Talk:Designing Online Information Portals - Monitoring and Evaluation

From KM4Dev Wiki
Jump to: navigation, search

See the original thread of this E-Discussion on D-Groups

Taline Haytayan, 2009/02/03

Dear KM4Dev members,

I am currently working on developing a monitoring and evaluation plan of an online information portal, which includes a baseline, and would be very interested to hear from others who have done something similar, in particular around:

  • how to design a baseline of online information tools
  • tracking usage of online information portals
  • how to assess whether the content is useful for people
  • innovative ways of gaining feedback from online users about the site and the content

Any experiences out there would be much appreciated,

Programme Officer.

Sam Lanfranco, 2009/02/03

Taline,

RE: …working on developing a monitoring and evaluation plan of an online information portal, which includes a baseline…

However you build the monitoring and evaluation tool keep in mind the following principles.

  • The cost of the portal in terms of finances and effort portal is important for assessing efficiency of production but not for measuring the value of output.
  • The “value” of the portal has to be measured by the impact of its use, not just by the extent of access (hits). The demand for the services of the portal is for an intermediate product in a process, the value is in the

impact.

  • The distribution of “impact of use” is likely to have a “long tail” with a lot of low impact users and a few high impact users. You need to make sure that you don’t over value low impact users because they are numerous, and that you don’t under value high impact users because they are few.
  • In the well-functioning marketplace consumers vote with their dollars (pesos, rupees, etc.) to reward quality and innovation. For free-access portals you need build effective user feedback to drive quality improvement and innovation, you need to evaluate how well (or poorly) that works, and

you probably need the support of that community to be sustainable in the long run.

Michelle Laurie, 2009/02/04

Hi Taline,

I am helping to develop a monitoring system for an online community (not necessarily the same as an information portal) and developed the following to guide the monitoring:

The purpose of this monitoring system is to assess if enough engagement is taking place for the online collaboration system to function and learning to occur. It will provide a snapshot analysis of the community site according to the following points:

  • Was website platform integrated into the training program?
  • Was a community group focal point appointed and/or volunteered?
  • How many people registered for the country groups?
  • Did discussions get established on-line?
  • If discussions took place, how did people participate? Did the facilitator make connections across the groups?

From there we will do an outcome monitoring style of analysis with indicators under three headings: Expect to see, Like to see and Love to see.

The community is just being set up and launched. I am proposing monitoring at the half way point and end. The simplicity of the system is mainly due to little capacity/resources to actually undertake the monitoring.

I hope that is helpful,

Denise Senmartin, 2009/02/06

Hi all, Taline, Thanks for continuing to follow up on M&E after that session in Lisbon!

Another important step in evaluating a portal is measuring users' response, fidelity and interests through surveys, analyzing not only the responses but also the percentage of people responding them. Perhaps would be interesting to look into developing some standard survey questions for online information portal users that we can all benefit from and use for M&E?

I would be happy to join a Skype call on these topics as well.

International Institute for Communication and Development (IICD)

Tammy Loverdos, 2009/02/03

Hi Taline,

I conduct my on-site web analytics through google analytics - and am very happy with the reports that they offer and the ease of use.To assess the usefulness of content can be tricky, but a good overall indicator is the bounce rate and length of time on site / page.

Verele de Vreede, 2009/02/03

Dear Taline,

We are also working on the development of a portal and I would like to exchange ideas with you on the monitoring and evaluation of this. At this moment we have decided to make use of google analytics, but we have not yet decided on the indicators and on how to use the analysis.

Maybe we should join forces?

Verele de Vreede, 2009/02/15

Dear Ashely,

You haven't missed anything yet. The chat has been postponed until March as some people were on the road for some time. I will add you to the list of interested people.

Anyone else wants to join? Let me know one to one.

WASTE advisers on urban environment and development, The Netherlands.

Cristina Sette, 2009/02/04

Hi,

I am doing the same. Why don't we schedule a skype chat sometime this week or next week?

Institutional Learning and Change (ILAC) Initiative, Italy.

Christian Kreutz, 2009/02/04

Dear all,

I would be also interested in such a chat. I am working right now on several indicators to measure the impact of a knowledge sharing platform, which is of course not completely possible without some qualitative feedback.

Surely Google Analytics is quite nice, but that alone says very little. Most important one needs to interpretate the data from such a tool, but more important from the statistics of your web platform. In my case it is Drupal, which offers some interesting data. Because an external tool such as Google Analytics cannot measure for example the interaction between users.

Here are some ideas I came up with: (Some might have false assumptions. Looking forward to your feedback)

Representation:

  • A good platform has a certain mix of representations: e.g. countries, organizations or departments etc.
  • The representation can be measured by visitors, members, contributions.

Contributions:

  • Frequency of new resources
  • Mix of contributions. E.g. links are not as valuable as blog posts. (Ranking)

Interaction:

  • Ratio of comments vs. resources: Average percentage of comments on each resource.
  • The ratio of comments towards users.
  • The Percentage of users, who have contributed something. The higher this group the better.

Content (quality):

  • Analysis of a random sample of resources under some criteria by different other members
  • Average time spent on pages
  • Page views. The more people browse pages - the more interest they have?

Outreach of website:

  • Growth of members or newsletter subscriber
  • The amount of invitations send from your platform.
  • Links to your platform.

Adrian Gnagi, 2009/02/04

Dear Cris, Verele and Taline

Several colleagues in SDC are also working on the same issue, but not as advanced as you yet. I promised them to monitor your discussion and report back. Would it be possible for you to sum up your skype chat in this thread?

Swiss Agency for Development and Cooperation, Bern.

Grant Ballard-Tremeer, 2009/02/05

At the HEDON Household Energy Network - for the overall network, in addition to using Google Analytics, we construct an engagement pyramid, an idea I picked up from the excellent publication from the folk at IISD - Strategic Intentions: Managing knowledge networks for sustainable development by Willard & Creech IISD, 2001, download the HEDON pyramidsnapshot from last year. I'd like to be able to make that at a click of a button, but haven't created that functionality yet.

Each of our Special Interest Groups (SIGs) also have their own Analytics tracking on all their SIG pages, and in our weekly SIG moderators meeting each moderator reports on six metrics covering the past week:

  • number of emails sent through the SIG
  • number of contributors sending emails through the SIG
  • number of new SIG subscribers
  • number of welcome messages sent to new subscribers
  • number of unique visits to the SIG pages
  • number of off list SIG emails sent (not counting welcome messages)

These metrics, which can be used to determine some useful ratios (eg.emails per contributor) are all at the activity or output level but we think that they are key ingredients to gauge the efforts of the moderator and the overall health of the SIG. All the data is collected with the help of our software EcoTrack which systematises the monitoring process

We haven't yet, but intend also to start tracking changes in knowledge and capacity for members of each of the SIGs during the course of this year. The idea is that the SIG develops it's own self-assessment form covering key practices in its subject area and each 6-month each member will be coerced into completing the self-assessment online and our system will create river diagrams (as described in 'Learning to Fly'). This - if all goes according to plan - will allow us to track changes in community capacity over time, by location, region etc. We still of course would have the roblem of attribution of changes in capacity, but we think it will be an immensely powerful tool.

Ashely Kiehnau, 2009/02/13

Hi,

Did I miss the Skype chat? We're switching platforms and looking ahead on ways to improve M&E for our site - including CoPs. Let me know if I haven't missed it!

Gabriele Sani, 2009/02/05

Hi Taline,

you have come up with a very interesting topic!

As usual, in this thread we have already seen many excellent suggestions and insights from past experiences, and think that to better profit from them and create a good system, you should try to focus on the purpose of your analysis.

Why do you need a monitoring and evaluation system, and who needs to see the results of this evaluation? This will help you choose wisely the tools of the trade, and what to make of them. There are two broad categories: Tools from the web, like Google Analytics, which is an excellent choice to monitor some data. However, keep in mind that it is geared mainly for the commercial websites that want to increase their revenues form advertisement (and, strangely enough, it's also an excellent marketing tool for Google, since it will remind to any webmaster that Google is the top dog of web search engines ;) ). The data that it provides clearly can be used for more than just that... but you cannot get too far on GA alone. For example, in an e-learning website with paid memberships you already know who your users are and which pages they visit, thus most of GA's functionalities become redundant. The second category of tools are those that are specific of the content delivery platform that you are using. Try researching them, you could stumble upon some gems! For example, in a previous email Christian mentioned using Drupal, and let me add some more examples to the already good list he gave. With Drupal, but with other patforms, too,, you can allow people to rate a page, and to customize their navigation menus adding bookmarks to the content they they liked the most. This give you an immediate feedback on how much they liked it. However, a few years ago a university decided to add some Britney Spears pics inside their pages on semiconductors, and it was a huge hit (at least between would-be physicists). You can give points to your users whenever they create a new content, and then award the top contributors with a new status, or a badge on their profile. But then you risk of breeding one-line spammers. Whenever you are looking at an active feedback from the users or asking them to do something for you, you should always try to cross check your results an look beyond the mere numbers. The thread on the Worst Practices has lots of warnings in this direction. You can analyze the user's behaviors in subtle ways. For example, you can monitor the terms used for all the searches, and compare it to a tag-cloud of your website to see how much your content matches the user's interests (or identify what users are looking for on top of what is already present). Also, you can use tags to identify similar content and then identify the areas with the highest and lowest number of dedicated pages. But this could become a quite time-consuming task if you allow free tagging of content, or if you have a huge taxonomy of terms. You can allow users to create and join sub-groups, and then monitor the results for each one of these groups. In km4dev you can find tons of resources on pros and cons of Communities of Practice.

I could rant on these topics for a long time, but the key point is that different platforms offer different tools, and probably there is a wide range of them. Start from what you are aiming for. Understand what you need to get there, and only then start looking into the tools that you may already have at your fingertips.

If it sounds like a KM strategy... well, it is.