Talk:A rigorous approach to lesson learning

From KM4Dev Wiki
Jump to: navigation, search

See the original thread of this E-Discussion on D-Groups

Juliana Caicedo, 2010/8/6

Hi everyone, A newcomer to this group, I’m looking forward to participating in many interesting conversations. I’d like to start by consulting you on an issue I’ve been struggling with: what would be a rigorous (almost scientific?) yet light enough methodology/approach to capturing lessons learned throughout the project cycle. As humans, we all have our own perceptions and biases, so I’m looking for a way to minimize those and extract lessons from an intervention that can become valid/solid evidence to inform a new program /project, perhaps in a different context. I believe that a methodology that is too heavy or cumbersome will just discourage people from properly reflecting and capturing their lessons learned for the good of the organization, but I also think that rigour is important if we are to use those lessons to inform decision making.

Grateful for any insights and experiences that you’d like to share with me.


Atanu Garai, 2010/8/6

Please note methodology depends on intervention. Kindly provide us with some description of the intervention. A handy reference is available at for designing most scientific methods.

Nancy White, 2010/8/7

Funny, when you start asking about something, another link shows up.This is a thought provoking blog post from Nancy Dixon on lessons learned.


Arthur Shelley, 2010/8/7

Hi Nancy (and Nancy),

Thanks for this example. It is good to have a collection of such examples. There were a few examples of some great “Lessons Learnt” in Australia going back a while that did not get into the general literature.

Bovis Lend Lease’s Ikonnect was not designed as a lessons learnt, but as a service to call to ask questions where the answer may have been known elsewhere in the organisation. Over time they developed a database and a network of people that enabled employees to get relevant information quickly. I heard Garry Cullen speak about it in 2004 at KM Asia and to my knowledge it is still going, but I know he is no longer facilitating it. See: Sharing our knowledge internally and externally:

”Collaboration is essential to ensuring the best ideas are leveraged no matter where they have originated within the organisation. In 2002 we developed an innovative knowledge-sharing service called ikonnect™. This year, the ikonnect™ team of facilitators resolved 2,960 enquiries from Lend Lease employees, sourced from employees and other stakeholders across the Group.”

Another example that comes to mind is what Ian Fry did for one of the Civil Services in South Australia. He designed and implemented a piece of software that enabled the capture of lessons learnt that was fully searchable in a variety of ways. It enabled the staff and volunteers to more easily find emergency information and cases of what was done previously. To my knowledge there is no case study written about this, but this is something that the community of knowledge practitioners should doing more of and make available through a public place. See:

I have no commercial relationship with either of these two organisations. Just thought members may find these unpublished stories relevant to this conversation.


Arthur Shelley Founder: Intelligent Answers & Organizational Zoo Ambassadors Network Author: The Organizational Zoo & Being a Successful Knowledge Leader Twitter: Metaphorage Blog: http// Ph +61 413 047 408 Skype: Arthur.Shelley Free Zoo Behavioural Profiles:

Juliana Caicedo, 2010/8/9

Dear all, Thank you so much for the great inputs to my question. I have closely read the various responses and attached links, and I'm now looking at my issue from different perspectives. The topic of lessons learned is indeed much more complex than what one could think based on how common those words have become. I find it is especially complex if you want to be rigorous in terms of defining a lesson learned and creating a loop that is only closed when the lessons is really learned by making a change - at a program or at a policy level.

Thanks again for your wonderful contributions, Juliana

Ian Thorpe , 2010/8/9

Hi Julia

A late addition to this discussion.

We have templates and guidance for documenting innovations, lessons learned and good practices which we use internally - but we don't have a specific methodology, although some of our country offices are working on their own specific process to document lessons locally.

From our experience it is better not to have specific methodologies to use in all cases since both the programmes and the country situation vary widely.. They can come from formal evaluations or operational research but also from internal reflection processes. One of the most important aspects is that however they are documented that there is some form of self-reflection where those implementing the programme look at the experience themselves to learn lessons- since if the "lessons" are documented by evaluators or other external observers., while they may be "lessons" they might not have been learned".

Also, its usually difficult to transfer a lesson or approach from one context to another so it is important to consider i) when documenting to look at similar cases from other countries to see which lessons are more generalizable and which are context specific ii) to remember that lessons or good practices are not blueprints that can be directly copied to create new programmes, they serve rather as possible models for inspiration or as checklists to consider when analyzing a particular problem and designing a unique programme to address it.

I hope this is helpful.


Below is a summary of the guidance and attached are the templates:

Category Criteria for Selection Template
From the field This is a good place to showcase ideas, highlight on-going research work or report on a particular experience from your work in the field observed/read about, that is relevant toor to share something you observed/read about, that is relevant to UNICEF's mandate. You can submit photos, videos and other mixed media that may better tell the story. A short (1-2 paragraph) abstract of the case study, story, staff experience community profile, etc. gives the reader a clear overview. Indicates how it the practice is relevance to UNICEF's mandate; problem/situat ion it addresses; describes the practice and where it is applicable Either in narrative form or other media format, e.g. photos, video, mixed media, etc., more detail is provided on the key points and lessons. Provides contact persons or links to related resources for additional information.
Innovation – these are summaries of a programmatic or operational innovations that have or are being implemented under UNICEF’s mandate. These innovations may be pilot projects or new approaches to a standard programming model that can demonstrate initial results. The main focus of the document you prepare is a concise description of the innovation so that its benefits are clear to your reader. Concise (1-2 short paragraphs) and clear description of the innovation and its potential application, i.e. the value-added to UNICEF programme(s). Quality and source of qualitative and / or quantitative evidence presented. Description of progress made (implementatio n) and next steps shows the reader why programme is innovative. Innovations template.doc
Lesson learned – these are more detailed reflections (rather than just a description) on a particular programme or operation and extraction of lessons learned through its implementation. These lessons may be positive (successes) or negative (failures); both are valuable and encouraged. You should be able to state the lesson(s) learned in a few sentences and provide verifiable results that are evidence of the lesson(s). Lessons learned have undergone more of a review process than innovations and generally have been implemented over a longer time frame. The lesson learned is clear, concise (1-2 short paragraphs) and relates to the strategy and implementation of the programme. Also shows validity for wider audience. Quality and source of qualitative and quantitative evidence presented. Implementation strategy, results and next steps relate directly to the key lesson (s) and allow the reader to understand how conclusions (lessons) drawn. Lessons learned template.doc
Good practice – these are well documented and assessed programming practices that provide evidence of success/impact and which are valuable for replication, scaling up and further study. Documentation of good practices require more time and effort because of the need for assessment or evaluation results. The more evidence the better, as these practices should add value to development programming in a particular sector or region. Your submission of good practices to this database may be the first step to peer review and wider publication. Summary of good practice describes value added to UNICEF/country programming and potential benefits to partners. Quality of formal evaluations and studies used to validate results and conclusions. Evidence of value-added to standard programming practice is presented. Strategy and implementation describes successes and replication in more than one programme setting. Recommendation s are well articulated and provide important information to readers that may be interested in further investigation. Good Practices template.doc

Manuel Flury, 2010/8/10

Dear Juliana, dear colleagues,

many thanks to all for discussing this highly relevant but tricky issue. Let me add some of my rather practical experiences:

I find a lot of "lessons learnt" chapters in any kind of activity reports, evaluation reports or experience capitalisation reports. Usually these are lists of rather abstract and at times trivial "lessons that are not yet learnt", such as "improve coordination among local and international partners", "involve communities as early as possible in activity planning" or "be clear about your strategic orientation". So my first two lessons I have learnt: (1) formulate "lessons to be learnt" and (2) formulate them in an as specific manner as possible. E.g. "In the coordination among local and international partners plan for a f2f session on the understanding of the political context". A story about the particular experience could well be added to this lesson.

SDC has realised that in a series of strategic evaluations over a period of 8 years the same lessons learnt were formulated several times. Obviously there was no learning happening. There might be reasons for this These "lessons" were assessed by a consultant. His report stimulated a very lively discussion about what possibilities of change the organisation has in reality. There is no need in every evaluation to ask the same questions again and again. So my third lession I'd like to share with you: (3) remain selective in your "lessons to be learnt" and include the perspective (and possibilities of taking action) of those that will have to learn in future!

And finally: it is because of such aspects that we do not promote a standard and backward looking "lessons learnt" culture (of lessons, we have not yet learnt) in SDC. Instead we favour the preparation of the future.

best regards Manuel

Manuel Flury Knowledge and Learning Processes Division

Federal Department of Foreign Affairs Swiss Agency for Development and Cooperation SDC Freiburgstrasse 130 CH-3003 Bern / Switzerland Phone +41 31 325 02 56 E-Mail: Blog sdclan

Christina Merl, 2010/8/11

What a beautiful blog post. Many thanks for sharing this!

First, I really would like to point out that it's wonderful to read such clearly structured, useful and applicable thoughts, expressed in such clear and simple language (esp. for non-native speakers of English this is very nice to read)!

Second, I am fascinated by the idea of having knowledge brokers who "transfer" knowledge and information within an organisation. Actually, I attended a conference in Nice last September (EC-TEL 09) where a keynote speaker brought up the importance of information brokers. He postulated brokers as the future key figures in a knowledge- and community-driven economy. However, I (and I think we all) have been struggling with how to prove the RoI of CoPs and information brokers. Are there any blog posts or thoughts out there that deal with RoI and measuring "lesson learning"? BTW, there is currently a Com-Praq discussion going on about "overmanaging CoP", some of you may have followed it.

Thanks, Christina

Jaap Pels, 2010/8/12

Hi All,

What lessons did we apply whyle sticking names like:

  • extension workers
  • web weavers
  • knowledge activists / anarchist
  • champion
  • knowledge monk
  • information hub / broker / mover
  • librarian
  • aficionado
  • information specialist
  • sense maker
  • appreciated inquire-ist
  • connector
  • learning architect
  • boundary spanner
  • intellectual capitalist
  • keynote speaker
  • etc?

I am open for other labels for peoples' 'KM related' activities / roles / functions. Pehaps we can start a label ontology page on the KM4Dev wiki for future use and musing.

On the RoI of staff on the ground we aim to support by 'KM-for-development':: the RoI of 'making information flow and knowledge shared' is very big, invaluable and hardly attributable to the monetary input.

For Western KM4Dev orgs I can imagine the labour costs make RoI calculations hard to result in an outcome bigger then 1 (break even). Be careful the proof of 'RoI > 1' might be costly.

And Christina, I love your crowd sourcing asking for 'blog posts or thoughts out there'. It must be a generation abyss; I would ask for literature :-)

Best, Jaap