Why storing project files is not the same as storing project knowledge

There is often an assumption that storing project files equates to managing knowledge on behalf of future projects. This is wrong, and here’s why.

For example, see this video from the USACE Knowledge Management program says “if you digitise your paper files, throw in some metadata tagging, and use our search appliance, finding what you need for your [future] project is easy”. (I have added the word [future] as this was proposed as a solution to the next project now anticipating things in advance).

However there is a major flaw with just digitising, tagging and filing the project documents and assuming that this transfers knowledge, and the flaw is that the project may have been doing things wrong, and almost certainly could have done things better with hindsight. Capturing the files will capture the mistakes, but will not capture the hindsight, which is where the learning and the knowledge resides.

It is that hindsight you need to capture, not the files themselves.

  • Don’t capture the bid package presented to the client, capture what you should have bid, the price you should have quoted, and the package you should have used. All of these things should come from the post-bid win/loss review.
  • Don’t capture the proposed project budget, capture the actual budget, where the cost overruns were, and how you would avoid these next time. This should come from the post-project lessons review.
  • Don’t capture the project resource plan, capture the resource plan you should have had, and the resourcing you would recommend to future projects of this type. This also should come from the post-project lessons review.
  • Don’t capture the planned product design, capture the as-built design, where the adjustments were made, and why they were made. (See  my own experience of working from stored plans and not as-built design which cost me £500 and ten dead trees).
  • And so on. You can no doubt think of other examples.
Capturing the hindsight is extra work, and requires analysis and reflection through Knowledge Management processes such as After Action Review and Retrospect. These processes need to be schedules within the project plan, and need to focus on questions such as 
  • What have we learned?
  • What would we repeat?
  • What would we do differently?
  • What advice and guidance, with the benefit of hindsight, would we give to future projects?
These are tough questions, focused on deriving hindsight (as in the blog picture above). Deriving hindsight is not easy, which is why these Knowledge Management processes need to be given time, space, and skilled facilitation. However they add huge value to future projects by capturing the lessons of hindsight.  Merely filing and tagging the project files is far easier, but will capture none of the hindsight and so none of the knowledge.

Capturing documents from previous projects and repeating what they did will cause you to repeat their mistakes. Better to capture their hindsight, so it can be turned into foresight for future projects. 

View Original Source (nickmilton.com) Here.

What to do when knowledge is a core product deliverable

For some projects, knowledge is their most important deliverable, but how is that deliverable defined?

We are used to thinking of knowledge as an input for a project, but it is often an output as well. Projects can learn new things, and can create new knowledge for an organisation. Often we assume that this new knowledge will be transferred through the lesson learned system, but is that really enough?

Usually it isn’t, and instead a different approach might be better.
Lessons are increments  of knowledge, usually identified after the event, at the end of an activity or a project stage. In an ideal world every lessons would be associated with an action, and each action would lead to the update of a best practice, a doctrine or a corporate standard. Lessons are usually captured in a team meeting such as a Retrospect.
However if a project is doing something new – something which has never been done before – then the standard lesson approach is insufficient. Rather than identifying and capturing the learning after the event, the organisation should identify the potential for learning at the start of the project, make sure resources are assigned to learning, and require the project to create a guideline, best practice or standard as a project deliverable. 
Imagine an organisational project to set up the company’s first manufacturing facility in Asia – the first of many such plants in an expansion program. The project is expected to deliver a manufacturing plant with a certain production capability, and the success of the project will usually be measured by whether the plant is delivered on time, to quality and to budget. However the success of the program will be influenced by how mush knowledge is passed from the first plant to the others, and the value of this knowledge may be higher than the value of the plant itself.
Therefore the project can be given a second deliverable – to create best practice guidance documents, doctrine or first-pass standards on topics such as
  • Doing business in that particular Asian country
  • How to negotiate the bureaucracy
  • How to obtain permits
  • How to construct the plant efficiently and effectively
  • How to recruit a skilled workforce
and many other topics. These deliverables should be managed through the project KM plan, and reported to management the same as other deliverables.
This set of knowledge deliverables could be given its own resources and its own workstream, in order to make sure that the knowledge is captured. Without this focus on knowledge, it is quite possible to get to the end of the project and find that no knowledge has been captured at all. 

View Original Source (nickmilton.com) Here.

What is a knowledge product?

The concept of a Knowledge Product is a common one in the development sector, and is used as a label for many types of document.  But what makes a product a “knowledge product”?

Many organisations working in the development sector create what they call “Knowledge Products”. The African Development bank, for example, has a whole suite of economic briefs, working papers and economic reports published under the heading of “knowledge products”. These are written by specialists in the Bank, for the education or reference of future Bank programs and for wider society. The main mechanism of “knowledge transfer” regarding these products is “dissemination” – publishing reports to target audiences, often on web-hosted repositories.

Other organisations are looking at the same topic, and wondering if Knowledge Products could be defined as a project output, for example.

But what does the “Knowledge” in the term mean? Are these reports “products of knowledge” or are they “products that aim to transfer knowledge to the user”?

If we are to use knowledge products as a component of a KM Framework, then surely they must follow the second definition, not the first?

Dr. Serafin Talisayon, in his lecture notes, suggests that

A knowledge product is something that enables effective action by an intended user, client or stakeholder of a government agency or a non- government or development organization.  


This is the second definition. A Knowledge Product must carry knowledge, and must enable action by the reader (knowledge is, after all, the ability to take effective action).  It must be actionable.

  • Therefore a project report is not a knowledge product. Even if it contains a detailed history of the project, the reader does not know whether to copy this history or not. 
  • A lesson learned report IS a knowledge product, providing the lessons  
  • An economic summary of a region is arguably not a knowledge project. One could read the report, but still not know what action to take. 
  • A summary of best practice, or recommended practice, is a knowledge product, provided that the description of the best practice is detailed enough, and provides enough contextual background, that people can act on it in their own context.  

We can see in this list from ITAD several knowledge products – lessons, findings, insights. The list quoted above from the African Development Bank seemed to have a looser definition of Knowledge Product, mixing in a whole variety of reports – lessons reports mixed with briefs and working papers.
If we are to follow Dr Talisayon’s definition, then writing Knowledge Products requires a lot of care if they are to be usable and used. To quote this blog from the World Bank,

KM should be conceived less as a purely technical information-based area and more as a communication and behaviour-change area … Knowledge producers need to package the product in a way that can be easily applied, while the users need to be “persuaded” to conceive knowledge as a practical tool that can be applied in their field.

If we are to translate the concept of Knowledge products from the Development sector to the Industrial sector, we would do well to bear this in mind, and use the term “Knowledge Products” only for items that are expressly written to convey knowledge, with the user in mind.

So a set of project reports on a website is not a collection of knowledge products. A wiki containing guidance, good practice and lessons is a knowledge product.  In an ideal world, every project should produce knowledge products which are used to grow and evolve the knowledge base in the organisation.

Knowledge products, if treated the right way, can be a core component of a KM Framework. 

View Original Source (nickmilton.com) Here.

The 11 steps of FEMA’s lesson capture process

The US Federal Emergency Management Agency (FEMA) has a pretty good process for capturing and distributing lessons. Here are the 11 steps.

FEMAEvery Emergency Services organisation pays close attention to Lesson-Learning  (see for the approach taken by the Wildland Fire Service). They know that effective attention to learning from lessons can save lives and property when the next emergency hits.

The lesson learning system at FEMA was described in an appendix to a 2011 audit document  and showed the following 11 steps in the process for moving from activity to distributed lessons and best practices.  Please note that I have not been able to find a more recent description of the process, which may have changed in the intervening 7 years.

FEMA Remedial Action Management Program Lessons Learned and Best Practices Process

  1. Team Leader (e.g., Federal Coordinating Officer) schedules after-action review
  2. After-action review facilitator is appointed
  3. Lesson Learned/Best Practice Data Collection Forms are distributed to personnel
  4. Facilitator reviews completed forms
  5. Facilitator conducts after-action review
  6. Facilitator reviews and organizes lessons learned and best practices identified in after-action review
  7. Facilitator enters lessons learned and best practices into the program’s database
  8. Facilitator Supervisor reviews lessons learned and best practices
  9. Facilitator Supervisor forwards lessons learned and best practices to Program Manager
  10. Program Manager reviews lessons learned and best practices
  11. Program Manager distributes lessons learned and best practices to Remedial Action Managers
This is a pretty good process.
However despite this good process, the audit showed many issues, including 
  • a lack of a common understanding of what a good lesson looks like; the examples shown are mainly historical statements rather than lessons, and this example from the FEMA archives has the unhelpful lesson “Learned that some of the information is already available information is available”
  • a lack of consistent application of the after action review process (in which I would include not getting to root cause, and not identifying the remedial action),
  • a lack of use of facilitators from outside the region to provide objectivity, 
  • limited distribution of the lesson output (which has now been fixed I believe, and 
  • loss of their lessons database when the server crashed (which has also been fixed by moving FEMA lessons to the Homeland Security Digital Library).

So even a good process like the one described above can be undermined by a lack of governance, a lack of trained resources, and a poor technology. 

View Original Source (nickmilton.com) Here.

Why "Know-why" is as important as "Know-how" – the monastery cat story

Recording the Know-how is key to Knowledge management, but Know-Why is also important.

Do you know the story of the monastery cat? It goes like this

Cat at Sovloki Monastery
picture from wikimedia commons

Once upon a time, there was a monastery where the monks meditated from dawn to dusk. One day a cat trespassed into the monastery and disturbed the monks. The head monk instructed that the cat be caught and tied to a tree until dusk. He also advised that every day, to avoid hindrance during meditation, the cat be tied to the tree. So it became a standard operational procedure “To catch the Cat & Tie it to the Tree” before the monks started meditating. 

One day the Head Monk died, the second most senior most monk was chosen as Head monk and all other traditions including tying the cat to the tree were continued. One day the cat died. The whole monastery plunged into chaos. A committee was formed to find a solution and it was unanimously decided that a cat be bought from the nearby market and tied to the tree before starting the meditation each day.  This tradition is still followed in the monastery even today.

A simple story, but carries a clear message – if you don’t know why you are doing something, then you don’t know when you can stop, or what you can change.

We are working with a client at the moment, who has a good system for documenting Standard Operating Procedures, and updating them with new Lessons Learned as appropriate. This is their way of continually improving their processes, and of institutionalising a form of corporate memory. However there is a risk here. The risk is that they are recording “how to do it” and not “why to do it this way” – the “Know-How” and not the “Know-Why”. And if the context changes, and the procedures need to be adapted, then without an understanding of the “why”, people won’t know how to change. The SoPs could become “Tying cat to the tree”.

This is an operational version of “thinking fast and slow“. The Fast thinking is to follow the SoP, when all you need to know is “what to do” or “how to react”. The Slow thinking is to go back to the principles, go back to the Why, and derive the new operating procedure. Or you could look at is as double-loop thinking – the first loop focused on “what to do”, and the second loop on “why do we do it that way”.

So how do you record the Know-Why?

You have a couple of options.

  • Record enough commentary in the SoP so that you can see what it is based on. For example, one legal company we worked with used to keep standard contracts, but each contract clause had extensive commentary describing why the clause was there, and why it was written the way it was. The contract could then be amended if needed (together with amended commentary) and the commentary stripped out when the contract was printed. That way they preserved the Know-Why.
  • Create a secondary document, such as a Basis of Design document. This document takes each element of a project design, a product design or a software design, and described the basis and the assumption son which it was designed. You can create a BoD before the project, and revise it during and after the project to capture changes to the design, and the rationale behind those changes. The BoD then helps future teams working on the same product to make intelligent changes, rather than blind changes. It also helps transfer knowledge to subcontractors working on the project, and protects against loss of knowledge. I recall someone saying to me, about Basis of Design, “I could look at the BoD and put out a quality project design in 2 weeks, and there hasn’t been any work done in this area for 2 years”.
Recording the Know-why keeps the context, and avoids the risk of repeating the Monastery Cat story.

View Original Source (nickmilton.com) Here.

Where should you focus Knowledge Retention?

Knowledge retention can be a massive exercise if not focused. But which knowledge should you focus on retaining?

Imagine you are setting up a Knowledge Retention interview with a company expert. This expert has a lifetime’s knowledge which would take an eternity to capture – where do you start? Where are the highest priority areas for capture?

This Boston Square may help.

The first axis of the square is the routine/non-routine nature of the activity which the Expert knows about. The Expert often has something that the ordinary practitioner does not have, and that is an understanding of the non-routine activity – the “one in a thousand” occurrences that most people never see, but which an expert has either met, or heard of somewhere. Most practitioners, even the junior ones, understand routine activity – it is when they meet non-routine circumstances that an expert is needed.

The second axis is teh criticality of the knowledge. How critical will that knowledge be? Will it save lives and millions of dollars, or is it not particularly critical?

Obviously the focus for your retention is the critical non-routine areas. If you do nothing else, then capture the knowledge of these topics.

Then, if you have time, address the “critical and routine” (although most people will know this already, it may be good to have the experts viewpoint), and then the “Non-critical non-routine” (it will at least help people avoid reinventing the wheel).

View Original Source (nickmilton.com) Here.

Reaching the deep knowledge

Reaching the Deep Knowledge often requires the help of a facilitator or interviewer, and there is a tell-tale sign that shows when you get there.

woman, thinking
“woman, thinking” by Robert Couse-Baker, on Flickr

Superficial knowledge transfer happens all the time.

A foreman leaves his job. The company arranges a hand-over meeting, and the foreman talks through the processes and procedures with his successor, but the real knowledge – the tips of the trade, the workarounds, the instinctive feel for the tasks – leaves with the man.

An engineer opens his email, and reads a request for advice from a colleague on another continent. The engineer drafts a quick reply, describing a solution he has applied in the past. However he fails to think through the reasoning and insights underlying the solution, and his reply is superficial and of little value to his colleague. The colleague gets no help from the suggestions, and next time she won’t even bother to ask.

A project manager finishes her project. She sits down for a couple of days and writes a close-out report, where she details the history of the project, and the successes this project has achieved. But she never gets to the secrets behind the success; these are hidden in the undocumented interactions in her team. As a result, the successes are unrepeated.

In each of these examples, an opportunity to exchange valuable knowledge has been lost – in some cases forever. The crucial knowledge stays in the head of the foreman, the project manager, the engineer, because none of them are conscious of what they know. Without being conscious of what they know, they have no way to pass that knowledge on. Any knowledge management system that fails to find the things that people don’t know they know (the unknown knowns), that fails to mine the deep knowledge, will fail to deliver it’s full potential.

One of the key tenets of Knowledge Management is that we know more than we realise, and more than we can record. The individual, working alone and with a blank sheet of paper, seldom accesses the deep knowledge, and you end up recording the superficial. The only way to dig a bit deeper (while still realising we won’t get everything) is to start probing with questions.

Questioning processes

A good questioner, or a good questioning process, can help the individual dig deeper than they can manage unaided. That’s why so many Knowledge Management processes are based on questioning and dialogue.

On a short-term small scale, the After Action Review is a questioning process; getting at the ground truth behind the results of an exercise or activity. The team’s expectation of an event is compared with the actuality, and the facilitator goes through a questioning process to find the reasons for the difference between the two. Where there is a difference, there is learning, but it may take some probing questioning to get to the knowledge. Oil companies use After Action Reviews in situations where a small team conducts a brief action, such as a maintenance team working a shift at a refinery, or a negotiation team conducting a difficult meeting with a host government. In every case we found that the quality of the questioning determined the value of the knowledge. Superficial questioning gives shallow knowledge of limited use. Harder questioning, maybe using the technique of ‘the five whys’, gets at the deep knowledge, where the real value lies.

On a larger scale, the Learning History uses the same sort of questioning techniques. A skilled facilitator, informed but detached, not a member of the team, will take a project team through a structured questioning and discussion process, where the history of the project is reviewed and the knowledge brought out and captured. For example, I was once part of a joint interview team, charged with capturing and packaging the knowledge from a major industrial merger. We targeted 40 of the top decision-makers, and sent them an interview guide with some high-level questions. We then followed this up with hour-long interviews, where we applied some of these questioning techniques. It was pretty obvious when we started to tap into the unconscious knowledge – the pace of the interview slowed as the interviewees started to really think deeply about what had happened, and started to ask themselves ‘what really happened there, and what did we really learn?’.

You know, as an interviewer, when you are tapping into the deep knowledge. The interviewee stops, thinks, leans back in their chair, and their gaze rises as they look upwards and inwards.

That’s the sign that you are digging deep – the sign that you are hitting Knowledge Paydirt!

View Original Source Here.

Skip to toolbar