Why storing project files is not the same as storing project knowledge

There is often an assumption that storing project files equates to managing knowledge on behalf of future projects. This is wrong, and here’s why.

For example, see this video from the USACE Knowledge Management program says “if you digitise your paper files, throw in some metadata tagging, and use our search appliance, finding what you need for your [future] project is easy”. (I have added the word [future] as this was proposed as a solution to the next project now anticipating things in advance).

However there is a major flaw with just digitising, tagging and filing the project documents and assuming that this transfers knowledge, and the flaw is that the project may have been doing things wrong, and almost certainly could have done things better with hindsight. Capturing the files will capture the mistakes, but will not capture the hindsight, which is where the learning and the knowledge resides.

It is that hindsight you need to capture, not the files themselves.

  • Don’t capture the bid package presented to the client, capture what you should have bid, the price you should have quoted, and the package you should have used. All of these things should come from the post-bid win/loss review.
  • Don’t capture the proposed project budget, capture the actual budget, where the cost overruns were, and how you would avoid these next time. This should come from the post-project lessons review.
  • Don’t capture the project resource plan, capture the resource plan you should have had, and the resourcing you would recommend to future projects of this type. This also should come from the post-project lessons review.
  • Don’t capture the planned product design, capture the as-built design, where the adjustments were made, and why they were made. (See  my own experience of working from stored plans and not as-built design which cost me £500 and ten dead trees).
  • And so on. You can no doubt think of other examples.
Capturing the hindsight is extra work, and requires analysis and reflection through Knowledge Management processes such as After Action Review and Retrospect. These processes need to be schedules within the project plan, and need to focus on questions such as 
  • What have we learned?
  • What would we repeat?
  • What would we do differently?
  • What advice and guidance, with the benefit of hindsight, would we give to future projects?
These are tough questions, focused on deriving hindsight (as in the blog picture above). Deriving hindsight is not easy, which is why these Knowledge Management processes need to be given time, space, and skilled facilitation. However they add huge value to future projects by capturing the lessons of hindsight.  Merely filing and tagging the project files is far easier, but will capture none of the hindsight and so none of the knowledge.

Capturing documents from previous projects and repeating what they did will cause you to repeat their mistakes. Better to capture their hindsight, so it can be turned into foresight for future projects. 

View Original Source (nickmilton.com) Here.

Army definitions in Lesson Learning

The Army talk about building up lessons through Observations and Insights. But what do these terms mean?

Chinese character dictionaryLesson learning is one area where Industry can learn from the Military. Military lesson learning can be literally a matter of life and death, so lesson learning is well developed and well understood in military organisations.

The Military see a progression in the extraction and development of lessons – from Observations to Insights to Lessons – and we see a similar progression within the questioning process in After Action Reviews and Retrospects.

On Slide 7 of this interesting presentation, given by Geoff Cooper, a senior analyst at the Australian Centre for Army Lessons Learned, at the recent 8th International Lessons Learned Conference, we have a set of definitions for these terms, which are very useful.

They read as follows (my additions in brackets)

Observation. The basic building block [for learning] from a discrete perspective. 

  • Many are subjective in nature, but provide unique insights into human experience.
  • Need to contain sufficient context to allow correct interpretation and understanding.
  • Offer recommendations from the source
  • [they should be] Categorised to speed retrieval and analysis

Insight. The conclusion drawn from an identified pattern of observations pertaining to a common experience or theme.

  • Link differing perspectives and observations, where they exist.
  • Indicate recommendations, not direct actions,
  • Link solid data to assist decision making processes
  • As insights relay trends, they can be measures

Lesson. Incorporates an insight, but adds specific action and the appropriate technical authority.  

Lesson Learned. When a desired behaviour or effect is sustained, preferably without external influence.

What Geoff is describing is a typical military approach to lesson-learning, where a lessons team collects many observations from Army personnel, performs analysis, and identified the Insight and Lesson. As I pointed out in this post, this is different from the typical Engineering Project approach, where the project team compare observations, derive their own insight, and draft their own lesson.

The difference between the two approaches depends on the scale of the exercise. In the military model there can be hundreds of people who contribute observations, while in a project, it’s usually a much smaller project team (in which case it makes sense to collect the observations and insights through discussion). If you are using the military model, these definitions will be very useful.

View Original Source (nickmilton.com) Here.

How the BBC learned from their Olympic coverage

Here is a case study of one organisation – the BBC – learning from experience. 

The  Olympics is was a massive event, on a scale that is unprecedented in peacetime. It’s the biggest project a country will ever undertake, other than a war. I have already blogged about the Olympic Games KM program, but its not just the Games organisers that need Knowledge Management, it’s everyone involved, especially those involved in new or development areas.

One such area was Digital Broadcasting.  The London Olympics were the Digital Olympics, with more HD broadcasts, web feeds, twitter feeds etc than any other Games before. And to deliver the Digital Games, the country’s main broadcaster, the BBC, needed to develop and apply a raft of new technologies.

 This post, from the BBC Internet Blog,shows how they used lesson-learning, in a structured and planned way, to ensure these products were delivered on time and to specification, and also to ensure that subsequent exercises will learn from this one.

I quote the relevant section

Lessons Learned 
 We captured the lessons from the programme as we went along, from end of sprint retrospectives and the rich data captured in our information systems above. At the end of the Olympics the project managers facilitated workshops to capture additional successes and improvement opportunities and share these with their colleagues.  

From these on-line surveys and interviews with stakeholders, over 300 lessons were captured in our project register. The key lessons touched on above were the importance of organising and planning the work amongst self-directed, multi-disciplinary teams, with a layer of information and communication support provided by the management team. The ability to prioritise the scope and deliver it incrementally with frequent opportunities to test at scale and in a live environment, contributed to the success of a once-in-a-lifetime sporting event for the BBC’s on-line audiences. 

 The experience and lessons learned in delivering this exciting programme will be carried forward by the team members into their next projects, while the environment and process limitations identified, will drive improvements in technology provision and uptake of best practices.

We can see in-project learning (end of sprint retrospectives) and post-project learning (at the end of the Olympics  – workshops to capture additional successes and improvement opportunities) – both activities built into the work program.

We worked with the BBC “live and learn” team about 10 years ago to introduce some of these learning principles, and they have subsequently been MAKE award finalists for many years. This blog shows that KM and learning practices are still alive and thriving at the BBC.

View Original Source (nickmilton.com) Here.

5 reasons why organisations don’t learn lessons.

If lesson learning is so simple, why do organisations so often fail to learn the big lessons?

We seem to be able to learn the little lessons, like improving small aspects of projects, but the big lessons seem to be relearned time and time again. Why is this?

Some of the answers to this question are explored in the article “Lessons We Don’t Learn: A Study of the Lessons of Disasters, Why We Repeat Them, and How We Can Learn Them” by Amy Donahue and Robert Tuohy. In this article they look at lessons from some of the major US emergency response exercises, and find that many of them are repeated time and again.

In particular, repeated lessons are found in the areas of

  • Failed Communications
  • Uncoordinated Leadership
  • Weak planning
  • Resourcing constraints
  • Poor Public relations

In fact these lessons come up so often that staff in disaster response exercises can almost tell you in advance what is going to fail.  People know that these issues will cause problems, but nobody is fixing them.  
You could draw up a similar list for commercial projects and find many of the same headings, with the possible addition of issues such as
  • Scoping and scope control
  • Subcontracting
  • Pricing
Donahue and Tuohy explore why it is so difficult to learn about these big issues, and come out with the following factors:
  1. Lack of motivation to fix the issues. As Donahue and Tuohy explain, 

“Individual citizens rarely see their emergency response systems in action. They generally assume the systems will work well when called upon. Yet citizens are confronted every day by other problems they want government to fix – failing schools, blighted communities, and high fuel prices. Politicians tend to respond to these more immediately pressing demands, deferring investments in emergency preparedness until a major event re-awakens public concern. As one incident commander put it, “Change decisions are driven by politics and scrutiny, not rational analysis.” 

In addition, they identify the sub-issues of a lack of ability to sustain commitment, Lack of a shared vision on what to do about the lessons, and a lack of a willingness of one federal or local body to learn from others.

All of these issues are also seen in commercial organisations. There is a reluctance to make big fixes if it’s not what you are being rewarded for, a reluctance to learn from other parts of the organisation, and difficulties in deciding which actions are valid.

  1. An ineffective lessons capture and dissemination process. Donahue and Tuohy identify the following points:

“While some (AAR) reports are very comprehensive and useful, lessons reporting processes are, on the whole, ad hoc. There is no universally accepted approach to the development or content of reports… often several reports come out of any given incident… agencies or disciplines write their own without consulting each other. These reports differ and even conflict … there is no independent validation mechanism to establish whether findings and lessons are “right” … concern about attribution and retribution is a severe constraint on candour in lessons reporting …  the level of detail required to make a lesson meaningful and actionable is lost … meaning is also diluted by the lack of a common terminology … AARs typically focus on what went wrong, but chiefs want to know what they can do that is right. Reports tend to detail things that didn’t work, without necessarily proposing solutions. … those preparing the reports need to understand not only what happened, but also why it happened and what corrective action would have improved the circumstances. Reports of this depth and quality are relatively rare … many opportunities to learn smaller but valuable lessons are foregone (and) there is no mechanism by which these smaller lessons can be easily reported and widely shared”. 

That’s quite a list, and again we can also see these issues in industry as well. Lesson learning crucially needs

  1. An ineffective lessons dissemination process. Donahue and Tuohy make the following points:

“The value of even well-crafted reports is often undermined because they are not distributed effectively. Most dissemination is informal, and as a result development and adoption of new practices is haphazard. Generally, responders must actively seek reports in order to obtain them. … There is no trusted, accessible facility or institution that provides lessons learned information to first responders broadly, although some disciplines do have lessons repositories. (The Wildland Fire Lessons Learned Center and the Center for Army Lessons Learned are two prominent examples.)”

In fact, the Wildland Fire lessons center and the Center for Army Lessons Learned represent good practice (not just in technology, but in resourcing and role as well) and are examples that industry can learn from. However the issue here is not just dissemination of lessons, but synthesis of knowledge from multiple lessons – something the emergency services generally do not do.

  1. An ineffective process for embedding change. Donahue and Tuohy address this under the heading of “learning and teaching).

“Most learning and change processes lack a formal, rigorous, systematic methodology. Simplistically, the lesson learning and change process iterates through the following steps: Identify the lesson > recognize the causal process > devise a new operational process > practice the new process > embed/institutionalize and sustain the new process.  It is apparent in practice that there are weaknesses at each of these steps….

The emergency response disciplines lack a common operating doctrine…. Agencies tend to consider individual incidents and particular lessons in isolation, rather than as systems or broad patterns of behavior. … Agencies that do get to the point of practicing a new process are lulled into a false sense that they have now corrected the problem. But when another stressful event happens, it turns out this new process is not as firmly embedded as the agency thought … Old habits seem “safer,” even though past experience has shown they do not work. 

Follow-up is inadequate … Lessons are not clearly linked to corrective actions, then to training objectives, then to performance metrics, so it is difficult for organizations to notice that they have not really learned until the next incident hits and they get surprised”.

This is the issue of lesson managament, which represents Stage 2 of lesson learning maturity. Many organisations, such as the ones involved in emergency response, are stuck at stage 1.  Lesson management involves tracking and supporting lessons through the whole lifecycle, from identification through to validated and embedded change.

There really is little point spending time collecting lessons if these lessons are then not managed through to resolution.

  1. A lack of dedicated resources. Donahue and Tuohy again – 

“Commitment to learning is wasted if resources are not available to support the process. Unfortunately, funds available to sustain corrective action, training, and exercise programs are even leaner than those available for staff and equipment”.

Lesson-learning and lesson management need to be resourced. Roles are needed such as those seen in the US Army and the RCAF, or in Shell, to support the process.  Under-resourcing lesson learning is a major reason why lesson learning so often fails.

Conclusions.

Donahue and Tuohy have given us some sobering reading, and provided many reasons why lesson learning is not working for Disaster response. Perhaps the underlying causes are the
lack of treating lesson learning as a system, rather than as a product (ie a report with documented lessons), and a lack of treating lesson learning with the urgency and importance that it deserves.

Make no mistake, many commercial organisations are falling into the same pitfalls that Donahue and Tuohy describe.

If learning lessons is important (and it usually is), then it needs proper attention, not lipservice.

View Original Source (nickmilton.com) Here.

The 11 steps of FEMA’s lesson capture process

The US Federal Emergency Management Agency (FEMA) has a pretty good process for capturing and distributing lessons. Here are the 11 steps.

FEMAEvery Emergency Services organisation pays close attention to Lesson-Learning  (see for the approach taken by the Wildland Fire Service). They know that effective attention to learning from lessons can save lives and property when the next emergency hits.

The lesson learning system at FEMA was described in an appendix to a 2011 audit document  and showed the following 11 steps in the process for moving from activity to distributed lessons and best practices.  Please note that I have not been able to find a more recent description of the process, which may have changed in the intervening 7 years.

FEMA Remedial Action Management Program Lessons Learned and Best Practices Process

  1. Team Leader (e.g., Federal Coordinating Officer) schedules after-action review
  2. After-action review facilitator is appointed
  3. Lesson Learned/Best Practice Data Collection Forms are distributed to personnel
  4. Facilitator reviews completed forms
  5. Facilitator conducts after-action review
  6. Facilitator reviews and organizes lessons learned and best practices identified in after-action review
  7. Facilitator enters lessons learned and best practices into the program’s database
  8. Facilitator Supervisor reviews lessons learned and best practices
  9. Facilitator Supervisor forwards lessons learned and best practices to Program Manager
  10. Program Manager reviews lessons learned and best practices
  11. Program Manager distributes lessons learned and best practices to Remedial Action Managers
This is a pretty good process.
However despite this good process, the audit showed many issues, including 
  • a lack of a common understanding of what a good lesson looks like; the examples shown are mainly historical statements rather than lessons, and this example from the FEMA archives has the unhelpful lesson “Learned that some of the information is already available information is available”
  • a lack of consistent application of the after action review process (in which I would include not getting to root cause, and not identifying the remedial action),
  • a lack of use of facilitators from outside the region to provide objectivity, 
  • limited distribution of the lesson output (which has now been fixed I believe, and 
  • loss of their lessons database when the server crashed (which has also been fixed by moving FEMA lessons to the Homeland Security Digital Library).

So even a good process like the one described above can be undermined by a lack of governance, a lack of trained resources, and a poor technology. 

View Original Source (nickmilton.com) Here.

10 steps of lesson learning – from review to embedded change

A lesson, or a piece of knowledge, goes through ten generic steps in its life cycle.

Free image from Max Pixel

That’s partly why lesson learning is not easy – the lifecycle of a lesson contains several steps if the lesson is to lead to embedded change. These ten steps are listed below.

Step one, review of activity. 
The initial step in the lesson learned process is to review and restate the purpose and context of the activity being reviewed. Each lesson is learned in a specific context, and needs to be viewed in that context. For example the lessons learned from a project operating in a remote third-world location, where supply of spares and material is highly challenging, may learn different lessons from those in a project based in the commercial US. This activity review will look at context, objectives, and outcomes.

Step two, identification of learning points.
 By comparing outcomes against objectives and expectations, this step allows a number of learning points to be identified. The outputs of this step are observations that something has been unusually successful, or unexpectedly unsuccessful, and that a lesson needs to be identified. These Observations are the first stage in lesson identification and development – the egg from which a lesson may grow, if you like.

Step three, analysis of learning point. 
This takes the form of root cause analysis, seeking to find the root cause which created the result identified as an observation. There may of course be more than one root cause. Once the root cause is been identified, these are the “insights” of the Military lesson-learning quadrad of Observations/Insights/Lessons/Actions

 Step four, generalization and creation of learning advice. Once the root causes have been identified and the insights generated, the next stage is to discuss how the same operation or project, or future operations and projects, may avoid the root causes that caused cost or delay, or reproduce the root causes that led to success. The discussion leads to derivation of a lesson, which should be phrased in the form of advice or recommendations for the future. At this stage we have a “lesson identified” rather than a lesson learned.

Step five, identification of action. Once the lesson has been identified, the next question to address is how the learning may be embedded within the processes, procedures, standards, and structures of the organization. In order for embedding to take place, somebody has to take an action, and an action must be identified and assigned.

The 5 steps above are often conducted verbally within the project team, and mirror the 5 questions of the After Action review or Retrospect. In the steps below, the lesson leaves the team and starts to move out into the organisation.

Step six, lesson documentation. The lesson may be documented after the action has been discussed, or the lesson may be documented after step four, when it is still a “lesson identified”. In some cases, when the lessons are submitted by individuals, they document the lessons step by step, as they go through the thought process. In other cases, as discussed below, the lesson is first discussed and then later documented based on notes or records from the discussion. We can think of this as a “lesson documented“. And to be honest, you can have a lesson learning system where this step is omitted, and all lessons are communicated verbally.

Step seven, lesson/action validation. Most lesson learning systems contain at least one validation step, where one or more people with authority examine the documented lesson and the assigned actions, to make sure that the lesson is valid and truly merits action and change, and that the proposed action is appropriate. Some regimes include a risk analysis, or a management of change analysis, on the actions proposed, if they are big enough. The deliverable from this step is a validated lesson/action

Step eight, lesson management. In many ways you can describe all of the steps listed here as “lesson management”, but in most of the organizations any oil and gas sector, a lessons management technology system (sometimes known as a lessons database) is brought in to ensure that lessons are “managed” by being routed to the people who most need to see them. This “routing of lessons” is crucial in a large organisation, to make sure the action-holders are notified of the lesson, and the action they need to take. The deliverable from this stage is a change request.

Step nine, take action. The action identified above, if valid, needs to be taken, and the change made. This is the most crucial step within the lesson learning system, because without change, no learning will have occurred. The deliverable from this step is Change.

 Step ten, lesson closure. Once the changes being made, the lessons can be completed, or closed. The lifecycle of that particular lesson is over, and it can be archived or deleted.

Steps 5 to 10 are concerned not with the identification of the lesson, but the way in which it leads to the right change in the organisation. 

As this blog post shows, these ten steps can take place within a single project, across many projects, or across a whole organisation. However the ten steps are needed in each case.

View Original Source (nickmilton.com) Here.

The link between lesson learning maturity and effectiveness.

What is the best type of storage system for lessons learned? Our survey data compares the options.

We conducted a big survey of Knowledge Management this year, following on from a previous survey in 2014. Both surveys contained an optional section on lesson learning, and across both surveys we collected 222 responses related to lesson learning.

One of the lessons learned questions was “Rate the effectiveness of your organisational lessons learned process in delivering performance improvement, from 5 (completely effective) to 0 (completely ineffective)”

Another asked the respondent where their lessons were most commonly stored.

By combining these two questions, we can look at the average effectiveness of lesson learning for each storage option, as shown in the chart above. You can see clearly that organisations where lessons are stored within a custom lesson management system are far more likely to rate their lesson learning as effective than those where lessons are stored as sections within project reports, or not stored at all. Other storage options are linked to intermediate ratings scores.

This links back to a blog post I wrote in 2012 on the maturity of lesson learned systems. 

Here I identified a number of maturity levels, from level 1a through level 3. The supporting technology for storing lessons is only one part of the maturity system, but it struck me today that you can overlay these maturity levels on the chart, as shown below.

  • In levels 1a and 1b, lessons are stored in project reports
  • In level 1c, lessons are stored in a spearate system – a database, a wiki, a spreadsheet
  • In level 1d, individuals can “follow” certain types of lessons, and be notified when new lessons appear
  • In level 2, lessons are stored in a lesson management system which allows them to be routed to experts to embed lessons into practice.

The diagram shows that each progression from one maturity level to the next is associated with an increase in effectiveness of the lesson learning system.

View Original Source (nickmilton.com) Here.

Lesson Quality in Knowledge management

Lesson quality is a crucial component of lesson learning. Poor quality lessons just lead to Garbage-In, Garbage out.

I came across an interesting article recently entitled “Enhancing Quality of Lessons Learned” by Lo and Fong.  The authors look at lessons learned and how effective they are as a mechanism for transferring knowledge, and come out with the following areas where particular attention needs to paid when recording lessons. These are

  • The Meta-Knowledge – the way in which the lesson is tagged and labelled (including the organisational unit affected, the applicability, and the importance of the lesson)
  • Taking into account the needs of the users/learners
  • Comprehensibility and clarity of the lesson (selecting words that are unambiguous, and free of jargon)
  • The validity of the reasoning behind the lesson – the “Why” behind the lesson.

The authors point particularly to the last issue, and say the following

“Since curiosity is a good motivator for learning, knowing the reasons why past practices succeeded or failed is essential for encouraging users to gain and share knowledge that contributes to organizational learning. It is argued that Lessons Learned should provide the rationales behind the lessons, fostering users’ reflection and extension of the application of lessons to other situations”.

This comes back to a point I made last week about capturing the the Why.

It also makes the point that lesson quality is important if the lesson-learned component of KM is to work well. We have worked with several organisations who include quality control steps in their lesson learned program, and for several years conducted a monthly lessons quality audit for one organisation. For others we have provided lesson quality audit as part of an overall Lesson Learning audit service.

Maybe it’s time you audited the quality of your lessons?

View Original Source (nickmilton.com) Here.

How can we learn lessons when every project is different?

This is another one to add to the “Common Knowledge Management Objections” list, and it’s worth thinking in advance what your counter-argument might be.

It’s a push-back you hear quite often in project organisations:

“We can’t do Knowledge Management, especially lessons learned, as all our projects are different”.

I last herd this from a technology company, and by saying “every project is different”, they mean that “every project has a different client, different product, different technical specifications”.

To some extent, they are correct, and this project variation reduces some of the impact of lesson learning. Certainly lessons add the most value when projects are the most similar. But even when projects change, learning still adds value.

Firstly, even on those technology projects, the process will be the same. 

The process of building and understanding the client requirements, choosing and forming the team, selecting and managing sub contractors, balancing the innovation against the risk, communicating within the team, keeping the client requirements always in mind, managing quality, managing cost, managing time, managing expectations, managing risk, and so on.

There is a huge amount of learning to be done about the process of a project, even when the tasks are different.

Secondly, the other common factor for this technology company was that every project was a variant on their existing product. 

They learned a lot about the way the product worked, and the technology behind the product, with every new project. If this additional knowledge was not captured, then they would have to rediscover it anew every time.  If the knowledge is captured, then each project is an exploration into the technology, and builds the company understanding of the technology so that new products can be developed in future.

So even if every project is different, every project can still be a learning opportunity. 

View Original Source (nickmilton.com) Here.

Sharing knowledge by video – – a firefighting example

The US Wildfire community is an area where Knowledge Management and Lesson Learning has been eagerly embraced, including the use of video.

The need for Knowledge Management and Lesson Learning is most obvious where the consequences of not learning are most extreme. Fire-fighting is a prime example of this – the consequences of failing to learn can be fatal, and  fire fighters were early adopters of KM. This includes the people who fight the ever-increasing numbers of grass fires and forest fires, known as Wildland fires.

The history of lesson learning in the Wildfire community is shown in the video below, including the decision after a major tragedy in 1994 to set up a lesson learned centre to cover wildfire response across the whole of the USA.

The increase in wildland fires in the 21st century made it obvious to all concerned that the fire services needed to learn quickly, and the Wildland Lessons Learned center began to introduce a number of activities, such as the introduction of After Action reviews, and collecting lessons from across the whole of the USA. A national wildfire “corporate university” is planned, of which the Lesson Learned center will form a part.

The wildfire lessons center can be found here, and this website includes lesson learned reports from various fires, online discussions, a blog (careful – some of the pictures of chainsaw incidents are a bit gruesome), a podcast, a set of resources such as recent advances in fire practice, a searchable incident database, a directory of members, and the ability to share individual lessons quickly. This is a real online community of practice.

Many of the lessons collected from fires are available as short videos published on the Wildland Lessons Center youtube channel and available to firefighters on handheld devices. An example lesson video is shown below, sharing lessons from a particular fire, and speaking directly to the firefighter, asking them to imagine themselves in a particular situation. See this example below from the “close call” deployment of a fire shelter during the Ahorn fire in 2011, which includes material recorded from people actually caught up in the situation.

Sometimes lessons can be drawn from multiple incidents, and combined into guidance. Chainsaw refueling operations are a continual risk during tree felling to manage forest fires, as chainsaw fuel tanks can become pressurised, spraying the operator with gasoline when the tank is opened (the last thing you want in the middle of a fire). Lessons from these incidents have been combined into the instructional video below.

This video library is a powerful resource, with a very serious aim – to save lives in future US Wildland fires. 

View Original Source (nickmilton.com) Here.

Skip to toolbar