10 steps of lesson learning – from review to embedded change

A lesson, or a piece of knowledge, goes through ten generic steps in its life cycle.

Free image from Max Pixel

That’s partly why lesson learning is not easy – the lifecycle of a lesson contains several steps if the lesson is to lead to embedded change. These ten steps are listed below.

Step one, review of activity. 
The initial step in the lesson learned process is to review and restate the purpose and context of the activity being reviewed. Each lesson is learned in a specific context, and needs to be viewed in that context. For example the lessons learned from a project operating in a remote third-world location, where supply of spares and material is highly challenging, may learn different lessons from those in a project based in the commercial US. This activity review will look at context, objectives, and outcomes.

Step two, identification of learning points.
 By comparing outcomes against objectives and expectations, this step allows a number of learning points to be identified. The outputs of this step are observations that something has been unusually successful, or unexpectedly unsuccessful, and that a lesson needs to be identified. These Observations are the first stage in lesson identification and development – the egg from which a lesson may grow, if you like.

Step three, analysis of learning point. 
This takes the form of root cause analysis, seeking to find the root cause which created the result identified as an observation. There may of course be more than one root cause. Once the root cause is been identified, these are the “insights” of the Military lesson-learning quadrad of Observations/Insights/Lessons/Actions

 Step four, generalization and creation of learning advice. Once the root causes have been identified and the insights generated, the next stage is to discuss how the same operation or project, or future operations and projects, may avoid the root causes that caused cost or delay, or reproduce the root causes that led to success. The discussion leads to derivation of a lesson, which should be phrased in the form of advice or recommendations for the future. At this stage we have a “lesson identified” rather than a lesson learned.

Step five, identification of action. Once the lesson has been identified, the next question to address is how the learning may be embedded within the processes, procedures, standards, and structures of the organization. In order for embedding to take place, somebody has to take an action, and an action must be identified and assigned.

The 5 steps above are often conducted verbally within the project team, and mirror the 5 questions of the After Action review or Retrospect. In the steps below, the lesson leaves the team and starts to move out into the organisation.

Step six, lesson documentation. The lesson may be documented after the action has been discussed, or the lesson may be documented after step four, when it is still a “lesson identified”. In some cases, when the lessons are submitted by individuals, they document the lessons step by step, as they go through the thought process. In other cases, as discussed below, the lesson is first discussed and then later documented based on notes or records from the discussion. We can think of this as a “lesson documented“. And to be honest, you can have a lesson learning system where this step is omitted, and all lessons are communicated verbally.

Step seven, lesson/action validation. Most lesson learning systems contain at least one validation step, where one or more people with authority examine the documented lesson and the assigned actions, to make sure that the lesson is valid and truly merits action and change, and that the proposed action is appropriate. Some regimes include a risk analysis, or a management of change analysis, on the actions proposed, if they are big enough. The deliverable from this step is a validated lesson/action

Step eight, lesson management. In many ways you can describe all of the steps listed here as “lesson management”, but in most of the organizations any oil and gas sector, a lessons management technology system (sometimes known as a lessons database) is brought in to ensure that lessons are “managed” by being routed to the people who most need to see them. This “routing of lessons” is crucial in a large organisation, to make sure the action-holders are notified of the lesson, and the action they need to take. The deliverable from this stage is a change request.

Step nine, take action. The action identified above, if valid, needs to be taken, and the change made. This is the most crucial step within the lesson learning system, because without change, no learning will have occurred. The deliverable from this step is Change.

 Step ten, lesson closure. Once the changes being made, the lessons can be completed, or closed. The lifecycle of that particular lesson is over, and it can be archived or deleted.

Steps 5 to 10 are concerned not with the identification of the lesson, but the way in which it leads to the right change in the organisation. 

As this blog post shows, these ten steps can take place within a single project, across many projects, or across a whole organisation. However the ten steps are needed in each case.

View Original Source (nickmilton.com) Here.

The link between lesson learning maturity and effectiveness.

What is the best type of storage system for lessons learned? Our survey data compares the options.

We conducted a big survey of Knowledge Management this year, following on from a previous survey in 2014. Both surveys contained an optional section on lesson learning, and across both surveys we collected 222 responses related to lesson learning.

One of the lessons learned questions was “Rate the effectiveness of your organisational lessons learned process in delivering performance improvement, from 5 (completely effective) to 0 (completely ineffective)”

Another asked the respondent where their lessons were most commonly stored.

By combining these two questions, we can look at the average effectiveness of lesson learning for each storage option, as shown in the chart above. You can see clearly that organisations where lessons are stored within a custom lesson management system are far more likely to rate their lesson learning as effective than those where lessons are stored as sections within project reports, or not stored at all. Other storage options are linked to intermediate ratings scores.

This links back to a blog post I wrote in 2012 on the maturity of lesson learned systems. 

Here I identified a number of maturity levels, from level 1a through level 3. The supporting technology for storing lessons is only one part of the maturity system, but it struck me today that you can overlay these maturity levels on the chart, as shown below.

  • In levels 1a and 1b, lessons are stored in project reports
  • In level 1c, lessons are stored in a spearate system – a database, a wiki, a spreadsheet
  • In level 1d, individuals can “follow” certain types of lessons, and be notified when new lessons appear
  • In level 2, lessons are stored in a lesson management system which allows them to be routed to experts to embed lessons into practice.

The diagram shows that each progression from one maturity level to the next is associated with an increase in effectiveness of the lesson learning system.

View Original Source (nickmilton.com) Here.

Lesson Quality in Knowledge management

Lesson quality is a crucial component of lesson learning. Poor quality lessons just lead to Garbage-In, Garbage out.

I came across an interesting article recently entitled “Enhancing Quality of Lessons Learned” by Lo and Fong.  The authors look at lessons learned and how effective they are as a mechanism for transferring knowledge, and come out with the following areas where particular attention needs to paid when recording lessons. These are

  • The Meta-Knowledge – the way in which the lesson is tagged and labelled (including the organisational unit affected, the applicability, and the importance of the lesson)
  • Taking into account the needs of the users/learners
  • Comprehensibility and clarity of the lesson (selecting words that are unambiguous, and free of jargon)
  • The validity of the reasoning behind the lesson – the “Why” behind the lesson.

The authors point particularly to the last issue, and say the following

“Since curiosity is a good motivator for learning, knowing the reasons why past practices succeeded or failed is essential for encouraging users to gain and share knowledge that contributes to organizational learning. It is argued that Lessons Learned should provide the rationales behind the lessons, fostering users’ reflection and extension of the application of lessons to other situations”.

This comes back to a point I made last week about capturing the the Why.

It also makes the point that lesson quality is important if the lesson-learned component of KM is to work well. We have worked with several organisations who include quality control steps in their lesson learned program, and for several years conducted a monthly lessons quality audit for one organisation. For others we have provided lesson quality audit as part of an overall Lesson Learning audit service.

Maybe it’s time you audited the quality of your lessons?

View Original Source (nickmilton.com) Here.

How can we learn lessons when every project is different?

This is another one to add to the “Common Knowledge Management Objections” list, and it’s worth thinking in advance what your counter-argument might be.

It’s a push-back you hear quite often in project organisations:

“We can’t do Knowledge Management, especially lessons learned, as all our projects are different”.

I last herd this from a technology company, and by saying “every project is different”, they mean that “every project has a different client, different product, different technical specifications”.

To some extent, they are correct, and this project variation reduces some of the impact of lesson learning. Certainly lessons add the most value when projects are the most similar. But even when projects change, learning still adds value.

Firstly, even on those technology projects, the process will be the same. 

The process of building and understanding the client requirements, choosing and forming the team, selecting and managing sub contractors, balancing the innovation against the risk, communicating within the team, keeping the client requirements always in mind, managing quality, managing cost, managing time, managing expectations, managing risk, and so on.

There is a huge amount of learning to be done about the process of a project, even when the tasks are different.

Secondly, the other common factor for this technology company was that every project was a variant on their existing product. 

They learned a lot about the way the product worked, and the technology behind the product, with every new project. If this additional knowledge was not captured, then they would have to rediscover it anew every time.  If the knowledge is captured, then each project is an exploration into the technology, and builds the company understanding of the technology so that new products can be developed in future.

So even if every project is different, every project can still be a learning opportunity. 

View Original Source (nickmilton.com) Here.

Sharing knowledge by video – – a firefighting example

The US Wildfire community is an area where Knowledge Management and Lesson Learning has been eagerly embraced, including the use of video.

The need for Knowledge Management and Lesson Learning is most obvious where the consequences of not learning are most extreme. Fire-fighting is a prime example of this – the consequences of failing to learn can be fatal, and  fire fighters were early adopters of KM. This includes the people who fight the ever-increasing numbers of grass fires and forest fires, known as Wildland fires.

The history of lesson learning in the Wildfire community is shown in the video below, including the decision after a major tragedy in 1994 to set up a lesson learned centre to cover wildfire response across the whole of the USA.

The increase in wildland fires in the 21st century made it obvious to all concerned that the fire services needed to learn quickly, and the Wildland Lessons Learned center began to introduce a number of activities, such as the introduction of After Action reviews, and collecting lessons from across the whole of the USA. A national wildfire “corporate university” is planned, of which the Lesson Learned center will form a part.

The wildfire lessons center can be found here, and this website includes lesson learned reports from various fires, online discussions, a blog (careful – some of the pictures of chainsaw incidents are a bit gruesome), a podcast, a set of resources such as recent advances in fire practice, a searchable incident database, a directory of members, and the ability to share individual lessons quickly. This is a real online community of practice.

Many of the lessons collected from fires are available as short videos published on the Wildland Lessons Center youtube channel and available to firefighters on handheld devices. An example lesson video is shown below, sharing lessons from a particular fire, and speaking directly to the firefighter, asking them to imagine themselves in a particular situation. See this example below from the “close call” deployment of a fire shelter during the Ahorn fire in 2011, which includes material recorded from people actually caught up in the situation.

Sometimes lessons can be drawn from multiple incidents, and combined into guidance. Chainsaw refueling operations are a continual risk during tree felling to manage forest fires, as chainsaw fuel tanks can become pressurised, spraying the operator with gasoline when the tank is opened (the last thing you want in the middle of a fire). Lessons from these incidents have been combined into the instructional video below.

This video library is a powerful resource, with a very serious aim – to save lives in future US Wildland fires. 

View Original Source (nickmilton.com) Here.

Lesson learning at NASA – more details

NASA has a well-developed Lesson Learning system – here are more details.

Image from Wikimedia commons

I blogged recently about lesson learning at NASA, based on a report from a few years ago, and observing that the NASA LLIS system seemed to be a passive database where lessons were left until someone came looking.

As a result of this post I was invited to join a NASA webinar on lesson learning, which you can review here, and which provides a more up to date overview of the NASA approach to lesson learning. Here are my take-aways (and thank you Barbara for opportunity to attend).

Each NASA project is required to conduct lessons capture meetings, which they call “Pause and Learn”.  These Pause and Learn meetings generally use an external facilitator.  Lessons are entered into LLIS in a standard template, which contains the following sections:

  • Subject 
  • Driving Event 
  • Lesson(s) Learned 
  • Recommendation(s) (there is some variation in the way that Recommendations are differentiated from Lessons)
  • Evidence of Recurrence Control Effectiveness

Although LLIS is essentially a passive database, there is an external process to control the re-occurrence of lessons, and many lessons seem to be referenced or referred to in standards and guidance.  However even when the lesson has been referenced in standards it still remains in the database, and LLIS contains lessons all the way back to the Apollo program.  I submitted a question to the webinar about how NASA deals with the archival of embedded, obsolete or duplicate lessons, but this was not one of the questions selected for discussion.

Some parts of NASA take the lesson management process further. Dr Jennifer Stevens, the Chief Knowledge integrator of the Marshall Space Flight Center, described the work of the distilling team, who look through the database of lessons and distill out the common factors and underlying issues which need correction. They see lessons as an immediate feedback system from operations, and they compartmentalise and group lessons until they can identify a corrective action; often updating a policy or guidance document as a result. Some lessons, which they can’t act on immediately, go into what they call a Stewpot, where they look for trends over time. A lesson, or type of lesson, which is seen many times is indicative of some sort of systemic or cultural issue which may merit action.

Projects are NASA are required to create a Knowledge Management plan, which they refer to as a Lesson Learning Plan, as described by Barbara Fillip, KM lead at Goddard Space Flight Center. This plan documents:

  • How the project intends to learn from others
  • How the project intends to learn through its lifecycle
  • How the project will share lessons with others.
The plan is built on a basic templates of 3 pages, one for each section, and there is no requirement for a planning meeting. Each project completes the plan in their own way. This is similar to the Knoco KM plan – drop me a message if you want a copy of our free KM plan template.

A few more snippets I picked up:

NASA, in their Pause and learn sessions, use “We” language rather than “They” language. The conversation is all about what WE did, and what WE should do, rather than what THEY did and how THEY need to fix it.

A motto they use to promote Learning before doing is “Get smart before you start”.

NASA do not refer to success and failure in their Lesson Learning system – they talk about Events. An Event is what happened – a Mistake or Failure or Success is just a label we put onto events.  NASA seeks to learn from all events.

In conclusion, the NASA lesson learning system is as well-developed Level 2 system, and lessons are used to systematically drive change. Although LLIS does not have seem to have the functionality to automate this driving of change, there are enough resources, such as the Distillation team, to be able to do this manually.

View Original Source (nickmilton.com) Here.

Lesson learning roles in the RCAF

Roles and Governance are often overlooked elements of KM. Here is a great example of a set of roles and accountabilities for Lesson learning within the Royal Canadian Air Force.

The example is taken from a web page dated 2015 called “Canadian Forces Aerospace Warfare Centre, Analysis and Lessons Learned“.

The RCAF have the following roles and accountabilities, shown in the diagram to the right, and described below:

  • A senior sponsor, known as the Lessons Learned Command Authority – this is actually the Commander of the RCAF, and is accountable to the Vice Chief of the Defence Staff for implementing and overseeing the Lesson Learned Programme. Note that the Chief of Defence Staff requires the RCAF to establish processes that add value to the existing body of knowledge, or attempt to correct deficiencies in concepts, policy, doctrine, training, equipment or organizations, and the Lessons Learned Programme is one response to this requirement.
  • A delegated customer/custodian for the Lesson learned program known as the “Lesson Learned programme Authority”. This is the Deputy Commander, who is responsible for all Air Lessons Learned matters, including maintenance and periodic review of the programme. 
  • A leader for the Lesson Learned program, called the Lessons-Learned Technical Authority. This is the Commanding Officer of the Canadian Forces Aerospace Warfare Centre, who reports to the Lesson Learned Programme Authority for lessons-learned matters, and who is responsible for executing all aspects of the programme with the help of a dedicated Analysis and Lesson Learned team.
  • Clear accountabilities for the leaders of the various divisions in their roles as Lessons Learned Operational Authorities, to effectively operationalize and implement the programme within their command areas of responsibility.
  • Each of these appoint a Lessons Learned point of contact to coordinate the Lessons Learned activities and functions for their organizations as well as to address issues that have been forwarded along the chain of command.
  • Wing Lessons-Learned Officers embedded in the organisation at wing and formation levels, who provide Lesson learning advice to the wing commander related to missions and mission-support activities.
  • Unit Lessons-Learned Officers within the RCAF units who coordinate the documentation and communication of what has been learned during daily activities; liaising directly with their relevant Wing Lessons-Learned Officer. These are like the Lesson Learned  Integrators in the US Army.
You can see how accountability for lesson learning comes down the chain of command (the red boxes in the diagram) from the RCAF Commander right down to Unit level, and how enabling and supporting roles are created at many levels – the LL Programme, the Divisional points of contact, the Wing LLOs and the Unit LLOs.

The principle of delegated accountability down the line management chain enabled by supporting resources is a good one, which can be applied in many organisational setting.

View Original Source (nickmilton.com) Here.

The importance of Reflection in KM

We don’t learn by doing, we learn by reflecting on doing, which is why your KM program should include reflective processes.

Kolb learning cycle. Publ;ic domain image from wikipedia

There is a popular quote on the Internet, often attributed to John Dewey, that “We do not learn from an experience … We learn from reflecting on an experience“. It probably isn’t from Dewey, although it is a summarisation of Dewey’s teaching, but it does make the point that no matter how broad your experience, it doesn’t mean you have leared anything.

Let’s match that with another Internet quote, attributed to Paul Shoemaker “Experience is inevitable, Learning is not“. Without reflection, and without change as a result of that reflection, nothing has been learned no matter how many experiences you have.

“Observation and Reflection” is also a part of the Kolb learning cycle, shown here, which is a well-established model for how individuals learn.

Knowledge Management is not so much about Individual learning as about Organisational Learning and Team Learning, but reflection is just as important in the KM context. Reflection  needs to be introduced to work practice through the introduction of reflective processes such as the following:

  • Introducing After Action Review as a reflective team process, to allow teams to reflect on, and collectively learn from, actions and activities;
  • Introducing Retrospect as a reflective team process, to allow teams to reflect on, and collectively learn from, projects and project phases;
  • Introducing processes such as Knowledge Exchange to allow members of a Community of Practice to reflect on, and share, their practice;
  • Introducing processes such as knowledge interviews to guide experts through structured reflection on their own experience, and top make this public for others.

It is only through introducing reflective processes such as these, and then acting on the new knowledge gained, that your organisation will stop just Experiencing, and start Learning.

View Original Source Here.

A key lesson-learning role in the military setting

Lesson Learning is well embedded in the United States Army and forms a model which industry can emulate, especially when it comes to assigning knowledge management roles within the business.

www.Army.mil As explained in this excellent analysis from Nancy Dixon, lesson learning works well in the US Army.  This article describes some of the components of the learning system they apply, and mentions processes such as After Action Reviews and Learning Interviews, but also mentions  different roles with accountability for the lessons process. One of the key roles is the  Lessons Learned Integrator, or L2I.

The Lessons Learned Integrator role

The Centre for Army Lessons Learned  is deploying Lessons Learned Integrators in operational units and in other units such as training schools and doctrine centres. These L2I analysts gather lessons learned, research requests for information (RFI), and support the unit within which they are situated. They act as conduits for lessons in and out of the units. You can find various role descriptions for this post (e.g. this one), which suggest that the role primarily involves
  • Collecting, reporting, and disseminating lessons from the host unit
  • Monitoring lessons learned and other new knowledge from elsewhere, assessing it for relevance to the host unit, and “pushing” it to the correct people
  • Initiating actions that lead to change recommendations
  • Locally supporting the “Request for Information” process, where soldiers can make requests for information from the Centre for Army Lessons Learned.
In many of the support centres, the L2I analyst also has a role in developing doctrine, as described here

  • The L2I analyst can derive information from a variety of sources: unit after-action reports; tactics, techniques, and procedures used by units in and returning from theater; Soldier observations/submissions to the Engineer School; and requests for information. 
  • This information is used to conduct doctrine, organization, training, materiel, leadership and education, personnel, and facilities gap analyses and to determine solutions

As ever, Industry can learn from the Military.

Too often we see “Lessons Learned systems” which seem to have no roles or accountabilities assigned to them. The assumption seems to be that “everyone is responsible for lessons learned”, which quickly becomes “someone else will do it”, then “nobody is responsible”. The Army avoid this by identifying specific pivotal roles for identification, communication and analysis of Lessons, and for identifying what needs to be done as a result.

If you want your Lessons Learned system to really work, then you may need roles similar to the L2I in your operation units.

View Original Source Here.

Three levels of Lesson Learning

Here are a couple of reprised blog posts from 5 years ago, covering the topic of lesson learning, and presenting 3 potential levels of maturity for a learning system. Most organisations are stuck at level 1.

There are three levels of rigour, or levels of maturity, regarding how Lesson-learning is applied in organisations.

  • The first is to reactively capture and doccument lessons for others to find and read
  • The second is to reactively capture lessons at the end of projects, document them, and as a result make changes to company procedures and practices
  • The third is to proactively hunt lessons from wherever they can be found, and make changes to company procedures and practices so that the lessons are embedded into practice. 

Lesson-learning can be a very powerful way for an organisation to learn, change and adapt, but only if it is approached in a mature way. Level 1, to be honest, will not add much value, and its only when you get to level 2 that Lesson learning really starts to make a difference.

Let’s look at those levels in more detail.

Level 1

Level 1  is to reactively capture lessons at the end of projects, and document them so that others can learn. Lessons are stored somewhere, and people need to find and read the lesson in order to access the knowledge. There are sub-levels of maturity in level 1, which include

1a) Ad-hoc capture of lessons, often by the project leader, documenting them and storing them in project files with no quality control or validation step. Lessons must therefore be sought by reading project reports, or browsing project files structures

1b)Structured capture of lessons, through lessons identification meetings such as retrospects, documenting and storing the lessons in project files with no quality control or validation step.

1c) Structured capture of lessons, through lessons identification meetings such as retrospects, documenting and storing the lessons in a company-wide system such as a lessons database or a wiki. This often includes a validation step.

1d) Structured capture of lessons, through lessons identification meetings such as retrospects, documenting and storing the lessons in a company-wide system with auto-notification, so that people can self-nominate to receive specific lessons.

Level 2

Level 2 is to reactively capture lessons at the end of projects, document them, and as a result make changes to company procedures and practices so that the lessons are embedded into practice. Here people do not need to read the lesson to access the knowledge, they just need to follow the practice. Again, there are sub-levels of maturity in level 2, which include

2a) Lessons are forwarded (ideally automatically, by a lesson management system) to the relevant expert for information, with the expectation that they will review them and incorporate them into practice.

2b) Lessons include assigned actions for the relevant expert, and are auto-forwarded to the expert for action

2c) As 2b, with the addition that the actions are tracked and reported.

Level 3  is to proactively hunt lessons from wherever they can be found, and make changes to company procedures and practices so that the lessons are embedded into practice.  There are not enough organisations at level 3 to recognise sub-levels, but there are some ways in which Level 3 can operate

3a) Senior managers can identify priority learning areas for the organisation. Projects are given learning objectives – objectives for gathering knowledge and lessons on behalf of the organisation. These may be written into project knowledge management plans. 3b) Learning teams may analyse lessons over a period of months or years to look for the common themes and the underlying trends – the weak signals that operational lessons may mask.

3c) Organisations may deploy specific learning resources (Learning Engineers, Project Historians, etc) into projects or activity, in order to pick up specific learning for the organisation.

I have only really come across level 3 in the military.  For example, see this quote from
Lieutenant-General Paul Newton

The key is to ‘hunt’ not ‘gather’ lessons, apply them rigorously—and only when you have made a change have you really learned a lesson. And it applies to everyone … It is Whole Army business. 

However even level 2 is quite rare. Many organisations have not gone beyond the “document and store” stages of Level 1, and generally have been disappointed by the outcomes.

If you aspire to be a learning organisation, set your sights at levels 2 or 3.

View Original Source Here.

Skip to toolbar