Lesson Quality in Knowledge management

Lesson quality is a crucial component of lesson learning. Poor quality lessons just lead to Garbage-In, Garbage out.

I came across an interesting article recently entitled “Enhancing Quality of Lessons Learned” by Lo and Fong.  The authors look at lessons learned and how effective they are as a mechanism for transferring knowledge, and come out with the following areas where particular attention needs to paid when recording lessons. These are

  • The Meta-Knowledge – the way in which the lesson is tagged and labelled (including the organisational unit affected, the applicability, and the importance of the lesson)
  • Taking into account the needs of the users/learners
  • Comprehensibility and clarity of the lesson (selecting words that are unambiguous, and free of jargon)
  • The validity of the reasoning behind the lesson – the “Why” behind the lesson.

The authors point particularly to the last issue, and say the following

“Since curiosity is a good motivator for learning, knowing the reasons why past practices succeeded or failed is essential for encouraging users to gain and share knowledge that contributes to organizational learning. It is argued that Lessons Learned should provide the rationales behind the lessons, fostering users’ reflection and extension of the application of lessons to other situations”.

This comes back to a point I made last week about capturing the the Why.

It also makes the point that lesson quality is important if the lesson-learned component of KM is to work well. We have worked with several organisations who include quality control steps in their lesson learned program, and for several years conducted a monthly lessons quality audit for one organisation. For others we have provided lesson quality audit as part of an overall Lesson Learning audit service.

Maybe it’s time you audited the quality of your lessons?

View Original Source (nickmilton.com) Here.

How can we learn lessons when every project is different?

This is another one to add to the “Common Knowledge Management Objections” list, and it’s worth thinking in advance what your counter-argument might be.

It’s a push-back you hear quite often in project organisations:

“We can’t do Knowledge Management, especially lessons learned, as all our projects are different”.

I last herd this from a technology company, and by saying “every project is different”, they mean that “every project has a different client, different product, different technical specifications”.

To some extent, they are correct, and this project variation reduces some of the impact of lesson learning. Certainly lessons add the most value when projects are the most similar. But even when projects change, learning still adds value.

Firstly, even on those technology projects, the process will be the same. 

The process of building and understanding the client requirements, choosing and forming the team, selecting and managing sub contractors, balancing the innovation against the risk, communicating within the team, keeping the client requirements always in mind, managing quality, managing cost, managing time, managing expectations, managing risk, and so on.

There is a huge amount of learning to be done about the process of a project, even when the tasks are different.

Secondly, the other common factor for this technology company was that every project was a variant on their existing product. 

They learned a lot about the way the product worked, and the technology behind the product, with every new project. If this additional knowledge was not captured, then they would have to rediscover it anew every time.  If the knowledge is captured, then each project is an exploration into the technology, and builds the company understanding of the technology so that new products can be developed in future.

So even if every project is different, every project can still be a learning opportunity. 

View Original Source (nickmilton.com) Here.

Sharing knowledge by video – – a firefighting example

The US Wildfire community is an area where Knowledge Management and Lesson Learning has been eagerly embraced, including the use of video.

The need for Knowledge Management and Lesson Learning is most obvious where the consequences of not learning are most extreme. Fire-fighting is a prime example of this – the consequences of failing to learn can be fatal, and  fire fighters were early adopters of KM. This includes the people who fight the ever-increasing numbers of grass fires and forest fires, known as Wildland fires.

The history of lesson learning in the Wildfire community is shown in the video below, including the decision after a major tragedy in 1994 to set up a lesson learned centre to cover wildfire response across the whole of the USA.

The increase in wildland fires in the 21st century made it obvious to all concerned that the fire services needed to learn quickly, and the Wildland Lessons Learned center began to introduce a number of activities, such as the introduction of After Action reviews, and collecting lessons from across the whole of the USA. A national wildfire “corporate university” is planned, of which the Lesson Learned center will form a part.

The wildfire lessons center can be found here, and this website includes lesson learned reports from various fires, online discussions, a blog (careful – some of the pictures of chainsaw incidents are a bit gruesome), a podcast, a set of resources such as recent advances in fire practice, a searchable incident database, a directory of members, and the ability to share individual lessons quickly. This is a real online community of practice.

Many of the lessons collected from fires are available as short videos published on the Wildland Lessons Center youtube channel and available to firefighters on handheld devices. An example lesson video is shown below, sharing lessons from a particular fire, and speaking directly to the firefighter, asking them to imagine themselves in a particular situation. See this example below from the “close call” deployment of a fire shelter during the Ahorn fire in 2011, which includes material recorded from people actually caught up in the situation.

Sometimes lessons can be drawn from multiple incidents, and combined into guidance. Chainsaw refueling operations are a continual risk during tree felling to manage forest fires, as chainsaw fuel tanks can become pressurised, spraying the operator with gasoline when the tank is opened (the last thing you want in the middle of a fire). Lessons from these incidents have been combined into the instructional video below.

This video library is a powerful resource, with a very serious aim – to save lives in future US Wildland fires. 

View Original Source (nickmilton.com) Here.

Lesson learning at NASA – more details

NASA has a well-developed Lesson Learning system – here are more details.

Image from Wikimedia commons

I blogged recently about lesson learning at NASA, based on a report from a few years ago, and observing that the NASA LLIS system seemed to be a passive database where lessons were left until someone came looking.

As a result of this post I was invited to join a NASA webinar on lesson learning, which you can review here, and which provides a more up to date overview of the NASA approach to lesson learning. Here are my take-aways (and thank you Barbara for opportunity to attend).

Each NASA project is required to conduct lessons capture meetings, which they call “Pause and Learn”.  These Pause and Learn meetings generally use an external facilitator.  Lessons are entered into LLIS in a standard template, which contains the following sections:

  • Subject 
  • Driving Event 
  • Lesson(s) Learned 
  • Recommendation(s) (there is some variation in the way that Recommendations are differentiated from Lessons)
  • Evidence of Recurrence Control Effectiveness

Although LLIS is essentially a passive database, there is an external process to control the re-occurrence of lessons, and many lessons seem to be referenced or referred to in standards and guidance.  However even when the lesson has been referenced in standards it still remains in the database, and LLIS contains lessons all the way back to the Apollo program.  I submitted a question to the webinar about how NASA deals with the archival of embedded, obsolete or duplicate lessons, but this was not one of the questions selected for discussion.

Some parts of NASA take the lesson management process further. Dr Jennifer Stevens, the Chief Knowledge integrator of the Marshall Space Flight Center, described the work of the distilling team, who look through the database of lessons and distill out the common factors and underlying issues which need correction. They see lessons as an immediate feedback system from operations, and they compartmentalise and group lessons until they can identify a corrective action; often updating a policy or guidance document as a result. Some lessons, which they can’t act on immediately, go into what they call a Stewpot, where they look for trends over time. A lesson, or type of lesson, which is seen many times is indicative of some sort of systemic or cultural issue which may merit action.

Projects are NASA are required to create a Knowledge Management plan, which they refer to as a Lesson Learning Plan, as described by Barbara Fillip, KM lead at Goddard Space Flight Center. This plan documents:

  • How the project intends to learn from others
  • How the project intends to learn through its lifecycle
  • How the project will share lessons with others.
The plan is built on a basic templates of 3 pages, one for each section, and there is no requirement for a planning meeting. Each project completes the plan in their own way. This is similar to the Knoco KM plan – drop me a message if you want a copy of our free KM plan template.

A few more snippets I picked up:

NASA, in their Pause and learn sessions, use “We” language rather than “They” language. The conversation is all about what WE did, and what WE should do, rather than what THEY did and how THEY need to fix it.

A motto they use to promote Learning before doing is “Get smart before you start”.

NASA do not refer to success and failure in their Lesson Learning system – they talk about Events. An Event is what happened – a Mistake or Failure or Success is just a label we put onto events.  NASA seeks to learn from all events.

In conclusion, the NASA lesson learning system is as well-developed Level 2 system, and lessons are used to systematically drive change. Although LLIS does not have seem to have the functionality to automate this driving of change, there are enough resources, such as the Distillation team, to be able to do this manually.

View Original Source (nickmilton.com) Here.

Lesson learning roles in the RCAF

Roles and Governance are often overlooked elements of KM. Here is a great example of a set of roles and accountabilities for Lesson learning within the Royal Canadian Air Force.

The example is taken from a web page dated 2015 called “Canadian Forces Aerospace Warfare Centre, Analysis and Lessons Learned“.

The RCAF have the following roles and accountabilities, shown in the diagram to the right, and described below:

  • A senior sponsor, known as the Lessons Learned Command Authority – this is actually the Commander of the RCAF, and is accountable to the Vice Chief of the Defence Staff for implementing and overseeing the Lesson Learned Programme. Note that the Chief of Defence Staff requires the RCAF to establish processes that add value to the existing body of knowledge, or attempt to correct deficiencies in concepts, policy, doctrine, training, equipment or organizations, and the Lessons Learned Programme is one response to this requirement.
  • A delegated customer/custodian for the Lesson learned program known as the “Lesson Learned programme Authority”. This is the Deputy Commander, who is responsible for all Air Lessons Learned matters, including maintenance and periodic review of the programme. 
  • A leader for the Lesson Learned program, called the Lessons-Learned Technical Authority. This is the Commanding Officer of the Canadian Forces Aerospace Warfare Centre, who reports to the Lesson Learned Programme Authority for lessons-learned matters, and who is responsible for executing all aspects of the programme with the help of a dedicated Analysis and Lesson Learned team.
  • Clear accountabilities for the leaders of the various divisions in their roles as Lessons Learned Operational Authorities, to effectively operationalize and implement the programme within their command areas of responsibility.
  • Each of these appoint a Lessons Learned point of contact to coordinate the Lessons Learned activities and functions for their organizations as well as to address issues that have been forwarded along the chain of command.
  • Wing Lessons-Learned Officers embedded in the organisation at wing and formation levels, who provide Lesson learning advice to the wing commander related to missions and mission-support activities.
  • Unit Lessons-Learned Officers within the RCAF units who coordinate the documentation and communication of what has been learned during daily activities; liaising directly with their relevant Wing Lessons-Learned Officer. These are like the Lesson Learned  Integrators in the US Army.
You can see how accountability for lesson learning comes down the chain of command (the red boxes in the diagram) from the RCAF Commander right down to Unit level, and how enabling and supporting roles are created at many levels – the LL Programme, the Divisional points of contact, the Wing LLOs and the Unit LLOs.

The principle of delegated accountability down the line management chain enabled by supporting resources is a good one, which can be applied in many organisational setting.

View Original Source (nickmilton.com) Here.

The importance of Reflection in KM

We don’t learn by doing, we learn by reflecting on doing, which is why your KM program should include reflective processes.

Kolb learning cycle. Publ;ic domain image from wikipedia

There is a popular quote on the Internet, often attributed to John Dewey, that “We do not learn from an experience … We learn from reflecting on an experience“. It probably isn’t from Dewey, although it is a summarisation of Dewey’s teaching, but it does make the point that no matter how broad your experience, it doesn’t mean you have leared anything.

Let’s match that with another Internet quote, attributed to Paul Shoemaker “Experience is inevitable, Learning is not“. Without reflection, and without change as a result of that reflection, nothing has been learned no matter how many experiences you have.

“Observation and Reflection” is also a part of the Kolb learning cycle, shown here, which is a well-established model for how individuals learn.

Knowledge Management is not so much about Individual learning as about Organisational Learning and Team Learning, but reflection is just as important in the KM context. Reflection  needs to be introduced to work practice through the introduction of reflective processes such as the following:

  • Introducing After Action Review as a reflective team process, to allow teams to reflect on, and collectively learn from, actions and activities;
  • Introducing Retrospect as a reflective team process, to allow teams to reflect on, and collectively learn from, projects and project phases;
  • Introducing processes such as Knowledge Exchange to allow members of a Community of Practice to reflect on, and share, their practice;
  • Introducing processes such as knowledge interviews to guide experts through structured reflection on their own experience, and top make this public for others.

It is only through introducing reflective processes such as these, and then acting on the new knowledge gained, that your organisation will stop just Experiencing, and start Learning.

View Original Source Here.

A key lesson-learning role in the military setting

Lesson Learning is well embedded in the United States Army and forms a model which industry can emulate, especially when it comes to assigning knowledge management roles within the business.

www.Army.mil As explained in this excellent analysis from Nancy Dixon, lesson learning works well in the US Army.  This article describes some of the components of the learning system they apply, and mentions processes such as After Action Reviews and Learning Interviews, but also mentions  different roles with accountability for the lessons process. One of the key roles is the  Lessons Learned Integrator, or L2I.

The Lessons Learned Integrator role

The Centre for Army Lessons Learned  is deploying Lessons Learned Integrators in operational units and in other units such as training schools and doctrine centres. These L2I analysts gather lessons learned, research requests for information (RFI), and support the unit within which they are situated. They act as conduits for lessons in and out of the units. You can find various role descriptions for this post (e.g. this one), which suggest that the role primarily involves
  • Collecting, reporting, and disseminating lessons from the host unit
  • Monitoring lessons learned and other new knowledge from elsewhere, assessing it for relevance to the host unit, and “pushing” it to the correct people
  • Initiating actions that lead to change recommendations
  • Locally supporting the “Request for Information” process, where soldiers can make requests for information from the Centre for Army Lessons Learned.
In many of the support centres, the L2I analyst also has a role in developing doctrine, as described here

  • The L2I analyst can derive information from a variety of sources: unit after-action reports; tactics, techniques, and procedures used by units in and returning from theater; Soldier observations/submissions to the Engineer School; and requests for information. 
  • This information is used to conduct doctrine, organization, training, materiel, leadership and education, personnel, and facilities gap analyses and to determine solutions

As ever, Industry can learn from the Military.

Too often we see “Lessons Learned systems” which seem to have no roles or accountabilities assigned to them. The assumption seems to be that “everyone is responsible for lessons learned”, which quickly becomes “someone else will do it”, then “nobody is responsible”. The Army avoid this by identifying specific pivotal roles for identification, communication and analysis of Lessons, and for identifying what needs to be done as a result.

If you want your Lessons Learned system to really work, then you may need roles similar to the L2I in your operation units.

View Original Source Here.

Three levels of Lesson Learning

Here are a couple of reprised blog posts from 5 years ago, covering the topic of lesson learning, and presenting 3 potential levels of maturity for a learning system. Most organisations are stuck at level 1.

There are three levels of rigour, or levels of maturity, regarding how Lesson-learning is applied in organisations.

  • The first is to reactively capture and doccument lessons for others to find and read
  • The second is to reactively capture lessons at the end of projects, document them, and as a result make changes to company procedures and practices
  • The third is to proactively hunt lessons from wherever they can be found, and make changes to company procedures and practices so that the lessons are embedded into practice. 

Lesson-learning can be a very powerful way for an organisation to learn, change and adapt, but only if it is approached in a mature way. Level 1, to be honest, will not add much value, and its only when you get to level 2 that Lesson learning really starts to make a difference.

Let’s look at those levels in more detail.

Level 1

Level 1  is to reactively capture lessons at the end of projects, and document them so that others can learn. Lessons are stored somewhere, and people need to find and read the lesson in order to access the knowledge. There are sub-levels of maturity in level 1, which include

1a) Ad-hoc capture of lessons, often by the project leader, documenting them and storing them in project files with no quality control or validation step. Lessons must therefore be sought by reading project reports, or browsing project files structures

1b)Structured capture of lessons, through lessons identification meetings such as retrospects, documenting and storing the lessons in project files with no quality control or validation step.

1c) Structured capture of lessons, through lessons identification meetings such as retrospects, documenting and storing the lessons in a company-wide system such as a lessons database or a wiki. This often includes a validation step.

1d) Structured capture of lessons, through lessons identification meetings such as retrospects, documenting and storing the lessons in a company-wide system with auto-notification, so that people can self-nominate to receive specific lessons.

Level 2

Level 2 is to reactively capture lessons at the end of projects, document them, and as a result make changes to company procedures and practices so that the lessons are embedded into practice. Here people do not need to read the lesson to access the knowledge, they just need to follow the practice. Again, there are sub-levels of maturity in level 2, which include

2a) Lessons are forwarded (ideally automatically, by a lesson management system) to the relevant expert for information, with the expectation that they will review them and incorporate them into practice.

2b) Lessons include assigned actions for the relevant expert, and are auto-forwarded to the expert for action

2c) As 2b, with the addition that the actions are tracked and reported.

Level 3  is to proactively hunt lessons from wherever they can be found, and make changes to company procedures and practices so that the lessons are embedded into practice.  There are not enough organisations at level 3 to recognise sub-levels, but there are some ways in which Level 3 can operate

3a) Senior managers can identify priority learning areas for the organisation. Projects are given learning objectives – objectives for gathering knowledge and lessons on behalf of the organisation. These may be written into project knowledge management plans. 3b) Learning teams may analyse lessons over a period of months or years to look for the common themes and the underlying trends – the weak signals that operational lessons may mask.

3c) Organisations may deploy specific learning resources (Learning Engineers, Project Historians, etc) into projects or activity, in order to pick up specific learning for the organisation.

I have only really come across level 3 in the military.  For example, see this quote from
Lieutenant-General Paul Newton

The key is to ‘hunt’ not ‘gather’ lessons, apply them rigorously—and only when you have made a change have you really learned a lesson. And it applies to everyone … It is Whole Army business. 

However even level 2 is quite rare. Many organisations have not gone beyond the “document and store” stages of Level 1, and generally have been disappointed by the outcomes.

If you aspire to be a learning organisation, set your sights at levels 2 or 3.

View Original Source Here.

The difference between lessons and best practice – another post from the archives

Here is another post from the archives – this time looking at the difference between Best Practice and Lessons Learned.

Someone last week asked me, what’s the difference between Best Practice, and Lessons Learned.

 Now I know that some KM pundits don’t like the term “Best Practice” as it can often be used defensively, but I think that there is nothing wrong with the term itself, and if used well, Best Practice can be a very useful concept within a company. So let’s dodge the issue of whether Best Practice is a useful concept, and instead discuss it’s relationship to lessons learned.

My reply to the questioner was that Best Practice is the amalgamation of many lessons, and it is through their incorporation into Best Practice that they become learned.

If we believe that learning must lead to action, that lessons are the identified improvements in practice, and that the actions associated with lessons are generally practice improvements, then it makes sense that as more and more lessons are accumulated, so practices become better and better.

A practice that represents the accumulation of all lessons is the best practice available at the time, and a practice that is adapted in teh light of new lessons will only get better.

View Original Source Here.

Why you need to place some demands on the knowledge sharer

Sharing knowledge is a two-sided process. There is a sharer and a receiver. Be careful that making knowledge easier to share does not make knowledge harder to re-use.

Image from wikimedia commons

Sharing knowledge is like passing a ball in a game of rugby, American Football or basketball. If you don’t place some demands on the thrower to throw well, it won’t work for the catcher. If you make it too undemanding to throw the ball, it can be too hard to catch the ball.  Passing the ball is a skill, and needs to be practised.

The same is true for knowledge. If you make it too simple to share knowledge, you can make it too difficult to find it and re-use it.  In knowledge transfer, the sharing part is the easier part of the transfer process. There are more barriers to understanding and re-use than there are to sharing, so if you make the burden too light on the knowledge supplier, then the burden on the knowledge user can become overwhelming.

Imagine a company that wants to make it easy for projects to share knowledge with other projects. They set up an online structure for doing this, with a simple form and a simple procedure. “We don’t want people to have to write too much” they say “because we want to make it as easy as possible for people to share knowledge”.

So what happens? People fill in the form, they put in the bare minimum, they don’t give any context, they don’t tell the story, they don’t explain the lesson. And as a result, almost none of these lessons are re-used. The feedback that they get is “these lessons are too generic and too brief to be any use”.  we have seen this happen many many times.

By making the knowledge too easy to share – by demanding too little from the knowledge supplier – you can make the whole process ineffective. 

There can be other unintended consequences as well. Another company had a situation as described above, where a new project enthusiastically filled in the knowledge transfer form with 50 lessons. However this company had put in a quality assurance system for lessons, and found that 47 of the 50 lessons were too simple, too brief and too generic to add value. So they rejected them.

The project team in question felt, quite rightly, that there was no point in spending time capturing lessons if 94% of them are going to be rejected, so they stopped sharing. They became totally demotivated when it came to any further KM activity.

 Here you can see some unintended consequences of making things simple. Simple does not equate to effective.

Our advice to this company was to introduce a facilitation role in the local Project Office, who could work with the project teams to ensure that lessons are captured with enough detail and context to be of real value. By using this approach, each lesson will be quality-controlled at source, and there should be no need to reject any lessons.

Don’t make it so simple to share knowledge, that people don’t give enough thought to what they write.

The sharer of knowledge, like the thrower of the ball, needs to ensure that the messages can be effectively passed to the receiver, and this requires a degree of attention and skill. 

View Original Source Here.

Skip to toolbar