Lesson learning as a supply chain

Another reprise from the archives – the idea of lessons being the “car parts” of knowledge

This post is a combination of three ideas, to see if they come up with something new.

  • Idea number 1 – the idea of an organisation as a knowledge factory, sparked by Lord Browne’s quote – “anyone in the organization who is not directly accountable for making a profit should be involved in creating and distributing knowledge that the company can use to make a profit”  

  • Idea number 2 – the idea that corporate process is a compilation or synthesis of all the lessons learned over time  

So the combination idea looks like this;

The inner ring is a supply chain where components are manufactured, and assembled into products (like a car plant, or a construction site).

The outer ring is the lesson learning cycle, one of the procedural loops in Knowledge Management. Please note that this is only one of the many ways in which KM works; this is the systematic push-driven cycle involving the collection of explicit lessons, and there are many other types of interaction in KM (push and pull, connect and collect).

In our analogy, we have lessons from experience being collected, distributed through lesson management, and assembled into continuously improving corporate processes, rather like car parts are created, distributed, and assembled into cars.  The links within this chain are as follows

  1. The raw materials for the supply chain are the experiences of the individuals in the workplace, who are trying to apply the processes in different contexts, in a changing world.
  2. The supplier of the raw materials therefore are the individuals themselves.
  3. Experiences are manufactured into lessons through processes of analysis and discussion – team meetings such as Retrospects, and After Action Reviews. Through discussion and analysis, individual unconscious knowledge is made conscious, and the experiences of many individuals are combined into the lessons of the team or the lessons from an event. These lessons are the components – the car parts within the supply chain. 
  4. Now we get into the Distribution part of the supply chain. We need to get those parts to the assembly plant. This is a part where many Lessons learned systems break down. They leave those parts (lessons) in the warehouse (database), and expect people to come and find them (remember that scene from raiders of the Lost Ark?). We need instead to have active lessons management, to push the lessons to those who need them.
  5. Those who need them are (primarily) the people in charge of corporate process, who need to keep those processes fresh and updated as new learning comes in. The Process owners, or SMEs.
  6. However that is not the end of the story. The assembled knowledge needs to get to the consumer – though the equivalent of car showrooms (community portals), or supermarkets (Intranets) or street markets (wikis).
  7. The consumer is the knowledge worker. They apply the new knowledge, and in doing so, gain new experience. 
And so the cycle begins again.

View Original Source (nickmilton.com) Here.

Why so much knowledge sharing, so little knowledge seeking?

Knowledge Management requires knowledge seeking and knowledge sharing. But why so much focus in internal processes on sharing and so little on seeking?

Learning Happens
Learning Happens by shareski, on Flickr

One of the standard models for Knowledge Management in project environments is the idea of “Learning Before, During and After“.

Ideally these three activities should be embedded in project process, so that a project

  1. Starts by reviewing and accessing all the knowledge it needs,
  2. Learns as it goes, improving its processes during the course of the project, and
  3. Identifies, analyses, documents and shares the new knowledge it has gained, for the sake of future projects. 
For the project itself, the most powerful of the three is “Learning Before”. If a project can maximise it’s knowledge up front, especially if the team can discover the things it doesn’t know that it doesn’t know, then success is much more likely. “Learning Before” activities such as Knowledge gap Analysis, KM planning or Peer Assist can overcome some of the more serious cognitive biases for KM, and are the nearest thing to KM Silver Bullets that we have. Learning before activities drive receptivity, increase absorptive capacity, and help teams “want to learn”.  
And yet, when you look at internal company project frameworks, or even at  generic frameworks such as Prince 2 or ISO, there is almost always a requirement for capturing and sharing lessons after the project, and no such requirement for Learning Before. According to our global survey, 68% of companies require their projects to do some sort of Learning After, but only 15% require them to do Learning Before.  Prince 2 has a required, and well documented, step at the end, for creating lessons (although this could be much improved!), but has no step at the project start-up, requiring a search for, and review of, existing knowledge.  
This astounds me.

Why even bother to collect lessons at the end of a project, if nobody reviews them at the start of the next project!

I think the answer is that it is psychologically easier to share than it is to learn.  A project team can feel proud and recognised (even a little smug at times) for sharing lessons, while asking for lessons can feel like an admission of incompetence (“can anyone help me with this?”). 
Learning After is Teaching – Learning Before is Learning, which is Much Harder. Knowledge reuse is more difficult than knowledge sharing, yet that is all the more reason we should make it a focused and deliberate step. 
You get around some of these barriers by introducing non-judgemental techniques such as Peer Assist and Knowledge Management plans, which take the exposure out of asking for help, or seeking for knowledge. And you also address it by developing a culture of Asking, rather than a culture of Sharing.

Please, let’s introduce the full cycle of Learning Before, During and After, and let’s not skip the Before step!

View Original Source (nickmilton.com) Here.

How emergency services developed a capability for Lessons Management

Lesson Management is a core component of Lesson Learning. Here are the story of how this capability was developed in Australian emergency services. 

This comes from the recent issue of the Australian Journal of Emergency Management, where Heather Stuart and Mark Thomason describe how Lesson Management was first recognised as important, and then developed across the services.

The need for systematic lesson management was recognised in the aftermath of some serious learning events, for example the following:

The South Australian Country Fire Service (CFS) had developed a lessons capacity following the Wangary Fires in 2005. The Wangary fire and other fires on that day were the most destructive fires, in terms of loss of life and property, that the CFS had seen since the Ash Wednesday fires in 1983. Given the losses, community grief and the Coronial inquest into this event, CFS recognised that a more formal approach into learning from these events was required and that the service owed it to the community to demonstrate improvements as soon as possible. This was the first time that a formal approach had been utilised in CFS for collecting, analysing and theming lessons

Emergency Management Australia – the coordination body for the state emergency services – hosted a conference in 2011 to discuss lesson learning, and found that many of the state bodies were in a similar stage, with the lessons process being driven by one or two passionate individuals.  A scan of the current state found 

  • a strong culture of identifying themes, trends and lessons but not much success at ensuring lessons were learnt by creating lasting behaviour change 
  • no consistent model for capturing, analysing, sharing and implementing lessons leading to poorly defined roles, responsibilities and expectations
  • “blame and shame”, although diminished, was still prevalent in some parts of the sector 
  • a lack of visibility in the process of developing lessons, leading to a perception that personal observations and contributions were not influencing change
  • many champions of learning practice in the field but there was a risk of losing momentum because of the perceived information ‘black holes’ 
  • emergency management agencies (e.g. responder agencies, government departments and non- government partners) were working separately on lessons management, creating silos of knowledge and disconnected learning opportunities 
  • there was a limited understanding of principles and benefits of lessons

The first step to address lesson management across the emergency services was to build a practitioners network, and then to draft a Lesson Management Handbook (one of the best I have seen, by the way). This led to a standard terminology across the states, and increasingly a standard lesson management model (Read about the State of Victoria model here). A common gap was identified – data analysis of lessons collection – and addressed through training, which led to the development of “national lessons”; lessons which appear across all states.

Through this process, lesson management, both within the individual states and across the nation, was born, defined, standardised and deployed.

As the authors conclude:

Learning lessons as a lessons practitioner is greater than the process itself and an individual agency’s activities… The synergies gained through collaboration between lessons practitioners across the emergency management sector has contributed to strengthening the lessons capability in each of the participating agencies and has resulted in greater achievements in this sphere than agencies would have achieved working in isolation.

View Original Source (nickmilton.com) Here.

The NATO lesson learned portal

The video below is a neat introduction to the concept behind the new Lesson Learned Portal at NATO

The video is publically available on the Youtube channel of JALLC, the joint Analysis and Lessons Learned Centre at NATO

The Youtube description is as follows:

The NATO Lessons Learned Portal is the Alliance’s centralized hub for all things related to Lessons learned. It is managed and maintained by the JALLC, acting as NATO’s leading agent for Lessons Learned. 

Observations and Best Practices that may lead to Lessons to be Learned can be submitted to the Portal, and the JALC will ensure that these Observations find their way through the NATO Lessons Learned Process. 

The information shared on the NATO Lessons Learned Portal can help saving lives. The little piece of information you have, may be the fragment missing to understand the bigger problem/solution – make sure you share it.

View Original Source (nickmilton.com) Here.

A case study of a failed learning system

When lesson learning failed in the Australian Defence Force, they blamed the database. But was this all that was at fault?

design-databaseHere’s an interesting 2011 article entitled “Defence lessons database turns off users”. I have copied some of the text below, to show that, even thought the lessons management software seems to have been very clumsy (which is what the title of the article suggests), there was much more than the software at fault.

 “A Department of Defence database designed to capture lessons learned from operations was abandoned by users who set up their own systems to replace it, according to a recent Audit report. The ADF Activity Analysis Data System’s (ADFAADS) was defeated by a “cultural bias” within Defence, the auditor found. Information became fragmented as users slowly abandoned the system”.

So although the article title is “defence lessons database turns off users”, the first paragraph says that it was “defeated by cultural bias”. There’s obviously something cultural at work here ……

“Although the auditor found the structure and design of the system conformed to ‘best practice’ for incident management systems, users found some features of the system difficult to use. Ultimately it was not perceived as ‘user‐friendly’, the auditor found. Convoluted search and business rules turned some users against the system”. 

….but it also sounds like a clumsy and cumbersome system

“In addition, Defence staff turnover meant that many were attempting to use ADFAADS with little support and training”.

…with no support and no training.

“An automatically-generated email was sent to ‘action officers’ listing outstanding issues in the system. At the time of audit, the email spanned 99 pages and was often disregarded, meaning no action was taken to clear the backlog”.

There needs to be a governance system to ensure actions are followed through, but sending a 99-page email? And with no support and follow-up?

 “It was common for issues to be sent on blindly as ‘resolved’ by frontline staff to clear them off ADFAADS, even though they remain unresolved, according to the auditor”.

Again, no governance. There needs to be a validation step for actions, and sign-off for “resolution” should not be developed to frontline staff.

 “Apart from a single directive issued by Defence in 2007, use of the database was not enforced and there were no sanctions against staff who avoided or misused it”.

There’s the kicker. Use of the lessons system was effectively optional, with no clear expectations, no link to reward or sanction, no performance management. It’s no wonder people stopped using it.

So it isn’t as simple as “database turned off users”. It’s a combination of

  • Poor database
  • Poor notification mechanism
  • No support
  • No training
  • No incentives
  • No governance
  • No checking on actions

It’s quite possible that if the other items had been fixed, then people might have persevered with the clumsy database, and it’s even more likely that if they built a better database without fixing the other deficiencies, then people still would not use it.  A

What they needed was a lessons management system, not just a database.

So what was the outcome? According to the article,

…..establish a clear role and scope for future operational knowledge management repositories, and develop a clear plan for capturing and migrating relevant existing information ….. prepare a “user requirement” for an enterprise system to share lessons.

In other words – “build a better database and hope they use it” Sigh.

View Original Source (nickmilton.com) Here.

How the Australian Emergency Services manage lessons

Taken from this document, here is a great insight into lesson management from Emergency Management Victoria. 

 Emergency Management Victoria coosrinates support for the state of Victoria, Australia during emergencies such as floods, bush fires, earthquakes, pendemics and so on. Core to their success is the effective learning of lessons from carious emergencies.
The diagram above summarises their approach to lesson learning, and you can read more in the review document itself, including summaries of the main lessons under 11 themes.
  • They collect Observations from individuals (sometimes submitted online), and from Monitoring, Formal debriefs, After ActionReviews and major reviews.
  • These observations are analysed by local teams and governance groups to identify locally relevant insights, lessons and actions required to contribute to continuous improvement. These actions are locally coordinated, implemented, monitored and reported. 
  • The State review team also take the observations from all tiers of emergency management, and analyse these for insights, trends, lessons and suggested actions. they then consult with subject matter experts to develop an action plan which will be presented to the Emergency Management Commissioner and Agency Chiefs for approval.
  • The State review team supports the action plan by developing and disseminating supporting materials and implementation products, and will monitor the progress of the action plan.

This approach sees lessons taken through to action both at local level and at State level, and is a very good example of Level 2 lesson learning.

View Original Source (nickmilton.com) Here.

What are the outputs of the KM workstream?

KM organisations need a Knowledge workstream as well as a Product/Project workstream. But what are the knowledge outputs?

I have blogged several times about the KM workstream you need in your organisation; the knowledge factory that runs alongside the product factory or the project factory.  But what are the outputs or  products of the knowledge factory?
The outputs of the product factory are clear – they are designed and manufactured products being sold to customers. The outputs of the project factory are also clear – the project deliverables which the internal or external client has ordered and paid for. 
We can look at the products of the KM workstream in a similar way. The clients and customers for these are knowledge workers in the organisation who need knowledge to do their work better; to deliver better projects and better products. It is they who define what knowledge is needed. Generally this knowledge comes in three forms:
  • Standard practices which experience has shown are the required way to work. These might be design standards, product standards, standard operating procedures, norms, standard templates, algorithms and so on. These are mandatory, they must be followed, and have been endorsed by senior technical management.
  • Best practices and best designs which lessons and experience have shown are currently the best way to work in a particular setting or context. These are advisory, they should be followed, and they have been endorsed by the community of practice as the current best approach.
  • Good practices and good options which lessons from one or two projects have shown to be a successful way to work. These might be examples of successful bids, plans, templates or designs, and they have been endorsed by the community of practice as “good examples” which might be copied in similar circumstances, but which are not yet robust enough to be recognised as “the best”. 
  • More generic accumulated knowledge about specific tasks, materials, suppliers, customers, legal regimes, concepts etc.
The project/product workstream also creates outputs which act as inputs to the knowledge workstream; these are the knowledge deliverables, the lessons which capture hindsight, and the useful iterms which can be stored as good practices and good options. The link between lessons and best practices is described here, and shows how the two workstreams operate together to gather and deliver knowledge to optimise results. 

View Original Source (nickmilton.com) Here.

The "One Year After" knowledge capture event.

Many of us are used to holding knowledge capture events at the end of a project.  There is also merit in repeating this exercise one year (or more) later.

Imagine a project that designs and builds something – a factory, for example, or a toll bridge, or a block of student accommodation. Typically such a project may capture lessons throughout the project lifetime, using After Action Reviews to capture project-specific lessons for immediate re-use, and may then capture end-of-project lessons using a Retrospect, looking back over the life of the project to identify knowledge which can be passed on to future projects. This end-of-project review tends to look at the efficiency of the practices used during the project, and how these may be improved going forward. 
The review asks “Was the project done right, and how can future projects be done better”.  However what the review often does not cover is “Was the right project done?”
At the end of the project the factory is not yet operational, the bridge has only just opened to traffic, and you have just cut the ribbon on the student accommodation block. You do not yet know how well the outcome of the project will perform in practice. 

This is where the One-Year operational lessons review comes in.

You hold this review after a year or more of operation. 
  • You look at factory throughput, and whether the lines are designed well, how they are being used, how effective the start-up process was, whether there are any bottlenecks in dispatch and access, and even whether the factory is in the correct location. 
  • You look at traffic over the bridge – is it at expected levels? Is it overused or underused? Is it bringing in the expected level of tolls? Does the bridge relieve congestion or cause congestion somewhere else? Does the road over the bridge have enough lanes?  Does it ice up in winter?
  • You look at usage of the student accommodation. Is it being used as expected? Are the kitchens big ebnough? Are there enough bike racks? Where is the wear and tear on the corridors? Where are accidents happening? What do the neighbours think?
In this review you are looking not at whether the project was done right, but whether it was the right project (or at least the right design). The One Year operational learning review will generate really useful lessons to help you improve your design, and your choice of projects, in future. 

Don’t stop collecting lessons at the end of the project, collect more once you have seen the results of a year or more of operations.

Contact Knoco for help in designing your lesson learned program.

View Original Source (nickmilton.com) Here.

Why storing project files is not the same as storing project knowledge

There is often an assumption that storing project files equates to managing knowledge on behalf of future projects. This is wrong, and here’s why.

For example, see this video from the USACE Knowledge Management program says “if you digitise your paper files, throw in some metadata tagging, and use our search appliance, finding what you need for your [future] project is easy”. (I have added the word [future] as this was proposed as a solution to the next project now anticipating things in advance).

However there is a major flaw with just digitising, tagging and filing the project documents and assuming that this transfers knowledge, and the flaw is that the project may have been doing things wrong, and almost certainly could have done things better with hindsight. Capturing the files will capture the mistakes, but will not capture the hindsight, which is where the learning and the knowledge resides.

It is that hindsight you need to capture, not the files themselves.

  • Don’t capture the bid package presented to the client, capture what you should have bid, the price you should have quoted, and the package you should have used. All of these things should come from the post-bid win/loss review.
  • Don’t capture the proposed project budget, capture the actual budget, where the cost overruns were, and how you would avoid these next time. This should come from the post-project lessons review.
  • Don’t capture the project resource plan, capture the resource plan you should have had, and the resourcing you would recommend to future projects of this type. This also should come from the post-project lessons review.
  • Don’t capture the planned product design, capture the as-built design, where the adjustments were made, and why they were made. (See  my own experience of working from stored plans and not as-built design which cost me £500 and ten dead trees).
  • And so on. You can no doubt think of other examples.
Capturing the hindsight is extra work, and requires analysis and reflection through Knowledge Management processes such as After Action Review and Retrospect. These processes need to be schedules within the project plan, and need to focus on questions such as 
  • What have we learned?
  • What would we repeat?
  • What would we do differently?
  • What advice and guidance, with the benefit of hindsight, would we give to future projects?
These are tough questions, focused on deriving hindsight (as in the blog picture above). Deriving hindsight is not easy, which is why these Knowledge Management processes need to be given time, space, and skilled facilitation. However they add huge value to future projects by capturing the lessons of hindsight.  Merely filing and tagging the project files is far easier, but will capture none of the hindsight and so none of the knowledge.

Capturing documents from previous projects and repeating what they did will cause you to repeat their mistakes. Better to capture their hindsight, so it can be turned into foresight for future projects. 

View Original Source (nickmilton.com) Here.

Army definitions in Lesson Learning

The Army talk about building up lessons through Observations and Insights. But what do these terms mean?

Chinese character dictionaryLesson learning is one area where Industry can learn from the Military. Military lesson learning can be literally a matter of life and death, so lesson learning is well developed and well understood in military organisations.

The Military see a progression in the extraction and development of lessons – from Observations to Insights to Lessons – and we see a similar progression within the questioning process in After Action Reviews and Retrospects.

On Slide 7 of this interesting presentation, given by Geoff Cooper, a senior analyst at the Australian Centre for Army Lessons Learned, at the recent 8th International Lessons Learned Conference, we have a set of definitions for these terms, which are very useful.

They read as follows (my additions in brackets)

Observation. The basic building block [for learning] from a discrete perspective. 

  • Many are subjective in nature, but provide unique insights into human experience.
  • Need to contain sufficient context to allow correct interpretation and understanding.
  • Offer recommendations from the source
  • [they should be] Categorised to speed retrieval and analysis

Insight. The conclusion drawn from an identified pattern of observations pertaining to a common experience or theme.

  • Link differing perspectives and observations, where they exist.
  • Indicate recommendations, not direct actions,
  • Link solid data to assist decision making processes
  • As insights relay trends, they can be measures

Lesson. Incorporates an insight, but adds specific action and the appropriate technical authority.  

Lesson Learned. When a desired behaviour or effect is sustained, preferably without external influence.

What Geoff is describing is a typical military approach to lesson-learning, where a lessons team collects many observations from Army personnel, performs analysis, and identified the Insight and Lesson. As I pointed out in this post, this is different from the typical Engineering Project approach, where the project team compare observations, derive their own insight, and draft their own lesson.

The difference between the two approaches depends on the scale of the exercise. In the military model there can be hundreds of people who contribute observations, while in a project, it’s usually a much smaller project team (in which case it makes sense to collect the observations and insights through discussion). If you are using the military model, these definitions will be very useful.

View Original Source (nickmilton.com) Here.

Skip to toolbar