Lesson learning in the US Army – example from Haiti

Army learning is not just about fighting battles – here’s an example from disaster response

In 2010, the US Army was called in to provide humanitarian aid, including food and shelter, after a category 7 earthquake in Haiti. This article by the US contracting commend, entitled Lessons learned and used during Haiti deployment described how lesson learning made this a more efficient and effective process.

The article describes how, at the height of the response, the US Army supplied more than 15 million meals  in a 10-day period to the Haitian population, as well as setting up distribution points for families to receive boxes and bags of rice, beans and cooking oil. All of this required a supply chain and contracting and purchasing activities, and by the end of the mission, the Expeditionary Contracting Command had created more than 380 contracting actions valued at almost $12 million. Of course this supply chain needed to be slick and well organised, and to have learned from the past.

“We took advantage of a lot of lessons learned from previous deployments. We didn’t do these types of things early on in Operation Iraqi Freedom or Operation Enduring Freedom. However, we learned those lessons and brought these capabilities to Haiti early on,” said Brig. Gen. Joe Bass, commander, Expeditionary Contracting Command. “We were very proactive from the beginning, deploying the right personnel mix needed to provide quality assurance, legal, policy and other areas where we could address issues on the front end rather than after they’ve been done.

Lessons included

  • the personnel which needed to be deployed with the first troops, 
  • the need to set up a support centre in the US, 
  • the need for review and decision making boards in Haiti, 
  • creation of “pre-positioned deployable equipment packages”, and 
  • the correct level of decision making authority for procurement orders of different value from $100 thousand to $1 million.

The Haiti organisation did not just learn lessons from the past, they created new lessons to help future operations.

“Learning from the past helped us deploy quicker and smarter,” Bass said. “Just as we gathered lessons learned from previous deployments, we have gathered some from the Haiti deployment that should help us the next time we have to deploy. Moving forward means reviewing what we’ve done and how we have done it in the past, then reviewing it again and constantly using those lesson to better ourselves with each new challenge”

This is a great example of an organisation using lesson learning to continuously improve operations. 

View Original Source (nickmilton.com) Here.

Lessons Learned, or lessons lost?

Are you learning lessons, or losing lessons?

Kizar [Public domain], from Wikimedia Commons

A Lesson is an Investment

It might take a project team of 10 people, one day to create, through facilitated discussion, ten high-quality lessons. So each of these takes one man-day to create; say £500 – plus some write-up time – lets say £1000 per lesson to choose a nice round number.

However that lesson is encapsulated knowledge which, if re-used, may save man-months of wasted time and effort in the future, or tens of thousands of pounds of wasted cost. The project team have invested their time to deliver future ten-fold or hundred-fold return on that investment. That lesson may therefore be an investment worth £10,000 to £100, 000 when realized through application.

So what do we normally do with our investments?

Do we hide them in a hole and never look at them again? Do we file that share certificate in a drawer and never look at it again? Or do we track our investments until the time and situation is right to realise them? When it comes to money, we do the latter. But what about lessons?

Unfortunately, a common approach to lessons is to record them in a report, or on a spreadsheet, then put them into project files, and hide them in the online filing system. I heard a report a couple of weeks ago about a team that created some hugely valuable lessons, and stored them in security-controlled project files to which no other project was allowed access!

That’s not Lessons Learned; that’s Lessons Lost.

That’s like the last scene in Indiana Jones, when the Ark of the Covenant is put in a crate, and hidden in a vast warehouse of identical crates.

What we should do with investments, is put them in a managed portfolio, and we track them, until we find the opportunity to realise their value.

What we should do with lessons, is put them in a Lessons Management System, and track them, until we find the opportunity to reuse them or to embed them into process and procedure, thus realising the investment of time and effort that was put into generating them in the first place.

That’s Lessons Learned; rather than Lessons Lost.

View Original Source (nickmilton.com) Here.

How the coastguard seeks input to lesson learning

Public organisations can learn from the coastguard when it comes to getting wide scale input to lesson learning

Any public organisation, especially one with an element of high priority service, needs a lesson-learning process to improve that service. The emergency response services in particular have well-developed lesson learning systems, but here is a wrinkle I had not seen before, from the US coastguard.

This article from 2017, entitled “Innovation Program seeks hurricane lessons learned from Coast Guard responders” describes how the US coastguard set up what they called the “Hurricane Lessons Learned challenge” on the Coast Guard’s ideas-capturing portal CG_Ideas@Work.

This portal was started as a way to preserve and institutionalize the wealth of lessons learned during hurricane response efforts, and all Coast Guard personnel who participated in any of the response efforts are encouraged to share their observations, issues and ideas.

This is a means of capturing ideas observations and insights which analysts later could convert into lessons (the sequence from Observations to Insights to Lessons is widely recognised in the Lesson learning community). Some direct lessons may also be captured.

As the article explains

 The Coast Guard routinely captures lessons learned as a way to improve its operations, but the CG_Ideas@Work challenge offers one distinct advantage: “Our crowdsourcing platform not only provides a place to submit ideas, but also to collaborate on them,” (Cmdr. Thomas “Andy”) Howell said. “Everyone from non-rates to admirals can discuss ideas.” Speed is also an advantage. “Catching the ideas when they’re fresh and raw preserves their integrity,” Howell said.

The US Coastguard are well aware that capturing lessons is not enough for them to be a learnign organisation. These lessons must also drive change.

The Commandant’s Direction says we need to become an organization capable of continuous learning, so it’s important that the innovations and adaptations that made this response successful are institutionalized,” Howell said. Ideas shared through the Hurricane Lessons Learned challenge are immediately shared with the responsible program. Many will be considered as potential projects for next year’s Research, Development, Test and Evaluation Project Portfolio.

The portal has been very well received

“We’ve heard from pilots, inspectors, commanding officers, district command staffs, reservists, Auxiliary personnel – the entire gamut of responders,” Howell said. “It’s a very user-friendly way to collect information, and comes with the benefit of collaboration,” he said.

This is an approach other similar organisations can learn from.

View Original Source (nickmilton.com) Here.

What’s the difference between a lesson-learned database and a lesson management system?

In this blog post I want to contrast two software systems, the Lessons Database, and the Lessons Management System.

There are two types of Lessons Learned approaches, which you could differentiate as “Lessons for Information” and “Lessons for Action”.

These represent maturity levels 1 and 2 from my three level categorisation, and can be described as follows.

Lessons for Information” is where lessons are captured and put in reports, or in a database, in that hope that people will look for them, read them, and assimilate them.

Lessons for Action” is where lessons are used to drive change and improvement. Lessons are captured, reviewed, validated, and action is taken to embed the lessons in process, procedure, standards and/or training.

“Lessons for Information” is supported by a Lessons Database, “Lessons for Action” by a Lessons Management System. Let’s contrast the two.

  • In a Lessons Database, the database is the final home of the lessons. In a Lessons Management System, the final home of lessons is considered to be the compiled knowledge of the organisation, which may be procedures, doctrine, guidance, best practices, wikis, etc.
  • In a Lessons Database, lessons reach their reader through search. In a Lessons Management System, lessons are pro-actively routed to those who need to see them and to take action.
  • In a Lessons Database, lessons accumulate over time (this was the problem with my first Lessons system in the 90s – it got clogged up over time with thousands of lessons, until people stopped looking). In a Lessons Management System, lessons are archived once they have been embedded into process and procedure, and the only live content in the system is the set of lessons currently under review.
  • In a Lesson Database there is only one type of lesson – the published lesson. In a Lesson Management system there are at least two types of lesson – the open lesson (where action has not yet been taken) and the closed lesson, which may then be archived. Some organisations recognize other types, such as the draft lesson (not yet validated) and the Parked lesson (where action cannot yet be taken, or where action is unclear, and where the lesson needs to be revisited in the future).
  • In a Lessons Database, there may be duplicate lessons, out of date lessons, or contradictory lessons. Through the use of a Lessons Management System, these have all been resolved during the incorporation into guidance.
  •  In a Lessons Database, there are limited options for metricating the process. You can measure how many lessons are in the system, but that’s about it (unless you capture data on re-use). Through the use of a Lessons Management System, you can track lessons through to action, and can measure whether they are being embedded into process, you can see where they are being held up, and by whom, and you can see how long the process is taking and where it needs to be speeded up.
Lessons management is the way to go. Lesson databases really do not work in the long term, and usually become lesson graveyards, and the home for Lessons Lost.

View Original Source (nickmilton.com) Here.

The curse of knowledge and the danger of fuzzy statements

Fuzzy statements in lessons learned are very common, and are the result of “the curse of knowledge”

Fuzzy Monster
Clip art courtesy of DailyClipArt.net

I blogged yesterday about Statements of the Blindingly Obvious, and how you often find these in explicit knowledge bases and lessons learned systems, as a by-product of the “curse of knowledge“.

There is a second way in which this curse strikes, and that is what I call “fuzzy statements”.

It’s another example of how somebody writes something down as a way of passing on what they have learned, and writes it in such a way that it is obvious to them what it means, but which carries very little information to the reader.

A fuzzy statement is an unqualified adjective, for example

  • Set up a small, well qualified team…(How small? 2 people? 20 people? How well qualified? University professors? Company experts? Graduates?)
  • Start the study early….(How early? Day 1 of the project? Day 10? After the scope has been defined?)
  • A tighter approach to quality is needed…. (Tighter than what? How tight should it be?)
You can see, in each case, the writer has something to say about team size, schedule or quality, but hasn’t really said enough for the reader to understand what to do, other than in a generic “fuzzy” way, using adjectives like “small, well, early, tighter” which need to be quantified.

In each case, the facilitator of the session or the validator of the knowledge base needs to ask additional questions. How small? How well qualified? How early? How tight?

Imagine if I tried to teach you how to bake a particular cake, and told you “Select the right ingredients, put them in a large enough bowl. Make sure the oven is hotter”. You would need to ask more questions in order to be able to understand this recipe.

Again, it comes back to Quality Control.

Any lessons management system or knowledge base suffers from garbage In, Garbage Out, and the unfortunate effect of the Curse of Knowledge is that people’s first attempt to communicate knowledge is often, as far as the reader is concerned, useless garbage.

Apply quality control to your lessons and de-fuzz the statements

View Original Source (nickmilton.com) Here.

The curse of knowledge, and stating the obvious

The curse of knowledge is the cognitive bias that leads to your Lesson Database being full of “statements of the obvious”

Obvious sign is obvious.There is an interesting exercise you can do, to show how difficult it is to transfer knowledge.

 This is the Newton tapper-listener exercise from 1990.

 Form participants into pairs. One member is the tapper; the other is the listener. The tapper picks out a song from a list of well-known songs and taps out the rhythm of that song to the listener. The tapper then predicts how likely it will be that the listener would correctly guess the song based on the tapping. Finally, the listener guesses the song.

Although tappers predicted that listeners would be right 50% of the time, listeners were actually right less than 3% of the time.

The difference between the two figures (50% and 3%) is that to the tapper, the answer is obvious. To the listener, it isn’t.

This is the “curse of knowledge“.

Once we know something—say, the melody of a song—we find it hard to imagine not knowing it. Our knowledge has “cursed” us. We have difficulty sharing it with others, because we can’t readily re-create their state of mind, and we assume that what is clear to us, is clear to them.

Transferring knowledge through the written word (for example in lessons learned, or in online knowledge bases) suffers from the same problem as transferring a song by tapping. People THINK that what they have written conveys knowledge, because they can’t put themselves in the mind of people who don’t already have that knowledge.

Just because they understand their own explanations, that does not mean those explanations are clear to he reader.

This effect can be seen in written knowledge bases and lessons databases, and often appears as Statements of the Blindingly Obvious (SOTBOs).

These are statements that nobody will disagree with, but which carry obviously carry some more subtle import to the writer which the reader cannot discern. These include statements like

  • “It takes time to build a relationship with the client” (Really? I thought it was instantaneous). 
  • “A task like this will require careful planning”. (Really? I thought careless planning would suffice)
  • “Make sure you have the right people on the team.” (Really? I thought we could get away with having the wrong people)
  • Ensure that communication and distribution of information is conducted effectively. (Really? I thought we would do it ineffectively instead)
The writer meant to convey something important through these messages, but failed completely. Why is this? Often because the writer had no help, no facilitation, and was not challenged on the emptiness of their statements.

In each case, any facilitator which had been involved in the capture of the knowledge, or any validator of the knowledge base, would ask supplementary questions:

  • How much time does it take? 
  • What would you need to do to make the planning careful enough? 
  • What are the right people for a job like this? 
  • What would ensure effective communication?
This further questioning is all part of the issue of knowledge quality assurance, to filter unhelpful material out of the knowledge base, or lessons management system, and to turn an unintelligible set of taps into a full tune.

Without this, people rapidly give up on the knowledge base as being “unhelpful”, and full of SOTBOs.

View Original Source (nickmilton.com) Here.

14 barriers to lesson learning

Lesson learning, though a simple idea, faces many barriers to its successful deployment. Here are 14 of them.

I posted, back in 2009, a list of 100 ways in which you could wreck organisational lesson-learning. These were taken from my book, The Lessons-Learned Handbook, and represent the many ways in which the lessons supply chain can be broken or corrupted.

Here’s an alternative view.

From the paper “Harvesting Project Knowledge: A Review of Project Learning Methods and Success Factors” by Martin Schindler and Martin J. Eppler, we have a list of 14 barriers to effective lesson learning through project debriefs, drawn from an extensive literature review.

  1. High time pressure towards the project’s end (completion pressure, new tasks already wait for the dissolving team).  
  2. Insufficient willingness for learning from mistakes of the persons involved.  
  3. Missing communication of the experiences by the involved people due to ‘‘wrong modesty’’ (with positive experiences) or the fear of negative sanctions (in case of mistakes). 
  4. Lacking knowledge of debriefing methods. 
  5. Underestimation of process complexity which a systematic derivation of experiences brings along. 
  6. Lacking enforcement of the procedures in the project manuals.  
  7. Missing integration of experience recording into project processes.  
  8. Team members do not see a (personal) use of coding experience and assume to address knowledge carriers directly as more efficient.  
  9. Difficulties in co-ordinating debriefings. 
  10. Persons cannot be engaged for a systematic project conclusion, since they are already involved in new projects. 

In those cases where a lessons learned gathering takes place, the gained knowledge is often not edited for reuse, or not accepted as valuable knowledge by others. If debriefings are conducted, there is still a certain risk that the results (i.e. the insights compiled by a project team):

  1. are not well documented and archived,  
  2. are described too generically or are not visualized where necessary, which prevents reuse due to a lack of context (e.g. it is too difficult to understand or not specific enough for the new purposes),  
  3. are archived in a way so that others have difficulties retrieving them,  
  4. are not accepted, although they are well documented and easy to locate (the so-called ‘‘not invented here’’-syndrome).

View Original Source (nickmilton.com) Here.

5 success factors for project learning

Learning effectively from projects is a goal for many organisations. Here are some ways how to do it.

The list below of success factors for project-based learning was proposed by Schindler and Eppler, two researchers working out of University of St. Gallen (Switzerland), in their paper “Harvesting Project Knowledge: A Review of Project Learning Methods and Success Factors“.

It’s a pretty good list! Their text is in bold, my commentary on each of these is in normal font.

  1. “From single review to continuous project learning – we stress the necessity for continuous project learning through regular reviews”.  In Knoco, we recommend any project define the methods and frequency up front through the use of a Knowledge Management Plan, and that suitable processes are the After Action review and the Retrospect, or Lessons Capture meeting, or even a Learning History in the case of mega-projects. This is the Process component of  project-based learning. Learning within the project is covered by After Action review, export of learning to other projects is covered by Retrospects.
  1. “New project roles and tasks – the need for new roles for project knowledge management should have become obvious”.  This is the Roles and Accountabilities component of project-based learning – we recommend that someone in the project team itself – a project Knowledge Manager –  takes accountability for ensuring learning processes are applied, making use of facilitation skills as appropriate. This need not be a full time role, but it should be a single point accountability.
  1. “Integration of learning and knowledge goals into project phase models–  project learning is too important to be left to chance or to the initiative of motivated individuals”. This is what we include as part of the Governance component of project-based learning. By embedding knowledge management processes into the project phase models or project management framework, we set a very clear expectation that project learning is important and part of normal project activity. Lesson Learning shoul dbe emphasised in the project management policy.
  1. Integration of learning and knowledge goals into project goals – Adding knowledge goals to every project step can foster systematic reflection about every milestone in a project. This we also include as part of the Governance component of project-based learning. This is the performance management element of Governance. If learning is in the project goals or the project objectives, then the project team will be judged and rewarded by whether they learn, as part of judging and rewarding whether they met their goals. 
Schindler and Epper therefore cover three out of the four enablers for Knowledge management – Roles and Accountabilities, Processes, and Governance. 
The one they do not cover is technology, perhaps because there were few effective Lessons-Management technologies around in 2003. Therefore I would like to propose a 5th enabler, as follows;
  1. Application of an effective lessons management system, including workflow to ensure the lessons reach those who must take action as a result, and a tracking system to track the effectiveness and application of learning.

View Original Source (nickmilton.com) Here.

The Risk Radar – what could possibly go wrong?

An analysis of your past lessons can be used to create a Risk Radar for future projects.

Michael Mauboussin, in his book “Think Twice”, talks about the planning fallacy, and compares the inside and outside view.

He points out that people planning a task or a project tend to take the inside view – they think through how they will do the project, they think about the risks and challenges they will face, and how they will overcome them, they add up the times for each step, and use that as their baseline. Almost always they are over-optimistic, unless they also use the outside view, and calibrate their baseline against what usually happens.

Mauboussin says

If you want to know how something is going to turn out for you, look how it turned out for others in the same situation”

To find out what happened to people in similar situations, we can use Lessons Analysis. If you have an effective lesson learning system you may have records from hundreds or thousands of past projects, and you have a huge database of “how things turned out”. Through lessons analysis you can recognise common themes across a number of lessons, and see the challenges and risks that projects are likely to meet, and identify those which are most common and most impactful. You can then use these to create a risk radar for future projects.

In the 90s, I was in charge of a lesson-learning system in Norway. At one point we had lessons from over 300 projects in the system, and we grouped them into a series of “things that commonly go wrong in our projects”.

In Norway, we treated our list as our “Risk radar” – the more of these problem areas that a project faced, the greater the risk to effective project delivery. The top 7 problem areas were as follows (remember these were small office-based projects)

  1. The project team is new, or contains a significant number of new members
  2. There is no clear full-time customer for the project 
  3. Satisfactory completion of the project relies on an external supplier. 
  4. The project is a low priority ‘back-burner’ project. 
  5. The project scope is poorly defined. 
  6. The project relies on an unusual degree of IT support. 
  7. The project involves consultation with large numbers of staff.

Any project where more than one of these was true had “Danger on it’s Radar”.

Such a project therefore needed to pay considerably more attention than usual to project management, and was advised to scan back through the relevant lessons and guidance, and speak to the people involved, in order to find out how the danger can be averted.

A few years later, the organisation applied the same concept to big engineering projects, and came up with what they called the “Train Wreck Indicator” – a similar indicator based on common reasons for failure, determined through lessons analysis. Any project that scored a Red Light on the train wreck indicator was given additional help and oversight in order to help it steer a path through the problem areas, and avoid becoming a train wreck.

Tools such as this help projects gain the Outside View to aid them in their planning.

If you have an effective lessons learned system, and are building up a library of project lessons, then consider a similar analysis, to build your own “Risk Radar“.

 Contact us if you want to know more about our lessons analysis service.

View Original Source (nickmilton.com) Here.

A lesson is not just something you learned, but something you can teach

People who have learned from experience must understand their responsibility to teach others.

I often say at the start of Lessons learned meetings, that when identifying and recording lessons we should think of them not as something we have learned, but as something we can teach others.

This is a subtle shift in emphasis form looking inwards, to looking outwards, and from looking backward, to looking forwards. It also identifies the responsibility of the knowledgeable person; a responsibility to others rather than to themselves.

For much of the lessons workshop, the participants are looking back at what happened. “We had a difficult time with the client”, they might decide, and then follow this Observation with a whole set of reminiscences about how difficult the client was, and what trouble it caused.

With good facilitation, they can also reach Insights about why the problems happened.

However the facilitator then has to turn the conversation outwards, and ask – “based on what you have learned from your reflection on the client difficulties, what can we teach the organisation about how to better deal with clients”. The participants need to stop analysing history, and start looking at generic learning they can pass on to others.

That is a critical value-added step, and it is that step, and the subtle mindset shift from passive learners to active teachers, that allows the participants to turn an observation and the subsequence insights, into a Lesson.

View Original Source (nickmilton.com) Here.