Every lesson is a story

Capture your lessons as stories, that way they will be remembered more easily.

Native American Storytelling. Photo By: Johnny Saldivar

One of the way humans learn is through stories. Stories can Inform, Educate, and/or Entertain (transmit information, knowledge, and entertainment) and some of the best stories do all three.

A post from the Farnham Street blog entitled “The structure of a Story” tells us that a great story has structure, and includes Context, Action and Result; where Context includes Where, Who, What do they Want, and What’s getting in the way. As the blog says

“It all hangs on the central Context, Action, Result (CAR) structure. It starts with the Subject, Treasure, and Obstacle, and concludes with the Right lesson and link back to why it’s being told”.

When we transfer Knowledge, story is one of the best formats. That is equally true of lesson-learning as it is of the other elements of KM. Yet all too often, lessons fail to meet the test of a good story.

As the Farnham Street blog says, aspirant storytellers usually start with the Action, because that is the exciting part, and omit the context, and as a result the reader of the story or lesson cannot easily identify with nor internalise the learnings.

When we teach lesson-learning to organisations, we say that Every Lesson is a Story, and we recommend that the lesson has at least 6 components

1) Context. Here we explain what was expected to happen, but didn’t. This is the “What did we want” step of the story. It is amazing how many project teams or facilitators skip this step. Because “what was expected to happen” is so obvious to them, they expect it to be obvious to the reader, but of course it seldom is. 

2) Outcome. Here we explain what actually happened – the Action step of the story. What went wrong? What actually happened? 

3) Root cause. Here we explain WHY it happened – what the Obstacles were, what was missing that should have been there, or (in a success story) what was done to achieve the result. 

4) Impact. Here we explain or try to quantify the result, either negative or positive. 

5) Lesson. Based on the root cause, what is the learning for future projects? This is the “Right Lesson” part of the story.

6) Action to Embed.What action do we take to embed that lesson into business process, so mistakes are never repeated, and successes always copied.

A lesson with a story-based structure such as this takes about a page to write down or a couple of minutes to tell, compared to typical bullet-point one-liner lessons – told in seconds, forgotten almost as quickly.

However if we treat every lesson as a story, then we realise that it must be presented as a Narrative, with Context, Action and Result, concluding with the Right Lesson.

This is the simple structure that conveys the message.

View Original Source (nickmilton.com) Here.

The four most costly words in KM – "this time it’s different"

“This time it’s different” can be the four most costly words in project knowledge management, if they are used as a reason not to learn from the past.

Albert Einstein’s definition of insanity was “doing the same thing over and over again and expecting different results”.

And yet, any analysis of a collection of corporate lessons will show the same mistakes being made time after time. So organisations obviously DO “do the same thing over and over again and expect different results”.

Are organisations insane?  Or is there another factor at work?

The factor may well be what the Farnam Street blog calls the “This time it’s different” fallacy. I quote from the blog –

“This time is different” could be the 4 most costly words ever spoken. 

It’s not the words that are costly so much as the conclusions they encourage us to draw. We incorrectly think that differences are more valuable than similarities. After all, anyone can see what’s the same but it takes true insight to see what’s different, right? We’re all so busy trying to find differences that we forget to pay attention to what is the same.

Different is exciting and new, same is old hat. People focus on the differences and neglect the similarities. In projects, this becomes the “my project is different” fallacy that I described here. People look at their projects, see the unique situations, find the differences, overlook the similarities to all similar projects on the past, and assume that “this time it will be different”.

It never is.

The same old mistakes will creep up on you and bite you in the bottom, as they always do.

Instead of assuming “this project is different”, perhaps we should start with the assumption that “this project is just like any project. It involves building and understanding client requirements, choosing and forming the team, selecting and managing sub contractors, balancing the innovation against the risk, communicating within the team and with the client, keeping the client requirements always in mind, managing quality, managing cost, managing time, managing expectations, managing risk, and so on”.

Then look for the lessons that will help you with all those tasks, and will help you avoid all the old pitfalls. As the Farnam Street blog says,

If you catch yourself reasoning based on “this time is different” remember that you are probably speculating. While you may be right, odds are, this time is not different. You just haven’t looked for the similarities.

A great antidote to the “This time it’s different” fallacy is that good old, tried and tested mainstay of Knowledge Management, the Peer Assist. Once a project team gets into a room with a bunch of people with experience, the conversation automatically focuses on the similarities. “Yes, we’ve seen that, we’ve been there, here’s what we learned” and it becomes increasingly difficult to maintain that “This time it will be different”.

View Original Source (nickmilton.com) Here.

Operationalising Lessons Learned in a small organisation

Here’s a really great video on a small organisation operationalising a lessons learned process

The organisation is Boulder Associates, an Architect and Design firm with a couple of hundred staff working out of a handful of US locations. The video was recorded at the KA-connect conference in San Francisco in 2018; an annual knowledge-focused conference for the AEC community organised by Knowledge Architecture.

The video is of a 27 minute presentation given by Todd Henderson and James Lenhart of Boulder Associates, and thanks Todd for the namechecks!

They make the point that collecting lesson is not the purpose of lesson learning, and they drive lesson learning through just in time delivery, using checklists as the final destination for the learning. Their combination of spreadsheet, tracking dashboard and checklists is a very simple, very appropriate system for an organisation of this scale.

View Original Source (nickmilton.com) Here.

Why admitting mistakes is so hard, and what we can do to counter this

It is part of the human condition to deny our mistakes, but that makes it hard for us, and for our organisations, to learn.

make no mistake by Meshl
make no mistake, a photo by Meshl on Flickr.

I can recommend a really interesting book called “Mistakes were made – but not by me – (why we justify foolish beliefs, bad decisions and hurtful acts)”.

The book is about cognitive dissonance – how people square their self-image of being a good, competent and smart person, with the realisation that they can make mistakes – sometimes pretty big mistakes.

The way most people deal with this problem is to maintain the self-image and explain away the mistake.

“It wasn’t really a mistake” – “Of course I didn’t see the stop sign, it was in such a stupid place” – “The sun was in my eyes” – “Nobody told me it was wrong” – “He pushed me” – “I don’t see why I should say sorry, he started it” – “I’m not wrong, aliens really DO exist; the government is covering it all up” and so on.

They didn’t REALLY make a mistake; they were smart people all along. It wasn’t their fault.

This cognitive dissonance works particularly strongly in cultures that are intolerant of mistakes, and to be honest, that’s most cultures. As the playwright Lillian Hellman says about US culture, for example

“We are a people who do not want to keep much of the past in our heads. It is considered unhealthy in America to remember mistakes, neurotic to think about them, psychotic to dwell on them”.

Knowledge Management, however, requires that organisations, and the people within them, learn from experience, and 50% of experience comes from Mistakes. KM, if it is balanced, must allow learning equally from failures and from successes. Somehow this very powerful driving force of cognitive dissonance, present in every culture and every human, must be allowed for, sidestepped and redressed.

How do we do this?

Firstly, there much be a very conscious top-down drive for a culture that allows learning from mistakes. Call it a no-blame culture, call it an organisational learning culture, call it openness – it needs to be a clear and conscious expectation. This sets up a counter-dissonance. “I am a smart person. But I made a mistake. But the company expects me, if I am smart, to admit mistakes. Hmmmm!”

We all know the story told about Tom Watson Sr, the first president of IBM.

A young worker had made a mistake that lost IBM $1 million in business. She was called in to the President’s office and as she walked in said, “Well, I guess you have called me here to fire me.” “Fire you?” Mr. Watson replied, “I just spent $1 million on your education!”

That is a very powerful story that sets up a counter-dissonance.

Or look at the NASA approach, of getting senior leaders to publish stories entitled “my best mistake”.  I suggested this to an organisation recently, and the head of KM actually said “you will NEVER get leaders to publish their mistakes”. And yet that’s just what they did at NASA.

Or look at the Japanese attitude of Hansei. Although alien to many in the west, Hansei is an important part of the Japanese culture. Han means “change” and Sei means “to review”, so the whole thing means “introspection” or “reflection for the purposes of change”.  This translates into a behaviour , instilled from childhood, of looking for mistakes, admitting responsibility, and implementing change.(When Japanese children do something wrong, for example, they are told: “Hansei shinasai – Do hansei”!). Hansie also contains the thought that “to say there were no mistakes, it itself a mistake”

Secondly, we use objective facilitation.

Take lessons-identification meetings, for example. The temptation, when scheduling a meeting for a project team to identify lessons from experience, is for the project leader to lead the meeting. However the outcome, most of the time, is that the project team made No Mistakes. Sure, mistakes were made, but not by the project team!

You see this, for example, when the lessons coming from the team are like this ….

We ordered a set of number 6 widgets, and when they arrived, they were very poor quality. We had to send them all back, which delayed construction by a week.  The lesson is not to use that supplier again”.

So the fault was with the supplier.

However a good objective facilitator would dig deeper. They would ask – what were your quality control procedures? What was your ordering philosophy? Did you wait for the widgets to arrive before doing quality control? If so, why did you leave it until the last minute, so that re-ordering caused delay? Did you just assume that everything would be top quality?” The lesson would be more about quality control procedures and mental assumptions, that it would be about suppliers and vendors.

Any good system of lessons identification and lesson-learning has to use objective external facilitation, if you are to overcome the tendency to say “mistakes were made, but not by my team.” That’s why an increasing number of companies are calling us in, for example, to provide that skilled external evaluation as part of their lesson learning system.

This is even more the case with high level review and learning. As the authors of the book say,

“Few organisations welcome outside supervision and correction. If those in power prefer to maintain their blind spots at all costs, then impartial review must improve their vision ……. If we as human beings are inevitably afflicted with tunnel vision, at least our errors are more likely to be reduced or corrected if the tunnel is made of glass”.

This goes for teams and individuals as well as organisations – impartial assistance is vital.

So, set the high level expectation for openness, and use external facilitation to probe the blind spots.

We all make mistakes – mistakes have been made, often by us, and those mistakes are an opportunity to learn and improve, despite our best efforts to pretend that they never happened.

View Original Source (nickmilton.com) Here.

Lesson learning in the US Army – example from Haiti

Army learning is not just about fighting battles – here’s an example from disaster response

In 2010, the US Army was called in to provide humanitarian aid, including food and shelter, after a category 7 earthquake in Haiti. This article by the US contracting commend, entitled Lessons learned and used during Haiti deployment described how lesson learning made this a more efficient and effective process.

The article describes how, at the height of the response, the US Army supplied more than 15 million meals  in a 10-day period to the Haitian population, as well as setting up distribution points for families to receive boxes and bags of rice, beans and cooking oil. All of this required a supply chain and contracting and purchasing activities, and by the end of the mission, the Expeditionary Contracting Command had created more than 380 contracting actions valued at almost $12 million. Of course this supply chain needed to be slick and well organised, and to have learned from the past.

“We took advantage of a lot of lessons learned from previous deployments. We didn’t do these types of things early on in Operation Iraqi Freedom or Operation Enduring Freedom. However, we learned those lessons and brought these capabilities to Haiti early on,” said Brig. Gen. Joe Bass, commander, Expeditionary Contracting Command. “We were very proactive from the beginning, deploying the right personnel mix needed to provide quality assurance, legal, policy and other areas where we could address issues on the front end rather than after they’ve been done.

Lessons included

  • the personnel which needed to be deployed with the first troops, 
  • the need to set up a support centre in the US, 
  • the need for review and decision making boards in Haiti, 
  • creation of “pre-positioned deployable equipment packages”, and 
  • the correct level of decision making authority for procurement orders of different value from $100 thousand to $1 million.

The Haiti organisation did not just learn lessons from the past, they created new lessons to help future operations.

“Learning from the past helped us deploy quicker and smarter,” Bass said. “Just as we gathered lessons learned from previous deployments, we have gathered some from the Haiti deployment that should help us the next time we have to deploy. Moving forward means reviewing what we’ve done and how we have done it in the past, then reviewing it again and constantly using those lesson to better ourselves with each new challenge”

This is a great example of an organisation using lesson learning to continuously improve operations. 

View Original Source (nickmilton.com) Here.

Lessons Learned, or lessons lost?

Are you learning lessons, or losing lessons?

Kizar [Public domain], from Wikimedia Commons

A Lesson is an Investment

It might take a project team of 10 people, one day to create, through facilitated discussion, ten high-quality lessons. So each of these takes one man-day to create; say £500 – plus some write-up time – lets say £1000 per lesson to choose a nice round number.

However that lesson is encapsulated knowledge which, if re-used, may save man-months of wasted time and effort in the future, or tens of thousands of pounds of wasted cost. The project team have invested their time to deliver future ten-fold or hundred-fold return on that investment. That lesson may therefore be an investment worth £10,000 to £100, 000 when realized through application.

So what do we normally do with our investments?

Do we hide them in a hole and never look at them again? Do we file that share certificate in a drawer and never look at it again? Or do we track our investments until the time and situation is right to realise them? When it comes to money, we do the latter. But what about lessons?

Unfortunately, a common approach to lessons is to record them in a report, or on a spreadsheet, then put them into project files, and hide them in the online filing system. I heard a report a couple of weeks ago about a team that created some hugely valuable lessons, and stored them in security-controlled project files to which no other project was allowed access!

That’s not Lessons Learned; that’s Lessons Lost.

That’s like the last scene in Indiana Jones, when the Ark of the Covenant is put in a crate, and hidden in a vast warehouse of identical crates.

What we should do with investments, is put them in a managed portfolio, and we track them, until we find the opportunity to realise their value.

What we should do with lessons, is put them in a Lessons Management System, and track them, until we find the opportunity to reuse them or to embed them into process and procedure, thus realising the investment of time and effort that was put into generating them in the first place.

That’s Lessons Learned; rather than Lessons Lost.

View Original Source (nickmilton.com) Here.

How the coastguard seeks input to lesson learning

Public organisations can learn from the coastguard when it comes to getting wide scale input to lesson learning

Any public organisation, especially one with an element of high priority service, needs a lesson-learning process to improve that service. The emergency response services in particular have well-developed lesson learning systems, but here is a wrinkle I had not seen before, from the US coastguard.

This article from 2017, entitled “Innovation Program seeks hurricane lessons learned from Coast Guard responders” describes how the US coastguard set up what they called the “Hurricane Lessons Learned challenge” on the Coast Guard’s ideas-capturing portal CG_Ideas@Work.

This portal was started as a way to preserve and institutionalize the wealth of lessons learned during hurricane response efforts, and all Coast Guard personnel who participated in any of the response efforts are encouraged to share their observations, issues and ideas.

This is a means of capturing ideas observations and insights which analysts later could convert into lessons (the sequence from Observations to Insights to Lessons is widely recognised in the Lesson learning community). Some direct lessons may also be captured.

As the article explains

 The Coast Guard routinely captures lessons learned as a way to improve its operations, but the CG_Ideas@Work challenge offers one distinct advantage: “Our crowdsourcing platform not only provides a place to submit ideas, but also to collaborate on them,” (Cmdr. Thomas “Andy”) Howell said. “Everyone from non-rates to admirals can discuss ideas.” Speed is also an advantage. “Catching the ideas when they’re fresh and raw preserves their integrity,” Howell said.

The US Coastguard are well aware that capturing lessons is not enough for them to be a learnign organisation. These lessons must also drive change.

The Commandant’s Direction says we need to become an organization capable of continuous learning, so it’s important that the innovations and adaptations that made this response successful are institutionalized,” Howell said. Ideas shared through the Hurricane Lessons Learned challenge are immediately shared with the responsible program. Many will be considered as potential projects for next year’s Research, Development, Test and Evaluation Project Portfolio.

The portal has been very well received

“We’ve heard from pilots, inspectors, commanding officers, district command staffs, reservists, Auxiliary personnel – the entire gamut of responders,” Howell said. “It’s a very user-friendly way to collect information, and comes with the benefit of collaboration,” he said.

This is an approach other similar organisations can learn from.

View Original Source (nickmilton.com) Here.

What’s the difference between a lesson-learned database and a lesson management system?

In this blog post I want to contrast two software systems, the Lessons Database, and the Lessons Management System.

There are two types of Lessons Learned approaches, which you could differentiate as “Lessons for Information” and “Lessons for Action”.

These represent maturity levels 1 and 2 from my three level categorisation, and can be described as follows.

Lessons for Information” is where lessons are captured and put in reports, or in a database, in that hope that people will look for them, read them, and assimilate them.

Lessons for Action” is where lessons are used to drive change and improvement. Lessons are captured, reviewed, validated, and action is taken to embed the lessons in process, procedure, standards and/or training.

“Lessons for Information” is supported by a Lessons Database, “Lessons for Action” by a Lessons Management System. Let’s contrast the two.

  • In a Lessons Database, the database is the final home of the lessons. In a Lessons Management System, the final home of lessons is considered to be the compiled knowledge of the organisation, which may be procedures, doctrine, guidance, best practices, wikis, etc.
  • In a Lessons Database, lessons reach their reader through search. In a Lessons Management System, lessons are pro-actively routed to those who need to see them and to take action.
  • In a Lessons Database, lessons accumulate over time (this was the problem with my first Lessons system in the 90s – it got clogged up over time with thousands of lessons, until people stopped looking). In a Lessons Management System, lessons are archived once they have been embedded into process and procedure, and the only live content in the system is the set of lessons currently under review.
  • In a Lesson Database there is only one type of lesson – the published lesson. In a Lesson Management system there are at least two types of lesson – the open lesson (where action has not yet been taken) and the closed lesson, which may then be archived. Some organisations recognize other types, such as the draft lesson (not yet validated) and the Parked lesson (where action cannot yet be taken, or where action is unclear, and where the lesson needs to be revisited in the future).
  • In a Lessons Database, there may be duplicate lessons, out of date lessons, or contradictory lessons. Through the use of a Lessons Management System, these have all been resolved during the incorporation into guidance.
  •  In a Lessons Database, there are limited options for metricating the process. You can measure how many lessons are in the system, but that’s about it (unless you capture data on re-use). Through the use of a Lessons Management System, you can track lessons through to action, and can measure whether they are being embedded into process, you can see where they are being held up, and by whom, and you can see how long the process is taking and where it needs to be speeded up.
Lessons management is the way to go. Lesson databases really do not work in the long term, and usually become lesson graveyards, and the home for Lessons Lost.

View Original Source (nickmilton.com) Here.

The curse of knowledge and the danger of fuzzy statements

Fuzzy statements in lessons learned are very common, and are the result of “the curse of knowledge”

Fuzzy Monster
Clip art courtesy of DailyClipArt.net

I blogged yesterday about Statements of the Blindingly Obvious, and how you often find these in explicit knowledge bases and lessons learned systems, as a by-product of the “curse of knowledge“.

There is a second way in which this curse strikes, and that is what I call “fuzzy statements”.

It’s another example of how somebody writes something down as a way of passing on what they have learned, and writes it in such a way that it is obvious to them what it means, but which carries very little information to the reader.

A fuzzy statement is an unqualified adjective, for example

  • Set up a small, well qualified team…(How small? 2 people? 20 people? How well qualified? University professors? Company experts? Graduates?)
  • Start the study early….(How early? Day 1 of the project? Day 10? After the scope has been defined?)
  • A tighter approach to quality is needed…. (Tighter than what? How tight should it be?)
You can see, in each case, the writer has something to say about team size, schedule or quality, but hasn’t really said enough for the reader to understand what to do, other than in a generic “fuzzy” way, using adjectives like “small, well, early, tighter” which need to be quantified.

In each case, the facilitator of the session or the validator of the knowledge base needs to ask additional questions. How small? How well qualified? How early? How tight?

Imagine if I tried to teach you how to bake a particular cake, and told you “Select the right ingredients, put them in a large enough bowl. Make sure the oven is hotter”. You would need to ask more questions in order to be able to understand this recipe.

Again, it comes back to Quality Control.

Any lessons management system or knowledge base suffers from garbage In, Garbage Out, and the unfortunate effect of the Curse of Knowledge is that people’s first attempt to communicate knowledge is often, as far as the reader is concerned, useless garbage.

Apply quality control to your lessons and de-fuzz the statements

View Original Source (nickmilton.com) Here.

The curse of knowledge, and stating the obvious

The curse of knowledge is the cognitive bias that leads to your Lesson Database being full of “statements of the obvious”

Obvious sign is obvious.There is an interesting exercise you can do, to show how difficult it is to transfer knowledge.

 This is the Newton tapper-listener exercise from 1990.

 Form participants into pairs. One member is the tapper; the other is the listener. The tapper picks out a song from a list of well-known songs and taps out the rhythm of that song to the listener. The tapper then predicts how likely it will be that the listener would correctly guess the song based on the tapping. Finally, the listener guesses the song.

Although tappers predicted that listeners would be right 50% of the time, listeners were actually right less than 3% of the time.

The difference between the two figures (50% and 3%) is that to the tapper, the answer is obvious. To the listener, it isn’t.

This is the “curse of knowledge“.

Once we know something—say, the melody of a song—we find it hard to imagine not knowing it. Our knowledge has “cursed” us. We have difficulty sharing it with others, because we can’t readily re-create their state of mind, and we assume that what is clear to us, is clear to them.

Transferring knowledge through the written word (for example in lessons learned, or in online knowledge bases) suffers from the same problem as transferring a song by tapping. People THINK that what they have written conveys knowledge, because they can’t put themselves in the mind of people who don’t already have that knowledge.

Just because they understand their own explanations, that does not mean those explanations are clear to he reader.

This effect can be seen in written knowledge bases and lessons databases, and often appears as Statements of the Blindingly Obvious (SOTBOs).

These are statements that nobody will disagree with, but which carry obviously carry some more subtle import to the writer which the reader cannot discern. These include statements like

  • “It takes time to build a relationship with the client” (Really? I thought it was instantaneous). 
  • “A task like this will require careful planning”. (Really? I thought careless planning would suffice)
  • “Make sure you have the right people on the team.” (Really? I thought we could get away with having the wrong people)
  • Ensure that communication and distribution of information is conducted effectively. (Really? I thought we would do it ineffectively instead)
The writer meant to convey something important through these messages, but failed completely. Why is this? Often because the writer had no help, no facilitation, and was not challenged on the emptiness of their statements.

In each case, any facilitator which had been involved in the capture of the knowledge, or any validator of the knowledge base, would ask supplementary questions:

  • How much time does it take? 
  • What would you need to do to make the planning careful enough? 
  • What are the right people for a job like this? 
  • What would ensure effective communication?
This further questioning is all part of the issue of knowledge quality assurance, to filter unhelpful material out of the knowledge base, or lessons management system, and to turn an unintelligible set of taps into a full tune.

Without this, people rapidly give up on the knowledge base as being “unhelpful”, and full of SOTBOs.

View Original Source (nickmilton.com) Here.