The "One Year After" knowledge capture event.

Many of us are used to holding knowledge capture events at the end of a project.  There is also merit in repeating this exercise one year (or more) later.

Imagine a project that designs and builds something – a factory, for example, or a toll bridge, or a block of student accommodation. Typically such a project may capture lessons throughout the project lifetime, using After Action Reviews to capture project-specific lessons for immediate re-use, and may then capture end-of-project lessons using a Retrospect, looking back over the life of the project to identify knowledge which can be passed on to future projects. This end-of-project review tends to look at the efficiency of the practices used during the project, and how these may be improved going forward. 
The review asks “Was the project done right, and how can future projects be done better”.  However what the review often does not cover is “Was the right project done?”
At the end of the project the factory is not yet operational, the bridge has only just opened to traffic, and you have just cut the ribbon on the student accommodation block. You do not yet know how well the outcome of the project will perform in practice. 

This is where the One-Year operational lessons review comes in.

You hold this review after a year or more of operation. 
  • You look at factory throughput, and whether the lines are designed well, how they are being used, how effective the start-up process was, whether there are any bottlenecks in dispatch and access, and even whether the factory is in the correct location. 
  • You look at traffic over the bridge – is it at expected levels? Is it overused or underused? Is it bringing in the expected level of tolls? Does the bridge relieve congestion or cause congestion somewhere else? Does the road over the bridge have enough lanes?  Does it ice up in winter?
  • You look at usage of the student accommodation. Is it being used as expected? Are the kitchens big ebnough? Are there enough bike racks? Where is the wear and tear on the corridors? Where are accidents happening? What do the neighbours think?
In this review you are looking not at whether the project was done right, but whether it was the right project (or at least the right design). The One Year operational learning review will generate really useful lessons to help you improve your design, and your choice of projects, in future. 

Don’t stop collecting lessons at the end of the project, collect more once you have seen the results of a year or more of operations.

Contact Knoco for help in designing your lesson learned program.

View Original Source (nickmilton.com) Here.

Why storing project files is not the same as storing project knowledge

There is often an assumption that storing project files equates to managing knowledge on behalf of future projects. This is wrong, and here’s why.

For example, see this video from the USACE Knowledge Management program says “if you digitise your paper files, throw in some metadata tagging, and use our search appliance, finding what you need for your [future] project is easy”. (I have added the word [future] as this was proposed as a solution to the next project now anticipating things in advance).

However there is a major flaw with just digitising, tagging and filing the project documents and assuming that this transfers knowledge, and the flaw is that the project may have been doing things wrong, and almost certainly could have done things better with hindsight. Capturing the files will capture the mistakes, but will not capture the hindsight, which is where the learning and the knowledge resides.

It is that hindsight you need to capture, not the files themselves.

  • Don’t capture the bid package presented to the client, capture what you should have bid, the price you should have quoted, and the package you should have used. All of these things should come from the post-bid win/loss review.
  • Don’t capture the proposed project budget, capture the actual budget, where the cost overruns were, and how you would avoid these next time. This should come from the post-project lessons review.
  • Don’t capture the project resource plan, capture the resource plan you should have had, and the resourcing you would recommend to future projects of this type. This also should come from the post-project lessons review.
  • Don’t capture the planned product design, capture the as-built design, where the adjustments were made, and why they were made. (See  my own experience of working from stored plans and not as-built design which cost me £500 and ten dead trees).
  • And so on. You can no doubt think of other examples.
Capturing the hindsight is extra work, and requires analysis and reflection through Knowledge Management processes such as After Action Review and Retrospect. These processes need to be schedules within the project plan, and need to focus on questions such as 
  • What have we learned?
  • What would we repeat?
  • What would we do differently?
  • What advice and guidance, with the benefit of hindsight, would we give to future projects?
These are tough questions, focused on deriving hindsight (as in the blog picture above). Deriving hindsight is not easy, which is why these Knowledge Management processes need to be given time, space, and skilled facilitation. However they add huge value to future projects by capturing the lessons of hindsight.  Merely filing and tagging the project files is far easier, but will capture none of the hindsight and so none of the knowledge.

Capturing documents from previous projects and repeating what they did will cause you to repeat their mistakes. Better to capture their hindsight, so it can be turned into foresight for future projects. 

View Original Source (nickmilton.com) Here.

How the BBC learned from their Olympic coverage

Here is a case study of one organisation – the BBC – learning from experience. 

The  Olympics is was a massive event, on a scale that is unprecedented in peacetime. It’s the biggest project a country will ever undertake, other than a war. I have already blogged about the Olympic Games KM program, but its not just the Games organisers that need Knowledge Management, it’s everyone involved, especially those involved in new or development areas.

One such area was Digital Broadcasting.  The London Olympics were the Digital Olympics, with more HD broadcasts, web feeds, twitter feeds etc than any other Games before. And to deliver the Digital Games, the country’s main broadcaster, the BBC, needed to develop and apply a raft of new technologies.

 This post, from the BBC Internet Blog,shows how they used lesson-learning, in a structured and planned way, to ensure these products were delivered on time and to specification, and also to ensure that subsequent exercises will learn from this one.

I quote the relevant section

Lessons Learned 
 We captured the lessons from the programme as we went along, from end of sprint retrospectives and the rich data captured in our information systems above. At the end of the Olympics the project managers facilitated workshops to capture additional successes and improvement opportunities and share these with their colleagues.  

From these on-line surveys and interviews with stakeholders, over 300 lessons were captured in our project register. The key lessons touched on above were the importance of organising and planning the work amongst self-directed, multi-disciplinary teams, with a layer of information and communication support provided by the management team. The ability to prioritise the scope and deliver it incrementally with frequent opportunities to test at scale and in a live environment, contributed to the success of a once-in-a-lifetime sporting event for the BBC’s on-line audiences. 

 The experience and lessons learned in delivering this exciting programme will be carried forward by the team members into their next projects, while the environment and process limitations identified, will drive improvements in technology provision and uptake of best practices.

We can see in-project learning (end of sprint retrospectives) and post-project learning (at the end of the Olympics  – workshops to capture additional successes and improvement opportunities) – both activities built into the work program.

We worked with the BBC “live and learn” team about 10 years ago to introduce some of these learning principles, and they have subsequently been MAKE award finalists for many years. This blog shows that KM and learning practices are still alive and thriving at the BBC.

View Original Source (nickmilton.com) Here.

Is Learning from Failure the worst way to learn?

Is learning from failure the best way to learn, or the worst?

Classic Learning
Classic Learning by Alan Levine on Flickr

I was driven to reflect on this when I read the following quote from Clay Shirkey;

Learning from experience is the worst possible way to learn something. Learning from experience is one up from remembering. That’s not great. The best way to learn something is when someone else figures it out and tells you: “Don’t go in that swamp. There are alligators in there.”

Clay thinks that learning from (your own bad) experience is the worst possible way to learn, but perhaps  things are more complex. Here are a few assertions.

  • If you fail, then it is a good thing to learn from it. Nobody could argue with that!
  • It is a very good plan to learn from the failure of others in order to avoid failures of your own. This is Clay’s point; that learning only from your own failures is bad if, instead, you can learn from others. Let them fail, so you can proceed further than they did. 
  • If you are trying something new, then plan for safe failure. If there is nobody else to learn from, then you may need to plan a fail-safe learning approach. Run some early stage prototypes or trials where failure will not hurt you, your project, or anyone else, and use these as learning opportunities. Do not wait for the big failures before you start learning.
  • Learn from success as well. Learn from the people who have avoided all the alligators, not just from the people that got bitten. And if you succeed, then analyse why you succeeded and make sure you can repeat the success.
  • Learning should come first, failure or success second. That is perhaps the worst thing about learning from experience – the experience has to come first. In learning from experience “the exam comes before the lesson.” Better to learn before experience, as well as during and after.  

Learning from failure has an important place in KM, but don’t rely on making all the failures yourself. 

View Original Source (nickmilton.com) Here.

5 reasons why organisations don’t learn lessons.

If lesson learning is so simple, why do organisations so often fail to learn the big lessons?

We seem to be able to learn the little lessons, like improving small aspects of projects, but the big lessons seem to be relearned time and time again. Why is this?

Some of the answers to this question are explored in the article “Lessons We Don’t Learn: A Study of the Lessons of Disasters, Why We Repeat Them, and How We Can Learn Them” by Amy Donahue and Robert Tuohy. In this article they look at lessons from some of the major US emergency response exercises, and find that many of them are repeated time and again.

In particular, repeated lessons are found in the areas of

  • Failed Communications
  • Uncoordinated Leadership
  • Weak planning
  • Resourcing constraints
  • Poor Public relations

In fact these lessons come up so often that staff in disaster response exercises can almost tell you in advance what is going to fail.  People know that these issues will cause problems, but nobody is fixing them.  
You could draw up a similar list for commercial projects and find many of the same headings, with the possible addition of issues such as
  • Scoping and scope control
  • Subcontracting
  • Pricing
Donahue and Tuohy explore why it is so difficult to learn about these big issues, and come out with the following factors:
  1. Lack of motivation to fix the issues. As Donahue and Tuohy explain, 

“Individual citizens rarely see their emergency response systems in action. They generally assume the systems will work well when called upon. Yet citizens are confronted every day by other problems they want government to fix – failing schools, blighted communities, and high fuel prices. Politicians tend to respond to these more immediately pressing demands, deferring investments in emergency preparedness until a major event re-awakens public concern. As one incident commander put it, “Change decisions are driven by politics and scrutiny, not rational analysis.” 

In addition, they identify the sub-issues of a lack of ability to sustain commitment, Lack of a shared vision on what to do about the lessons, and a lack of a willingness of one federal or local body to learn from others.

All of these issues are also seen in commercial organisations. There is a reluctance to make big fixes if it’s not what you are being rewarded for, a reluctance to learn from other parts of the organisation, and difficulties in deciding which actions are valid.

  1. An ineffective lessons capture and dissemination process. Donahue and Tuohy identify the following points:

“While some (AAR) reports are very comprehensive and useful, lessons reporting processes are, on the whole, ad hoc. There is no universally accepted approach to the development or content of reports… often several reports come out of any given incident… agencies or disciplines write their own without consulting each other. These reports differ and even conflict … there is no independent validation mechanism to establish whether findings and lessons are “right” … concern about attribution and retribution is a severe constraint on candour in lessons reporting …  the level of detail required to make a lesson meaningful and actionable is lost … meaning is also diluted by the lack of a common terminology … AARs typically focus on what went wrong, but chiefs want to know what they can do that is right. Reports tend to detail things that didn’t work, without necessarily proposing solutions. … those preparing the reports need to understand not only what happened, but also why it happened and what corrective action would have improved the circumstances. Reports of this depth and quality are relatively rare … many opportunities to learn smaller but valuable lessons are foregone (and) there is no mechanism by which these smaller lessons can be easily reported and widely shared”. 

That’s quite a list, and again we can also see these issues in industry as well. Lesson learning crucially needs

  1. An ineffective lessons dissemination process. Donahue and Tuohy make the following points:

“The value of even well-crafted reports is often undermined because they are not distributed effectively. Most dissemination is informal, and as a result development and adoption of new practices is haphazard. Generally, responders must actively seek reports in order to obtain them. … There is no trusted, accessible facility or institution that provides lessons learned information to first responders broadly, although some disciplines do have lessons repositories. (The Wildland Fire Lessons Learned Center and the Center for Army Lessons Learned are two prominent examples.)”

In fact, the Wildland Fire lessons center and the Center for Army Lessons Learned represent good practice (not just in technology, but in resourcing and role as well) and are examples that industry can learn from. However the issue here is not just dissemination of lessons, but synthesis of knowledge from multiple lessons – something the emergency services generally do not do.

  1. An ineffective process for embedding change. Donahue and Tuohy address this under the heading of “learning and teaching).

“Most learning and change processes lack a formal, rigorous, systematic methodology. Simplistically, the lesson learning and change process iterates through the following steps: Identify the lesson > recognize the causal process > devise a new operational process > practice the new process > embed/institutionalize and sustain the new process.  It is apparent in practice that there are weaknesses at each of these steps….

The emergency response disciplines lack a common operating doctrine…. Agencies tend to consider individual incidents and particular lessons in isolation, rather than as systems or broad patterns of behavior. … Agencies that do get to the point of practicing a new process are lulled into a false sense that they have now corrected the problem. But when another stressful event happens, it turns out this new process is not as firmly embedded as the agency thought … Old habits seem “safer,” even though past experience has shown they do not work. 

Follow-up is inadequate … Lessons are not clearly linked to corrective actions, then to training objectives, then to performance metrics, so it is difficult for organizations to notice that they have not really learned until the next incident hits and they get surprised”.

This is the issue of lesson managament, which represents Stage 2 of lesson learning maturity. Many organisations, such as the ones involved in emergency response, are stuck at stage 1.  Lesson management involves tracking and supporting lessons through the whole lifecycle, from identification through to validated and embedded change.

There really is little point spending time collecting lessons if these lessons are then not managed through to resolution.

  1. A lack of dedicated resources. Donahue and Tuohy again – 

“Commitment to learning is wasted if resources are not available to support the process. Unfortunately, funds available to sustain corrective action, training, and exercise programs are even leaner than those available for staff and equipment”.

Lesson-learning and lesson management need to be resourced. Roles are needed such as those seen in the US Army and the RCAF, or in Shell, to support the process.  Under-resourcing lesson learning is a major reason why lesson learning so often fails.

Conclusions.

Donahue and Tuohy have given us some sobering reading, and provided many reasons why lesson learning is not working for Disaster response. Perhaps the underlying causes are the
lack of treating lesson learning as a system, rather than as a product (ie a report with documented lessons), and a lack of treating lesson learning with the urgency and importance that it deserves.

Make no mistake, many commercial organisations are falling into the same pitfalls that Donahue and Tuohy describe.

If learning lessons is important (and it usually is), then it needs proper attention, not lipservice.

View Original Source (nickmilton.com) Here.

How can we learn lessons when every project is different?

This is another one to add to the “Common Knowledge Management Objections” list, and it’s worth thinking in advance what your counter-argument might be.

It’s a push-back you hear quite often in project organisations:

“We can’t do Knowledge Management, especially lessons learned, as all our projects are different”.

I last herd this from a technology company, and by saying “every project is different”, they mean that “every project has a different client, different product, different technical specifications”.

To some extent, they are correct, and this project variation reduces some of the impact of lesson learning. Certainly lessons add the most value when projects are the most similar. But even when projects change, learning still adds value.

Firstly, even on those technology projects, the process will be the same. 

The process of building and understanding the client requirements, choosing and forming the team, selecting and managing sub contractors, balancing the innovation against the risk, communicating within the team, keeping the client requirements always in mind, managing quality, managing cost, managing time, managing expectations, managing risk, and so on.

There is a huge amount of learning to be done about the process of a project, even when the tasks are different.

Secondly, the other common factor for this technology company was that every project was a variant on their existing product. 

They learned a lot about the way the product worked, and the technology behind the product, with every new project. If this additional knowledge was not captured, then they would have to rediscover it anew every time.  If the knowledge is captured, then each project is an exploration into the technology, and builds the company understanding of the technology so that new products can be developed in future.

So even if every project is different, every project can still be a learning opportunity. 

View Original Source (nickmilton.com) Here.

Sharing knowledge by video – – a firefighting example

The US Wildfire community is an area where Knowledge Management and Lesson Learning has been eagerly embraced, including the use of video.

The need for Knowledge Management and Lesson Learning is most obvious where the consequences of not learning are most extreme. Fire-fighting is a prime example of this – the consequences of failing to learn can be fatal, and  fire fighters were early adopters of KM. This includes the people who fight the ever-increasing numbers of grass fires and forest fires, known as Wildland fires.

The history of lesson learning in the Wildfire community is shown in the video below, including the decision after a major tragedy in 1994 to set up a lesson learned centre to cover wildfire response across the whole of the USA.

The increase in wildland fires in the 21st century made it obvious to all concerned that the fire services needed to learn quickly, and the Wildland Lessons Learned center began to introduce a number of activities, such as the introduction of After Action reviews, and collecting lessons from across the whole of the USA. A national wildfire “corporate university” is planned, of which the Lesson Learned center will form a part.

The wildfire lessons center can be found here, and this website includes lesson learned reports from various fires, online discussions, a blog (careful – some of the pictures of chainsaw incidents are a bit gruesome), a podcast, a set of resources such as recent advances in fire practice, a searchable incident database, a directory of members, and the ability to share individual lessons quickly. This is a real online community of practice.

Many of the lessons collected from fires are available as short videos published on the Wildland Lessons Center youtube channel and available to firefighters on handheld devices. An example lesson video is shown below, sharing lessons from a particular fire, and speaking directly to the firefighter, asking them to imagine themselves in a particular situation. See this example below from the “close call” deployment of a fire shelter during the Ahorn fire in 2011, which includes material recorded from people actually caught up in the situation.

Sometimes lessons can be drawn from multiple incidents, and combined into guidance. Chainsaw refueling operations are a continual risk during tree felling to manage forest fires, as chainsaw fuel tanks can become pressurised, spraying the operator with gasoline when the tank is opened (the last thing you want in the middle of a fire). Lessons from these incidents have been combined into the instructional video below.

This video library is a powerful resource, with a very serious aim – to save lives in future US Wildland fires. 

View Original Source (nickmilton.com) Here.

Learning by Watching

There is only a certain amount you can learn by reading. Sometimes you have to go and see.

With complex knowledge, there is more going on that can ever be documented, and (if it’s possible) the best way to learn is to go and see for yourself. Toyota call this “Genchi Genbutsu” – an approach they apply to problem solving. Wikipedia has this story –

“Taiichi Ohno, creator of the Toyota Production System is credited, perhaps apocryphally, with taking new graduates to the shopfloor and drawing a chalk circle on the floor. The graduate would be told to stand in the circle, observe and note what he saw. When Ohno returned he would check; if the graduate had not seen enough he would be asked to keep observing. Ohno was trying to imprint upon his future engineers that the only way to truly understand what happens on the shop floor was to go there”.

In Knowledge Management, these Knowledge Visits have a place in knowledge transfer. If you really want to learn from something complex and truly understand what happens, then go and see and talk to the people who are involved.

See for example the Observer Programme organised by the International Olympic Committee as part of their Knowledge Management framework.

This article describes how more than 100 staff from the PyeongChang Organizing Committee for the Olympic and Paralympic Winter Games visited and observed the Rio 2016 Olympics.

IOC’s Observer Programme, which is an integral element of the Olympic Games Knowledge Management, represents a key component of the knowledge transfer process, providing a unique opportunity to live, learn and experience real operations to future hosts, guided by key personnel of Rio Organizing Committee or IOC. 

POCOG will attend the total of seventy-six programmes, including Airport Operations, Look of the Games, Accreditation, Medical Services, Venue Management, Ticketing, Venue Energy, and Transport. 

SEO, Min-jung, the Head of Doping Control Team said, “After closely observing Rio Games, I now know what to do for PyeongChang 2018.” She added, “I can expect what needs to be done to give athletes absolute confidence in the doping control system and uphold the integrity of Olympics and Paralympics Games.” 

POCOG Spokesperson SUNG, Baik-you commented, “Thanks to invaluable IOC Observer Programme, POCOG staffs are here to watch and learn, and every moment and experience from airport arrival to competition venue visit will be a learning experience for PyeongChang 2018.”

Sometimes, in cases like this, you just have to go and see in order to learn.

View Original Source (nickmilton.com) Here.

Why do some organisations just not want to learn?

Having knowledge, and doing something with that knowledge, are two different things. There is often a gap between knowing and doing.

 
Why do you get teams or organisations that just don’t want to learn?

Take the example of one company, with dysfunctional project management practices. They have had several external audits which tell them that their practices are dysfunctional, and that they need to introduce proper planning, proper communication, and proper risk management, and yet they don’t change. They continue as before, and their projects are delivered late and over budget.

Why?

They have the knowledge, but they don’t take the action.

This, of course, is the phenomenon addressed by the well known book “the Knowing-Doing Gap“, which describes several reasons by teams and organisations will not learn, including the following:

  1. They haven’t made a close enough link between knowledge and action.  They think that gaining knowledge, for example through the external audits mentioned above, is sufficient in itself. Certainly the company we studied had not committed to taking action as a result of the audit.   
  2. “The way we have always done things” is a very hard habit to break.   Much work is done through habit, and those habits have been built up and reinforced over the years, without being challenged.  New knowledge challenges old knowledge and old habits, and old habits die hard (the curse of prior knowledge)
  3. They are disempowered. I argued recently that although knowledge management can support empowerment, it requires empowerment in the first place. Teams which are disempowered cannot learn.   
  4. They understand the how, but they don’t understand the why. They may have imported tools and techniques and processes, but they don’t understand the philosophy behind them, and so they cannot make them work.   
You can you lead a company to knowledge, but you can’t make it learn. 
 To become a learning organisation requires more than just effective knowledge management, it requires a commitment to learning and a commitment to change.
The organisation must accept that if knowledge is to add any value, then it must lead to action. It must accept that that action may often challenge the status quo, and will frequently the way things are already done.

They must not just accept this, they must welcome it, and must empower the teams and individuals within the organisation to take action on their own learning.  And they must also realise that when the organisation adopts  new techniques and new approaches, they have to understand the philosophy behind them, as well as the practices themselves.

Only though this approach, can an organisation hope to become a learning organisation.

View Original Source Here.

The importance of Reflection in KM

We don’t learn by doing, we learn by reflecting on doing, which is why your KM program should include reflective processes.

Kolb learning cycle. Publ;ic domain image from wikipedia

There is a popular quote on the Internet, often attributed to John Dewey, that “We do not learn from an experience … We learn from reflecting on an experience“. It probably isn’t from Dewey, although it is a summarisation of Dewey’s teaching, but it does make the point that no matter how broad your experience, it doesn’t mean you have leared anything.

Let’s match that with another Internet quote, attributed to Paul Shoemaker “Experience is inevitable, Learning is not“. Without reflection, and without change as a result of that reflection, nothing has been learned no matter how many experiences you have.

“Observation and Reflection” is also a part of the Kolb learning cycle, shown here, which is a well-established model for how individuals learn.

Knowledge Management is not so much about Individual learning as about Organisational Learning and Team Learning, but reflection is just as important in the KM context. Reflection  needs to be introduced to work practice through the introduction of reflective processes such as the following:

  • Introducing After Action Review as a reflective team process, to allow teams to reflect on, and collectively learn from, actions and activities;
  • Introducing Retrospect as a reflective team process, to allow teams to reflect on, and collectively learn from, projects and project phases;
  • Introducing processes such as Knowledge Exchange to allow members of a Community of Practice to reflect on, and share, their practice;
  • Introducing processes such as knowledge interviews to guide experts through structured reflection on their own experience, and top make this public for others.

It is only through introducing reflective processes such as these, and then acting on the new knowledge gained, that your organisation will stop just Experiencing, and start Learning.

View Original Source Here.

Skip to toolbar