Is it possible for an organisation to learn?

Can organisations learn, or can only people learn? Some thoughts on the subject.

from creative commons images

We often hear about “organisational learning” but is learning something that organisations actually can do? Or is learning the province of people and animals? (Let’s put machine learning aside for the moment – that is another discussion).

There is a school of though that learning is a human attribute- that only humans are able to learn. After all, learning  requires a memory in which new knowledge can be stored. Humans  have a memory, but do organisations?

You could argue that organisations have two memories – one if the collective memory of the individuals in the organisation, often reinforced through stories and “shared experience”

The second memory is held in “the way we work” – the processes, procedures, doctrines, structures, norms, behaviours, organigrams, and the stories that are told. As one project person said to me – “our standard process is made up of all the lessons we have learned over the years”.

The first memory comes and goes with the people, and the effect of this can be observed in the cycles of unlearning you see in some organisations, where the same mistakes are repeated on a 5 to 10 year cycle as the older staff retire. The second memory is potentially longer-term, and survives the changeover of staff, but is also slower to build up and slower to respond to events.

However I would argue that this deeper slower memory is where real organisational learning can take place. An organisation, through activating learning activities and learning cycles, can steadily but surely change the way it operates, in response to events and to new experiences.

So, yes, organisations can learn. Organisations can modify their behaviour as a result of experience, and that, surely, is a form of learning. Maybe its more mechanistic than intuitive learning, maybe they don’t learn as fast or as well as a human does, maybe they learn more like the way a dog learns.

However I believe that organisations can learn if they develop a structure for learning. The bigger question is why don’t they learn better, and more often?

View Original Source (nickmilton.com) Here.

What makes a good learner?

If you want a learning organisation, you need an organisation of learners. But what makes a good learner?

Here’s an article called “Seven Characteristics of Good Learners” by Maryellen Weimer, which addresses just that question.

According to Maryellen:

  • Good learners are curious – They wonder about all sorts of things, often about things way beyond their areas of expertise. They love the discovery part of learning. 
  • Good learners pursue understanding diligently – A few things may come easily to learners but most knowledge arrives after effort, and good learners are willing to put in the time to seek and to find. Good learners are persistent. They don’t give up easily. 
  • Good learners recognise that a lot of learning isn’t fun – That doesn’t change how much they love learning. Learning is hard work sometimes, but when understanding finally comes, when they get it, when all the pieces fit together, that is one special thrill. 
  • Failure frightens good learners, but they know it’s beneficial – It’s a part of learning that offers special opportunities that aren’t there when success comes quickly and without failure. In the presence of repeated failure and seeming futility, good learners carry on, confident that they’ll figure it out. 
  • Good learners make knowledge their own – This is about making the new knowledge fit with what the learner already knows, not making it mean whatever the learner wants. Good learners change their knowledge structures in order to accommodate what they are learning. 
  • Good learners never run out of questions – There’s always more to know. Good learners are never satisfied with how much they know about anything. They are pulled around by questions—the ones they still can’t answer, or can only answer part way, or the ones without very good answers. 
  • Good learners share what they’ve learned – Good learners are teachers committed to sharing with others what they’ve learned. They write about it, and talk about it. Good learners can explain what they know in ways that make sense to others. They aren’t trapped by specialised language. They can translate, paraphrase, and find examples that make what they know meaningful to other learners. They are connected to the knowledge passed on to them and committed to leaving what they’ve learned with others.

An organisation of such people would be a powerful learning organisation indeed.

View Original Source (nickmilton.com) Here.

Keeping a decision log as an aid to learning

A decision log can be a useful tool in learning, and as part of a KM system

Dia 91: DecisionesMany projects and many non-project bodies maintain a decision log, to keep track of, and to publish, the major decisions which have been made. This allows you later to revisit the decisions, and understand the basis behind them, in the light of later knowledge. If you know why decisions were made, then you know whether and when to revisit them.

Some public bodies publish their decision logs, for example some of the UK police and crime commissioners have public decision logs.

But how helpful are these logs for learning purposes? A simple decision log will record which decisions were made, and by whom, but there is often no way to go back and understand why those decision were made.

The Washington DNR site has a good decision log template including a column for decision rationale and one for the alternatives considered, but even that one lacks the “assumptions” column, and often one of the major causes of learning is that our assumptions were incorrect.

In engineering, the Toyota A3 report acts as a decision log for product design, and is a simple and visual way to keep track of engineering decisions, recording

  • The problem
  • the details of the current situation
  • root cause analysis
  • the “target state” 
  • the alternative countermeasures to address root causes
  • the chosen implementation plan with accountable actions and costs
  •  a follow-up plan, including preparation of a follow-up report 

These reports are used to communicate decisions in review meetings to build a knowledge base about good practices in product development, and to develop a final Basis of Design document.

If a decision log is to be useful as a learning tool, then it needs to cover some of the same ground as the A3 report, and to record.

  • The problem that needed to be addressed (in therms of the difference between the current reality and the desired state)
  • An analysis of the problem
  • The alternative options that were rejected
  • The decision that was made
  • Why that decision was made, i.e. the deciding factors that resulted in choosing that particular option
  • The assumptions behind the decision.

So on a crucial project, consider the use of a decision log, but make sure you record the assumptions and rationale behind each of those decisions.

View Original Source (nickmilton.com) Here.

How KM helped England learn from history and beat Colombia

Sometimes learning from personal failure is the way to win.

Gareth Southgate in despain after missing the penalty in 1996,
an event which indirectly led to Englands win last night.

Last night England beat Colombia in the Football World Cup quarter final. The game was decided on a penalty shoot-out – a process which England are historically notoriously bad at, during which players take it in turns to shoot for goal, competing one-on-one with the goal-keeper. After the match, the manager Gareth Southgate talked about the importance of the players “owning the process” of taking penalties as -part of the reason for success.

The story of developing that process, and player ownership in the process, has a lot of Knowledge Management in it, including learning from mistakes, analysis of the situation, and plenty of preparation and practice.

Southgate has his own personal and painful history with penalty taking, as a young player missing a crucial penalty for England against Germany in the semi-final of Euro 1996. His feelings were described in this article from the Daily Telegraph.

“I wanted (the penalty) to be firmly struck, but didn’t want to hit it too hard and risk putting it over the bar or wide. In trying to be precise, I hit a soft and badly placed penalty. Kopke saved comfortably. ‘What have I done?’ I put my arms over my head. The thought of the lads on the halfway line made me despair.”

After the 1996 match, the then Prime Minister of the UK, John Major, offered Southgate a consoling hug, but the failure that day made Southgate determined to learn to do better.

At the time, there was no “process” for penalty taking, but Southgate wanted to learn the lessons from the past, and 22 years later as England Manager he wanted there to be a process, and for the players to “own it.” An article in the Guardian quotes him as saying “The depth of knowledge and understanding wasn’t so great (in 1996) and we didn’t have as much information as we do now …. I have had a couple of decades thinking it through”

So Soutgate has taken the following learning approach for his team:

  • He started his team practising for penalties in March, well before the World CUp started
  • He commissioned a study into the failed penalty shoot-outs of previous tournaments where it was discovered that England players took less time over their kicks than any other nation. “They will take their time now”.
  • He has tried new ways of putting the players under pressure by organising competitions in which they jeer and shout at each other, just as the Colombian fans jeered and shouted last night.
  • He has a list of his squad’s penalty-takers in order from 1 to 23, constantly updated with training performance and injury.
  • Collectively, with the team, he has discussed how they would ideally approach a penalty shootout, “making sure there’s a calmness, that we own the process and it’s not decisions made on the spur of the moment. We have to make sure it’s calm, that the right people are on the pitch and it doesn’t become too many voices in the players’ heads”.
The result of this learning was clear to see last night – a cool calm and collected process which won England the match.

View Original Source (nickmilton.com) Here.

Learning before, during and after; how it works in Sport

“Learning before, during and after” is a common principle applied to knowledge management in project-based organisations.  But what does it really mean?

Glamorgan-Sport-Park-performance-analysis

We may have read about the principle of learning before, during and after in projects, but if you want to see how it really works in practice, take a look at sport. 
A sports team – a football team, or a rugby team or a cricket team – has built this learning rhythm into their working week.
The team do their Learning Before on a Friday. 
They study video recordings of their forthcoming opponents, they identify the other team’s weaknesses and patterns of play, and they discuss the tactics and strategy they might use on the Saturday.  They look back on occasions when these opponents have been beaten, and analyse how best to beat them this weekend.
The team do their Learning After on a Monday.  They review and analyse, often in great detail and with the aid of computer analysis, the video footage of the weekend’s match.  They identify the things that have gone well, and analyse why they went well.  They analyse things which did not go to plan, look at the root causes behind this, and identify anything they need to work on during the week, and do differently in future matches.
What about Learning During?  Learning during takes place during the match, partly through messages sent on to the field from the management team, and partly through self analysis during the half time break. The purpose of Learning During is to identify and change things that are not working, and identify and repeat things that are. Often the team that learns and adapts the best on the pitch, wins the match. 
This is just the same way that a project can learn.  The team can analyse, before work starts, and challenges and problems they will face, and seek to learn from previous examples where those challenges have been overcome, through mechanisms such as Knowledge Management planning and Peer Assist.
They can learn after the project; they identify the things that have gone well, and analyse why they went well.  They analyse things which did not go to plan, look at the root causes behind this, and identify anything that needs to be done differently on future projects, through processes such as Retrospect.  Then during the project itself, they should be open to learnings both from the project team themselves, and from external observers. They can  identify and change things that are not working, and identify and repeat things that are.

That’s how winning teams learn.

View Original Source (nickmilton.com) Here.

The right way to learn from failure

We acknowledge, in KM, that learning from failure is desirable, but what kind of failure are we talking about?

image from
wikimedia commons

We hear the terms “Failure” “Error” and “Mistake” very often in Knowledge Management circles; often treated as synonyms. In particular, the terms “Learning from Failure,” “Learning from Mistakes” and “Learning by Trial and Error” are almost interchangeable.

But these words are not complete synonyms, particularly in Knowledge Management terms.

According to Oxford Dictionaries Online, a Mistake is an act or judgement that is misguided or wrong, an Error is the state or condition of being wrong in conduct, judgement or outcome, while a Failure is a lack of success.

So I could Fail at something or make an Error, without it necessarily being a Mistake, even though a Mistake always leads to error. I could try something that was a stretch or a gamble, and that turned out to be impossible. I could try something unknown, and fail to succeed, but without making an error in judgement, or doing anything misguided. You cannot be misguided when attempting the unknown. Edison famously had a whole string of failures before inventing the light bulb, but none of them were mistakes; they were all part of a process of enlightened trial, failure and (ultimately) success.

A mistake, or an error of judgement, implies that you “should have known better”, and “should have known better” implies that knowledge was available, but not accessed or not used, even though it “should have been”. To learn from a mistake, you need to both acknowledge that available knowledge (that you now realise you should have known), and also sharpen up your Knowledge Management (so that in future, you acquire the knowledge you “should know” before you start an enterprise). Repeat failures are always mistakes.

There are therefore four categories when it comes to learning from failure.

1) There is learning from a failure when trying the unknown. You were trying something new, and met an unexpected barrier. This was not your fault – this was part of the exploration process. It was a justified error. You gained new knowledge, and need not only to learn from this, but to spread the learning to others who are also exploring the same area.

2) There is learning from planned failure. Sometimes you plan to fail. This is often the best way to explore a new product or process – try many things at one and select the one that does not fail. Create prototypes, plan pilots in as fail-safe an environment as you can. Deliberately try many things knowing that some will fail, and use this as a selection process to find the right path. This “trial and error”. This was Edison’s approach, and is often the smartest way to learning your way forward. As Dave Kelly, CEO of IDEO, said – “Enlightened trial and error out-performs the planning of flawless intellects”

3) There is learning from a failure when you should have known better. You were trying something new to you and met a barrier you did not expect, but found afterwards that others knew of this barrier. This was a failure of the knowledge management framework. You were either unaware of this knowledge, unable to find it, or unaware even of the need to look. In this case, you have not gained new knowledge about the barrier which needs to be shared; you have gained knowledge about the unreliability of the KM framework, and this needs to be fixed and improved before the failure is repeated again (see this cautionary tale).

4) There is learning from a failure when you had access to the knowledge, but ignored it. You were trying something new, others had shared knowledge with you, warned you about the barriers, but for some reason you thought “those barriers won’t affect me – my project is different“. This was a MISTAKE. This was an error of judgement on your part. You were at fault. You have learned nothing new about the barrier, but you have learned something about yourself.

Allow a lot of failure, and you will meet a lot of success.
Allow a lot of mistakes, and you won’t.

It’s as simple as that.

View Original Source (nickmilton.com) Here.

A case study of a failed learning system

When lesson learning failed in the Australian Defence Force, they blamed the database. But was this all that was at fault?

design-databaseHere’s an interesting 2011 article entitled “Defence lessons database turns off users”. I have copied some of the text below, to show that, even thought the lessons management software seems to have been very clumsy (which is what the title of the article suggests), there was much more than the software at fault.

 “A Department of Defence database designed to capture lessons learned from operations was abandoned by users who set up their own systems to replace it, according to a recent Audit report. The ADF Activity Analysis Data System’s (ADFAADS) was defeated by a “cultural bias” within Defence, the auditor found. Information became fragmented as users slowly abandoned the system”.

So although the article title is “defence lessons database turns off users”, the first paragraph says that it was “defeated by cultural bias”. There’s obviously something cultural at work here ……

“Although the auditor found the structure and design of the system conformed to ‘best practice’ for incident management systems, users found some features of the system difficult to use. Ultimately it was not perceived as ‘user‐friendly’, the auditor found. Convoluted search and business rules turned some users against the system”. 

….but it also sounds like a clumsy and cumbersome system

“In addition, Defence staff turnover meant that many were attempting to use ADFAADS with little support and training”.

…with no support and no training.

“An automatically-generated email was sent to ‘action officers’ listing outstanding issues in the system. At the time of audit, the email spanned 99 pages and was often disregarded, meaning no action was taken to clear the backlog”.

There needs to be a governance system to ensure actions are followed through, but sending a 99-page email? And with no support and follow-up?

 “It was common for issues to be sent on blindly as ‘resolved’ by frontline staff to clear them off ADFAADS, even though they remain unresolved, according to the auditor”.

Again, no governance. There needs to be a validation step for actions, and sign-off for “resolution” should not be developed to frontline staff.

 “Apart from a single directive issued by Defence in 2007, use of the database was not enforced and there were no sanctions against staff who avoided or misused it”.

There’s the kicker. Use of the lessons system was effectively optional, with no clear expectations, no link to reward or sanction, no performance management. It’s no wonder people stopped using it.

So it isn’t as simple as “database turned off users”. It’s a combination of

  • Poor database
  • Poor notification mechanism
  • No support
  • No training
  • No incentives
  • No governance
  • No checking on actions

It’s quite possible that if the other items had been fixed, then people might have persevered with the clumsy database, and it’s even more likely that if they built a better database without fixing the other deficiencies, then people still would not use it.  A

What they needed was a lessons management system, not just a database.

So what was the outcome? According to the article,

…..establish a clear role and scope for future operational knowledge management repositories, and develop a clear plan for capturing and migrating relevant existing information ….. prepare a “user requirement” for an enterprise system to share lessons.

In other words – “build a better database and hope they use it” Sigh.

View Original Source (nickmilton.com) Here.

What are the outputs of the KM workstream?

KM organisations need a Knowledge workstream as well as a Product/Project workstream. But what are the knowledge outputs?

I have blogged several times about the KM workstream you need in your organisation; the knowledge factory that runs alongside the product factory or the project factory.  But what are the outputs or  products of the knowledge factory?
The outputs of the product factory are clear – they are designed and manufactured products being sold to customers. The outputs of the project factory are also clear – the project deliverables which the internal or external client has ordered and paid for. 
We can look at the products of the KM workstream in a similar way. The clients and customers for these are knowledge workers in the organisation who need knowledge to do their work better; to deliver better projects and better products. It is they who define what knowledge is needed. Generally this knowledge comes in three forms:
  • Standard practices which experience has shown are the required way to work. These might be design standards, product standards, standard operating procedures, norms, standard templates, algorithms and so on. These are mandatory, they must be followed, and have been endorsed by senior technical management.
  • Best practices and best designs which lessons and experience have shown are currently the best way to work in a particular setting or context. These are advisory, they should be followed, and they have been endorsed by the community of practice as the current best approach.
  • Good practices and good options which lessons from one or two projects have shown to be a successful way to work. These might be examples of successful bids, plans, templates or designs, and they have been endorsed by the community of practice as “good examples” which might be copied in similar circumstances, but which are not yet robust enough to be recognised as “the best”. 
  • More generic accumulated knowledge about specific tasks, materials, suppliers, customers, legal regimes, concepts etc.
The project/product workstream also creates outputs which act as inputs to the knowledge workstream; these are the knowledge deliverables, the lessons which capture hindsight, and the useful iterms which can be stored as good practices and good options. The link between lessons and best practices is described here, and shows how the two workstreams operate together to gather and deliver knowledge to optimise results. 

View Original Source (nickmilton.com) Here.

Observations, Insights, Lessons – how knowledge is born

Knowledge is born in a three-stage process of reflection on experience – here’s how.

I think most people accept that knowledge is born through reflection on experience. The three-stage process in which this happens is the core of how the military approach learning from experience, for example as documented in  this presentation from the Australian Army (slide 12).

The three stages are the identification of  Observations, Insights and Lessons, collectively referred to as OILs. Here are the stages, using some of the Australian Army explanation, and some of my own.

  • Observations. Observations are what we capture from sources, whether they be people or things or events. Observations are “What actually happened” and are usually compared to “What was supposed to happen”. Observations are the basic building blocks for knowledge but they often offer very limited or biased perspective on their own. However storing observations is at least one step better that storing what was planned to happen (see here). For observations to be a valid first step they need to be the truth, the whole truth (which usually comes from multiple perspectives) and nothing but the truth (which usually requires some degree of validation against other observations and against hard data).
  • Insights. Insights are conclusions drawn from patterns we find looking at groups of observations.  They identify WHY things happened the way they did, and insights come from identifying root causes. You may need to ask the 5 whys in order to get to the root cause.  Insights are a really good step towards knowledge due to their objectivity.  The Australian Army suggests that for the standard soldier, insights are as good as lessons. 
  • Lessons.  These are the inferences from insights, and the recommendations for the future. Lessons are knowledge which has been formulated as advice for others, and the creation of lessons from insights requires analysis and generalisation to make the insights specific and actionable . The Australian army defines lessons as “insights that have specific authorised actions attached…. directed to Army authorities to implement the stated action”, and there is a close link between defining an actionable lesson, and assigning an action to that lesson.

This progression, from Observation to Insight to Lesson represents the methodology of learning by reflection. The Retrospect meeting and the (smaller scale) After Action Review both provide a structured discussion format which moves increments of knowledge through the three stages..

In other organisations these three stages are separated. Observations are collected, analysts use these to derive insights, and then an authoritative body adds the action and turns the insights into lessons. My personal preference is to address all three steps as close as possible to the action which is being reviewed, using the same team who conducted the action to take Observations through to Lessons.

But however you divide the process, and whoever conducts the steps, these three stages of Observation, Insight and Lesson are fundamental to the process of learning from experience. 

View Original Source (nickmilton.com) Here.

The "One Year After" knowledge capture event.

Many of us are used to holding knowledge capture events at the end of a project.  There is also merit in repeating this exercise one year (or more) later.

Imagine a project that designs and builds something – a factory, for example, or a toll bridge, or a block of student accommodation. Typically such a project may capture lessons throughout the project lifetime, using After Action Reviews to capture project-specific lessons for immediate re-use, and may then capture end-of-project lessons using a Retrospect, looking back over the life of the project to identify knowledge which can be passed on to future projects. This end-of-project review tends to look at the efficiency of the practices used during the project, and how these may be improved going forward. 
The review asks “Was the project done right, and how can future projects be done better”.  However what the review often does not cover is “Was the right project done?”
At the end of the project the factory is not yet operational, the bridge has only just opened to traffic, and you have just cut the ribbon on the student accommodation block. You do not yet know how well the outcome of the project will perform in practice. 

This is where the One-Year operational lessons review comes in.

You hold this review after a year or more of operation. 
  • You look at factory throughput, and whether the lines are designed well, how they are being used, how effective the start-up process was, whether there are any bottlenecks in dispatch and access, and even whether the factory is in the correct location. 
  • You look at traffic over the bridge – is it at expected levels? Is it overused or underused? Is it bringing in the expected level of tolls? Does the bridge relieve congestion or cause congestion somewhere else? Does the road over the bridge have enough lanes?  Does it ice up in winter?
  • You look at usage of the student accommodation. Is it being used as expected? Are the kitchens big ebnough? Are there enough bike racks? Where is the wear and tear on the corridors? Where are accidents happening? What do the neighbours think?
In this review you are looking not at whether the project was done right, but whether it was the right project (or at least the right design). The One Year operational learning review will generate really useful lessons to help you improve your design, and your choice of projects, in future. 

Don’t stop collecting lessons at the end of the project, collect more once you have seen the results of a year or more of operations.

Contact Knoco for help in designing your lesson learned program.

View Original Source (nickmilton.com) Here.

Skip to toolbar