The importance of Reflection in KM

We don’t learn by doing, we learn by reflecting on doing, which is why your KM program should include reflective processes.

Kolb learning cycle. Publ;ic domain image from wikipedia

There is a popular quote on the Internet, often attributed to John Dewey, that “We do not learn from an experience … We learn from reflecting on an experience“. It probably isn’t from Dewey, although it is a summarisation of Dewey’s teaching, but it does make the point that no matter how broad your experience, it doesn’t mean you have leared anything.

Let’s match that with another Internet quote, attributed to Paul Shoemaker “Experience is inevitable, Learning is not“. Without reflection, and without change as a result of that reflection, nothing has been learned no matter how many experiences you have.

“Observation and Reflection” is also a part of the Kolb learning cycle, shown here, which is a well-established model for how individuals learn.

Knowledge Management is not so much about Individual learning as about Organisational Learning and Team Learning, but reflection is just as important in the KM context. Reflection  needs to be introduced to work practice through the introduction of reflective processes such as the following:

  • Introducing After Action Review as a reflective team process, to allow teams to reflect on, and collectively learn from, actions and activities;
  • Introducing Retrospect as a reflective team process, to allow teams to reflect on, and collectively learn from, projects and project phases;
  • Introducing processes such as Knowledge Exchange to allow members of a Community of Practice to reflect on, and share, their practice;
  • Introducing processes such as knowledge interviews to guide experts through structured reflection on their own experience, and top make this public for others.

It is only through introducing reflective processes such as these, and then acting on the new knowledge gained, that your organisation will stop just Experiencing, and start Learning.

View Original Source Here.

A key lesson-learning role in the military setting

Lesson Learning is well embedded in the United States Army and forms a model which industry can emulate, especially when it comes to assigning knowledge management roles within the business.

www.Army.mil As explained in this excellent analysis from Nancy Dixon, lesson learning works well in the US Army.  This article describes some of the components of the learning system they apply, and mentions processes such as After Action Reviews and Learning Interviews, but also mentions  different roles with accountability for the lessons process. One of the key roles is the  Lessons Learned Integrator, or L2I.

The Lessons Learned Integrator role

The Centre for Army Lessons Learned  is deploying Lessons Learned Integrators in operational units and in other units such as training schools and doctrine centres. These L2I analysts gather lessons learned, research requests for information (RFI), and support the unit within which they are situated. They act as conduits for lessons in and out of the units. You can find various role descriptions for this post (e.g. this one), which suggest that the role primarily involves
  • Collecting, reporting, and disseminating lessons from the host unit
  • Monitoring lessons learned and other new knowledge from elsewhere, assessing it for relevance to the host unit, and “pushing” it to the correct people
  • Initiating actions that lead to change recommendations
  • Locally supporting the “Request for Information” process, where soldiers can make requests for information from the Centre for Army Lessons Learned.
In many of the support centres, the L2I analyst also has a role in developing doctrine, as described here

  • The L2I analyst can derive information from a variety of sources: unit after-action reports; tactics, techniques, and procedures used by units in and returning from theater; Soldier observations/submissions to the Engineer School; and requests for information. 
  • This information is used to conduct doctrine, organization, training, materiel, leadership and education, personnel, and facilities gap analyses and to determine solutions

As ever, Industry can learn from the Military.

Too often we see “Lessons Learned systems” which seem to have no roles or accountabilities assigned to them. The assumption seems to be that “everyone is responsible for lessons learned”, which quickly becomes “someone else will do it”, then “nobody is responsible”. The Army avoid this by identifying specific pivotal roles for identification, communication and analysis of Lessons, and for identifying what needs to be done as a result.

If you want your Lessons Learned system to really work, then you may need roles similar to the L2I in your operation units.

View Original Source Here.

Three levels of Lesson Learning

Here are a couple of reprised blog posts from 5 years ago, covering the topic of lesson learning, and presenting 3 potential levels of maturity for a learning system. Most organisations are stuck at level 1.

There are three levels of rigour, or levels of maturity, regarding how Lesson-learning is applied in organisations.

  • The first is to reactively capture and doccument lessons for others to find and read
  • The second is to reactively capture lessons at the end of projects, document them, and as a result make changes to company procedures and practices
  • The third is to proactively hunt lessons from wherever they can be found, and make changes to company procedures and practices so that the lessons are embedded into practice. 

Lesson-learning can be a very powerful way for an organisation to learn, change and adapt, but only if it is approached in a mature way. Level 1, to be honest, will not add much value, and its only when you get to level 2 that Lesson learning really starts to make a difference.

Let’s look at those levels in more detail.

Level 1

Level 1  is to reactively capture lessons at the end of projects, and document them so that others can learn. Lessons are stored somewhere, and people need to find and read the lesson in order to access the knowledge. There are sub-levels of maturity in level 1, which include

1a) Ad-hoc capture of lessons, often by the project leader, documenting them and storing them in project files with no quality control or validation step. Lessons must therefore be sought by reading project reports, or browsing project files structures

1b)Structured capture of lessons, through lessons identification meetings such as retrospects, documenting and storing the lessons in project files with no quality control or validation step.

1c) Structured capture of lessons, through lessons identification meetings such as retrospects, documenting and storing the lessons in a company-wide system such as a lessons database or a wiki. This often includes a validation step.

1d) Structured capture of lessons, through lessons identification meetings such as retrospects, documenting and storing the lessons in a company-wide system with auto-notification, so that people can self-nominate to receive specific lessons.

Level 2

Level 2 is to reactively capture lessons at the end of projects, document them, and as a result make changes to company procedures and practices so that the lessons are embedded into practice. Here people do not need to read the lesson to access the knowledge, they just need to follow the practice. Again, there are sub-levels of maturity in level 2, which include

2a) Lessons are forwarded (ideally automatically, by a lesson management system) to the relevant expert for information, with the expectation that they will review them and incorporate them into practice.

2b) Lessons include assigned actions for the relevant expert, and are auto-forwarded to the expert for action

2c) As 2b, with the addition that the actions are tracked and reported.

Level 3  is to proactively hunt lessons from wherever they can be found, and make changes to company procedures and practices so that the lessons are embedded into practice.  There are not enough organisations at level 3 to recognise sub-levels, but there are some ways in which Level 3 can operate

3a) Senior managers can identify priority learning areas for the organisation. Projects are given learning objectives – objectives for gathering knowledge and lessons on behalf of the organisation. These may be written into project knowledge management plans. 3b) Learning teams may analyse lessons over a period of months or years to look for the common themes and the underlying trends – the weak signals that operational lessons may mask.

3c) Organisations may deploy specific learning resources (Learning Engineers, Project Historians, etc) into projects or activity, in order to pick up specific learning for the organisation.

I have only really come across level 3 in the military.  For example, see this quote from
Lieutenant-General Paul Newton

The key is to ‘hunt’ not ‘gather’ lessons, apply them rigorously—and only when you have made a change have you really learned a lesson. And it applies to everyone … It is Whole Army business. 

However even level 2 is quite rare. Many organisations have not gone beyond the “document and store” stages of Level 1, and generally have been disappointed by the outcomes.

If you aspire to be a learning organisation, set your sights at levels 2 or 3.

View Original Source Here.

The difference between lessons and best practice – another post from the archives

Here is another post from the archives – this time looking at the difference between Best Practice and Lessons Learned.

Someone last week asked me, what’s the difference between Best Practice, and Lessons Learned.

 Now I know that some KM pundits don’t like the term “Best Practice” as it can often be used defensively, but I think that there is nothing wrong with the term itself, and if used well, Best Practice can be a very useful concept within a company. So let’s dodge the issue of whether Best Practice is a useful concept, and instead discuss it’s relationship to lessons learned.

My reply to the questioner was that Best Practice is the amalgamation of many lessons, and it is through their incorporation into Best Practice that they become learned.

If we believe that learning must lead to action, that lessons are the identified improvements in practice, and that the actions associated with lessons are generally practice improvements, then it makes sense that as more and more lessons are accumulated, so practices become better and better.

A practice that represents the accumulation of all lessons is the best practice available at the time, and a practice that is adapted in teh light of new lessons will only get better.

View Original Source Here.

Why you need to place some demands on the knowledge sharer

Sharing knowledge is a two-sided process. There is a sharer and a receiver. Be careful that making knowledge easier to share does not make knowledge harder to re-use.

Image from wikimedia commons

Sharing knowledge is like passing a ball in a game of rugby, American Football or basketball. If you don’t place some demands on the thrower to throw well, it won’t work for the catcher. If you make it too undemanding to throw the ball, it can be too hard to catch the ball.  Passing the ball is a skill, and needs to be practised.

The same is true for knowledge. If you make it too simple to share knowledge, you can make it too difficult to find it and re-use it.  In knowledge transfer, the sharing part is the easier part of the transfer process. There are more barriers to understanding and re-use than there are to sharing, so if you make the burden too light on the knowledge supplier, then the burden on the knowledge user can become overwhelming.

Imagine a company that wants to make it easy for projects to share knowledge with other projects. They set up an online structure for doing this, with a simple form and a simple procedure. “We don’t want people to have to write too much” they say “because we want to make it as easy as possible for people to share knowledge”.

So what happens? People fill in the form, they put in the bare minimum, they don’t give any context, they don’t tell the story, they don’t explain the lesson. And as a result, almost none of these lessons are re-used. The feedback that they get is “these lessons are too generic and too brief to be any use”.  we have seen this happen many many times.

By making the knowledge too easy to share – by demanding too little from the knowledge supplier – you can make the whole process ineffective. 

There can be other unintended consequences as well. Another company had a situation as described above, where a new project enthusiastically filled in the knowledge transfer form with 50 lessons. However this company had put in a quality assurance system for lessons, and found that 47 of the 50 lessons were too simple, too brief and too generic to add value. So they rejected them.

The project team in question felt, quite rightly, that there was no point in spending time capturing lessons if 94% of them are going to be rejected, so they stopped sharing. They became totally demotivated when it came to any further KM activity.

 Here you can see some unintended consequences of making things simple. Simple does not equate to effective.

Our advice to this company was to introduce a facilitation role in the local Project Office, who could work with the project teams to ensure that lessons are captured with enough detail and context to be of real value. By using this approach, each lesson will be quality-controlled at source, and there should be no need to reject any lessons.

Don’t make it so simple to share knowledge, that people don’t give enough thought to what they write.

The sharer of knowledge, like the thrower of the ball, needs to ensure that the messages can be effectively passed to the receiver, and this requires a degree of attention and skill. 

View Original Source Here.

Skip to toolbar