The link between lesson learning maturity and effectiveness.

What is the best type of storage system for lessons learned? Our survey data compares the options.

We conducted a big survey of Knowledge Management this year, following on from a previous survey in 2014. Both surveys contained an optional section on lesson learning, and across both surveys we collected 222 responses related to lesson learning.

One of the lessons learned questions was “Rate the effectiveness of your organisational lessons learned process in delivering performance improvement, from 5 (completely effective) to 0 (completely ineffective)”

Another asked the respondent where their lessons were most commonly stored.

By combining these two questions, we can look at the average effectiveness of lesson learning for each storage option, as shown in the chart above. You can see clearly that organisations where lessons are stored within a custom lesson management system are far more likely to rate their lesson learning as effective than those where lessons are stored as sections within project reports, or not stored at all. Other storage options are linked to intermediate ratings scores.

This links back to a blog post I wrote in 2012 on the maturity of lesson learned systems. 

Here I identified a number of maturity levels, from level 1a through level 3. The supporting technology for storing lessons is only one part of the maturity system, but it struck me today that you can overlay these maturity levels on the chart, as shown below.

  • In levels 1a and 1b, lessons are stored in project reports
  • In level 1c, lessons are stored in a spearate system – a database, a wiki, a spreadsheet
  • In level 1d, individuals can “follow” certain types of lessons, and be notified when new lessons appear
  • In level 2, lessons are stored in a lesson management system which allows them to be routed to experts to embed lessons into practice.

The diagram shows that each progression from one maturity level to the next is associated with an increase in effectiveness of the lesson learning system.

View Original Source (nickmilton.com) Here.

More data on the health of KM (revised)

Is KM dying, alive and well, or on life support? Let’s bring some data into the debate (this post updated based on further data).

The debate about the health of KM is a perennial topic, with people variously claiming “KM is dead”, “KM is alive and well” or “KM is on life support”.  The item commonly missing in these claims is hard data; people instead going on their impressions, or on the bold claim of a replacement for KM that overthrows its older rival.

I have tried my best to bring some hard data into it, such as the apparent accelerating start-up rate of organisations, taken from the Knoco survey data (the counter-argument to which might be that more recent entrants to the KM game are more likely to have responded to the survey).
Here are some more data.
3 years ago I did a survey within LinkedIn, looking at the number of people in different countries with “Knowledge” in their job title (or or Conocimiento, or Connaissance, or Kennis, etc etc depending on language). From this I concluded that there are probably about 32,000 knowledge managers in the world, with the greatest concentration in Switzerland and the Netherlands, and the lowest concentration in Russia and Brazil.
This survey is easy to repeat, and to compare the number of people now with Knowledge or its equivalent in their job title, with the number of people then. The results for the top 10 countries in terms of search results are shown in the figure above and the table below.

total K people 2014 total K people 2017
USA
10483
12494
UK
3431
3989
India
3244
4228
Canada
1736
2000
Netherlands
1656
1988
Australia
1105
1388
Spain
820
788
France
803
1000
Brazil
733
988

In every case the number of people with a Knowledge job title has increased, but about 26%.

However the number of people from those countries on linkedin has also increased (thanks to Mahomed Nazir for pointing this out).

If we look at the number of people with Knowledge in their job title as a percentage of LinkedIn users, then things change, as the population of LinkedIn has grown a lot over the last few years.  The figures below represent the number of people from a particular country with Knowledge or its translation in their job title, per million LinkedIn unsers from that country.
K people per million LinkedIn users 2014 total K people per million LinkedIn users 2017
USA
101
97
UK
214
181
India
130
101
Canada
174
157
Netherlands
325
331
Australia
184
174
Spain
128
143
France
97
107
Brazil
41
33
Here some countries have seen a fall in the percentage of people with a Knowledge job title, others have seen a rise. On average, numbers have fallen by 6%.
So in conclusion, over the last 3 years, the number of people with Knowledge etc as a job title on LinkedIn has increased by 26%, but this becomes a 6% decline in percentage terms if you allow for the overall population growth of LinkedIn.
 Is this the death of KM? Unlikely.
Could it be a slow decline? Possibly. 
I think we need to collect more data like this over a longer time period, and see if any trends continue.

View Original Source (nickmilton.com) Here.

How Knowledge Management maturity progresses

Here is a nice graph from our global KM surveys that shows how KM maturity progresses.

This graph is a combination of two questions, and we have combined answers from both the 2014 and 2017 surveys, so over 570 answers are included in the graph. The first question was:

Which of the following best describes the current status of KM within this organisation (or part of the organisation)?

  • We are in the early stages of introducing KM 
  • We are well in progress with KM 
  • KM is embedded in the way we work

The second question was

To what extent is KM now integrated with the normal work of the organisation? Choose the sentence that most closely fits your answer.

  • KM is not part of normal activity but is being addressed by a separate group
  • KM is performed as a one-off intervention after which business returns to normal 
  • KM is a non-routine part of normal activity, done as an exception or when requested 
  • KM is fully integrated and is a routine part of normal activity or operations 

 The graph shows how the responses to the second question vary according to the first question, and shows how the integration of KM changes with maturity.

In the early stages of KM, KM is mostly either performed by a separate group (30% of responses) or as an exception to normal process (48% of responses).

For organisations which are well in progress, the role of the separate group is much reduced (to 12% of responses), as is the one-off intervention. The largest proportion of responses is still that KM is an exception to normal process (54% of responses), but the second largest is that KM is fully integrated in normal activity.

For organisations who claim that KM is fully embedded, almost three quarters say that KM is fully integrated in normal activity.

As you might expect, there is a close link between fully embedded KM, and full integration of KM activities into operations.

View Original Source Here.

KM is dead? Here’s data that shows the opposite.

We often hear people say that Knowledge Management is dead, but in fact it has never been so much alive, with an accelerating take-up of the topic.

The idea that “KM is dead” is a meme that has been with us for over a decade (2004, 200820112012, 2015, 2016 to choose but a few) and which resurfaces several times a year; usually when a software vendor has something to sell (example). Very seldom are these assertions of the demise of KM accompanied by any data or analysis of trends, other than the Googletrends plot, which as we have seen, is based on searches as a proportion of the total, and would also  point to the demise of project management, risk management, financial management, and so on. 
So where are the data to measure the health of the Knowledge Management sector?

That’s partly why we conducted our KM survey last month – to collect data. one of the sets of data we collected was related to the length of time organisations have been doing KM, and properly analysed this could show us whether the take-up of KM was accelerating or slowing.

We asked people “How many years has this organisation been doing Knowledge Management? Please select the closest number from the list below, giving them the following options:

  • 0 years
  • .5
  • 1
  • 2
  • 4
  • 8
  • 16
  • 32 years
The results from this question are shown below.  Of course we need to state the obvious caveat, which is to say that

(disclaimer) these results are representative only of the population of organisations which took the survey, and not necessarily of the entire KM population. 


At first sight it looks as if the “hump” of KM was 8 years ago, with more organisations choosing the category “8 years” than any other.

However we need to realise that this is not a linear scale, and that organisations who chose “8 years” as the closest number had actually been doing KM for between 6 and 12 years – a 6 year period, with organisations starting KM at a rate of about 17 a year over that period. Contrast that with the 21 organisations that have started KM in the last 6 months, and we see that this plot is misleading, and that we need to have some way to plot KM start-up in a linear way.

That’s what we did in the plot below.

This plot takes the same figures, and converts them into the “KM start-year”. So the organisations that had started KM between 6 and 12 years ago, at a rate of about 17 a year over that period, are shown with start dates from 2006 to 2011.

Now the picture is very different. Now we can see that the rate of take-up of KM is accelerating considerably. A very similar pattern was seen in our 2014 survey, though with different absolute values as it was a different set of participants.

These data suggest that KM is not only far from dead, it is increasing in popularity year on year as an increasing number of organisations take up the topic.

View Original Source Here.

Which KM implementation approach works the fastest?

The quickest ways to implement KM are by change management, and by piloting. The slowest are through top down directive, and KM by stealth. But how do we know this?

I blogged yesterday about how long it takes on average to implement KM, but how can you get ahead of the curve, and deliver KM quicker than the average?  We conducted a big global survey of KM this year. following on from a previous survey in 2014. In both surveys we asked two questions:

How many years have you been doing KM?

  • 0
  • .5
  • 1
  • 2
  • 4
  • 8
  • 16
  • 32 years

Which of these best expresses the level of KM maturity in your organisation?

  • We are in the early stages of introducing KM
  • We are well in progress with KM
  • KM is embedded in the way we work.
Yesterday we used these data to look at the average length of time organisations have been doing KM, for each of these maturity levels, which gives us a measure of the speed of KM implementation.  And then, of course, we can look at factors that influence that speed.
One of the most obvious factors would be the implementation strategy, and luckily we asked the survey respondents the following question:
How has KM been implemented in the organisation? Please choose the answer closest to your situation.

  • A KM pilot phase followed by a roll-out phase 
  • As a change management approach 
  • Introduce and promote technology 
  • Introduce processes (eg CoPs, lesson learning) 
  • Introduce technology and hope for viral growth 
  • KM by stealth/Guerrilla KM 
  • Top down directive to the entire company 
  • Not decided yet 
  • Other (please specify)
The chart shown here combines these three questions for a combined dataset from the 2014 and 2017 surveys, with duplicates removed. In total 522 people answered all 3 questions. The chart shows

For organisations who have chosen each of these implementation approaches, what is the average number of years they have been doing KM, for each of these maturity levels?

For example, organisations using a change management approach and who say they are “in the early stages” have been doing KM on average for just over 3 years, whereas if they are “well in progress” they have been doing KM on average for just over 6 years.
These numbers give a proxy measure of the speed of KM implementation, and the approaches are ordered from left to right in order of overall implementation speed.

The fastest approaches to KM implementation are a Change Management approach, and a piloting phase followed by roll-out.

Change management is the overall quickest approach. Piloting gets you out of the “early stages” more quickly than any other approach, as a successful pilot means you are well in progress with KM already, but the roll-out phase may keep you in the “in progess” phase for longer. A combination of Change Management and Knowledge Management Pilot projects is the approach we at Knoco recommend for Knowledge Management implementation.

KM by top-down directive and KM by stealth are the slowest approaches.

“KM by stealth” organisations which say they are well in progress have been doing KM for nearly 12 years; double the number for the change management approach. KM by top down directive is almost as slow.

If you are unsure about your KM implementation strategy, hopefully these results will give you some guidance. 

View Original Source Here.

How long it really takes to embed Knowledge Management

In the wake of our recent 2017 survey, here are some more data about how long it really takes to embed Knowledge Management.

We conducted a big global survey of KM this year. following on from a previous survey in 2014. In both surveys we asked two questions:

How many years have you been doing KM?

  • 0
  • .5
  • 1
  • 2
  • 4
  • 8
  • 16
  • 32

Which of these best expresses the level of KM maturity in your organisation?

  • We are in the early stages of introducing KM
  • We are well in progress with KM
  • KM is embedded in the way we work.
If you combine these questions, then you can get a measure of how long it takes to reach the various levels of KM maturity. The graph below is just such a combination, and represents all datapoints from the 2014 and 2017 surveys with duplicates removed – a dataset of just over 750 organisations.

Full dataset

This is the full dataset, and we can see that the transition from “early stages” to “well in progress” takes normally about 4 years (if you take the 50% level as normal), and the transition to fully embedded takes normally about 20 years.  There is a large spread – some reach maturity far faster than others.
We can also see some strange anomalies:
  • organisations which have been doing KM for 0 years, yet it is fully embedded – either these are spurious data, or organisations who feel they are doing KM without the benefit of introducing a formal KM program
  • organisations which have been doing K for 32 years, yet are still in the early stages – either these are spurious data, or organisations who feel they are doing KM for ages but in a half-hearted manner, or doing it “under the radar”.

    However this full dataset may not be too helpful, as we know that embedding KM takes longer in larger organisations. The graphs below show sections of the dataset for small, medium and large organisations.

    Small organisations

    We also asked the participants to answer the following question:
    How large is the organisation (or part of the organisation) you are decribing in terms of staff?Please select the closest number from the list below.
    • 10
    • 30
    • 100
    • 300
    • 1000
    • 3000
    • 10000
    • 30000
    • 100000
    • 300000
    The graph above is the same plot of maturity v number of years, but only for those 148 organisations where the respondent chose a size of 10, 30 or 100.
    We can see that the strange anomalies of “doing KM for 32 years and getting nowhere) belong to this size range. we can also see that the transition from early stages to well in progress still takes just under 4 years (if you take the 50% level as normal),  but the transition to fully embedded takes about 6 years.

    Medium organisations

    The graph above is the same plot of maturity v number of years, for those 351 organisations where the respondent chose a size of 300, 1000 or 3000.
    Here the transition from early stages to well in progress still takes about 4 years (if you take the 50% level as normal),  but the transition to fully embedded takes about 20 years.

    Large organisations

    The graph above is the same plot of maturity v number of years, for those 3255 organisations where the respondent chose a size of 10,000 staff or larger.
    Here the transition from early stages to well in progress still takes about 4 years (if you take the 50% level as normal),  but the transition to fully embedded takes 32 years, as half of the respondents at the 32 year mark said they were still “well in progress”.

    Conclusions

    The obvious conclusion is that implementing KM takes a long time, and the bigger the organisation, the longer it takes. However we can be a bit more subtle than that, and conclude as follows:
    • The early stages of Knowledge Management take on average 4 years for any organisation, before you can begin to say ” we are well in progress”.  20% of organisations may get to this point within a year, another 20% may take 8 years or more.
    • The time it takes to reach the point where KM is fully embedded depends on the size of the organisation, with an average of 6 years for the smaller ones, to 32 years for the very biggest. 
    These are average figures – some implementations are faster and some are slower. Tomorrow we might start to investigate what makes the difference in the speed of Knowledge Management implementation.

    View Original Source Here.

    Which Knowledge Management Processes add most value?

    I blogged yesterday about usage and value of Knowledge Management technologies. Here is a similar analysis, also drawn from our  2017 Global Survey of Knowledge management, of the usage and value of KM processes.

    We asked the survey participants to rate these different KM processes by the value they have added to their KM program, including in the question the option to choose “we do not use this process” or “it’s too early to tell”.
    The chart above shows these processes in order of value from left to right, as a stacked area chart of responses, with the weighted value of the process overlain as a line (this line would be at 100% if all the participants that used this process claimed it had “high value” and at 0% if they all claimed it had no value). The height of the dark grey area represents usage, as the light grey area is the “Not Used” response.

    288 people answered this question on the survey. from a wide range of organisations around the globe.

    The processes are also listed below in order of the usage figures, and in order of the average value assigned by the respondents.

    Knowledge Management processes in order of usage 
    (most common at the top)
    Knowledge Management processes in order of the assigned value when used (those rated most valuable at the top)
    1. coaching and mentoring
    2. project lessons capture (large scale)
    3. after action review (small scale)
    4. knowledge roundtables
    5. Peer Assist
    6. retention interviews
    7. storytelling
    8. action learning
    9. knowledge cafe
    10. crowdsourcing
    11. open space
    12. appreciative enquiry
    13. Innovation deepdive
    14. wikithon
    15. positive deviance
    1. knowledge roundtables
    2. coaching and mentoring
    3. project lessons capture (large scale)
    4. after action review (small scale)
    5. action learning
    6. Peer Assist
    7. retention interviews
    8. knowledge cafe
    9. Innovation deepdive
    10. storytelling
    11. appreciative enquiry
    12. open space
    13. crowdsourcing
    14. positive deviance
    15. wikithon

    Comparison of usage and value

    As with the Technology results, there is a strong correlation between usage and value. This could represent a tendency for the more valuable KM processes to get the greatest use. This is a perfectly valid interpretation.  An alternative argument would be to say that processes deliver more value the more they are used. Processes at the top of the list are mainstream processes, used frequently, and delivering high value. Processes at the bottom of the list are less mainstream, and deliver less value to the companies that use them, because those companies make less use of these processes. This is also a plausible interpretation.

    Even with this interpretation, we could still look for “Good performing” processes which deliver more value than their popularity would imply (and so are significantly higher in the value list than in the popularity list), and “Poor performing processes” which deliver less value than their popularity would imply.

    Under this interpretation, the best performing KM processes are technologies are Innovation Deepdive, Knowledge Roundtable meetings and Action Learning (both of them 3 or 4 places higher in the Value list than the Usage list) and the poorest performing processes in terms of value per use are Crowdsourcing and Storytelling.

    Changes since the 2014 survey

    We saw similar results in the 2014 survey, with processes such as Knowledge Roundtable, After Action Review and Coaching and Mentoring both popular and performing well. However there are also some significant changes in both usage and value.

    Those processes which have seen the greatest most increase in use between  the two surveys in 2014 and 2017 is Project Lesson Capture, with a rise in usage of 5 places and in value of 6 places, and Storytelling (+7 in usage, +4 in value).

    There have been some big fallers as well. Positive Deviance has dropped 9 places in usage and 8 places in value, and Crowdsourcing has dropped 6 places in usage and 9 places on the value list. We did not include Innovation deepdives in the 2014 survey. Peer Assist has also fallen in popularity, which is a shame

    It looks like the old staples processes of KM – the knowledge roundtable meetings, after action reviews, lesosn capture, peer assist and coaching and mentoring – remain the core process set for Knowledge Management.

    View Original Source Here.

    Which Knowledge Management technologies add most value?

    Interesting results are coming through from the Knoco 2017 Knowledge Management survey, including this plot of comparative KM technology value.

    We asked the survey participants to rate these different types of technology by the value they have added to their KM program, including in the question the option to choose “we do not use this technology” or “it’s too early to tell”.

    The chart above shows these technologies in order of value from left to right, as a stacked area chart, with the weighted value shown as a blue line (this line would be at 100% if all the participants that used this technology claimed it had “high value” and at 0 they all claimed it had no value).

    The top of the grey area represents the usage percentage for these technologies, as the light grey area above represents people who do not use this technology. The top of the green area represents the percentage of people who said this technology had added “large value”.

    288 people answered this question.

    The technology types are listed below in order of usage, and in order of value.

    Technology type in order of usage 
    (most common at the top)
    Technology type in order of value delivered  when used (most valuable at the top)
    Best practice repository
    Document collaboration
    eLearning
    People and expertise search
    Enterprise search
    Enterprise content management
    Portals (non-wiki)
    Video publication
    Question and answer forums
    Blogs
    Lessons Management
    Microblogs
    Brainstorming/ideation/crowdsourcing
    Wikis
    Social media other than microblogs
    Expert systems
    Data mining
    Innovation funnel
    Semantic search
    Enterprise search
    Best practice repository
    Document collaboration
    Enterprise content management
    eLearning
    Portals (non-wiki)
    People and expertise search
    Question and answer forums
    Lessons Management
    Expert systems
    Brainstorming/ideation/crowdsourcing
    Microblogs
    Video publication
    Social media other than microblogs
    Wikis
    Semantic search
    Data mining
    Innovation funnel
    Blogs

    Comparison of usage and value

    There is a strong correlation between usage and value. This could represent a tendency for the more valuable technologies to get the greatest use. This is a perfectly valid interpretation.  An alternative argument would be to say that technologies deliver more value the more they are used. Technologies at the top of the list are mainstream technologies, used frequently, and delivering high value. Technologies at the bottom of the list are less mainstream, and deliver less value to the companies that use them, because those companies make less use of these technologies. This is also a plausible interpretation.

    Even with this interpretation, we could still look for “Good performing” technologies which deliver more value than their popularity would imply, and “Poor performing technologies” which deliver less value than their popularity would imply.

    Under this interpretation, the best performing technologies are Enterprise Search and Expert Systems (both of them 6 places higher in the Value list than the Usage list) and the worst performing technologies in terms of value per use are Blogs.

    This does not necessarily mean Blogs are a bad technology; it probably means they are not being used in ways that add KM value.

    Changes since the 2014 survey

    We saw very similar results in the 2014 survey, again with Blogs being the poorest performing technology given their usage figures, and again with the best performing technologies in terms of value vs use being Enterprise Search and Expert Systems.

    Those technologies which have most increased in use between 2014 and 2017 are Microblogs and video publication, and not surprisingly these have also seen the greatest increase in value delivery as well. The technology which has decreased in use the most over the last 3 years is the innovation funnel technology.

    View Original Source Here.

    Are communities of practice getting smaller?

    A preliminary result from the Knoco 2016 survey suggests that Communities of Practice may be getting smaller over time.

    In our survey of Knowledge Management around the world, we asked a series of questions about Communities of Practice, one of which was “What is the typical size (number of members) of your CoPs and Knowledge Sharing networks?    Please choose the nearest number from the list below”.
    We then gave them a series of size ranges to choose from (given that we had responses form organisations from 10 people to 30 people).
    The graph above shows the number of responses for each size range, and I have restricted this graph to internal CoPs rather than external. The results from this years survey are in blue, and the results from the 2014 survey are in red.
    The number of CoPs of around 10 people has increased over the last 2 years, while the number in every other size range other than the largest has decreased.
    The plot below shows this as a percentage of the results rather then the absolute number of results.
    This is an interesting result, and if real is perhaps rather worrying, given that larger CoPs generally perform much better than smaller ones. 

    Has anyone got any ideas why CoPs seem to be getting smaller, or smaller CoPs becoming more common?

    View Original Source Here.

    How to make a start in KM – answers from the survey

    The Knoco 2017 survey of global Knowledge Management has recently closed, with submissions from 428 participants. Here is an interesting insight from the preliminary results.

    This insight is a combination of two of the questions, and it seeks analyse the most effective ways to get your KM program started. 

    The first question asked the participants “What external support has the organisation accessed to help the KM program? (Please select all that apply)”. The options were
    • External consultancy from a generic consulting firm 
    • External consultancy from a specialist KM consultant or consulting firm
    • External KM training
    • Membership of a multi-organisation KM consortium
    • Informal advice and discussions with KM practitioners from other organisations
    • Attendance at KM conferences
    • No external support other than books/Internet
    • I don’t know
    • I can’t remember (it was too long ago)
    • Other (please specify) 
    The second question asked the participants to estimate, on an order of magnitude scale, the value delivered to date through their KM program (although many were unable to estimate this figure).

    We can combine these two answers and look at the value generated from KM programs which use these forms of external support. The answer is shown in the diagram above.

    First the caveats – the numbers are quite small – the numbers of respondents who both quoted a KM value and chose any particular option above are between 11 and 25, so quite a small dataset. There are a few more late responses to add to the dataset which might still alter the numbers a little. And there is nor a one-for-one correlation between support and value, as many respondents used more than one source of support.

    However the conclusions seem clear.

    • Firstly, external support helps. Those respondents to tried to figure out KM for themselves from books and from the Internet delivered significantly less value than those who got external help.
    • Secondly the best external support seems to come from discussion with others, conference attendance, training and specialist consultancy support.
    • And of those four, the external consultancy seems to be associated with the most value-adding KM campaigns. 
    Now you might think “Of course he would say that; he is a KM consultant”, and you would be correct.
    However it is not me saying this, it is the results from a survey. And also I would not be in this job if I did not believe I added value to my clients. I sincerely believe that the right external consultant can get your KM program off to a flying start, and flying in the right direction. The alternative of “figuring it out for yourself” from books and websites does not pay in the long term.

    View Original Source Here.

    Skip to toolbar