3 arguments in KM we may never resolve

Here are three perennial KM arguments. Do they matter? (this is a reprise of an original blog post from 5 years ago)

Mockingbird argument, from wikimedia commons

Over the 20 years that we have been doing knowledge management, there has been a number of recurrent arguments that appear regularly, often several times a year. Watch the linked-in forums; you will see these arguments popping up like mushrooms. I call them cul-de-sac arguments, as they lead us nowhere. They are never resolved, and they make little difference to pragmatic knowledge management.

Here is the first and the biggest.

Can you manage knowledge?

This argument often comes about because people assume that “knowledge management” means “the management of knowledge”. Given that knowledge is intangible, they feel that it cannot be controlled as a “thing”, and that without control, there can be no management.

This argument is debated around and around and around and around, and usually ends up with the position that “it depends what you mean by knowledge, and it depends what you mean by management”. Often the argument is an emotional one, driven by the feeling that “Knowledge is Good, Management is Bad”, and that we ought to find another term. “Knowledge Sharing“, perhaps.

My position is that it doesn’t matter. The validity of “Knowledge Management” doesn’t depend on whether knowledge is a concrete object to be controlled, any more than the  validity of “Reputation Management” depends on whether reputation is a concrete object to be controlled, or that the validity of “Risk Management” depends on whether risk is a concrete object to be controlled.

The point is that you can do some really useful things through Management, which enable the flow and re-use of Knowledge, and which deliver real value to an organisation. It’s “Management with a focus on Knowledge“, and that’s enough to call it Knowledge Management as far as I am concerned.

Here’s the second

“If you write it down, is it still knowledge?”

Many hours have been spent arguing whether knowledge can ever exist outside the human head. Some say “Yes, it’s explicit knowledge”. Others say “No, it’s information”.

This argument is seldom resolved, but as far as I am concerned, it doesn’t matter. It is possible to help people understand how to do things through the transmission of the spoken or recorded word. You can add a huge amount of value this way, no matter what you call the medium of transmission (explicit knowledge, or knowledge-focused information). There is a loss of value when things get written down, but where does that loss turn knowledge into information? Is it

  • when you formalise your thoughts into coherent form?
  • when you speak your thoughts?
  • when you record what you speak onto video?
  • when you write a transcript of that video?
Nobody knows, and trying to pinpoint the step at which knowledge becomes information, is irrelevant to the fact that you can add value to an organisation (in many cases) through written transmission. It’s seldom the best way to transfer knowledge, but sometimes its the only practical way.
Here’s the third.

“What is knowledge?”


This is an argument that gets very philosophical, very quickly. Plato is often quoted, along with Senge, and other philosophers. The “justified true belief” definition is popular, but even this only covers propositional knowledge, and not other types of knowledge. Nobody can ever agree which definition is correct, nor which philosopher is authoritative.
I don’t think it matters.

You don’t need to agree on the definition of an electron, to be able to create an electronic circuit which does useful things like lighting a lamp or ringing a bell. Similarly you don’t need to agree on the definition of knowledge, to be able to help knowledge flow in order to do useful things like solve a problem or improve a process. You don’t need to define “a thought” in order to think, you don’t need to define “music” in order to sing, and you don’t need to define “knowledge” in order to do knowledge management. Knowledge is hard to define, but its not that hard to deliver real value through knowledge management.

So the answer is “we don’t really know, but see what we can do with it!”

View Original Source Here.

Dealing with the important urgent knowledge first

We often say that “Knowledge Management must be focused on the critical business knowledge”, but how do we identify what that critical knowledge is?

There are actually two dimensions to identifying the criticality of a Knowledge Topic (at least in terms of steering your KM program). These are

  • Importance, and
  • Urgency
I have already blogged about how to identify the important knowledge – to start with your business strategy, identify the activities needed to deliver that strategy, then identify the knowledge needed to deliver the activities. These could be activities (and knowledge) at all levels in the organisation 
  • knowledge of how to enter new markets as well, as knowledge of how to sell products
  • knowledge of how to set production forecasts, as well as knowledge of how to operate a plant
  • knowledge of how to interact with host government environmental agencies, as well as knowledge of how to avoid pollution at your chemical plant
The knowledge can be new knowledge which needs to be acquired, cutting edge knowledge which forms your competitive advantage, or core knowledge which is needed to keep your income stream alive, and to fulfil your commitments. It can even be knowledge which is supplied by your partners and contractors, but which is still vital to your business. You identify the important knowledge through conversation with senior managers.

What about the urgent knowledge? This is the knowledge which needs urgent attention from knowledge management. There are at least four cases where knowledge can be in need of urgent attention. These are as follows.

  • When knowledge is important to the company, but we don’t have it (or we don’t have enough of it). Here the focus will be on the acquisition and development of knowledge – on innovation, knowledge creation, research and action learning.
  • Where knowledge exists widely in the company, but is siloed, and not shared, or otherwise not properly managed. Here knowledge is used inefficiently – advances in one part of the business are not shared and learned from, in other parts of the business. Multiple, and inefficient, solutions exist, where one or two solutions would be better. Here the focus will be on the elements of knowledge sharing, and knowledge improvement, such as communities of practice, lessons learned, and development of knowledge assets, best practices, and standardisation.
  • Where important knowledge is at risk of loss (perhaps through the retirement of key members of staff). Here the focus must be on developing and deploying a Retention strategy.
  • When critical knowledge is held by a contractor, partner or supplier, and they don’t have knowledge Management. Here the focus is on defining a Knowledge Management Framework for them to apply, to keep your knowledge safe.
How do you identify the urgent knowledge? You need to do a Knowledge Scan of the important topics, and shortlist the ones in most need to attention.

Those important and urgent knowledge issues are the ones that should drive your knowledge management strategy, tackling them one by one.

View Original Source Here.

A key lesson-learning role in the military setting

Lesson Learning is well embedded in the United States Army and forms a model which industry can emulate, especially when it comes to assigning knowledge management roles within the business.

www.Army.mil As explained in this excellent analysis from Nancy Dixon, lesson learning works well in the US Army.  This article describes some of the components of the learning system they apply, and mentions processes such as After Action Reviews and Learning Interviews, but also mentions  different roles with accountability for the lessons process. One of the key roles is the  Lessons Learned Integrator, or L2I.

The Lessons Learned Integrator role

The Centre for Army Lessons Learned  is deploying Lessons Learned Integrators in operational units and in other units such as training schools and doctrine centres. These L2I analysts gather lessons learned, research requests for information (RFI), and support the unit within which they are situated. They act as conduits for lessons in and out of the units. You can find various role descriptions for this post (e.g. this one), which suggest that the role primarily involves
  • Collecting, reporting, and disseminating lessons from the host unit
  • Monitoring lessons learned and other new knowledge from elsewhere, assessing it for relevance to the host unit, and “pushing” it to the correct people
  • Initiating actions that lead to change recommendations
  • Locally supporting the “Request for Information” process, where soldiers can make requests for information from the Centre for Army Lessons Learned.
In many of the support centres, the L2I analyst also has a role in developing doctrine, as described here

  • The L2I analyst can derive information from a variety of sources: unit after-action reports; tactics, techniques, and procedures used by units in and returning from theater; Soldier observations/submissions to the Engineer School; and requests for information. 
  • This information is used to conduct doctrine, organization, training, materiel, leadership and education, personnel, and facilities gap analyses and to determine solutions

As ever, Industry can learn from the Military.

Too often we see “Lessons Learned systems” which seem to have no roles or accountabilities assigned to them. The assumption seems to be that “everyone is responsible for lessons learned”, which quickly becomes “someone else will do it”, then “nobody is responsible”. The Army avoid this by identifying specific pivotal roles for identification, communication and analysis of Lessons, and for identifying what needs to be done as a result.

If you want your Lessons Learned system to really work, then you may need roles similar to the L2I in your operation units.

View Original Source Here.

Making Knowledge Visible

One of the biggest challenges in Knowledge Management is the invisible and intangible nature of Knowledge. How can we make knowledge, and knowledge gaps, visible to others?

 cup_invisibleYou can’t see knowledge, you can’t measure it, you can’t tell when it’s missing, other than by observing it’s effects. This makes it difficult to identify opportunities for knowledge transfer, from someone how has knowledge, to someone who needs it.

If you could see knowledge, and you could see it’s absence, then you would be in a much better position to set up the knowledge transfers that need to happen. You could say “Look, Susie needs some Red knowledge, Peter has lots of Red knowledge, let’s introduce Peter to Susie”.

But because knowledge is invisible you can’t see what Susie needs or Peter has, unless you ask them.  How can a supplier of knowledge get in touch with a needer of knowledge if both the supply and the need can’t be seen?

Here are four easy ways to make Knowledge visible, and to set up Knowledge Transfer opportunities.

The Seekers exercise

Seekers is a simple exercise, suitable for groups of 40 or 50 or more, and runs during a coffee break in a training session, or as part of a Brown Bag lunch. It requires blank name badges, so either buy a supply of badges, or if you are in a badged event such as a conference, ask people to turn their badges to the blank side. Ask them to write on the blank badge, in large clear letters, a question to which they would like an answer.

Ideally it should be a real work question rather than a home-life question, and a question where an answer would be really useful. Make sure it’s a practical question! It should be “How do I best plan a program of data collection” rather than “How do I become the next CEO”.

During the exercise, if people see a question they can help answer – either giving good advice, or pointing people to a source of advice – then they go and introduce themselves and offer help. After 20 to 30 minutes of pairing up and discussion, ask for a show of hands for “Who has received an answer?”. You should see between a third and half the people raise their hands. You can then lead a discussion on motivation  – What motivated people to help? What would motivate you to ask questions of others? –  on the power of Asking as a driver for knowledge transfer, on “how we can make our questions visible to others as part of our work”, and on KM approaches such as community forums and peer assist.

Knowledge Market

A Knowledge Market is a meeting to match up people who need learning, with people who can provide their learning. It is a way of connecting people to stimulate knowledge, make new connections, and identify new collaborative relationships, it is for connecting those who have problems with those that can potentially solve the problem in a very simple way. Knowledge Markets are commonly used within Communities of Practice.

At a Knowledge Market, you ask people to write (on post-it notes, or (better) on a large poster) two or three “Knowledge Offers”, and two or three “Knowledge Needs”. These should be real business issues – either an issue for which they have found a solution (a knowledge offer), or a business issue which they are currently facing, where they need access to more knowledge to help them make the correct decision.  Then you display these posters or notes, and ask people to walk around and identify

  1. A knowledge need they think they can help with
  2. A knowledge offer which they want to hear more about, because it will help solve a business issue for them.
Once these “matches” have been identified, then you set up follow-on conversations (either at the same event, or later) to transfer the knowledge.

An online (or physical) Knowledge Wants and Offers board

“Wants and offers” forums are popular as a way for people to sell and buy items (see this example). In the UK you see this in physical form in supermarkets, where someone looking for accommodation, or with a bed for sale, puts a card up on a notice board.

You can do the same for Knowledge, and provide an online site, or (in a shared office) a notice board (with pens, pins and cards) where people can post questions and offer solutions.  Online of course is easier, as you can click on a question to email an answer. Community of Practice discussion forums often become Wants and Offers forums, with people raising questions and offering solutions.

The Yellow Pages/People Directory

You may have some system of personal pages, where people identify their skills. Why not extend this to “knowledge needs” as well?

All of these methods make Knowledge, and the need for knowledge, visible, allowing matches to be made between Knowledge Suppliers and Knowledge Customers. 

View Original Source Here.

Which Knowledge Management Processes add most value?

I blogged yesterday about usage and value of Knowledge Management technologies. Here is a similar analysis, also drawn from our  2017 Global Survey of Knowledge management, of the usage and value of KM processes.

We asked the survey participants to rate these different KM processes by the value they have added to their KM program, including in the question the option to choose “we do not use this process” or “it’s too early to tell”.
The chart above shows these processes in order of value from left to right, as a stacked area chart of responses, with the weighted value of the process overlain as a line (this line would be at 100% if all the participants that used this process claimed it had “high value” and at 0% if they all claimed it had no value). The height of the dark grey area represents usage, as the light grey area is the “Not Used” response.

288 people answered this question on the survey. from a wide range of organisations around the globe.

The processes are also listed below in order of the usage figures, and in order of the average value assigned by the respondents.

Knowledge Management processes in order of usage 
(most common at the top)
Knowledge Management processes in order of the assigned value when used (those rated most valuable at the top)
1. coaching and mentoring
2. project lessons capture (large scale)
3. after action review (small scale)
4. knowledge roundtables
5. Peer Assist
6. retention interviews
7. storytelling
8. action learning
9. knowledge cafe
10. crowdsourcing
11. open space
12. appreciative enquiry
13. Innovation deepdive
14. wikithon
15. positive deviance
1. knowledge roundtables
2. coaching and mentoring
3. project lessons capture (large scale)
4. after action review (small scale)
5. action learning
6. Peer Assist
7. retention interviews
8. knowledge cafe
9. Innovation deepdive
10. storytelling
11. appreciative enquiry
12. open space
13. crowdsourcing
14. positive deviance
15. wikithon

Comparison of usage and value

As with the Technology results, there is a strong correlation between usage and value. This could represent a tendency for the more valuable KM processes to get the greatest use. This is a perfectly valid interpretation.  An alternative argument would be to say that processes deliver more value the more they are used. Processes at the top of the list are mainstream processes, used frequently, and delivering high value. Processes at the bottom of the list are less mainstream, and deliver less value to the companies that use them, because those companies make less use of these processes. This is also a plausible interpretation.

Even with this interpretation, we could still look for “Good performing” processes which deliver more value than their popularity would imply (and so are significantly higher in the value list than in the popularity list), and “Poor performing processes” which deliver less value than their popularity would imply.

Under this interpretation, the best performing KM processes are technologies are Innovation Deepdive, Knowledge Roundtable meetings and Action Learning (both of them 3 or 4 places higher in the Value list than the Usage list) and the poorest performing processes in terms of value per use are Crowdsourcing and Storytelling.

Changes since the 2014 survey

We saw similar results in the 2014 survey, with processes such as Knowledge Roundtable, After Action Review and Coaching and Mentoring both popular and performing well. However there are also some significant changes in both usage and value.

Those processes which have seen the greatest most increase in use between  the two surveys in 2014 and 2017 is Project Lesson Capture, with a rise in usage of 5 places and in value of 6 places, and Storytelling (+7 in usage, +4 in value).

There have been some big fallers as well. Positive Deviance has dropped 9 places in usage and 8 places in value, and Crowdsourcing has dropped 6 places in usage and 9 places on the value list. We did not include Innovation deepdives in the 2014 survey. Peer Assist has also fallen in popularity, which is a shame

It looks like the old staples processes of KM – the knowledge roundtable meetings, after action reviews, lesosn capture, peer assist and coaching and mentoring – remain the core process set for Knowledge Management.

View Original Source Here.

Which Knowledge Management technologies add most value?

Interesting results are coming through from the Knoco 2017 Knowledge Management survey, including this plot of comparative KM technology value.

We asked the survey participants to rate these different types of technology by the value they have added to their KM program, including in the question the option to choose “we do not use this technology” or “it’s too early to tell”.

The chart above shows these technologies in order of value from left to right, as a stacked area chart, with the weighted value shown as a blue line (this line would be at 100% if all the participants that used this technology claimed it had “high value” and at 0 they all claimed it had no value).

The top of the grey area represents the usage percentage for these technologies, as the light grey area above represents people who do not use this technology. The top of the green area represents the percentage of people who said this technology had added “large value”.

288 people answered this question.

The technology types are listed below in order of usage, and in order of value.

Technology type in order of usage 
(most common at the top)
Technology type in order of value delivered  when used (most valuable at the top)
Best practice repository
Document collaboration
eLearning
People and expertise search
Enterprise search
Enterprise content management
Portals (non-wiki)
Video publication
Question and answer forums
Blogs
Lessons Management
Microblogs
Brainstorming/ideation/crowdsourcing
Wikis
Social media other than microblogs
Expert systems
Data mining
Innovation funnel
Semantic search
Enterprise search
Best practice repository
Document collaboration
Enterprise content management
eLearning
Portals (non-wiki)
People and expertise search
Question and answer forums
Lessons Management
Expert systems
Brainstorming/ideation/crowdsourcing
Microblogs
Video publication
Social media other than microblogs
Wikis
Semantic search
Data mining
Innovation funnel
Blogs

Comparison of usage and value

There is a strong correlation between usage and value. This could represent a tendency for the more valuable technologies to get the greatest use. This is a perfectly valid interpretation.  An alternative argument would be to say that technologies deliver more value the more they are used. Technologies at the top of the list are mainstream technologies, used frequently, and delivering high value. Technologies at the bottom of the list are less mainstream, and deliver less value to the companies that use them, because those companies make less use of these technologies. This is also a plausible interpretation.

Even with this interpretation, we could still look for “Good performing” technologies which deliver more value than their popularity would imply, and “Poor performing technologies” which deliver less value than their popularity would imply.

Under this interpretation, the best performing technologies are Enterprise Search and Expert Systems (both of them 6 places higher in the Value list than the Usage list) and the worst performing technologies in terms of value per use are Blogs.

This does not necessarily mean Blogs are a bad technology; it probably means they are not being used in ways that add KM value.

Changes since the 2014 survey

We saw very similar results in the 2014 survey, again with Blogs being the poorest performing technology given their usage figures, and again with the best performing technologies in terms of value vs use being Enterprise Search and Expert Systems.

Those technologies which have most increased in use between 2014 and 2017 are Microblogs and video publication, and not surprisingly these have also seen the greatest increase in value delivery as well. The technology which has decreased in use the most over the last 3 years is the innovation funnel technology.

View Original Source Here.

What the NASA CKO said about KM policies

Knowledge Management policies are still rare, and opinion on them is divided. Here is what the CKO of NASA said about the topic.

Image from Wikimedia commons

Knowledge Management policies are coming.

When the ISO KM standard is in place, next year or the year after, a KM policy becomes a requirement under the standard. This requirement is not unique to KM – all the ISO Management System standards reauire a policy. After all – can an organisation be said to have adopted a management system if there is no policy?

However many people are resistant to KM policies. “Added beauracracy” they say. “We have a strategy – we don’t need a policy” they say. “We are getting by OK without one” they say.

The NASA CKO, Ed Hoffman (now retired from NASA) used to be similarly sceptical, but is now a big convert. Here is what he says on the matter.

“A policy sends a number of messages.

First, it declares that we, as an organization, recognize what’s important.

Second, it identifies a community of people who are held accountable for taking action.

Third, a policy indicates that the organization and its leaders want to make sure things are done the right way. It sets a course without being overly prescriptive.

Fourth, excellent organizations make a practice of communicating what they really stand for”.

This is hard to argue with really. The policy is “a statement of what we really stand for”, and if you don’t have a policy for KM, do you really stand behind the topic?

The NASA KM policy is not a top-down mandate but establishes a federated approach for governance of knowledge.  As the CKO says

“Each center and mission directorate will develop its own strategy, with the understanding that knowledge will be shared across the agency to the greatest extent possible. The policy unifies these efforts.

I am optimistic that the knowledge policy represents a significant step toward helping NASA achieve its potential as a learning organization. We have built a community that shares a commitment to sustaining NASA’s knowledge resources, and we have charted a course toward greater integration across the agency.  

If you have been doing Knowledge Management for a few years – if you feel that KM is becoming embedded in the organisation, but needs greater integration and greater commitment – then your next step is probably to craft a Knowledge Management Policy

View Original Source Here.

A value-led KM story

I have blogged many times about how Knowledge Management should be value-led, and driven by the needs of the business. Here’s a story of how one KM Community leader helped define that value in a very graphic form. 

Image from wikimedia commons

The story was told to me by my friend Johnny, who was at one time the leader of a highly successful Community of Practice in the Oil Refining sector. The great thing about Johnny’s story is that way that the Community were able to make an enemy of the waste in the production process.

Johnny took this Cost of Lost Knowledge and personalised it as a thief – “The Phantom”.

Here’s Johnny’s story.

“It is always a hard one – to wrestle with the value that a community can deliver. It is very difficult to measure somehow. In some ways you just know instinctively that it has made a difference, but to actually pin a monetary value on it, is sometimes very very difficult.

“However, we recognised that in the operating area, there was money disappearing. Every year, while the plants are running, we say “this plant could have run better – we could have got more out of this asset”. So we have lost money somewhere along the line. 

“We like to spin it around, and say that it is money that has gone to The Phantom. It has disappeared, you can’t recover it. So we need to go after this Phantom Money. 

“One of the tools that we have in the Community tool box is “Capturing The Phantom“. We actually go after the drips and the bits and pieces like that. And shared learning is very important. If you can capture what people have done before you, you can get enormous value from that. We can start to measure that less and less money is going to The Phantom. 

“People understand what The Phantom is, and they also understand that we can maybe capture The Phantom, but if we take our eye off it, The Phantom will come back. The Phantom will always come back!”

View Original Source Here.

27 ways in which a Community of Practice can add value

How can communities of practice add value? Let me count the ways.

Image from wikimedia commons

Here’s a list we made of 27 different mechanisms by which a community of practice can add value to an organisation.

No doubt you can think of more!

Community members can

  1. solve problems for each other 
  2. Learn before” starting a piece of work – using the CoP as a “Peer Assist” mechanism
  3. “Learn during” a piece of work, drawing on the knowledge of the CoP
  4. “Learn after” by sharing lessons with the community
  5. support each other emotionally, through messages of support or congratulations
  6. benchmark performance with each other 
  7. exchange resources through the community, such as tools, templates and approaches
  8. collaborate on purchasing (buying things that any one member could not justify) 
  9. collaborating on contracts (using the purchasing power of the community) 
  10. cooperate on trials and pilots 
  11. share results of studies, and maybe remove the need for others to re-do the same study
  12. exchange equipment (re-use old equipment, share spares) 
  13.  mentor and coach each other 
The community collectively can
  1. collaborate on a community blog, to act as a real-time story of what the community is collectively learning
  2. act as a learning resource for new staff
  3. build and maintain documented Best Practices, perhaps using a community wiki as a shared knowledge base
  4. build and maintain a curated document base as a shared resource
  5. decide a taxonomy and/or metadata scheme so members can record their knowledge in a consistent way
  6. recognise the most useful resources (for example through feedback and voting)
  7. recognise the most helpful and generous sharers (for example through “contributor of the year” awards)
  8. develop lists of common risks and warning signs (and what to do when you see them) 
  9. develop checklists and templates for member use
  10. create knowledge products for use by clients or customers
  11. identify knowledge retention issues 
  12. identify training gaps and collaborate on training provision 
  13. innovate new products, services or opportunities by combining ideas from everyone
  14. advise the organisation on strategy

View Original Source Here.

Are communities of practice getting smaller?

A preliminary result from the Knoco 2016 survey suggests that Communities of Practice may be getting smaller over time.

In our survey of Knowledge Management around the world, we asked a series of questions about Communities of Practice, one of which was “What is the typical size (number of members) of your CoPs and Knowledge Sharing networks?    Please choose the nearest number from the list below”.
We then gave them a series of size ranges to choose from (given that we had responses form organisations from 10 people to 30 people).
The graph above shows the number of responses for each size range, and I have restricted this graph to internal CoPs rather than external. The results from this years survey are in blue, and the results from the 2014 survey are in red.
The number of CoPs of around 10 people has increased over the last 2 years, while the number in every other size range other than the largest has decreased.
The plot below shows this as a percentage of the results rather then the absolute number of results.
This is an interesting result, and if real is perhaps rather worrying, given that larger CoPs generally perform much better than smaller ones. 

Has anyone got any ideas why CoPs seem to be getting smaller, or smaller CoPs becoming more common?

View Original Source Here.

Skip to toolbar