When "cutting corners" in KM destroys value

Here is a sad story, about how trying to save costs in KM destroyed value.

The moral of the story is about hopw bring people together face to face and letting them experience the knowledge for themseves, in context, established the necessary credibility for re-use.

The organisation in question had an effective approach to knowledge sharing.  They would identifying those parts of the business which had knowledge to share (the “supplier” business units), and they would set up knowledge visits from the other business units to come and see for themselves what the supplier business unit had to offer.

The people who came on these learning visits were the operators, the foreman, and the knowledge workers.  They could watch how the supplier business unit did things, they could identify new knowledge and new practices that they could use, and they could see for themselves how things were done. They could also suggest, and sometimes demonstrate, better practices they used at their own sites, so that knowledge transfer was two-way.

This approach worked.  Because the people coming on the visits were the people who would apply the knowledge, and because they could see that knowledge in context, they could go home convinced that they had seen a new and better way to do things.  Seeing was believing.  Knowledge was transferred, best practices reapplied, and significant cost savings resulted.

Then management changed the system.

They believed that transferring knowledge in this way was expensive, as each knowledge transfer involved sending somebody to another country.  Cost pressure meant that there was a restriction on travel budgets, and they decided to do things a different way.  Instead of sending the operators, the foremen and the knowledge workers, they decided to send a few company experts instead.  The theory was that the experts would then come back and spread the knowledge around the other operating units, delivering knowledge transfer more cheaply.

What happened was that because the operators and knowledge workers had not personally seen the knowledge in use, the suggestions that the experts brought back lacked credibility.  The experts found it very difficult to get people to reuse knowledge which they had not personally seen in use.  Because they hadn’t seen, they didn’t believe, or at least they didn’t trust.  The rate of knowledge reuse dropped.

Eventually the entire system was cancelled.  Now the different operating units work in silos, with no sharing and reuse of best practice.

The moral of the story, of course, is that if you wish to transfer knowledge to somebody, then you should involve them first-hand in the transfer wherever possible, and bear the costs of this.  It’s very difficult for experts to transfer knowledge on someone else’s behalf.

Seeing is believing, and experiencing first-hand is important.  Certainly this may require an investment in travel, but the investment pays for itself in improved performance.  Trying to save some of the money risks losing all of the value, and the key people to involve in the knowledge transfer are the knowledge workers themselves.

View Original Source (nickmilton.com) Here.

7 reasons for KM failure in a Pharma company

An interesting study on a failed Knowledge Management project highlights 7 reasons for failure

Here is an interesting study of a failed KM approach in a Pharma organisation, by Ashley Braganza1 and Gerald Mollenkramer. It starts with the following vignette.

On a sunny morning in July 1999, Samuel Parsons, Head of Knowledge Management at PharmaCorp, convened his regular Monday team meeting. ‘Last Friday evening I was informed that Wilco Smith, Head of Pharma Global Order Handling Services, no longer wants KM. His only question now is how to off-board the KM staff.’ Thus came to an end a three-year initiative that at the outset was considered to be ‘the KM showcase for the firm’.

This has happened to too many organisations. We need to learn from failures such as this, and luckily that’s what the authors of the study try to do.  The authors draw 5 conclusions from the study – these largely echo my 7 reasons for KM failure. The 5 conclusions are as follows;

  1. Avoid defining knowledge within functions or silo-oriented communities of practice; instead define knowledge at the level of business processes. Within the study Project, KM was operationalized through the functions, and each function would have its own IT interface to the knowledge repository. As a result the relevance stored content to others was not addressed, and also the technology infrastructure became too fragmented and complex for the IT domain to deliver.
  1. Remember that knowledge is operationalized by people; hence, a knowledge management initiative must relate knowledge to people’s day jobs.  People lacked a clear context for specifying what knowledge was business-critical, so each knowledge-element was assigned implicitly equal weighting, and specific elements that are business critical got insufficient attention
  1. Tacit knowledge resides within people and their behaviours; hence attempting to apply Information Technology to tacit knowledge is fraught with difficulty. Instead, it is explicit knowledge that is most susceptible to the application of Information Technology.  Pharma focused primarily upon explicit knowledge. This is a common feature of many knowledge management projects that yield poor results.
  1. Knowledge is context-specific. It should be owned and maintained by people within the organization. Hence, external input to knowledge management initiatives must be carefully managed to ensure people within the organization are in control of the initiative at all times. Three different consulting firms were involved at different times, and the external consultants held key roles. This placed team members at a disadvantage when the consultants left.
  1. Implementing knowledge management involves change in the organization. Understand the organization’s willingness to change and manage people’s expectations appropriately.People at all levels of the organization must feel part of the knowledge management initiative, and senior managers must create a space within which people from different functions can come together to forge knowledge across each business process

View Original Source (nickmilton.com) Here.

A case study of a failed learning system

When lesson learning failed in the Australian Defence Force, they blamed the database. But was this all that was at fault?

design-databaseHere’s an interesting 2011 article entitled “Defence lessons database turns off users”. I have copied some of the text below, to show that, even thought the lessons management software seems to have been very clumsy (which is what the title of the article suggests), there was much more than the software at fault.

 “A Department of Defence database designed to capture lessons learned from operations was abandoned by users who set up their own systems to replace it, according to a recent Audit report. The ADF Activity Analysis Data System’s (ADFAADS) was defeated by a “cultural bias” within Defence, the auditor found. Information became fragmented as users slowly abandoned the system”.

So although the article title is “defence lessons database turns off users”, the first paragraph says that it was “defeated by cultural bias”. There’s obviously something cultural at work here ……

“Although the auditor found the structure and design of the system conformed to ‘best practice’ for incident management systems, users found some features of the system difficult to use. Ultimately it was not perceived as ‘user‐friendly’, the auditor found. Convoluted search and business rules turned some users against the system”. 

….but it also sounds like a clumsy and cumbersome system

“In addition, Defence staff turnover meant that many were attempting to use ADFAADS with little support and training”.

…with no support and no training.

“An automatically-generated email was sent to ‘action officers’ listing outstanding issues in the system. At the time of audit, the email spanned 99 pages and was often disregarded, meaning no action was taken to clear the backlog”.

There needs to be a governance system to ensure actions are followed through, but sending a 99-page email? And with no support and follow-up?

 “It was common for issues to be sent on blindly as ‘resolved’ by frontline staff to clear them off ADFAADS, even though they remain unresolved, according to the auditor”.

Again, no governance. There needs to be a validation step for actions, and sign-off for “resolution” should not be developed to frontline staff.

 “Apart from a single directive issued by Defence in 2007, use of the database was not enforced and there were no sanctions against staff who avoided or misused it”.

There’s the kicker. Use of the lessons system was effectively optional, with no clear expectations, no link to reward or sanction, no performance management. It’s no wonder people stopped using it.

So it isn’t as simple as “database turned off users”. It’s a combination of

  • Poor database
  • Poor notification mechanism
  • No support
  • No training
  • No incentives
  • No governance
  • No checking on actions

It’s quite possible that if the other items had been fixed, then people might have persevered with the clumsy database, and it’s even more likely that if they built a better database without fixing the other deficiencies, then people still would not use it.  A

What they needed was a lessons management system, not just a database.

So what was the outcome? According to the article,

…..establish a clear role and scope for future operational knowledge management repositories, and develop a clear plan for capturing and migrating relevant existing information ….. prepare a “user requirement” for an enterprise system to share lessons.

In other words – “build a better database and hope they use it” Sigh.

View Original Source (nickmilton.com) Here.

Is Learning from Failure the worst way to learn?

Is learning from failure the best way to learn, or the worst?

Classic Learning
Classic Learning by Alan Levine on Flickr

I was driven to reflect on this when I read the following quote from Clay Shirkey;

Learning from experience is the worst possible way to learn something. Learning from experience is one up from remembering. That’s not great. The best way to learn something is when someone else figures it out and tells you: “Don’t go in that swamp. There are alligators in there.”

Clay thinks that learning from (your own bad) experience is the worst possible way to learn, but perhaps  things are more complex. Here are a few assertions.

  • If you fail, then it is a good thing to learn from it. Nobody could argue with that!
  • It is a very good plan to learn from the failure of others in order to avoid failures of your own. This is Clay’s point; that learning only from your own failures is bad if, instead, you can learn from others. Let them fail, so you can proceed further than they did. 
  • If you are trying something new, then plan for safe failure. If there is nobody else to learn from, then you may need to plan a fail-safe learning approach. Run some early stage prototypes or trials where failure will not hurt you, your project, or anyone else, and use these as learning opportunities. Do not wait for the big failures before you start learning.
  • Learn from success as well. Learn from the people who have avoided all the alligators, not just from the people that got bitten. And if you succeed, then analyse why you succeeded and make sure you can repeat the success.
  • Learning should come first, failure or success second. That is perhaps the worst thing about learning from experience – the experience has to come first. In learning from experience “the exam comes before the lesson.” Better to learn before experience, as well as during and after.  

Learning from failure has an important place in KM, but don’t rely on making all the failures yourself. 

View Original Source (nickmilton.com) Here.

A story of how a community lost trust

It is possible for the members of a Community of Practice to lose trust in the community as an effective support mechanism. Here’s one story of how that happened.

The story is from one of Knoco’s Asian clients.

  • This community started well, with 4 or 5 questions per week from community members. 
  • The community facilitator forwarded these questions to community experts to answer, rather than sending them to the whole community and making use of the long tail of knowledge.  This may well have been a cultural issue, as her culture reveres experts.
  • Sometimes the expert would answer on the community discussion forum, but most of the time they answered by telephone, or personal visit. Therefore the community members did not see the answer, and were not even aware the question had been answered.
  • Often the expert did not have enough business context to answer the question (this is a complicated business), so when they did answer on the forum, the answer was vague and high-level. In a culture where experts are not questioned, nobody interrogated these vague answers to get more detail. 
  • Often the questions themselves were asked with very little context or explanation, so it was not possible to give good answers. The community facilitator never “questioned the question” to find out what the real issue was.
  • Where there was a discussion around the question, it very quickly went off-topic. Again the facilitator did not play an active role in conversation management.
  • When the facilitator followed up, to see if the questioner was satisfied by the answer, the answer was usually No.
  • A year later, the questions have dropped to 1 or 2 a month.
As far as the community members were aware through observing interactions on the forum, the questions seemed either to receive no answer (as the real discussion happened offline), or to receive worthless answers.  The users lost trust in the community forum as a way to get questions answered effectively, and have almost stopped asking. 
One way to revitalise this community will be to set up a series of face to face meetings, so that the members regain trust in each other as knowledgeable individuals, then ask the members to help design an effective online interaction. This will almost certainly involve asking the community and not the experts, and making much more use of the facilitator to get the questions clarified, to make sure the answers are posted online, to probe into the details of vague answer, and to keep the discussions on topic.
This sort of discussion is needed at community kick-off, so the community can be set up as an effective problem-solving body, and so that the members trust that their questions will be answered quickly and well.

If the members do not trust that the community will answer their questions, they will soon stop asking.

View Original Source Here.

Make small mistakes

You will inevitably make mistakes in your Knowledge Management program. Make sure they are small ones, not fatal ones.

Knowledge Managers need to learn, learning requires experimentation, experiments often lead to mistakes, but mistakes can be costly and derail your program.  That’s a big dilemma for every Knowledge Manager.

You cannot afford to make a big mistake. Too often we see failed KM programs which have started with grand plans and expensive software purchase, failed to deliver, and set back the cause of KM in the organisation for many years.  After a big expensive flop, KM will have a tarnished reputation and management will be resultant to spend any more money.  This can be a fatal KM mistake, and impossible to recover from.
Therefore implementing Knowledge Management should be a series of smaller steps and smaller experiments, where failure will not be fatal. Follow the approach below.
  1. Do your research. Find out what is involved in Knowledge Management. Understand what your organisation needs, and the type of KM framework which will support this. Conduct an assessment, review the culture, develop a strategy – all of this before you start to make any changes at all.
  2. See what others are doing. Research the world leaders in KM. Find a consultant who has worked with them, and who can share the details.
  3. Start with small experiments; proof-of-concept trials and pilots where you introduce a minimum viable KM framework. The proof of concept trials should be small enough that failure doesn’t matter; these are your chance to learn as you go, and to experiment. The Knowledge Management pilots can be a little larger, and should be set up to solve specific business problems, but can be a simplified version of the final Knowledge Management framework. Learn from the trials and pilots, until your final KM framework is bullet-proof.
  4. Roll out the framework.
Make all your mistakes in Stage 3 (and if you have been diligent in Stages 1 and 2, these mistakes should be few and minor). This is a far better approach than starting with step 4 and making your mistakes there. 

Make small mistakes early, and avoid the large mistakes later.

View Original Source Here.

Skip to toolbar