Thursday, 11 August 2016

Experts, Zen masters (& David Brent)



What is an 'expert'? In my work facilitating knowledge transfer I have come across individuals who have widely varying degrees of expertise. Some are very aware of their unique know-how, others less so.


Low competency/ Low consciousness:



There are two categories here; firstly those who are blissfully unaware that they lack any expertise. A baby would be an example. Secondly, there are those who profess to be expert, but are actually incompetent and unaware of their incompetence. Rather than reach for an obvious example from US politics, I propose David Brent, from The Office. This phenomenon is called the Dunning Kruger effect named after the Cornell University professors who published the seminal research paper entitled 'Unskilled and Unaware of it'. This is why validation and looking for evidence is such an important part of the knowledge transfer process. Incidentally, did you know that 62% of all software engineers rate themselves in the top 5% of their profession?


The trouble with the world
is that the stupid are cocksure
and the intelligent are full of doubt.
— Bertrand Russell


Low Competency / High Consciousness


Novices are obvious examples of individuals who are aware of their inexperience. One interesting observation about those who claim to have low competency, is that sometimes it just takes a skilled interviewer to reveal a latent talent. One group that often benefit from this help are job-changers, unsure of their place. It can also be hugely motivating for them.


High Competency / High Consciousness



This group is the most obvious to classify as 'expert'. They can easily tell you what's right (or wrong) and why, and can provide lots of evidence. They are usually confident in their ability and will sometimes claim that expertise is in some way unique. This is worth testing in the knowledge transfer process - is it 'commodity' know-how? Is it easily codified (it may not be 'captured' in order to create an impression of uniqueness and inaccessibility). In my knowledge elicitation process, I use a mining metaphor. These experts are good at providing ore (superficial knowledge) but find it difficult to come up with gems (detailed knowledge that has context which makes it accessible to others). Their preferred communication style is to 'tell'.


High Competency / Low Consciousness



The Zen masters. These individuals have deep, experience gained through years of practice. Ironically, this most valuable expertise is the hardest to pass on. Next time you are out on a golf course with a player who is far better than you, try this... as they tee-up ask them to explain how they play their perfect shot. Either they can't explain it, or their next shot will go into the rough. Like pro basketball players who instinctively know where every other player is, they can't explain their mastery, they just do it. This is why knowledge transfer for these deep experts benefits from skilled facilitation. The best environment to elicit this sort of know how is a Socratic questioning approach or dialogue; quite different to the 'tell' approach of the 'expert'.

Friday, 15 July 2016

Augmented knowledge - the fourth channel

Ask anyone familiar with knowledge management what form organisational knowledge takes, they will almost certainly mention tacit knowledge and explicit knowledge. They may also mention latent knowledge in networks. I'd like to propose a fourth - augmented knowledge. The coming-of-age of artificial intelligence, 'social robots' and big data is having a massive impact on the way decisions are made in organisations. It follows that if we are to maximise know-how and expertise, the outputs from this technology-enabled channel must be integrated into how we work. Augmenting judgment and experience in this way also supports the move towards evidence-based decision making.

It also drives new skills needed to maximise these opportunities. Data analytics and blockchain coding are not esoteric geeky pastimes, but are increasingly employed by major FMCG, finance, retail, and law firms to highlight trends and real-time patterns that augment business acumen and expertise.

This chart does not imply a hierarchy, but shows how Augmented Knowledge fits with the more established Organisational Knowledge channels.

This perspective is my own, not necessarily representative of KIN's. Alternative views are welcome in the comments!

Augmented Knowledge will be explored in the Knowledge and Innovation Network Winter Workshop on 7th December on the theme of 'Organisational Learning in the Machine Intelligence Era'.

Monday, 11 July 2016

Trends in Big Data, Data Analytics and AI


I was asked by Managing Partners Forum (MPF) recently to give a brief overview of the current status and industry trends in Big Data and Data Analytics, topics I've been keeping an eye on for several years. The slides are available on Slideshare. The following is shortened abstract from the presentation.
One of the issues I have with with Big Data is just that - the term "Big Data". It's fairly abstract and defies a precise definition. I'm guessing the name began as a marketing invention, and we've been stuck with it ever since. I'm a registered user of IBM's Watson Analytical Engine, and their free plan has a dataset limit of 500MByte. So is that 'Big Data'? In reality it's all relative. To a small accountancy firm of 20 staff, their payroll spreadsheet is probably big data, whereas the CERN research laboratory in Switzerland probably works in units of terabytes.
Eric Schmidt (Google) was famously quoted in 2010 as saying “There were 5 exabytes of information created between the dawn of civilisation through 2003, but that much information is now created in 2 days”. We probably don't need to understand what an 'exabyte' is, but we can get a sense that it's very big, and what's more, we begin to get a sense of the velocity of information, since according to Schmidt it's doubling every 2 days, and probably less than that since we've moved on by 6 years since his original statement.
It probably won't come as a surprise to anyone that most organisations still don’t know what data they actually have, and what they’re creating and storing on a daily basis. Some are beginning to realise that these massive archives of data might hold some useful information that can be potentially deliver some business value. But it takes time to access, analyse, interpret and apply actions resulting from this analysis, and in the mean-time, the world has moved on.
According to the "Global Databerg Report" by Veritas Technologies, 55% of all information is considered to be 'Dark', or in other words, value unknown. The report goes on to say that where information has been analysed, 33% is considered to be "ROT" - redundant, obsolete or trivial. Hence the 'credibility' gap between the rate at which information is being created, and our abilities to process and extract value from this information before it becomes "ROT".
But the good news is that more organisations are recognising that there is some potential value in the data and information that they create and store, with growing investment in people and systems that can make use of this information.
The PwC Global Data & Analytics Survey 2016 emphasises the need for companies to establish a data-driven innovation culture – but there is still some way to go. Those using data and analytics are focused on the past, looking back  with descriptive (27%) or diagnostic (28%) methods. The more sophisticated organisations (a minority at present)  use a forward-looking predictive and prescriptive approach to data.
What is becoming increasingly apparent is that C-suite executives who have traditionally relied on instinct and experience to make decisions, now have the opportunity to use decision support systems driven by massive amounts of data.  Sophisticated machine learning can complement experience and intuition. Today’s business environment is not just about automating business processes – it’s about automating thought processes. Decisions need to be made faster in order to keep pace with a rapidly changing business environment. So decision making based on a mix of mind and machine is now coming in to play.
One of the most interesting bi-products of this Big Data era is 'Machine Learning' - mentioned above. Machine learning’s ability to scale across the broad spectrum of contract management, customer service, finance, legal, sales, pricing and production is attributable to its ability to continually learn and improve. Machine learning algorithms are iterative in nature, constantly learning and seeking to optimise outcomes.  Every time a miscalculation is made, machine learning algorithms correct the error and begin another iteration of the data analysis. These calculations happen in milliseconds which makes machine learning exceptionally efficient at optimising decisions and predicting outcomes.
So, where is all of this headed over the next few years? I can't recall the provenance of the quote "never make predictions, especially about the future", so treat these predictions with caution:
  1. Power to business users: Driven by a shortage of big data talent and the ongoing gap between needing business information and unlocking it from the analysts and data scientists, there will be more tools and features that expose information directly to the people who use it. (Source: Information Week 2016)
  2. Machine generated content: Content that is based on data and analytical information will be turned into natural language writing by technologies that can proactively assemble and deliver information through automated composition engines. Content currently written by people, such as shareholder reports, legal documents, market reports, press releases and white papers are prime candidates for these tools. (Source: Gartner 2016)
  3. Embedding intelligence: On a mass scale, Gartner identifies "autonomous agents and things" as one of the up-and-coming trends, which is already marking the arrival of robots, autonomous vehicles, virtual personal assistants, and smart advisers. (Source: Gartner 2016)
  4. Shortage of talent: Business consultancy A.T. Kearney reported that 72% of market-leading global companies reported that they had a hard time hiring data science talent. (Source: A.T Kearney 2016)
  5. Machine learning: Gartner said that an advanced form of machine learning called deep neural nets will create systems that can autonomously learn to perceive the world on their own. (Source: Ovum 2016)
  6. Data as a service: IBM's acquisition of the Weather Company -- with all its data, data streams, and predictive analytics -- highlighted something that's coming. (Source: Forrester 2016)
  7. Real-time insights: The window for turning data into action is narrowing. The next 12 months will be about distributed, open source streaming alternatives built on open source projects like Kafka and Spark(Source: Forrester 2016)
  8. RobobossSome performance measurements can be consumed more swiftly by smart machine managers aka “robo-bosses,” who will perform supervisory duties and make decisions about staffing or management incentives. (Source: Gartner 2016)
  9. Algorithm markets: Firms will recognize that many algorithms can be acquired rather than developed. “Just add data”. Examples of services available today, including AlgorithmiaData Xu, and Kaggle (Source: Forrester 2016)
The one thing I have taken away from the various reports, papers and blogs I've read as party of this research is that you can't think about Big Data in isolation. It has to be coupled with cognitive technologies - AI, machine learning or whatever label you want to give it. Information is being created at an ever-increasing velocity. The window is getting ever narrower for decision making. These demands can only be met by coupling Big Data and Data Analytics with AI.

Steve Dale (for KIN Enterprise Technology SiG)

Friday, 1 July 2016

Measuring the impact of intangibles - fairy dust or fair enough?

An organisation I have recently been working with has a problem.

They have used a Capability Maturing Modelling approach to measure the impact of their Knowledge Management program for a number of years, apparently very successfully. The majority of teams/departments that take part in the analysis appear to be achieving very high levels of competency (80%+) leaving little room for cross-department improvement. The organisation asked if there were alternative ‘light-touch’ methods of measuring the impact of their Knowledge Management program.

Whatever method you may use to measure, I believe the imperative requirements are: 
1. A baseline from which to judge improvement. 
2. Credible metrics that will convince both staff and senior managers. Beware of metrics from small samples that are extrapolated to organisation-wide improvement. These are liable to immediate challenge. 
3. Metrics that are empirical, to support other evidence that may be anecdotal

Whilst not imperative requirements, the following are of added benefit: 
4.The ability to leverage results to drive further improvement 
5. Low cost (implying as much self-assessment and automation as sensible and possible)

In addition, it is helpful to differentiate between attributable impact and contributory impact. The latter is the most common scenario, as KM (if embedded in the business process, as it should be) is usually one of many simultaneous improvement activities. The exception to this is where you have the ability/luxury of having control groups for your KM initiatives. For example:




I looked at measurement in intangible organisational assets that are analogous to KM, in particular HR and Corporate Social Responsibility.

Ideas from CSR

A common CSR method is 'The Reputation Index'. Whilst rich in data, this cannot attribute CSR to organisational performance.
Warwick Business School has an excellent article from Prof Kamal Mellahi on measuring CSR impact. In particular note his final ‘cautionary tale’ sentence. 
Some organisations have adapted Delphi Analysis , but this is more suited to forecasting than performance management.



Ideas from HR

Almost all the measurement approaches for HR impact seem to feature some sort of maturity assessment. For example this HR.com measurement table shows, in effect, a maturity model for HR 'stability' 
The leading HR forum is CIPD. They have a sophisticated measurement tool, that relies on users level of agreement with statements. Again, the results look very much like a maturity model with an associated action plan.

This article gives a useful representation of cause and effect from HR. Step 5 emphasises the need to set the level of attribution correctly. This is also shown in the Andrew Mayo diagram below, which we examined at the last KIN Roundtable in May.






Organisational Network Analysis

Other than maturity modelling, the other approach that meets almost all the requirements list above in Organisational Network Analysis, or SNA. The exception may be Point 5, cost.
Andrew Parker at Grenoble University and Rob Cross at the University of Virginia are the ‘go-to people for ONA. Rob has an excellent ONA primer here.
KIN Facilitator Steve Dale is also qualified in SNA modelling.

Other KIN Members’ insights

Below is a mindmap synopsis of the major ‘Take-aways’ from a previous Roundtable I ran on KM measurement.


Conclusion

Assuming that you agree with the criteria I set out at the start…

I have not come across anything that is better than Capability Maturity Modelling that is suited to what the organisation is trying to achieve. My recommendation was therefore that they significantly refresh their maturity modelling process and re-benchmark. The re-benchmarking is needed as I suspect there was ‘grade inflation’ to borrow the phrase from education. If there is a small differentiator between departmental KM performance, this should be re-baselined and exposed to identify gaps and re-invigorate competition.

SNA is a sophisticated alternative that meets most of the criteria. It could be used to measure KM interactions through careful wording of the SNA questions. Strength, direction, density and centrality of bonds can be analysed and teams/departmental KM performance inferred. The resulting network infographics can be particularly useful in presenting results in an empirical way to senior managers. If you have not tried SNA before, this would be worthy of a trial. SNA software is now inexpensive (in some cases free) , although proper set-up and analysis can still take a lot of expert input.


What are your suggestions for a ‘light-touch’ approach to measuring knowledge management program that you can share here?

Tuesday, 21 June 2016

Huts and Silos

There is a scene in 'The Imitation Game' movie where the brilliant Alan Turing, played by Dominic Cumberbatch, is assembling his famous 'bombe'. This is not an Italian ice cream, but a huge, sophisticated mechanical calculator that helped accelerate the deciphering of German Enigma codes. Whilst most of the film is acknowledged as a faithful representation of the amazing work that went on at Bletchley Park during the war, there were two small factual inaccuracies.

Firstly, the first Bombe was initially built by Polish codebreakers in 1938. Turing's and Gordon Welchman's genius transformed it into a more accurate, electro-mechanical device that could simultaneously process 36 Enigma codes and at phenomenal speed.  Over 200 Bombes were built (not one survived the end of the war), giving industrial capability in codebreaking. The second was that whilst Turing designed them, he didn't actually build the Bombe himself.  Harold Keen was the amazing engineer who physically constructed the machines.


How do I know this? Last weekend we visited Bletchley Park and it was a revelation. The geniuses and support staff (there were 10,000 of them) that worked there are credited with shortening the Second World War by at least 2 years and saving millions of lives.

The teams at Bletchley Park worked in strictly demarcated teams for security, such that almost none of them had the full picture of what the whole was doing. The 'Listeners' who spent literally years in headphones patiently writing down reams of meaningless German Morse code, had no idea what the team next door did. They never asked. In fact they were separated physically, in numbered huts.

Despite this, the whole functioned as an efficient and effective operation. This was largely down to Commander Alastair Dennison, whose own genius was unparalleled leadership and organisational skills. Working with, sometimes petulant, highly able experts who were not used to collaborating, he got the most out of each team.

Many large organisations suffer from isolated knowledge, constrained by department or other 'silos' (aka huts). For ideas about how effective organisation and leadership can improve performance and how to get your experts to give their best, I highly recommend a visit to Bletchley Park. We went for a morning and spent the entire day.

Photo credit: Rebuilt Bombe, Wikipedia