Tuesday, 18 October 2016

What modern organisations can learn from the Bletchley Park code-breakers

A few weeks ago I wrote about my eye-opening visit to Bletchley Park in a post called 'Huts and Silos'. This inspired me to arrange a KIN Site Visit to the home of WW2 codebreaking. The idea was
to see what modern organisations could learn from Bletchley Park's innovation, collaboration and organisation set-up. On Friday, 13 of us had an inspiring tour of the site; this is the result of our reflections at the end of the day.

Participant observations of the Bletchley Park operation
Possible lessons for modern organisations
Diversity of backgrounds and professions represented. Unusually, class distinctions were immaterial.
Different perspectives & backgrounds = higher likelihood of finding solutions to problems. Complementary skill sets.
‘Silo’ working at Bletchley Park was a necessity for security reasons
Sometimes there is a good reason for clear separation of operations, for example Chinese Walls for financial operations.
Despite much of the work being tedious and the workers conscripted, morale was high and ambitious targets achieved
Intrinsic motivation (having a goal that workers believe in and work that plays to strengths) can compensate for difficult circumstances. It’s not all about pay and rations (literally!)
Socialisation and relaxation was seen by senior management as an important factor in managing stress and keeping productivity high. Eg tennis, dances, beer!
Informal spaces to relax and converse with co-workers are vital in building relationships, trust and the exchange of ideas (clearly the latter didn’t apply at Bletchley Park)
Unusually for the time, female staff at Bletchley Park (2/3 of the total) received equal pay to men. Note: we are unsure if this applied just to the code-breakers, or all female staff.
One hopes that equal pay is no longer an issue, but we must be vigilant with regard to biases. The KIN Spring 2017 Workshop will include this issue.
Individuals with specialist skills were given very specific tasks; not asked to be generalists
Too often experts are asked to take on generalist roles (such as managing teams or budgets). This can be a distraction, or cause stress or under-performance.
There were many failed attempts at problem solving. This was anticipated and processes in place to understand root cause of failure. In one instance, the Navy code-breakers took 9 months of repeated failure before cracking a problem.
We need to have a defined level of tolerance for failure, and ensure processes are in place to take action as a result. ‘Anyone who has not experienced failure has never tried anything new’ – A Einstein
The code breakers had to deal with up to an astonishing 6000 messages per day. These had to be processed before midnight every day, when the Enigma settings changed. The industrialization of the processing and analysis may be the first example of Big Data and Data Analytics.
Processes and skills for the analysis of huge volumes of rea-time data are becoming ever more important. AI may be a way of understanding hidden patterns an inferences (see KIN Winter Workshop, 7th December).
The actors in ‘The Imitation Game’ spent time talking directly with Bletchley Park veterans, to  understand what it was like to work there.
First-hand, verbatim knowledge is vital in understanding context and nuance for handovers and other knowledge transfer situations.
Having tough targets and working under critical time constraints can sometimes foster ingenious solutions. For example the ‘cribs’ shortcuts.
Sometimes disturbing the staus quo or adopting counter intuitive approaches can bear foster innovation.
A good source of personnel were cryptic crossword puzzle fanatics and other critical thinkers
Do we encourage critical thinking and individualism sufficiently in our education systems?

Monday, 3 October 2016

Rhetoric - a much need skill for knowledge workers

Rhetoric seems to have negative connotations these days. That's a shame, as Aristotle's approach to 'the art of effective or persuasive speaking or writing' is a skill that anyone effecting change in organizations must have. Female staffers at the White House have proactively employed rhetoric in a very innovative and specific way to get their voices heard.
Whilst we're talking philosophers, the most effective masterclass technique that I train facilitators in is Socratic knowledge transfer. Too often deep experts reach for their Powerpoint slides and simply impart their wisdom, without demanding critical thinking. Much better to have a dialogue based on seeded 'judgement call questions' and elicit personal insights or experience from all those participating. Getting the 'expert' to hold off imparting their solution or answer until the end of the discussion is tricky!
The process involves a carefully selected and rehearsed case-study that gives plenty of context and has two or three decision points which relied on judgement. The 'expert' pauses at the judgement calls and asks, for example, 'what would you do?' or 'what else do we need to know?' or 'what do you think happened next?'. On several occasions an entirely novel approach or solution has emerged that the 'expert' had not considered.
One vital component is getting the right participants. Everyone invited should potentially have something to contribute to the topic. In that way there is not just one 'expert' in the room.
The process is particularly effective in generating new insights and conveying complex ideas, but needs careful coaching and facilitation. 

Thursday, 11 August 2016

Experts, Zen masters (& David Brent)

What is an 'expert'? In my work facilitating knowledge transfer I have come across individuals who have widely varying degrees of expertise. Some are very aware of their unique know-how, others less so.

Low competency/ Low consciousness:

There are two categories here; firstly those who are blissfully unaware that they lack any expertise. A baby would be an example. Secondly, there are those who profess to be expert, but are actually incompetent and unaware of their incompetence. Rather than reach for an obvious example from US politics, I propose David Brent, from The Office. This phenomenon is called the Dunning Kruger effect named after the Cornell University professors who published the seminal research paper entitled 'Unskilled and Unaware of it'. This is why validation and looking for evidence is such an important part of the knowledge transfer process. Incidentally, did you know that 62% of all software engineers rate themselves in the top 5% of their profession?

The trouble with the world
is that the stupid are cocksure
and the intelligent are full of doubt.
— Bertrand Russell

Low Competency / High Consciousness

Novices are obvious examples of individuals who are aware of their inexperience. One interesting observation about those who claim to have low competency, is that sometimes it just takes a skilled interviewer to reveal a latent talent. One group that often benefit from this help are job-changers, unsure of their place. It can also be hugely motivating for them.

High Competency / High Consciousness

This group is the most obvious to classify as 'expert'. They can easily tell you what's right (or wrong) and why, and can provide lots of evidence. They are usually confident in their ability and will sometimes claim that expertise is in some way unique. This is worth testing in the knowledge transfer process - is it 'commodity' know-how? Is it easily codified (it may not be 'captured' in order to create an impression of uniqueness and inaccessibility). In my knowledge elicitation process, I use a mining metaphor. These experts are good at providing ore (superficial knowledge) but find it difficult to come up with gems (detailed knowledge that has context which makes it accessible to others). Their preferred communication style is to 'tell'.

High Competency / Low Consciousness

The Zen masters. These individuals have deep, experience gained through years of practice. Ironically, this most valuable expertise is the hardest to pass on. Next time you are out on a golf course with a player who is far better than you, try this... as they tee-up ask them to explain how they play their perfect shot. Either they can't explain it, or their next shot will go into the rough. Like pro basketball players who instinctively know where every other player is, they can't explain their mastery, they just do it. This is why knowledge transfer for these deep experts benefits from skilled facilitation. The best environment to elicit this sort of know how is a Socratic questioning approach or dialogue; quite different to the 'tell' approach of the 'expert'.

Friday, 15 July 2016

Augmented knowledge - the fourth channel

Ask anyone familiar with knowledge management what form organisational knowledge takes, they will almost certainly mention tacit knowledge and explicit knowledge. They may also mention latent knowledge in networks. I'd like to propose a fourth - augmented knowledge. The coming-of-age of artificial intelligence, 'social robots' and big data is having a massive impact on the way decisions are made in organisations. It follows that if we are to maximise know-how and expertise, the outputs from this technology-enabled channel must be integrated into how we work. Augmenting judgment and experience in this way also supports the move towards evidence-based decision making.

It also drives new skills needed to maximise these opportunities. Data analytics and blockchain coding are not esoteric geeky pastimes, but are increasingly employed by major FMCG, finance, retail, and law firms to highlight trends and real-time patterns that augment business acumen and expertise.

This chart does not imply a hierarchy, but shows how Augmented Knowledge fits with the more established Organisational Knowledge channels.

This perspective is my own, not necessarily representative of KIN's. Alternative views are welcome in the comments!

Augmented Knowledge will be explored in the Knowledge and Innovation Network Winter Workshop on 7th December on the theme of 'Organisational Learning in the Machine Intelligence Era'.

Monday, 11 July 2016

Trends in Big Data, Data Analytics and AI

I was asked by Managing Partners Forum (MPF) recently to give a brief overview of the current status and industry trends in Big Data and Data Analytics, topics I've been keeping an eye on for several years. The slides are available on Slideshare. The following is shortened abstract from the presentation.
One of the issues I have with with Big Data is just that - the term "Big Data". It's fairly abstract and defies a precise definition. I'm guessing the name began as a marketing invention, and we've been stuck with it ever since. I'm a registered user of IBM's Watson Analytical Engine, and their free plan has a dataset limit of 500MByte. So is that 'Big Data'? In reality it's all relative. To a small accountancy firm of 20 staff, their payroll spreadsheet is probably big data, whereas the CERN research laboratory in Switzerland probably works in units of terabytes.
Eric Schmidt (Google) was famously quoted in 2010 as saying “There were 5 exabytes of information created between the dawn of civilisation through 2003, but that much information is now created in 2 days”. We probably don't need to understand what an 'exabyte' is, but we can get a sense that it's very big, and what's more, we begin to get a sense of the velocity of information, since according to Schmidt it's doubling every 2 days, and probably less than that since we've moved on by 6 years since his original statement.
It probably won't come as a surprise to anyone that most organisations still don’t know what data they actually have, and what they’re creating and storing on a daily basis. Some are beginning to realise that these massive archives of data might hold some useful information that can be potentially deliver some business value. But it takes time to access, analyse, interpret and apply actions resulting from this analysis, and in the mean-time, the world has moved on.
According to the "Global Databerg Report" by Veritas Technologies, 55% of all information is considered to be 'Dark', or in other words, value unknown. The report goes on to say that where information has been analysed, 33% is considered to be "ROT" - redundant, obsolete or trivial. Hence the 'credibility' gap between the rate at which information is being created, and our abilities to process and extract value from this information before it becomes "ROT".
But the good news is that more organisations are recognising that there is some potential value in the data and information that they create and store, with growing investment in people and systems that can make use of this information.
The PwC Global Data & Analytics Survey 2016 emphasises the need for companies to establish a data-driven innovation culture – but there is still some way to go. Those using data and analytics are focused on the past, looking back  with descriptive (27%) or diagnostic (28%) methods. The more sophisticated organisations (a minority at present)  use a forward-looking predictive and prescriptive approach to data.
What is becoming increasingly apparent is that C-suite executives who have traditionally relied on instinct and experience to make decisions, now have the opportunity to use decision support systems driven by massive amounts of data.  Sophisticated machine learning can complement experience and intuition. Today’s business environment is not just about automating business processes – it’s about automating thought processes. Decisions need to be made faster in order to keep pace with a rapidly changing business environment. So decision making based on a mix of mind and machine is now coming in to play.
One of the most interesting bi-products of this Big Data era is 'Machine Learning' - mentioned above. Machine learning’s ability to scale across the broad spectrum of contract management, customer service, finance, legal, sales, pricing and production is attributable to its ability to continually learn and improve. Machine learning algorithms are iterative in nature, constantly learning and seeking to optimise outcomes.  Every time a miscalculation is made, machine learning algorithms correct the error and begin another iteration of the data analysis. These calculations happen in milliseconds which makes machine learning exceptionally efficient at optimising decisions and predicting outcomes.
So, where is all of this headed over the next few years? I can't recall the provenance of the quote "never make predictions, especially about the future", so treat these predictions with caution:
  1. Power to business users: Driven by a shortage of big data talent and the ongoing gap between needing business information and unlocking it from the analysts and data scientists, there will be more tools and features that expose information directly to the people who use it. (Source: Information Week 2016)
  2. Machine generated content: Content that is based on data and analytical information will be turned into natural language writing by technologies that can proactively assemble and deliver information through automated composition engines. Content currently written by people, such as shareholder reports, legal documents, market reports, press releases and white papers are prime candidates for these tools. (Source: Gartner 2016)
  3. Embedding intelligence: On a mass scale, Gartner identifies "autonomous agents and things" as one of the up-and-coming trends, which is already marking the arrival of robots, autonomous vehicles, virtual personal assistants, and smart advisers. (Source: Gartner 2016)
  4. Shortage of talent: Business consultancy A.T. Kearney reported that 72% of market-leading global companies reported that they had a hard time hiring data science talent. (Source: A.T Kearney 2016)
  5. Machine learning: Gartner said that an advanced form of machine learning called deep neural nets will create systems that can autonomously learn to perceive the world on their own. (Source: Ovum 2016)
  6. Data as a service: IBM's acquisition of the Weather Company -- with all its data, data streams, and predictive analytics -- highlighted something that's coming. (Source: Forrester 2016)
  7. Real-time insights: The window for turning data into action is narrowing. The next 12 months will be about distributed, open source streaming alternatives built on open source projects like Kafka and Spark(Source: Forrester 2016)
  8. RobobossSome performance measurements can be consumed more swiftly by smart machine managers aka “robo-bosses,” who will perform supervisory duties and make decisions about staffing or management incentives. (Source: Gartner 2016)
  9. Algorithm markets: Firms will recognize that many algorithms can be acquired rather than developed. “Just add data”. Examples of services available today, including AlgorithmiaData Xu, and Kaggle (Source: Forrester 2016)
The one thing I have taken away from the various reports, papers and blogs I've read as party of this research is that you can't think about Big Data in isolation. It has to be coupled with cognitive technologies - AI, machine learning or whatever label you want to give it. Information is being created at an ever-increasing velocity. The window is getting ever narrower for decision making. These demands can only be met by coupling Big Data and Data Analytics with AI.

Steve Dale (for KIN Enterprise Technology SiG)