Ask anyone who their ideal boss would be, and they’re unlikely to name a dictator.
We want our bosses to come with empathy, understanding and at least a basic awareness of social skills. We prefer to be motivated by carrots rather than sticks.
Yet the reality is, we are heading down a much scarier path. We are hiring a new generation of single-minded, half-blind autocrats, the sort of people that would make you shake with fear if you spotted them on your org chart. And we’re doing it without realising.
Who are these dictators? Well, actually, they’re not people. But we’re giving them the same level of power that we used to give to people.
The new autocrats are the systems we’re installing.
The rise of NBA
No, nothing to do with basketball. A new trend is rapidly taking over the life sciences industry, called Next Best Action (NBA). Computers are being deployed to assess and analyse risk, absorb as much information as possible, and spit out recommendations to field reps, directing them to the most intelligent move.
Sounds smart, right?
Yes – from a manager’s point of view. From an employee perspective, it could be the beginning of the end.
Currently, the majority of systems bark out a single instruction: ‘THIS IS WHAT YOU SHOULD DO TODAY’. Worse still, they are black boxes: there is no reasoning or logic behind each command.
We know from countless examples that when there is a lack of understanding or alignment between authority and their staff, this breeds resentment.
Resentment leads to distrust, and distrust leads to poor adherence, lower performance and a sense of injustice.
Now, whilst I don’t think that the next Terminator movie will be filmed in a pharma rep’s office, I do still think this detachment will eventually result in an uprising – a backlash against management-by-machine.
Explainability is everything
These new systems rely on artificial intelligence (AI). But like all intelligences, they are not equal. You are probably already familiar with the difference between a ‘rules-based’ approach, where we give a computer specific instructions to complete its task, like stabilisers on a bicycle. But if the system is smart enough, we enable ‘deep’ AI, where the system figures everything out on its own. This is, in fact, the greatest advance in the sophistication of AI over the past decade.
The problem with using the latter is that while you get superior intelligence, the computer effectively creates its own unintelligible language, and it becomes impossible for humans to understand how it came to a conclusion. The smarter the system, the more opaque it is.
So much of our latest effort is not just about making the systems smarter, it’s about making them explainable. Only the most capable systems come with a high-quality explainability engine, and whilst it doesn’t seem like it’s more important than accuracy or reliability, in the medium-to-long term it’s what will make the difference between a system that gets used and trusted… or yet another expensive failed attempt at digital transformation.
Ultimately, you must ask yourself if you want to work with a psychopath who cannot distinguish between right and wrong, yet who is meticulous and calculating in achieving their goals, operating without remorse or feeling. That’s the definition of psychopathy after all. If we cannot produce explainability, we are simply not creating systems that will be understood, be trusted or change behaviour.
The OKRA solution – The perfect colleague
OKRA’s FieldFocus and MarketSphere products use a hybrid explainability engine to provide reasons behind each AI insight. After 3 years of development, validation, and exploration of the complex topic of causality and correlation, we have created the most trustworthy engine that delivers human explanations to complex problems and allows people to fully embrace change with AI.
This is not the only thing which enables superior levels of trust and usage between human and machine. Rather than a single instruction, OKRA ensures that several recommendations are offered at any one point – multiple ways to crack a nut – enabling the human to use their entrepreneurial skill and personal chutzpah to get the job done. This is empowering, not weakening.
Let’s be careful to ensure that as we introduce new intelligence to the workplace, it’s the right kind of intelligence. An assistant with wisdom, a trusted friend that advises us; a colleague that we all would love to work with.