Blog

Avoiding the AI Hype Bubble

Pinterest LinkedIn Tumblr

In the merry old land of Oz (the fantasy land, not a nickname for our sunburnt country), little Dorothy is told by Glenda (the Good Witch of the North) that she has always possessed the ability to get what she wants.  All Dorothy must do is tap her ruby slippers together and say the magic words.

Sadly, many people would have you believe that adopting AI into your business is just as easy. There are many ‘Wizards of Oz’ using smoke, mirrors and promises of grandeur to sell AI products to those in a state of FOMO!

Inversely, the wider media is creating just as dangerous anti-hype about the many life-threatening risks and all out anarchy that will rule when the robots take over. Reminds me of the Wicked Witch of the West’s flying monkeys in Dorothy’s adventure.

The truth, as always, lies somewhere in the middle. So how do you avoid both the hype and the anti-hype? Let’s examine 3 tips on both sides to help you on your AI adoption journey.

Hype first

Tip 1: Adopt the airline model. Every flight has a pre-flight briefing (you know that part where you ignore the cabin crew and look out the window). Airlines always tell you to put on your oxygen mask before helping others. It is the same with AI.

Educate yourself on what AI is and is not. Find out what the terminology means. You don’t need to get a PhD on the subject but learn enough that you can have meaningful conversations with your colleagues and suppliers alike.

Know enough to question the glossy brochures!

Tip 2: Know the rules. Before you play any new board game or card game you must familiarise yourself with what you can do and what you can’t do. This also applies to AI.

Get together with your head of technology and start a discussion about what technical constraints might exist in your organisation around AI adoption. Just a hint: data sitting in old Lotus Notes databases is not AI ready!

Similarly, discover where your data resides.  This is less about the structured data (that lives in databases or CRM systems, etc) and more about the unstructured data that lives in PDFs or SharePoint/Confluence, etc.

The rules may surprise you!

Tip 3: Think like a jockey.  No jockey has ever won a race by putting the horse on the saddle – although I did once witness a horse legitimately win a race with no jockey (the jockey fell off) at Palio di Siena in Italy but that’s another story.

Why would anyone go and buy a product first (the horse) and then try to shoe-horn in a use case (the saddle) for that technology? Albeit a little slower and more effort, building a dedicated team and allowing them time and space to examine all the possible use cases to determine which one delivers best bang for buck will yield much better outcomes.

The first use case may not be the one you are thinking of!

Now the Anti-Hype bubble

Tip 1: Learn from the divers. What is the similarity between sky divers and scuba divers? Both activities are risky, and not just ‘I got a paper cut’ risky, life-ending risky. Yet people continue to do both adrenaline-fuelled sports. Why? Because they understand the risks and make calculated decisions. Never truer than with AI.

A myriad of AI related risks exist that organisations need to come to terms with. From bias to hallucination to model drift to brand degradation to ethics and many more. All of these can, and should, have mitigation strategies in place. No different to any other organisational risk.

Well planned out AI adoption that focuses on risk mitigation, responsible and ethical frameworks, regulatory compliance, return on investment and proof of concept will not be overtly fraught with danger.

Take a calculated risk and enjoy the dive!

Tip 2: Make AI your Moby Dick. In this treasured Herman Melville classic, we meet Captain Ahab and follow his relentless pursuit of the elusive whale. The crew are fully immersed in the journey. AI is similar in that it is an empirical journey.

An empirical journey means that you cannot think your way through it – you must experience it. The best use cases for AI are born after people start using AI. This empirical approach magnifies the risks mentioned above, but it is the only way.

Call me Ishmael!

Tip 3: Take some ski lessons.  Anyone who has taken snow ski lessons has probably heard tales of instructors saying, “if you are not falling down, you are simply not trying hard enough!”.  This is a great lesson for AI adoption.

Organisations need to create a culture where it is OK to stumble and fall, especially for the team leading your AI adoption. More is learned from failure than success so bring on the mistakes. A good proof of concept should reveal errors and flaws. If it doesn’t, then you simply have not pushed hard enough – become an online troll for your own application and purposely try to break it!

Try hard to fail and you just might succeed!

Finally

The hype and anti-hype bubbles are both extremely large for AI, especially Generative AI.  To make matters worse, the technology is moving at such a break-neck speed that it is currently outstripping the human use cases for the technology itself. This can make the adoption of AI a very daunting task for organisations.

Now is the time to learn, evaluate, understand, and then adopt AI. Remember to bring your people on the journey with you. They are likely stressed about the impact of AI on their world, and it is your job to help calm that stress.

Simon Kriss is the Chief Innovation Officer for the CX Innovation Institute. He works with Boards, Executives and Leadership teams on Responsible AI adoption. Simon is the author of “The AI Empowered Customer Experience” and hosts podcasts on CX and AI. Simon is also a Non-Executive Director of Auscontact.

Simon Kriss is the Chief Innovation Officer for the CX Innovation Institute. He works with Boards, Executives and Leadership teams on Responsible AI adoption. Simon is the author of “The AI Empowered Customer Experience” and hosts podcasts on CX and AI. Simon is also a Non-Executive Director of Auscontact.

Comments are closed.