Do you need an AI Strategy?

Fig 1. A high level plan. Photo by Jo Szczepanska on Unsplash

This article gives some of my thoughts on the following article in HBR:  Why AI Underperforms and What Companies Can Do About It.

My primary concern with it, is the assertion that an AI Strategy is not needed in any organisation undertaking AI projects. 

Why AI Underperforms and What Companies Can Do About It

The primary point of the HBR article “Why AI Underperforms and What Companies Can Do About It” by Mihnea Moldoveanu, is that there is large difference in communication styles between executives and those that develop machine learning solutions. This results in AI projects failing. To allow AI to succeed executives need “training in computational and algorithmic thinking”.

AI Strategy, apparently not needed

The article made a fair point, suggesting that an organisation’s success in AI requires training executives such that they understand more about AI,  but one set of sentences at the end of the article really got stuck in my craw.

AI strategies’ fail because AI is a means, not an end. “Do you have an AI strategy?” makes as much sense as asking, “Do we have an Excel strategy?”

I’m going to pull this apart in reverse order, but first lets talk about strategy.

Strategy

A strategy is a plan. There are many plans made within an organisation, which focus on different challenges or needs. We can think of these plans as being layered, and by common agreement we call the plans at the highest layers or levels – strategies, and the ones at the lower levels – tactical or operational plans. Plans at different levels or plans for different undertakings often require very different elements. There are analogies between business and military strategies but they are in many ways dissimilar. Figuring out what to pay attention in your higher level plans (strategies) is not necessarily straight forward, especially since the consequences of poor planning at this level can be catastrophic (compared to, for example, planning where to put the photocopier that no one uses anymore).

Excel

Lets translate from the excerpt above “Do you have an Excel strategy” using the word “plan” and add some words to make it more understandable.

It translates to “Do you have a plan on how to use Excel in your organisation?”.

I put it to you that, for many decades, Excel was the backbone of many financial institutions – way longer than it should have been and way longer than we think it was – and there should have been a plan. There should have been an Excel strategy because it unfortunately was critical to many financial operations in many companies financial or not.

The fact that for most of the time, in many companies, there wasn’t a plan for this crucial piece of technology was stupid.

There should have been an Excel strategy, it it should have declared how dependant the company wants to be on it, and in which direction to drive its use.

When you have a technology (that can fall over with a fresh breath of wind) that is managing billions of dollars of assets, maybe you should have some strategy around it.

So it makes sense to ask if you have an explicit Excel strategy (rather than an implicit one of letting spread through an organisation like a virus).

An AI strategy is silly

We’ve decided that it makes sense that many organisations should have had an Excel strategy, but in the context of the statement above it is saying there is no point in having an Excel strategy, an Excel strategy is silly, and similarly there is no point in also having an AI strategy.

As a reminder, the phrase is saying that there is no sense in asking the question  “Do you have an AI Strategy” for which the (embarrassed) reply is supposed to be :

“No, just like we don’t have an Excel plan, we don’t have an AI plan, and it makes no sense to have one”

In this case we are saying  “We have a technology that is literally world changing, but we do not need any plan to use or engage with that technology”.

My assertion is that because the effects of AI are so widespread it is essential you have an AI plan or an AI strategy. Sure, that plan might be “We don’t think it affects us now and we are not going to engage with it” – but that’s still a plan about what you are going to do with AI, with the understanding of its potential effects on your business.

So, yes, asking if you have an AI Strategy does make very good sense, likewise asking yourself if you have a plan to deal with any other emerging disruptive technology makes good sense too.

I can’t imagine any CEO saying “Hmm, this technology potentially could change our whole workforce, and our competitors potentially could disrupt us such that the company goes bankrupt. It’s ok though – we don’t need to plan for it ! Its just like Excel! Its a technology that we use – and surely no one makes strategies to how to deal with technologies! We can just stick our head back in the sand!”

If at this point you think I’ve built a straw man – again here’s what we are talking about:

“Do you have an AI strategy?” makes as much sense as asking, “Do we have an Excel strategy?”

I’m saying that, that statement is nonsense, that having an AI Strategy is a sensible, very rational approach to dealing with AI.

The means to an end

Finally the initial part was “AI strategies fail because AI is a means not an end” is also nonsense.

So what – if it’s a – “means”?  So what?

Let’s do a thought experiment – If we want to roll out electric cars across the country to reduce greenhouse gases – we could ask ourselves “What’s our electric car strategy?”, or “What is our plan to role out electric cars?”.

“But wait! We can’t do that! Electric cars are a “means” to reduce carbon emissions – so we cannot use it the same sentence as strategy!“

“We cannot have an electric car plan!”

Put your hand up if you think thats a sensible conversation to have.

Describing AI as a “means” serves no useful purpose, and doesn’t negate the requirement for an AI strategy in many companies.

To Summarise

Both the words Strategy and AI are overloaded to such a degree that it is difficult to have a conversation about them without many tonnes of confusing contextual baggage interfering with what we mean. Despite my disagreement in how the article concluded , the article had something sensible to say – and that is – executives don’t know enough about AI to sensibly incorporate it in their organisations. However it is my contention that there is an absolute requirement that any mid sized or larger organisation must include AI in its plans – and therefore must have an AI strategy.



Leave a Reply