The confusing terminology around AI 


Few weeks ago , I attended our college reunion in India . Hard to believe 20 years have passed since I got a piece of paper mailed to me that I am a bonafide mechanical engineer . In these twenty years – the “work” conversation I have had the most consistently , multiple times every year , is the difference between terms like data, reporting , BI , analytics etc . After about ten years of fighting the good fight – I gave up and reconciled in my mind that it doesn’t matter what you call it as long as it solves problems for my clients . 

Over these past two decades , there was always a movement in IT to show that careful analysis of data will help the business take better decisions . As a result – a lot of improvements happened in both data management as well as analytics . As the technology got more and more sophisticated – the terms we use to describe it got more and more confusing too . At this point – people use AI , Cognitive , machine learning , neural networks and deep learning etc interchangeably . 

The amount of confusion this generates is not trivial . So now – not only do I get to explain the old ” analytics vs reporting vs BI ” , I also get to spend countless hours explaining nuances between ” cognitive vs AI vs…..” . 

If we need one umbrella term – I would stick to ” Artificial Intelligence” as that term . AI was a term coined by the late Prof McCarthy over 60 years ago . Over the past few years – led by IBM, several people have started using Cognitive computing as an umbrella term too . 

I have asked around for and read a lot of definitions for AI – and it’s hard to find any consensus . The way I look at it is AI is the discipline that is today about doing things only humans could do in past , and one that is aiming for a tomorrow where computers also think like humans do . 

A friend of mine and I usually joke around AI just being a series of nested if-else statements , just that it is written in Python 🙂

That joke is not fully off base . The traditional approach has been to model the thing we want to analyze and then ask questions of it . Intelligence comes from the brilliance of the designers – not really “artificial” . The challenge is of course , things change over time . A better approach probably is to model how humans think – so that even if things change , answers can still be found . Just that it is contrary to how we ( or most of us ) have learned all this while to design and code . This is the concept ( or more precisely just my understanding of ) behind “deep learning” . 

Is “Supervised” learning really much different from the maintenance and enhancement aspect of traditional programming and hence is that really AI ? I am conflicted on this – mostly because human learning also needs supervision in many cases . 

Some of the confusion can be avoided by thinking of today’s world of AI as “narrow” intelligence and the vision for tomorrow’s world as “general” intelligence . Machine learning – perhaps the most visible part of AI today – is mostly used today (at least from my limited point of view) on the “narrow” use cases . The easiest way to think of it for me is that rather than make continuous code changes , the algorithm keeps up with changes by detecting patterns as it gets access to more and more data 

The challenge for me with the term AI is the definition of “artificial”. I think the expectation for “artificial” is a lot higher than say “augmented”. And that is perhaps why “cognitive” doesn’t get as much push back as it should . 

Another challenge to move from old world to the AI world is our fascination for precision . Most decisions only need directionally correct information and options – they don’t need precision . But that is not something that a lot of people will agree without significant pushback . AI type projects need a lot of expectation setting and some education on the basics of probability . I had to dust off a few of my statistics books before I could talk semi intelligently to my clients . 

As machines get smarter and the primary communication becomes mostly machine to machine – perhaps machine learning doesn’t need to try to think like humans anymore. Whether it’s going to be more complex or less complex is anybody’s guess . All I am sure of is that we won’t be spared of some more jargon 🙂

Before I sign off – here is a shout out to my friends in the world of hardware . Without the extreme speed of innovation in the hardware world, AI ( and old world computing too) would never have had a chance to get on the fast track . Look at how much the world of AI has changed since GPUs became mainstream as an example . The last two or three years have seen more progress than the decades before it. It’s gonna be a wild ride  



Published by Vijay Vijayasankar

Son/Husband/Dad/Dog Lover/Engineer. Follow me on twitter @vijayasankarv. These blogs are all my personal views - and not in way related to my employer or past employers

5 thoughts on “The confusing terminology around AI 

Leave a comment