The term “artificial intelligence” (AI) was first used in 1955 by John McCarthy. However, up to this day the term does not have a formal definition and AI means different things to different people.
Common definitions include:
McKinsey: “The ability of a machine to perform cognitive functions we associate with human minds, such as perceiving, reasoning, learning, and problem solving.”
John McCarthy: “The science and engineering of making intelligent machines, especially intelligent computer programs.”
Merriam-Webster dictionary: “A branch of computer science dealing with the simulation of intelligent behavior in computers.”
Intel: “A range of computer algorithms and approaches that allow machines to sense, reason, act and adapt like humans do – or in ways beyond our abilities.”
Gartner: “Artificial intelligence (AI) applies advanced analysis and logic-based techniques, including machine learning, to interpret events, support and automate decisions, and take actions.”
The above definition by Gartner captures how AI is used to improve outcomes in business and is the context in which “Artificial Intelligence for Everyone” is written.
Artificial intelligence is one of the pillars of digital transformation or the Fourth Industrial Revolution. It enables us to build remarkable systems for self-driving cars, dynamic pricing, targeted advertisements, drug discovery in healthcare, identifying defective products on a production line, etc.
Online reference was accessed on 25 May 2022