Title: Artificial Intelligence: Definition, Advice, Comparisons, Testimonials…

Language: English

Type: Blog/Article

Nature: Website

Published: November 18, 2020


Region: Global

Country: Global / Non-Specific

Keywords: Age of Disruption

Document Link(s):


Document Summary:

Aimed at simulating human intelligence, artificial intelligence has been emerging since the beginning of 2010, driven by deep learning, big data and the explosion in computing power.


Document Details:

Artificial intelligence (AI) refers to “an application capable of processing tasks that are currently performed more satisfactorily by human beings insofar as they involve high-level mental processes such as perceptual learning, memory organization and critical thinking”. This is how the American scientist Marvin Lee Minsky, considered to be the father of AI, defines the concept. It was in 1956 at a meeting of scientists in Dartmouth (south of Boston) organized to consider the creation of thinking machines that he managed to convince his audience to accept the term.

Following initial work in particular around expert systems, AI emerges much later. In 1989, the Frenchman Yann Lecun developed the first neural network capable of recognizing handwritten numbers. But it was not until 2019 that his research and that of Canadians Geoffrey Hinton and Yoshua Bengio were awarded the Turing Prize. Why is this? Because deep learning faces two obstacles. First, the computing power needed to train neural networks. The emergence of graphics processors in the 2010’s brings a solution to the problem. Secondly, learning obviously involves massive volumes of data. In this respect, the Gafams have since been successful, but data sets have also been published in open source such as ImagiNET.


Updated: December 28, 2023