Artificial Intelligence Update: AI Jargon Buster
23 May 2019
As with all new and emerging themes affecting our future daily lives, manufacturers and consumers are often bombarded with a large amount of unfamiliar buzz words and acronyms. We provide a helpful guide on commonly used AI terms.
Glossary of terms
Artificial Intelligence (AI): broadly refers to the ability of computing technologies and software to simulate human cognition and learning. AI enables machines to perform tasks normally requiring human intelligence, allowing software to learn, reason, interact, and engage in sensory perception and understanding. While there is no universal definition, AI technologies have become ubiquitous in modern life – from Siri, to tailored Spotify or YouTube recommendations, to driverless cars.
Algorithm: An algorithm is an unambiguous set of mathematical rules to solve a class of problems which is the key to enabling AI software to problem-solve. For example, if you need to get from A to B on Google Maps, an algorithm exists within the software that will help you work out the fastest route taking into account things like congestion etc.
Artificial Neural Network: refers to a network of “neural” layers that are used to allow software to mimic the processes of the human brain. For example, when Google Images needs to tailor its search results to decide whether or not a dog is in an image, the neural network will consider (in nanoseconds) various elements and characteristics of the image (including for example, the arrangement of pixels, instances of light and shadow, notable shapes and even whether or not s/he will have pointy or floppy ears) before making a final decision.
Autonomous Mode: refers to the capability of AI software to operate independently without direct human input. Examples of autonomous AI include robots covering deep sea and space exploration, deep learning in medical diagnosis and of course driverless cars.
Big Data: refers to unique datasets which are so large or complex that traditional data processing applications are inadequate to deal with them. Examples include financial services using big data analytics software to detect suspicious transactions in the prevention of money laundering, or cutting-edge meteorological models (like IBM Deep Thunder) forecasting weather patterns through high performance computing of big data
Blockchain: refers to a system of secure data storage in which digital transactions made using a cryptocurrency (e.g. Bitcoin) are stored as encrypted “blocks” of data in a secure linear “chain” which is maintained across several computers within a network.
Chatbots: these are “chat robots” that converse with human users through text or voice commands, commonly used by websites and online services to mimic human contact for customers.
Deep Learning: refers to the ability of AI algorithmic software to recognise patterns within a neural network by extracting features from large datasets, thereby allowing it to closely resemble human intelligence.
Internet of Things (IoT): refers generally to the interconnectivity of devices via the Internet which enables them to freely send and receive data between them (e.g. physical health sensors and activity trackers in the healthcare sphere).
Machine Learning: refers to the processes by which machines and AI algorithmic software “learn” by example and/or teach themselves to recognise patterns or reach set goals without being explicitly programmed to do so.
Robotics: refers generally to the branch of scientific technology focused on the manufacture and design of “robots” which simulate human intelligence and actions. Think C-3PO, but bear in mind that robotic sentience is still a long-way off and most robotics in use today are focused on repetitive tasks such as welding and assembly…
Virtual Reality (VR): refers to technologies often using VR headsets which simulate physical, real-world environments by generating realistic images, sounds and other sensations resembling a user's physical presence in a virtual or imaginary environment.
 Kemp IT Law, Legal Aspects of Artificial Intelligence (2018)