Triangular cross relationship between AI, Machine Learning and Deep Learning

There has been noticed the prevalent use of the terms AI, Machine Learning and Deep Learning embedded almost very fields now from innovations in the driverless vehicle to games such as poker, automating customer service using Chatbots, anomaly detection of next unavoidable breach in cyber security and opening doors to understanding the large-scale incredibly complex data set that makes up the human genome. There are fundamentally key differences between each type of technology one should understand and know to get familiarize with their use cases and application.

Artificial Intelligence (AI)

In the early era of electronic computers (1940–1950s), the machines were relatively simple, small and slow (compared with today). They were programmed (given recipe-like instructions) to do simple but boring but straightforward arithmetic tasks.

The phrase artificial intelligence was coined in 1956 by John McCarthy, who organized an academic conference at Dartmouth dedicated to the topic. AI is a subfield of computer science, which was created in the 1960s, and it applies on solving tasks that are easy for humans, but hard for computers.

Strong AI would be a system that can do anything a human can (perhaps without purely physical things) that includes all kinds of tasks, such as planning, moving around in the world, recognizing objects and sounds, speaking, translating, performing social or business transactions, creative work (making art or poetry), etc.

The graphical representation on AI is put well together on Nvidia’s blog to understand it.

More companies are investing in AI projects, and we do interact with AI software every day through smartphones, social media, Web search engines or e-commerce sites. And one of the types of AI that we interact with most often is machine learning.

Machine Learning (ML)

The phrase “machine learning” also dates back to the middle of the last century. In 1959, Arthur Samuel defined machine learning as “the ability to learn without being explicitly programmed.

Machine learning is one subfield of AI. The core principle here is that machines take data and “learn” for themselves. It’s currently the most promising tool in the AI kit for businesses. ML systems can quickly apply knowledge and training from large data sets to excel at facial recognition, speech recognition, object recognition, translation, and many other tasks. Unlike hand-coding a software program with specific instructions to complete a task, ML allows a system to learn to recognize patterns on its own and make predictions.

Types

  1. Deep Blue

Deep Blue was rule-based, dependent on programming—so it was not a form of ML

  1. DeepMind

DeepMind, on the other hand, is: It beat the world champion in Go by training itself on a large data set of expert moves.

“Machine learning can happen in 3 ways. Again taking an analogy, a different one. Imagine a kid. He is nothing but an AI.

  • You describe the shape of an apple to a kid and ask him to draw. It may take several takes for him to get the closer looking figure. This is Reinforcement learning.
  • You show a picture of an apple to a kid and ask him to identify that as Apple. This is Supervised learning.
  • You kid learns chilies are not to be eaten by actually biting on a chilly. It self-learns this information. This is Unsupervised learning”.

Companies like Facebook, Amazon, and Netflix use machine learning to power their recommendation engines. For example, Facebook enables newsfeeds, Amazon highlights products you might want to purchase and Netflix suggests movies you might want to watch, all of those recommendations are based predictions that arise from patterns in their existing data.

In fact, machine learning has become so associated with statistics, data mining and predictive analytics that some people argue it should be classified as a separate field of artificial intelligence.

Deep Learning (DL)

Deep learning is a subset of ML and uses some ML techniques to solve real-world problems by tapping into neural networks that simulate human decision-making. Deep learning can be expensive and requires massive datasets to train itself on. That’s because there are a huge number of parameters that need to be understood by a learning algorithm, which can initially produce a lot of false-positives. For instance, a deep learning algorithm could be instructed to “learn” what a cat looks like. It would take a very massive data set of images for it to understand the very minor details that distinguish a cat from, say, a cheetah or a panther or a fox.

Use cases:

1) Deep learning and AI disrupting healthcare industry

Using artificial intelligence (AI), deep learning algorithms, and complex data sets disrupting the entire healthcare industry in diagnostics to gene therapies to personalized medicine. Deep Genomics holds the key to unlocking the biggest disruptions in the medical, life sciences, and pharmaceutical industries.

Startups are raising billions of dollars in venture capital. Grail raised $900M in a series B round for their “high-intensity sequencing approach” in a market estimated to reach $45B by 2024.

The companies such as Fabric Genomics, Guardant Health and Ion Torrent Business are prevailing with this emerging technology’s ecosystem.

2) Deep Learning enabling Intelligence from Big Data

Deep learning has a direct linkage with data from two sources one being recorded from the physical world (image, sound) and another from the data human produce (words, meta data, tagging data).Operators of a deep learning algorithm feed data to the algorithm and then train the algorithm to evaluate the data effectively. Deep learning processes have recently disrupted robotics, manufacturing, agriculture, and a plethora of other industries.

The Ph.D.’s and brain scientists at Clarifai applies award-winning speed to image recognition and tagging, Skymind will analyze millions of news articles for you to determine if someone’s talking about your company with emotional bias, AlchemyAPI will help you develop apps to understand customer behavior, Nirvana systems develops hardware specifically for deep learning algorithms, and Ersatz Labs provides web based access to deep learning modules so you can run your own experiments.

With all those disruptions, which technology might be powerful over the long run: AI, ML or DL?

References:

[1] https://en.wikipedia.org/wiki/Artificial_intelligence

[2] http://www.datamation.com/applications/artificial-intelligence-software-45-ai-projects-to-watch-1.html

[3] https://www.vlab.org/

[4] https://blogs.nvidia.com/blog/2016/07/29/whats-difference-artificial-intelligence-machine-learning-deep-learning-ai/

[5] https://en.wikipedia.org/wiki/Arthur_Samuel

 

 

 

 

0

2 comments on “Triangular cross relationship between AI, Machine Learning and Deep Learning”

  1. Excellent, relevant information, Hira. Thank you for the practical perspective behind the terminology, because far too many use the nomenclature without understanding what’s behind it. In the mid-1980’s at Digital Equipment Corp., we had an expert system (early AI) called XCON/XSEL to configure computer systems for a user versus the user configuring manually by evaluating backplane/chassis slot capacity and power/current consumption at the board/card level. The system was built with the OPS5 language (not sure anyone uses that these days), and, while the user interface was conversational via text entry using VT1xx or VT2xx terminals or terminal emulators, it was very primitive compared to mobile applications of today. In my current capacity, we have been working on ML applications to try to better predict sector/cell site level issues in LTE (long-term evolution) communications, including the ability to diagnose issues with SFP’s (small form factor pluggable connectors) and individual fibers in the fiber optic links that connect radios with the digital units processing network traffic. As many of our MS238 speakers have stated, the quantity of data affects the quality of what we can do with analytics, and that increases with both installed base growth as well as elapsed time of operation. Thanks again!

    1+

    Users who have LIKED this comment:

    • avatar
  2. MS&E 238A – WEEK 5 PARTICIPATION – ELENI MCFARLAND
    Great post! I like that you broke down AI, ML, and DL sequentially – this is useful for understanding the different ways AI presents itself to us every day. As you mentioned, we interact with AI constantly with our smart phones, social media, etcetera. For me, one of the most tangible examples of AI/ML is through sites such as Facebook and Netflix. I have always assumed my newsfeed and Netflix recommendations were based on somewhat straightforward algorithms – I click on a person’s profile, and then for the next month, they seem to be everywhere on my Facebook. In fact, I often say “why does Facebook think I want to see this?” or “Why is Netflix recommending that to me?” These recommendation systems are based on such advanced concepts of ML, yet they haven’t seemed to really “learn” me yet. I read this article (http://www.businessinsider.com/netflix-using-ai-to-suggest-better-films-2014-2) that came out in 2014 explaining that Netflix was working to improve their recommendation engine by moving into DL and creating neural networks. This is intended to create predictive behavior (among other things). Clearly, other people had the same problems with Netflix’s recommendation engine previously. Perhaps this mean ML is something we take for granted these days, and that we are already expecting machines to be inside our minds – perhaps we are expecting them to have neural networks equal to or greater than (i.e. more predictive and ahead of) our very own brains.

    0

Comments are closed.