Big data: Computer vs. Human Brain

The amount of digital data continues to double every two years (https://www.economist.com/news/business/21717430-success-nvidia-and-its-new-computing-chip-signals-rapid-change-it-architecture). Digital data is growing so fast that computers and existing storage techniques are not able to keep up (https://www.technologyreview.com/s/602156/the-big-deal-with-big-data-isnt-just-the-data/). Data is produced at a rate at which government organizations and companies do not know how to analyze it, and a lot of the data is never examined (https://www.technologyreview.com/s/602156/the-big-deal-with-big-data-isnt-just-the-data/). Examining all this data is also not possible with current computing resources. As we struggle to deal with processing large amounts of data in an efficient way, our computers are still easily outperformed by human brains in most tasks. IBM Watson consumes about 750,000 watts of power, and the human brain runs on only 12 watts; a difference of four orders of magnitude! (https://www.wired.com/2011/02/what-watson-can-learn-from-the-human-brain/) While computers are great at executing specific and well-defined tasks at high speed, humans are still significantly better at a wide variety of tasks that require, for example, creativity, common sense, pattern recognition, or language understanding. Humans are better at tasks that cannot be modularized or described by well-defined algorithms; today’s computers need to be exactly told what to do and they are just beginning to learn by themselves (https://www.nature.com/nature/journal/v518/n7540/pdf/nature14236.pdf).

Brains are especially more energy efficient and better at information storage than existing computers. The brain requires only about 12 watts of power, which is significantly less than most light bulbs need. On top of that, it stores all its genetic information in less than 750 megabytes (https://www.wired.com/2011/02/what-watson-can-learn-from-the-human-brain/, https://www.scientificamerican.com/article/computers-vs-brains/). As can be seen in the below figure an iPad 2 still has over 1000 times less total data storage than a cat’s brain (https://www.scientificamerican.com/article/computers-vs-brains/). One of the main reasons the brain is so much more efficient is that information is processed in parallel. While today’s computers only have a few processors, the brain has billions of cells that are processing data all at the same time (Hawkins et al.). Nevertheless, we still do not fully understand how the human brain works.

Many researchers are working on unraveling the still mysterious inner workings of the human brain. The Obama administration created the BRAIN Initiative (Brain Research through Advancing Innovative Neurotechnologies) that aims to “support the development and application of innovative technologies that can create a dynamic understanding of brain function” (https://www.technologyreview.com/s/513011/why-obamas-brain-mapping-project-matters/, https://www.technologyreview.com/s/531291/obamas-brain-project-backs-neurotechnology/, https://obamawhitehouse.archives.gov/node/300741) There are two main benefits of developing new data analysis techniques based on neuroscience and how the human brain works (http://www.cell.com/neuron/pdf/S0896-6273(17)30509-3.pdf). First, the human brain is a rich source of inspiration for researchers that can provide them with new ideas of algorithms and architectures that can learn from data, and reach better conclusions after analyzing the data. Second, the human brain can serve as a validation tool for new data mining and machine learning techniques. If there is a new promising method or algorithm developed by researchers, and an equivalent mechanism is found in the human brain, then this likely affirms that the method could play an important role in making computers as efficient as human brains. With such human brain based validation researchers and organizations can decide to spend more time and resources further developing these algorithms.

 

Byrnes, Nanette. “The Big Deal with Big Data Isn’t (Just) the Data.” MIT Technology Review. MIT Technology Review, 07 Dec. 2016. Web. 28 July 2017. https://www.technologyreview.com/s/602156/the-big-deal-with-big-data-isnt-just-the-data/

Fischetti, Mark. “Computers versus Brains.” Scientific American. Nature Publishing Group., 1 Nov. 2011. Web. 28 July 2017. https://www.scientificamerican.com/article/computers-vs-brains/

Hassabis, Demis, et al. “Neuroscience-Inspired Artificial Intelligence.” Neuron 95.2 (2017): 245-258. http://www.cell.com/neuron/pdf/S0896-6273(17)30509-3.pdf

Hawkins, Jeff, and Sandra Blakeslee. On intelligence:. New York: Times /Henry Holt, 2008. Print.

Lehrer, Jonah. “What Watson Can Learn From the Human Brain.” Wired. Conde Nast, 03 June 2017. Web. 28 July 2017. https://www.wired.com/2011/02/what-watson-can-learn-from-the-human-brain/

Mnih, Volodymyr, et al. “Human-level control through deep reinforcement learning.” Nature 518.7540 (2015): 529-533. https://www.nature.com/nature/journal/v518/n7540/pdf/nature14236.pdf

Regalado, Antonio. “Obama’s Brain Project Backs Neurotechnology.” MIT Technology Review. MIT Technology Review, 30 Sept. 2014. Web. 28 July 2017. https://www.technologyreview.com/s/531291/obamas-brain-project-backs-neurotechnology/

Rojahn, Susan Young. “Why Obama’s Brain-Mapping Project Matters.” MIT Technology Review. MIT Technology Review, 08 Apr. 2013. Web. 28 July 2017. https://www.technologyreview.com/s/513011/why-obamas-brain-mapping-project-matters/

“The BRAIN Initiative.” National Archives and Records Administration. National Archives and Records Administration, Apr. 2013. Web. 28 July 2017. https://obamawhitehouse.archives.gov/node/300741

“The rise of artificial intelligence is creating new variety in the chip market, and trouble for Intel.” The Economist. The Economist Newspaper, 25 Feb. 2017. Web. 12 July 2017. https://www.economist.com/news/business/21717430-success-nvidia-and-its-new-computing-chip-signals-rapid-change-it-architecture

 

5+

Users who have LIKED this post:

  • avatar
  • avatar
  • avatar
  • avatar

10 comments on “Big data: Computer vs. Human Brain”

  1. Hi Teun,

    Very interesting subject. I agree that more effort should be put into creating new data analysis techniques based on neuroscience. Creating services and or techniques that, to an extent, model the way the human brain analyses and differentiates data with such efficiency could lead to massive technological advancements.

    Although initial development is tough and requires data scientists, I think that AI could play a major role in this research. AI can conduct certain tasks at different levels and stages in researching and developing data analysis techniques based on neuroscience. AI should be implemented because it is capable of conducting and completing certain tasks at much higher rates and with lower margins of error than humans, thus speeding up and aiding the process and development significantly.

    2+

    Users who have LIKED this comment:

    • avatar
  2. Good article! It is right to state that big data analytic is an emerging issue that companies cannot do without because of dynamics of the world market. Data has to be mined on a daily basis and properly utilized for decision making. The data is equally stored for future decision making and references.

    0
  3. I remember watching an interesting episode on 60 Minutes (http://www.cbsnews.com/news/60-minutes-artificial-intelligence-charlie-rose-robot-sophia/) several month ago on the advances of artificial intelligence. The segment focused on IBM’s Watson supercomputer. Although Watson is most well known for playing Jeopardy, it’s also being trained to process incredible amounts of health care data. The segment provides an overview of how IBM has trained Watson to digest thousands of journal articles on new cancer treatments. The computer can then prescribe new treatments based on each patient’s data. This is a huge advantage over the status quo as doctors aren’t able to process the same volume of data and recall it on a moment’s notice. IBM predicts that their new technology will in the near future revamp the way cancer patients receive treatments.

    1+

    Users who have LIKED this comment:

    • avatar
  4. Thank you for your blog post, Teun. I am not only fascinated by the differences of the computer vs human brain, by more so by the potential interaction of the two. I am a fan of non-invasive interaction and it looks like most people are like me: many are not so keen about the development of cyborg technology [1]. However, I truly love the idea of this chain reaction: brain -> facial expression/emotion->AI response. I think it would be helpful if I drove home sad and my car decided to play a song that will cheer me up and actually succeeded. There is great progress in the non-invasive domain but there also is progress in the other: direct body/brain to computer in interface [2].

    Helpful data and content:
    – One interesting table to check out illustrates the communication speed between brain and machine: https://28oa9i1t08037ue3m1l0i861-wpengine.netdna-ssl.com/wp-content/uploads/2018/04/Communication-Speed-GRAPH-1.png
    – Another cool advancement is this TEDx about the interface between the human skin and machine: https://www.youtube.com/watch?v=3jRNY_JcBpg

    [1] https://28oa9i1t08037ue3m1l0i861-wpengine.netdna-ssl.com/wp-content/uploads/2018/04/PS_2016.07.26_Human-Enhancement-Survey_0-01.png
    [2] Neuralink. https://waitbutwhy.com/2017/04/neuralink.html#part4

    2+

    Users who have LIKED this comment:

    • avatar
  5. Hi Teun,

    Your article is insightful and informative. I agree with you about the potential value of human brains in the development of data analytics. However, another potential risk is human will be surpassed sooner if the algorithms are learned directly and copied from our human brain. Also, which brain we should use is another issue. Shall we use normal human brains or genius’ brains? Would every type of brains suitable for data mining?

    Yi-Lin Tsai (MS&E 238 A student)

    1+

    Users who have LIKED this comment:

    • avatar
    1. Human brains of geniuses and average people work in the same way. There are some slight differences but the main concepts are all identical. We need to understand how the brain works, and so learning from a genius or average brain will not make a significant difference. Right now, our understanding of the brain is still too limited.

      0
  6. While the human brain is better than the computer in many ways; the computer still beats the brain in scaling, speed, accuracy and complexity of computation. The computer outperforms the brain when there is too much data to process; it performs faster and much more accurately than the human brain. Additionally complex computations are better done by the computer. An AI system performs millions of matrix computations in seconds. The best use of computer systems is for repetitive large scale and well structured tasks. The brain is best used for creative and innovative tasks. Together, both systems can achieve much more complex tasks at a very large scale.

    0
  7. Thanks, Teun, for your insightful article on the different capabilities of computers and brains.
    In my theoretical computer science course we discussed the halting problem. It is quite interesting to see on a mathematical basis (automata theory & Turing machines) that today it is not possible for a computer to determine based on program code and input, if the program will ever halt, which is why infinite loops cannot be detected. Our brain on the other hand is capable of determining that.
    So as long as there is no proof or disproof of several mathematical problems computers will never be able to beat the human brain on the halting problem. Quite impressive.

    1+

    Users who have LIKED this comment:

    • avatar
  8. Very interesting post! I agree that there are a lot that science can learn from the human brains. I still remember the quote appeared on the first day of the leading trend class: “machines are accurate, fast, and stupid; human brains are inaccurate, slow, and brilliant”. There is probably a long way to go even to come close the intricacy of human brains.

    2+

    Users who have LIKED this comment:

    • avatar

Comments are closed.