

I am a fan of non-invasive interaction and it looks like most people are like me: many are not so keen about the development of cyborg technology. I am not only fascinated by the differences of the computer vs human brain, by more so by the potential interaction of the two. “The rise of artificial intelligence is creating new variety in the chip market, and trouble for Intel.” The Economist. National Archives and Records Administration, Apr. “The BRAIN Initiative.” National Archives and Records Administration. “Why Obama’s Brain-Mapping Project Matters.” MIT Technology Review. “Obama’s Brain Project Backs Neurotechnology.” MIT Technology Review. “Human-level control through deep reinforcement learning.” Nature 518.7540 (2015): 529-533. “What Watson Can Learn From the Human Brain.” Wired. “Neuroscience-Inspired Artificial Intelligence.” Neuron 95.2 (2017): 245-258. “Computers versus Brains.” Scientific American. “The Big Deal with Big Data Isn’t (Just) the Data.” MIT Technology Review. With such human brain based validation researchers and organizations can decide to spend more time and resources further developing these algorithms.īyrnes, Nanette. If there is a new promising method or algorithm developed by researchers, and an equivalent mechanism is found in the human brain, then this likely affirms that the method could play an important role in making computers as efficient as human brains. Second, the human brain can serve as a validation tool for new data mining and machine learning techniques. First, the human brain is a rich source of inspiration for researchers that can provide them with new ideas of algorithms and architectures that can learn from data, and reach better conclusions after analyzing the data. The Obama administration created the BRAIN Initiative (Brain Research through Advancing Innovative Neurotechnologies) that aims to “support the development and application of innovative technologies that can create a dynamic understanding of brain function” (, , ) There are two main benefits of developing new data analysis techniques based on neuroscience and how the human brain works ( (17)30509-3.pdf). Many researchers are working on unraveling the still mysterious inner workings of the human brain.

Nevertheless, we still do not fully understand how the human brain works. While today’s computers only have a few processors, the brain has billions of cells that are processing data all at the same time (Hawkins et al.).

One of the main reasons the brain is so much more efficient is that information is processed in parallel. As can be seen in the below figure an iPad 2 still has over 1000 times less total data storage than a cat’s brain ( ). On top of that, it stores all its genetic information in less than 750 megabytes (, ). The brain requires only about 12 watts of power, which is significantly less than most light bulbs need. Humans are better at tasks that cannot be modularized or described by well-defined algorithms today’s computers need to be exactly told what to do and they are just beginning to learn by themselves ( ).īrains are especially more energy efficient and better at information storage than existing computers. IBM Watson consumes about 750,000 watts of power, and the human brain runs on only 12 watts a difference of four orders of magnitude! ( ) While computers are great at executing specific and well-defined tasks at high speed, humans are still significantly better at a wide variety of tasks that require, for example, creativity, common sense, pattern recognition, or language understanding. As we struggle to deal with processing large amounts of data in an efficient way, our computers are still easily outperformed by human brains in most tasks. Examining all this data is also not possible with current computing resources.
#Ihbm 2018 human brain mapping how to#
Data is produced at a rate at which government organizations and companies do not know how to analyze it, and a lot of the data is never examined ( ). Digital data is growing so fast that computers and existing storage techniques are not able to keep up ( ). The amount of digital data continues to double every two years ( ). Tags: artificial intelligence, Big Data, Cognitive Science, data mining, Neuroscience
