With the arrival of the 4th Industrial Revolution, the semiconductor industry is paying close attention to ‘neuromorphic’ chips that remarkably resemble the human brain. A neuromorphic chip can imitate the brain to effectively process data, greatly surpassing existing machines that find it hard to recognize the increased amount of data brought on by more demanding new technologies such as Big Data, Artificial Intelligence (AI), and Machine Learning. Computing via neuromorphic chips is expected to play a key role across various sectors, including voice/face recognition and data mining, by accurately learning from evolving data.
“Neuromorphic Chip,” a Semiconductor that Works like the Human Brain
The neuromorphic chip is a brand-new small semiconductor chip that has taken inspiration from the human brain to replicate its information and thought processing methods. The human brain processes information and stores it instantly with more than 100 billion neurons communicating with one another via more than 100 trillion synapses, which are the connecting links. These synapses are connected in parallel, allowing the network to simultaneously perform memory, computation, reasoning and calculation with the low power of approximately 20W, while Alphago consumes a tremendous amount of power. The brain’s synaptic information transfer system is emerging as a key to AI, because it can process highly parallel computations with only a small amount of energy. This is why the semiconductor industry is shifting their efforts to brain research.
The neuromorphic technology offers the unmatched storage, calculation, recognition and pattern analysis of vast amounts of information. Engineers have based the neuromorphic chip on their intelligent interpretation of how the neuron structure processes information – by exchanging spike-shaped signals and controlling synaptic strength between synapse connections – similar to the process found within a semiconductor.
The Von Neumann computer architecture, which processes input data consecutively, revealed its limitations in terms of power consumption, pattern recognition, real-time recognition and judgment. This architecture, while excellent for running numerical calculations or precisely written programs, is limited in its ability to process and understand images and sounds.
For better understanding, a computer is typically composed of a CPU, memory and other hardware units such as input and output devices. When information travels from a CPU to the memory and then to hardware, speed is somewhat decreased, which often results in the bottleneck phenomenon. The neuromorphic chip is designed to overcome these real-time information processing speed issues.
The key to neuromorphic technology is the sequential computers mimicking the human brain that works in parallel, which in turn allows massive amounts of memory and computation to be performed simultaneously. By imitating the human brain’s structure, neuromorphic chips can recognize more diverse information such as atypical text, images, sounds and videos as patterns, which was impossible via previous computers. Neuromorphic chips can also perform both data input and output simultaneously, while power consumption can be reduced significantly compared to existing semiconductors.
Key Next-Generation Technologies for the 4th Industrial Revolution
Developing artificial neural network semiconductor devices and advancing them into neuromorphic chips will ultimately create a new computing system that boasts both the functions of memory semiconductors and the computing power of logic semiconductors. This means, when an outside command is received, a computer chip can perform multiple operations and information processing simultaneously, like a human brain does.
Once the neuromorphic chip is perfected, the AI technologies of the future will be able to operate at ultra-low power, while still boasting high performance, like a human brain. This hardware-based future AI has been named the Spiking Neural Network (SNN) to distinguish it from the software-based Deep Neural Network (DNN). Despite AI making significant advancements with each passing day, the technology is yet to reach the brain capacity of a human being. However, neuromorphic technology is poised to close the gap between humans and AI.
Neuromorphic chips are also expected to be applied to various IT technologies such as face recognition, voice recognition, robots, drones, autonomous vehicles and wearable devices, which are all pivotal next-generation technologies with a huge part to play in the 4th Industrial Revolution. Neuromorphic technology will provide us with more sophisticated IT technologies, such as an AI doctor able to diagnose illness with much accuracy or a fully autonomous car that requires zero human intervention. With its high level of utilization, neuromorphic chips show incredible growth potential.
Igniting Competition among IT and Semiconductor Companies for Technological Development
Many global IT companies have begun developing neuromorphic technology of their own, guaranteeing innovation in advanced technologies and national competitiveness in the 4th Industrial Revolution era.
In 2013, Qualcomm presented its very own processor “Zeroth” that boasted the human brain’s learning capabilities. IBM also successfully developed a neuromorphic chip named “TrueNorth” in 2014, under the “Systems of Neuromorphic Adaptive Plastic Scalable Electronics (SyNAPSE)” program of the US Defense Advanced Research Projects Agency (DARPA). TrueNorth had one million electronic neurons and 256 million electronic synapses. In 2017, Intel announced “Loihi,” the first self-learning neuromorphic test chip with 130,000 neurons and 130 million synapses. Intel established the Intel Neuromorphic Research Community (INRC) in 2018, and more recently in July 2019, unveiled a neuromorphic system code-named, “Pohoiki Beach,” where 64 units of Loihi were combined into one. Meanwhile, Samsung Electronics announced in June 2019 its plan for strengthening its own Neural Processing Unit (NPU) capabilities and expanding it to neuromorphic processor technology.
(Credit: Intel Corporation)
With an agreement signed with Stanford University on the “Joint Research and Development of Artificial Neural Network Devices” in October 2016, SK hynix has fully initiated the research and development of its neuromorphic chips. SK hynix plans to develop neuromorphic chips by utilizing a ferroelectric material that can save various differentiated data depending on different voltage levels.