Jan 23, 2025
Neuromorphic Computing for More Efficient AI Systems
The human brain continues to be a valuable source of inspiration when it comes to modeling computational capabilities. One of the most promising scientific endeavors in this area is neuromorphic computing, a brain-inspired approach to developing hardware and algorithms that can make artificial neural networks much more energy-efficient. However, the pursuit of machine intelligence modeled on the brain requires a paradigm shift in the way we design and build computing platforms. A world-class consortium led by the University of Texas San Antonio, involving TU Dresden and its spinoff SpiNNcloud Systems, recently published a summary of the latest findings and the current state of research in the renowned scientific journal “Nature”.
The publication focuses on the use of neuroscientific principles to develop efficient computing systems that can be scaled up to supercomputer level. In particular, it addresses the rapidly growing trends in AI models and data centers. The article presents approaches for the development of scalable neuromorphic architectures and functions and underlines their key features. It highlights potential applications that could benefit from scaling and the key challenges that arise.
“These new findings regarding scalability are particularly relevant for the brain-like supercomputer SpiNNaker2, which was developed at TU Dresden and the University of Manchester and is now being commercialized by the TUD spin-off SpiNNcloud Systems,” explains Prof. Christian Mayr, holder of the Chair of Highly-Parallel VLSI Systems and Neuro-Microelectronics and co-author of the study. “
With more than 5 million computing cores, SpiNNaker 2 stands as the world's largest and most flexible energy-efficient supercomputer for real-time artificial intelligence at large scale. . This article confirms our research direction and sees our SpiNNaker2 alongside Intel's Loihi2 as the leading system in the field of neuromorphic computing,” adds Mayr.
“The importance of large-scale, brain-inspired systems should not be underestimated, especially with regard to the future unsustainable and rapidly growing requirements of AI models and data centers,” adds co-author Dr. Hector Gonzalez and like Prof. Christian Mayr, co-founder of SpiNNcloud Systems. “Brain-inspired systems are a disruptive technological step away from incremental improvements in conventional technologies towards a completely new architectural paradigm for the next evolution of energy-efficient and more robust AI. The Nature paper underlines past and future efforts in this area.”
The team of authors also analyzes what ecosystem is needed to enable sustainable growth and highlights opportunities presented by scaling neuromorphic systems. The paper summarizes ideas from different areas of computing and provides valuable impetus for researchers and practitioners in neuromorphic computing to further advance this technology.
Press contact
Prof. Christian Mayr
TU Dresden
Chair of Highly-Parallel VLSI Systems and Neuro-Microelectronics
Dr. Hector Gonzalez
Co-founder and co-CEO
SpiNNcloud Systems GmbH
Participating research institutions
University of Texas at San Antonio, TX, USA
University of Tennessee, Knoxville, TN, USA
Sandia National Laboratories, Albuquerque, NM, USA
Rochester Institute of Technology, NY, USA
University of Pittsburgh, PA, USA
Intel Labs, CA, USA
Technische Universität, Dresden, Germany
U.S. Naval Research Lab
Google DeepMind
Italian Institute of Technology
University of California, San Diego, USA
Institute of Neuroinformatics, University of Zurich and ETH Zurich
National Institute of Standards and Technology, CO, USA
Oak Ridge National Laboratory, TN, USA
SpiNNcloud Systems GmbH, Dresden, Germany
Indian Institute of Science, India
Royal Holloway, University of London, UK
The University of Manchester, UK