+353-1-416-8900REST OF WORLD
+44-20-3973-8888REST OF WORLD
1-917-300-0470EAST COAST U.S
1-800-526-8630U.S. (TOLL FREE)

Towards Neuromorphic Machine Intelligence. Spike-Based Representation, Learning, and Applications

  • Book

  • August 2024
  • Elsevier Science and Technology
  • ID: 5955079
Towards Neuromorphic Machine Intelligence: Spike-Based Representation, Learning and Applications provides readers with in-depth understanding of Spiking Neural Networks (SNN), which is a burgeoning research branch of Artificial Neural Networks (ANN), AI, and Machine Learning that sits at the heart of the integration between Computer Science and Neural Engineering. In recent years, neural networks have re-emerged in relation to AI, representing a well-grounded paradigm rooted in disciplines from physics and psychology to information science and engineering. This book represents one of the established cross-over areas where neurophysiology, cognition, and neural engineering coincide with the development of new Machine Learning and AI paradigms. There are many excellent theoretical achievements in neuron models, learning algorithms, network architecture and so on. But these achievements are numerous and scattered, with a lack of straightforward systematic integration, making it difficult for researchers to assimilate and apply. As the third generation of Artificial Neural Networks (ANN), Spiking Neural Networks (SNN) simulate the neuron dynamics and information transmission in a biological neural system in more detail, which is a cross-product of computer science and neuroscience. The primary target audience of this book is divided into two categories: artificial intelligence researchers who know nothing about SNN, and researchers who know a lot about SNN. The former needs to acquire fundamental knowledge of SNN, but the challenge is that a large number of existing literatures on SNN only slightly mention the basic knowledge of SNN, or are too superficial, and this book gives a systematic explanation from scratch. The latter needs to learn about some novel research achievements in the field of SNN, and this book introduces the latest research results on different aspects of SNN and provides detailed simulation processes to facilitate readers' replication. In addition, the book introduces neuromorphic hardware architecture as a further extension of the SNN system. The book starts with the birth and development of SNN, and then introduces the main research hotspots, including spiking neuron models, learning algorithms, network architectures, and neuromorphic hardware. Therefore, the book provides readers with easy access to both the foundational concepts and recent research findings in SNN.

Table of Contents

1. Introduction
2. Fundamentals of Spiking Neural Networks
3. Specialized Spiking Neuron Model
4. Learning Algorithms for Shallow Spiking Neural Networks
5. Learning Algorithms for Deep Spiking Neural Networks
6. Neural Column-Inspired Spiking neural networks
7. Retinal-Inspired Visual Spiking Neural Network
8. ANN-SNN Algorithm Suitable for Ultra Energy Efficient Application
9. Neuromorphic Hardware
10. Conclusions

Authors

Hong Qu Professor, Computational Intelligence Laboratory, School of Computer Science and Engineering, University of Electronic Science and Technology of China, China.

Dr. Hong Qu received the Ph.D. degree in computer science from the University of Electronic Science and Technology of China, Chengdu, China, in 2006. From 2007 to 2008, he was a Post-Doctoral Fellow with the Advanced Robotics and Intelligent Systems Laboratory, School of Engineering, University of Guelph, Guelph, ON, Canada. From 2014 to 2015, he was a Visiting Scholar with the Potsdam Institute for Climate Impact Research, Potsdam, Germany, and the Humboldt University of Berlin, Berlin, Germany. He is currently a Professor with the Computational Intelligence Laboratory, School of Computer Science and Engineering, University of Electronic Science and Technology of China. His current research interests include neural networks, machine learning, and big data.