As a result of clock speed stagnation and the foreseen limitation of transistor density, traditional computing technology based on the Von Neumann architecture is facing fundamental limits. Neuromorphic engineering is therefore attracting increased attention with the ultimate goal of realizing machines that could surpass the human brain, with some aspects of cognitive intelligence. In that sense, brain research bear the promise of a new computing paradigm. One such framework was developed by IBM. IBM has developed the brain-inspired TrueNorth chip. It is powered by an unprecedented 1,000,000 neurons and 256,000,000 synapses. It is the largest chip IBM has ever built at 5.4 billion transistors, and has an on-chip network of 4,096 neuro-synaptic cores. It only consumes 70mW during real-time operation — orders of magnitude less energy than traditional chips. As part of a complete cognitive hardware and software ecosystem, this technology opens new computing frontiers for distributed sensor and super-computing applications.
In this course we will study the fundamentals of neuro-morphic computing, focusing on the digital neuron model for neuro-synaptic cores, cognitive programming paradigms, and algorithms and applications for networks of neuro-synaptic cores.
Unit 1: Introduction to cognitive computing: models, abstractions and reductionism in brain sciences. We will be focusing on three perspectives of cognitive computing: the scientific perspective: conceptualizing and modelling the brain, the brains’ computa- tional building blocks, neurophysiology and neuroanatomy, emergent behavior and neuro- simulations; the computer architect perspective: CMOS circuits, quantum and power limitations of tran- sistor density, emerging computing architectures, and paradigms of neuromorphic hardware; the algorithmic perspective: feature extraction VS. feature learning; descriptive, mechanistic, and interpretive models of receptive fields, and functional architectures.
Unit 2. From the neuron doctrine to neural networks: the neural engineering framework: representation, decoding, transformation and dynamics; the Nengo software environment for the simulation of large scale neural systems. Research focus: Spaun - a large scale model of the functioning brain.
Unit 3. Neuromorphic VLSI: Circuits: Introduction to VLSI circuits and the silicon neuron; temporal integration: the pulse current source synapse, the reset and discharge synapse, the linear charge and discharge synapse and the Diff-Pair integrator; spike generation and the axon-hillock circuit; the DPI neuron circuit; spike-frequency adaptation; balancing signal’s energy, efficiency, and precision. Neuronal architectures: fully dedicated, shared axon, shared synapse and shared dendrite models; the address event representation and the address-event bus. Technology focus: the Neurogrid (Stanford University) - architecture, NEF integration, spaun integration; controlling robots with spiking silicon neurons. Research focus: retina on a chip
Unit 4. Hybrid neuromorphic architectures: Liberating programming from the Von- Neumann architecture, discrete neuronal modeling, even-driven computing, SOI fabrication, the neural programming model, the compass simulator, multi-core programming paradigms, the Corelet programming language, and neural codes (binary, rate, population and time-to-spike coding); from VLSI to FPGA. Technology focus: the TrueNorth (IBM), the SpiNNaker (University of Manchester), the BlueHive (Cambridge University)
Unit 5. Tensor Processing Unit (TPU), cloud TPU, and TensorFlow on AWS.