Neuromorphic Computing with Stochastic Binary Weights Based on Magnetic Tunnel Junctions

Many problems which are best solved by neural networks are exhibiting rapid growth in nascent and existing fields, such as natural language processing, and image recognition for self-driving cars. Current limitations in manufacturing technologies impose limits that prevent these performance demands from being met through conventional methods. Neuromorphic computing has been proposed as a potential solution for problems best solved with artificial neural networks. The memory bandwidth intensiveness of neuromorphic computing architectures gives rise to power and performance constraints, which limit scalability. Attempts in weight quantization have been made to overcome this limitation, yet even in the case of single-bit binary weights, stochastic behaviour is required to achieve adequate performance; this is turn necessitates additional circuitry, negating the cost and performance advantages of binary weights. However, the emerging technology of Magnetic Tunnel Junctions is proposed as a method to implement inherently stochastic non-volatile Logic-in-Memory in binary weighted neuromorphic computing architectures.

Faculty Supervisor:

Warren Gross

Student:

Sean Smithson

Partner:

Discipline:

Engineering - computer / electrical

Sector:

University:

McGill University

Program:

Globalink Research Award

Current openings

Find the perfect opportunity to put your academic skills and knowledge into practice!

Find Projects