Neuromorphic Computing with Stochastic Binary Weights Based on Magnetic Tunnel Junctions

Many problems which are best solved by neural networks are exhibiting rapid growth in nascent and existing fields, such as natural language processing, and image recognition for self-driving cars. Current limitations in manufacturing technologies impose limits that prevent these performance demands from being met through conventional methods. Neuromorphic computing has been proposed as a potential solution for problems best solved with artificial neural networks. The memory bandwidth intensiveness of neuromorphic computing architectures gives rise to power and performance constraints, which limit scalability. Attempts in weight quantization have been made to overcome this limitation, yet even in the case of single-bit binary weights, stochastic behaviour is required to achieve adequate performance; this is turn necessitates additional circuitry, negating the cost and performance advantages of binary weights. However, the emerging technology of Magnetic Tunnel Junctions is proposed as a method to implement inherently stochastic non-volatile Logic-in-Memory in binary weighted neuromorphic computing architectures.

Sean Smithson
Faculty Supervisor: 
Warren Gross
Partner University: 
Tohoku University