Devices and Algorithms for Analog Deep Learning

MTL Seminar Series
to
Speaker
Murat Onen, MIT
Location
Grier (34-401)
Open to
MIT Community

Join members of the MIT, MTL, and AI Hardware Program communities for a
Joint MTL/AI Hardware Program Seminar

 

Abstract: Analog deep-learning processors can provide orders of magnitude higher processing speed and energy efficiency compared to traditional digital counterparts. This is imperative for the promise of artificial intelligence to be realized. However, the implementation of analog processors faces a significant barrier comprising two coupled components: 1) the absence of devices that satisfy stringent algorithm-imposed demands and 2) algorithms that can tolerate inevitable device nonidealities. This talk will present major advancements along both directions: a novel near-ideal device technology and a superior neural network training algorithm. The devices first realized here are CMOS-compatible nanoscale protonic programmable resistors that incorporate the benefits of nanoionics with extreme acceleration of ion transport under strong electric fields. Enabled by a material-level breakthrough of utilizing phosphosilicate glass (PSG) as a proton electrolyte, these devices achieve controlled proton intercalation in nanoseconds with high energy-efficiency. Separately, a theoretical analysis explains the infamous incompatibility between asymmetric device modulation and conventional neural network training algorithms. By establishing a powerful analogy with classical mechanics, a novel method, Stochastic Hamiltonian Descent, has been developed to exploit device asymmetry as a useful feature instead. In combination, the two developments presented in this thesis can be effective in ultimately realizing the potential of analog deep learning.

Murat Onen headshot

Bio: Murat Onen is a Postdoctoral Researcher at Massachusetts Institute of Technology (MIT). He holds a PhD degree in Electrical Engineering and Computer Science from MIT. His research focuses on devices, architectures, and algorithms for analog deep learning which has led to 16 patents and numerous publications to date. Currently, he focuses on developing nanoprotonic programmable resistors and specialized training algorithms for analog crossbar accelerators.