Thesis Defense - Nellie Wu

Thesis Defense
to
Speaker
Nellie Wu, MIT
Location
32-D463 (STAR)
Open to
MIT Community

Title: Systematic Modeling and Design of Sparse Tensor Accelerators
Date: Friday, May 5th, 2023
Time:  2:00 PM 

Location: 32-D463 (STAR)

Zoom Linkhttps://mit.zoom.us/j/96631190975?pwd=S1FMOXMzRE9vTW9kK2tpa2ZiOVJCZz09

Abstract:
Sparse tensor algebra is an important computation kernel in many popular applications, such as deep neural networks (DNNs), circuit simulations, and graph algorithms. The sparsity (i.e., zero values) in such kernels motivates the development of many sparse tensor accelerators that aim to translate the sparsity into hardware savings, such as reductions in energy consumption and processing latency. However, despite the abundant existing proposals, there has not been a systematic way to understand, model, and develop sparse tensor accelerators.

To address the above limitations, we first present a well-defined taxonomy of sparsity-related acceleration features to allow a systematic understanding of the sparse tensor accelerator design space.  Based on the taxonomy, we propose Sparseloop, the first analytical modeling framework for fast, accurate, and flexible evaluations of sparse tensor accelerators, enabling early-stage exploration of the large and diverse design space.  Employing Sparseloop, we search the design space and present an efficient and flexible DNN accelerator that accelerates DNNs with a novel sparsity pattern, called hierarchical structured sparsity, with the key insight that we can efficiently translate diverse degrees of sparsity into hardware savings by having them hierarchically composed of simple sparsity patterns.

Thesis Committee:
Vivienne Sze
Joel Emer
Daniel Sanchez