Abstract
Modern day computing systems are based on the von Neumann architecture proposed in 1945 but face dual challenges of: 1) unique data-centric requirements of emerging applications and 2) increased nondeterminism of nanoscale technologies caused by process variations and failures. This paper presents a Shannon-inspired statistical model of computation (statistical computing) that addresses the statistical attributes of both emerging cognitive workloads and nanoscale fabrics within a common framework. Statistical computing is a principled approach to the design of non-von Neumann architectures. It emphasizes the use of information-based metrics; enables the determination of fundamental limits on energy, latency, and accuracy; guides the exploration of statistical design principles for low signal-to-noise ratio (SNR) circuit fabrics and architectures such as deep in-memory architecture (DIMA) and deep in-sensor architecture (DISA); and thereby provides a framework for the design of computing systems that approach the limits of energy efficiency, latency, and accuracy. From its early origins, Shannon-inspired statistical computing has grown into a concrete design framework validated extensively via both theory and laboratory prototypes in both CMOS and beyond. The framework continues to grow at both of these levels, yielding new ways of connecting systems through architectures, circuits, and devices, for the semiconductor roadmap to march into the nanoscale era.
Original language | English (US) |
---|---|
Article number | 8482253 |
Pages (from-to) | 90-107 |
Number of pages | 18 |
Journal | Proceedings of the IEEE |
Volume | 107 |
Issue number | 1 |
DOIs | |
State | Published - Jan 2019 |
All Science Journal Classification (ASJC) codes
- General Computer Science
- Electrical and Electronic Engineering
Keywords
- Artificial intelligence
- computing
- information theor
- machine learning
- nanoscale devices
- statistical computing