Modern day computing systems are based on the von Neumann architecture proposed in 1945 but face dual challenges of: 1) unique data-centric requirements of emerging applications and 2) increased nondeterminism of nanoscale technologies caused by process variations and failures. This paper presents a Shannon-inspired statistical model of computation (statistical computing) that addresses the statistical attributes of both emerging cognitive workloads and nanoscale fabrics within a common framework. Statistical computing is a principled approach to the design of non-von Neumann architectures. It emphasizes the use of information-based metrics; enables the determination of fundamental limits on energy, latency, and accuracy; guides the exploration of statistical design principles for low signal-to-noise ratio (SNR) circuit fabrics and architectures such as deep in-memory architecture (DIMA) and deep in-sensor architecture (DISA); and thereby provides a framework for the design of computing systems that approach the limits of energy efficiency, latency, and accuracy. From its early origins, Shannon-inspired statistical computing has grown into a concrete design framework validated extensively via both theory and laboratory prototypes in both CMOS and beyond. The framework continues to grow at both of these levels, yielding new ways of connecting systems through architectures, circuits, and devices, for the semiconductor roadmap to march into the nanoscale era.
All Science Journal Classification (ASJC) codes
- Electrical and Electronic Engineering
- Artificial intelligence
- information theor
- machine learning
- nanoscale devices
- statistical computing