Keyphrases
Activation Function
6%
Analog Computing
11%
Analog Technology
8%
Architectural Integration
8%
Architectural Parameters
6%
Batch Input
8%
Batch Normalization Layer
8%
Binary Input
11%
Capacitive Charging
16%
Capacitor-based
33%
Charge-domain Compute
33%
CIFAR-10
11%
Classification Accuracy
6%
Column Parallel
66%
Data Dimensions
6%
Desired Conditions
11%
Digital Switching
22%
Dynamic Range
11%
Embedded Microprocessor
33%
Energy Efficiency
22%
Factorized
16%
General Rotation
11%
General-rank
16%
Global Minimum
16%
Google TensorFlow
11%
Gradient Flow
16%
Hessian Eigenvalues
16%
Hidden Layer Neurons
8%
Image Classification
11%
In-memory Computing
66%
Inference Accelerator
33%
Input Driver
22%
Input Filter
33%
Input Vector
22%
Invariant Manifolds
16%
Large-scale Neural Networks
8%
Layer Width
6%
Least Eigenvalue
16%
Library Networks
8%
Low-rank
16%
Low-rank Matrix Factorization
50%
Matrix Factorization
33%
Mixed-signal Accelerators
16%
Mixed-signal Processing
66%
Multi-dataset Analysis
11%
Near-memory
8%
Near-memory Computing
9%
Neural Network
33%
Neural Network Classifier
33%
Occam's Razor
6%
Parallel Matrix multiplication
22%
Parameter-free
13%
Programmable Neural Networks
33%
Rectified Linear Unit (ReLU)
6%
Row-column
33%
Separability
6%
Signal Operation
8%
Single Instruction multiple Data
8%
Singular Values
33%
Software Integration
8%
Strict Saddle
33%
Training Data
20%
Training Error
6%
Training Examples
6%
Training Set
6%
Computer Science
Activation Function
5%
And-States
16%
Classification Accuracy
5%
Compute Domain
33%
Convolution Layer
16%
Convolutional Neural Network
11%
Digital Conversion
11%
Dimensional Matrix
8%
Energy Consumption
11%
Energy Efficiency
100%
Energy Efficient
11%
Free Parameter
11%
Heterogeneous Architecture
11%
Image Classification
44%
Library Network
11%
Mapping Algorithm
16%
Network Inference
33%
Neural Network
66%
Noise-to-Signal Ratio
33%
Parallelism
16%
Physical Design
16%
Programmability
11%
Quantization Level
11%
Signal Operation
16%
Single Instruction Multiple Data
11%
Software Library
11%
Software Stack
11%
Sparsity
11%
Training Data
16%
Training Error
5%
Training Example
5%
Vector Multiplication
50%
Virtualizations
16%
Engineering
Binary Input
33%
Dynamic Range
33%
Energy Conservation
66%
Energy Efficiency
66%
Image Classification
33%
Input Level
16%
Input Vector
66%
Output Activation
16%