Keras2c: A library for converting Keras neural networks to real-time compatible C

Rory Conlin, Keith Erickson, Joseph Abbate, Egemen Kolemen

Research output: Contribution to journalArticlepeer-review

33 Scopus citations

Abstract

With the growth of machine learning models and neural networks in measurement and control systems comes the need to deploy these models in a way that is compatible with existing systems. Existing options for deploying neural networks either introduce very high latency, require expensive and time consuming work to integrate into existing code bases, or only support a very limited subset of model types. We have therefore developed a new method called Keras2c, which is a simple library for converting Keras/TensorFlow neural network models into real-time compatible C code. It supports a wide range of Keras layers and model types including multidimensional convolutions, recurrent layers, multi-input/output models, and shared layers. Keras2c re-implements the core components of Keras/TensorFlow required for predictive forward passes through neural networks in pure C, relying only on standard library functions considered safe for real-time use. The core functionality consists of ∼1500 lines of code, making it lightweight and easy to integrate into existing codebases. Keras2c has been successfully tested in experiments and is currently in use on the plasma control system at the DIII-D National Fusion Facility at General Atomics in San Diego.

Original languageEnglish (US)
Article number104182
JournalEngineering Applications of Artificial Intelligence
Volume100
DOIs
StatePublished - Apr 2021

All Science Journal Classification (ASJC) codes

  • Control and Systems Engineering
  • Artificial Intelligence
  • Electrical and Electronic Engineering

Keywords

  • C
  • Control systems
  • Keras
  • Neural networks
  • Real-time
  • Software

Fingerprint

Dive into the research topics of 'Keras2c: A library for converting Keras neural networks to real-time compatible C'. Together they form a unique fingerprint.

Cite this