TY - JOUR
T1 - Systematically testing OpenFlow controller applications
AU - Perešíni, Peter
AU - Kuźniar, MacIej
AU - Canini, Marco
AU - Venzano, Daniele
AU - Kostić, Dejan
AU - Rexford, Jennifer L.
N1 - Publisher Copyright:
© 2015 Elsevier B.V.
PY - 2015/12/9
Y1 - 2015/12/9
N2 - The emergence of OpenFlow-capable switches enables exciting new network functionality, at the risk of programming errors that make communication less reliable. The centralized programming model, where a single controller program manages the network, seems to reduce the likelihood of bugs. However, the system is inherently distributed and asynchronous, with events happening at different switches and end hosts, and inevitable delays affecting communication with the controller. In this paper, we present efficient, systematic techniques for testing unmodified controller programs. Our NICE tool applies model checking to explore the state space of the entire system - the controller, the switches, and the hosts. Scalability is the main challenge, given the diversity of data packets, the large system state, and the many possible event orderings. To address this, we propose a novel way to augment model checking with symbolic execution of event handlers (to identify representative packets that exercise code paths on the controller). We also present a simplified OpenFlow switch model (to reduce the state space), and effective strategies for generating event interleavings likely to uncover bugs. Our prototype tests Python applications on the popular NOX platform. In testing three real applications - a MAC-learning switch, in-network server load balancing, and energy-efficient traffic engineering - we uncover 13 bugs.
AB - The emergence of OpenFlow-capable switches enables exciting new network functionality, at the risk of programming errors that make communication less reliable. The centralized programming model, where a single controller program manages the network, seems to reduce the likelihood of bugs. However, the system is inherently distributed and asynchronous, with events happening at different switches and end hosts, and inevitable delays affecting communication with the controller. In this paper, we present efficient, systematic techniques for testing unmodified controller programs. Our NICE tool applies model checking to explore the state space of the entire system - the controller, the switches, and the hosts. Scalability is the main challenge, given the diversity of data packets, the large system state, and the many possible event orderings. To address this, we propose a novel way to augment model checking with symbolic execution of event handlers (to identify representative packets that exercise code paths on the controller). We also present a simplified OpenFlow switch model (to reduce the state space), and effective strategies for generating event interleavings likely to uncover bugs. Our prototype tests Python applications on the popular NOX platform. In testing three real applications - a MAC-learning switch, in-network server load balancing, and energy-efficient traffic engineering - we uncover 13 bugs.
KW - Model checking
KW - OpenFlow
KW - Reliability
KW - Software-defined networking
KW - Symbolic execution
UR - http://www.scopus.com/inward/record.url?scp=84948569552&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84948569552&partnerID=8YFLogxK
U2 - 10.1016/j.comnet.2015.03.019
DO - 10.1016/j.comnet.2015.03.019
M3 - Article
AN - SCOPUS:84948569552
SN - 1389-1286
VL - 92
SP - 270
EP - 286
JO - Computer Networks
JF - Computer Networks
ER -