TY - GEN
T1 - MadEye
T2 - 21st USENIX Symposium on Networked Systems Design and Implementation, NSDI 2024
AU - Wong, Mike
AU - Ramanujam, Murali
AU - Balakrishnan, Guha
AU - Netravali, Ravi
N1 - Publisher Copyright:
© 2024 Proceedings of the 21st USENIX Symposium on Networked Systems Design and Implementation, NSDI 2024. All rights reserved.
PY - 2024
Y1 - 2024
N2 - Camera orientations (i.e., rotation and zoom) govern the content that a camera captures in a given scene, which in turn heavily influences the accuracy of live video analytics pipelines. However, existing analytics approaches leave this crucial adaptation knob untouched, instead opting to only alter the way that captured images from fixed orientations are encoded, streamed, and analyzed. We present MadEye, a camera-server system that automatically and continually adapts orientations to maximize accuracy for the workload and resource constraints at hand. To realize this using commodity pan-tilt-zoom (PTZ) cameras, MadEye embeds (1) a search algorithm that rapidly explores the massive space of orientations to identify a fruitful subset at each time, and (2) a novel knowledge distillation strategy to efficiently (with only camera resources) select the ones that maximize workload accuracy. Experiments on diverse workloads show that MadEye boosts accuracy by 2.9-25.7% for the same resource usage, or achieves the same accuracy with 2-3.7× lower resource costs.
AB - Camera orientations (i.e., rotation and zoom) govern the content that a camera captures in a given scene, which in turn heavily influences the accuracy of live video analytics pipelines. However, existing analytics approaches leave this crucial adaptation knob untouched, instead opting to only alter the way that captured images from fixed orientations are encoded, streamed, and analyzed. We present MadEye, a camera-server system that automatically and continually adapts orientations to maximize accuracy for the workload and resource constraints at hand. To realize this using commodity pan-tilt-zoom (PTZ) cameras, MadEye embeds (1) a search algorithm that rapidly explores the massive space of orientations to identify a fruitful subset at each time, and (2) a novel knowledge distillation strategy to efficiently (with only camera resources) select the ones that maximize workload accuracy. Experiments on diverse workloads show that MadEye boosts accuracy by 2.9-25.7% for the same resource usage, or achieves the same accuracy with 2-3.7× lower resource costs.
UR - http://www.scopus.com/inward/record.url?scp=85194198150&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85194198150&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:85194198150
T3 - Proceedings of the 21st USENIX Symposium on Networked Systems Design and Implementation, NSDI 2024
SP - 549
EP - 568
BT - Proceedings of the 21st USENIX Symposium on Networked Systems Design and Implementation, NSDI 2024
PB - USENIX Association
Y2 - 16 April 2024 through 18 April 2024
ER -