TY - GEN
T1 - Floo
T2 - 20th ACM International Conference on Mobile Systems, Applications and Services, MobiSys 2022
AU - Ramanujam, Murali
AU - Chen, Helen
AU - Mardani, Shaghayegh
AU - Netravali, Ravi
N1 - Funding Information:
Acknowledgements: We thank Harsha Madhyastha, Amit Levy, Anirudh Sivaraman, and Aishwarya Sivaraman for their valuable feedback on earlier drafts of the paper. We also thank our shepherd, Alec Wolman, and the anonymous MobiSys reviewers for their constructive comments. This work was supported in part by NSF grants CNS-2140552, CNS-2151630, and CNS-2152313, as well as a Sloan Research Fellowship.
Publisher Copyright:
© 2022 Owner/Author.
PY - 2022/6/27
Y1 - 2022/6/27
N2 - Owing to growing feature sets and sluggish improvements to smartphone CPUs (relative to mobile networks), mobile app response times have increasingly become bottlenecked on client-side computations. In designing a solution to this emerging issue, our primary insight is that app computations exhibit substantial stability over time in that they are entirely performed in rarely-updated codebases within app binaries and the OS. Building on this, we present Floo, a system that aims to automatically reuse (or memoize) computation results during app operation in an effort to reduce the amount of compute needed to handle user interactions. To ensure practicality - the struggle with any memoization effort - in the face of limited mobile device resources and the short-lived nature of each app computation, Floo embeds several new techniques that collectively enable it to mask cache lookup overheads and ensure high cache hit rates, all the while guaranteeing correctness for any reused computations. Across a wide range of apps, live networks, phones, and interaction traces, Floo reduces median and 95th percentile interaction response times by 32.7% and 72.3%.
AB - Owing to growing feature sets and sluggish improvements to smartphone CPUs (relative to mobile networks), mobile app response times have increasingly become bottlenecked on client-side computations. In designing a solution to this emerging issue, our primary insight is that app computations exhibit substantial stability over time in that they are entirely performed in rarely-updated codebases within app binaries and the OS. Building on this, we present Floo, a system that aims to automatically reuse (or memoize) computation results during app operation in an effort to reduce the amount of compute needed to handle user interactions. To ensure practicality - the struggle with any memoization effort - in the face of limited mobile device resources and the short-lived nature of each app computation, Floo embeds several new techniques that collectively enable it to mask cache lookup overheads and ensure high cache hit rates, all the while guaranteeing correctness for any reused computations. Across a wide range of apps, live networks, phones, and interaction traces, Floo reduces median and 95th percentile interaction response times by 32.7% and 72.3%.
KW - caching
KW - memoization
KW - mobile apps
KW - performance
KW - smartphones
UR - http://www.scopus.com/inward/record.url?scp=85134046011&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85134046011&partnerID=8YFLogxK
U2 - 10.1145/3498361.3538929
DO - 10.1145/3498361.3538929
M3 - Conference contribution
AN - SCOPUS:85134046011
T3 - MobiSys 2022 - Proceedings of the 2022 20th Annual International Conference on Mobile Systems, Applications and Services
SP - 168
EP - 182
BT - MobiSys 2022 - Proceedings of the 2022 20th Annual International Conference on Mobile Systems, Applications and Services
PB - Association for Computing Machinery, Inc
Y2 - 27 June 2022 through 1 July 2022
ER -