TY - GEN
T1 - Triggering artwork swaps for live animation
AU - Willett, Nora S.
AU - Li, Wilmot
AU - Popović, Jovan
AU - Finkelstein, Adam
N1 - Publisher Copyright:
© 2017 Association of Computing Machinery.
PY - 2017/10/20
Y1 - 2017/10/20
N2 - Live animation of 2D characters is a new form of storytelling that has started to appear on streaming platforms and broadcast TV. Unlike traditional animation, human performers control characters in real time so that they can respond and improvise to live events. Current live animation systems provide a range of animation controls, such as camera input to drive head movements, audio for lip sync, and keyboard shortcuts to trigger discrete pose changes via artwork swaps. However, managing all of these controls during a live performance is challenging. In this work, we present a new interactive system that specifically addresses the problem of triggering artwork swaps in live settings. Our key contributions are the design of a multi-touch triggering interface that overlays visual triggers around a live preview of the character, and a predictive triggering model that leverages practice performances to suggest pose transitions during live performances. We evaluate our system with quantitative experiments, a user study with novice participants, and interviews with professional animators.
AB - Live animation of 2D characters is a new form of storytelling that has started to appear on streaming platforms and broadcast TV. Unlike traditional animation, human performers control characters in real time so that they can respond and improvise to live events. Current live animation systems provide a range of animation controls, such as camera input to drive head movements, audio for lip sync, and keyboard shortcuts to trigger discrete pose changes via artwork swaps. However, managing all of these controls during a live performance is challenging. In this work, we present a new interactive system that specifically addresses the problem of triggering artwork swaps in live settings. Our key contributions are the design of a multi-touch triggering interface that overlays visual triggers around a live preview of the character, and a predictive triggering model that leverages practice performances to suggest pose transitions during live performances. We evaluate our system with quantitative experiments, a user study with novice participants, and interviews with professional animators.
KW - 2D animation
KW - Live performance
UR - http://www.scopus.com/inward/record.url?scp=85041492710&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85041492710&partnerID=8YFLogxK
U2 - 10.1145/3126594.3126596
DO - 10.1145/3126594.3126596
M3 - Conference contribution
AN - SCOPUS:85041492710
T3 - UIST 2017 - Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology
SP - 85
EP - 95
BT - UIST 2017 - Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology
PB - Association for Computing Machinery, Inc
T2 - 30th Annual ACM Symposium on User Interface Software and Technology, UIST 2017
Y2 - 22 October 2017 through 25 October 2017
ER -