We present a novel method for generating performance-driven, "hand-drawn" animation in real-time. Given an annotated set of hand-drawn faces for various expressions, our algorithm performs multi-way morphs to generate real-time animation that mimics the expressions of a user. Our system consists of a vision-based tracking component and a rendering component. Together, they form an animation system that can be used in a variety of applications, including teleconferencing, multi-user virtual worlds, compressed instructional videos, and consumer-oriented animation kits. This paper describes our algorithms in detail and illustrates the potential for this work in a teleconferencing application. Experience with our implementation suggests that there are several advantages to our hand-drawn characters over other alternatives: (1) flexibility of animation style; (2) increased compression of expression information; and (3) masking of errors made by the face tracking system that are distracting in photorealistic animations.