A meta-instrument for interactive, on-the-fly machine learning

Rebecca Fiebrink, Dan Trueman, Perry R. Cook

Research output: Contribution to journalConference articlepeer-review

86 Scopus citations

Abstract

Supervised learning methods have long been used to allow musical interface designers to generate new mappings by example. We propose a method for harnessing machine learning algorithms within a radically interactive paradigm, in which the designer may repeatedly generate examples, train a learner, evaluate outcomes, and modify parameters in real-time within a single software environment. We describe our meta-instrument, the Wekinator, which allows a user to engage in on-the-fly learning using arbitrary control modalities and sound synthesis environments. We provide details regarding the system implementation and discuss our experiences using the Wekinator for experimentation and performance.

Original languageEnglish (US)
Pages (from-to)280-285
Number of pages6
JournalProceedings of the International Conference on New Interfaces for Musical Expression
StatePublished - 2009
Event9th International conference on New Interfaces for Musical Expression, NIME 2009 - Pittsburgh, United States
Duration: Jun 4 2009Jun 6 2009

All Science Journal Classification (ASJC) codes

  • Control and Systems Engineering
  • Signal Processing
  • Instrumentation
  • Music
  • Human-Computer Interaction
  • Hardware and Architecture
  • Computer Science Applications

Keywords

  • Machine learning
  • Mapping
  • Tools

Fingerprint

Dive into the research topics of 'A meta-instrument for interactive, on-the-fly machine learning'. Together they form a unique fingerprint.

Cite this