Home > Research > Publications & Outputs > Fluently remixing musical objects with higher-o...
View graph of relations

Fluently remixing musical objects with higher-order functions

Research output: Contribution to conference - Without ISBN/ISSN Conference paperpeer-review

Published

Standard

Fluently remixing musical objects with higher-order functions. / Lindsay, Adam T.; Hutchison, David.
2009. 422-429 Paper presented at 12th International Conference on Digital Audio Effects, DAFx 2009, Como, Italy.

Research output: Contribution to conference - Without ISBN/ISSN Conference paperpeer-review

Harvard

Lindsay, AT & Hutchison, D 2009, 'Fluently remixing musical objects with higher-order functions', Paper presented at 12th International Conference on Digital Audio Effects, DAFx 2009, Como, Italy, 1/09/09 - 4/09/09 pp. 422-429. <http://dafx.de/paper-archive/details.php?id=QDZBlfHPNL74NMZqF1iZUQ>

APA

Lindsay, A. T., & Hutchison, D. (2009). Fluently remixing musical objects with higher-order functions. 422-429. Paper presented at 12th International Conference on Digital Audio Effects, DAFx 2009, Como, Italy. http://dafx.de/paper-archive/details.php?id=QDZBlfHPNL74NMZqF1iZUQ

Vancouver

Lindsay AT, Hutchison D. Fluently remixing musical objects with higher-order functions. 2009. Paper presented at 12th International Conference on Digital Audio Effects, DAFx 2009, Como, Italy.

Author

Lindsay, Adam T. ; Hutchison, David. / Fluently remixing musical objects with higher-order functions. Paper presented at 12th International Conference on Digital Audio Effects, DAFx 2009, Como, Italy.8 p.

Bibtex

@conference{3ee1ec969d7843a3a12abf5c85ad4ebd,
title = "Fluently remixing musical objects with higher-order functions",
abstract = "Soon after the Echo Nest Remix API was made publicly available and open source, the primary author began aggressively enhancing the Python framework for re-editing music based on perceptually-based musical analyses. The basic principles of this API - integrating content-based metadata with the underlying signal - are described in the paper, then the authors' enhancements are described. The libraries moved from supporting an imperative coding style to incorporating influences from functional programming and domain specific languages to allow for a much more fluent, terse coding style, allowing users to concentrate on the functions needed to find the portions of the song that were interesting, and modifying them. The paper then goes on to describe enhancements involving mixing multiple sources with one another and enabling user-created and user-modifiable effects that are controlled by direct manipulation of the objects that represent the sound. Revelations that the Remix API does not need to be as integrated as it currently is point to future directions for the API at the end of the paper.",
author = "Lindsay, {Adam T.} and David Hutchison",
year = "2009",
month = dec,
day = "1",
language = "English",
pages = "422--429",
note = "12th International Conference on Digital Audio Effects, DAFx 2009 ; Conference date: 01-09-2009 Through 04-09-2009",

}

RIS

TY - CONF

T1 - Fluently remixing musical objects with higher-order functions

AU - Lindsay, Adam T.

AU - Hutchison, David

PY - 2009/12/1

Y1 - 2009/12/1

N2 - Soon after the Echo Nest Remix API was made publicly available and open source, the primary author began aggressively enhancing the Python framework for re-editing music based on perceptually-based musical analyses. The basic principles of this API - integrating content-based metadata with the underlying signal - are described in the paper, then the authors' enhancements are described. The libraries moved from supporting an imperative coding style to incorporating influences from functional programming and domain specific languages to allow for a much more fluent, terse coding style, allowing users to concentrate on the functions needed to find the portions of the song that were interesting, and modifying them. The paper then goes on to describe enhancements involving mixing multiple sources with one another and enabling user-created and user-modifiable effects that are controlled by direct manipulation of the objects that represent the sound. Revelations that the Remix API does not need to be as integrated as it currently is point to future directions for the API at the end of the paper.

AB - Soon after the Echo Nest Remix API was made publicly available and open source, the primary author began aggressively enhancing the Python framework for re-editing music based on perceptually-based musical analyses. The basic principles of this API - integrating content-based metadata with the underlying signal - are described in the paper, then the authors' enhancements are described. The libraries moved from supporting an imperative coding style to incorporating influences from functional programming and domain specific languages to allow for a much more fluent, terse coding style, allowing users to concentrate on the functions needed to find the portions of the song that were interesting, and modifying them. The paper then goes on to describe enhancements involving mixing multiple sources with one another and enabling user-created and user-modifiable effects that are controlled by direct manipulation of the objects that represent the sound. Revelations that the Remix API does not need to be as integrated as it currently is point to future directions for the API at the end of the paper.

M3 - Conference paper

AN - SCOPUS:84872704676

SP - 422

EP - 429

T2 - 12th International Conference on Digital Audio Effects, DAFx 2009

Y2 - 1 September 2009 through 4 September 2009

ER -