Resurrecting a Dinosaur - The Adaptation of Clarence Barlow's Legacy Software Autobusk
Bibtex
Abstract
Georg Hajdu
This paper aims at describing efforts to conserve and further develop the legacy real-time generative music program AUTOBUSK by Clarence Barlow. We present a case study demonstrating that a simple port of 30+ year old code may not suffice to infuse new life into a project that suffered from the abandonment of the hardware it was developed on. In the process of resurrecting this dinosaur, AUTOBUSK was entirely redesigned for the popular music software environments Max and Ableton Live (via Max for Live) and renamed DJster. It comes in several incarnations, the most recent ones being DJster Autobus for Ableton Live, a device for real-time event generation and DJster Autobus Scorepion, a plugin for the MaxScore Editor. These incarnations take advantage of being embedded in current environments running on modern operating systems and have since acquired some new and useful features. As AUTOBUSK/DJster is based on universal musical principles, which Barlow formalized during the 1970’s while working on his generative piano piece Çoǧluotobüsişletmesi, its algorithms are of general applicability for composers and performers working in diverse fields such as microtonality, interactive installations and/or film music. It has therefore inspired me to lay the foundations of a shorthand notation, which we will discuss in the last section.
@inproceedings{Hajdu_tenor2016,
Address = {Cambridge, UK},
Author = { Georg Hajdu },
Title = {Resurrecting a Dinosaur - The Adaptation of Clarence Barlow's Legacy Software Autobusk},
Booktitle = {Proceedings of the International Conference on Technologies for Music Notation and Representation - TENOR2016},
Pages = {181--186},
Year = {2016},
Editor = {Richard Hoadley and Chris Nash and Dominique Fober},
Publisher = {Anglia Ruskin University},
ISBN = {978-0-9931461-1-4}
}
Hexaphonic Guitar Transcription and Visualization
Bibtex
Abstract
Iñigo Angulo, Sergio Giraldo and Rafael Ramirez
Music representation has been a widely researched topic through centuries. Transcription of music through the conventional notation system has dominated the field, for the best part of the last centuries. However, this notational system often falls short of communicating the essence of music to the masses, especially to the people with no music training. Advances in signal processing and computer science over the last few decades have bridged this gap to an extent, but conveying the meaning of music remains a challenging research field. Music visualization is one such bridge, which we explore in this paper. This paper presents an approach to visually represent music produced by a guitar. To achieve this, hexaphonic guitar processing is carried out (i.e. processing each of the six strings as an independent monophonic sound source). Once this information is obtained, different approaches for representing it visually are explored. As a final result, a system is proposed to enrich the musical listening experience, by extending the perceived auditory sensations to include visual stimuli.
@inproceedings{Angulo_tenor2016,
Address = {Cambridge, UK},
Author = { Iñigo Angulo and Sergio Giraldo and Rafael Ramirez },
Title = {Hexaphonic Guitar Transcription and Visualization},
Booktitle = {Proceedings of the International Conference on Technologies for Music Notation and Representation - TENOR2016},
Pages = {187--192},
Year = {2016},
Editor = {Richard Hoadley and Chris Nash and Dominique Fober},
Publisher = {Anglia Ruskin University},
ISBN = {978-0-9931461-1-4}
}
Designing Dynamic Networked Scores to Enhance the Experience of Ensemble Music Making
Bibtex
Abstract
Alice Eldridge, Ed Hughes and Chris Kiefer
This paper describes the impetus for, and design and evaluation of, a pilot project examining the potential for digital, dynamic networked scores to enhance the experience of ensemble music making. We present a new networked score presentation system, and describe how it has evolved through a participatory design approach. Feedback has highlighted key issues concerning synchronisation between conductor, performers and notation, and autonomy and adaptation for performers; we discuss these key points and present our future plans for the project.
@inproceedings{Eldridge_tenor2016,
Address = {Cambridge, UK},
Author = { Alice Eldridge and Ed Hughes and Chris Kiefer },
Title = {Designing Dynamic Networked Scores to Enhance the Experience of Ensemble Music Making},
Booktitle = {Proceedings of the International Conference on Technologies for Music Notation and Representation - TENOR2016},
Pages = {193--199},
Year = {2016},
Editor = {Richard Hoadley and Chris Nash and Dominique Fober},
Publisher = {Anglia Ruskin University},
ISBN = {978-0-9931461-1-4}
}
Conversion from Standard MIDI Files to Vertical Line Notation Scores and Automatic Decision of Piano Fingering for Beginners
Bibtex
Abstract
Yasuyuki Saito, Eita Nakamura, Riku Sato, Suguru Agata, Yuu Igarashi and Shigeki Sagayama
This paper introduces "vertical line notation'' (VLN) of music for piano beginners, a conversion method from standard MIDI files to VLN scores, and an algorithm of automatic decision of piano fingering for it. Currently, staff notation is widely used for various instruments including piano. However, this notation often appears hard to beginners. On the other hand, VLN is intuitive and easy to understand for piano beginners since it graphically indicates the time order of notes as well as fingering. With the VLN score, piano beginners can make smooth progress with correct fingering. VLN scores are expected to help piano beginners make smooth progress with correct fingering. An issue with VLN is that it is currently created by hand with a spreadsheet software. It would be desirable to automatically produce VLN scores from existing digital scores. In this paper, we propose a method of converting standard MIDI files into VLN scores and an algorithm of automatic fingering decision for piano beginners. Some examples of practical and successful use of VLN scores are shown.
@inproceedings{Saito_tenor2016,
Address = {Cambridge, UK},
Author = { Yasuyuki Saito and Eita Nakamura and Riku Sato and Suguru Agata and Yuu Igarashi and Shigeki Sagayama },
Title = {Conversion from Standard MIDI Files to Vertical Line Notation Scores and Automatic Decision of Piano Fingering for Beginners},
Booktitle = {Proceedings of the International Conference on Technologies for Music Notation and Representation - TENOR2016},
Pages = {200--211},
Year = {2016},
Editor = {Richard Hoadley and Chris Nash and Dominique Fober},
Publisher = {Anglia Ruskin University},
ISBN = {978-0-9931461-1-4}
}
Taxonomy and Notation of Spatialization
Bibtex
Abstract
Emile Ellberger, Germán Toro Pérez, Linda Cavaliero, Johannes Schuett, Basile Zimmermann and Giorgio Zoia
The SSMN Spatial Taxonomy and its symbols libraries, which are the corner stone of the Spatialization Symbolic Music Notation (SSMN) project, emanates from research into composers’ attitudes in this domain. It was conceived as the basis for the development of dedicated notation and rendering tools within the SSMN project. The taxonomy is a systematic representation of all relevant features necessary to specify sound spatiality: shape and acoustic quality of the space, structure, position and movement of sound sources. It is based on single descriptors that can be combined in order to define complex spatial configurations. Descriptors can be transformed locally and globally and can be the object of structural and behavioral operations. The SSMN Spatial Taxonomy proposes a corresponding graphic symbolic representation of descriptors, operations and other functional elements facilitating the communication of creative ideas to performers and technical assistants. This paper focuses on the presentation of the taxonomy and the symbols. Additionally it describes the workflow proposed for using symbols inside a notation software prototype developed within the project. Finally, further aspects concerning the actual and future developments of SSMN are mentioned.
@inproceedings{Ellberger_tenor2016,
Address = {Cambridge, UK},
Author = { Emile Ellberger and Germán Toro Pérez and Linda Cavaliero and Johannes Schuett and Basile Zimmermann and Giorgio Zoia },
Title = {Taxonomy and Notation of Spatialization},
Booktitle = {Proceedings of the International Conference on Technologies for Music Notation and Representation - TENOR2016},
Pages = {212--219},
Year = {2016},
Editor = {Richard Hoadley and Chris Nash and Dominique Fober},
Publisher = {Anglia Ruskin University},
ISBN = {978-0-9931461-1-4}
}
Music Analysis Through Visualization
Bibtex
Abstract
Jia Li
In this paper analytic visualizations are used to selectively highlight salient musical features in four modern compositions, focusing on micro or macro structures: from motivic pitch contour to large-scale form. At a glance these visualizations allow a quick grasp of the structure and assist listeners to make connections between local features and global trends. Textures obscured by musical notation become more apparent when displayed in a graphical format, such as broad registral shifts, polyphonic streaming, as well as interplay between instruments. Pitch, timbre and voicing are plotted against time to show large-scale patterns that would otherwise be difficult to recognize in a musical score or compare between different works. Music analysis through compositional data visualization not only makes sense to musicians but also to non-musicians, facilitating collaboration and exchange with artists and technicians in other media.
@inproceedings{Li_tenor2016,
Address = {Cambridge, UK},
Author = { Jia Li },
Title = {Music Analysis Through Visualization},
Booktitle = {Proceedings of the International Conference on Technologies for Music Notation and Representation - TENOR2016},
Pages = {220--225},
Year = {2016},
Editor = {Richard Hoadley and Chris Nash and Dominique Fober},
Publisher = {Anglia Ruskin University},
ISBN = {978-0-9931461-1-4}
}
Notation as Temporal Instrument
Bibtex
Abstract
Eric Maestri
In this paper the author proposes a descriptive musicological framework built on the notion of notation as temporal instrument in today's context of electronic music. The principal goal is to discuss a research categorization of musical notation that consider the performative character of musical writing in electronic music performance. In the intentions of the author, this framework could resume the multiple enhancement of the temporal dimension of notation implied by the new means of performance in electronic music.
@inproceedings{Maestri_tenor2016,
Address = {Cambridge, UK},
Author = { Eric Maestri },
Title = {Notation as Temporal Instrument},
Booktitle = {Proceedings of the International Conference on Technologies for Music Notation and Representation - TENOR2016},
Pages = {226--229},
Year = {2016},
Editor = {Richard Hoadley and Chris Nash and Dominique Fober},
Publisher = {Anglia Ruskin University},
ISBN = {978-0-9931461-1-4}
}
Visual Confusion in Piano Notation
Bibtex
Abstract
Marion Wood
This series of Reaction Time experiments investigates how quickly notes can be read from a screen and immediately executed on a MIDI keyboard. This makes it possible to study pitch reading and motor coordination in considerable detail away from the customary confounds of rhythm reading or pulse entrainment. The first experiment found that reaction times were slower in extreme keys (3#, 4#, 3b, 4b), even for very experienced sightreaders, a large effect of clef in most individuals, and other results suggesting that, in this simple paradigm at least, reading notation presents more of a difficulty to execution than motor coordination. A second experiment found, in addition, an effect of order in which the notes were presented. A clarified form of notation was devised that disambiguates visual confusion across key signatures, and to some extent across clefs. Initial results from an experiment to contrast traditional noteheads with the clearer ones found substantial improvements in both Reaction Time and accuracy for the clarified notation. The possible applications of improved notation to the wider field of piano playing are discussed.
@inproceedings{Wood_tenor2016,
Address = {Cambridge, UK},
Author = { Marion Wood },
Title = {Visual Confusion in Piano Notation},
Booktitle = {Proceedings of the International Conference on Technologies for Music Notation and Representation - TENOR2016},
Pages = {230--239},
Year = {2016},
Editor = {Richard Hoadley and Chris Nash and Dominique Fober},
Publisher = {Anglia Ruskin University},
ISBN = {978-0-9931461-1-4}
}
From Transcription to Signal Representation: Pitch, Rhythm and Performance
Bibtex
Abstract
Marie Tahon and Pierre-Eugène Sitchet
Musical transcription is a real challenge, moreover in a folk music context. Signal visualization tools could be of interest for such music. The present paper is a comparison of a musical transcription and two signal representations (pitch and rhythm) applied to a song from the Gwoka repertoire. The study aims at finding similar elements and differences on pitch, rhythm and performance features in both the transcription and the signal visualization. Signal visualization is founded on vowel segmentation, and extraction of pitch and duration information. On the one hand transcription gives general characteristics on the music (harmony, tonality and rhythmic structure) and on the other hand, signal visualization gives performance-related characteristics. The main conclusion is that both approaches are of great interest for understanding such a music.
@inproceedings{Tahon_tenor2016,
Address = {Cambridge, UK},
Author = { Marie Tahon and Pierre-Eugène Sitchet },
Title = {From Transcription to Signal Representation: Pitch, Rhythm and Performance},
Booktitle = {Proceedings of the International Conference on Technologies for Music Notation and Representation - TENOR2016},
Pages = {240--245},
Year = {2016},
Editor = {Richard Hoadley and Chris Nash and Dominique Fober},
Publisher = {Anglia Ruskin University},
ISBN = {978-0-9931461-1-4}
}