Taxonomy of Orchestral Effects Related to Auditory Grouping


—-Section under construction——

(private page)

Score Analysis:

Methods: Analysis of orchestration effects

  • The score analysis involved creating annotations.

  • In the field of Music Information Retrieval, human annotators of audio files are considered to be the gold standard of annotations, but researchers acknowledge that humans are not infallible or always in agreement, and this impacts the reliability of the annotation

  • Peeters and Fort (2012) outline several reasons for this, including:

    • Music Information Retrieval (MIR) - human annotations of audio files are “gold standard” but not perfect

      • Concepts may not be defined clearly

      • Concepts may not fit the content of the audio file

      • Several plausible possibilities for a particular annotation

      • Annotators may lose concentration

  • To mitigate these issues, there is a common practice of cross validation, such as correcting by a second annotator and/or the same file is annotated by 2 people, then compared. A decision is often made to determine the reliability of the annotation, such as the degree of inter-annotator agreement, which may lead to several outcomes.

    • Common practice of cross-validation

      • Correction by second annotator

      • Same file annotated by two annotators and then compared

      • Decisions about reliability of annotation

        • Sufficiently reliable?

        • Need for reanalysis?

        • Definitions/rules need to be modified?



  • Team of annotators (composers, music theorists, musicologists) worked in pairs

  • Individually

    • Examine the score and listen to recording(s)

    • Identify salient examples of orchestral effects according to predefined taxonomy

    • Annotate on PDF version of the score

    • Upload files to server

  • Pairs

    • Compare analyses, develop joint-analysis by consensus

    • Catalogue details (spreadsheet/google form)

    • Upload new files to server

  • Group meetings

    • Shared analyses, discussed issues, refined analysis categories

  • Rotated partnerships to avoid developing idiosyncratic analyses within teams

  • Verification stage

    • New team of annotators repeated procedure with movements that were previously analyzed by other teams

    • Concordances and divergences were logged and coded to find themes and patterns

    • Results

    • Definitions/rules needed to be modified for consistency

  • Editing and review stage

    • Update analyses for comprehensiveness

    • Sufficiently reliable

    • Need for reanalysis

To catalogue the annotations, we record the span of measures in which the orchestral device occurs, the start and end timing of the recordings, the type of device, the instrumentation and the roles of the instruments, and a description of the device.

We also incorporated strength ratings based on the recordings so we could conduct analyses later to explore how performance nuances can contribute to the perception of the device.

Difficult to standardize information with spreadsheets

Opted for a google form for phase 2

Given code for movement, so the information was controlled

85 full movements analyzed (1787-1943 + 2004)

4439 annotations of orchestral devices

Started with Romantic era, where these devices were likely to be found. Then we branched out into the classical period (Mozart, Haydn, Beethoven) and into the early 20th century (Debussy, Vaughan Williams). We intend to continue to fill in missing gaps in terms of historical epochs as well as nationalities. In doing so, we anticipate that the analyses will become more complicated and our theory will need to evolve in response to genres where timbral composition became an aim in and of itself.

We integrated these analyses into our new Orchestration Analysis and Research Database or ORCHARD, developed with Alistair Russell.

It currently contains over 4400 annotations from 85 orchestral movements drawn from the classical period to the early 20th century.

In terms of design, it is an object relational database which uses Apache Solr’s indexing capabilities for fast searching from the web application interface.

Currently, we can perform simple and complex queries, as well as view annotated scores while streaming musical clips. We are also developing the ability for annotators to upload future analyses directly into the collection with a computer-assisted score annotation system in MEI format.

You will be able to browse the analyses, conducted a simple search (keyword), and create complex queries.

Query-builder lets you specify the parameters in a hierarchical format and sort them hierarchically as well

Looks complex, but if you understand boolean operators it is very powerful.

Here, we are looking for either of 2 types of effects with a strength >= 3

AND Composers Mahler OR Mussorgsky

AND containing (trumpet AND clarinet) OR (trombone AND bassoon)

List of annotations will appear sorted according to the sort hierarchy

Developing tools to export these results