Deliverable 5.5 Annotation Tools

Author
Abstract
In this deliverable we describe the requirements and development plan to make tools to collect and
store annotations about music content that we work with in TROMPA. These annotations will be
used in the project to enrich the music that we are studying in TROMPA, and help us to improve
algorithms for the automatic analysis and annotation of this music. We present the design for a
graphical interface for creating the types of annotations that are required in the use cases being
produced in WP 6. The individual use cases can embed the annotation tools in the applications that
they are developing to provide annotation functionality. The annotations will be stored in the
Contributor Environment (CE), being built as part of Deliverable 5.1. We will interact with the crowd
planning strategies (Deliverable 4.4) to determine the best way of soliciting annotations from users
depending on the skills of the users and requirements of a particular annotation task.
In TROMPA, we plan to collect user annotations in the use cases for performers, singers, music
enthusiasts, and music scholars. In this deliverable we focus on annotations of music recordings for
the first three user groups, but we will also design our annotation storage system to support other
kinds of annotations such as annotations of music scores, video recordings, or images so that other
use cases can also utilise the framework that we are developing. We will focus on a closed set of
annotation types, including free-form annotation labels, annotation with a closed vocabulary of
labels, and ratings of items in the TROMPA corpus. Annotations can apply to a single item, or to a
segment of an item (e.g., between two timestamps). Some annotations can apply to more than one
item or segment (e.g., the similarity between two recordings). We discuss these requirements in
Section 2.
The annotation interface will be a web-based interface that can be embedded in web
applications. It will provide users with the ability to view audio recordings, listen to them, select
segments of recordings, and apply an annotation to a selection. Where possible we plan to reuse
existing tools that have been developed within the Music Information Retrieval field. Annotations
will be stored in the CE. We will use the Web annotations model to store annotations, an existing
open standard for annotations which has been used in previous research by some TROMPA
consortium members. Web annotations will give us a standardised way of applying annotations by a
certain person and of a certain type to an item that exists in the CE. We will extend the data model
and API of the CE to support these annotations. The annotation interfaces and storage are described
in Section 3.
In collaboration with Deliverable 4.4 (Hybrid annotation workflows) we will develop strategies to
determine which types of annotations are the most important to solicit from users for the tasks that
we want to perform in TROMPA. Different types of annotations will be used for different purposes,
from the training and evaluation of automatic music description algorithms, to the enrichment of
musical content, allowing people to learn more about the music recordings that we store in the CE.
These strategies will allow us to choose what is the most important type of annotation that should
solicit from a user in order to maximise the benefit to other users and algorithms, and request that
those annotations are performed first. We outline this collaboration in Section 4.
Year of Publication
2019
Report Number
TR-D5.5-Annotation Tools v1
URL
https://trompamusic.eu/deliverables/TR-D5.5-Annotation_Tools_v1.pdf
Short Title
D5.5