Sound Data Management Training » History » Version 14
Version 13 (Steve Welburn, 2012-07-04 10:50 PM) → Version 14/110 (Steve Welburn, 2012-07-05 11:45 AM)
h1. WP1.2 Online Training Material
(back to [[Wiki]])
{{>toc}}
h2. C4DM Researcher use cases
h3. Quantitative testing - machine testing
bq. A common use-case in C4DM research is to run a newly-developed analysis algorithm on a set of audio examples and evaluate the algorithm by comparing its output with that of a human annotator. Results are then compared with published results using the same input data to determine whether the newly proposed approach makes any improvement on the state of the art.
Data involved includes:
* Software for the algorithm (which can be hosted on "Sound Software":http://www.soundsoftware.ac.uk)
* An annotated dataset against which the algorithm can be tested
* Results of applying the new algorithm and competing algorithms to the dataset
* Documentation of the testing methodology
Note that *if* other algorithms have published results using the same dataset and methodology, then results should be directly comparable between the published results and the results for the new algorithm. In this case, most of the methodology is already documented and only details specific to the new algorithm (e.g. parameters) need recording.
Also, if the testing is scripted, then the code used would be sufficient documentation during the research - readable documentation only being at publication.
If no suitable annotated dataset already exists, a new dataset may be created including:
* Selection of underlying (audio) data (the actual audio may be in the dataset or the dataset may reference material - e.g. for [[Copyright|copyright]] reasons)
* Ground-truth annotations for the audio and the type of algorithm (e.g. chord sequences for chord estimation, onset times for onset detection)
h3. Qualitative testing - Listening tests
An example would be testing audio at various levels of compression using both standard techniques and a newly derived algorithm.
e.g. [[MUSHRA]] type tests.
Data involved includes:
* Software for the algorithm (which can be hosted on "Sound Software":http://www.soundsoftware.ac.uk)
* Original uncompressed audio
* Audio output of the new algorithm on the audio
* Audio output of existing algorithms on the same audio
* Documentation of the testing methodology
We note that for listening tests, the research may involve:
* Demographic details of participants
* Identifiable participants (Data Protection])
* Release forms by people taking part
and *will* involve:
* ethics-board approval
h3. Publication
Additionally, publication of results will require:
* Summarising the results
* Publishing the paper
h3. Primary Investoigator (PI)
The data management concerns of a PI will largely revolve around planning and appraisal of data management for research projects.
Areas of interest may involve:
* legalities (Freedom of Information, Copyright and Data Protection)
* data management plan
** covering the research council requirements
** during the project
** data archiving
** data publication
* After the project is completed, an appraisal of how the data was managed should be carried out as part of the project's "lessons learned"
h2. Overarching concerns
Human participation - ethics, data protection
Audio data - copyright
Storage - where ? how ? SLA ?
(back to [[Wiki]])
{{>toc}}
h2. C4DM Researcher use cases
h3. Quantitative testing - machine testing
bq. A common use-case in C4DM research is to run a newly-developed analysis algorithm on a set of audio examples and evaluate the algorithm by comparing its output with that of a human annotator. Results are then compared with published results using the same input data to determine whether the newly proposed approach makes any improvement on the state of the art.
Data involved includes:
* Software for the algorithm (which can be hosted on "Sound Software":http://www.soundsoftware.ac.uk)
* An annotated dataset against which the algorithm can be tested
* Results of applying the new algorithm and competing algorithms to the dataset
* Documentation of the testing methodology
Note that *if* other algorithms have published results using the same dataset and methodology, then results should be directly comparable between the published results and the results for the new algorithm. In this case, most of the methodology is already documented and only details specific to the new algorithm (e.g. parameters) need recording.
Also, if the testing is scripted, then the code used would be sufficient documentation during the research - readable documentation only being at publication.
If no suitable annotated dataset already exists, a new dataset may be created including:
* Selection of underlying (audio) data (the actual audio may be in the dataset or the dataset may reference material - e.g. for [[Copyright|copyright]] reasons)
* Ground-truth annotations for the audio and the type of algorithm (e.g. chord sequences for chord estimation, onset times for onset detection)
h3. Qualitative testing - Listening tests
An example would be testing audio at various levels of compression using both standard techniques and a newly derived algorithm.
e.g. [[MUSHRA]] type tests.
Data involved includes:
* Software for the algorithm (which can be hosted on "Sound Software":http://www.soundsoftware.ac.uk)
* Original uncompressed audio
* Audio output of the new algorithm on the audio
* Audio output of existing algorithms on the same audio
* Documentation of the testing methodology
We note that for listening tests, the research may involve:
* Demographic details of participants
* Identifiable participants (Data Protection])
* Release forms by people taking part
and *will* involve:
* ethics-board approval
h3. Publication
Additionally, publication of results will require:
* Summarising the results
* Publishing the paper
h3. Primary Investoigator (PI)
The data management concerns of a PI will largely revolve around planning and appraisal of data management for research projects.
Areas of interest may involve:
* legalities (Freedom of Information, Copyright and Data Protection)
* data management plan
** covering the research council requirements
** during the project
** data archiving
** data publication
* After the project is completed, an appraisal of how the data was managed should be carried out as part of the project's "lessons learned"
h2. Overarching concerns
Human participation - ethics, data protection
Audio data - copyright
Storage - where ? how ? SLA ?