Sound Data Management Training » History » Version 15

Steve Welburn, 2012-07-05 11:46 AM

1 5 Steve Welburn
h1. WP1.2 Online Training Material
2 1 Steve Welburn
3 7 Steve Welburn
(back to [[Wiki]])
4 7 Steve Welburn
5 9 Steve Welburn
{{>toc}}
6 9 Steve Welburn
7 1 Steve Welburn
h2. C4DM Researcher use cases
8 1 Steve Welburn
9 1 Steve Welburn
h3. Quantitative testing - machine testing
10 1 Steve Welburn
11 1 Steve Welburn
bq. A common use-case in C4DM research is to run a newly-developed analysis algorithm on a set of audio examples and evaluate the algorithm by comparing its output with that of a human annotator. Results are then compared with published results using the same input data to determine whether the newly proposed approach makes any improvement on the state of the art.
12 1 Steve Welburn
13 1 Steve Welburn
Data involved includes:
14 1 Steve Welburn
* Software for the algorithm (which can be hosted on "Sound Software":http://www.soundsoftware.ac.uk)
15 1 Steve Welburn
* An annotated dataset against which the algorithm can be tested
16 1 Steve Welburn
* Results of applying the new algorithm and competing algorithms to the dataset
17 1 Steve Welburn
* Documentation of the testing methodology
18 1 Steve Welburn
19 13 Steve Welburn
Note that *if* other algorithms have published results using the same dataset and methodology, then results should be directly comparable between the published results and the results for the new algorithm. In this case, most of the methodology is already documented and only details specific to the new algorithm (e.g. parameters) need recording.
20 1 Steve Welburn
21 1 Steve Welburn
Also, if the testing is scripted, then the code used would be sufficient documentation during the research - readable documentation only being at publication.
22 1 Steve Welburn
23 1 Steve Welburn
If no suitable annotated dataset already exists, a new dataset may be created including:
24 12 Steve Welburn
* Selection of underlying (audio) data (the actual audio may be in the dataset or the dataset may reference material - e.g. for [[Copyright|copyright]] reasons)
25 11 Steve Welburn
* Ground-truth annotations for the audio and the type of algorithm (e.g. chord sequences for chord estimation, onset times for onset detection)
26 1 Steve Welburn
27 1 Steve Welburn
h3. Qualitative testing - Listening tests
28 1 Steve Welburn
29 2 Steve Welburn
An example would be testing audio at various levels of compression using both standard techniques and a newly derived algorithm.
30 1 Steve Welburn
31 6 Steve Welburn
e.g. [[MUSHRA]] type tests.
32 1 Steve Welburn
33 1 Steve Welburn
Data involved includes:
34 1 Steve Welburn
* Software for the algorithm (which can be hosted on "Sound Software":http://www.soundsoftware.ac.uk)
35 1 Steve Welburn
* Original uncompressed audio
36 1 Steve Welburn
* Audio output of the new algorithm on the audio
37 1 Steve Welburn
* Audio output of existing algorithms on the same audio
38 1 Steve Welburn
* Documentation of the testing methodology
39 1 Steve Welburn
40 1 Steve Welburn
We note that for listening tests, the research may involve:
41 1 Steve Welburn
* Demographic details of participants
42 3 Steve Welburn
* Identifiable participants (Data Protection])
43 3 Steve Welburn
* Release forms by people taking part
44 1 Steve Welburn
45 1 Steve Welburn
and *will* involve:
46 1 Steve Welburn
* ethics-board approval
47 1 Steve Welburn
48 1 Steve Welburn
h3. Publication
49 1 Steve Welburn
50 1 Steve Welburn
Additionally, publication of results will require:
51 1 Steve Welburn
* Summarising the results
52 1 Steve Welburn
* Publishing the paper
53 8 Steve Welburn
54 15 Steve Welburn
h3. Primary Investigator (PI)
55 14 Steve Welburn
56 14 Steve Welburn
The data management concerns of a PI will largely revolve around planning and appraisal of data management for research projects.
57 14 Steve Welburn
58 14 Steve Welburn
Areas of interest may involve:
59 14 Steve Welburn
* legalities (Freedom of Information, Copyright and Data Protection)
60 14 Steve Welburn
* data management plan
61 14 Steve Welburn
** covering the research council requirements
62 14 Steve Welburn
** during the project
63 14 Steve Welburn
** data archiving
64 14 Steve Welburn
** data publication
65 14 Steve Welburn
* After the project is completed, an appraisal of how the data was managed should be carried out as part of the project's "lessons learned"
66 14 Steve Welburn
67 8 Steve Welburn
h2. Overarching concerns
68 8 Steve Welburn
69 8 Steve Welburn
Human participation - ethics, data protection
70 8 Steve Welburn
71 8 Steve Welburn
Audio data - copyright
72 10 Steve Welburn
73 10 Steve Welburn
Storage - where ? how ? SLA ?