Sound Data Management Training » History » Version 27

Steve Welburn, 2012-07-30 11:08 AM

1 5 Steve Welburn
h1. WP1.2 Online Training Material
2 1 Steve Welburn
3 7 Steve Welburn
(back to [[Wiki]])
4 7 Steve Welburn
5 9 Steve Welburn
{{>toc}}
6 9 Steve Welburn
7 1 Steve Welburn
h2. C4DM Researcher use cases
8 1 Steve Welburn
9 1 Steve Welburn
h3. Quantitative testing - machine testing
10 1 Steve Welburn
11 1 Steve Welburn
bq. A common use-case in C4DM research is to run a newly-developed analysis algorithm on a set of audio examples and evaluate the algorithm by comparing its output with that of a human annotator. Results are then compared with published results using the same input data to determine whether the newly proposed approach makes any improvement on the state of the art.
12 1 Steve Welburn
13 26 Steve Welburn
Two main cases:
14 26 Steve Welburn
* New data, existing algorithms
15 26 Steve Welburn
* Existing data, new algorithm
16 26 Steve Welburn
17 1 Steve Welburn
Data involved includes:
18 1 Steve Welburn
* Software for the algorithm (which can be hosted on "Sound Software":http://www.soundsoftware.ac.uk)
19 1 Steve Welburn
* An annotated dataset against which the algorithm can be tested
20 1 Steve Welburn
* Results of applying the new algorithm and competing algorithms to the dataset
21 1 Steve Welburn
* Documentation of the testing methodology
22 1 Steve Welburn
23 22 Steve Welburn
Note that *if* other algorithms have published results using the same dataset and methodology, then results should be directly comparable between the published results and the results for the new algorithm. In this case, most of the methodology is already documented and only details specific to the new algorithm (e.g. parameters) need separately recording.
24 1 Steve Welburn
25 23 Steve Welburn
Also, if the testing is scripted, then the code used would be sufficient documentation during the research - readable documentation only being required at publication.
26 1 Steve Welburn
27 1 Steve Welburn
If no suitable annotated dataset already exists, a new dataset may be created including:
28 12 Steve Welburn
* Selection of underlying (audio) data (the actual audio may be in the dataset or the dataset may reference material - e.g. for [[Copyright|copyright]] reasons)
29 11 Steve Welburn
* Ground-truth annotations for the audio and the type of algorithm (e.g. chord sequences for chord estimation, onset times for onset detection)
30 1 Steve Welburn
31 25 Steve Welburn
h3. Qualitative testing
32 1 Steve Welburn
33 27 Steve Welburn
An example would be using interviews with performers to evaluate a new instrument design.
34 1 Steve Welburn
35 24 Steve Welburn
Data involved may include:
36 24 Steve Welburn
* the interface design
37 24 Steve Welburn
* Captured audio from performances
38 24 Steve Welburn
* Recorded interviews with performers (possibly audio or video)
39 24 Steve Welburn
* Interview transcripts
40 1 Steve Welburn
41 24 Steve Welburn
The research may also involve:
42 1 Steve Welburn
* Demographic details of participants
43 3 Steve Welburn
* Identifiable participants (Data Protection])
44 24 Steve Welburn
* Release forms for people taking part
45 1 Steve Welburn
46 1 Steve Welburn
and *will* involve:
47 1 Steve Welburn
* ethics-board approval
48 1 Steve Welburn
49 1 Steve Welburn
h3. Publication
50 1 Steve Welburn
51 1 Steve Welburn
Additionally, publication of results will require:
52 1 Steve Welburn
* Summarising the results
53 1 Steve Welburn
* Publishing the paper
54 8 Steve Welburn
55 17 Steve Welburn
Note that the EPSRC data management principles require sources of data to be referenced.
56 17 Steve Welburn
57 15 Steve Welburn
h3. Primary Investigator (PI)
58 14 Steve Welburn
59 14 Steve Welburn
The data management concerns of a PI will largely revolve around planning and appraisal of data management for research projects.
60 14 Steve Welburn
61 14 Steve Welburn
Areas of interest may involve:
62 14 Steve Welburn
* legalities (Freedom of Information, Copyright and Data Protection)
63 14 Steve Welburn
* data management plan
64 14 Steve Welburn
** covering the research council requirements
65 14 Steve Welburn
** during the project
66 14 Steve Welburn
** data archiving
67 14 Steve Welburn
** data publication
68 14 Steve Welburn
* After the project is completed, an appraisal of how the data was managed should be carried out as part of the project's "lessons learned"
69 14 Steve Welburn
70 16 Steve Welburn
Data management training should provide an overview of all the above, and keep PIs informed of any changes in the above that affect data management requirements.
71 16 Steve Welburn
72 19 Steve Welburn
The DCC DMP Online tool provides a series of questions which allow the user to build a data management plan which will match research council requirements.
73 18 Steve Welburn
74 8 Steve Welburn
h2. Overarching concerns
75 8 Steve Welburn
76 8 Steve Welburn
Human participation - ethics, data protection
77 8 Steve Welburn
78 8 Steve Welburn
Audio data - copyright
79 10 Steve Welburn
80 10 Steve Welburn
Storage - where ? how ? SLA ?
81 20 Steve Welburn
82 21 Steve Welburn
Short-term resilient storage for work-in-progress
83 20 Steve Welburn
84 21 Steve Welburn
Long-term archival storage for research data outputs
85 1 Steve Welburn
86 1 Steve Welburn
Curation of archived data - refreshing media and formats
87 21 Steve Welburn
88 21 Steve Welburn
Drivers - FoI, RCUK