Sound Data Management Training » History » Version 28

Steve Welburn, 2012-07-30 02:55 PM

1 5 Steve Welburn
h1. WP1.2 Online Training Material
2 1 Steve Welburn
3 7 Steve Welburn
(back to [[Wiki]])
4 7 Steve Welburn
5 9 Steve Welburn
{{>toc}}
6 9 Steve Welburn
7 28 Steve Welburn
h2. By Stage Of Research
8 28 Steve Welburn
9 28 Steve Welburn
h3. Before Research 
10 28 Steve Welburn
11 28 Steve Welburn
* Planning research data management
12 28 Steve Welburn
13 28 Steve Welburn
h3. During Research
14 28 Steve Welburn
15 28 Steve Welburn
* Backing up research data
16 28 Steve Welburn
* Documenting data
17 28 Steve Welburn
18 28 Steve Welburn
h3. At The End Of A Piece of Research
19 28 Steve Welburn
20 28 Steve Welburn
(Includes on publication of a paper based on your research)
21 28 Steve Welburn
22 28 Steve Welburn
* Archiving research data
23 28 Steve Welburn
* Publishing research data
24 28 Steve Welburn
25 28 Steve Welburn
26 1 Steve Welburn
h2. C4DM Researcher use cases
27 1 Steve Welburn
28 1 Steve Welburn
h3. Quantitative testing - machine testing
29 1 Steve Welburn
30 1 Steve Welburn
bq. A common use-case in C4DM research is to run a newly-developed analysis algorithm on a set of audio examples and evaluate the algorithm by comparing its output with that of a human annotator. Results are then compared with published results using the same input data to determine whether the newly proposed approach makes any improvement on the state of the art.
31 1 Steve Welburn
32 26 Steve Welburn
Two main cases:
33 26 Steve Welburn
* New data, existing algorithms
34 26 Steve Welburn
* Existing data, new algorithm
35 26 Steve Welburn
36 1 Steve Welburn
Data involved includes:
37 1 Steve Welburn
* Software for the algorithm (which can be hosted on "Sound Software":http://www.soundsoftware.ac.uk)
38 1 Steve Welburn
* An annotated dataset against which the algorithm can be tested
39 1 Steve Welburn
* Results of applying the new algorithm and competing algorithms to the dataset
40 1 Steve Welburn
* Documentation of the testing methodology
41 1 Steve Welburn
42 22 Steve Welburn
Note that *if* other algorithms have published results using the same dataset and methodology, then results should be directly comparable between the published results and the results for the new algorithm. In this case, most of the methodology is already documented and only details specific to the new algorithm (e.g. parameters) need separately recording.
43 1 Steve Welburn
44 23 Steve Welburn
Also, if the testing is scripted, then the code used would be sufficient documentation during the research - readable documentation only being required at publication.
45 1 Steve Welburn
46 1 Steve Welburn
If no suitable annotated dataset already exists, a new dataset may be created including:
47 12 Steve Welburn
* Selection of underlying (audio) data (the actual audio may be in the dataset or the dataset may reference material - e.g. for [[Copyright|copyright]] reasons)
48 11 Steve Welburn
* Ground-truth annotations for the audio and the type of algorithm (e.g. chord sequences for chord estimation, onset times for onset detection)
49 1 Steve Welburn
50 25 Steve Welburn
h3. Qualitative testing
51 1 Steve Welburn
52 27 Steve Welburn
An example would be using interviews with performers to evaluate a new instrument design.
53 1 Steve Welburn
54 24 Steve Welburn
Data involved may include:
55 24 Steve Welburn
* the interface design
56 24 Steve Welburn
* Captured audio from performances
57 24 Steve Welburn
* Recorded interviews with performers (possibly audio or video)
58 24 Steve Welburn
* Interview transcripts
59 1 Steve Welburn
60 24 Steve Welburn
The research may also involve:
61 1 Steve Welburn
* Demographic details of participants
62 3 Steve Welburn
* Identifiable participants (Data Protection])
63 24 Steve Welburn
* Release forms for people taking part
64 1 Steve Welburn
65 1 Steve Welburn
and *will* involve:
66 1 Steve Welburn
* ethics-board approval
67 1 Steve Welburn
68 1 Steve Welburn
h3. Publication
69 1 Steve Welburn
70 1 Steve Welburn
Additionally, publication of results will require:
71 1 Steve Welburn
* Summarising the results
72 1 Steve Welburn
* Publishing the paper
73 8 Steve Welburn
74 17 Steve Welburn
Note that the EPSRC data management principles require sources of data to be referenced.
75 17 Steve Welburn
76 15 Steve Welburn
h3. Primary Investigator (PI)
77 14 Steve Welburn
78 14 Steve Welburn
The data management concerns of a PI will largely revolve around planning and appraisal of data management for research projects.
79 14 Steve Welburn
80 14 Steve Welburn
Areas of interest may involve:
81 14 Steve Welburn
* legalities (Freedom of Information, Copyright and Data Protection)
82 14 Steve Welburn
* data management plan
83 14 Steve Welburn
** covering the research council requirements
84 14 Steve Welburn
** during the project
85 14 Steve Welburn
** data archiving
86 14 Steve Welburn
** data publication
87 14 Steve Welburn
* After the project is completed, an appraisal of how the data was managed should be carried out as part of the project's "lessons learned"
88 14 Steve Welburn
89 16 Steve Welburn
Data management training should provide an overview of all the above, and keep PIs informed of any changes in the above that affect data management requirements.
90 16 Steve Welburn
91 19 Steve Welburn
The DCC DMP Online tool provides a series of questions which allow the user to build a data management plan which will match research council requirements.
92 18 Steve Welburn
93 8 Steve Welburn
h2. Overarching concerns
94 8 Steve Welburn
95 8 Steve Welburn
Human participation - ethics, data protection
96 8 Steve Welburn
97 8 Steve Welburn
Audio data - copyright
98 10 Steve Welburn
99 10 Steve Welburn
Storage - where ? how ? SLA ?
100 20 Steve Welburn
101 21 Steve Welburn
Short-term resilient storage for work-in-progress
102 20 Steve Welburn
103 21 Steve Welburn
Long-term archival storage for research data outputs
104 1 Steve Welburn
105 1 Steve Welburn
Curation of archived data - refreshing media and formats
106 21 Steve Welburn
107 21 Steve Welburn
Drivers - FoI, RCUK