Sound Data Management Training » History » Version 71

Steve Welburn, 2012-08-22 03:41 PM

1 5 Steve Welburn
h1. WP1.2 Online Training Material
2 1 Steve Welburn
3 9 Steve Welburn
{{>toc}}
4 9 Steve Welburn
5 68 Steve Welburn
We consider three stages of a research project, and the appropriate research data management considerations for each of those stages. The stages are:
6 41 Steve Welburn
* before the research;
7 41 Steve Welburn
* during the research;
8 41 Steve Welburn
* at the end of the research.
9 1 Steve Welburn
10 67 Steve Welburn
{{include(Before The Research)}}
11 67 Steve Welburn
12 71 Steve Welburn
{{include(During The Research)}}
13 28 Steve Welburn
14 40 Steve Welburn
During the course of a piece of research, data management is largely risk mitigation - it makes your research more robust and allows you to continue if something goes wrong.
15 31 Steve Welburn
16 31 Steve Welburn
The two main areas to consider are:
17 57 Steve Welburn
* [[backing up]] research data - in case you lose, or corrupt, the main copy of your data;
18 57 Steve Welburn
* [[documenting data]] - in case you need to to return to it later.
19 28 Steve Welburn
20 48 Steve Welburn
In addition to the immediate benefits during research, applying good research data management practices makes it easier to manage your research data at the end of your research project.
21 39 Steve Welburn
22 49 Steve Welburn
We have identified three basic types of research projects, two quantitative (one based on new data, one based on a new algorithm) and one qualitative, and consider the data management techniques appropriate to those workflows. More complex research projects may required a combination of the techniques from these.
23 1 Steve Welburn
24 49 Steve Welburn
h3. Quantitative research - New Data
25 49 Steve Welburn
26 50 Steve Welburn
For this use case, the research workflow involves:
27 50 Steve Welburn
* creating a new dataset
28 50 Steve Welburn
* testing outputs of existing algorithms on the dataset
29 50 Steve Welburn
* publication of results
30 1 Steve Welburn
31 1 Steve Welburn
32 50 Steve Welburn
The new dataset may include:
33 50 Steve Welburn
* Selection or creation of underlying (audio) data (the actual audio may be in the dataset or the dataset may reference material - e.g. for [[Copyright|copyright]] reasons)
34 50 Steve Welburn
* creation of ground-truth annotations for the audio and the type of algorithm (e.g. chord sequences for chord estimation, onset times for onset detection)
35 49 Steve Welburn
36 62 Steve Welburn
The content of the dataset will need to be [[documenting data|documented]].
37 49 Steve Welburn
38 50 Steve Welburn
Data involved includes:
39 64 Steve Welburn
* [[Managing Software As Data|software]] for the algorithms
40 50 Steve Welburn
* the new dataset
41 65 Steve Welburn
* identification of existing datasets against which results will be compared
42 50 Steve Welburn
* results of applying the algorithms to the dataset
43 50 Steve Welburn
* documentation of the testing methodology - including algorithm parameters
44 49 Steve Welburn
45 50 Steve Welburn
Note that *if* existing algorithms have published results using the same existing datasets and methodology, then results should be directly comparable between the published results and the results for the new dataset. In this case, most of the methodology is already documented and only details specific to the new dataset need separately recording.
46 50 Steve Welburn
47 50 Steve Welburn
If the testing is scripted, then the code used would be sufficient documentation during the research - readable documentation only being required at publication.
48 1 Steve Welburn
49 1 Steve Welburn
h3. Quantitative research - New Algorithm
50 1 Steve Welburn
51 29 Steve Welburn
bq. A common use-case in C4DM research is to run a newly-developed analysis algorithm on a set of audio examples and evaluate the algorithm by comparing its output with that of a human annotator. Results are then compared with published results using the same input data to determine whether the newly proposed approach makes any improvement on the state of the art.
52 29 Steve Welburn
53 1 Steve Welburn
Data involved includes:
54 64 Steve Welburn
* [[Managing Software As Data|software]] for the algorithm
55 1 Steve Welburn
* An annotated dataset against which the algorithm can be tested
56 1 Steve Welburn
* Results of applying the new algorithm and competing algorithms to the dataset
57 22 Steve Welburn
* Documentation of the testing methodology
58 1 Steve Welburn
59 23 Steve Welburn
Note that *if* other algorithms have published results using the same dataset and methodology, then results should be directly comparable between the published results and the results for the new algorithm. In this case, most of the methodology is already documented and only details specific to the new algorithm (e.g. parameters) need separately recording.
60 1 Steve Welburn
61 1 Steve Welburn
Also, if the testing is scripted, then the code used would be sufficient documentation during the research - readable documentation only being required at publication.
62 12 Steve Welburn
63 27 Steve Welburn
h3. Qualitative research
64 1 Steve Welburn
65 46 Steve Welburn
An example would be using interviews with performers to evaluate a new instrument design.
66 46 Steve Welburn
67 46 Steve Welburn
The workflow is:
68 46 Steve Welburn
* Gather data for the experiment (e.g. though interviews)
69 46 Steve Welburn
* Analyse data
70 24 Steve Welburn
* Publish data
71 24 Steve Welburn
72 24 Steve Welburn
Data involved may include:
73 24 Steve Welburn
* the interface design
74 24 Steve Welburn
* Captured audio from performances
75 1 Steve Welburn
* Recorded interviews with performers (possibly audio or video)
76 24 Steve Welburn
* Interview transcripts
77 1 Steve Welburn
78 1 Steve Welburn
The research may also involve:
79 1 Steve Welburn
* Demographic details of participants
80 1 Steve Welburn
* Identifiable participants (Data Protection])
81 8 Steve Welburn
* Release forms for people taking part
82 17 Steve Welburn
83 46 Steve Welburn
and *will* involve:
84 44 Steve Welburn
* ethics-board approval
85 1 Steve Welburn
86 44 Steve Welburn
h2. At The End Of The Research
87 44 Steve Welburn
88 66 Steve Welburn
Either the complete end of a research project or on completion of an identifiable unit of research (e.g. on publication of a paper based on your research).
89 44 Steve Welburn
90 63 Steve Welburn
* [[Archiving research data]]
91 63 Steve Welburn
* [[Publishing research data]]
92 56 Steve Welburn
* Reviewing the data management plan (possibly for the project final report)
93 44 Steve Welburn
94 44 Steve Welburn
Publication of the results of your research will require:
95 44 Steve Welburn
* Summarising the results
96 44 Steve Welburn
* Publishing the paper
97 44 Steve Welburn
98 44 Steve Welburn
Note that the EPSRC data management principles require sources of data to be referenced.
99 44 Steve Welburn
100 42 Steve Welburn
h2. Primary Investigator (PI)
101 42 Steve Welburn
102 55 Steve Welburn
The data management concerns of a PI will largely revolve around planning and appraisal of data management for research projects - both to make sure that they conform with institutional and funder requirements and to ensure that the data managment needs of the research project are met.
103 42 Steve Welburn
104 42 Steve Welburn
Areas of interest may involve:
105 57 Steve Welburn
* [[legislation|legalities]] (Freedom of Information, Copyright and Data Protection)
106 42 Steve Welburn
* data management plan
107 42 Steve Welburn
** covering the research council requirements
108 42 Steve Welburn
** during the project
109 42 Steve Welburn
** data archiving
110 42 Steve Welburn
** data publication
111 42 Steve Welburn
* After the project is completed, an appraisal of how the data was managed should be carried out as part of the project's "lessons learned"
112 42 Steve Welburn
113 42 Steve Welburn
Data management training should provide an overview of all the above, and keep PIs informed of any changes in the above that affect data management requirements.
114 18 Steve Welburn
115 8 Steve Welburn
The DCC DMP Online tool provides a series of questions which allow the user to build a data management plan which will match research council requirements.
116 8 Steve Welburn
117 8 Steve Welburn
h2. Overarching concerns
118 8 Steve Welburn
119 8 Steve Welburn
Human participation - ethics, data protection
120 10 Steve Welburn
121 10 Steve Welburn
Audio data - copyright
122 20 Steve Welburn
123 21 Steve Welburn
Storage - where ? how ? SLA ?
124 20 Steve Welburn
125 21 Steve Welburn
Short-term resilient storage for work-in-progress
126 1 Steve Welburn
127 1 Steve Welburn
Long-term archival storage for research data outputs
128 21 Steve Welburn
129 21 Steve Welburn
Curation of archived data - refreshing media and formats
130 1 Steve Welburn
131 1 Steve Welburn
Drivers - FoI, RCUK