annotate README.txt @ 1072:2ea78697aadf

Scripts: comment_parser and score_parser start new CSV files each time; various plots added to generated PDF report
author Brecht De Man <BrechtDeMan@users.noreply.github.com>
date Thu, 20 Aug 2015 11:29:29 +0200
parents f1201566b54b
children 235594325b84
rev   line source
BrechtDeMan@881 1 WEB AUDIO EVALUATION TOOL
BrechtDeMan@881 2
BrechtDeMan@881 3 This is not (yet) a fully fledged manual.
BrechtDeMan@881 4
BrechtDeMan@881 5
BrechtDeMan@881 6 AUTHORS
BrechtDeMan@881 7 Nicholas Jillings <n.g.r.jillings@se14.qmul.ac.uk>
BrechtDeMan@881 8 Brecht De Man <b.deman@qmul.ac.uk>
BrechtDeMan@881 9 David Moffat <d.j.moffat@qmul.ac.uk>
BrechtDeMan@881 10 Joshua D. Reiss (supervisor) <j.d.reiss@qmul.ac.uk>
BrechtDeMan@881 11
BrechtDeMan@881 12
BrechtDeMan@881 13 PACKAGE CONTENTS
BrechtDeMan@881 14
BrechtDeMan@881 15 - main folder (/)
BrechtDeMan@881 16 - ape.css, core.css, graphics.css, structure.css: style files (edit to change appearance)
BrechtDeMan@881 17 - ape.js: JavaScript file for APE-style interface [1]
BrechtDeMan@881 18 - core.js: JavaScript file with core functionality
BrechtDeMan@881 19 - index.html: webpage where interface should appear
BrechtDeMan@881 20 - jquery-2.1.4.js: jQuery JavaScript Library
BrechtDeMan@881 21 - pythonServer.py: webserver for running tests locally
BrechtDeMan@881 22 - pythonServer-legacy.py: webserver with limited functionality (no storing of output XML files)
BrechtDeMan@881 23 - Documentation (/docs/)
BrechtDeMan@881 24 - Project Specification Document (LaTeX/PDF)
BrechtDeMan@881 25 - Results Specification Document (LaTeX/PDF)
BrechtDeMan@881 26 - SMC15: PDF and LaTeX source of corresponding SMC2015 publication
BrechtDeMan@881 27 - Example project (/example_eval/)
BrechtDeMan@881 28 An example of what the set up XML should look like, with example audio files 0.wav-10.wav which are short recordings at 44.1kHz, 16bit of a woman saying the corresponding number (useful for testing randomisation and general familiarisation with the interface).
BrechtDeMan@881 29 - Output files (/saves/)
BrechtDeMan@881 30 The output XML files of tests will be stored here by default by the pythonServer.py script.
BrechtDeMan@881 31 - Auxiliary scripts (/scripts/)
BrechtDeMan@881 32 Helpful Python scripts for extraction and visualisation of data.
BrechtDeMan@881 33 - Test creation tool (/test_create/)
BrechtDeMan@881 34 Webpage for easily setting up a test without having to delve into the XML.
BrechtDeMan@881 35
BrechtDeMan@881 36
BrechtDeMan@881 37 QUICK START
BrechtDeMan@1070 38
BrechtDeMan@881 39 Using the example project:
BrechtDeMan@881 40 1. Make sure your system sample rate corresponds with the sample rate of the audio files, if the input XML file enforces the given sample rate.
BrechtDeMan@881 41 2. Run pythonServer.py (make sure you have Python installed).
BrechtDeMan@881 42 3. Open a browser (anything but Internet Explorer).
BrechtDeMan@881 43 4. Go to ‘localhost:8000’.
BrechtDeMan@881 44 5. The test should open; complete it and look at the output XML file in /saves/.
BrechtDeMan@881 45
BrechtDeMan@881 46
BrechtDeMan@881 47 LEGACY
BrechtDeMan@1070 48
BrechtDeMan@881 49 The APE interface and most of the functionality of the interface is inspired by the APE toolbox for MATLAB [1]. See https://code.soundsoftware.ac.uk/projects/ape for the source code and corresponding paper.
BrechtDeMan@881 50
BrechtDeMan@881 51
BrechtDeMan@881 52 CITING
BrechtDeMan@881 53
BrechtDeMan@881 54 We request that you acknowledge the authors and cite our work [2], see CITING.txt.
BrechtDeMan@881 55
BrechtDeMan@881 56
BrechtDeMan@881 57 LICENSE
BrechtDeMan@881 58
BrechtDeMan@881 59 See LICENSE.txt. This code is shared under the GNU General Public License v3.0 (http://choosealicense.com/licenses/gpl-3.0/). Generally speaking, this is a copyleft license that requires anyone who distributes our code or a derivative work to make the source available under the same terms.
BrechtDeMan@881 60
BrechtDeMan@881 61
BrechtDeMan@881 62 FEATURE REQUESTS AND BUG REPORTS
BrechtDeMan@1054 63
BrechtDeMan@883 64 We continually develop this tool to fix issues and implement features useful to us or our user base. See https://code.soundsoftware.ac.uk/projects/webaudioevaluationtool/issues for a list of feature requests and bug reports, and their status.
BrechtDeMan@883 65
BrechtDeMan@881 66 Please contact the authors if you experience any bugs, if you would like additional functionality, if you have questions about using the interface or if you would like to give any feedback (even positive!) about the interface. We look forward to learning how the tool has (not) been useful to you.
BrechtDeMan@881 67
BrechtDeMan@881 68
BrechtDeMan@1054 69 TROUBLESHOOTING
BrechtDeMan@1054 70
BrechtDeMan@1054 71 Thanks to feedback from using the interface in experiments by the authors and others, many bugs have been caught and fatal crashes due to the interface (provided it is set up properly by the user) seem to be a thing of the past.
BrechtDeMan@1056 72 However, if things do go wrong or the test needs to be interrupted for whatever reason, all data is not lost. In a normal scenario, the test needs to be completed until the end (the final ‘Submit’), at which point the output XML is stored in ‘saves/‘. If this stage is not reached, open the JavaScript Console (see below for how to find it) and type ‘createProjectSave()’ (without the quotes) and hit enter. This will open a pop-up window with a hyperlink that reads ‘Save File’; click it and an XML file with results until that point should be stored in your download folder.
BrechtDeMan@1056 73 Alternatively, a lot of data can be read from the same console, in which the tool prints a lot of debug information. Specifically:
BrechtDeMan@1054 74 - the randomisation of pages and fragments are logged;
BrechtDeMan@1054 75 - any time a slider is played, its ID and the time stamp (in seconds since the start of the test) are displayed;
BrechtDeMan@1054 76 - any time a slider is dragged and dropped, the location where it is dropped including the time stamp are shown;
BrechtDeMan@1054 77 - any comments and pre- or post-test questions and their answers are logged as well.
BrechtDeMan@1054 78
BrechtDeMan@1056 79 You can select all this and save into a text file, so that none of this data is lost. You may to choose to do this even when a test was successful as an extra precaution.
BrechtDeMan@1054 80
BrechtDeMan@1054 81 In Google Chrome, the JavaScript Console can be found in View>Developer>JavaScript Console, or via the keyboard shortcut Cmd + Alt + J (Mac OS X).
BrechtDeMan@1054 82 In Safari, the JavaScript Console can be found in Develop>Show Error Console, or via the keyboard shortcut Cmd + Alt + C (Mac OS X). Note that for the Developer menu to be visible, you have to go to Preferences (Cmd + ,) and enable ‘Show Develop menu in menu bar’ in the ‘Advanced’ tab.
BrechtDeMan@1054 83 In Firefox, go to Tools>Web Developer>Web Console, or hit Cmd + Alt + K.
BrechtDeMan@1054 84
BrechtDeMan@1054 85
BrechtDeMan@1070 86 REMOTE TESTS
BrechtDeMan@1070 87
BrechtDeMan@1070 88 As the test is browser-based, it can be run remotely from a web server without modification. To allow for remote storage of the output XML files (as opposed to saving them locally on the subject’s machine, which is the default if no ‘save’ path is specified or found), a PHP script on the server needs to accept the output XML files. An example of such script will be included in a future version.
BrechtDeMan@1070 89
BrechtDeMan@1070 90
BrechtDeMan@884 91 SCRIPTS
BrechtDeMan@884 92
BrechtDeMan@1070 93 The tool comes with a few handy Python (2.7) scripts for easy extraction of ratings or comments, and visualisation of ratings and timelines. See below for a quick guide on how to use them. All scripts written for Python 2.7. Visualisation requires the free matplotlib toolbox (http://matplotlib.org), numpy and scipy.
BrechtDeMan@1072 94 By default, the scripts can be run from the ‘scripts’ folder, with the result files in the ‘saves’ folder (the default location where result XMLs are stored). Each script takes the XML file folder as an argument, along with other arguments in some cases.
BrechtDeMan@1072 95 Note: to avoid all kinds of problems, please avoid using spaces in file and folder names (this may work on some systems, but others don’t like it).
BrechtDeMan@884 96
BrechtDeMan@884 97 comment_parser.py
BrechtDeMan@884 98 Extracts comments from the output XML files corresponding with the different subjects found in ‘saves/’. It creates a folder per ‘audioholder’/page it finds, and stores a CSV file with comments for every ‘audioelement’/fragment within these respective ‘audioholders’/pages. In this CSV file, every line corresponds with a subject/output XML file. Depending on the settings, the first column containing the name of the corresponding XML file can be omitted (for anonymisation).
BrechtDeMan@884 99 Beware of Excel: sometimes the UTF-8 is not properly imported, leading to problems with special characters in the comments (particularly cumbersome for foreign languages).
BrechtDeMan@884 100
BrechtDeMan@1051 101 evaluation_stats.py
BrechtDeMan@1051 102 Shows a few statistics of tests in the ‘saves/‘ folder so far, mainly for checking for errors. Shows the number of files that are there, the audioholder IDs that were tested (and how many of each separate ID), the duration of each page, the duration of each complete test, the average duration per page, and the average duration in function of the page number.
BrechtDeMan@1051 103
BrechtDeMan@1072 104 generate_report.py
BrechtDeMan@1072 105 Similar to ‘evaluation_stats.py’, but generates a PDF report based on the output files in the ‘saves/‘ folder - or any folder specified as command line argument. Uses pdflatex to write a LaTeX document, then convert to a PDF.
BrechtDeMan@1072 106
BrechtDeMan@884 107 score_parser.py
BrechtDeMan@884 108 Extracts rating values from the XML to CSV - necessary for running visualisation of ratings. Creates the folder ‘saves/ratings/‘ if not yet created, to which it writes a separate file for every ‘audioholder’/page in any of the output XMLs it finds in ‘saves/‘. Within each file, rows represent different subjects (output XML file names) and columns represent different ‘audioelements’/fragments.
BrechtDeMan@884 109
BrechtDeMan@884 110 score_plot.py
BrechtDeMan@884 111 Plots the ratings as stored in the CSVs created by score_parser.py
BrechtDeMan@884 112 Depending on the settings, it displays and/or saves (in ‘saves/ratings/’) a boxplot, confidence interval plot, scatter plot, or a combination of the aforementioned.
BrechtDeMan@884 113 Requires the free matplotlib library.
BrechtDeMan@884 114 At this point, more than one subjects are needed for this script to work.
BrechtDeMan@884 115
BrechtDeMan@1070 116 timeline_view_movement.py
BrechtDeMan@1070 117 Creates a timeline for every subject, for every ‘audioholder’/page, corresponding with any of the output XML files found in ‘/saves’. It shows the marker movements of the different fragments, along with when each fragment was played (red regions). Automatically takes fragment names, rating axis title, rating axis labels, and audioholder name from the XML file (if available).
BrechtDeMan@1070 118
BrechtDeMan@884 119 timeline_view.py
BrechtDeMan@884 120 Creates a timeline for every subject, for every ‘audioholder’/page, corresponding with any of the output XML files found in ‘/saves’. It shows when and for how long the subject listened to each of the fragments.
BrechtDeMan@884 121
BrechtDeMan@884 122
BrechtDeMan@884 123
BrechtDeMan@881 124 REFERENCES
BrechtDeMan@881 125 [1] B. De Man and Joshua D. Reiss, “APE: Audio Perceptual Evaluation toolbox for MATLAB,” 136th Convention of the Audio Engineering Society, 2014.
BrechtDeMan@881 126
BrechtDeMan@884 127 [2] Nicholas Jillings, Brecht De Man, David Moffat and Joshua D. Reiss, "Web Audio Evaluation Tool: A Browser-Based Listening Test Environment," 12th Sound and Music Computing Conference, July 2015.