annotate README.txt @ 1392:4a0c4119e00d

Bug #1486: Fixed rogue '+' appearing in move slider alert. Unlabelled axis have default of 'Axis ' and their index.
author Nicholas Jillings <nickjillings@users.noreply.github.com>
date Thu, 17 Dec 2015 13:03:39 +0000
parents
children 397f19747594 1b6fa37d46a4 235594325b84
rev   line source
nickjillings@1392 1 WEB AUDIO EVALUATION TOOL
nickjillings@1392 2
nickjillings@1392 3 This is not (yet) a fully fledged manual.
nickjillings@1392 4
nickjillings@1392 5
nickjillings@1392 6 AUTHORS
nickjillings@1392 7 Nicholas Jillings <n.g.r.jillings@se14.qmul.ac.uk>
nickjillings@1392 8 Brecht De Man <b.deman@qmul.ac.uk>
nickjillings@1392 9 David Moffat <d.j.moffat@qmul.ac.uk>
nickjillings@1392 10 Joshua D. Reiss (supervisor) <j.d.reiss@qmul.ac.uk>
nickjillings@1392 11
nickjillings@1392 12
nickjillings@1392 13 PACKAGE CONTENTS
nickjillings@1392 14
nickjillings@1392 15 - main folder (/)
nickjillings@1392 16 - ape.css, core.css, graphics.css, structure.css: style files (edit to change appearance)
nickjillings@1392 17 - ape.js: JavaScript file for APE-style interface [1]
nickjillings@1392 18 - core.js: JavaScript file with core functionality
nickjillings@1392 19 - index.html: webpage where interface should appear
nickjillings@1392 20 - jquery-2.1.4.js: jQuery JavaScript Library
nickjillings@1392 21 - pythonServer.py: webserver for running tests locally
nickjillings@1392 22 - pythonServer-legacy.py: webserver with limited functionality (no storing of output XML files)
nickjillings@1392 23 - Documentation (/docs/)
nickjillings@1392 24 - Project Specification Document (LaTeX/PDF)
nickjillings@1392 25 - Results Specification Document (LaTeX/PDF)
nickjillings@1392 26 - SMC15: PDF and LaTeX source of corresponding SMC2015 publication
nickjillings@1392 27 - Example project (/example_eval/)
nickjillings@1392 28 An example of what the set up XML should look like, with example audio files 0.wav-10.wav which are short recordings at 44.1kHz, 16bit of a woman saying the corresponding number (useful for testing randomisation and general familiarisation with the interface).
nickjillings@1392 29 - Output files (/saves/)
nickjillings@1392 30 The output XML files of tests will be stored here by default by the pythonServer.py script.
nickjillings@1392 31 - Auxiliary scripts (/scripts/)
nickjillings@1392 32 Helpful Python scripts for extraction and visualisation of data.
nickjillings@1392 33 - Test creation tool (/test_create/)
nickjillings@1392 34 Webpage for easily setting up a test without having to delve into the XML.
nickjillings@1392 35
nickjillings@1392 36
nickjillings@1392 37 QUICK START
nickjillings@1392 38
nickjillings@1392 39 Using the example project:
nickjillings@1392 40 1. Make sure your system sample rate corresponds with the sample rate of the audio files, if the input XML file enforces the given sample rate.
nickjillings@1392 41 2. Run pythonServer.py (make sure you have Python installed).
nickjillings@1392 42 3. Open a browser (anything but Internet Explorer).
nickjillings@1392 43 4. Go to ‘localhost:8000’.
nickjillings@1392 44 5. The test should open; complete it and look at the output XML file in /saves/.
nickjillings@1392 45
nickjillings@1392 46
nickjillings@1392 47 LEGACY
nickjillings@1392 48
nickjillings@1392 49 The APE interface and most of the functionality of the interface is inspired by the APE toolbox for MATLAB [1]. See https://code.soundsoftware.ac.uk/projects/ape for the source code and corresponding paper.
nickjillings@1392 50
nickjillings@1392 51
nickjillings@1392 52 CITING
nickjillings@1392 53
nickjillings@1392 54 We request that you acknowledge the authors and cite our work [2], see CITING.txt.
nickjillings@1392 55
nickjillings@1392 56
nickjillings@1392 57 LICENSE
nickjillings@1392 58
nickjillings@1392 59 See LICENSE.txt. This code is shared under the GNU General Public License v3.0 (http://choosealicense.com/licenses/gpl-3.0/). Generally speaking, this is a copyleft license that requires anyone who distributes our code or a derivative work to make the source available under the same terms.
nickjillings@1392 60
nickjillings@1392 61
nickjillings@1392 62 FEATURE REQUESTS AND BUG REPORTS
nickjillings@1392 63
nickjillings@1392 64 We continually develop this tool to fix issues and implement features useful to us or our user base. See https://code.soundsoftware.ac.uk/projects/webaudioevaluationtool/issues for a list of feature requests and bug reports, and their status.
nickjillings@1392 65
nickjillings@1392 66 Please contact the authors if you experience any bugs, if you would like additional functionality, if you have questions about using the interface or if you would like to give any feedback (even positive!) about the interface. We look forward to learning how the tool has (not) been useful to you.
nickjillings@1392 67
nickjillings@1392 68
nickjillings@1392 69 TROUBLESHOOTING
nickjillings@1392 70
nickjillings@1392 71 Thanks to feedback from using the interface in experiments by the authors and others, many bugs have been caught and fatal crashes due to the interface (provided it is set up properly by the user) seem to be a thing of the past.
nickjillings@1392 72 However, if things do go wrong or the test needs to be interrupted for whatever reason, all data is not lost. In a normal scenario, the test needs to be completed until the end (the final ‘Submit’), at which point the output XML is stored in ‘saves/‘. If this stage is not reached, open the JavaScript Console (see below for how to find it) and type ‘createProjectSave()’ (without the quotes) and hit enter. This will open a pop-up window with a hyperlink that reads ‘Save File’; click it and an XML file with results until that point should be stored in your download folder.
nickjillings@1392 73 Alternatively, a lot of data can be read from the same console, in which the tool prints a lot of debug information. Specifically:
nickjillings@1392 74 - the randomisation of pages and fragments are logged;
nickjillings@1392 75 - any time a slider is played, its ID and the time stamp (in seconds since the start of the test) are displayed;
nickjillings@1392 76 - any time a slider is dragged and dropped, the location where it is dropped including the time stamp are shown;
nickjillings@1392 77 - any comments and pre- or post-test questions and their answers are logged as well.
nickjillings@1392 78
nickjillings@1392 79 You can select all this and save into a text file, so that none of this data is lost. You may to choose to do this even when a test was successful as an extra precaution.
nickjillings@1392 80
nickjillings@1392 81 In Google Chrome, the JavaScript Console can be found in View>Developer>JavaScript Console, or via the keyboard shortcut Cmd + Alt + J (Mac OS X).
nickjillings@1392 82 In Safari, the JavaScript Console can be found in Develop>Show Error Console, or via the keyboard shortcut Cmd + Alt + C (Mac OS X). Note that for the Developer menu to be visible, you have to go to Preferences (Cmd + ,) and enable ‘Show Develop menu in menu bar’ in the ‘Advanced’ tab.
nickjillings@1392 83 In Firefox, go to Tools>Web Developer>Web Console, or hit Cmd + Alt + K.
nickjillings@1392 84
nickjillings@1392 85
nickjillings@1392 86 REMOTE TESTS
nickjillings@1392 87
nickjillings@1392 88 As the test is browser-based, it can be run remotely from a web server without modification. To allow for remote storage of the output XML files (as opposed to saving them locally on the subject’s machine, which is the default if no ‘save’ path is specified or found), a PHP script on the server needs to accept the output XML files. An example of such script will be included in a future version.
nickjillings@1392 89
nickjillings@1392 90
nickjillings@1392 91 SCRIPTS
nickjillings@1392 92
nickjillings@1392 93 The tool comes with a few handy Python (2.7) scripts for easy extraction of ratings or comments, and visualisation of ratings and timelines. See below for a quick guide on how to use them. All scripts written for Python 2.7. Visualisation requires the free matplotlib toolbox (http://matplotlib.org), numpy and scipy.
nickjillings@1392 94 By default, the scripts can be run from the ‘scripts’ folder, with the result files in the ‘saves’ folder (the default location where result XMLs are stored). Each script takes the XML file folder as an argument, along with other arguments in some cases.
nickjillings@1392 95 Note: to avoid all kinds of problems, please avoid using spaces in file and folder names (this may work on some systems, but others don’t like it).
nickjillings@1392 96
nickjillings@1392 97 comment_parser.py
nickjillings@1392 98 Extracts comments from the output XML files corresponding with the different subjects found in ‘saves/’. It creates a folder per ‘audioholder’/page it finds, and stores a CSV file with comments for every ‘audioelement’/fragment within these respective ‘audioholders’/pages. In this CSV file, every line corresponds with a subject/output XML file. Depending on the settings, the first column containing the name of the corresponding XML file can be omitted (for anonymisation).
nickjillings@1392 99 Beware of Excel: sometimes the UTF-8 is not properly imported, leading to problems with special characters in the comments (particularly cumbersome for foreign languages).
nickjillings@1392 100
nickjillings@1392 101 evaluation_stats.py
nickjillings@1392 102 Shows a few statistics of tests in the ‘saves/‘ folder so far, mainly for checking for errors. Shows the number of files that are there, the audioholder IDs that were tested (and how many of each separate ID), the duration of each page, the duration of each complete test, the average duration per page, and the average duration in function of the page number.
nickjillings@1392 103
nickjillings@1392 104 generate_report.py
nickjillings@1392 105 Similar to ‘evaluation_stats.py’, but generates a PDF report based on the output files in the ‘saves/‘ folder - or any folder specified as command line argument. Uses pdflatex to write a LaTeX document, then convert to a PDF.
nickjillings@1392 106
nickjillings@1392 107 score_parser.py
nickjillings@1392 108 Extracts rating values from the XML to CSV - necessary for running visualisation of ratings. Creates the folder ‘saves/ratings/‘ if not yet created, to which it writes a separate file for every ‘audioholder’/page in any of the output XMLs it finds in ‘saves/‘. Within each file, rows represent different subjects (output XML file names) and columns represent different ‘audioelements’/fragments.
nickjillings@1392 109
nickjillings@1392 110 score_plot.py
nickjillings@1392 111 Plots the ratings as stored in the CSVs created by score_parser.py
nickjillings@1392 112 Depending on the settings, it displays and/or saves (in ‘saves/ratings/’) a boxplot, confidence interval plot, scatter plot, or a combination of the aforementioned.
nickjillings@1392 113 Requires the free matplotlib library.
nickjillings@1392 114 At this point, more than one subjects are needed for this script to work.
nickjillings@1392 115
nickjillings@1392 116 timeline_view_movement.py
nickjillings@1392 117 Creates a timeline for every subject, for every ‘audioholder’/page, corresponding with any of the output XML files found in ‘/saves’. It shows the marker movements of the different fragments, along with when each fragment was played (red regions). Automatically takes fragment names, rating axis title, rating axis labels, and audioholder name from the XML file (if available).
nickjillings@1392 118
nickjillings@1392 119 timeline_view.py
nickjillings@1392 120 Creates a timeline for every subject, for every ‘audioholder’/page, corresponding with any of the output XML files found in ‘/saves’. It shows when and for how long the subject listened to each of the fragments.
nickjillings@1392 121
nickjillings@1392 122
nickjillings@1392 123
nickjillings@1392 124 REFERENCES
nickjillings@1392 125 [1] B. De Man and Joshua D. Reiss, “APE: Audio Perceptual Evaluation toolbox for MATLAB,” 136th Convention of the Audio Engineering Society, 2014.
nickjillings@1392 126
nickjillings@1392 127 [2] Nicholas Jillings, Brecht De Man, David Moffat and Joshua D. Reiss, "Web Audio Evaluation Tool: A Browser-Based Listening Test Environment," 12th Sound and Music Computing Conference, July 2015.