annotate README.txt @ 1056:ae2f1c7f7584

README: Extra troubleshooting info (type 'createProjectSave()' to summon pop-up that saves XML file so far). Previous commit should read 'scrubber' instead of 'slider'
author Brecht De Man <BrechtDeMan@users.noreply.github.com>
date Wed, 01 Jul 2015 15:19:55 +0100
parents 5bcf1745e044
children 99cb3436759e
rev   line source
BrechtDeMan@881 1 WEB AUDIO EVALUATION TOOL
BrechtDeMan@881 2
BrechtDeMan@881 3 This is not (yet) a fully fledged manual.
BrechtDeMan@881 4
BrechtDeMan@881 5
BrechtDeMan@881 6 AUTHORS
BrechtDeMan@881 7 Nicholas Jillings <n.g.r.jillings@se14.qmul.ac.uk>
BrechtDeMan@881 8 Brecht De Man <b.deman@qmul.ac.uk>
BrechtDeMan@881 9 David Moffat <d.j.moffat@qmul.ac.uk>
BrechtDeMan@881 10 Joshua D. Reiss (supervisor) <j.d.reiss@qmul.ac.uk>
BrechtDeMan@881 11
BrechtDeMan@881 12
BrechtDeMan@881 13 PACKAGE CONTENTS
BrechtDeMan@881 14
BrechtDeMan@881 15 - main folder (/)
BrechtDeMan@881 16 - ape.css, core.css, graphics.css, structure.css: style files (edit to change appearance)
BrechtDeMan@881 17 - ape.js: JavaScript file for APE-style interface [1]
BrechtDeMan@881 18 - core.js: JavaScript file with core functionality
BrechtDeMan@881 19 - index.html: webpage where interface should appear
BrechtDeMan@881 20 - jquery-2.1.4.js: jQuery JavaScript Library
BrechtDeMan@881 21 - pythonServer.py: webserver for running tests locally
BrechtDeMan@881 22 - pythonServer-legacy.py: webserver with limited functionality (no storing of output XML files)
BrechtDeMan@881 23 - Documentation (/docs/)
BrechtDeMan@881 24 - Project Specification Document (LaTeX/PDF)
BrechtDeMan@881 25 - Results Specification Document (LaTeX/PDF)
BrechtDeMan@881 26 - SMC15: PDF and LaTeX source of corresponding SMC2015 publication
BrechtDeMan@881 27 - Example project (/example_eval/)
BrechtDeMan@881 28 An example of what the set up XML should look like, with example audio files 0.wav-10.wav which are short recordings at 44.1kHz, 16bit of a woman saying the corresponding number (useful for testing randomisation and general familiarisation with the interface).
BrechtDeMan@881 29 - Output files (/saves/)
BrechtDeMan@881 30 The output XML files of tests will be stored here by default by the pythonServer.py script.
BrechtDeMan@881 31 - Auxiliary scripts (/scripts/)
BrechtDeMan@881 32 Helpful Python scripts for extraction and visualisation of data.
BrechtDeMan@881 33 - Test creation tool (/test_create/)
BrechtDeMan@881 34 Webpage for easily setting up a test without having to delve into the XML.
BrechtDeMan@881 35
BrechtDeMan@881 36
BrechtDeMan@881 37 QUICK START
BrechtDeMan@881 38 Using the example project:
BrechtDeMan@881 39 1. Make sure your system sample rate corresponds with the sample rate of the audio files, if the input XML file enforces the given sample rate.
BrechtDeMan@881 40 2. Run pythonServer.py (make sure you have Python installed).
BrechtDeMan@881 41 3. Open a browser (anything but Internet Explorer).
BrechtDeMan@881 42 4. Go to ‘localhost:8000’.
BrechtDeMan@881 43 5. The test should open; complete it and look at the output XML file in /saves/.
BrechtDeMan@881 44
BrechtDeMan@881 45
BrechtDeMan@881 46 LEGACY
BrechtDeMan@881 47 The APE interface and most of the functionality of the interface is inspired by the APE toolbox for MATLAB [1]. See https://code.soundsoftware.ac.uk/projects/ape for the source code and corresponding paper.
BrechtDeMan@881 48
BrechtDeMan@881 49
BrechtDeMan@881 50 CITING
BrechtDeMan@881 51
BrechtDeMan@881 52 We request that you acknowledge the authors and cite our work [2], see CITING.txt.
BrechtDeMan@881 53
BrechtDeMan@881 54
BrechtDeMan@881 55 LICENSE
BrechtDeMan@881 56
BrechtDeMan@881 57 See LICENSE.txt. This code is shared under the GNU General Public License v3.0 (http://choosealicense.com/licenses/gpl-3.0/). Generally speaking, this is a copyleft license that requires anyone who distributes our code or a derivative work to make the source available under the same terms.
BrechtDeMan@881 58
BrechtDeMan@881 59
BrechtDeMan@881 60 FEATURE REQUESTS AND BUG REPORTS
BrechtDeMan@1054 61
BrechtDeMan@883 62 We continually develop this tool to fix issues and implement features useful to us or our user base. See https://code.soundsoftware.ac.uk/projects/webaudioevaluationtool/issues for a list of feature requests and bug reports, and their status.
BrechtDeMan@883 63
BrechtDeMan@881 64 Please contact the authors if you experience any bugs, if you would like additional functionality, if you have questions about using the interface or if you would like to give any feedback (even positive!) about the interface. We look forward to learning how the tool has (not) been useful to you.
BrechtDeMan@881 65
BrechtDeMan@881 66
BrechtDeMan@1054 67 TROUBLESHOOTING
BrechtDeMan@1054 68
BrechtDeMan@1054 69 Thanks to feedback from using the interface in experiments by the authors and others, many bugs have been caught and fatal crashes due to the interface (provided it is set up properly by the user) seem to be a thing of the past.
BrechtDeMan@1056 70 However, if things do go wrong or the test needs to be interrupted for whatever reason, all data is not lost. In a normal scenario, the test needs to be completed until the end (the final ‘Submit’), at which point the output XML is stored in ‘saves/‘. If this stage is not reached, open the JavaScript Console (see below for how to find it) and type ‘createProjectSave()’ (without the quotes) and hit enter. This will open a pop-up window with a hyperlink that reads ‘Save File’; click it and an XML file with results until that point should be stored in your download folder.
BrechtDeMan@1056 71 Alternatively, a lot of data can be read from the same console, in which the tool prints a lot of debug information. Specifically:
BrechtDeMan@1054 72 - the randomisation of pages and fragments are logged;
BrechtDeMan@1054 73 - any time a slider is played, its ID and the time stamp (in seconds since the start of the test) are displayed;
BrechtDeMan@1054 74 - any time a slider is dragged and dropped, the location where it is dropped including the time stamp are shown;
BrechtDeMan@1054 75 - any comments and pre- or post-test questions and their answers are logged as well.
BrechtDeMan@1054 76
BrechtDeMan@1056 77 You can select all this and save into a text file, so that none of this data is lost. You may to choose to do this even when a test was successful as an extra precaution.
BrechtDeMan@1054 78
BrechtDeMan@1054 79 In Google Chrome, the JavaScript Console can be found in View>Developer>JavaScript Console, or via the keyboard shortcut Cmd + Alt + J (Mac OS X).
BrechtDeMan@1054 80 In Safari, the JavaScript Console can be found in Develop>Show Error Console, or via the keyboard shortcut Cmd + Alt + C (Mac OS X). Note that for the Developer menu to be visible, you have to go to Preferences (Cmd + ,) and enable ‘Show Develop menu in menu bar’ in the ‘Advanced’ tab.
BrechtDeMan@1054 81 In Firefox, go to Tools>Web Developer>Web Console, or hit Cmd + Alt + K.
BrechtDeMan@1054 82
BrechtDeMan@1054 83
BrechtDeMan@884 84 SCRIPTS
BrechtDeMan@884 85
BrechtDeMan@884 86 The tool comes with a few handy Python scripts for easy extraction of ratings or comments, and visualisation of ratings and timelines. See below for a quick guide on how to use them. All scripts written for Python 2.7. Visualisation requires the free matplotlib toolbox (http://matplotlib.org), numpy and scipy.
BrechtDeMan@884 87 By default, the scripts can be run from the ‘scripts’ folder, with the result files in the ‘saves’ folder (the default location where result XMLs are stored).
BrechtDeMan@884 88
BrechtDeMan@884 89 comment_parser.py
BrechtDeMan@884 90 Extracts comments from the output XML files corresponding with the different subjects found in ‘saves/’. It creates a folder per ‘audioholder’/page it finds, and stores a CSV file with comments for every ‘audioelement’/fragment within these respective ‘audioholders’/pages. In this CSV file, every line corresponds with a subject/output XML file. Depending on the settings, the first column containing the name of the corresponding XML file can be omitted (for anonymisation).
BrechtDeMan@884 91 Beware of Excel: sometimes the UTF-8 is not properly imported, leading to problems with special characters in the comments (particularly cumbersome for foreign languages).
BrechtDeMan@884 92
BrechtDeMan@1051 93 evaluation_stats.py
BrechtDeMan@1051 94 Shows a few statistics of tests in the ‘saves/‘ folder so far, mainly for checking for errors. Shows the number of files that are there, the audioholder IDs that were tested (and how many of each separate ID), the duration of each page, the duration of each complete test, the average duration per page, and the average duration in function of the page number.
BrechtDeMan@1051 95
BrechtDeMan@884 96 score_parser.py
BrechtDeMan@884 97 Extracts rating values from the XML to CSV - necessary for running visualisation of ratings. Creates the folder ‘saves/ratings/‘ if not yet created, to which it writes a separate file for every ‘audioholder’/page in any of the output XMLs it finds in ‘saves/‘. Within each file, rows represent different subjects (output XML file names) and columns represent different ‘audioelements’/fragments.
BrechtDeMan@884 98
BrechtDeMan@884 99 score_plot.py
BrechtDeMan@884 100 Plots the ratings as stored in the CSVs created by score_parser.py
BrechtDeMan@884 101 Depending on the settings, it displays and/or saves (in ‘saves/ratings/’) a boxplot, confidence interval plot, scatter plot, or a combination of the aforementioned.
BrechtDeMan@884 102 Requires the free matplotlib library.
BrechtDeMan@884 103 At this point, more than one subjects are needed for this script to work.
BrechtDeMan@884 104
BrechtDeMan@884 105 timeline_view.py
BrechtDeMan@884 106 Creates a timeline for every subject, for every ‘audioholder’/page, corresponding with any of the output XML files found in ‘/saves’. It shows when and for how long the subject listened to each of the fragments.
BrechtDeMan@884 107
BrechtDeMan@884 108
BrechtDeMan@884 109
BrechtDeMan@881 110 REFERENCES
BrechtDeMan@881 111 [1] B. De Man and Joshua D. Reiss, “APE: Audio Perceptual Evaluation toolbox for MATLAB,” 136th Convention of the Audio Engineering Society, 2014.
BrechtDeMan@881 112
BrechtDeMan@884 113 [2] Nicholas Jillings, Brecht De Man, David Moffat and Joshua D. Reiss, "Web Audio Evaluation Tool: A Browser-Based Listening Test Environment," 12th Sound and Music Computing Conference, July 2015.