changeset 444:9c9fd68693b1

Merge. Pull of revision info from dev_main.
author Nicholas Jillings <n.g.r.jillings@se14.qmul.ac.uk>
date Wed, 23 Dec 2015 14:36:00 +0000
parents 1081368deed7 (current diff) 4866152611e6 (diff)
children 63c4163fc705
files core.js docs/Instructions/BuildingInterface.tex docs/Instructions/ListeningTestInstructions.bib docs/Instructions/ListeningTestInstructions.pdf docs/Instructions/ListeningTestInstructions.tex docs/Instructions/User Guide.tex pythonServer.py
diffstat 26 files changed, 3388 insertions(+), 1647 deletions(-) [+]
line wrap: on
line diff
--- a/CITING.txt	Mon Dec 21 23:18:43 2015 +0000
+++ b/CITING.txt	Wed Dec 23 14:36:00 2015 +0000
@@ -5,7 +5,7 @@
 
 Feel free to let us know how you have used it! We highly welcome any kind of feedback, bug reports and feature requests. 
 
-n.g.r.jillings@se14.qmul.ac.uk
+nicholas.jillings@mail.bcu.ac.uk
 b.deman@qmul.ac.uk
 d.j.moffat@qmul.ac.uk
 joshua.reiss@qmul.ac.uk
\ No newline at end of file
--- a/README.txt	Mon Dec 21 23:18:43 2015 +0000
+++ b/README.txt	Wed Dec 23 14:36:00 2015 +0000
@@ -1,127 +1,11 @@
 WEB AUDIO EVALUATION TOOL
 
-This is not (yet) a fully fledged manual. 
-
-
 AUTHORS
 Nicholas Jillings 		<n.g.r.jillings@se14.qmul.ac.uk>
 Brecht De Man			<b.deman@qmul.ac.uk>
 David Moffat			<d.j.moffat@qmul.ac.uk>
 Joshua D. Reiss (supervisor)	<j.d.reiss@qmul.ac.uk>
 
+INSTRUCTIONS FOR USE
 
-PACKAGE CONTENTS
-
-- main folder (/)
-	- ape.css, core.css, graphics.css, structure.css: style files (edit to change appearance)
-	- ape.js: JavaScript file for APE-style interface [1]
-	- core.js: JavaScript file with core functionality
-	- index.html: webpage where interface should appear
-	- jquery-2.1.4.js: jQuery JavaScript Library
-	- pythonServer.py: webserver for running tests locally
-	- pythonServer-legacy.py: webserver with limited functionality (no storing of output XML files)
-- Documentation (/docs/)
-	- Project Specification Document (LaTeX/PDF)
-	- Results Specification Document (LaTeX/PDF)
-	- SMC15: PDF and LaTeX source of corresponding SMC2015 publication
-- Example project (/example_eval/)
-	An example of what the set up XML should look like, with example audio files 0.wav-10.wav which are short recordings at 44.1kHz, 16bit of a woman saying the corresponding number (useful for testing randomisation and general familiarisation with the interface). 
-- Output files (/saves/)
-	The output XML files of tests will be stored here by default by the pythonServer.py script. 
-- Auxiliary scripts (/scripts/)
-	Helpful Python scripts for extraction and visualisation of data. 
-- Test creation tool (/test_create/)
-	Webpage for easily setting up a test without having to delve into the XML. 
-
-
-QUICK START
-
-Using the example project: 
-1. Make sure your system sample rate corresponds with the sample rate of the audio files, if the input XML file enforces the given sample rate. 
-2. Run pythonServer.py (make sure you have Python installed). 
-3. Open a browser (anything but Internet Explorer). 
-4. Go to ‘localhost:8000’. 
-5. The test should open; complete it and look at the output XML file in /saves/. 
-
-
-LEGACY
-
-The APE interface and most of the functionality of the interface is inspired by the APE toolbox for MATLAB [1]. See https://code.soundsoftware.ac.uk/projects/ape for the source code and corresponding paper. 
-
-
-CITING
-
-We request that you acknowledge the authors and cite our work [2], see CITING.txt. 
-
-
-LICENSE
-
-See LICENSE.txt. This code is shared under the GNU General Public License v3.0 (http://choosealicense.com/licenses/gpl-3.0/). Generally speaking, this is a copyleft license that requires anyone who distributes our code or a derivative work to make the source available under the same terms. 
-
-
-FEATURE REQUESTS AND BUG REPORTS
-
-We continually develop this tool to fix issues and implement features useful to us or our user base. See https://code.soundsoftware.ac.uk/projects/webaudioevaluationtool/issues for a list of feature requests and bug reports, and their status. 
-
-Please contact the authors if you experience any bugs, if you would like additional functionality, if you have questions about using the interface or if you would like to give any feedback (even positive!) about the interface. We look forward to learning how the tool has (not) been useful to you. 
-
-
-TROUBLESHOOTING
-
-Thanks to feedback from using the interface in experiments by the authors and others, many bugs have been caught and fatal crashes due to the interface (provided it is set up properly by the user) seem to be a thing of the past. 
-However, if things do go wrong or the test needs to be interrupted for whatever reason, all data is not lost. In a normal scenario, the test needs to be completed until the end (the final ‘Submit’), at which point the output XML is stored in ‘saves/‘. If this stage is not reached, open the JavaScript Console (see below for how to find it) and type ‘createProjectSave()’ (without the quotes) and hit enter. This will open a pop-up window with a hyperlink that reads ‘Save File’; click it and an XML file with results until that point should be stored in your download folder. 
-Alternatively, a lot of data can be read from the same console, in which the tool prints a lot of debug information. Specifically:
-	- the randomisation of pages and fragments are logged;
-	- any time a slider is played, its ID and the time stamp (in seconds since the start of the test) are displayed;
-	- any time a slider is dragged and dropped, the location where it is dropped including the time stamp are shown; 
-	- any comments and pre- or post-test questions and their answers are logged as well. 
-
-You can select all this and save into a text file, so that none of this data is lost. You may to choose to do this even when a test was successful as an extra precaution. 
-
-In Google Chrome, the JavaScript Console can be found in View>Developer>JavaScript Console, or via the keyboard shortcut Cmd + Alt + J (Mac OS X). 
-In Safari, the JavaScript Console can be found in Develop>Show Error Console, or via the keyboard shortcut Cmd + Alt + C (Mac OS X). Note that for the Developer menu to be visible, you have to go to Preferences (Cmd + ,) and enable ‘Show Develop menu in menu bar’ in the ‘Advanced’ tab. 
-In Firefox, go to Tools>Web Developer>Web Console, or hit Cmd + Alt + K. 
-
-
-REMOTE TESTS
-
-As the test is browser-based, it can be run remotely from a web server without modification. To allow for remote storage of the output XML files (as opposed to saving them locally on the subject’s machine, which is the default if no ‘save’ path is specified or found), a PHP script on the server needs to accept the output XML files. An example of such script will be included in a future version. 
-
-
-SCRIPTS
-
-The tool comes with a few handy Python (2.7) scripts for easy extraction of ratings or comments, and visualisation of ratings and timelines. See below for a quick guide on how to use them. All scripts written for Python 2.7. Visualisation requires the free matplotlib toolbox (http://matplotlib.org), numpy and scipy. 
-By default, the scripts can be run from the ‘scripts’ folder, with the result files in the ‘saves’ folder (the default location where result XMLs are stored). Each script takes the XML file folder as an argument, along with other arguments in some cases.
-Note: to avoid all kinds of problems, please avoid using spaces in file and folder names (this may work on some systems, but others don’t like it). 
-
-	comment_parser.py
-		Extracts comments from the output XML files corresponding with the different subjects found in ‘saves/’. It creates a folder per ‘audioholder’/page it finds, and stores a CSV file with comments for every ‘audioelement’/fragment within these respective ‘audioholders’/pages. In this CSV file, every line corresponds with a subject/output XML file. Depending on the settings, the first column containing the name of the corresponding XML file can be omitted (for anonymisation). 
-		Beware of Excel: sometimes the UTF-8 is not properly imported, leading to problems with special characters in the comments (particularly cumbersome for foreign languages). 
-
-	evaluation_stats.py
-		Shows a few statistics of tests in the ‘saves/‘ folder so far, mainly for checking for errors. Shows the number of files that are there, the audioholder IDs that were tested (and how many of each separate ID), the duration of each page, the duration of each complete test, the average duration per page, and the average duration in function of the page number. 
-
-	generate_report.py
-		Similar to ‘evaluation_stats.py’, but generates a PDF report based on the output files in the ‘saves/‘ folder - or any folder specified as command line argument. Uses pdflatex to write a LaTeX document, then convert to a PDF. 
-
-	score_parser.py
-		Extracts rating values from the XML to CSV - necessary for running visualisation of ratings. Creates the folder ‘saves/ratings/‘ if not yet created, to which it writes a separate file for every ‘audioholder’/page in any of the output XMLs it finds in ‘saves/‘. Within each file, rows represent different subjects (output XML file names) and columns represent different ‘audioelements’/fragments. 
-
-	score_plot.py
-		Plots the ratings as stored in the CSVs created by score_parser.py
-		Depending on the settings, it displays and/or saves (in ‘saves/ratings/’) a boxplot, confidence interval plot, scatter plot, or a combination of the aforementioned. 
-		Requires the free matplotlib library. 
-		At this point, more than one subjects are needed for this script to work. 
-
-	timeline_view_movement.py
-		Creates a timeline for every subject, for every ‘audioholder’/page, corresponding with any of the output XML files found in ‘/saves’. It shows the marker movements of the different fragments, along with when each fragment was played (red regions). Automatically takes fragment names, rating axis title, rating axis labels, and audioholder name from the XML file (if available). 
-
-	timeline_view.py
-		Creates a timeline for every subject, for every ‘audioholder’/page, corresponding with any of the output XML files found in ‘/saves’. It shows when and for how long the subject listened to each of the fragments. 
-
-
-
-REFERENCES
-[1] B. De Man and Joshua D. Reiss, “APE: Audio Perceptual Evaluation toolbox for MATLAB,” 136th Convention of the Audio Engineering Society, 2014.
-
-[2] Nicholas Jillings, Brecht De Man, David Moffat and Joshua D. Reiss, "Web Audio Evaluation Tool: A Browser-Based Listening Test Environment," 12th Sound and Music Computing Conference, July 2015.
+Please refer to ‘docs/Instructions/Instructions.pdf’
\ No newline at end of file
--- a/analyse.html	Mon Dec 21 23:18:43 2015 +0000
+++ b/analyse.html	Wed Dec 23 14:36:00 2015 +0000
@@ -15,44 +15,18 @@
 		<script type="text/javascript">
 			// To aid 'one-page set-up' all scripts and CSS must be included directly in this file!
 			
-			//google.load("visualization", "1", {packages:["corechart"]});
+			google.load("visualization", "1", {packages:["corechart"]});
 			
 			/*************
 			*	SETUP   *
 			*************/
 			// folder where to find the XML files
-			xmlFileFolder = "analysis_test";
+			xmlFileFolder = "saves";
 			// array of XML files
-			var xmlFiles = ['McG-A-2013-09.xml', 'McG-A-2014-03.xml', 'McG-A-2014-12.xml', 'McG-B-2013-09.xml', 
-			'McG-B-2014-03.xml', 'McG-B-2014-12.xml', 'McG-C-2013-09.xml', 'McG-C-2014-03.xml', 'McG-C-2014-12.xml', 
-			'McG-D-2013-09.xml', 'McG-D-2014-03.xml', 'McG-D-2014-12.xml', 'McG-E-2013-09.xml', 'McG-E-2014-03.xml', 
-			'McG-E-2014-12.xml', 'McG-F-2013-09.xml', 'McG-F-2014-03.xml', 'McG-F-2014-12.xml', 'McG-G-2014-03.xml', 
-			'McG-G-2014-12.xml', 'McG-H-2013-09.xml', 'McG-H-2014-03.xml', 'McG-H-2014-12.xml', 'McG-I-2013-09.xml', 
-			'McG-I-2014-03.xml', 'McG-J-2013-09.xml', 'McG-J-2014-03.xml', 'McG-K-2013-09.xml', 'McG-K-2014-03.xml', 
-			'McG-L-2013-09.xml', 'McG-L-2014-03.xml', 'McG-M-2013-09.xml', 'McG-M-2014-03.xml', 'McG-N-2013-09.xml', 
-			'McG-N-2014-03.xml', 'McG-O-2013-09.xml', 'McG-O-2014-03.xml', 'McG-P-2013-09.xml', 'McG-P-2014-03.xml', 
-			'McG-pro1-2013-09.xml', 'McG-pro1-2014-03.xml', 'McG-pro1-2014-12.xml', 'McG-pro2-2013-09.xml', 
-			'McG-pro2-2014-03.xml', 'McG-pro2-2014-12.xml', 'McG-Q-2014-12.xml', 'McG-R-2014-12.xml', 
-			'McG-S-2014-12.xml', 'McG-subA-2013-09.xml', 'McG-subA-2014-03.xml', 'McG-subB-2014-03.xml', 
-			'McG-subB-2014-12.xml', 'McG-subC-2013-09.xml', 'McG-subC-2014-03.xml', 'McG-subC-2014-12.xml', 
-			'McG-subD-2013-09.xml', 'McG-subD-2014-12.xml', 'McG-subE-2014-12.xml', 'McG-subG-2014-12.xml', 
-			'McG-subH-2013-09.xml', 'McG-T-2014-12.xml', 'McG-U-2014-12.xml', 'McG-V-2014-12.xml', 
-			'McG-W-2014-12.xml', 'McG-X-2014-12.xml', 'MG1-2013-09.xml', 'MG2-2013-09.xml', 'MG3-2013-09.xml', 
-			'MG4-2013-09.xml', 'MG5-2013-09.xml', 'MG6-2013-09.xml', 'MG7-2013-09.xml', 'MG8-2013-09.xml', 
-			'MG9-2013-09.xml', 'QM-1-1.xml', 'QM-1-2.xml', 'QM-10-1.xml', 'QM-11-1.xml', 'QM-11-2.xml', 'QM-12-1.xml', 'QM-12-2.xml', 
-			'QM-13-1.xml', 'QM-14-1.xml', 'QM-15-1.xml', 'QM-16-1.xml', 'QM-17-1.xml', 'QM-18-1.xml', 'QM-18-2.xml', 
-			'QM-18-3.xml', 'QM-19-1.xml', 'QM-2-1.xml', 'QM-2-2.xml', 'QM-2-3.xml', 'QM-20-1.xml', 'QM-20-2.xml', 
-			'QM-20-3.xml', 'QM-21-1.xml', 'QM-21-2.xml', 'QM-3-1.xml', 'QM-3-2.xml', 'QM-3-3.xml', 'QM-4-1.xml', 'QM-5-1.xml', 
-			'QM-5-2.xml', 'QM-6-1.xml', 'QM-6-2.xml', 'QM-7-1.xml', 'QM-7-2.xml', 'QM-8-1.xml', 'QM-9-1.xml',
-			'PXL-L1.xml','PXL-L2.xml','PXL-L3.xml','PXL-L4.xml','PXL-L5.xml','PXL-S1.xml','PXL-S2.xml','PXL-S3.xml',
-			'PXL-S4.xml','PXL-S5.xml','PXL-S6.xml','PXL-S7.xml','PXL-pro.xml','DU-A1.xml','DU-A2.xml','DU-B1.xml',
-			'DU-B2.xml','DU-C1.xml','DU-C2.xml','DU-D1.xml','DU-D2.xml','DU-E1.xml','DU-F1.xml','DU-F2.xml','DU-G1.xml',
-			'DU-G2.xml','DU-H1.xml','DU-H2.xml','DU-I2.xml','DU-J2.xml','DU-K1.xml','DU-K2.xml','DU-L1.xml','DU-L2.xml',
-			'DU-M1.xml','DU-M2.xml','DU-N1.xml','DU-O1.xml','DU-O2.xml','DU-P1.xml','DU-P2.xml','DU-Q1.xml','DU-Q2.xml',
-			'DU-R1.xml','DU-R2.xml','DU-S1.xml','DU-S2.xml','DU-T1.xml','DU-T2.xml','DU-U1.xml','DU-U2.xml','DU-U3.xml'];
-			//['QM-1-1.xml','QM-2-1.xml','QM-2-2.xml','QM-2-3.xml','QM-3-1.xml','QM-3-2.xml','QM-4-1.xml','QM-5-1.xml','QM-5-2.xml','QM-6-1.xml','QM-6-2.xml','QM-7-1.xml','QM-7-2.xml','QM-8-1.xml','QM-9-1.xml','QM-10-1.xml','QM-11-1.xml','QM-12-1.xml','QM-12-2.xml','QM-13-1.xml','QM-14-1.xml','QM-15-1.xml','QM-16-1.xml','QM-17-1.xml','QM-18-1.xml','QM-18-2.xml','QM-18-3.xml','QM-19-1.xml','QM-20-1.xml','QM-20-2.xml','QM-20-3.xml','QM-21-1.xml','QM-21-2.xml'];
-			//['McG-A-2014-03.xml','McG-B-2014-03.xml','McG-C-2014-03.xml','McG-D-2014-03.xml','McG-E-2014-03.xml','McG-F-2014-03.xml','McG-G-2014-03.xml','McG-H-2014-03.xml'];
+			// THIS IS WHERE YOU SPECIFY RESULT XML FILES TO ANALYSE
+			var xmlFiles = ['test-0.xml','test-1.xml','test-2.xml','test-3.xml']; 
 							
+
 			//TODO: make retrieval of file names automatic / drag files on here
 			
 			/****************
@@ -103,8 +77,8 @@
 				var setup = document.createElement('div');
 				setup.id = 'setupTagDiv';
 				loadAllFiles();
+				makePlots();
 				printSurveyData() 
-				//makePlots();
 				// measure time at this point: 
 				lastTimeMeasured = new Date().getTime(); // in milliseconds
 			};
@@ -424,11 +398,11 @@
 							}
 						}
 
-						// mix experience
+						// post-test info
 						if (posttestnode) {
 							posttestcomments = posttestnode.getElementsByTagName('comment');
 							for (idx=0; idx < posttestcomments.length; idx++){
-								commentsToPrint = ['generalExperience', 'interfaceExperience'];
+								commentsToPrint = ['age', 'location']; // CHANGE WHAT TO PRINT
 								idAttribute = posttestcomments[idx].getAttribute('id');
 								if (commentsToPrint.indexOf(idAttribute) >= 0) { // if exists? 
 									document.getElementById('div_survey_'+xmlFileName).innerHTML += '<br><strong>'+idAttribute+': </strong>'+posttestcomments[idx].textContent;
--- a/ape.css	Mon Dec 21 23:18:43 2015 +0000
+++ b/ape.css	Wed Dec 23 14:36:00 2015 +0000
@@ -14,7 +14,8 @@
 div.pageTitle {
 	width: auto;
 	height: 20px;
-	margin-top: 20px;
+	margin-top: 5px;
+	margin-bottom: 10px;
 }
 
 div.pageTitle span{
@@ -27,16 +28,22 @@
 	background-color: #ddd
 }
 
-div#slider {
+div.slider {
 	/* Specify any structure for the slider holder interface */
 	background-color: #eee;
 	height: 150px;
 	margin-bottom: 5px;
+	-moz-user-select: -moz-none;
+	-khtml-user-select: none;
+	-webkit-user-select: none;
 }
 
 div.sliderScale {
 	width: 100%;
 	min-height: 30px;
+	-moz-user-select: -moz-none;
+	-khtml-user-select: none;
+	-webkit-user-select: none;
 }
 
 div.sliderScale span {
@@ -54,9 +61,12 @@
 	width: 12px;
 	float: left;
 	background-color: rgb(100,200,100);
+	-moz-user-select: -moz-none;
+	-khtml-user-select: none;
+	-webkit-user-select: none;
 }
 
-div#outside-reference {
+div.outside-reference {
 	width:120px;
 	padding-left: 55px;
 	margin-left: 100px;
--- a/ape.js	Mon Dec 21 23:18:43 2015 +0000
+++ b/ape.js	Wed Dec 23 14:36:00 2015 +0000
@@ -7,8 +7,6 @@
 // Once this is loaded and parsed, begin execution
 loadInterface();
 
-var clicking = -1;
-
 function loadInterface() {
 	
 	// Get the dimensions of the screen available to the page
@@ -20,42 +18,9 @@
 	var testContent = document.createElement('div');
 	
 	testContent.id = 'testContent';
-
-	
-	// Create APE specific metric functions
-	audioEngineContext.metric.initialiseTest = function()
-	{
-	};
-	
-	audioEngineContext.metric.sliderMoved = function()
-	{
-		var id = this.data;
-		this.data = -1;
-		var position = convSliderPosToRate(id);
-        console.log('slider ' + id + ': '+ position + ' (' + time + ')'); // DEBUG/SAFETY: show position and slider id
-		if (audioEngineContext.timer.testStarted)
-		{
-			audioEngineContext.audioObjects[id].metric.moved(time,position);
-		}
-	};
-	
-	audioEngineContext.metric.sliderPlayed = function(id)
-	{
-		var time = audioEngineContext.timer.getTestTime();
-		if (audioEngineContext.timer.testStarted)
-		{
-			if (this.lastClicked >= 0)
-			{
-				audioEngineContext.audioObjects[this.lastClicked].metric.listening(time);
-			}
-			this.lastClicked = id;
-			audioEngineContext.audioObjects[id].metric.listening(time);
-		}
-        console.log('slider ' + id + ' played (' + time + ')'); // DEBUG/SAFETY: show played slider id
-	};
 	
 	// Bindings for interfaceContext
-	Interface.prototype.checkAllPlayed = function()
+	interfaceContext.checkAllPlayed = function()
 	{
 		hasBeenPlayed = audioEngineContext.checkAllPlayed();
 		if (hasBeenPlayed.length > 0) // if a fragment has not been played yet
@@ -79,32 +44,46 @@
 	    return true;
 	};
 	
-	Interface.prototype.checkAllMoved = function() {
-		var audioObjs = audioEngineContext.audioObjects;
+	interfaceContext.checkAllMoved = function() {
 		var state = true;
-		var strNums = [];
-		for (var i=0; i<audioObjs.length; i++)
+		var str = 'You have not moved the following sliders. ';
+		for (var i=0; i<this.interfaceSliders.length; i++)
 		{
-			if (audioObjs[i].metric.wasMoved == false && audioObjs[i].specification.type != 'outsidereference') {
-				state = false;
-				strNums.push(i);
+			var interfaceTID = [];
+			for (var j=0; j<this.interfaceSliders[i].metrics.length; j++)
+			{
+				if (this.interfaceSliders[i].metrics[j].wasMoved == false)
+				{
+					state = false;
+					interfaceTID.push(j);
+				}
+			}
+			if (interfaceTID.length != 0)
+			{
+				var interfaceName = this.interfaceSliders[i].interfaceObject.title;
+				if (interfaceName == undefined) {
+					str += 'On axis '+String(i+1)+' you must move ';
+				} else {
+					str += 'On axis "'+interfaceName+'" you must move ';
+				}
+				if (interfaceTID.length == 1)
+				{
+					str += 'slider '+interfaceTID[0]+'. ';
+				}
+				else {
+					str += 'sliders ';
+					for (var k=0; k<interfaceTID.length-1; k++)
+					{
+						str += interfaceTID[k]+', ';
+					}
+					str += interfaceTID[interfaceTID.length-1] +'. ';
+				}
 			}
 		}
-		if (state == false) {
-			if (strNums.length > 1) {
-				var str = "";
-		    	for (var i=0; i<strNums.length; i++) {
-		    		str = str + strNums[i];
-		    		if (i < strNums.length-2){
-		    			str += ", ";
-		    		} else if (i == strNums.length-2) {
-		    			str += " or ";
-		    		}
-		    	}
-		    	alert('You have not moved fragments ' + str + ' yet. Please listen, rate and comment all samples before submitting.');
-	       } else {
-	       		alert('You have not moved fragment ' + strNums[0] + ' yet. Please listen, rate and comment all samples before submitting.');
-	       }
+		if (state != true)
+		{
+			alert(str);
+			console.log(str);
 		}
 		return state;
 	};
@@ -146,34 +125,46 @@
 	{
 		var audioObjs = audioEngineContext.audioObjects;
 		var audioHolder = testState.stateMap[testState.stateIndex];
-		var interfaces = audioHolder.interfaces;
-		
-		var minRanking = audioObjs[0].interfaceDOM.getValue();
-		var maxRanking = minRanking;
-		
-		var minScale;
-		var maxScale;
-		for (var i=0; i<interfaces[0].options.length; i++)
+		var state = true;
+		var str = '';
+		for (var i=0; i<this.interfaceSliders.length; i++)
 		{
-			if (interfaces[0].options[i].check == "scalerange") {
-				minScale = interfaces[0].options[i].min;
-				maxScale = interfaces[0].options[i].max;
+			var minScale;
+			var maxScale;
+			var interfaceObject = interfaceContext.interfaceSliders[0].interfaceObject;
+			for (var j=0; j<interfaceObject.options.length; j++)
+			{
+				if (interfaceObject.options[j].check == "scalerange") {
+					minScale = interfaceObject.options[j].min;
+					maxScale = interfaceObject.options[j].max;
+					break;
+				}
+			}
+			var minRanking = convSliderPosToRate(this.interfaceSliders[i].sliders[0]);
+			var maxRanking = minRanking;
+			for (var j=1; j<this.interfaceSliders[i].sliders.length; j++)
+			{
+				var ranking = convSliderPosToRate(this.interfaceSliders[i].sliders[j]);
+				if (ranking < minRanking)
+				{
+					minRanking = ranking;
+				} else if (ranking > maxRanking)
+				{
+					maxRanking = ranking;
+				}
+			}
+			if (minRanking > minScale || maxRanking < maxScale)
+			{
+				state = false;
+				str += 'On axis "'+this.interfaceSliders[i].interfaceObject.title+'" you have not used the full width of the scale. ';
 			}
 		}
-		
-		for (var i=1; i<audioObjs.length; i++){
-			if (audioObjs[i].specification.type != 'outsidereference') {
-				var ranking = audioObjs[i].interfaceDOM.getValue();
-				if (ranking < minRanking) { minRanking = ranking;}
-				if (ranking > maxRanking) { maxRanking = ranking;}
-			}
+		if (state != true)
+		{
+			alert(str);
+			console.log(str);
 		}
-		if (minRanking > minScale || maxRanking < maxScale) {
-			alert('Please use the full width of the scale');
-			return false;
-		} else {
-			return true;
-		}
+		return state;
 	};
 	
 	Interface.prototype.objectSelected = null;
@@ -207,6 +198,9 @@
 		return this.objectMoved;
 	};
 	
+	// Bindings for slider interfaces
+	Interface.prototype.interfaceSliders = [];
+	
 	// Bindings for audioObjects
 	
 	// Create the top div for the Title element
@@ -225,13 +219,6 @@
 	// Insert the titleSpan element into the title div element.
 	title.appendChild(titleSpan);
 	
-	var pagetitle = document.createElement('div');
-	pagetitle.className = "pageTitle";
-	pagetitle.align = "center";
-	var titleSpan = document.createElement('span');
-	titleSpan.id = "pageTitle";
-	pagetitle.appendChild(titleSpan);
-	
 	// Create Interface buttons!
 	var interfaceButtons = document.createElement('div');
 	interfaceButtons.id = 'interface-buttons';
@@ -259,38 +246,9 @@
 	interfaceButtons.appendChild(playback);
 	interfaceButtons.appendChild(submit);
 	
-	// Now create the slider and HTML5 canvas boxes
+	var sliderHolder = document.createElement("div");
+	sliderHolder.id = "slider-holder";
 	
-	// Create the div box to center align
-	var sliderBox = document.createElement('div');
-	sliderBox.className = 'sliderCanvasDiv';
-	sliderBox.id = 'sliderCanvasHolder';
-	
-	// Create the slider box to hold the slider elements
-	var canvas = document.createElement('div');
-	canvas.id = 'slider';
-	canvas.align = "left";
-	canvas.addEventListener('dragover',function(event){
-		event.preventDefault();
-		event.dataTransfer.effectAllowed = 'none';
-		event.dataTransfer.dropEffect = 'copy';
-		return false;
-	},false);
-	var sliderMargin = document.createAttribute('marginsize');
-	sliderMargin.nodeValue = 42; // Set default margins to 42px either side
-	// Must have a known EXACT width, as this is used later to determine the ratings
-	var w = (Number(sliderMargin.nodeValue)+8)*2;
-	canvas.style.width = width - w +"px";
-	canvas.style.marginLeft = sliderMargin.nodeValue +'px';
-	canvas.setAttributeNode(sliderMargin);
-	sliderBox.appendChild(canvas);
-	
-	// Create the div to hold any scale objects
-	var scale = document.createElement('div');
-	scale.className = 'sliderScale';
-	scale.id = 'sliderScaleHolder';
-	scale.align = 'left';
-	sliderBox.appendChild(scale);
 	
 	// Global parent for the comment boxes on the page
 	var feedbackHolder = document.createElement('div');
@@ -301,9 +259,8 @@
 	
 	// Inject into HTML
 	testContent.appendChild(title); // Insert the title
-	testContent.appendChild(pagetitle);
 	testContent.appendChild(interfaceButtons);
-	testContent.appendChild(sliderBox);
+	testContent.appendChild(sliderHolder);
 	testContent.appendChild(feedbackHolder);
 	interfaceContext.insertPoint.appendChild(testContent);
 
@@ -315,19 +272,27 @@
 
 function loadTest(audioHolderObject)
 {
-	
-	// Reset audioEngineContext.Metric globals for new test
-	audioEngineContext.newTestPage();
-	
+	var width = window.innerWidth;
+	var height = window.innerHeight;
 	var id = audioHolderObject.id;
 	
+	interfaceContext.interfaceSliders = [];
+	
 	var feedbackHolder = document.getElementById('feedbackHolder');
-	var canvas = document.getElementById('slider');
+	var sliderHolder = document.getElementById('slider-holder');
 	feedbackHolder.innerHTML = null;
-	canvas.innerHTML = null;
+	sliderHolder.innerHTML = null;
+	
+	// Delete outside reference
+	var outsideReferenceHolder = document.getElementById('outside-reference');
+	if (outsideReferenceHolder != null) {
+		document.getElementById('interface-buttons').removeChild(outsideReferenceHolder);
+	}
 	
 	var interfaceObj = audioHolderObject.interfaces;
 	for (var k=0; k<interfaceObj.length; k++) {
+		// Create the div box to center align
+		interfaceContext.interfaceSliders.push(new interfaceSliderHolder(interfaceObj[k]));
 		for (var i=0; i<interfaceObj[k].options.length; i++)
 		{
 			if (interfaceObj[k].options[i].type == 'option' && interfaceObj[k].options[i].name == 'playhead')
@@ -349,73 +314,34 @@
 					pagecountHolder = document.createElement('div');
 					pagecountHolder.id = 'page-count';
 				}
-				pagecountHolder.innerHTML = '<span>Test '+(audioHolderObject.presentedId+1)+' of '+specification.audioHolders.length+'</span>';
+				pagecountHolder.innerHTML = '<span>Page '+(audioHolderObject.presentedId+1)+' of '+specification.audioHolders.length+'</span>';
 				var inject = document.getElementById('interface-buttons');
 				inject.appendChild(pagecountHolder);
 			}
 		}
 	}
-	// Setup question title
 	
-	var commentBoxPrefix = "Comment on track";
-	if (interfaceObj.length != 0) {
-		interfaceObj = interfaceObj[0];
-		var titleNode = interfaceObj.title;
-		if (titleNode != undefined)
-		{
-			document.getElementById('pageTitle').textContent = titleNode;
-		}
-		var positionScale = canvas.style.width.substr(0,canvas.style.width.length-2);
-		var offset = Number(document.getElementById('slider').attributes['marginsize'].value);
-		var scale = document.getElementById('sliderScaleHolder');
-		scale.innerHTML = null;
-		$(interfaceObj.scale).each(function(index,scaleObj){
-			var value = document.createAttribute('value');
-			var position = Number(scaleObj[0])*0.01;
-			value.nodeValue = position;
-			var pixelPosition = (position*positionScale)+offset;
-			var scaleDOM = document.createElement('span');
-			scaleDOM.textContent = scaleObj[1];
-			scale.appendChild(scaleDOM);
-			scaleDOM.style.left = Math.floor((pixelPosition-($(scaleDOM).width()/2)))+'px';
-			scaleDOM.setAttributeNode(value);
-		});
-		
-		if (interfaceObj.commentBoxPrefix != undefined) {
-			commentBoxPrefix = interfaceObj.commentBoxPrefix;
-		}
-	}
-
-	/// CHECK FOR SAMPLE RATE COMPATIBILITY
-	if (audioHolderObject.sampleRate != undefined) {
-		if (Number(audioHolderObject.sampleRate) != audioContext.sampleRate) {
-			var errStr = 'Sample rates do not match! Requested '+Number(audioHolderObject.sampleRate)+', got '+audioContext.sampleRate+'. Please set the sample rate to match before completing this test.';
-			alert(errStr);
-			return;
-		}
-	}
+	var commentBoxPrefix = "Comment on fragment";
 	
 	var commentShow = audioHolderObject.elementComments;
 	
 	var loopPlayback = audioHolderObject.loop;
 
-	audioEngineContext.loopPlayback = loopPlayback;
-	// Create AudioEngine bindings for playback
-	
 	currentTestHolder = document.createElement('audioHolder');
 	currentTestHolder.id = audioHolderObject.id;
 	currentTestHolder.repeatCount = audioHolderObject.repeatCount;
 	
-	// Delete any previous audioObjects associated with the audioEngine
-	audioEngineContext.audioObjects = [];
-	interfaceContext.deleteCommentBoxes();
-	interfaceContext.deleteCommentQuestions();
-	
 	// Find all the audioElements from the audioHolder
 	$(audioHolderObject.audioElements).each(function(index,element){
 		// Find URL of track
 		// In this jQuery loop, variable 'this' holds the current audioElement.
 		
+		// Check if an outside reference
+		if (index == audioHolderObject.outsideReference)
+		{
+			return;
+		}
+		
 		// Now load each audio sample. First create the new track by passing the full URL
 		var trackURL = audioHolderObject.hostURL + element.url;
 		var audioObject = audioEngineContext.newTrack(element);
@@ -423,39 +349,48 @@
 		var node = interfaceContext.createCommentBox(audioObject);
 		
 		// Create a slider per track
-		audioObject.interfaceDOM = new sliderObject(audioObject);
-		
-		// Distribute it randomnly
-		var w = window.innerWidth - (offset+8)*2;
-		w = Math.random()*w;
-		w = Math.floor(w+(offset+8));
-		audioObject.interfaceDOM.trackSliderObj.style.left = w+'px';
-		
-		canvas.appendChild(audioObject.interfaceDOM.trackSliderObj);
-		audioObject.metric.initialised(convSliderPosToRate(audioObject.interfaceDOM.trackSliderObj));
+		audioObject.interfaceDOM = new sliderObject(audioObject,interfaceObj);
+		audioObject.metric.initialPosition = convSliderPosToRate(audioObject.interfaceDOM.trackSliderObjects[0]);
+		if (audioObject.state == 1)
+		{
+			audioObject.interfaceDOM.enable();
+		}
         
 	});
-	
 	$('.track-slider').mousedown(function(event) {
 		interfaceContext.selectObject($(this)[0]);
 	});
+	$('.track-slider').on('touchstart',null,function(event) {
+		interfaceContext.selectObject($(this)[0]);
+	});
 	
 	$('.track-slider').mousemove(function(event) {
 		event.preventDefault();
 	});
 	
-	$('#slider').mousemove(function(event) {
+	$('.slider').mousemove(function(event) {
 		event.preventDefault();
 		var obj = interfaceContext.getSelectedObject();
 		if (obj == null) {return;}
 		$(obj).css("left",event.clientX + "px");
 		interfaceContext.moveObject();
 	});
+	
+	$('.slider').on('touchmove',null,function(event) {
+		event.preventDefault();
+		var obj = interfaceContext.getSelectedObject();
+		if (obj == null) {return;}
+		var move = event.originalEvent.targetTouches[0].clientX - 6;
+		$(obj).css("left",move + "px");
+		interfaceContext.moveObject();
+	});
 
 	$(document).mouseup(function(event){
 		event.preventDefault();
 		var obj = interfaceContext.getSelectedObject();
 		if (obj == null) {return;}
+		var interfaceID = obj.parentElement.getAttribute("interfaceid");
+		var trackID = obj.getAttribute("trackindex");
 		if (interfaceContext.hasSelectedObjectMoved() == true)
 		{
 			var l = $(obj).css("left");
@@ -463,6 +398,7 @@
 			var time = audioEngineContext.timer.getTestTime();
 			var rate = convSliderPosToRate(obj);
 			audioEngineContext.audioObjects[id].metric.moved(time,rate);
+			interfaceContext.interfaceSliders[interfaceID].metrics[trackID].moved(time,rate);
 			console.log("slider "+id+" moved to "+rate+' ('+time+')');
 		} else {
 			var id = Number(obj.attributes['trackIndex'].value);
@@ -470,9 +406,9 @@
 			audioEngineContext.play(id);
 	        // Currently playing track red, rest green
 	        
-	        //document.getElementById('track-slider-'+index).style.backgroundColor = "#FF0000";
 	        $('.track-slider').removeClass('track-slider-playing');
-	        $(obj).addClass('track-slider-playing');
+	        var name = ".track-slider-"+obj.getAttribute("trackindex");
+	        $(name).addClass('track-slider-playing');
 	        $('.comment-div').removeClass('comment-box-playing');
 	        $('#comment-div-'+id).addClass('comment-box-playing');
 	        var outsideReference = document.getElementById('outside-reference');
@@ -482,6 +418,24 @@
 		interfaceContext.releaseObject();
 	});
 	
+	$('.slider').on('touchend',null,function(event){
+		var obj = interfaceContext.getSelectedObject();
+		if (obj == null) {return;}
+		var interfaceID = obj.parentElement.getAttribute("interfaceid");
+		var trackID = obj.getAttribute("trackindex");
+		if (interfaceContext.hasSelectedObjectMoved() == true)
+		{
+			var l = $(obj).css("left");
+			var id = obj.getAttribute('trackIndex');
+			var time = audioEngineContext.timer.getTestTime();
+			var rate = convSliderPosToRate(obj);
+			audioEngineContext.audioObjects[id].metric.moved(time,rate);
+			interfaceContext.interfaceSliders[interfaceID].metrics[trackID].moved(time,rate);
+			console.log("slider "+id+" moved to "+rate+' ('+time+')');
+		}
+		interfaceContext.releaseObject();
+	});
+	
 	
 	if (commentShow) {
 		interfaceContext.showCommentBoxes(feedbackHolder,true);
@@ -496,11 +450,12 @@
 	if (audioHolderObject.outsideReference != null) {
 		var outsideReferenceHolder = document.createElement('div');
 		outsideReferenceHolder.id = 'outside-reference';
+		outsideReferenceHolder.className = 'outside-reference';
 		outsideReferenceHolderspan = document.createElement('span');
 		outsideReferenceHolderspan.textContent = 'Reference';
 		outsideReferenceHolder.appendChild(outsideReferenceHolderspan);
 		
-		var audioObject = audioEngineContext.newTrack(audioHolderObject.outsideReference);
+		var audioObject = audioEngineContext.newTrack(audioHolderObject.audioElements[audioHolderObject.outsideReference]);
 		
 		outsideReferenceHolder.onclick = function(event)
 		{
@@ -521,30 +476,180 @@
 	//testWaitIndicator();
 }
 
-function sliderObject(audioObject) {
+function interfaceSliderHolder(interfaceObject)
+{
+	this.sliders = [];
+	this.metrics = [];
+	this.id = document.getElementsByClassName("sliderCanvasDiv").length;
+	this.name = interfaceObject.name;
+	this.interfaceObject = interfaceObject;
+	this.sliderDOM = document.createElement('div');
+	this.sliderDOM.className = 'sliderCanvasDiv';
+	this.sliderDOM.id = 'sliderCanvasHolder-'+this.id;
+	
+	var pagetitle = document.createElement('div');
+	pagetitle.className = "pageTitle";
+	pagetitle.align = "center";
+	var titleSpan = document.createElement('span');
+	titleSpan.id = "pageTitle-"+this.id;
+	if (interfaceObject.title != undefined && typeof interfaceObject.title == "string")
+	{
+		titleSpan.textContent = interfaceObject.title;
+	} else {
+		titleSpan.textContent = "Axis "+String(this.id+1);
+	}
+	pagetitle.appendChild(titleSpan);
+	this.sliderDOM.appendChild(pagetitle);
+	
+	// Create the slider box to hold the slider elements
+	this.canvas = document.createElement('div');
+	if (this.name != undefined)
+		this.canvas.id = 'slider-'+this.name;
+	else
+		this.canvas.id = 'slider-'+this.id;
+	this.canvas.setAttribute("interfaceid",this.id);
+	this.canvas.className = 'slider';
+	this.canvas.align = "left";
+	this.canvas.addEventListener('dragover',function(event){
+		event.preventDefault();
+		event.dataTransfer.effectAllowed = 'none';
+		event.dataTransfer.dropEffect = 'copy';
+		return false;
+	},false);
+	var sliderMargin = document.createAttribute('marginsize');
+	sliderMargin.nodeValue = 42; // Set default margins to 42px either side
+	// Must have a known EXACT width, as this is used later to determine the ratings
+	var w = (Number(sliderMargin.nodeValue)+8)*2;
+	this.canvas.style.width = window.innerWidth - w +"px";
+	this.canvas.style.marginLeft = sliderMargin.nodeValue +'px';
+	this.canvas.setAttributeNode(sliderMargin);
+	this.sliderDOM.appendChild(this.canvas);
+	
+	// Create the div to hold any scale objects
+	this.scale = document.createElement('div');
+	this.scale.className = 'sliderScale';
+	this.scale.id = 'sliderScaleHolder-'+this.id;
+	this.scale.align = 'left';
+	this.sliderDOM.appendChild(this.scale);
+	var positionScale = this.canvas.style.width.substr(0,this.canvas.style.width.length-2);
+	var offset = Number(this.canvas.attributes['marginsize'].value);
+	for (var index=0; index<interfaceObject.scale.length; index++)
+	{
+		var scaleObj = interfaceObject.scale[index];
+		var value = document.createAttribute('value');
+		var position = Number(scaleObj[0])*0.01;
+		value.nodeValue = position;
+		var pixelPosition = (position*positionScale)+offset;
+		var scaleDOM = document.createElement('span');
+		scaleDOM.textContent = scaleObj[1];
+		this.scale.appendChild(scaleDOM);
+		scaleDOM.style.left = Math.floor((pixelPosition-($(scaleDOM).width()/2)))+'px';
+		scaleDOM.setAttributeNode(value);
+	}
+	
+	var dest = document.getElementById("slider-holder");
+	dest.appendChild(this.sliderDOM);
+	
+	this.createSliderObject = function(audioObject)
+	{
+		var trackObj = document.createElement('div');
+		trackObj.className = 'track-slider track-slider-disabled track-slider-'+audioObject.id;
+		trackObj.id = 'track-slider-'+this.id+'-'+audioObject.id;
+		trackObj.setAttribute('trackIndex',audioObject.id);
+		trackObj.innerHTML = '<span>'+audioObject.id+'</span>';
+		if (this.name != undefined) {
+			trackObj.setAttribute('interface-name',this.name);
+		} else {
+			trackObj.setAttribute('interface-name',this.id);
+		}
+		var offset = Number(this.canvas.attributes['marginsize'].value);
+		// Distribute it randomnly
+		var w = window.innerWidth - (offset+8)*2;
+		w = Math.random()*w;
+		w = Math.floor(w+(offset+8));
+		trackObj.style.left = w+'px';
+		this.canvas.appendChild(trackObj);
+		this.sliders.push(trackObj);
+		this.metrics.push(new metricTracker(this));
+		this.metrics[this.metrics.length-1].initialPosition = convSliderPosToRate(trackObj);
+		return trackObj;
+	};
+	
+	this.resize = function(event)
+	{
+		var holdValues = [];
+		for (var index = 0; index < this.sliders.length; index++)
+		{
+			holdValues.push(convSliderPosToRate(this.sliders[index])); 
+		}
+		var width = event.target.innerWidth;
+		var sliderDiv = this.canvas;
+		var sliderScaleDiv = this.scale;
+		var marginsize = Number(sliderDiv.attributes['marginsize'].value);
+		var w = (marginsize+8)*2;
+		sliderDiv.style.width = width - w + 'px';
+		var width = width - w;
+		// Move sliders into new position
+		for (var index = 0; index < this.sliders.length; index++)
+		{
+			var pos = holdValues[index];
+			var pix = pos * width;
+			this.sliders[index].style.left = pix+marginsize+'px';
+		}
+		
+		// Move scale labels
+		for (var index = 0; index < this.scale.children.length; index++)
+		{
+			var scaleObj = this.scale.children[index];
+			var position = Number(scaleObj.attributes['value'].value);
+			var pixelPosition = (position*width)+marginsize;
+			scaleObj.style.left = Math.floor((pixelPosition-($(scaleObj).width()/2)))+'px';
+		}
+	};
+}
+
+function sliderObject(audioObject,interfaceObjects) {
 	// Create a new slider object;
 	this.parent = audioObject;
-	this.trackSliderObj = document.createElement('div');
-	this.trackSliderObj.className = 'track-slider track-slider-disabled';
-	this.trackSliderObj.id = 'track-slider-'+audioObject.id;
-
-	this.trackSliderObj.setAttribute('trackIndex',audioObject.id);
-	this.trackSliderObj.innerHTML = '<span>'+audioObject.id+'</span>';
+	this.trackSliderObjects = [];
+	for (var i=0; i<interfaceContext.interfaceSliders.length; i++)
+	{
+		var trackObj = interfaceContext.interfaceSliders[i].createSliderObject(audioObject);
+		this.trackSliderObjects.push(trackObj);
+	}
 
 	// Onclick, switch playback to that track
 	
 	this.enable = function() {
 		if (this.parent.state == 1)
 		{
-			$(this.trackSliderObj).removeClass('track-slider-disabled');
+			$(this.trackSliderObjects).each(function(i,trackObj){
+				$(trackObj).removeClass('track-slider-disabled');
+			});
 		}
 	};
-	
+	this.updateLoading = function(progress)
+	{
+		if (progress != 100)
+		{
+			progress = String(progress);
+			progress = progress.split('.')[0];
+			this.trackSliderObjects[0].children[0].textContent = progress+'%';
+		} else {
+			this.trackSliderObjects[0].children[0].textContent = this.parent.id;
+		}
+	};
 	this.exportXMLDOM = function(audioObject) {
 		// Called by the audioObject holding this element. Must be present
-		var node = document.createElement('value');
-		node.textContent = convSliderPosToRate(this.trackSliderObj);
-		return node;
+		var obj = [];
+		$(this.trackSliderObjects).each(function(i,trackObj){
+			var node = document.createElement('value');
+			node.setAttribute("interface-name",trackObj.getAttribute("interface-name"));
+			node.textContent = convSliderPosToRate(trackObj);
+			obj.push(node);
+		});
+		
+		return obj;
 	};
 	this.getValue = function() {
 		return convSliderPosToRate(this.trackSliderObj);
@@ -589,6 +694,9 @@
 				var checkState = interfaceContext.checkScaleRange();
 				if (checkState == false) {canContinue = false;}
 				break;
+			default:
+				console.log("WARNING - Check option "+checks[i].check+" is not supported on this interface");
+				break;
 			}
 
 		}
@@ -612,12 +720,13 @@
     } 
 }
 
-function convSliderPosToRate(slider)
+function convSliderPosToRate(trackSlider)
 {
-	var w = document.getElementById('slider').style.width;
-	var marginsize = Number(document.getElementById('slider').attributes['marginsize'].value);
+	var slider = trackSlider.parentElement;
+	var w = slider.style.width;
+	var marginsize = Number(slider.attributes['marginsize'].value);
 	var maxPix = w.substr(0,w.length-2);
-	var pix = slider.style.left;
+	var pix = trackSlider.style.left;
 	pix = pix.substr(0,pix.length-2);
 	var rate = (pix-marginsize)/maxPix;
 	return rate;
@@ -627,63 +736,52 @@
 	// Function called when the window has been resized.
 	// MANDATORY FUNCTION
 	
-	// Store the slider marker values
-	var holdValues = [];
-	$(".track-slider").each(function(index,sliderObj){
-		holdValues.push(convSliderPosToRate(sliderObj));
-	});
-	
-	var width = event.target.innerWidth;
-	var canvas = document.getElementById('sliderCanvasHolder');
-	var sliderDiv = canvas.children[0];
-	var sliderScaleDiv = canvas.children[1];
-	var marginsize = Number(sliderDiv.attributes['marginsize'].value);
-	var w = (marginsize+8)*2;
-	sliderDiv.style.width = width - w + 'px';
-	var width = width - w;
-	// Move sliders into new position
-	$(".track-slider").each(function(index,sliderObj){
-		var pos = holdValues[index];
-		var pix = pos * width;
-		sliderObj.style.left = pix+marginsize+'px';
-	});
-	
-	// Move scale labels
-	$(sliderScaleDiv.children).each(function(index,scaleObj){
-		var position = Number(scaleObj.attributes['value'].value);
-		var pixelPosition = (position*width)+marginsize;
-		scaleObj.style.left = Math.floor((pixelPosition-($(scaleObj).width()/2)))+'px';
-	});
+	// Resize the slider objects
+	for (var i=0; i<interfaceContext.interfaceSliders.length; i++)
+	{
+		interfaceContext.interfaceSliders[i].resize(event);
+	}
 }
 
 function pageXMLSave(store, testXML)
 {
+	// MANDATORY
 	// Saves a specific test page
-	var xmlDoc = store;
-	// Check if any session wide metrics are enabled
-	
-	var commentShow = testXML.elementComments;
-	
-	var metric = document.createElement('metric');
-	if (audioEngineContext.metric.enableTestTimer)
+	// You can use this space to add any extra nodes to your XML <audioHolder> saves
+	// Get the current <audioHolder> information in store (remember to appendChild your data to it)
+	if (interfaceContext.interfaceSliders.length == 1)
 	{
-		var testTime = document.createElement('metricResult');
-		testTime.id = 'testTime';
-		testTime.textContent = audioEngineContext.timer.testDuration;
-		metric.appendChild(testTime);
+		// If there is only one axis, there only needs to be one metric return
+		return;
 	}
-	xmlDoc.appendChild(metric);
-	var audioObjects = audioEngineContext.audioObjects;
-	for (var i=0; i<audioObjects.length; i++) 
+	var audioelements = store.getElementsByTagName("audioelement");
+	for (var i=0; i<audioelements.length; i++)
 	{
-		var audioElement = audioEngineContext.audioObjects[i].exportXMLDOM();
-		audioElement.setAttribute('presentedId',i);
-		xmlDoc.appendChild(audioElement);
+		// Have to append the metric specific nodes
+		if (testXML.outsideReference == null || testXML.outsideReference.id != audioelements[i].id)
+		{
+			var inject = audioelements[i].getElementsByTagName("metric");
+			if (inject.length == 0)
+			{
+				inject = document.createElement("metric");
+			} else {
+				inject = inject[0];
+			}
+			for (var k=0; k<interfaceContext.interfaceSliders.length; k++)
+			{
+				var node = interfaceContext.interfaceSliders[k].metrics[i].exportXMLDOM();
+				var mrnodes = node.getElementsByTagName("metricresult");
+				for (var j=0; j<mrnodes.length; j++)
+				{
+					var name = mrnodes[j].getAttribute("name");
+					if (name == "elementTracker" || name == "elementTrackerFull" || name == "elementInitialPosition" || name == "elementFlagMoved")
+					{
+						mrnodes[j].setAttribute("interface-name",interfaceContext.interfaceSliders[k].name);
+						mrnodes[j].setAttribute("interface-id",k);
+						inject.appendChild(mrnodes[j]);
+					}
+				}
+			}
+		}
 	}
-	
-	$(interfaceContext.commentQuestions).each(function(index,element){
-		var node = element.exportXMLDOM();
-		xmlDoc.appendChild(node);
-	});
-	store = xmlDoc;
 }
\ No newline at end of file
--- a/core.js	Mon Dec 21 23:18:43 2015 +0000
+++ b/core.js	Wed Dec 23 14:36:00 2015 +0000
@@ -16,10 +16,6 @@
 var audioEngineContext; // The custome AudioEngine object
 var projectReturn; // Hold the URL for the return
 
-
-// Add a prototype to the bufferSourceNode to reference to the audioObject holding it
-AudioBufferSourceNode.prototype.owner = undefined;
-
 window.onload = function() {
 	// Function called once the browser has loaded all files.
 	// This should perform any initial commands such as structure / loading documents
@@ -32,9 +28,6 @@
 	// Create test state
 	testState = new stateMachine();
 	
-	// Create the audio engine object
-	audioEngineContext = new AudioEngine();
-	
 	// Create the popup interface object
 	popup = new interfacePopup();
 	
@@ -43,8 +36,224 @@
 	
 	// Create the interface object
 	interfaceContext = new Interface(specification);
+	// Define window callbacks for interface
+	window.onresize = function(event){interfaceContext.resizeWindow(event);};
+	
+	// Add a prototype to the bufferSourceNode to reference to the audioObject holding it
+	AudioBufferSourceNode.prototype.owner = undefined;
+	// Add a prototype to the bufferNode to hold the desired LINEAR gain
+	AudioBuffer.prototype.gain = 1.0;
+	// Add a prototype to the bufferNode to hold the computed LUFS loudness
+	AudioBuffer.prototype.lufs = -23;
 };
 
+function loadProjectSpec(url) {
+	// Load the project document from the given URL, decode the XML and instruct audioEngine to get audio data
+	// If url is null, request client to upload project XML document
+	var r = new XMLHttpRequest();
+	r.open('GET',url,true);
+	r.onload = function() {
+		loadProjectSpecCallback(r.response);
+	};
+	r.send();
+};
+
+function loadProjectSpecCallback(response) {
+	// Function called after asynchronous download of XML project specification
+	//var decode = $.parseXML(response);
+	//projectXML = $(decode);
+	
+	var parse = new DOMParser();
+	projectXML = parse.parseFromString(response,'text/xml');
+	var errorNode = projectXML.getElementsByTagName('parsererror');
+	if (errorNode.length >= 1)
+	{
+		var msg = document.createElement("h3");
+		msg.textContent = "FATAL ERROR";
+		var span = document.createElement("span");
+		span.textContent = "The XML parser returned the following errors when decoding your XML file";
+		document.getElementsByTagName('body')[0].innerHTML = null;
+		document.getElementsByTagName('body')[0].appendChild(msg);
+		document.getElementsByTagName('body')[0].appendChild(span);
+		document.getElementsByTagName('body')[0].appendChild(errorNode[0]);
+		return;
+	}
+	
+	// Build the specification
+	specification.decode(projectXML);
+	
+	// Detect the interface to use and load the relevant javascripts.
+	var interfaceJS = document.createElement('script');
+	interfaceJS.setAttribute("type","text/javascript");
+	if (specification.interfaceType == 'APE') {
+		interfaceJS.setAttribute("src","ape.js");
+		
+		// APE comes with a css file
+		var css = document.createElement('link');
+		css.rel = 'stylesheet';
+		css.type = 'text/css';
+		css.href = 'ape.css';
+		
+		document.getElementsByTagName("head")[0].appendChild(css);
+	} else if (specification.interfaceType == "MUSHRA")
+	{
+		interfaceJS.setAttribute("src","mushra.js");
+		
+		// MUSHRA comes with a css file
+		var css = document.createElement('link');
+		css.rel = 'stylesheet';
+		css.type = 'text/css';
+		css.href = 'mushra.css';
+		
+		document.getElementsByTagName("head")[0].appendChild(css);
+	}
+	document.getElementsByTagName("head")[0].appendChild(interfaceJS);
+	
+	// Create the audio engine object
+	audioEngineContext = new AudioEngine(specification);
+	
+	testState.stateMap.push(specification.preTest);
+	
+	$(specification.audioHolders).each(function(index,elem){
+		testState.stateMap.push(elem);
+		$(elem.audioElements).each(function(i,audioElem){
+			var URL = audioElem.parent.hostURL + audioElem.url;
+			var buffer = null;
+			for (var i=0; i<audioEngineContext.buffers.length; i++)
+			{
+				if (URL == audioEngineContext.buffers[i].url)
+				{
+					buffer = audioEngineContext.buffers[i];
+					break;
+				}
+			}
+			if (buffer == null)
+			{
+				buffer = new audioEngineContext.bufferObj();
+				buffer.getMedia(URL);
+				audioEngineContext.buffers.push(buffer);
+			}
+		});
+	});
+	
+	testState.stateMap.push(specification.postTest);
+}
+
+function createProjectSave(destURL) {
+	// Save the data from interface into XML and send to destURL
+	// If destURL is null then download XML in client
+	// Now time to render file locally
+	var xmlDoc = interfaceXMLSave();
+	var parent = document.createElement("div");
+	parent.appendChild(xmlDoc);
+	var file = [parent.innerHTML];
+	if (destURL == "null" || destURL == undefined) {
+		var bb = new Blob(file,{type : 'application/xml'});
+		var dnlk = window.URL.createObjectURL(bb);
+		var a = document.createElement("a");
+		a.hidden = '';
+		a.href = dnlk;
+		a.download = "save.xml";
+		a.textContent = "Save File";
+		
+		popup.showPopup();
+		popup.popupContent.innerHTML = null;
+		popup.popupContent.appendChild(a);
+	} else {
+		var xmlhttp = new XMLHttpRequest;
+		xmlhttp.open("POST",destURL,true);
+		xmlhttp.setRequestHeader('Content-Type', 'text/xml');
+		xmlhttp.onerror = function(){
+			console.log('Error saving file to server! Presenting download locally');
+			createProjectSave(null);
+		};
+		xmlhttp.onreadystatechange  = function() {
+			console.log(xmlhttp.status);
+			if (xmlhttp.status != 200 && xmlhttp.readyState == 4) {
+				createProjectSave(null);
+			} else {
+				if (xmlhttp.responseXML == null)
+				{
+					return createProjectSave(null);
+				}
+				var response = xmlhttp.responseXML.childNodes[0];
+				if (response.getAttribute('state') == "OK")
+				{
+					var file = response.getElementsByTagName('file')[0];
+					console.log('Save OK: Filename '+file.textContent+','+file.getAttribute('bytes')+'B');
+					popup.showPopup();
+					popup.popupContent.innerHTML = null;
+					popup.popupContent.textContent = "Thank you!";
+				} else {
+					var message = response.getElementsByTagName('message')[0];
+					errorSessionDump(message.textContent);
+				}
+			}
+		};
+		xmlhttp.send(file);
+	}
+}
+
+function errorSessionDump(msg){
+	// Create the partial interface XML save
+	// Include error node with message on why the dump occured
+	popup.showPopup();
+	popup.popupContent.innerHTML = null;
+	var err = document.createElement('error');
+	var parent = document.createElement("div");
+	if (typeof msg === "object")
+	{
+		err.appendChild(msg);
+		popup.popupContent.appendChild(msg);
+		
+	} else {
+		err.textContent = msg;
+		popup.popupContent.innerHTML = "ERROR : "+msg;
+	}
+	var xmlDoc = interfaceXMLSave();
+	xmlDoc.appendChild(err);
+	parent.appendChild(xmlDoc);
+	var file = [parent.innerHTML];
+	var bb = new Blob(file,{type : 'application/xml'});
+	var dnlk = window.URL.createObjectURL(bb);
+	var a = document.createElement("a");
+	a.hidden = '';
+	a.href = dnlk;
+	a.download = "save.xml";
+	a.textContent = "Save File";
+	
+	
+	
+	popup.popupContent.appendChild(a);
+}
+
+// Only other global function which must be defined in the interface class. Determines how to create the XML document.
+function interfaceXMLSave(){
+	// Create the XML string to be exported with results
+	var xmlDoc = document.createElement("BrowserEvaluationResult");
+	var projectDocument = specification.projectXML;
+	projectDocument.setAttribute('file-name',url);
+	xmlDoc.appendChild(projectDocument);
+	xmlDoc.appendChild(returnDateNode());
+	xmlDoc.appendChild(interfaceContext.returnNavigator());
+	for (var i=0; i<testState.stateResults.length; i++)
+	{
+		xmlDoc.appendChild(testState.stateResults[i]);
+	}
+	
+	return xmlDoc;
+}
+
+function linearToDecibel(gain)
+{
+	return 20.0*Math.log10(gain);
+}
+
+function decibelToLinear(gain)
+{
+	return Math.pow(10,gain/20.0);
+}
+
 function interfacePopup() {
 	// Creates an object to manage the popup
 	this.popup = null;
@@ -56,6 +265,14 @@
 	this.popupOptions = null;
 	this.currentIndex = null;
 	this.responses = null;
+	$(window).keypress(function(e){
+			if (e.keyCode == 13 && popup.popup.style.visibility == 'visible')
+			{
+				console.log(e);
+				popup.buttonProceed.onclick();
+				e.preventDefault();
+			}
+		});
 	
 	this.createPopup = function(){
 		// Create popup window interface
@@ -133,12 +350,6 @@
 		var blank = document.getElementsByClassName('testHalt')[0];
 		blank.style.zIndex = 2;
 		blank.style.visibility = 'visible';
-		$(window).keypress(function(e){
-			if (e.keyCode == 13 && popup.popup.style.visibility == 'visible')
-			{
-				popup.buttonProceed.onclick();
-			}
-		});
 	};
 	
 	this.hidePopup = function(){
@@ -148,7 +359,6 @@
 		blank.style.zIndex = -2;
 		blank.style.visibility = 'hidden';
 		this.buttonPrevious.style.visibility = 'inherit';
-		$(window).keypress(function(e){});
 	};
 	
 	this.postNode = function() {
@@ -306,12 +516,14 @@
 		} else if (node.type == "radio") {
 			var optHold = this.popupResponse;
 			var hold = document.createElement('radio');
+			console.log("Checkbox: "+ node.statement);
 			var responseID = null;
 			var i=0;
 			while(responseID == null) {
 				var input = optHold.childNodes[i].getElementsByTagName('input')[0];
 				if (input.checked == true) {
 					responseID = i;
+					console.log("Selected: "+ node.options[i].name);
 				}
 				i++;
 			}
@@ -397,6 +609,16 @@
 			}
 		}
 	};
+	
+	this.resize = function(event)
+	{
+		// Called on window resize;
+		this.popup.style.left = (window.innerWidth/2)-250 + 'px';
+		this.popup.style.top = (window.innerHeight/2)-125 + 'px';
+		var blank = document.getElementsByClassName('testHalt')[0];
+		blank.style.width = window.innerWidth;
+		blank.style.height = window.innerHeight;
+	};
 }
 
 function advanceState()
@@ -458,7 +680,7 @@
 				this.currentStateMap = this.stateMap[this.stateIndex];
 				if (this.currentStateMap.type == "audioHolder") {
 					console.log('Loading test page');
-					loadTest(this.currentStateMap);
+					interfaceContext.newPage(this.currentStateMap);
 					this.initialiseInnerState(this.currentStateMap);
 				} else if (this.currentStateMap.type == "pretest" || this.currentStateMap.type == "posttest") {
 					if (this.currentStateMap.options.length >= 1) {
@@ -477,8 +699,26 @@
 	
 	this.testPageCompleted = function(store, testXML, testId) {
 		// Function called each time a test page has been completed
-		// Can be used to over-rule default behaviour
-		
+		var metric = document.createElement('metric');
+		if (audioEngineContext.metric.enableTestTimer)
+		{
+			var testTime = document.createElement('metricResult');
+			testTime.id = 'testTime';
+			testTime.textContent = audioEngineContext.timer.testDuration;
+			metric.appendChild(testTime);
+		}
+		store.appendChild(metric);
+		var audioObjects = audioEngineContext.audioObjects;
+		for (var i=0; i<audioObjects.length; i++) 
+		{
+			var audioElement = audioEngineContext.audioObjects[i].exportXMLDOM();
+			audioElement.setAttribute('presentedId',i);
+			store.appendChild(audioElement);
+		}
+		$(interfaceContext.commentQuestions).each(function(index,element){
+			var node = element.exportXMLDOM();
+			store.appendChild(node);
+		});
 		pageXMLSave(store, testXML);
 	};
 	
@@ -518,216 +758,7 @@
 	this.previousState = function(){};
 }
 
-function testEnded(testId)
-{
-	pageXMLSave(testId);
-	if (testXMLSetups.length-1 > testId)
-	{
-		// Yes we have another test to perform
-		testId = (Number(testId)+1);
-		currentState = 'testRun-'+testId;
-		loadTest(testId);
-	} else {
-		console.log('Testing Completed!');
-		currentState = 'postTest';
-		// Check for any post tests
-		var xmlSetup = projectXML.find('setup');
-		var postTest = xmlSetup.find('PostTest')[0];
-		popup.initState(postTest);
-	}
-}
-
-function loadProjectSpec(url) {
-	// Load the project document from the given URL, decode the XML and instruct audioEngine to get audio data
-	// If url is null, request client to upload project XML document
-	var r = new XMLHttpRequest();
-	r.open('GET',url,true);
-	r.onload = function() {
-		loadProjectSpecCallback(r.response);
-	};
-	r.send();
-};
-
-function loadProjectSpecCallback(response) {
-	// Function called after asynchronous download of XML project specification
-	//var decode = $.parseXML(response);
-	//projectXML = $(decode);
-	
-	var parse = new DOMParser();
-	projectXML = parse.parseFromString(response,'text/xml');
-	
-	// Build the specification
-	specification.decode();
-	
-	testState.stateMap.push(specification.preTest);
-	
-	$(specification.audioHolders).each(function(index,elem){
-		testState.stateMap.push(elem);
-	});
-	 
-	 testState.stateMap.push(specification.postTest);
-	 
-	// Obtain the metrics enabled
-	$(specification.metrics).each(function(index,node){
-		var enabled = node.textContent;
-		switch(node.enabled)
-		{
-		case 'testTimer':
-			sessionMetrics.prototype.enableTestTimer = true;
-			break;
-		case 'elementTimer':
-			sessionMetrics.prototype.enableElementTimer = true;
-			break;
-		case 'elementTracker':
-			sessionMetrics.prototype.enableElementTracker = true;
-			break;
-		case 'elementListenTracker':
-			sessionMetrics.prototype.enableElementListenTracker = true;
-			break;
-		case 'elementInitialPosition':
-			sessionMetrics.prototype.enableElementInitialPosition = true;
-			break;
-		case 'elementFlagListenedTo':
-			sessionMetrics.prototype.enableFlagListenedTo = true;
-			break;
-		case 'elementFlagMoved':
-			sessionMetrics.prototype.enableFlagMoved = true;
-			break;
-		case 'elementFlagComments':
-			sessionMetrics.prototype.enableFlagComments = true;
-			break;
-		}
-	});
-	
-	
-	
-	// Detect the interface to use and load the relevant javascripts.
-	var interfaceJS = document.createElement('script');
-	interfaceJS.setAttribute("type","text/javascript");
-	if (specification.interfaceType == 'APE') {
-		interfaceJS.setAttribute("src","ape.js");
-		
-		// APE comes with a css file
-		var css = document.createElement('link');
-		css.rel = 'stylesheet';
-		css.type = 'text/css';
-		css.href = 'ape.css';
-		
-		document.getElementsByTagName("head")[0].appendChild(css);
-	} else if (specification.interfaceType == "MUSHRA")
-	{
-		interfaceJS.setAttribute("src","mushra.js");
-		
-		// MUSHRA comes with a css file
-		var css = document.createElement('link');
-		css.rel = 'stylesheet';
-		css.type = 'text/css';
-		css.href = 'mushra.css';
-		
-		document.getElementsByTagName("head")[0].appendChild(css);
-	}
-	document.getElementsByTagName("head")[0].appendChild(interfaceJS);
-	
-	// Define window callbacks for interface
-	window.onresize = function(event){interfaceContext.resizeWindow(event);};
-}
-
-function createProjectSave(destURL) {
-	// Save the data from interface into XML and send to destURL
-	// If destURL is null then download XML in client
-	// Now time to render file locally
-	var xmlDoc = interfaceXMLSave();
-	var parent = document.createElement("div");
-	parent.appendChild(xmlDoc);
-	var file = [parent.innerHTML];
-	if (destURL == "null" || destURL == undefined) {
-		var bb = new Blob(file,{type : 'application/xml'});
-		var dnlk = window.URL.createObjectURL(bb);
-		var a = document.createElement("a");
-		a.hidden = '';
-		a.href = dnlk;
-		a.download = "save.xml";
-		a.textContent = "Save File";
-		
-		popup.showPopup();
-		popup.popupContent.innerHTML = null;
-		popup.popupContent.appendChild(a);
-	} else {
-		var xmlhttp = new XMLHttpRequest;
-		xmlhttp.open("POST",destURL,true);
-		xmlhttp.setRequestHeader('Content-Type', 'text/xml');
-		xmlhttp.onerror = function(){
-			console.log('Error saving file to server! Presenting download locally');
-			createProjectSave(null);
-		};
-		xmlhttp.onreadystatechange  = function() {
-			console.log(xmlhttp.status);
-			if (xmlhttp.status != 200 && xmlhttp.readyState == 4) {
-				createProjectSave(null);
-			} else {
-				if (xmlhttp.responseXML == null)
-				{
-					return createProjectSave(null);
-				}
-				var response = xmlhttp.responseXML.childNodes[0];
-				if (response.getAttribute('state') == "OK")
-				{
-					var file = response.getElementsByTagName('file')[0];
-					console.log('Save OK: Filename '+file.textContent+','+file.getAttribute('bytes')+'B');
-					popup.showPopup();
-					popup.popupContent.innerHTML = null;
-					popup.popupContent.textContent = "Thank you!";
-				} else {
-					var message = response.getElementsByTagName('message')[0];
-					errorSessionDump(message.textContent);
-				}
-			}
-		};
-		xmlhttp.send(file);
-	}
-}
-
-function errorSessionDump(msg){
-	// Create the partial interface XML save
-	// Include error node with message on why the dump occured
-	var xmlDoc = interfaceXMLSave();
-	var err = document.createElement('error');
-	err.textContent = msg;
-	xmlDoc.appendChild(err);
-	var parent = document.createElement("div");
-	parent.appendChild(xmlDoc);
-	var file = [parent.innerHTML];
-	var bb = new Blob(file,{type : 'application/xml'});
-	var dnlk = window.URL.createObjectURL(bb);
-	var a = document.createElement("a");
-	a.hidden = '';
-	a.href = dnlk;
-	a.download = "save.xml";
-	a.textContent = "Save File";
-	
-	popup.showPopup();
-	popup.popupContent.innerHTML = "ERROR : "+msg;
-	popup.popupContent.appendChild(a);
-}
-
-// Only other global function which must be defined in the interface class. Determines how to create the XML document.
-function interfaceXMLSave(){
-	// Create the XML string to be exported with results
-	var xmlDoc = document.createElement("BrowserEvaluationResult");
-	var projectDocument = specification.projectXML;
-	projectDocument.setAttribute('file-name',url);
-	xmlDoc.appendChild(projectDocument);
-	xmlDoc.appendChild(returnDateNode());
-	xmlDoc.appendChild(interfaceContext.returnNavigator());
-	for (var i=0; i<testState.stateResults.length; i++)
-	{
-		xmlDoc.appendChild(testState.stateResults[i]);
-	}
-	
-	return xmlDoc;
-}
-
-function AudioEngine() {
+function AudioEngine(specification) {
 	
 	// Create two output paths, the main outputGain and fooGain.
 	// Output gain is default to 1 and any items for playback route here
@@ -747,13 +778,78 @@
 	// Create the timer Object
 	this.timer = new timer();
 	// Create session metrics
-	this.metric = new sessionMetrics(this);
+	this.metric = new sessionMetrics(this,specification);
 	
 	this.loopPlayback = false;
 	
 	// Create store for new audioObjects
 	this.audioObjects = [];
 	
+	this.buffers = [];
+	this.bufferObj = function()
+	{
+		this.url = null;
+		this.buffer = null;
+		this.xmlRequest = new XMLHttpRequest();
+		this.xmlRequest.parent = this;
+		this.users = [];
+		this.getMedia = function(url) {
+			this.url = url;
+			this.xmlRequest.open('GET',this.url,true);
+			this.xmlRequest.responseType = 'arraybuffer';
+			
+			var bufferObj = this;
+			
+			// Create callback to decode the data asynchronously
+			this.xmlRequest.onloadend = function() {
+				audioContext.decodeAudioData(bufferObj.xmlRequest.response, function(decodedData) {
+					bufferObj.buffer = decodedData;
+					for (var i=0; i<bufferObj.users.length; i++)
+					{
+						bufferObj.users[i].state = 1;
+						if (bufferObj.users[i].interfaceDOM != null)
+						{
+							bufferObj.users[i].bufferLoaded(bufferObj);
+						}
+					}
+					//calculateLoudness(bufferObj.buffer,"I");
+				}, function(){
+					// Should only be called if there was an error, but sometimes gets called continuously
+					// Check here if the error is genuine
+					if (bufferObj.buffer == undefined) {
+						// Genuine error
+						console.log('FATAL - Error loading buffer on '+audioObj.id);
+						if (request.status == 404)
+						{
+							console.log('FATAL - Fragment '+audioObj.id+' 404 error');
+							console.log('URL: '+audioObj.url);
+							errorSessionDump('Fragment '+audioObj.id+' 404 error');
+						}
+					}
+				});
+			};
+			this.progress = 0;
+			this.progressCallback = function(event){
+				if (event.lengthComputable)
+				{
+					this.parent.progress = event.loaded / event.total;
+					for (var i=0; i<this.parent.users.length; i++)
+					{
+						if(this.parent.users[i].interfaceDOM != null)
+						{
+							if (typeof this.parent.users[i].interfaceDOM.updateLoading === "function")
+							{
+								this.parent.users[i].interfaceDOM.updateLoading(this.parent.progress*100);
+							}
+						}
+					}
+				}
+			};
+			this.xmlRequest.addEventListener("progress", this.progressCallback);
+			this.xmlRequest.send();
+		};
+	};
+	
 	this.play = function(id) {
 		// Start the timer and set the audioEngine state to playing (1)
 		if (this.status == 0 && this.loopPlayback) {
@@ -794,7 +890,7 @@
 						this.audioObjects[i].outputGain.gain.value = 0.0;
 						this.audioObjects[i].stop();
 					} else if (i == id) {
-						this.audioObjects[id].outputGain.gain.value = 1.0;
+						this.audioObjects[id].outputGain.gain.value = this.audioObjects[id].specification.gain*this.audioObjects[id].buffer.buffer.gain;
 						this.audioObjects[id].play(audioContext.currentTime+0.01);
 					}
 				}
@@ -823,9 +919,31 @@
 		audioObjectId = this.audioObjects.length;
 		this.audioObjects[audioObjectId] = new audioObject(audioObjectId);
 
-		// AudioObject will get track itself.
+		// Check if audioObject buffer is currently stored by full URL
+		var URL = element.parent.hostURL + element.url;
+		var buffer = null;
+		for (var i=0; i<this.buffers.length; i++)
+		{
+			if (URL == this.buffers[i].url)
+			{
+				buffer = this.buffers[i];
+				break;
+			}
+		}
+		if (buffer == null)
+		{
+			console.log("[WARN]: Buffer was not loaded in pre-test! "+URL);
+			buffer = new this.bufferObj();
+			buffer.getMedia(URL);
+			this.buffers.push(buffer);
+		}
 		this.audioObjects[audioObjectId].specification = element;
-		this.audioObjects[audioObjectId].constructTrack(element.parent.hostURL + element.url);
+		this.audioObjects[audioObjectId].url = URL;
+		buffer.users.push(this.audioObjects[audioObjectId]);
+		if (buffer.buffer != null)
+		{
+			this.audioObjects[audioObjectId].bufferLoaded(buffer);
+		}
 		return this.audioObjects[audioObjectId];
 	};
 	
@@ -833,6 +951,10 @@
 		this.state = 0;
 		this.audioObjectsReady = false;
 		this.metric.reset();
+		for (var i=0; i < this.buffers.length; i++)
+		{
+			this.buffers[i].users = [];
+		}
 		this.audioObjects = [];
 	};
 	
@@ -861,26 +983,19 @@
 	this.setSynchronousLoop = function() {
 		// Pads the signals so they are all exactly the same length
 		var length = 0;
-		var lens = [];
 		var maxId;
 		for (var i=0; i<this.audioObjects.length; i++)
 		{
-			lens.push(this.audioObjects[i].buffer.length);
-			if (length < this.audioObjects[i].buffer.length)
+			if (length < this.audioObjects[i].buffer.buffer.length)
 			{
-				length = this.audioObjects[i].buffer.length;
+				length = this.audioObjects[i].buffer.buffer.length;
 				maxId = i;
 			}
 		}
-		// Perform difference
-		for (var i=0; i<lens.length; i++)
+		// Extract the audio and zero-pad
+		for (var i=0; i<this.audioObjects.length; i++)
 		{
-			lens[i] = length - lens[i];
-		}
-		// Extract the audio and zero-pad
-		for (var i=0; i<lens.length; i++)
-		{
-			var orig = this.audioObjects[i].buffer;
+			var orig = this.audioObjects[i].buffer.buffer;
 			var hold = audioContext.createBuffer(orig.numberOfChannels,length,orig.sampleRate);
 			for (var c=0; c<orig.numberOfChannels; c++)
 			{
@@ -889,8 +1004,9 @@
 				for (var n=0; n<orig.length; n++)
 				{inData[n] = outData[n];}
 			}
-			this.audioObjects[i].buffer = hold;
-			delete orig;
+			hold.gain = orig.gain;
+			hold.lufs = orig.lufs;
+			this.audioObjects[i].buffer.buffer = hold;
 		}
 	};
 	
@@ -922,9 +1038,47 @@
 	// the audiobuffer is not designed for multi-start playback
 	// When stopeed, the buffer node is deleted and recreated with the stored buffer.
 	this.buffer;
+	
+	this.bufferLoaded = function(callee)
+	{
+		// Called by the associated buffer when it has finished loading, will then 'bind' the buffer to the
+		// audioObject and trigger the interfaceDOM.enable() function for user feedback
+		if (audioEngineContext.loopPlayback){
+			// First copy the buffer into this.buffer
+			this.buffer = new audioEngineContext.bufferObj();
+			this.buffer.url = callee.url;
+			this.buffer.buffer = audioContext.createBuffer(callee.buffer.numberOfChannels, callee.buffer.length, callee.buffer.sampleRate);
+			for (var c=0; c<callee.buffer.numberOfChannels; c++)
+			{
+				var src = callee.buffer.getChannelData(c);
+				var dst = this.buffer.buffer.getChannelData(c);
+				for (var n=0; n<src.length; n++)
+				{
+					dst[n] = src[n];
+				}
+			}
+		} else {
+			this.buffer = callee;
+		}
+		this.state = 1;
+		this.buffer.buffer.gain = callee.buffer.gain;
+		this.buffer.buffer.lufs = callee.buffer.lufs;
+		/*
+		var targetLUFS = this.specification.parent.loudness;
+		if (typeof targetLUFS === "number")
+		{
+			this.buffer.buffer.gain = decibelToLinear(targetLUFS - this.buffer.buffer.lufs);
+		} else {
+			this.buffer.buffer.gain = 1.0;
+		}
+		*/
+		if (this.interfaceDOM != null) {
+			this.interfaceDOM.enable();
+		}
+	};
     
 	this.loopStart = function() {
-		this.outputGain.gain.value = 1.0;
+		this.outputGain.gain.value =  this.specification.gain*this.buffer.buffer.gain;
 		this.metric.startListening(audioEngineContext.timer.getTestTime());
 	};
 	
@@ -936,11 +1090,11 @@
 	};
 	
 	this.play = function(startTime) {
-		if (this.bufferNode == undefined) {
+		if (this.bufferNode == undefined && this.buffer.buffer != undefined) {
 			this.bufferNode = audioContext.createBufferSource();
 			this.bufferNode.owner = this;
 			this.bufferNode.connect(this.outputGain);
-			this.bufferNode.buffer = this.buffer;
+			this.bufferNode.buffer = this.buffer.buffer;
 			this.bufferNode.loop = audioEngineContext.loopPlayback;
 			this.bufferNode.onended = function(event) {
 				// Safari does not like using 'this' to reference the calling object!
@@ -968,7 +1122,7 @@
 		if (this.bufferNode != undefined) {
 			if (this.bufferNode.loop == true) {
 				if (audioEngineContext.status  == 1) {
-					return (time-this.metric.listenStart)%this.buffer.duration;
+					return (time-this.metric.listenStart)%this.buffer.buffer.duration;
 				} else {
 					return 0;
 				}
@@ -983,52 +1137,27 @@
 			return 0;
 		}
 	};
-
-	this.constructTrack = function(url) {
-		var request = new XMLHttpRequest();
-		this.url = url;
-		request.open('GET',url,true);
-		request.responseType = 'arraybuffer';
-		
-		var audioObj = this;
-		
-		// Create callback to decode the data asynchronously
-		request.onloadend = function() {
-			audioContext.decodeAudioData(request.response, function(decodedData) {
-				audioObj.buffer = decodedData;
-				audioObj.state = 1;
-				if (audioObj.specification.type != 'outsidereference')
-					{audioObj.interfaceDOM.enable();}
-			}, function(){
-				// Should only be called if there was an error, but sometimes gets called continuously
-				// Check here if the error is genuine
-				if (audioObj.state == 0 || audioObj.buffer == undefined) {
-					// Genuine error
-					console.log('FATAL - Error loading buffer on '+audioObj.id);
-					if (request.status == 404)
-					{
-						console.log('FATAL - Fragment '+audioObj.id+' 404 error');
-						console.log('URL: '+audioObj.url);
-						errorSessionDump('Fragment '+audioObj.id+' 404 error');
-					}
-				}
-			});
-		};
-		request.send();
-	};
 	
 	this.exportXMLDOM = function() {
 		var root = document.createElement('audioElement');
 		root.id = this.specification.id;
-		root.setAttribute('url',this.url);
+		root.setAttribute('url',this.specification.url);
 		var file = document.createElement('file');
-		file.setAttribute('sampleRate',this.buffer.sampleRate);
-		file.setAttribute('channels',this.buffer.numberOfChannels);
-		file.setAttribute('sampleCount',this.buffer.length);
-		file.setAttribute('duration',this.buffer.duration);
+		file.setAttribute('sampleRate',this.buffer.buffer.sampleRate);
+		file.setAttribute('channels',this.buffer.buffer.numberOfChannels);
+		file.setAttribute('sampleCount',this.buffer.buffer.length);
+		file.setAttribute('duration',this.buffer.buffer.duration);
 		root.appendChild(file);
 		if (this.specification.type != 'outsidereference') {
-			root.appendChild(this.interfaceDOM.exportXMLDOM(this));
+			var interfaceXML = this.interfaceDOM.exportXMLDOM(this);
+			if (interfaceXML.length == undefined) {
+				root.appendChild(interfaceXML);
+			} else {
+				for (var i=0; i<interfaceXML.length; i++)
+				{
+					root.appendChild(interfaceXML[i]);
+				}
+			}
 			root.appendChild(this.commentDOM.exportXMLDOM(this));
 			if(this.specification.type == 'anchor') {
 				root.setAttribute('anchor',true);
@@ -1084,7 +1213,7 @@
 	};
 }
 
-function sessionMetrics(engine)
+function sessionMetrics(engine,specification)
 {
 	/* Used by audioEngine to link to audioObjects to minimise the timer call timers;
 	 */
@@ -1095,6 +1224,46 @@
 		this.lastClicked = -1;
 		this.data = -1;
 	};
+	
+	this.enableElementInitialPosition = false;
+	this.enableElementListenTracker = false;
+	this.enableElementTimer = false;
+	this.enableElementTracker = false;
+	this.enableFlagListenedTo = false;
+	this.enableFlagMoved = false;
+	this.enableTestTimer = false;
+	// Obtain the metrics enabled
+	for (var i=0; i<specification.metrics.length; i++)
+	{
+		var node = specification.metrics[i];
+		switch(node.enabled)
+		{
+		case 'testTimer':
+			this.enableTestTimer = true;
+			break;
+		case 'elementTimer':
+			this.enableElementTimer = true;
+			break;
+		case 'elementTracker':
+			this.enableElementTracker = true;
+			break;
+		case 'elementListenTracker':
+			this.enableElementListenTracker = true;
+			break;
+		case 'elementInitialPosition':
+			this.enableElementInitialPosition = true;
+			break;
+		case 'elementFlagListenedTo':
+			this.enableFlagListenedTo = true;
+			break;
+		case 'elementFlagMoved':
+			this.enableFlagMoved = true;
+			break;
+		case 'elementFlagComments':
+			this.enableFlagComments = true;
+			break;
+		}
+	}
 	this.initialiseTest = function(){};
 }
 
@@ -1304,7 +1473,7 @@
 	
 	hold.appendChild(date);
 	hold.appendChild(time);
-	return hold
+	return hold;
 	
 }
 
@@ -1312,18 +1481,59 @@
 	// Handles the decoding of the project specification XML into a simple JavaScript Object.
 	
 	this.interfaceType = null;
-	this.commonInterface = null;
+	this.commonInterface = new function()
+	{
+		this.options = [];
+		this.optionNode = function(input)
+		{
+			var name = input.getAttribute('name');
+			this.type = name;
+			if(this.type == "option")
+			{
+				this.name = input.id;
+			} else if (this.type == "check")
+			{
+				this.check = input.id;
+			}
+		};
+	};
+	
+	this.randomiseOrder = function(input)
+	{
+		// This takes an array of information and randomises the order
+		var N = input.length;
+		
+		var inputSequence = []; // For safety purposes: keep track of randomisation
+		for (var counter = 0; counter < N; ++counter) 
+			inputSequence.push(counter) // Fill array
+		var inputSequenceClone = inputSequence.slice(0);
+		
+		var holdArr = [];
+		var outputSequence = [];
+		for (var n=0; n<N; n++)
+		{
+			// First pick a random number
+			var r = Math.random();
+			// Multiply and floor by the number of elements left
+			r = Math.floor(r*input.length);
+			// Pick out that element and delete from the array
+			holdArr.push(input.splice(r,1)[0]);
+			// Do the same with sequence
+			outputSequence.push(inputSequence.splice(r,1)[0]);
+		}
+		console.log(inputSequenceClone.toString()); // print original array to console
+		console.log(outputSequence.toString()); 	// print randomised array to console
+		return holdArr;
+	};
 	this.projectReturn = null;
 	this.randomiseOrder = null;
 	this.collectMetrics = null;
 	this.testPages = null;
-	this.preTest = null;
-	this.postTest = null;
-	this.metrics =[];
+	this.audioHolders = [];
+	this.metrics = [];
+	this.loudness = null;
 	
-	this.audioHolders = [];
-	
-	this.decode = function() {
+	this.decode = function(projectXML) {
 		// projectXML - DOM Parsed document
 		this.projectXML = projectXML.childNodes[0];
 		var setupNode = projectXML.getElementsByTagName('setup')[0];
@@ -1343,10 +1553,29 @@
 			this.testPages = Number(this.testPages);
 			if (this.testPages == 0) {this.testPages = null;}
 		}
+		if (setupNode.getAttribute('loudness') != null)
+		{
+			var XMLloudness = setupNode.getAttribute('loudness');
+			if (isNaN(Number(XMLloudness)) == false)
+			{
+				this.loudness = Number(XMLloudness);
+			}
+		}
 		var metricCollection = setupNode.getElementsByTagName('Metric');
 		
-		this.preTest = new this.prepostNode('pretest',setupNode.getElementsByTagName('PreTest'));
-		this.postTest = new this.prepostNode('posttest',setupNode.getElementsByTagName('PostTest'));
+		var setupPreTestNode = setupNode.getElementsByTagName('PreTest');
+		if (setupPreTestNode.length != 0)
+		{
+			setupPreTestNode = setupPreTestNode[0];
+			this.preTest.construct(setupPreTestNode);
+		}
+		
+		var setupPostTestNode = setupNode.getElementsByTagName('PostTest');
+		if (setupPostTestNode.length != 0)
+		{
+			setupPostTestNode = setupPostTestNode[0];
+			this.postTest.construct(setupPostTestNode);
+		}
 		
 		if (metricCollection.length > 0) {
 			metricCollection = metricCollection[0].getElementsByTagName('metricEnable');
@@ -1388,7 +1617,10 @@
 						}
 					}
 				} else if (this.type == 'anchor' || this.type == 'reference') {
-					Console.log("WARNING: Anchor and Reference tags in the <interface> node are depricated");
+					this.value = Number(child.textContent);
+					this.enforce = child.getAttribute('enforce');
+					if (this.enforce == 'true') {this.enforce = true;}
+					else {this.enforce = false;}
 				}
 			};
 			this.options = [];
@@ -1403,11 +1635,13 @@
 		
 		var audioHolders = projectXML.getElementsByTagName('audioHolder');
 		for (var i=0; i<audioHolders.length; i++) {
-			this.audioHolders.push(new this.audioHolderNode(this,audioHolders[i]));
+			var node = new this.audioHolderNode(this);
+			node.decode(this,audioHolders[i]);
+			this.audioHolders.push(node);
 		}
 		
 		// New check if we need to randomise the test order
-		if (this.randomiseOrder)
+		if (this.randomiseOrder && typeof randomiseOrder === "function")
 		{
 	 		this.audioHolders = randomiseOrder(this.audioHolders);
 	 		for (var i=0; i<this.audioHolders.length; i++)
@@ -1432,238 +1666,617 @@
 		}
 	};
 	
-	this.prepostNode = function(type,Collection) {
+	this.encode = function()
+	{
+		var root = document.implementation.createDocument(null,"BrowserEvalProjectDocument");
+		// First get all the <setup> tag compiled
+		var setupNode = root.createElement("setup");
+		setupNode.setAttribute('interface',this.interfaceType);
+		setupNode.setAttribute('projectReturn',this.projectReturn);
+		setupNode.setAttribute('randomiseOrder',this.randomiseOrder);
+		setupNode.setAttribute('collectMetrics',this.collectMetrics);
+		setupNode.setAttribute('testPages',this.testPages);
+		if(this.loudness != null) {AHNode.setAttribute("loudness",this.loudness);}
+		
+		var setupPreTest = root.createElement("PreTest");
+		for (var i=0; i<this.preTest.options.length; i++)
+		{
+			setupPreTest.appendChild(this.preTest.options[i].exportXML(root));
+		}
+		
+		var setupPostTest = root.createElement("PostTest");
+		for (var i=0; i<this.postTest.options.length; i++)
+		{
+			setupPostTest.appendChild(this.postTest.options[i].exportXML(root));
+		}
+		
+		setupNode.appendChild(setupPreTest);
+		setupNode.appendChild(setupPostTest);
+		
+		// <Metric> tag
+		var Metric = root.createElement("Metric");
+		for (var i=0; i<this.metrics.length; i++)
+		{
+			var metricEnable = root.createElement("metricEnable");
+			metricEnable.textContent = this.metrics[i].enabled;
+			Metric.appendChild(metricEnable);
+		}
+		setupNode.appendChild(Metric);
+		
+		// <interface> tag
+		var CommonInterface = root.createElement("interface");
+		for (var i=0; i<this.commonInterface.options.length; i++)
+		{
+			var CIObj = this.commonInterface.options[i];
+			var CINode = root.createElement(CIObj.type);
+			if (CIObj.type == "check") {CINode.setAttribute("name",CIObj.check);}
+			else {CINode.setAttribute("name",CIObj.name);}
+			CommonInterface.appendChild(CINode);
+		}
+		setupNode.appendChild(CommonInterface);
+		
+		root.getElementsByTagName("BrowserEvalProjectDocument")[0].appendChild(setupNode);
+		// Time for the <audioHolder> tags
+		for (var ahIndex = 0; ahIndex < this.audioHolders.length; ahIndex++)
+		{
+			var node = this.audioHolders[ahIndex].encode(root);
+			root.getElementsByTagName("BrowserEvalProjectDocument")[0].appendChild(node);
+		}
+		return root;
+	};
+	
+	this.prepostNode = function(type) {
 		this.type = type;
 		this.options = [];
 		
-		this.OptionNode = function(child) {
+		this.OptionNode = function() {
 			
-			this.childOption = function(element) {
+			this.childOption = function() {
 				this.type = 'option';
-				this.id = element.id;
-				this.name = element.getAttribute('name');
-				this.text = element.textContent;
+				this.id = null;
+				this.name = undefined;
+				this.text = null;
 			};
 			
-			this.type = child.nodeName;
-			if (child.nodeName == "question") {
-				this.id = child.id;
-				this.mandatory;
-				if (child.getAttribute('mandatory') == "true") {this.mandatory = true;}
-				else {this.mandatory = false;}
-				this.question = child.textContent;
-				if (child.getAttribute('boxsize') == null) {
-					this.boxsize = 'normal';
-				} else {
-					this.boxsize = child.getAttribute('boxsize');
+			this.type = undefined;
+			this.id = undefined;
+			this.mandatory = undefined;
+			this.question = undefined;
+			this.statement = undefined;
+			this.boxsize = undefined;
+			this.options = [];
+			this.min = undefined;
+			this.max = undefined;
+			this.step = undefined;
+			
+			this.decode = function(child)
+			{
+				this.type = child.nodeName;
+				if (child.nodeName == "question") {
+					this.id = child.id;
+					this.mandatory;
+					if (child.getAttribute('mandatory') == "true") {this.mandatory = true;}
+					else {this.mandatory = false;}
+					this.question = child.textContent;
+					if (child.getAttribute('boxsize') == null) {
+						this.boxsize = 'normal';
+					} else {
+						this.boxsize = child.getAttribute('boxsize');
+					}
+				} else if (child.nodeName == "statement") {
+					this.statement = child.textContent;
+				} else if (child.nodeName == "checkbox" || child.nodeName == "radio") {
+					var element = child.firstElementChild;
+					this.id = child.id;
+					if (element == null) {
+						console.log('Malformed' +child.nodeName+ 'entry');
+						this.statement = 'Malformed' +child.nodeName+ 'entry';
+						this.type = 'statement';
+					} else {
+						this.options = [];
+						while (element != null) {
+							if (element.nodeName == 'statement' && this.statement == undefined){
+								this.statement = element.textContent;
+							} else if (element.nodeName == 'option') {
+								var node = new this.childOption();
+								node.id = element.id;
+								node.name = element.getAttribute('name');
+								node.text = element.textContent;
+								this.options.push(node);
+							}
+							element = element.nextElementSibling;
+						}
+					}
+				} else if (child.nodeName == "number") {
+					this.statement = child.textContent;
+					this.id = child.id;
+					this.min = child.getAttribute('min');
+					this.max = child.getAttribute('max');
+					this.step = child.getAttribute('step');
 				}
-			} else if (child.nodeName == "statement") {
-				this.statement = child.textContent;
-			} else if (child.nodeName == "checkbox" || child.nodeName == "radio") {
-				var element = child.firstElementChild;
-				this.id = child.id;
-				if (element == null) {
-					console.log('Malformed' +child.nodeName+ 'entry');
-					this.statement = 'Malformed' +child.nodeName+ 'entry';
-					this.type = 'statement';
-				} else {
-					this.options = [];
-					while (element != null) {
-						if (element.nodeName == 'statement' && this.statement == undefined){
-							this.statement = element.textContent;
-						} else if (element.nodeName == 'option') {
-							this.options.push(new this.childOption(element));
-						}
-						element = element.nextElementSibling;
+			};
+			
+			this.exportXML = function(root)
+			{
+				var node = root.createElement(this.type);
+				switch(this.type)
+				{
+				case "statement":
+					node.textContent = this.statement;
+					break;
+				case "question":
+					node.id = this.id;
+					node.setAttribute("mandatory",this.mandatory);
+					node.setAttribute("boxsize",this.boxsize);
+					node.textContent = this.question;
+					break;
+				case "number":
+					node.id = this.id;
+					node.setAttribute("mandatory",this.mandatory);
+					node.setAttribute("min", this.min);
+					node.setAttribute("max", this.max);
+					node.setAttribute("step", this.step);
+					node.textContent = this.statement;
+					break;
+				case "checkbox":
+					node.id = this.id;
+					var statement = root.createElement("statement");
+					statement.textContent = this.statement;
+					node.appendChild(statement);
+					for (var i=0; i<this.options.length; i++)
+					{
+						var option = this.options[i];
+						var optionNode = root.createElement("option");
+						optionNode.id = option.id;
+						optionNode.textContent = option.text;
+						node.appendChild(optionNode);
 					}
+					break;
+				case "radio":
+					node.id = this.id;
+					var statement = root.createElement("statement");
+					statement.textContent = this.statement;
+					node.appendChild(statement);
+					for (var i=0; i<this.options.length; i++)
+					{
+						var option = this.options[i];
+						var optionNode = root.createElement("option");
+						optionNode.setAttribute("name",option.name);
+						optionNode.textContent = option.text;
+						node.appendChild(optionNode);
+					}
+					break;
 				}
-			} else if (child.nodeName == "number") {
-				this.statement = child.textContent;
-				this.id = child.id;
-				this.min = child.getAttribute('min');
-				this.max = child.getAttribute('max');
-				this.step = child.getAttribute('step');
+				return node;
+			};
+		};
+		this.construct = function(Collection)
+		{
+			if (Collection.childElementCount != 0) {
+				var child = Collection.firstElementChild;
+				var node = new this.OptionNode();
+				node.decode(child);
+				this.options.push(node);
+				while (child.nextElementSibling != null) {
+					child = child.nextElementSibling;
+					node = new this.OptionNode();
+					node.decode(child);
+					this.options.push(node);
+				}
 			}
 		};
-		
-		// On construction:
-		if (Collection.length != 0) {
-			Collection = Collection[0];
-			if (Collection.childElementCount != 0) {
-				var child = Collection.firstElementChild;
-				this.options.push(new this.OptionNode(child));
-				while (child.nextElementSibling != null) {
-					child = child.nextElementSibling;
-					this.options.push(new this.OptionNode(child));
-				}
-			}
-		}
 	};
+	this.preTest = new this.prepostNode("pretest");
+	this.postTest = new this.prepostNode("posttest");
 	
 	this.metricNode = function(name) {
 		this.enabled = name;
 	};
 	
-	this.audioHolderNode = function(parent,xml) {
+	this.audioHolderNode = function(parent) {
 		this.type = 'audioHolder';
-		this.presentedId = parent.audioHolders.length;
-		this.interfaceNode = function(DOM) {
-			var title = DOM.getElementsByTagName('title');
-			if (title.length == 0) {this.title = null;}
-			else {this.title = title[0].textContent;}
-			this.options = parent.commonInterface.options;
-			var scale = DOM.getElementsByTagName('scale');
-			this.scale = [];
-			for (var i=0; i<scale.length; i++) {
-				var arr = [null, null];
-				arr[0] = scale[i].getAttribute('position');
-				arr[1] = scale[i].textContent;
-				this.scale.push(arr);
+		this.presentedId = undefined;
+		this.id = undefined;
+		this.hostURL = undefined;
+		this.sampleRate = undefined;
+		this.randomiseOrder = undefined;
+		this.loop = undefined;
+		this.elementComments = undefined;
+		this.outsideReference = null;
+		this.loudness = null;
+		this.initialPosition = null;
+		this.preTest = new parent.prepostNode("pretest");
+		this.postTest = new parent.prepostNode("pretest");
+		this.interfaces = [];
+		this.commentBoxPrefix = "Comment on track";
+		this.audioElements = [];
+		this.commentQuestions = [];
+		
+		this.decode = function(parent,xml)
+		{
+			this.presentedId = parent.audioHolders.length;
+			this.id = xml.id;
+			this.hostURL = xml.getAttribute('hostURL');
+			this.sampleRate = xml.getAttribute('sampleRate');
+			if (xml.getAttribute('randomiseOrder') == "true") {this.randomiseOrder = true;}
+			else {this.randomiseOrder = false;}
+			this.repeatCount = xml.getAttribute('repeatCount');
+			if (xml.getAttribute('loop') == 'true') {this.loop = true;}
+			else {this.loop == false;}
+			if (xml.getAttribute('elementComments') == "true") {this.elementComments = true;}
+			else {this.elementComments = false;}
+			if (typeof parent.loudness === "number")
+			{
+				this.loudness = parent.loudness;
+			}
+			if (typeof xml.getAttribute('initial-position') === "string")
+			{
+				var xmlInitialPosition = Number(xml.getAttribute('initial-position'));
+				if (isNaN(xmlInitialPosition) == false)
+				{
+					if (xmlInitialPosition > 1)
+					{
+						xmlInitialPosition /= 100;
+					}
+					this.initialPosition = xmlInitialPosition;
+				}
+			}
+			if (xml.getAttribute('loudness') != null)
+			{
+				var XMLloudness = xml.getAttribute('loudness');
+				if (isNaN(Number(XMLloudness)) == false)
+				{
+					this.loudness = Number(XMLloudness);
+				}
+			}
+			var setupPreTestNode = xml.getElementsByTagName('PreTest');
+			if (setupPreTestNode.length != 0)
+			{
+				setupPreTestNode = setupPreTestNode[0];
+				this.preTest.construct(setupPreTestNode);
+			}
+			
+			var setupPostTestNode = xml.getElementsByTagName('PostTest');
+			if (setupPostTestNode.length != 0)
+			{
+				setupPostTestNode = setupPostTestNode[0];
+				this.postTest.construct(setupPostTestNode);
+			}
+			
+			var interfaceDOM = xml.getElementsByTagName('interface');
+			for (var i=0; i<interfaceDOM.length; i++) {
+				var node = new this.interfaceNode();
+				node.decode(interfaceDOM[i]);
+				this.interfaces.push(node);
+			}
+			this.commentBoxPrefix = xml.getElementsByTagName('commentBoxPrefix');
+			if (this.commentBoxPrefix.length != 0) {
+				this.commentBoxPrefix = this.commentBoxPrefix[0].textContent;
+			} else {
+				this.commentBoxPrefix = "Comment on track";
+			}
+			var audioElementsDOM = xml.getElementsByTagName('audioElements');
+			var outsideReferenceHolder = null;
+			for (var i=0; i<audioElementsDOM.length; i++) {
+				var node = new this.audioElementNode();
+				node.decode(this,audioElementsDOM[i]);
+				if (audioElementsDOM[i].getAttribute('type') == 'outsidereference') {
+					if (this.outsideReference == null) {
+						outsideReferenceHolder = node;
+						this.outsideReference = i;
+					} else {
+						console.log('Error only one audioelement can be of type outsidereference per audioholder');
+						this.audioElements.push(node);
+						console.log('Element id '+audioElementsDOM[i].id+' made into normal node');
+					}
+				} else {
+					this.audioElements.push(node);
+				}
+			}
+			
+			if (this.randomiseOrder == true && typeof randomiseOrder === "function")
+			{
+				this.audioElements = randomiseOrder(this.audioElements);
+			}
+			if (outsideReferenceHolder != null)
+			{
+				this.audioElements.push(outsideReferenceHolder);
+				this.outsideReference = this.audioElements.length-1;
+			}
+			
+			
+			var commentQuestionsDOM = xml.getElementsByTagName('CommentQuestion');
+			for (var i=0; i<commentQuestionsDOM.length; i++) {
+				var node = new this.commentQuestionNode();
+				node.decode(commentQuestionsDOM[i]);
+				this.commentQuestions.push(node);
 			}
 		};
 		
-		this.audioElementNode = function(parent,xml) {
-			this.url = xml.getAttribute('url');
-			this.id = xml.id;
-			this.parent = parent;
-			this.type = xml.getAttribute('type');
-			if (this.type == null) {this.type = "normal";}
-			if (this.type == 'anchor') {this.anchor = true;}
-			else {this.anchor = false;}
-			if (this.type == 'reference') {this.reference = true;}
-			else {this.reference = false;}
+		this.encode = function(root)
+		{
+			var AHNode = root.createElement("audioHolder");
+			AHNode.id = this.id;
+			AHNode.setAttribute("hostURL",this.hostURL);
+			AHNode.setAttribute("sampleRate",this.sampleRate);
+			AHNode.setAttribute("randomiseOrder",this.randomiseOrder);
+			AHNode.setAttribute("repeatCount",this.repeatCount);
+			AHNode.setAttribute("loop",this.loop);
+			AHNode.setAttribute("elementComments",this.elementComments);
+			if(this.loudness != null) {AHNode.setAttribute("loudness",this.loudness);}
+			if(this.initialPosition != null) {
+				AHNode.setAttribute("loudness",this.initialPosition*100);
+				}
+			for (var i=0; i<this.interfaces.length; i++)
+			{
+				AHNode.appendChild(this.interfaces[i].encode(root));
+			}
 			
-			if (this.anchor == true || this.reference == true)
+			for (var i=0; i<this.audioElements.length; i++) {
+				AHNode.appendChild(this.audioElements[i].encode(root));
+			}
+			// Create <CommentQuestion>
+			for (var i=0; i<this.commentQuestions.length; i++)
 			{
-				this.marker = xml.getAttribute('marker');
-				if (this.marker != undefined)
+				AHNode.appendChild(this.commentQuestions[i].exportXML(root));
+			}
+			
+			// Create <PreTest>
+			var AHPreTest = root.createElement("PreTest");
+			for (var i=0; i<this.preTest.options.length; i++)
+			{
+				AHPreTest.appendChild(this.preTest.options[i].exportXML(root));
+			}
+			
+			var AHPostTest = root.createElement("PostTest");
+			for (var i=0; i<this.postTest.options.length; i++)
+			{
+				AHPostTest.appendChild(this.postTest.options[i].exportXML(root));
+			}
+			AHNode.appendChild(AHPreTest);
+			AHNode.appendChild(AHPostTest);
+			return AHNode;
+		};
+		
+		this.interfaceNode = function() {
+			this.title = undefined;
+			this.options = [];
+			this.scale = [];
+			this.name = undefined;
+			this.decode = function(DOM)
+			{
+				var title = DOM.getElementsByTagName('title');
+				if (title.length == 0) {this.title = null;}
+				else {this.title = title[0].textContent;}
+				var name = DOM.getAttribute("name");
+				if (name != undefined) {this.name = name;}
+				this.options = parent.commonInterface.options;
+				var scale = DOM.getElementsByTagName('scale');
+				this.scale = [];
+				for (var i=0; i<scale.length; i++) {
+					var arr = [null, null];
+					arr[0] = scale[i].getAttribute('position');
+					arr[1] = scale[i].textContent;
+					this.scale.push(arr);
+				}
+			};
+			this.encode = function(root)
+			{
+				var node = root.createElement("interface");
+				if (this.title != undefined)
 				{
-					this.marker = Number(this.marker);
-					if (isNaN(this.marker) == false)
+					var title = root.createElement("title");
+					title.textContent = this.title;
+					node.appendChild(title);
+				}
+				for (var i=0; i<this.options.length; i++)
+				{
+					var optionNode = root.createElement(this.options[i].type);
+					if (this.options[i].type == "option")
 					{
-						if (this.marker > 1)
-						{	this.marker /= 100.0;}
-						if (this.marker >= 0 && this.marker <= 1)
+						optionNode.setAttribute("name",this.options[i].name);
+					} else if (this.options[i].type == "check") {
+						optionNode.setAttribute("check",this.options[i].check);
+					} else if (this.options[i].type == "scalerange") {
+						optionNode.setAttribute("min",this.options[i].min*100);
+						optionNode.setAttribute("max",this.options[i].max*100);
+					}
+					node.appendChild(optionNode);
+				}
+				for (var i=0; i<this.scale.length; i++) {
+					var scale = root.createElement("scale");
+					scale.setAttribute("position",this.scale[i][0]);
+					scale.textContent = this.scale[i][1];
+					node.appendChild(scale);
+				}
+				return node;
+			};
+		};
+		
+		this.audioElementNode = function() {
+			this.url = null;
+			this.id = null;
+			this.parent = null;
+			this.type = "normal";
+			this.marker = false;
+			this.enforce = false;
+			this.gain = 1.0;
+			this.decode = function(parent,xml)
+			{
+				this.url = xml.getAttribute('url');
+				this.id = xml.id;
+				this.parent = parent;
+				this.type = xml.getAttribute('type');
+				var gain = xml.getAttribute('gain');
+				if (isNaN(gain) == false && gain != null)
+				{
+					this.gain = decibelToLinear(Number(gain));
+				}
+				if (this.type == null) {this.type = "normal";}
+				if (this.type == 'anchor') {this.anchor = true;}
+				else {this.anchor = false;}
+				if (this.type == 'reference') {this.reference = true;}
+				else {this.reference = false;}
+				if (this.anchor == true || this.reference == true)
+				{
+					this.marker = xml.getAttribute('marker');
+					if (this.marker != undefined)
+					{
+						this.marker = Number(this.marker);
+						if (isNaN(this.marker) == false)
 						{
-							this.enforce = true;
-							return;
+							if (this.marker > 1)
+							{	this.marker /= 100.0;}
+							if (this.marker >= 0 && this.marker <= 1)
+							{
+								this.enforce = true;
+								return;
+							} else {
+								console.log("ERROR - Marker of audioElement "+this.id+" is not between 0 and 1 (float) or 0 and 100 (integer)!");
+								console.log("ERROR - Marker not enforced!");
+							}
 						} else {
-							console.log("ERROR - Marker of audioElement "+this.id+" is not between 0 and 1 (float) or 0 and 100 (integer)!");
+							console.log("ERROR - Marker of audioElement "+this.id+" is not a number!");
 							console.log("ERROR - Marker not enforced!");
 						}
-					} else {
-						console.log("ERROR - Marker of audioElement "+this.id+" is not a number!");
-						console.log("ERROR - Marker not enforced!");
 					}
 				}
-			}
-			this.marker = false;
-			this.enforce = false;
+			};
+			this.encode = function(root)
+			{
+				var AENode = root.createElement("audioElements");
+				AENode.id = this.id;
+				AENode.setAttribute("url",this.url);
+				AENode.setAttribute("type",this.type);
+				AENode.setAttribute("gain",linearToDecibel(this.gain));
+				if (this.marker != false)
+				{
+					AENode.setAttribute("marker",this.marker*100);
+				}
+				return AENode;
+			};
 		};
 		
 		this.commentQuestionNode = function(xml) {
-			this.childOption = function(element) {
+			this.id = null;
+			this.type = undefined;
+			this.question = undefined;
+			this.options = [];
+			this.statement = undefined;
+			
+			this.childOption = function() {
 				this.type = 'option';
-				this.name = element.getAttribute('name');
-				this.text = element.textContent;
+				this.name = null;
+				this.text = null;
 			};
-			this.id = xml.id;
-			if (xml.getAttribute('mandatory') == 'true') {this.mandatory = true;}
-			else {this.mandatory = false;}
-			this.type = xml.getAttribute('type');
-			if (this.type == undefined) {this.type = 'text';}
-			switch (this.type) {
-			case 'text':
-				this.question = xml.textContent;
-				break;
-			case 'radio':
-				var child = xml.firstElementChild;
-				this.options = [];
-				while (child != undefined) {
-					if (child.nodeName == 'statement' && this.statement == undefined) {
-						this.statement = child.textContent;
-					} else if (child.nodeName == 'option') {
-						this.options.push(new this.childOption(child));
+			this.exportXML = function(root)
+			{
+				var CQNode = root.createElement("CommentQuestion");
+				CQNode.id = this.id;
+				CQNode.setAttribute("type",this.type);
+				switch(this.type)
+				{
+				case "text":
+					CQNode.textContent = this.question;
+					break;
+				case "radio":
+					var statement = root.createElement("statement");
+					statement.textContent = this.statement;
+					CQNode.appendChild(statement);
+					for (var i=0; i<this.options.length; i++)
+					{
+						var optionNode = root.createElement("option");
+						optionNode.setAttribute("name",this.options[i].name);
+						optionNode.textContent = this.options[i].text;
+						CQNode.appendChild(optionNode);
 					}
-					child = child.nextElementSibling;
+					break;
+				case "checkbox":
+					var statement = root.createElement("statement");
+					statement.textContent = this.statement;
+					CQNode.appendChild(statement);
+					for (var i=0; i<this.options.length; i++)
+					{
+						var optionNode = root.createElement("option");
+						optionNode.setAttribute("name",this.options[i].name);
+						optionNode.textContent = this.options[i].text;
+						CQNode.appendChild(optionNode);
+					}
+					break;
 				}
-				break;
-			case 'checkbox':
-				var child = xml.firstElementChild;
-				this.options = [];
-				while (child != undefined) {
-					if (child.nodeName == 'statement' && this.statement == undefined) {
-						this.statement = child.textContent;
-					} else if (child.nodeName == 'option') {
-						this.options.push(new this.childOption(child));
+				return CQNode;
+			};
+			this.decode = function(xml) {
+				this.id = xml.id;
+				if (xml.getAttribute('mandatory') == 'true') {this.mandatory = true;}
+				else {this.mandatory = false;}
+				this.type = xml.getAttribute('type');
+				if (this.type == undefined) {this.type = 'text';}
+				switch (this.type) {
+				case 'text':
+					this.question = xml.textContent;
+					break;
+				case 'radio':
+					var child = xml.firstElementChild;
+					this.options = [];
+					while (child != undefined) {
+						if (child.nodeName == 'statement' && this.statement == undefined) {
+							this.statement = child.textContent;
+						} else if (child.nodeName == 'option') {
+							var node = new this.childOption();
+							node.name = child.getAttribute('name');
+							node.text = child.textContent;
+							this.options.push(node);
+						}
+						child = child.nextElementSibling;
 					}
-					child = child.nextElementSibling;
+					break;
+				case 'checkbox':
+					var child = xml.firstElementChild;
+					this.options = [];
+					while (child != undefined) {
+						if (child.nodeName == 'statement' && this.statement == undefined) {
+							this.statement = child.textContent;
+						} else if (child.nodeName == 'option') {
+							var node = new this.childOption();
+							node.name = child.getAttribute('name');
+							node.text = child.textContent;
+							this.options.push(node);
+						}
+						child = child.nextElementSibling;
+					}
+					break;
 				}
-				break;
-			}
+			};
 		};
-		
-		this.id = xml.id;
-		this.hostURL = xml.getAttribute('hostURL');
-		this.sampleRate = xml.getAttribute('sampleRate');
-		if (xml.getAttribute('randomiseOrder') == "true") {this.randomiseOrder = true;}
-		else {this.randomiseOrder = false;}
-		this.repeatCount = xml.getAttribute('repeatCount');
-		if (xml.getAttribute('loop') == 'true') {this.loop = true;}
-		else {this.loop == false;}
-		if (xml.getAttribute('elementComments') == "true") {this.elementComments = true;}
-		else {this.elementComments = false;}
-		
-		this.preTest = new parent.prepostNode('pretest',xml.getElementsByTagName('PreTest'));
-		this.postTest = new parent.prepostNode('posttest',xml.getElementsByTagName('PostTest'));
-		
-		this.interfaces = [];
-		var interfaceDOM = xml.getElementsByTagName('interface');
-		for (var i=0; i<interfaceDOM.length; i++) {
-			this.interfaces.push(new this.interfaceNode(interfaceDOM[i]));
-		}
-		
-		this.commentBoxPrefix = xml.getElementsByTagName('commentBoxPrefix');
-		if (this.commentBoxPrefix.length != 0) {
-			this.commentBoxPrefix = this.commentBoxPrefix[0].textContent;
-		} else {
-			this.commentBoxPrefix = "Comment on track";
-		}
-		
-		this.audioElements  =[];
-		var audioElementsDOM = xml.getElementsByTagName('audioElements');
-		this.outsideReference = null;
-		for (var i=0; i<audioElementsDOM.length; i++) {
-			if (audioElementsDOM[i].getAttribute('type') == 'outsidereference') {
-				if (this.outsideReference == null) {
-					this.outsideReference = new this.audioElementNode(this,audioElementsDOM[i]);
-				} else {
-					console.log('Error only one audioelement can be of type outsidereference per audioholder');
-					this.audioElements.push(new this.audioElementNode(this,audioElementsDOM[i]));
-					console.log('Element id '+audioElementsDOM[i].id+' made into normal node');
-				}
-			} else {
-				this.audioElements.push(new this.audioElementNode(this,audioElementsDOM[i]));
-			}
-		}
-		
-		if (this.randomiseOrder) {
-			this.audioElements = randomiseOrder(this.audioElements);
-		}
-		
-		this.commentQuestions = [];
-		var commentQuestionsDOM = xml.getElementsByTagName('CommentQuestion');
-		for (var i=0; i<commentQuestionsDOM.length; i++) {
-			this.commentQuestions.push(new this.commentQuestionNode(commentQuestionsDOM[i]));
-		}
 	};
 }
-
+			
 function Interface(specificationObject) {
 	// This handles the bindings between the interface and the audioEngineContext;
 	this.specification = specificationObject;
 	this.insertPoint = document.getElementById("topLevelBody");
 	
+	this.newPage = function(audioHolderObject)
+	{
+		audioEngineContext.newTestPage();
+		/// CHECK FOR SAMPLE RATE COMPATIBILITY
+		if (audioHolderObject.sampleRate != undefined) {
+			if (Number(audioHolderObject.sampleRate) != audioContext.sampleRate) {
+				var errStr = 'Sample rates do not match! Requested '+Number(audioHolderObject.sampleRate)+', got '+audioContext.sampleRate+'. Please set the sample rate to match before completing this test.';
+				alert(errStr);
+				return;
+			}
+		}
+		
+		audioEngineContext.loopPlayback = audioHolderObject.loop;
+		// Delete any previous audioObjects associated with the audioEngine
+		audioEngineContext.audioObjects = [];
+		interfaceContext.deleteCommentBoxes();
+		interfaceContext.deleteCommentQuestions();
+		loadTest(audioHolderObject);
+	};
+	
 	// Bounded by interface!!
 	// Interface object MUST have an exportXMLDOM method which returns the various DOM levels
 	// For example, APE returns  the slider position normalised in a <value> tag.
@@ -1672,6 +2285,7 @@
 	
 	this.resizeWindow = function(event)
 	{
+		popup.resize(event);
 		for(var i=0; i<this.commentBoxes.length; i++)
 		{this.commentBoxes[i].resize();}
 		for(var i=0; i<this.commentQuestions.length; i++)
@@ -2089,7 +2703,7 @@
 		this.setTimePerPixel = function(audioObject) {
 			//maxTime must be in seconds
 			this.playbackObject = audioObject;
-			this.maxTime = audioObject.buffer.duration;
+			this.maxTime = audioObject.buffer.buffer.duration;
 			var width = 490; //500 - 10, 5 each side of the tracker head
 			this.timePerPixel = this.maxTime/490;
 			if (this.maxTime < 60) {
@@ -2201,7 +2815,7 @@
 		for (var i = 0; i<audioEngineContext.audioObjects.length; i++)
 		{
 			var object = audioEngineContext.audioObjects[i];
-			var time = object.buffer.duration;
+			var time = object.buffer.buffer.duration;
 			var metric = object.metric;
 			var passed = false;
 			for (var j=0; j<metric.listenTracker.length; j++)
@@ -2225,7 +2839,7 @@
 		}
 		if (check_pass == false)
 		{
-			var str_start = "You have not listened to fragments ";
+			var str_start = "You have not completely listened to fragments ";
 			for (var i=0; i<error_obj.length; i++)
 			{
 				str_start += error_obj[i];
@@ -2239,4 +2853,64 @@
 			alert(str_start);
 		}
 	};
+	this.checkAllMoved = function()
+	{
+		var str = "You have not moved ";
+		var failed = [];
+		for (var i in audioEngineContext.audioObjects)
+		{
+			if(audioEngineContext.audioObjects[i].metric.wasMoved == false && audioEngineContext.audioObjects[i].specification.type != 'outsidereference')
+			{
+				failed.push(audioEngineContext.audioObjects[i].id);
+			}
+		}
+		if (failed.length == 0)
+		{
+			return true;
+		} else if (failed.length == 1)
+		{
+			str += 'track '+failed[0];
+		} else {
+			str += 'tracks ';
+			for (var i=0; i<failed.length-1; i++)
+			{
+				str += failed[i]+', ';
+			}
+			str += 'and '+failed[i];
+		}
+		str +='.';
+		alert(str);
+		console.log(str);
+		return false;
+	};
+	this.checkAllPlayed = function()
+	{
+		var str = "You have not played ";
+		var failed = [];
+		for (var i in audioEngineContext.audioObjects)
+		{
+			if(audioEngineContext.audioObjects[i].metric.wasListenedTo == false)
+			{
+				failed.push(audioEngineContext.audioObjects[i].id);
+			}
+		}
+		if (failed.length == 0)
+		{
+			return true;
+		} else if (failed.length == 1)
+		{
+			str += 'track '+failed[0];
+		} else {
+			str += 'tracks ';
+			for (var i=0; i<failed.length-1; i++)
+			{
+				str += failed[i]+', ';
+			}
+			str += 'and '+failed[i];
+		}
+		str +='.';
+		alert(str);
+		console.log(str);
+		return false;
+	};
 }
\ No newline at end of file
--- a/docs/Instructions/BuildingInterface.tex	Mon Dec 21 23:18:43 2015 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,60 +0,0 @@
-\documentclass[11pt, oneside]{article}   	% use "amsart" instead of "article" for AMSLaTeX format
-\usepackage[margin=2cm]{geometry}            		% See geometry.pdf to learn the layout options. There are lots.
-\geometry{letterpaper}                   		% ... or a4paper or a5paper or ... 
-%\geometry{landscape}                		% Activate for rotated page geometry
-\usepackage[parfill]{parskip}    		% Activate to begin paragraphs with an empty line rather than an indent
-\usepackage{graphicx}				% Use pdf, png, jpg, or eps§ with pdflatex; use eps in DVI mode
-								% TeX will automatically convert eps --> pdf in pdflatex		
-								
-\usepackage{listings}				% Source code
-\usepackage{amssymb}
-\usepackage{cite}
-\usepackage{hyperref}				% Hyperlinks
-
-
-\graphicspath{{img/}}					% Relative path where the images are stored. 
-
-\title{Building your own Interface for\\ Web Audio Evaluation Tool}
-\author{Nicholas Jillings}
-\date{}							% Activate to display a given date or no date
-
-\begin{document}
-\maketitle
-
-\section{Introduction}
-This guide will help you to construct your own interface on top of the WAET (Web Audio Evaluation Tool) engine. The WAET engine resides in the core.js file, this contains prototype objects to handle most of the test creation, operation and data collection. The interface simply has to link into this at the correct points.
-
-\subsection{Nodes to familiarise}
-Core.js handles several very important nodes which you should become familiar with. The first is the Audio Engine, initialised and stored in variable 'AudioEngineContext'. This handles the playback of the web audio nodes as well as storing the 'AudioObjects'. The 'AudioObjects' are custom nodes which hold the audio fragments for playback. These nodes also have a link to two interface objects, the comment box if enabled and the interface providing the ranking. On creation of an 'AudioObject' the interface link will be nulled, it is up to the interface to link these correctly.
-
-The specification document will be decoded and parsed into an object called 'specification'. This will hold all of the specifications various nodes. The test pages and any pre/post test objects are processed by a test state which will proceed through the test when called to by the interface. Any checks (such as playback or movement checks) are to be completed by the interface before instructing the test state to proceed. The test state will call the interface on each page load with the page specification node.
-
-\subsection{Modifying Core.js}
-Whilst there is very little code actually needed, you do need to instruct core.js to load your interface file when called for from a specification node. There is a function called 'loadProjectSpecCallback' which handles the decoding of the specification and setting any external items (such as metric collection). At the very end of this function there is an if statement, add to this list with your interface string to link to the source. There is an example in there for both the APE and MUSHRA tests already included. Note: Any updates to core.js in future work will most likely overwrite your changes to this file, so remember to check your interface is still here after any update that interferes with core.js.
-Any further files can be loaded here as well, such as css styling files. jQuery is already included.
-
-\section{Building the Interface}
-Your interface file will get loaded automatically when the 'interface' attribute of the setup node matches the string in the 'loadProjectSpecCallback' function. The following functions must be defined in your interface file.
-\begin{itemize}
-\item \texttt{loadInterface} - Called once when the document is parsed. This creates any necessary bindings, such as to the metric collection classes and any check commands. Here you can also start the structure for your test such as placing in any common nodes (such as the title and empty divs to drop content into later).
-\item \texttt{loadTest(audioHolderObject)} - Called for each page load. The audioHolderObject contains a specification node holding effectively one of the audioHolder nodes.
-\item \texttt{resizeWindow(event)} - Handle for any window resizing. Simply scale your interface accordingly. This function must be here, but can me an empty function call.
-\end{itemize}
-
-\subsection{loadInterface}
-This function is called by the interface once the document has been parsed since some browsers may parse files asynchronously. The best method is simply to put 'loadInterface()' at the top of your interface file, therefore when the JavaScript engine is ready the function is called.
-
-By default the HTML file has an element with id "topLevelBody" where you can build your interface. Make sure you blank the contents of that object. This function is the perfect time to build any fixed items, such as the page title, session titles, interface buttons (Start, Stop, Submit) and any holding and structure elements for later on.
-
-At the end of the function, insert these two function calls: testState.initialise() and testState.advanceState();. This will actually begin the test sequence, including the pre-test options (if any are included in the specification document).
-
-\subsection{loadTest(audioHolderObject)}
-This function is called on each new test page. It is this functions job to clear out the previous test and set up the new page. Use the function audioEngineContext.newTestPage(); to instruct the audio engine to prepare for a new page. "audioEngineContext.audioObjects = [];" will delete any audioObjects, interfaceContext.deleteCommentBoxes(); will delete any comment boxes and interfaceContext.deleteCommentQuestions(); will delete any extra comment boxes specified by commentQuestion nodes.
-
-This function will need to instruct the audio engine to build each fragment. Just passing the constructor each element from the audioHolderObject will build the track, audioEngineContext.newTrack(element) (where element is the audioHolderObject audio element). This will return a reference to the constructed audioObject. Decoding of the audio will happen asynchronously.
-
-You also need to link audioObject.interfaceDOM with your interface object for that audioObject. The interfaceDOM object has a few default methods. Firstly it must start disabled and become enabled once the audioObject has decoded the audio (function call: enable()). Next it must have a function exportXMLDOM(), this will return the xml node for your interface, however the default is for it to return a value node, with textContent equal to the normalised value. You can perform other functions, but our scripts may not work if something different is specified (as it will breach our results specifications). Finally it must also have a method getValue, which returns the normalised value.
-
-It is also the job the interfaceDOM to call any metric collection functions necessary, however some functions may be better placed outside (for example, the APE interface uses drag and drop, therefore the best way was to call the metric functions from the dragEnd function, which is called when the interface object is dropped). Metrics based upon listening are handled by the audioObject. The interfaceDOM object must manage any movement metrics. For a list of valid metrics and their behaviours, look at the project specification document included in the repository/docs location. The same goes for any checks required when pressing the submit button, or any other method to proceed the test state.
-
-\end{document}
\ No newline at end of file
--- /dev/null	Thu Jan 01 00:00:00 1970 +0000
+++ b/docs/Instructions/Instructions.bib	Wed Dec 23 14:36:00 2015 +0000
@@ -0,0 +1,41 @@
+%% This BibTeX bibliography file was created using BibDesk.
+%% http://bibdesk.sourceforge.net/
+
+%% Created for Brecht De Man at 2015-12-07 15:24:14 +0100 
+
+
+%% Saved with string encoding Unicode (UTF-8) 
+
+
+
+@book{mushra,
+	Date-Added = {2015-12-07 14:24:08 +0000},
+	Date-Modified = {2015-12-07 14:24:08 +0000},
+	Keywords = {standard},
+	Publisher = {Recommendation {ITU-R BS.1534-1}},
+	Read = {1},
+	Title = {Method for the subjective assessment of intermediate quality level of coding systems},
+	Year = {2003},
+	Bdsk-File-1 = {YnBsaXN0MDDUAQIDBAUGJCVYJHZlcnNpb25YJG9iamVjdHNZJGFyY2hpdmVyVCR0b3ASAAGGoKgHCBMUFRYaIVUkbnVsbNMJCgsMDxJXTlMua2V5c1pOUy5vYmplY3RzViRjbGFzc6INDoACgAOiEBGABIAFgAdccmVsYXRpdmVQYXRoWWFsaWFzRGF0YV8QaC4uLy4uLy4uLy4uL0dvb2dsZSBEcml2ZS9Eb2N1bWVudHMvUGFwZXJzL1RlY2huaWNhbCBEb2N1bWVudHMvTVVTSFJBIFItUkVDLUJTLjE1MzQtMS0yMDAzMDEtSSEhUERGLUUucGRm0hcLGBlXTlMuZGF0YU8RAk4AAAAAAk4AAgAADE1hY2ludG9zaCBIRAAAAAAAAAAAAAAAAAAAANBcXYdIKwAAAApoyx9NVVNIUkEgUi1SRUMtQlMuMTUzNC0jQUI2NTUucGRmAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAACrZVzdlVaAAAAAAAAAAAAAQABQAACSAAAAAAAAAAAAAAAAAAAAATVGVjaG5pY2FsIERvY3VtZW50cwAAEAAIAADQXEFnAAAAEQAIAADN2TlIAAAAAQAYAApoywAKXxEACl67AApdAgAJRF4AApPVAAIAcU1hY2ludG9zaCBIRDpVc2VyczoAQnJlY2h0OgBHb29nbGUgRHJpdmU6AERvY3VtZW50czoAUGFwZXJzOgBUZWNobmljYWwgRG9jdW1lbnRzOgBNVVNIUkEgUi1SRUMtQlMuMTUzNC0jQUI2NTUucGRmAAAOAFYAKgBNAFUAUwBIAFIAQQAgAFIALQBSAEUAQwAtAEIAUwAuADEANQAzADQALQAxAC0AMgAwADAAMwAwADEALQBJACEAIQBQAEQARgAtAEUALgBwAGQAZgAPABoADABNAGEAYwBpAG4AdABvAHMAaAAgAEgARAASAGlVc2Vycy9CcmVjaHQvR29vZ2xlIERyaXZlL0RvY3VtZW50cy9QYXBlcnMvVGVjaG5pY2FsIERvY3VtZW50cy9NVVNIUkEgUi1SRUMtQlMuMTUzNC0xLTIwMDMwMS1JISFQREYtRS5wZGYAABMAAS8AABUAAgAN//8AAIAG0hscHR5aJGNsYXNzbmFtZVgkY2xhc3Nlc11OU011dGFibGVEYXRhox0fIFZOU0RhdGFYTlNPYmplY3TSGxwiI1xOU0RpY3Rpb25hcnmiIiBfEA9OU0tleWVkQXJjaGl2ZXLRJidUcm9vdIABAAgAEQAaACMALQAyADcAQABGAE0AVQBgAGcAagBsAG4AcQBzAHUAdwCEAI4A+QD+AQYDWANaA18DagNzA4EDhQOMA5UDmgOnA6oDvAO/A8QAAAAAAAACAQAAAAAAAAAoAAAAAAAAAAAAAAAAAAADxg==}}
+
+@conference{ape,
+	Author = {De Man, Brecht and Joshua D. Reiss},
+	Booktitle = {136th Convention of the Audio Engineering Society},
+	Date-Added = {2015-09-29 17:07:16 +0000},
+	Date-Modified = {2015-09-29 17:07:20 +0000},
+	Keywords = {perceptual evaluation},
+	Month = {April},
+	Read = {1},
+	Title = {{APE}: {A}udio {P}erceptual {E}valuation toolbox for {MATLAB}},
+	Year = {2014},
+	Bdsk-File-1 = {YnBsaXN0MDDUAQIDBAUGJCVYJHZlcnNpb25YJG9iamVjdHNZJGFyY2hpdmVyVCR0b3ASAAGGoKgHCBMUFRYaIVUkbnVsbNMJCgsMDxJXTlMua2V5c1pOUy5vYmplY3RzViRjbGFzc6INDoACgAOiEBGABIAFgAdccmVsYXRpdmVQYXRoWWFsaWFzRGF0YV8QOi4uLy4uLy4uLy4uL0dvb2dsZSBEcml2ZS9Xcml0aW5ncy9fcHVibGljYXRpb25zL2FlczEzNi5wZGbSFwsYGVdOUy5kYXRhTxEBsgAAAAABsgACAAAMTWFjaW50b3NoIEhEAAAAAAAAAAAAAAAAAAAA0Fxdh0grAAAACl8UCmFlczEzNi5wZGYAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAKaS7PXHsUAAAAAAAAAAAABAAEAAAJIAAAAAAAAAAAAAAAAAAAAA1fcHVibGljYXRpb25zAAAQAAgAANBcQWcAAAARAAgAAM9cbQQAAAABABQACl8UAApeugAKXQIACUReAAKT1QACAE1NYWNpbnRvc2ggSEQ6VXNlcnM6AEJyZWNodDoAR29vZ2xlIERyaXZlOgBXcml0aW5nczoAX3B1YmxpY2F0aW9uczoAYWVzMTM2LnBkZgAADgAWAAoAYQBlAHMAMQAzADYALgBwAGQAZgAPABoADABNAGEAYwBpAG4AdABvAHMAaAAgAEgARAASADtVc2Vycy9CcmVjaHQvR29vZ2xlIERyaXZlL1dyaXRpbmdzL19wdWJsaWNhdGlvbnMvYWVzMTM2LnBkZgAAEwABLwAAFQACAA3//wAAgAbSGxwdHlokY2xhc3NuYW1lWCRjbGFzc2VzXU5TTXV0YWJsZURhdGGjHR8gVk5TRGF0YVhOU09iamVjdNIbHCIjXE5TRGljdGlvbmFyeaIiIF8QD05TS2V5ZWRBcmNoaXZlctEmJ1Ryb290gAEACAARABoAIwAtADIANwBAAEYATQBVAGAAZwBqAGwAbgBxAHMAdQB3AIQAjgDLANAA2AKOApAClQKgAqkCtwK7AsICywLQAt0C4ALyAvUC+gAAAAAAAAIBAAAAAAAAACgAAAAAAAAAAAAAAAAAAAL8}}
+
+@conference{waet,
+	Author = {Nicholas Jillings and David Moffat and De Man, Brecht and Joshua D. Reiss},
+	Booktitle = {12th Sound and Music Computing Conference},
+	Date-Added = {2015-09-22 16:48:27 +0000},
+	Date-Modified = {2015-09-22 16:48:33 +0000},
+	Month = {July},
+	Read = {1},
+	Title = {Web {A}udio {E}valuation {T}ool: {A} browser-based listening test environment},
+	Year = {2015}}
Binary file docs/Instructions/Instructions.pdf has changed
--- /dev/null	Thu Jan 01 00:00:00 1970 +0000
+++ b/docs/Instructions/Instructions.tex	Wed Dec 23 14:36:00 2015 +0000
@@ -0,0 +1,705 @@
+\documentclass[11pt, oneside]{article}   	% use "amsart" instead of "article" for AMSLaTeX format
+\usepackage{geometry}                		% See geometry.pdf to learn the layout options. There are lots.
+\geometry{letterpaper}                   		% ... or a4paper or a5paper or ... 
+%\geometry{landscape}                		% Activate for rotated page geometry
+\usepackage[parfill]{parskip}    		% Activate to begin paragraphs with an empty line rather than an indent
+\usepackage{graphicx}				% Use pdf, png, jpg, or eps§ with pdflatex; use eps in DVI mode
+								% TeX will automatically convert eps --> pdf in pdflatex		
+								
+\usepackage{listings}				% Source code
+\usepackage{xcolor}					% colour (source code for instance)
+\definecolor{grey}{rgb}{0.1,0.1,0.1}
+\definecolor{darkblue}{rgb}{0.0,0.0,0.6}
+\definecolor{cyan}{rgb}{0.0,0.6,0.6}
+
+\usepackage{amssymb}
+\usepackage{cite}
+\usepackage{hyperref}				% Hyperlinks
+\usepackage[nottoc,numbib]{tocbibind}	% 'References' in TOC
+
+\graphicspath{{img/}}					% Relative path where the images are stored. 
+
+\title{Instructions for \\ Web Audio Evaluation Tool}
+\author{Nicholas Jillings, Brecht De Man and David Moffat}
+\date{7 December 2015}							% Activate to display a given date or no date
+
+\begin{document}
+\maketitle
+
+These instructions are about use of the Web Audio Evaluation Tool on Windows and Mac OS X platforms. 
+
+We request that you acknowledge the authors and cite our work when using it \cite{waet}, see also CITING.txt. 
+
+The tool is available in its entirety including source code on \url{https://code.soundsoftware.ac.uk/projects/webaudioevaluationtool/}, under the GNU General Public License v3.0 (\url{http://choosealicense.com/licenses/gpl-3.0/}), see also LICENSE.txt. 
+
+% TO DO: Linux (Android, iOS)
+
+\tableofcontents
+
+\clearpage
+
+\section{Installation}
+	Download the folder (\url{https://code.soundsoftware.ac.uk/hg/webaudioevaluationtool/archive/tip.zip}) and unzip in a location of your choice, or pull the source code from \url{https://code.soundsoftware.ac.uk/hg/webaudioevaluationtool} (Mercurial). 
+	
+	\subsection{Contents}
+		The folder should contain the following elements: \\
+		
+		\textbf{Main folder:} 
+			\begin{itemize}
+	            	\item \texttt{analyse.html}: analysis and diagnostics of a set of result XML files
+	            	\item \texttt{ape.css, core.css, graphics.css, mushra.css, structure.css}: style files (edit to change appearance)
+	            	\item \texttt{ape.js}: JavaScript file for APE-style interface \cite{ape}
+	            	\item \texttt{CITING.txt, LICENSE.txt, README.txt}: text files with, respectively, the citation which we ask to include in any work where this tool or any portion thereof is used, modified or otherwise; the license under which the software is shared; and a general readme file referring to these instructions.
+	            	\item \texttt{core.js}: JavaScript file with core functionality
+	            	\item \texttt{index.html}: webpage where interface should appear (includes link to test configuration XML)
+	            	\item \texttt{jquery-2.1.4.js}: jQuery JavaScript Library
+	            	\item \texttt{loudness.js}: Allows for automatic calculation of loudness of Web Audio API Buffer objects, return gain values to correct for a target loudness or match loudness between multiple objects
+	            	\item \texttt{mushra.js}: JavaScript file for MUSHRA-style interface \cite{mushra}
+	            	\item \texttt{pythonServer.py}: webserver for running tests locally
+	            	\item \texttt{pythonServer-legacy.py}: webserver with limited functionality (no automatic storing of output XML files)
+	            	\item \texttt{save.php}: PHP script to store result XML files to web server\\
+			\end{itemize}
+	     \textbf{Documentation (./docs/)}
+	         \begin{itemize}
+	         		\item \href{http://c4dm.eecs.qmul.ac.uk/dmrn/events/dmrnp10/#posters}{DMRN+10}: PDF and \LaTeX source of poster for 10\textsuperscript{th} Digital Music Research Network One-Day workshop (``soft launch'')
+	         		\item Instructions: PDF and \LaTeX source of these instructions
+	            	\item Project Specification Document (\LaTeX/PDF)
+	            	\item Results Specification Document (\LaTeX/PDF)
+	            	\item SMC15: PDF and \LaTeX  source of 12th Sound and Music Computing Conference paper \cite{waet}
+	            	\item WAC2016: PDF and \LaTeX  source of 2nd Web Audio Conference paper\\
+			\end{itemize}
+         \textbf{Example project (./example\_eval/)}
+            	\begin{itemize}
+            		\item An example of what the set up XML should look like, with example audio files 0.wav-10.wav which are short recordings at 44.1kHz, 16bit of a woman saying the corresponding number (useful for testing randomisation and general familiarisation with the interface).\\ 
+            	\end{itemize}
+          \textbf{Output files (./saves/)}
+            	\begin{itemize}
+            		\item The output XML files of tests will be stored here by default by the \texttt{pythonServer.py} script.\\ 
+            	\end{itemize}
+          \textbf{Auxiliary scripts (./scripts/)}
+            	\begin{itemize}
+            		\item Helpful Python scripts for extraction and visualisation of data.\\ 
+            	\end{itemize}
+          \textbf{Test creation tool (./test\_create/)}
+            	\begin{itemize}
+            		\item Webpage for easily setting up your own test without having to delve into the XML.\\ 
+            	\end{itemize}
+                    	
+	\subsection{Compatibility}
+		As Microsoft Internet Explorer doesn't support the Web Audio API\footnote{\url{http://caniuse.com/\#feat=audio-api}}, you will need another browser like Google Chrome, Safari or Firefox (all three are tested and confirmed to work). 
+
+		Firefox does not currently support other bit depths than 8 or 16 bit for PCM wave files. In the future, this will throw a warning message to tell the user that their content is being quantised automatically. %Nick? Right? To be removed if and when actually implemented
+		
+		The tool is platform-independent and works in any browser that supports the Web Audio API. It does not require any specific, proprietary software. However, in case the tool is hosted locally (i.e. you are not hosting it on an actual webserver) you will need Python (2.7), which is a free programming language - see the next paragraph. 
+	
+\clearpage
+
+
+\section{Test setup}
+
+	\subsection{Sample rate}
+		Depending on how the experiment is set up, audio is resampled automatically (the Web Audio default) or the sample rate is enforced. In the latter case, you will need to make sure that the sample rate of the system is equal to the sample rate of these audio files. For this reason, all audio files in the experiment will have to have the same sample rate. 
+
+		Always make sure that all other digital equipment in the playback chain (clock, audio interface, digital-to-analog converter, ...) is set to this same sample rate.
+
+		Note that upon changing the sampling rate, the browser will have to be restarted for the change to take effect. 
+		
+		\subsubsection{Mac OS X}
+			To change the sample rate in Mac OS X, go to \textbf{Applications/Utilities/Audio MIDI Setup} or find this application with Spotlight (see Figure \ref{fig:audiomidisetup}). Then select the output of the audio interface you are using and change the `Format' to the appropriate number. Also make sure the bit depth and channel count are as desired. 
+			If you are using an external audio interface, you may have to go to the preference pane of that device to change the sample rate. 
+
+			Also make sure left and right channel gains are equal, as some applications alter this without changing it back, leading to a predominantly louder left or right channel. See Figure \ref{fig:audiomidisetup} for an example where the channel gains are different. 
+
+			\begin{figure}[tb]
+				\centering
+				\includegraphics[width=.65\textwidth]{img/audiomidisetup.png}
+				\caption{The Audio MIDI Setup window in Mac OS X}
+				\label{fig:audiomidisetup}
+			\end{figure}
+		
+		\subsubsection{Windows}
+			To change the sample rate in Windows, right-click on the speaker icon in the lower-right corner of your desktop and choose `Playback devices'. Right-click the appropriate playback device and click `Properties'. Click the `Advanced' tab and verify or change the sample rate under `Default Format'.    % NEEDS CONFIRMATION
+			If you are using an external audio interface, you may have to go to the preference pane of that device to change the sample rate. 
+
+	\subsection{Local test}
+		If the test is hosted locally, you will need to run the local webserver provided with this tool. 
+		
+		\subsubsection{Mac OS X \& Linux}
+
+			On Mac OS X, Python comes preinstalled, as with most Unix/Linux distributions.
+
+			Open the Terminal (find it in \textbf{Applications/Terminal} or via Spotlight), and go to the folder you downloaded. To do this, type \texttt{cd [folder]}, where \texttt{[folder]} is the folder where to find the \texttt{pythonServer.py} script you downloaded. For instance, if the location is \texttt{/Users/John/Documents/test/}, then type
+			
+				\texttt{cd /Users/John/Documents/test/}
+				
+			Then hit enter and run the Python script by typing
+
+				\texttt{python pythonServer.py}
+
+			and hit enter again. See also Figure \ref{fig:terminal}.
+			
+			\begin{figure}[htbp]
+	                \begin{center}
+	                \includegraphics[width=.75\textwidth]{pythonServer.png}
+	                \caption{Mac OS X: The Terminal window after going to the right folder (\texttt{cd [folder\_path]}) and running \texttt{pythonServer.py}.}
+	                \label{fig:terminal}
+	                \end{center}
+	                \end{figure}
+
+	        Alternatively, you can simply type \texttt{python} (follwed by a space) and drag the file into the Terminal window from Finder. % DOESN'T WORK YET
+			
+			You can leave this running throughout the different experiments (i.e. leave the Terminal open). Once running the terminal will report the current URL to type into your browser to initiate the test, usually this is http://localhost:8000/.
+
+			To start the test, open the browser and type 
+				
+			\texttt{localhost:8000}
+
+			and hit enter. The test should start (see Figure \ref{fig:test}). 
+
+			To quit the server, either close the terminal window or press Ctrl+C on your keyboard to forcibly shut the server.
+
+		\subsubsection{Windows}
+
+			On Windows, Python 2.7 is not generally preinstalled and therefore has to be downloaded\footnote{\url{https://www.python.org/downloads/windows/}} and installed to be able to run scripts such as the local webserver, necessary if the tool is hosted locally. 
+		
+			Simply double click the Python script \texttt{pythonServer.py} in the folder you downloaded. 
+			
+			You may see a warning like the one in Figure \ref{fig:warning}. Click `Allow access'. 
+			
+			\begin{figure}[htbp]
+            \begin{center}
+            \includegraphics[width=.6\textwidth]{warning.png}
+            \caption{Windows: Potential warning message when executing \texttt{pythonServer.py}.}
+            \label{fig:warning}
+            \end{center}
+            \end{figure}
+            
+            The process should now start, in the Command prompt that opens - see Figure \ref{fig:python}. 
+            
+            \begin{figure}[htbp]
+            \begin{center}
+            \includegraphics[width=.75\textwidth]{python.png}
+            \caption{Windows: The Command Prompt after running \texttt{pythonServer.py} and opening the corresponding website.}
+            \label{fig:python}
+            \end{center}
+            \end{figure}
+            
+            You can leave this running throughout the different experiments (i.e. leave the Command Prompt open). 
+
+		    To start the test, open the browser and type 
+				
+			\texttt{localhost:8000}
+		
+			and hit enter. The test should start (see Figure \ref{fig:test}). 
+		
+			\begin{figure}[htb]
+	        \begin{center}
+	        \includegraphics[width=.8\textwidth]{test.png}
+	        \caption{The start of the test in Google Chrome on Windows 7.}
+	        \label{fig:test}
+	        \end{center}
+	        \end{figure}
+                    
+        If at any point in the test the participant reports weird behaviour or an error of some kind, or the test needs to be interrupted, please notify the experimenter and/or refer to Section \ref{sec:troubleshooting}. 
+		
+		When the test is over (the subject should see a message to that effect, and click `Submit' one last time), the output XML file containing all collected data should have appeared in `saves/'. The names of these files are `test-0.xml', `test-1.xml', etc., in ascending order. The Terminal or Command prompt running the local web server will display the following file name. If such a file did not appear, please again refer to Section \ref{sec:troubleshooting}. 
+		
+		It is advised that you back up these results as often as possible, as a loss of this data means that the time and effort spent by the subject(s) has been in vain. Save the results to an external or network drive, and/or send them to the experimenter regularly. 
+		
+		To start the test again for a new participant, you do not need to close the browser or shut down the Terminal or Command Prompt. Simply refresh the page or go to \texttt{localhost:8000} again. 
+		
+
+	\subsection{Remote test}
+		Put all files on a web server which supports PHP. This allows the `save.php' script to store the XML result files in the `saves/' folder. If the web server is not able to store the XML file there at the end of the test, it will present the XML file locally to the user, as a `Save file' link. 
+
+		Make sure the \texttt{projectReturn} attribute of the \texttt{setup} node is set to the \texttt{save.php} script. 
+
+		Then, just go to the URL of the corresponding HTML file, e.g. \texttt{http://server.com/path/to/WAET/index.html?url=test/my-test.xml}. If storing on the server doesn't work at submission (e.g. if the \texttt{projectReturn} attribute isn't properly set), the result XML file will be presented to the subject on the client side, as a `Save file' link. 
+
+    \subsection{Multiple test documents}
+        By default the index page will load a demo page of tests. To automatically load a test document, you need to append the location in the URL. If your URL is normally http://localhost:8000/index.html you would append the following: \texttt{?url=/path/to/your/test.xml}. Replace the fields with your actual path, the path is local to the running directory, so if you have your test in the directory \texttt{example\_eval} called \texttt{project.xml} you would append \texttt{?url=/example\_eval/project.xml}.
+
+\clearpage
+
+\section{Interfaces}
+
+	The Web Audio Evaluation Tool comes with a number of interface styles, each of which can be customised extensively, either by configuring them differently using the many optional features, or by modifying the JavaScript files. 
+
+	To set the interface style for the whole test, %Nick? change when this is not the case anymore, i.e. when the interface can be set per page
+	add \texttt{interface="APE"} to the \texttt{setup} node, where \texttt{"APE"} is one of the interface names below. 
+
+	\subsection{APE}
+		The APE interface is based on \cite{ape}, and consists of one or more axes, each corresponding with an attribute to be rated, on which markers are placed. As such, it is a multiple stimulus interface where (for each dimension or attribute) all elements are on one axis so that they can be maximally compared against each other, as opposed to rated individually or with regards to a single reference. 
+		It also contains an optional text box for each element, to allow for clarification by the subject, tagging, and so on. 
+
+	\subsection{MUSHRA}
+		This is a straightforward implementation of \cite{mushra}, especially common for the rating of audio quality, for instance for the evaluation of audio codecs. 
+
+	
+\clearpage
+
+\section{Features}
+
+	This section covers the different features implemented in the Web Audio Evaluation Tool, how to use them, and what to know about them. 
+
+	Unless otherwise specified, \emph{each} feature described here is optional, i.e. it can be enabled or disabled and adjusted to some extent. 
+
+	As the example project showcases (nearly) all of these features, please refer to its configuration XML document for a demonstration of how to enable and adjust them. 
+
+	\subsection{Interface layout}
+		The \texttt{interface} node (child of \texttt{audioholder}) contains 
+
+		Example: 
+
+		\begin{lstlisting}
+<interface name="quality">
+	<title>Audio Quality</title>
+	<scale position="10">Poor</scale>
+	<scale position="90">Excellent</scale>
+	<commentBoxPrefix>Comment on fragment</commentBoxPrefix>
+</interface>
+		\end{lstlisting}
+
+		\subsubsection{Title}
+			Specifies the axis title as displayed on the interface. 
+
+			If this tag is absent, the title will default to `Axis \emph{[number]}'. Therefore, if no title is desired, just add the title tag (\texttt{<title/>}) without text. 
+
+		\subsubsection{Annotation}
+			Words or numbers can be placed on specific positions of the scale with the \texttt{scale} tag. The \texttt{position} attribute is a value from 0 to 100, corresponding to the percentage of the width/height of the scale where you want the string to be placed. 
+
+		\subsubsection{Comment box prefix}
+			If comment boxes corresponding with the fragments are enabled, this sets the comment box string after which the fragment number is appended. 
+
+			The default value is ``Comment on fragment''. So in this case, each comment box would have a header ``Comment on fragment \emph[number]''. 
+
+		\subsubsection{Multiple scales}
+			In the case of multiple rating scales, e.g. when the stimuli are to be rated in terms of attributes `timbre' and `spatial impression', multiple interface nodes will have to be added, each specifying the title and annotations. 
+
+			This is where the \texttt{interface}'s \texttt{name} attribute is particularly important: use this to retrieve the rating values, comments and metrics associated with the specified interface. 
+			If none is given, you can still use the automatically given \texttt{interface-id}, which is the interface number starting with 0 and corresponding to the order in which the rating scales appear. 
+
+
+	\subsection{Surveys}
+	    Surveys are conducted through an in-page popup window which can collect data using various HTML functions, see Survey elements below for a list. Survey questions are placed into the \texttt{<pretest>} or \texttt{<posttest>} nodes. Appending these nodes to the \texttt{<setup>} node will have the survey options appear before any test pages (if in the \texttt{<pretest>} node) or after all test pages. Placing the survey options in the \texttt{<audioholder>} node will have them appear before or after the test page they are a child of.
+		\subsubsection{Survey elements}
+			All survey elements (which `pop up' in the centre of the browser) have an \texttt{id} attribute, for retrieval of the responses in post-processing of the results, and a \texttt{mandatory} attribute, which if set to ``true'' requires the subjects to respond before they can continue. 
+
+			\begin{description}
+				\item[statement] Simply shows text to the subject until `Next' or `Start' is clicked. 
+				\item[question] Expects a text answer (in a text box). Has the \texttt{boxsize} argument: set to ``large'' or ``huge'' for a bigger box size, or ``small'' for small.
+				\item[number] Only accepts a numerical value. Attribute \texttt{min="0"} specifies the minimum value - in this case the answer must be stricly positive before the subject can continue.
+				\item[radio] Radio buttons. Presents a list of options to the user using radio buttons, where only one option from the list can be selected.
+				\item[checkbox] Checkboxes. Note that when making a checkbox question ``mandatory'', the subject is forced to select at least one option (which could be e.g. `Other' or `None').\\
+			\end{description}
+
+			\textbf{Example usage:}\\
+
+			\lstset{
+			  basicstyle=\ttfamily,
+			  columns=fullflexible,
+			  showstringspaces=false,
+			  commentstyle=\color{grey}\upshape
+			}
+
+			\lstdefinelanguage{XML}
+			{
+			  morestring=[b]",
+			  morestring=[s]{>}{<},
+			  morecomment=[s]{<?}{?>},
+			  stringstyle=\color{black} \bfseries,
+			  identifierstyle=\color{darkblue} \bfseries,
+			  keywordstyle=\color{cyan} \bfseries,
+			  morekeywords={xmlns,version,type}, 
+			  breaklines=true% list your attributes here
+			}
+			\scriptsize
+			\lstset{language=XML}
+
+			\begin{lstlisting}
+<PostTest>
+	<question id="location" mandatory="true" boxsize="large">Please enter your location. (example mandatory text question)</question>
+	<number id="age" min="0">Please enter your age (example non-mandatory number question)</number>
+	<radio id="rating">
+		<statement>Please rate this interface (example radio button question)</statement>
+		<option name="bad">Bad</option>
+		<option name="poor">Poor</option>
+		<option name="good">Good</option>
+		<option name="great">Great</option>
+	</radio>
+	<checkbox id="background" mandatory="true">
+				<statement>Please select with which activities you have any experience (example checkbox question)</statement>
+				<option name="musician">Playing a musical instrument</option>
+				<option name="soundengineer">Recording or mixing audio</option>
+			</checkbox>
+	<statement>Thank you for taking this listening test. Please click 'Submit' and your results will appear in the 'saves/' folder.</statement>
+</PostTest>
+			\end{lstlisting}
+
+
+
+	\subsection{Randomisation}
+		[WORK IN PROGRESS]
+
+		\subsubsection{Randomisation of configuration XML files}
+		    The python server has a special function to automatically cycle through a list of test pages. Instead of directly requesting an XML, simply setting the url item in the browser URL to \texttt{pseudo.xml} will cycle through a list of XMLs. These XMLs must be in the local directory called \texttt{pseudo}.
+			% how to
+			% explain how this is implemented in the pythonServer
+			%Nick? already implemented in the PHP?
+			% Needs to be implemented in PHP and automated better, will complete soon
+
+
+		\subsubsection{Randomsation of page order}
+        The page order randomisation is set by the \texttt{<setup>} node attribute \texttt{randomise-order}, for example \texttt{<setup ... randomise-order="true">...</setup>} will randomise the test page order. When not set, the default is to \textbf{not} randomise the test page order.
+
+		\subsubsection{Randomisation of axis order}
+
+		\subsubsection{Randomisation of fragment order}
+        The audio fragment randomisation is set by the \texttt{<audioholder>} node attribute \texttt{randomise-order}, for example \texttt{<audioholder ... randomise-order="true">...</audioholder>} will randomise the test page order. When not set, the default is to \textbf{not} randomise the test page order.
+
+		\subsubsection{Randomisation of initial slider position}
+        By default slider values are randomised on start. The MUSHRA interface supports setting the initial values of all sliders throught the \texttt{<audioholder>} attribute \texttt{initial-position}. This takes an integer between 0 and 100 to signify the slider position.
+		% /subsubsection{Randomisation of survey question order}
+		% should be an attribute of the individual 'pretest' and 'posttest' elements
+		% uncomment once we have it
+
+	\subsection{Looping}
+	    Looping enables the fragments to loop until stopped by the user. Looping is synchronous between samples so all samples start at the same time.
+		Individual test pages can have their playback looped by the \texttt{<audioholder>} attribute \texttt{loop} with a value of "true" or "false".
+		If the fragments are not of equal length initially, they are padded with zeros so that they are equal length, to enable looping without the fragments going out of sync relative to each other. 
+
+		Note that fragments cannot be played until all page fragments are loaded when in looped mode, as the engine needs to know the amount to pad the fragments.
+
+	\subsection{Sample rate}
+		If you require the test to be conducted at a certain sample rate (i.e. you do not tolerate resampling of the elements to correspond with the system's sample rate), add \texttt{sampleRate="96000"} - where ``96000'' can be any support sample rate - so that a warning message is shown alerting the subject the system's sample rate is different from this enforced sample rate. This of course means that in one test, all sample rates must be equal as it is impossible to change the system's sample rates during the test (even if you were to manually change it, then the browser must be restarted for it to take effect). 
+
+	\subsection{Scrubber bar}
+		The scrubber bar, or transport bar (that is the name of the visualisation of the playhead thing with an indication of time and showing the portion of the file played so far) is at this point just a visual, and not a controller to adjust the playhead position.
+
+		Make visible by adding \texttt{<option name='playhead'/>} to the \texttt{interface} node (see Section \ref{sec:checks}: Checks). 
+
+	\subsection{Metrics}
+		Enable the collection of metrics by adding \texttt{collectMetrics=`true'} in the \texttt{setup} node. % Should this always be on??
+
+		The \texttt{Metric} node, which contains the metrics to be tracked during the complete test, is a child of the \texttt{setup} node, and it could look as follows.
+
+		\begin{lstlisting}
+<Metric>
+	<metricEnable>testTimer</metricEnable>
+	<metricEnable>elementTimer</metricEnable>
+	<metricEnable>elementInitialPosition</metricEnable>
+	<metricEnable>elementTracker</metricEnable>
+	<metricEnable>elementFlagListenedTo</metricEnable>
+	<metricEnable>elementFlagMoved</metricEnable>
+	<metricEnable>elementListenTracker</metricEnable>
+</Metric>
+		\end{lstlisting}
+
+		When in doubt, err on the inclusive side, as one never knows which information is needed in the future. Most of these metrics are necessary for post-processing scripts such as timeline\_view\_movement.py. 
+
+		\subsubsection{Time test duration}
+			\texttt{testTimer}\\
+			One per test page. Presents the total test time from the first playback on the test page to the submission of the test page (exculding test time of the pre-/post- test surveys). This is presented in the results as \texttt{<metricresult id="testTime"> 8.60299319727892 </metricresult>}. The time is in seconds.
+
+		\subsubsection{Time fragment playback}
+			\texttt{elementTimer}\\
+			One per audio fragment per test page. This totals up the entire time the audio fragment has been listened to in this test and presented \texttt{<metricresult name="enableElementTimer"> 1.0042630385487428 </metricresult>}. The time is in seconds.
+
+		\subsubsection{Initial positions}
+			\texttt{elementInitialPosition}\\
+			One per audio fragment per test page. Tracks the initial position of the sliders, especially relevant when these are randomised. Example result \texttt{<metricresult name="elementInitialPosition"> 0.8395522388059702 </metricresult>}.
+
+		\subsubsection{Track movements}
+		    \texttt{elementTracker}\\
+		    One per audio fragment per test page. Tracks the movement of each interface object. Each movement event has the time it occured at and the new value.
+		\subsubsection{Which fragments listened to}
+		    \texttt{elementFlagListenedTo}\\
+		    One per audio fragment per test page. Boolean response, set to true if listened to.
+		\subsubsection{Which fragments moved}
+		    \texttt{elementFlagMoved}\\
+			One per audio fragment per test page. Binary check whether or not a the marker corresponding with a particular fragment was moved at all throughout the experiment. 
+
+		\subsubsection{elementListenTracker}
+		    \texttt{elementListenTracker}\\
+		    One per audio fragment per test page. Tracks the playback events of each audio element pairing both the time in the test when playback started and when it stopped, it also gives the buffertime positions.
+
+	\subsection{References and anchors}
+	    The audio elements, \texttt{<audioelement>} have the attribute \texttt{type}, which defaults to normal. Setting this to one of the following will have the following effects.
+		\subsubsection{Outside Reference}
+		    Set type to 'outside-reference'. This will place the object in a separate playback element clearly labelled as an outside reference. This is exempt of any movement checks but will still be included in any listening checks.
+		\subsubsection{Hidden reference} 
+		    Set type to 'reference'. The element will still be randomised as normal (if selected) and presented to the user. However the element will have the 'reference' type in the results to quickly find it. The reference can be forced to be below a value before completing the test page by setting the attribute 'marker' to be a value between 0 and 100 representing the integer value position it must be equal to or above.
+		\subsubsection{Hidden anchor}
+		    Set type to 'anchor'. The element will still be randomised as normal (if selected) and presented to the user. However the element will have the 'anchor' type in the results to quickly find it. The anchor can be forced to be below a value before completing the test page by setting the attribute 'marker' to be a value between 0 and 100 representing the integer value position it must be equal to or below.
+
+	\subsection{Checks}
+		\label{sec:checks}
+
+		%blabla
+		These checks are enabled in the \texttt{interface} node, which is a child of the \texttt{setup} node. 
+		\subsubsection{Playback checks}
+				% what it does/is
+				Enforce playing each sample at least once, for at least a little bit (e.g. this test is satisfied even if you only play a tiny portion of the file), by alerting the user to which samples have not been played upon clicking `Submit'. When enabled, one cannot proceed to the next page, answer a survey question, or finish the test, before clicking each sample at least once. 
+				% how to enable/disable
+
+				Alternatively, one can check whether the \emph{entire} fragment was listened to at least once. 
+				% how to enable
+
+				Add \texttt{<check name="fragmentPlayed"/>} to the \texttt{interface} node. 
+				
+
+		\subsubsection{Movement check}
+			Enforce moving each sample at least once, for at least a little bit (e.g. this test is satisfied even if you only play a tiny portion of the file), by alerting the user to which samples have not been played upon clicking `Submit'. When enabled, one cannot proceed to the next page, answer a survey question, or finish the test, before clicking each sample at least once. 
+			If there are several axes, the warning will specify which samples have to be moved on which axis.
+
+			Add \texttt{<check name="fragmentMoved"/>} to the \texttt{interface} node.
+
+		\subsubsection{Comment check}
+			% How to enable/disable? 
+
+			Enforce commenting, by alerting the user to which samples have not been commented on upon clicking `Submit'. When enabled, one cannot proceed to the next page, answer a survey question, or finish the test, before putting at least one character in each comment box. 
+
+			Note that this does not apply to any extra (text, radio button, checkbox) elements, unless these have the `mandatory' option enabled. %Nick? is this extra 'mandatory' option implemented? 
+
+			Add \texttt{<check name="fragmentComments"/>} to the \texttt{interface} node. 
+
+			%ADD: how to add a custom comment box
+
+		\subsubsection{Scale use check}
+			It is possible to enforce a certain usage of the scale, meaning that at least one slider needs to be below and/or above a certain percentage of the slider. 
+
+			Add \texttt{<check name="scalerange" min="25" max="75"/>} to the \texttt{interface} node. 
+
+		\subsubsection{Note on the use of multiple rating axes}
+			I.e. what if more than one axis? How to specify which axis the checks relate to? %Nick? to add? 
+
+	\subsection{Platform information}
+		% what does it do, what does it look like
+		% limitations? 
+		For troubleshooting and usage statistics purposes, information about the browser and the operating system is logged in the results XML file. This is especially useful in the case of remote tests, when it is not certain which operating system, browser and/or browser were used. Note that this information is not always available and/or accurate, e.g. when the subject has taken steps to be more anonymous, so it should be treated as a guide only.
+
+		Example: 
+		\begin{lstlisting}
+<navigator>
+	<platform>MacIntel</platform>
+	<vendor>Google Inc.</vendor>
+	<uagent>Mozilla/5.0 (Macintosh; Intel Mac OS X 10_7_5) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/43.0.2357.134 Safari/537.36</uagent>
+</navigator>
+		\end{lstlisting}
+
+	\subsection{Show progress}
+		Add \texttt{<option name="page-count"/>} to the \texttt{interface} node (see Section \ref{sec:checks}: Checks) to add the current page number and the total number of pages to the interface. 
+
+	\subsection{Gain}
+		It is possible to set the gain (in decibel) applied to the different audioelements, as an attribute of the \texttt{audioelement} nodes in the configuration XML file: 
+
+		\texttt{<audioElements url="sample-01.wav" gain="-6" id="sample01quieter" />}\\
+		Please note, there are no checks on this to detect if accidentaly typed in linear.
+
+	\subsection{Loudness}
+		% automatic loudness equalisation
+		% guide to loudness.js 
+		Each audio fragment on loading has its loudness calculated. The tool uses the EBU R 128 recommendation following the ITU-R BS.1770-4 loduness calculations to return the integreated LUFS loudness. The attribute \texttt{loudness} will set the loudness from the scope it is applied in. Applying it in the \texttt{<setup>} node will set the loudness for all test pages. Applying it in the \texttt{<audioholder>} node will set the loudness for that page. Applying it in the \texttt{<audioelement>} node will set the loudness for that fragment. The scope is set locally, so if there is a loudness on both the \texttt{<audioholder>} and \texttt{<setup>} nodes, that test page will take the value associated with the \texttt{<audioholder>}. The loudness attribute is set in LUFS
+
+\clearpage
+
+
+\section{Using the test create tool}
+	We provide a test creation tool, available in the directory test\_create. This tool is a self-contained web page, so doubling clicking will launch the page in your system default browser.
+
+	The test creation tool can help you build a simple test very quickly. By simply selecting your interface and clicking check-boxes you can build a test in minutes.
+
+	Include audio by dragging and dropping the stimuli you wish to include. 
+
+	The tool examines your XML before exporting to ensure you do not export an invalid XML structure which would crash the test.
+	
+	This guide will help you to construct your own interface on top of the WAET (Web Audio Evaluation Tool) engine. The WAET engine resides in the core.js file, this contains prototype objects to handle most of the test creation, operation and data collection. The interface simply has to link into this at the correct points.
+	
+\section{Building your own interface}
+
+	\subsection{Nodes to familiarise}
+		Core.js handles several very important nodes which you should become familiar with. The first is the Audio Engine, initialised and stored in variable `AudioEngineContext'. This handles the playback of the web audio nodes as well as storing the `AudioObjects'. The `AudioObjects' are custom nodes which hold the audio fragments for playback. These nodes also have a link to two interface objects, the comment box if enabled and the interface providing the ranking. On creation of an `AudioObject' the interface link will be nulled, it is up to the interface to link these correctly.
+
+		The specification document will be decoded and parsed into an object called `specification'. This will hold all of the specifications various nodes. The test pages and any pre/post test objects are processed by a test state which will proceed through the test when called to by the interface. Any checks (such as playback or movement checks) are to be completed by the interface before instructing the test state to proceed. The test state will call the interface on each page load with the page specification node.
+
+	\subsection{Modifying \texttt{core.js}}
+		Whilst there is very little code actually needed, you do need to instruct core.js to load your interface file when called for from a specification node. There is a function called `loadProjectSpecCallback' which handles the decoding of the specification and setting any external items (such as metric collection). At the very end of this function there is an if statement, add to this list with your interface string to link to the source. There is an example in there for both the APE and MUSHRA tests already included. Note: Any updates to core.js in future work will most likely overwrite your changes to this file, so remember to check your interface is still here after any update that interferes with core.js.
+		Any further files can be loaded here as well, such as css styling files. jQuery is already included.
+
+	\subsection{Building the Interface}
+		Your interface file will get loaded automatically when the `interface' attribute of the setup node matches the string in the `loadProjectSpecCallback' function. The following functions must be defined in your interface file.
+		\begin{itemize}
+		\item \texttt{loadInterface} - Called once when the document is parsed. This creates any necessary bindings, such as to the metric collection classes and any check commands. Here you can also start the structure for your test such as placing in any common nodes (such as the title and empty divs to drop content into later).
+		\item \texttt{loadTest(audioHolderObject)} - Called for each page load. The audioHolderObject contains a specification node holding effectively one of the audioHolder nodes.
+		\item \texttt{resizeWindow(event)} - Handle for any window resizing. Simply scale your interface accordingly. This function must be here, but can me an empty function call.
+		\end{itemize}
+
+		\subsubsection{loadInterface}
+			This function is called by the interface once the document has been parsed since some browsers may parse files asynchronously. The best method is simply to put `loadInterface()' at the top of your interface file, therefore when the JavaScript engine is ready the function is called.
+
+			By default the HTML file has an element with id ``topLevelBody'' where you can build your interface. Make sure you blank the contents of that object. This function is the perfect time to build any fixed items, such as the page title, session titles, interface buttons (Start, Stop, Submit) and any holding and structure elements for later on.
+
+			At the end of the function, insert these two function calls: testState.initialise() and testState.advanceState();. This will actually begin the test sequence, including the pre-test options (if any are included in the specification document).
+
+		\subsubsection{loadTest(audioHolderObject)}
+			This function is called on each new test page. It is this functions job to clear out the previous test and set up the new page. Use the function audioEngineContext.newTestPage(); to instruct the audio engine to prepare for a new page. ``audioEngineContext.audioObjects = [];'' will delete any audioObjects, interfaceContext.deleteCommentBoxes(); will delete any comment boxes and interfaceContext.deleteCommentQuestions(); will delete any extra comment boxes specified by commentQuestion nodes.
+
+			This function will need to instruct the audio engine to build each fragment. Just passing the constructor each element from the audioHolderObject will build the track, audioEngineContext.newTrack(element) (where element is the audioHolderObject audio element). This will return a reference to the constructed audioObject. Decoding of the audio will happen asynchronously.
+
+			You also need to link audioObject.interfaceDOM with your interface object for that audioObject. The interfaceDOM object has a few default methods. Firstly it must start disabled and become enabled once the audioObject has decoded the audio (function call: enable()). Next it must have a function exportXMLDOM(), this will return the xml node for your interface, however the default is for it to return a value node, with textContent equal to the normalised value. You can perform other functions, but our scripts may not work if something different is specified (as it will breach our results specifications). Finally it must also have a method getValue, which returns the normalised value.
+
+			It is also the job the interfaceDOM to call any metric collection functions necessary, however some functions may be better placed outside (for example, the APE interface uses drag and drop, therefore the best way was to call the metric functions from the dragEnd function, which is called when the interface object is dropped). Metrics based upon listening are handled by the audioObject. The interfaceDOM object must manage any movement metrics. For a list of valid metrics and their behaviours, look at the project specification document included in the repository/docs location. The same goes for any checks required when pressing the submit button, or any other method to proceed the test state.
+
+\clearpage
+\section{Analysis and diagnostics}
+	\subsection{In the browser}
+		See `analysis.html' in the main folder: immediate visualisation of (by default) all results in the `saves/' folder. 
+
+	\subsection{Python scripts}
+		The package includes Python (2.7) scripts (in `scripts/') to extract ratings and comments, generate visualisations of ratings and timelines, and produce a fully fledged report. 
+
+		Visualisation requires the free matplotlib toolbox (http://matplotlib.org), numpy and scipy. 
+		By default, the scripts can be run from the `scripts' folder, with the result files in the `saves' folder (the default location where result XMLs are stored). Each script takes the XML file folder as an argument, along with other arguments in some cases.
+		Note: to avoid all kinds of problems, please avoid using spaces in file and folder names (this may work on some systems, but others don't like it). 
+
+		\subsubsection{comment\_parser.py}
+			Extracts comments from the output XML files corresponding with the different subjects found in `saves/'. It creates a folder per `audioholder'/page it finds, and stores a CSV file with comments for every `audioelement'/fragment within these respective `audioholders'/pages. In this CSV file, every line corresponds with a subject/output XML file. Depending on the settings, the first column containing the name of the corresponding XML file can be omitted (for anonymisation). 
+			Beware of Excel: sometimes the UTF-8 is not properly imported, leading to problems with special characters in the comments (particularly cumbersome for foreign languages). 
+
+		\subsubsection{evaluation\_stats.py}
+			Shows a few statistics of tests in the `saves/' folder so far, mainly for checking for errors. Shows the number of files that are there, the audioholder IDs that were tested (and how many of each separate ID), the duration of each page, the duration of each complete test, the average duration per page, and the average duration in function of the page number. 
+
+		\subsubsection{generate\_report.py}
+			Similar to `evaluation\_stats.py', but generates a PDF report based on the output files in the `saves/' folder - or any folder specified as command line argument. Uses pdflatex to write a LaTeX document, then convert to a PDF. 
+
+		\subsubsection{score\_parser.py}
+			Extracts rating values from the XML to CSV - necessary for running visualisation of ratings. Creates the folder `saves/ratings/' if not yet created, to which it writes a separate file for every `audioholder'/page in any of the output XMLs it finds in `saves/'. Within each file, rows represent different subjects (output XML file names) and columns represent different `audioelements'/fragments. 
+
+		\subsubsection{score\_plot.py}
+			Plots the ratings as stored in the CSVs created by score\_parser.py
+			Depending on the settings, it displays and/or saves (in `saves/ratings/') a boxplot, confidence interval plot, scatter plot, or a combination of the aforementioned. 
+			Requires the free matplotlib library. 
+			At this point, more than one subjects are needed for this script to work. 
+
+		\subsubsection{timeline\_view\_movement.py}
+			Creates a timeline for every subject, for every `audioholder'/page, corresponding with any of the output XML files found in `saves/'. It shows the marker movements of the different fragments, along with when each fragment was played (red regions). Automatically takes fragment names, rating axis title, rating axis labels, and audioholder name from the XML file (if available). 
+
+		\subsubsection{timeline\_view.py} % should be omitted or absorbed by the above soon
+			Creates a timeline for every subject, for every `audioholder'/page, corresponding with any of the output XML files found in `saves/'. It shows when and for how long the subject listened to each of the fragments. 
+
+
+
+\clearpage
+\section{Troubleshooting} \label{sec:troubleshooting}
+	\subsection{Reporting bugs and requesting features}
+	Thanks to feedback from using the interface in experiments by the authors and others, many bugs have been caught and fatal crashes due to the interface seem to be a thing of the past entirely. 
+
+	We continually develop this tool to fix issues and implement features useful to us or our user base. See \url{https://code.soundsoftware.ac.uk/projects/webaudioevaluationtool/issues} for a list of feature requests and bug reports, and their status. 
+
+	Please contact the authors if you experience any bugs, if you would like additional functionality, if you spot any errors or gaps in the documentation, if you have questions about using the interface, or if you would like to give any feedback (even positive!) about the interface. We look forward to learning how the tool has (not) been useful to you. 
+
+
+	\subsection{First aid}
+		Meanwhile, if things do go wrong or the test needs to be interrupted for whatever reason, all data is not lost. In a normal scenario, the test needs to be completed until the end (the final `Submit'), at which point the output XML is stored in the \texttt{saves/}. If this stage is not reached, open the JavaScript Console (see below for how to find it) and type 
+
+		\texttt{createProjectSave()}
+
+		to present the result XML file on the client side, or
+
+		\texttt{createProjectSave(specification.projectReturn)}
+
+		to try to store it to the specified location, e.g. the `saves/' folder on the web server or the local machine (on failure the result XML should be presented directly in the web browser instead)
+
+		and hit enter. This will open a pop-up window with a hyperlink that reads `Save File'; click it and an XML file with results until that point should be stored in your download folder. 
+		
+		Alternatively, a lot of data can be read from the same console, in which the tool prints a lot of debug information. Specifically:
+	    	\begin{itemize}
+	        	\item the randomisation of pages and fragments are logged;
+	        	\item any time a slider is played, its ID and the time stamp (in seconds since the start of the test) are displayed;
+	        	\item any time a slider is dragged and dropped, the location where it is dropped including the time stamp are shown; 
+	        	\item any comments and pre- or post-test questions and their answers are logged as well. 
+	    	\end{itemize}
+
+		You can select all this and save into a text file, so that none of this data is lost. You may to choose to do this even when a test was successful as an extra precaution. 
+
+		If you encounter any issue which you believe to be caused by any aspect of the tool, and/or which the documentation does not mention, please do let us know! 
+
+		\subsubsection*{Opening the JavaScript Console}
+	        \begin{itemize}
+	            \item In Google Chrome, the JavaScript Console can be found in \textbf{View$>$Developer$>$JavaScript Console}, or via the keyboard shortcut Cmd + Alt + J (Mac OS X). 
+	            \item In Safari, the JavaScript Console can be found in \textbf{Develop$>$Show Error Console}, or via the keyboard shortcut Cmd + Alt + C (Mac OS X). Note that for the Developer menu to be visible, you have to go to Preferences (Cmd + ,) and enable `Show Develop menu in menu bar' in the `Advanced' tab. \textbf{Note that as long as the Developer menu is not visible, nothing is logged to the console, i.e. you will only be able to see diagnostic information from when you switched on the Developer tools onwards.}
+	            \item In Firefox, go to \textbf{Tools$>$Web Developer$>$Web Console}, or hit Cmd + Alt + K. 
+	        \end{itemize}
+
+	\subsection{Known issues and limitations}
+	\label{sec:knownissues}
+
+		The following is a non-exhaustive list of problems and limitations you may experience using this tool, due to not being supported yet by us, or by the Web Audio API and/or (some) browsers. 
+
+		\begin{itemize}
+			\item Issue \href{https://code.soundsoftware.ac.uk/issues/1463}{\textbf{\#1463}}: \textbf{Firefox} only supports 8 bit and 16 bit WAV files. Pending automatic requantisation (which deteriorates the audio signal's dynamic range to some extent), WAV format stimuli need to adhere to these limitations in order for the test to be compatible with Firefox. 
+			\item Issues \href{https://code.soundsoftware.ac.uk/issues/1474}{\textbf{\#1474}} and \href{https://code.soundsoftware.ac.uk/issues/1462}{\textbf{\#1462}}: On occasions, audio is not working - or only a continuous `beep' can be heard - notably in \textbf{Safari}. Refreshing, quitting the browser and even enabling Developer tools in Safari's Preferences pane (`Advanced' tab: ``Show `Develop' menu in menu bar'') has helped resolve this. If no (high quality) audio can be heard, make sure your entire playback system's settings are all correct. 
+		\end{itemize}
+
+\clearpage
+\bibliographystyle{ieeetr}
+\bibliography{Instructions}{}
+
+
+\clearpage
+\appendix
+
+\section{Legacy}
+	The APE interface and most of the functionality of the first WAET editions are inspired by the APE toolbox for MATLAB \cite{ape}. See \url{https://code.soundsoftware.ac.uk/projects/ape} for the source code and \url{http://brechtdeman.com/publications/aes136.pdf} for the corresponding paper. 
+
+\clearpage
+
+\section{Listening test instructions example}
+
+	Before each test, show the instructions below or similar and make sure it is available to the subject throughout the test. Make sure to ask whether the participant has any questions upon seeing and/or reading the instructions. 
+			
+	\begin{itemize}
+		\item You will be asked for your name (``John Smith'') and location (room identifier). 
+		\item An interface will appear, where you are asked to 
+		\begin{itemize}
+			\item click green markers to play the different mixes;
+			\item drag the markers on a scale to reflect your preference for the mixes;
+			\item comment on these mixes, using text boxes with corresponding numbers (in your \textbf{native language});
+			\item optionally comment on all mixes together, or on the song, in `General comments'. 
+		\end{itemize}
+		\item You are asked for your personal, honest opinion. Feel free to use the full range of the scale to convey your opinion of the various mixes. Don?t be afraid to be harsh and direct. 
+		\item The markers appear at random positions at first (which means some markers may hide behind others). 
+		\item The interface can take a few seconds to start playback, but switching between mixes should be instantaneous. 
+		\item This is a research experiment, so please forgive us if things go wrong. Let us know immediately and we will fix it or restart the test.  
+		\item When the test is finished (after all songs have been evaluated), just call the experimenter, do NOT close the window.  
+		\item After the test, please fill out our survey about your background, experience and feedback on the test. 
+		\item By participating, you consent to us using all collected data for research. Unless asked explicitly, all data will be anonymised when shared. 
+	\end{itemize}
+
+\clearpage
+
+\section{Terminology} % just to keep track of what exactly we call things. Don't use terms that are too different, to avoid confusion.
+	As a guide to better understand the Instructions, and to expand them later, here is a list of terms that may be unclear or ambiguous unless properly defined. 
+	\begin{description}
+		\item[Subject] The word we use for a participant, user, ... of the test, i.e. not the experimenter who designs the test but the person who evaluates the audio under test as part of an experiment (or the preparation of one). 
+		\item[User] The person who uses the tool to configure, run and analyse the test - i.e. the experimenter, most likely a researcher - or at least 
+		\item[Page] A screen in a test; corresponds with an \texttt{audioholder}
+		\item[Fragment] An element, stimulus or sample in a test; corresponds with an \texttt{audioelement}
+		\item[Test] A complete test which can consist of several pages; corresponds with an entire configuration XML file
+		\item[Configuration XML file] The XML file containing the necessary information on interface, samples, survey questions, configurations, ... which the JavaScript modules read to produce the desired test.
+		\item[Results XML file] The output of a successful test, including ratings, comments, survey responses, timing information, and the complete configuration XML file with which the test was generated in the first place. 
+	\end{description}
+
+\clearpage
+
+\setcounter{secnumdepth}{0} % don't number this last bit
+\section{Contact details} % maybe add web pages, Twitter accounts, whatever you like
+\label{sec:contact}
+
+	\begin{itemize}
+		\item Nicholas Jillings: \texttt{nicholas.jillings@mail.bcu.ac.uk}
+		\item Brecht De Man: \texttt{b.deman@qmul.ac.uk}
+		\item David Moffat: \texttt{d.j.moffat@qmul.ac.uk}
+	\end{itemize}
+
+\end{document}  
\ No newline at end of file
--- a/docs/Instructions/ListeningTestInstructions.bib	Mon Dec 21 23:18:43 2015 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,31 +0,0 @@
-%% This BibTeX bibliography file was created using BibDesk.
-%% http://bibdesk.sourceforge.net/
-
-%% Created for Brecht De Man at 2015-09-30 17:44:12 +0200 
-
-
-%% Saved with string encoding Unicode (UTF-8) 
-
-
-
-@conference{ape,
-	Author = {De Man, Brecht and Joshua D. Reiss},
-	Booktitle = {136th Convention of the Audio Engineering Society},
-	Date-Added = {2015-09-29 17:07:16 +0000},
-	Date-Modified = {2015-09-29 17:07:20 +0000},
-	Keywords = {perceptual evaluation},
-	Month = {April},
-	Read = {1},
-	Title = {{APE}: {A}udio {P}erceptual {E}valuation toolbox for {MATLAB}},
-	Year = {2014},
-	Bdsk-File-1 = {YnBsaXN0MDDUAQIDBAUGJCVYJHZlcnNpb25YJG9iamVjdHNZJGFyY2hpdmVyVCR0b3ASAAGGoKgHCBMUFRYaIVUkbnVsbNMJCgsMDxJXTlMua2V5c1pOUy5vYmplY3RzViRjbGFzc6INDoACgAOiEBGABIAFgAdccmVsYXRpdmVQYXRoWWFsaWFzRGF0YV8QOi4uLy4uLy4uLy4uL0dvb2dsZSBEcml2ZS9Xcml0aW5ncy9fcHVibGljYXRpb25zL2FlczEzNi5wZGbSFwsYGVdOUy5kYXRhTxEBsgAAAAABsgACAAAMTWFjaW50b3NoIEhEAAAAAAAAAAAAAAAAAAAA0Fxdh0grAAAACl8UCmFlczEzNi5wZGYAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAKaS7PXHsUAAAAAAAAAAAABAAEAAAJIAAAAAAAAAAAAAAAAAAAAA1fcHVibGljYXRpb25zAAAQAAgAANBcQWcAAAARAAgAAM9cbQQAAAABABQACl8UAApeugAKXQIACUReAAKT1QACAE1NYWNpbnRvc2ggSEQ6VXNlcnM6AEJyZWNodDoAR29vZ2xlIERyaXZlOgBXcml0aW5nczoAX3B1YmxpY2F0aW9uczoAYWVzMTM2LnBkZgAADgAWAAoAYQBlAHMAMQAzADYALgBwAGQAZgAPABoADABNAGEAYwBpAG4AdABvAHMAaAAgAEgARAASADtVc2Vycy9CcmVjaHQvR29vZ2xlIERyaXZlL1dyaXRpbmdzL19wdWJsaWNhdGlvbnMvYWVzMTM2LnBkZgAAEwABLwAAFQACAA3//wAAgAbSGxwdHlokY2xhc3NuYW1lWCRjbGFzc2VzXU5TTXV0YWJsZURhdGGjHR8gVk5TRGF0YVhOU09iamVjdNIbHCIjXE5TRGljdGlvbmFyeaIiIF8QD05TS2V5ZWRBcmNoaXZlctEmJ1Ryb290gAEACAARABoAIwAtADIANwBAAEYATQBVAGAAZwBqAGwAbgBxAHMAdQB3AIQAjgDLANAA2AKOApAClQKgAqkCtwK7AsICywLQAt0C4ALyAvUC+gAAAAAAAAIBAAAAAAAAACgAAAAAAAAAAAAAAAAAAAL8}}
-
-@conference{waet,
-	Author = {Nicholas Jillings and David Moffat and De Man, Brecht and Joshua D. Reiss},
-	Booktitle = {12th Sound and Music Computing Conference},
-	Date-Added = {2015-09-22 16:48:27 +0000},
-	Date-Modified = {2015-09-22 16:48:33 +0000},
-	Month = {July},
-	Read = {1},
-	Title = {Web {A}udio {E}valuation {T}ool: {A} browser-based listening test environment},
-	Year = {2015}}
Binary file docs/Instructions/ListeningTestInstructions.pdf has changed
--- a/docs/Instructions/ListeningTestInstructions.tex	Mon Dec 21 23:18:43 2015 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,241 +0,0 @@
-\documentclass[11pt, oneside]{article}   	% use "amsart" instead of "article" for AMSLaTeX format
-\usepackage{geometry}                		% See geometry.pdf to learn the layout options. There are lots.
-\geometry{letterpaper}                   		% ... or a4paper or a5paper or ... 
-%\geometry{landscape}                		% Activate for rotated page geometry
-\usepackage[parfill]{parskip}    		% Activate to begin paragraphs with an empty line rather than an indent
-\usepackage{graphicx}				% Use pdf, png, jpg, or eps§ with pdflatex; use eps in DVI mode
-								% TeX will automatically convert eps --> pdf in pdflatex		
-								
-\usepackage{listings}				% Source code
-\usepackage{amssymb}
-\usepackage{cite}
-\usepackage{hyperref}				% Hyperlinks
-\usepackage[nottoc,numbib]{tocbibind}	% 'References' in TOC
-
-\graphicspath{{img/}}					% Relative path where the images are stored. 
-
-\title{Instructions for listening tests using\\ Web Audio Evaluation Tool}
-\author{Brecht De Man}
-\date{}							% Activate to display a given date or no date
-
-\begin{document}
-\maketitle
-
-These instructions are about use of the Web Audio Evaluation Tool \cite{waet} with the APE interface \cite{ape} on Windows and Mac OS X platforms. 
-% TO DO: Linux
-
-\tableofcontents
-
-\clearpage
-
-\section{Installation and set up}
-	Download the folder (\url{https://code.soundsoftware.ac.uk/hg/webaudioevaluationtool/archive/tip.zip}) and unzip in a location of your choice. 
-	
-	\subsection{Contents}
-		The folder should contain the following elements: \\
-		
-		\textbf{Main folder:} 
-			\begin{itemize}
-	            	\item \texttt{ape.css, core.css, graphics.css, structure.css}: style files (edit to change appearance)
-	            	\item \texttt{ape.js}: JavaScript file for APE-style interface \cite{ape}
-	            	\item \texttt{CITING.txt, LICENSE.txt, README.txt}: text files with, respectively, the citation which we ask to include in any work where this tool or any portion thereof is used, modified or otherwise; the license under which the software is shared; and a general readme file.
-	            	\item \texttt{core.js}: JavaScript file with core functionality
-	            	\item \texttt{index.html}: webpage where interface should appear
-	            	\item \texttt{jquery-2.1.4.js}: jQuery JavaScript Library
-	            	\item \texttt{pythonServer.py}: webserver for running tests locally
-	            	\item \texttt{pythonServer-legacy.py}: webserver with limited functionality (no automatic storing of output XML files)\\
-			\end{itemize}
-	     \textbf{Documentation (./docs/)}
-	         \begin{itemize}
-	         		\item Instructions: PDF and \LaTeX source of these instructions
-	            	\item Project Specification Document (\LaTeX/PDF)
-	            	\item Results Specification Document (\LaTeX/PDF)
-	            	\item SMC15: PDF and \LaTeX source of corresponding SMC2015 publication \cite{waet}
-	            	\item WAC2016: PDF and \LaTeX source of corresponding WAC2016 publication\\
-			\end{itemize}
-         \textbf{Example project (./example\_eval/)}
-            	\begin{itemize}
-            		\item An example of what the set up XML should look like, with example audio files 0.wav-10.wav which are short recordings at 44.1kHz, 16bit of a woman saying the corresponding number (useful for testing randomisation and general familiarisation with the interface).\\ 
-            	\end{itemize}
-          \textbf{Output files (./saves/)}
-            	\begin{itemize}
-            		\item The output XML files of tests will be stored here by default by the \texttt{pythonServer.py} script.\\ 
-            	\end{itemize}
-          \textbf{Auxiliary scripts (./scripts/)}
-            	\begin{itemize}
-            		\item Helpful Python scripts for extraction and visualisation of data.\\ 
-            	\end{itemize}
-          \textbf{Test creation tool (./test\_create/)}
-            	\begin{itemize}
-            		\item Webpage for easily setting up your own test without having to delve into the XML.\\ 
-            	\end{itemize}
-                    	
-	\subsection{Browser}
-		As Microsoft Internet Explorer doesn't support the Web Audio API\footnote{\url{http://caniuse.com/\#feat=audio-api}}, you will need another browser like Google Chrome, Safari or Firefox (all three are tested and confirmed to work). 
-		
-		The tool is platform-independent and works in any browser that supports the Web Audio API. It does not require any specific, proprietary software. However, in case the tool is hosted locally (i.e. you are not hosting it on an actual webserver) you will need Python, which is a free programming language - see the next paragraph. 
-	
-	\subsection{Python 2.7}
-		On Windows, Python 2.7 is not generally preinstalled and therefore has to be downloaded\footnote{\url{https://www.python.org/downloads/windows/}} and installed to be able to run scripts such as the local webserver, necessary if the tool is hosted locally. 
-		
-		On Mac OS X, Python comes preinstalled. 
-
-\clearpage
-
-\section{Listening test: Local}
-	\subsection{Start local webserver}
-		If the test is hosted locally, you will need to run the local webserver provided with this tool. 
-		
-		\subsubsection{Mac OS X}
-			Open the Terminal (find it in \textbf{Applications/Terminal} or via Spotlight), and go to the folder you downloaded. To do this, type \texttt{cd [folder]}, where \texttt{[folder]} is the folder where to find the \texttt{pythonServer.py} script you downloaded. For instance, if the location is \texttt{/Users/John/Documents/test/}, then type
-			
-				\texttt{cd /Users/John/Documents/test/}
-				
-			Then hit enter and run the Python script by typing
-
-				\texttt{python pythonServer.py}
-
-			and hit enter again. See also Figure \ref{fig:terminal}.
-			
-			\begin{figure}[htbp]
-	                \begin{center}
-	                \includegraphics[width=.75\textwidth]{pythonServer.png}
-	                \caption{Mac OS X: The Terminal window after going to the right folder (\texttt{cd [folder\_path]}) and running \texttt{pythonServer.py}.}
-	                \label{fig:terminal}
-	                \end{center}
-	                \end{figure}
-
-	        Alternatively, you can simply type \texttt{python} (follwed by a space) and drag the file into the Terminal window from Finder. % DOESN'T WORK YET
-			
-			You can leave this running throughout the different experiments (i.e. leave the Terminal open). 
-
-		\subsubsection{Windows}
-		
-			Simply double click the Python script \texttt{pythonServer.py} in the folder you downloaded. 
-			
-			You may see a warning like the one in Figure \ref{fig:warning}. Click `Allow access'. 
-			
-			\begin{figure}[htbp]
-	                \begin{center}
-	                \includegraphics[width=.6\textwidth]{warning.png}
-	                \caption{Windows: Potential warning message when executing \texttt{pythonServer.py}.}
-	                \label{fig:warning}
-	                \end{center}
-	                \end{figure}
-	                
-	                The process should now start, in the Command prompt that opens - see Figure \ref{fig:python}. 
-	                
-	                \begin{figure}[htbp]
-	                \begin{center}
-	                \includegraphics[width=.75\textwidth]{python.png}
-	                \caption{Windows: The Command Prompt after running \texttt{pythonServer.py} and opening the corresponding website.}
-	                \label{fig:python}
-	                \end{center}
-	                \end{figure}
-	                
-	                You can leave this running throughout the different experiments (i.e. leave the Command Prompt open). 
-		
-		
-\clearpage
-	\subsection{Sample rate}
-		Depending on how the experiment is set up, audio is resampled automatically (the Web Audio default) or the sample rate is enforced. In the latter case, you will need to make sure that the sample rate of the system is equal to the sample rate of these audio files. For this reason, all audio files in the experiment will have to have the same sample rate. 
-
-		Always make sure that all other digital equipment in the playback chain (clock, audio interface, digital-to-analog converter, ...) is set to this same sample rate.
-		
-		\subsubsection{Mac OS X}
-			To change the sample rate in Mac OS X, go to \textbf{Applications/Utilities/Audio MIDI Setup} or find this application with Spotlight. Then select the output of the audio interface you are using and change the `Format' to the appropriate number. Also make sure the bit depth and channel count are as desired. 
-			If you are using an external audio interface, you may have to go to the preference pane of that device to change the sample rate. 
-		
-		\subsubsection{Windows}
-			To change the sample rate in Windows, right-click on the speaker icon in the lower-right corner of your desktop and choose `Playback devices'. Right-click the appropriate playback device and click `Properties'. Click the `Advanced' tab and verify or change the sample rate under `Default Format'.    % NEEDS CONFIRMATION
-			If you are using an external audio interface, you may have to go to the preference pane of that device to change the sample rate. 
-		 
-				
-		
-	\subsection{Setting up a participant}
-		
-		\subsubsection{Instructions} % EXAMPLE?
-			Before each test, show the instructions below or similar and make sure it is available to the subject throughout the test. Make sure to ask whether the participant has any questions upon seeing and/or reading the instructions. 
-			
-			\begin{itemize}
-			\item You will be asked for your name (``John'') and location (room identifier). 
-			\item An interface will appear, where you are asked to 
-			\begin{itemize}
-				\item click green markers to play the different mixes;
-				\item drag the markers on a scale to reflect your preference for the mixes;
-				\item comment on these mixes, using text boxes with corresponding numbers (in your \textbf{native language});
-				\item optionally comment on all mixes together, or on the song, in `General comments'. 
-			\end{itemize}
-			\item You are asked for your personal, honest opinion. Feel free to use the full range of the scale to convey your opinion of the various mixes. Don?t be afraid to be harsh and direct. 
-			\item The markers appear at random positions at first (which means some markers may hide behind others). 
-			\item The interface can take a few seconds to start playback, but switching between mixes should be instantaneous. 
-			\item This is a research experiment, so please forgive us if things go wrong. Let us know immediately and we will fix it or restart the test.  
-			\item When the test is finished (after all songs have been evaluated), just call the experimenter, do NOT close the window.  
-			\item After the test, please fill out our survey about your background, experience and feedback on the test. 
-			\item By participating, you consent to us using all collected data for research. Unless asked explicitly, all data will be anonymised when shared. 
-			\end{itemize}
-
-		
-		\subsubsection{The test}
-			To start the test, open the browser and type 
-			
-			\texttt{localhost:8000}
-			
-			and hit enter. The test should start (see Figure \ref{fig:test}). 
-			
-			\begin{figure}[htb]
-                        \begin{center}
-                        \includegraphics[width=.8\textwidth]{test.png}
-                        \caption{The start of the test in Google Chrome on Windows 7.}
-                        \label{fig:test}
-                        \end{center}
-                        \end{figure}
-                        
-            If at any point in the test the participant reports weird behaviour or an error of some kind, or the test needs to be interrupted, please notify the experimenter and/or refer to Section \ref{sec:troubleshooting}. 
-			
-			When the test is over (the subject should see a message to that effect, and click `Submit' one last time), the output XML file containing all collected data should have appeared in `saves/'. The names of these files are `test-0.xml', `test-1.xml', etc., in ascending order. The Terminal or Command prompt running the local web server will display the following file name. If such a file did not appear, please again refer to Section \ref{sec:troubleshooting}. 
-			
-			It is advised that you back up these results as often as possible, as a loss of this data means that the time and effort spent by the subject(s) has been in vain. Save the results to an external or network drive, and/or send them to the experimenter regularly. 
-			
-			To start the test again for a new participant, you do not need to close the browser or shut down the Terminal or Command Prompt. Simply refresh the page or go to \texttt{localhost:8000} again. 
-			
-			
-		\subsubsection{Survey}
-			The tool allows for embedded questions before and after each page, and before and after the whole test. If these do \underline{not} include survey questions (about the participant's background, demographic information, and so on) make sure to ask the participant to complete the survey immediately after the test. Above anything else, this decreases the likelihood that the survey goes forgotten and the experimenters do not receive the data in time. 
-	
-\clearpage
-	\subsection{Troubleshooting} \label{sec:troubleshooting}
-		Thanks to feedback from using the interface in experiments by the authors and others, many bugs have been caught and fatal crashes due to the interface (provided it is set up properly by the user) seem to be a thing of the past. 
-		However, if things do go wrong or the test needs to be interrupted for whatever reason, all data is not lost. In a normal scenario, the test needs to be completed until the end (the final `Submit'), at which point the output XML is stored in the \texttt{saves/}. If this stage is not reached, open the JavaScript Console (see below for how to find it) and type 
-
-		\texttt{createProjectSave()}
-
-		and hit enter. This will open a pop-up window with a hyperlink that reads `Save File'; click it and an XML file with results until that point should be stored in your download folder. 
-		
-		Alternatively, a lot of data can be read from the same console, in which the tool prints a lot of debug information. Specifically:
-        	\begin{itemize}
-            	\item the randomisation of pages and fragments are logged;
-            	\item any time a slider is played, its ID and the time stamp (in seconds since the start of the test) are displayed;
-            	\item any time a slider is dragged and dropped, the location where it is dropped including the time stamp are shown; 
-            	\item any comments and pre- or post-test questions and their answers are logged as well. 
-        	\end{itemize}
-
-		You can select all this and save into a text file, so that none of this data is lost. You may to choose to do this even when a test was successful as an extra precaution. 
-
-		\subsubsection{Opening the JavaScript Console}
-            \begin{itemize}
-                \item In Google Chrome, the JavaScript Console can be found in \textbf{View$>$Developer$>$JavaScript Console}, or via the keyboard shortcut Cmd + Alt + J (Mac OS X). 
-                \item In Safari, the JavaScript Console can be found in \textbf{Develop$>$Show Error Console}, or via the keyboard shortcut Cmd + Alt + C (Mac OS X). Note that for the Developer menu to be visible, you have to go to Preferences (Cmd + ,) and enable `Show Develop menu in menu bar' in the `Advanced' tab. 
-                \item In Firefox, go to \textbf{Tools$>$Web Developer$>$Web Console}, or hit Cmd + Alt + K. 
-            \end{itemize}
-
-\clearpage
-\section{Listening test: remote}
-
-	(TBA)
-
-\clearpage
-\bibliographystyle{ieeetr}
-\bibliography{ListeningTestInstructions}{}
-
-\end{document}  
\ No newline at end of file
--- a/docs/Instructions/User Guide.tex	Mon Dec 21 23:18:43 2015 +0000
+++ /dev/null	Thu Jan 01 00:00:00 1970 +0000
@@ -1,75 +0,0 @@
-\documentclass[11pt, oneside]{article}   	% use "amsart" instead of "article" for AMSLaTeX format
-\usepackage{geometry}                		% See geometry.pdf to learn the layout options. There are lots.
-\geometry{letterpaper}                   		% ... or a4paper or a5paper or ... 
-%\geometry{landscape}                		% Activate for rotated page geometry
-\usepackage[parfill]{parskip}    		% Activate to begin paragraphs with an empty line rather than an indent
-\usepackage{graphicx}				% Use pdf, png, jpg, or eps§ with pdflatex; use eps in DVI mode
-								% TeX will automatically convert eps --> pdf in pdflatex		
-								
-\usepackage{listings}				% Source code
-\usepackage{amssymb}
-\usepackage{cite}
-\usepackage{hyperref}				% Hyperlinks
-
-\graphicspath{{img/}}					% Relative path where the images are stored. 
-
-\title{Web Audio Evaluation Tool \\User Guide}
-\date{}							% Activate to display a given date or no date
-
-\begin{document}
-\maketitle
-
-These instructions are about use of the Web Audio Evaluation Tool \cite{deman2015c}.
-Version 1.0
-
-\tableofcontents
-
-\section{Installing}
-
-The tool can be downloaded from the SoundSoftware website, available at \url{https://code.soundsoftware.ac.uk/projects/webaudioevaluationtool/repository}. The repository contains all the files required by the tool, along with interfaces to post bug reports or issue any feature requests.
-
-Once downloaded and extracted (either through a Mercurial client or the available zip download) the tool is ready to be operated with. The tool is designed for three modes of use:
-\begin{itemize}
-\item Single Location, One User - A listening test which will be conducted in a single location, one user at a time. Possibly on a machine with no network or internet connectivity
-\item Single Location, Multiple Users - Similar to the above but where the hosting server is located behind a networked firewall which all test machines can access
-\item Multiple Location, Multiple Users - A test operated over the web by multiple end users
-\end{itemize}
-There are other modes of use which we cannot document due to the flexible nature of the test. If your test does not mostly fit into one of these three categories, have a look in the Advanced Test section.
-
-\subsection{Python}
-
-To trial the test before deployment, or if you are performing a test on a non-networked machine, you will need to run our python script to launch a local python web server. This script is designed for Python 2.7. Running the script will open a basic web server, hosting the directory it is contained in. Visit \url{http://localhost:8080/} to launch the test instance once the server is running. To quit the server, either close the terminal window or press Ctrl+C on your keyboard to forcibly shut the server.
-
-If your system already uses port 8080 and you wish to use the server, please read the Advanced Test Creation section.
-
-\section{Designing a Test}
-
-The test specification document is an XML file containing all the information the tool requires to operate your test. No coding in JavaScript or HTML is needed to get this test running.
-
-
-
-\subsection{Using the test create tool}
-We have supplied a test creation tool, available in the repository directory test\_creation. This tool is a self-contained web page, so doubling clicking will launch the page in your system default browser.
-
-The test creation tool can help you build a simple test very quickly. By simply selecting your interface and clicking check-boxes you can build a test in minutes.
-
-Audio is handled by directing the tool to where
-
-The tool examines your XML before exporting to ensure you do not export an invalid XML structure which would crash the test.
-
-\subsection{Setting up the test directory}
-
-\section{Launching and operating}
-
-\section{Advanced Test Creation}
-\subsection{Multi-User}
-\subsection{3rd Party Server}
-
-\section{Errors and Troubleshooting}
-\subsection{Common Errors}
-\subsection{Forcing an Export}
-\subsection{Terminal}
-
-\section{Future Work}
-
-\end{document}
\ No newline at end of file
Binary file docs/Instructions/img/audiomidisetup.png has changed
--- /dev/null	Thu Jan 01 00:00:00 1970 +0000
+++ b/example_eval/mushra_example.xml	Wed Dec 23 14:36:00 2015 +0000
@@ -0,0 +1,97 @@
+<?xml version="1.0" encoding="utf-8"?>
+<BrowserEvalProjectDocument>
+	<setup interface="MUSHRA" projectReturn="save.php" randomiseOrder='true' collectMetrics='true' testPages="2" loudness="-23">
+		<PreTest>
+			<question id="sessionId" mandatory="true">Please enter your name.</question>
+			<statement>This is an example of a 'MUSHRA'-style test, with two pages, using the test stimuli in 'example_eval/'.</statement>
+		</PreTest>
+		<PostTest>
+			<question id="location" mandatory="true" boxsize="large">Please enter your location. (example mandatory text question)</question>
+			<number id="age" min="0">Please enter your age (example non-mandatory number question)</number>
+			<radio id="rating">
+				<statement>Please rate this interface (example radio button question)</statement>
+				<option name="bad">Bad</option>
+				<option name="poor">Poor</option>
+				<option name="good">Good</option>
+				<option name="great">Great</option>
+			</radio>
+			<statement>Thank you for taking this listening test. Please click 'submit' and your results will appear in the 'saves/' folder.</statement>
+		</PostTest>
+		<Metric>
+			<metricEnable>testTimer</metricEnable>
+			<metricEnable>elementTimer</metricEnable>
+			<metricEnable>elementInitialPosition</metricEnable>
+			<metricEnable>elementTracker</metricEnable>
+			<metricEnable>elementFlagListenedTo</metricEnable>
+			<metricEnable>elementFlagMoved</metricEnable>
+			<metricEnable>elementListenTracker</metricEnable>
+		</Metric>
+		<interface>
+			<check name="fragmentMoved"/>
+			<option name='playhead'/>
+			<option name="page-count"/>
+		</interface>
+	</setup>
+	<audioHolder id='test-0' hostURL="example_eval/" randomiseOrder='true' repeatCount='0' loop='true' elementComments='true' loudness="-12" initial-position='50'>
+		<interface>
+			<title>Preference</title>
+			<check name='fragmentPlayed'/>
+			<scale position="0">Min</scale>
+			<scale position="100">Max</scale>
+			<scale position="50">Middle</scale>
+			<scale position="20">20</scale>
+			<commentBoxPrefix>Comment on fragment</commentBoxPrefix>
+		</interface>
+		<audioElements url="0.wav" id="0" type="anchor"/>
+		<audioElements url="1.wav" id="1"/>
+		<audioElements url="2.wav" id="2"/>
+		<audioElements url="3.wav" id="3"/>
+		<audioElements url="4.wav" id="4"/>
+		<PreTest>
+			<statement>Example of an 'MUSHRA' style interface with hidden anchor 'zero' (which needs to be below 20%), looping of the samples, randomisation of marker labels, mandatory moving of every sample, mandatory listening of every sample and set initialisation to 50% </statement>
+		</PreTest>
+		<PostTest>
+			<question id="genre" mandatory="true">Please enter the genre.</question>
+		</PostTest>
+	</audioHolder>
+    <audioHolder id='test-1' hostURL="example_eval/" randomiseOrder='true' repeatCount='0' loop='false' elementComments='true'>
+        <interface name="preference">
+            <title>Example Test Question</title>
+            <scale position="0">Min</scale>
+            <scale position="100">Max</scale>
+            <scale position="50">Middle</scale>
+            <scale position="75">75</scale>
+            <scalerange min="25" max="75"/>
+            <commentBoxPrefix>Comment on fragment</commentBoxPrefix>
+        </interface>
+        <audioElements url="0.wav" gain="-6" id="0" type="anchor" marker="20"/>
+        <audioElements url="1.wav" gain="0.0" id="1" type="reference" marker="80"/>
+        <audioElements url="2.wav" gain="0.0" id="2"/>
+        <audioElements url="3.wav" gain="0.0" id="3"/>
+        <audioElements url="4.wav" gain="0.0" id="4"/>
+        <audioElements url="5.wav" gain="0.0" id="5"/>
+        <audioElements url="6.wav" gain="0.0" id="6" type="outsidereference"/>
+        <CommentQuestion id='mixingExperience' type="text">What is your general experience with numbers?</CommentQuestion>
+        <CommentQuestion id="preference" type="radio">
+            <statement>Please enter your overall preference</statement>
+            <option name="worst">Very Bad</option>
+            <option name="bad"></option>
+            <option name="OK">OK</option>
+            <option name="Good"></option>
+            <option name="Great">Great</option>
+        </CommentQuestion>
+        <CommentQuestion id="preference" type="checkbox">
+            <statement>Please describe the overall character</statement>
+            <option name="funky">Funky</option>
+            <option name="mellow">Mellow</option>
+            <option name="laidback">Laid back</option>
+            <option name="heavy">Heavy</option>
+        </CommentQuestion>
+        <PreTest>
+        	<statement>Example of an 'MUSHRA' style interface with outside reference 'six', hidden anchor 'zero' (which needs to be below 20%), hidden reference 'one' (which needs to be above 80%), randomisation of marker labels. </statement>
+        </PreTest>
+        <PostTest>
+            <question id="genre" mandatory="true">Please enter the genre.</question>
+        </PostTest>
+    </audioHolder>
+</BrowserEvalProjectDocument>
\ No newline at end of file
--- a/example_eval/project.xml	Mon Dec 21 23:18:43 2015 +0000
+++ b/example_eval/project.xml	Wed Dec 23 14:36:00 2015 +0000
@@ -1,29 +1,29 @@
 <?xml version="1.0" encoding="utf-8"?>
 <BrowserEvalProjectDocument>
-	<setup interface="APE" projectReturn="save.php" randomiseOrder='true' collectMetrics='true' testPages="2">
+	<setup interface="APE" projectReturn="save.php" randomiseOrder='true' collectMetrics='true' testPages="2" loudness="-23">
 		<PreTest>
-			<question id="Location" mandatory="true" boxsize="large">Please enter your location.</question>
-			<checkbox id="experience">
-				<statement>Check options which are relevant to you</statement>
-				<option id="digital">Digital Consoles</option>
-				<option id="analog">Analog Consoles</option>
-				<option id="live">Live Mixing</option>
-				<option id="studio">Studio Mixing</option>
-				<option id="player">Play an instrument</option>
+			<question id="sessionId" mandatory="true">Please enter your name.</question>
+			<checkbox id="checkboxtest" mandatory="true">
+				<statement>Please select with which activities you have any experience (example checkbox question)</statement>
+				<option name="musician">Playing a musical instrument</option>
+				<option name="soundengineer">Recording or mixing audio</option>
+				<option name="developer">Developing audio software</option>
+				<option name="hwdesigner">Designing or building audio hardware</option>
+				<option name="researcher">Research in the field of audio</option>
 			</checkbox>
-			<number id="age" min="0">Please enter your age</number>
-			<statement>Please listen to all fragments</statement>
+			<statement>This is an example of an 'APE'-style test, with two pages, using the test stimuli in 'example_eval/'.</statement>
 		</PreTest>
 		<PostTest>
-			<question id="SessionID" mandatory="true">Please enter your name.</question>
+			<question id="location" mandatory="true" boxsize="large">Please enter your location. (example mandatory text question)</question>
+			<number id="age" min="0">Please enter your age (example non-mandatory number question)</number>
 			<radio id="rating">
-				<statement>Please rate this interface</statement>
+				<statement>Please rate this interface (example radio button question)</statement>
 				<option name="bad">Bad</option>
-				<option name="OK">OK</option>
-				<option name="Good">Good</option>
-				<option name="Great">Great</option>
+				<option name="poor">Poor</option>
+				<option name="good">Good</option>
+				<option name="great">Great</option>
 			</radio>
-			<statement>Thank you for taking this listening test.</statement>
+			<statement>Thank you for taking this listening test. Please click 'submit' and your results will appear in the 'saves/' folder.</statement>
 		</PostTest>
 		<Metric>
 			<metricEnable>testTimer</metricEnable>
@@ -41,44 +41,34 @@
 			<option name="page-count"/>
 		</interface>
 	</setup>
-	<audioHolder id='test-0' hostURL="example_eval/" sampleRate="44100" randomiseOrder='true' repeatCount='0' loop='true' elementComments='true'>
-		<interface>
-			<title>Example Test Question</title>
+	<audioHolder id='test-0' hostURL="example_eval/" randomiseOrder='true' repeatCount='0' loop='true' elementComments='true' loudness="-12">
+		<interface name="preference">
 			<scale position="0">Min</scale>
 			<scale position="100">Max</scale>
 			<scale position="50">Middle</scale>
 			<scale position="20">20</scale>
 			<commentBoxPrefix>Comment on fragment</commentBoxPrefix>
 		</interface>
+		<interface name="depth">
+			<title>Depth</title>
+			<scale position="0">Low</scale>
+			<scale position="100">High</scale>
+			<scale position="50">Middle</scale>
+		</interface>
 		<audioElements url="0.wav" id="0" type="anchor"/>
 		<audioElements url="1.wav" id="1"/>
 		<audioElements url="2.wav" id="2"/>
 		<audioElements url="3.wav" id="3"/>
 		<audioElements url="4.wav" id="4"/>
-		<CommentQuestion id='mixingExperience' type="text">What is your mixing experience</CommentQuestion>
-		<CommentQuestion id="preference" type="radio">
-			<statement>Please enter your ranking preference on this song</statement>
-			<option name="worst">Very Bad</option>
-			<option name="bad"></option>
-			<option name="OK">OK</option>
-			<option name="Good"></option>
-			<option name="Great">Great</option>
-		</CommentQuestion>
-		<CommentQuestion id="preference" type="checkbox">
-			<statement>Describe this song</statement>
-			<option name="bright">Bright</option>
-			<option name="punchy">Punchy</option>
-			<option name="dark">Dark</option>
-			<option name="muddy">Muddy</option>
-			<option name="thin">Thin</option>
-		</CommentQuestion>
-		<PreTest/>
+		<PreTest>
+			<statement>Example of an 'APE' style interface with hidden anchor 'zero' (which needs to be below 20%), looping of the samples, randomisation of marker labels, mandatory moving of every sample, and a forced scale usage of at least 25%-75%. </statement>
+		</PreTest>
 		<PostTest>
 			<question id="genre" mandatory="true">Please enter the genre.</question>
 		</PostTest>
 	</audioHolder>
-    <audioHolder id='test-1' hostURL="example_eval/" sampleRate="44100" randomiseOrder='true' repeatCount='0' loop='false' elementComments='true'>
-        <interface>
+    <audioHolder id='test-1' hostURL="example_eval/" randomiseOrder='true' repeatCount='0' loop='false' elementComments='true'>
+        <interface name="preference">
             <title>Example Test Question</title>
             <scale position="0">Min</scale>
             <scale position="100">Max</scale>
@@ -87,16 +77,16 @@
             <scalerange min="25" max="75"/>
             <commentBoxPrefix>Comment on fragment</commentBoxPrefix>
         </interface>
-        <audioElements url="0.wav" id="0" type="reference" marker="80"/>
-        <audioElements url="1.wav" id="1" type="anchor" marker="20"/>
-        <audioElements url="2.wav" id="2"/>
-        <audioElements url="3.wav" id="3"/>
-        <audioElements url="4.wav" id="4"/>
-        <audioElements url="5.wav" id="5"/>
-        <audioElements url="6.wav" id="6" type="outsidereference"/>
-        <CommentQuestion id='mixingExperience' type="text">What is your mixing experience?</CommentQuestion>
+        <audioElements url="0.wav" gain="-6" id="0" type="anchor" marker="20"/>
+        <audioElements url="1.wav" gain="0.0" id="1" type="reference" marker="80"/>
+        <audioElements url="2.wav" gain="0.0" id="2"/>
+        <audioElements url="3.wav" gain="0.0" id="3"/>
+        <audioElements url="4.wav" gain="0.0" id="4"/>
+        <audioElements url="5.wav" gain="0.0" id="5"/>
+        <audioElements url="6.wav" gain="0.0" id="6" type="outsidereference"/>
+        <CommentQuestion id='mixingExperience' type="text">What is your general experience with numbers?</CommentQuestion>
         <CommentQuestion id="preference" type="radio">
-            <statement>Please enter your ranking preference on this song.</statement>
+            <statement>Please enter your overall preference</statement>
             <option name="worst">Very Bad</option>
             <option name="bad"></option>
             <option name="OK">OK</option>
@@ -104,13 +94,15 @@
             <option name="Great">Great</option>
         </CommentQuestion>
         <CommentQuestion id="preference" type="checkbox">
-            <statement>Describe this song</statement>
+            <statement>Please describe the overall character</statement>
             <option name="funky">Funky</option>
             <option name="mellow">Mellow</option>
             <option name="laidback">Laid back</option>
             <option name="heavy">Heavy</option>
         </CommentQuestion>
-        <PreTest/>
+        <PreTest>
+        	<statement>Example of an 'APE' style interface with outside reference 'six', hidden anchor 'zero' (which needs to be below 20%), hidden reference 'one' (which needs to be above 80%), randomisation of marker labels, and a forced scale usage of at least 25%-75%. </statement>
+        </PreTest>
         <PostTest>
             <question id="genre" mandatory="true">Please enter the genre.</question>
         </PostTest>
--- a/index.html	Mon Dec 21 23:18:43 2015 +0000
+++ b/index.html	Wed Dec 23 14:36:00 2015 +0000
@@ -18,25 +18,53 @@
 		<!--<script src="https://ajax.googleapis.com/ajax/libs/jquery/2.1.3/jquery.min.js"></script>-->
 		<script src="jquery-2.1.4.js"></script>
 		<script src='core.js'></script>
+		<script src='loudness.js'></script>
 		<script type="text/javascript">
-			window.onbeforeunload = function() {
-				return "Please only leave this page once you have completed the tests. Are you sure you have completed all testing?";
-			};
+			// SEARCH QUERY: By using the GET Request option ?url=loca/path/to/project.xml in the URL bar, you can load a project quickly
+			if (window.location.search.length != 0)
+			{
+				var search = window.location.search.split('?')[1];
+				// Now split the requests into pairs
+				var searchQueries = search.split('&');
+				for (var i in searchQueries)
+				{
+					// Split each request into
+					searchQueries[i] = searchQueries[i].split('=');
+					if (searchQueries[i][0] == "url")
+					{
+						url = searchQueries[i][1];
+					}
+				}
+				loadProjectSpec(url);
+				window.onbeforeunload = function() {
+					return "Please only leave this page once you have completed the tests. Are you sure you have completed all testing?";
+				};
+			}
 		</script>
-		<!-- Uncomment the following script for automatic loading of projects -->
-		<script>
-			//url = '/pseudo.xml'; //Project XML document location
-			url = 'example_eval/project.xml';
-			loadProjectSpec(url);
-		</script>
-		
 	</head>
 
 	<body>
 		<!-- Load up the default page interface allowing for project setting loads, even if hard-coded-->
 		<!-- Actual test interface design should be contained in the .js for ease of dynamic content-->
 		<div id='topLevelBody'>
-			<p>HTML5 APE Tool</p>
+			<h1>Web Audio Evaluation Tool</h1>
+			<h2>Start menu </h2>
+			<ul>
+				<li><a href="index.html?url=example_eval/project.xml" target="_blank">APE interface test example</a></li>
+				<li><a href="index.html?url=example_eval/mushra_example.xml" target="_blank">MUSHRA interface test example</a></li>
+				<li><a href="test_create/test_create.html" target="_blank">Test creator</a></li>
+				<li><a href="analyse.html" target="_blank">Analysis and diagnostics of results</a></li>
+			</ul>
+
+			<br>
+
+			<ul>
+				<li><a href="LICENSE.txt" target="_blank">License</a></li>
+				<li><a href="CITING.txt" target="_blank">Citing</a></li>
+				<li><a href="docs/Instructions/Instructions.pdf" target="_blank">Instructions</a></li>
+			</ul>
+
+
 		</div>
 	</body>
 </html>
--- /dev/null	Thu Jan 01 00:00:00 1970 +0000
+++ b/loudness.js	Wed Dec 23 14:36:00 2015 +0000
@@ -0,0 +1,181 @@
+/**
+ *  loundess.js
+ *  Loudness module for the Web Audio Evaluation Toolbox
+ *  Allows for automatic calculation of loudness of Web Audio API Buffer objects,
+ * 	return gain values to correct for a target loudness or match loudness between
+ *  multiple objects
+ */
+
+var interval_cal_loudness_event = null;
+
+if (typeof OfflineAudioContext == "undefined"){
+	var OfflineAudioContext = webkitOfflineAudioContext;
+}
+
+function calculateLoudness(buffer, timescale, target, offlineContext)
+{
+	// This function returns the EBU R 128 specification loudness model and sets the linear gain required to match -23 LUFS
+	// buffer -> Web Audio API Buffer object
+	// timescale -> M or Momentary (returns Array), S or Short (returns Array),
+	//   I or Integrated (default, returns number)
+	// target -> default is -23 LUFS but can be any LUFS measurement.
+	
+	if (buffer == undefined)
+	{
+		return 0;
+	}
+	if (timescale == undefined)
+	{
+		timescale = "I";
+	}
+	if (target == undefined)
+	{
+		target = -23;
+	}
+	if (offlineContext == undefined)
+	{
+		offlineContext = new OfflineAudioContext(buffer.numberOfChannels, buffer.length, buffer.sampleRate);
+	}
+	// Create the required filters
+	var KFilter = offlineContext.createBiquadFilter();
+	KFilter.type = "highshelf";
+	KFilter.gain.value = 4;
+	KFilter.frequency.value = 1480;
+	
+	var HPFilter = offlineContext.createBiquadFilter();
+	HPFilter.type = "highpass";
+	HPFilter.Q.value = 0.707;
+	HPFilter.frequency.value = 60;
+	// copy Data into the process buffer
+	var processSource = offlineContext.createBufferSource();
+	processSource.buffer = buffer;
+	
+	processSource.connect(KFilter);
+	KFilter.connect(HPFilter);
+	HPFilter.connect(offlineContext.destination);
+	processSource.start();
+	offlineContext.oncomplete = function(renderedBuffer) {
+		// Have the renderedBuffer information, now continue processing
+		if (typeof renderedBuffer.renderedBuffer == 'object') {
+			renderedBuffer = renderedBuffer.renderedBuffer;
+		}
+		switch(timescale)
+		{
+		case "I":
+			var blockEnergy = calculateProcessedLoudness(renderedBuffer, 400, 0.75);
+			// Apply the absolute gate
+			var loudness = calculateLoudnessFromChannelBlocks(blockEnergy);
+			var absgatedEnergy = new Array(blockEnergy.length);
+			for (var c=0; c<blockEnergy.length; c++)
+			{
+				absgatedEnergy[c] = [];
+			}
+			for (var i=0; i<loudness.length; i++)
+			{
+				if (loudness[i] >= -70)
+				{
+					for (var c=0; c<blockEnergy.length; c++)
+					{
+						absgatedEnergy[c].push(blockEnergy[c][i]);
+					}
+				}
+			}
+			var overallAbsLoudness = calculateOverallLoudnessFromChannelBlocks(absgatedEnergy);
+			
+			//applying the relative gate 8 dB down from overallAbsLoudness
+			var relGateLevel = overallAbsLoudness - 8;
+			var relgateEnergy = new Array(blockEnergy.length);
+			for (var c=0; c<blockEnergy.length; c++)
+			{
+				relgateEnergy[c] = [];
+			}
+			for (var i=0; i<loudness.length; i++)
+			{
+				if (loudness[i] >= relGateLevel)
+				{
+					for (var c=0; c<blockEnergy.length; c++)
+					{
+						relgateEnergy[c].push(blockEnergy[c][i]);
+					}
+				}
+			}
+			var overallRelLoudness = calculateOverallLoudnessFromChannelBlocks(relgateEnergy);
+			buffer.lufs =  overallRelLoudness;
+		}
+	};
+	offlineContext.startRendering();
+}
+
+function calculateProcessedLoudness(buffer, winDur, overlap)
+{
+	// Buffer		Web Audio buffer node
+	// winDur		Window Duration in milliseconds
+	// overlap		Window overlap as normalised (0.5 = 50% overlap);
+	if (buffer == undefined)
+	{
+		return 0;
+	}
+	if (winDur == undefined)
+	{
+		winDur = 400;
+	}
+	if (overlap == undefined)
+	{
+		overlap = 0.5;
+	}
+	var winSize = buffer.sampleRate*winDur/1000;
+	var olapSize = (1-overlap)*winSize;
+	var numberOfFrames = Math.floor(buffer.length/olapSize - winSize/olapSize + 1);
+	var blockEnergy = new Array(buffer.numberOfChannels);
+	for (var channel = 0; channel < buffer.numberOfChannels; channel++)
+	{
+		blockEnergy[channel] = new Float32Array(numberOfFrames);
+		var data = buffer.getChannelData(channel);
+		for (var i=0; i<numberOfFrames; i++)
+		{
+			var sigma = 0;
+			for (var n=i*olapSize; n < i*olapSize+winSize; n++)
+			{
+				sigma += Math.pow(data[n],2);
+			}
+			blockEnergy[channel][i] = sigma/winSize;
+		}
+	}
+	return blockEnergy;
+}
+function calculateLoudnessFromChannelBlocks(blockEnergy)
+{
+	// Loudness
+	var loudness = new Float32Array(blockEnergy[0].length);
+	for (var i=0; i<blockEnergy[0].length; i++)
+	{
+		var sigma = 0;
+		for (var channel = 0; channel < blockEnergy.length; channel++)
+		{
+			var G = 1.0;
+			if (channel >= 4) {G = 1.41;}
+			sigma += blockEnergy[channel][i]*G;
+		}
+		loudness[i] = -0.691 + 10*Math.log10(sigma);
+	}
+	return loudness;
+}
+function calculateOverallLoudnessFromChannelBlocks(blockEnergy)
+{
+	// Loudness
+	var summation = 0;
+	for (var channel = 0; channel < blockEnergy.length; channel++)
+	{
+		var G = 1.0;
+		if (channel >= 4) {G = 1.41;}
+		var sigma = 0;
+		for (var i=0; i<blockEnergy[0].length; i++)
+		{
+			blockEnergy[channel][i] *= G;
+			sigma += blockEnergy[channel][i];
+		}
+		sigma /= blockEnergy.length;
+		summation+= sigma;
+	}
+	return -0.691 + 10*Math.log10(summation);;
+}
--- a/mushra.css	Mon Dec 21 23:18:43 2015 +0000
+++ b/mushra.css	Wed Dec 23 14:36:00 2015 +0000
@@ -35,6 +35,21 @@
 	background-color: #ddd
 }
 
+div#slider-holder {
+	height: inherit;
+	position: absolute;
+	top: 125px;
+	left: 0px;
+	z-index: 2;
+}
+
+div#scale-holder {
+	height: inherit;
+	position: absolute;
+	top: 125px;
+	left: 0px;
+	z-index: 3;
+}
 
 div.track-slider {
 	float: left;
@@ -43,6 +58,24 @@
 	border-width: 1px;
 	border-color: black;
 	padding:2px;
+	margin-left: 50px;
+}
+
+div.outside-reference {
+	width:120px;
+	padding-left: 55px;
+	margin-left: 100px;
+	height:20px;
+	margin-bottom:5px;
+	background-color: rgb(100,200,100);
+}
+
+div.track-slider-playing {
+	background-color: #FFDDDD;
+}
+
+input.track-slider-range {
+	margin: 2px 0px;
 }
 
 input[type=range][orient=vertical]
@@ -50,6 +83,44 @@
     writing-mode: bt-lr; /* IE */
     -webkit-appearance: slider-vertical; /* WebKit */
     width: 8px;
-    height: 175px;
     padding: 0 5px;
-}
\ No newline at end of file
+    color: rgb(255, 144, 144);
+}
+
+input[type=range]::-webkit-slider-runnable-track {
+	width: 8px;
+	cursor: pointer;
+	background: #fff;
+	border-radius: 4px;
+	border: 1px solid #000;
+}
+
+input[type=range]::-moz-range-track {
+	width: 8px;
+	cursor: pointer;
+	background: #fff;
+	border-radius: 4px;
+	border: 1px solid #000;
+}
+
+input.track-slider-not-moved[type=range]::-webkit-slider-runnable-track {
+	background: #aaa;
+}
+
+input.track-slider-not-moved[type=range]::-moz-range-track {
+	background: #aaa;
+}
+
+
+input[type=range]::-moz-range-thumb {
+	margin-left: -7px;
+	cursor: pointer;
+	margin-top: -1px;
+	box-shadow: 1px 1px 1px #000000, 0px 0px 1px #0d0d0d;
+}
+
+input[type=range]::-webkit-slider-thumb {
+	cursor: pointer;
+	margin-top: -1px;
+	margin-left: -4px;
+}
--- a/mushra.js	Mon Dec 21 23:18:43 2015 +0000
+++ b/mushra.js	Wed Dec 23 14:36:00 2015 +0000
@@ -47,6 +47,7 @@
 	var playback = document.createElement("button");
 	playback.innerHTML = 'Stop';
 	playback.id = 'playback-button';
+	playback.style.float = 'left';
 	// onclick function. Check if it is playing or not, call the correct function in the
 	// audioEngine, change the button text to reflect the next state.
 	playback.onclick = function() {
@@ -62,6 +63,7 @@
 	submit.innerHTML = 'Submit';
 	submit.onclick = buttonSubmitClick;
 	submit.id = 'submit-button';
+	submit.style.float = 'left';
 	// Append the interface buttons into the interfaceButtons object.
 	interfaceButtons.appendChild(playback);
 	interfaceButtons.appendChild(submit);
@@ -69,9 +71,16 @@
 	// Create a slider box
 	var sliderBox = document.createElement('div');
 	sliderBox.style.width = "100%";
-	sliderBox.style.height = window.innerHeight - 180 + 'px';
+	sliderBox.style.height = window.innerHeight - 200+12 + 'px';
+	sliderBox.style.marginBottom = '10px';
 	sliderBox.id = 'slider';
-	sliderBox.align = "center";
+	var scaleHolder = document.createElement('div');
+	scaleHolder.id = "scale-holder";
+	sliderBox.appendChild(scaleHolder);
+	var sliderObjectHolder = document.createElement('div');
+	sliderObjectHolder.id = 'slider-holder';
+	sliderObjectHolder.align = "center";
+	sliderBox.appendChild(sliderObjectHolder);
 	
 	// Global parent for the comment boxes on the page
 	var feedbackHolder = document.createElement('div');
@@ -95,20 +104,27 @@
 
 function loadTest(audioHolderObject)
 {
-	// Reset audioEngineContext.Metric globals for new test
-	audioEngineContext.newTestPage();
-	
-	// Delete any previous audioObjects associated with the audioEngine
-	audioEngineContext.audioObjects = [];
-	interfaceContext.deleteCommentBoxes();
-	interfaceContext.deleteCommentQuestions();
-	
 	var id = audioHolderObject.id;
 	
 	var feedbackHolder = document.getElementById('feedbackHolder');
 	var interfaceObj = audioHolderObject.interfaces;
+	if (interfaceObj.length > 1)
+	{
+		console.log("WARNING - This interface only supports one <interface> node per page. Using first interface node");
+	}
+	interfaceObj = interfaceObj[0];
+	if(interfaceObj.title != null)
+	{
+		document.getElementById("pageTitle").textContent = interfaceObj.title;
+	}
 	
-	var sliderBox = document.getElementById('slider');
+	// Delete outside reference
+	var outsideReferenceHolder = document.getElementById('outside-reference');
+	if (outsideReferenceHolder != null) {
+		document.getElementById('interface-buttons').removeChild(outsideReferenceHolder);
+	}
+	
+	var sliderBox = document.getElementById('slider-holder');
 	feedbackHolder.innerHTML = null;
 	sliderBox.innerHTML = null;
 	
@@ -116,20 +132,8 @@
 	if (interfaceObj.commentBoxPrefix != undefined) {
 		commentBoxPrefix = interfaceObj.commentBoxPrefix;
 	}
-	
-	/// CHECK FOR SAMPLE RATE COMPATIBILITY
-	if (audioHolderObject.sampleRate != undefined) {
-		if (Number(audioHolderObject.sampleRate) != audioContext.sampleRate) {
-			var errStr = 'Sample rates do not match! Requested '+Number(audioHolderObject.sampleRate)+', got '+audioContext.sampleRate+'. Please set the sample rate to match before completing this test.';
-			alert(errStr);
-			return;
-		}
-	}
-	
 	var loopPlayback = audioHolderObject.loop;
 	
-	audioEngineContext.loopPlayback = loopPlayback;
-	
 	currentTestHolder = document.createElement('audioHolder');
 	currentTestHolder.id = audioHolderObject.id;
 	currentTestHolder.repeatCount = audioHolderObject.repeatCount;
@@ -144,8 +148,12 @@
 		// Find URL of track
 		// In this jQuery loop, variable 'this' holds the current audioElement.
 		
-		// Now load each audio sample. First create the new track by passing the full URL
-		var trackURL = audioHolderObject.hostURL + element.url;
+		// Check if an outside reference
+		if (index == audioHolderObject.outsideReference)
+		{
+			return;
+		}
+		
 		var audioObject = audioEngineContext.newTrack(element);
 		
 		var node = interfaceContext.createCommentBox(audioObject);
@@ -153,8 +161,14 @@
 		// Create a slider per track
 		audioObject.interfaceDOM = new sliderObject(audioObject);
 		
-		// Distribute it randomnly
-		audioObject.interfaceDOM.slider.value = Math.random();
+		if (typeof audioHolderObject.initialPosition === "number")
+		{
+			// Set the values
+			audioObject.interfaceDOM.slider.value = audioHolderObject.initalPosition;
+		} else {
+			// Distribute it randomnly
+			audioObject.interfaceDOM.slider.value = Math.random();
+		}
 		
 		sliderBox.appendChild(audioObject.interfaceDOM.holder);
 		audioObject.metric.initialised(audioObject.interfaceDOM.slider.value);
@@ -165,7 +179,33 @@
 	var numObj = audioHolderObject.audioElements.length;
 	var totalWidth = (numObj-1)*150+100;
 	var diff = (window.innerWidth - totalWidth)/2;
-	audioEngineContext.audioObjects[0].interfaceDOM.holder.style.marginLeft = diff + 'px';
+	sliderBox.style.marginLeft = diff + 'px';
+	
+	// Construct outside reference;
+	if (audioHolderObject.outsideReference != null) {
+		var outsideReferenceHolder = document.createElement('div');
+		outsideReferenceHolder.id = 'outside-reference';
+		outsideReferenceHolder.className = 'outside-reference';
+		outsideReferenceHolderspan = document.createElement('span');
+		outsideReferenceHolderspan.textContent = 'Reference';
+		outsideReferenceHolder.appendChild(outsideReferenceHolderspan);
+		
+		var audioObject = audioEngineContext.newTrack(audioHolderObject.audioElements[audioHolderObject.outsideReference]);
+		
+		outsideReferenceHolder.onclick = function(event)
+		{
+			audioEngineContext.play(audioEngineContext.audioObjects.length-1);
+			$('.track-slider').removeClass('track-slider-playing');
+            $('.comment-div').removeClass('comment-box-playing');
+            if (event.currentTarget.nodeName == 'DIV') {
+            	$(event.currentTarget).addClass('track-slider-playing');
+            } else {
+            	$(event.currentTarget.parentElement).addClass('track-slider-playing');
+            }
+		};
+		
+		document.getElementById('interface-buttons').appendChild(outsideReferenceHolder);
+	}
 }
 
 function sliderObject(audioObject)
@@ -183,7 +223,10 @@
 	this.holder.appendChild(this.slider);
 	this.holder.appendChild(this.play);
 	this.holder.align = "center";
-	this.holder.style.marginLeft = "50px";
+	if (audioObject.id == 0)
+	{
+		this.holder.style.marginLeft = '0px';
+	}
 	this.holder.setAttribute('trackIndex',audioObject.id);
 	
 	this.title.textContent = audioObject.id;
@@ -191,12 +234,11 @@
 	this.title.style.float = "left";
 	
 	this.slider.type = "range";
+	this.slider.className = "track-slider-range track-slider-not-moved";
 	this.slider.min = "0";
 	this.slider.max = "1";
 	this.slider.step = "0.01";
 	this.slider.setAttribute('orient','vertical');
-	this.slider.style.float = "left";
-	this.slider.style.width = "100%";
 	this.slider.style.height = window.innerHeight-250 + 'px';
 	this.slider.onchange = function()
 	{
@@ -204,27 +246,31 @@
 		var id = Number(this.parentNode.getAttribute('trackIndex'));
 		audioEngineContext.audioObjects[id].metric.moved(time,this.value);
 		console.log('slider '+id+' moved to '+this.value+' ('+time+')');
+		$(this).removeClass('track-slider-not-moved');
 	};
 	
-	this.play.textContent = "Play";
+	this.play.textContent = "Loading...";
 	this.play.value = audioObject.id;
 	this.play.style.float = "left";
 	this.play.style.width = "100%";
-	this.play.onclick = function()
+	this.play.disabled = true;
+	this.play.onclick = function(event)
 	{
-		audioEngineContext.play();
-		if (audioEngineContext.audioObjectsReady) {
-			var id = Number(event.srcElement.value);
-			//audioEngineContext.metric.sliderPlayed(id);
-			audioEngineContext.play(id);
+		var id = Number(event.currentTarget.value);
+		//audioEngineContext.metric.sliderPlayed(id);
+		audioEngineContext.play(id);
+		$(".track-slider").removeClass('track-slider-playing');
+		$(event.currentTarget.parentElement).addClass('track-slider-playing');
+		var outsideReference = document.getElementById('outside-reference');
+		if (outsideReference != null) {
+			$(outsideReference).removeClass('track-slider-playing');
 		}
 	};
 	
 	this.enable = function() {
-		if (this.parent.state == 1)
-		{
-			$(this.slider).removeClass('track-slider-disabled');
-		}
+		this.play.disabled = false;
+		this.play.textContent = "Play";
+		$(this.slider).removeClass('track-slider-disabled');
 	};
 	
 	this.exportXMLDOM = function(audioObject) {
@@ -236,6 +282,41 @@
 	this.getValue = function() {
 		return this.slider.value;
 	};
+	
+	this.resize = function(event)
+	{
+		this.holder.style.height = window.innerHeight-200 + 'px';
+		this.slider.style.height = window.innerHeight-250 + 'px';
+	};
+	this.updateLoading = function(progress)
+	{
+		progress = String(progress);
+		progress = progress.substr(0,5);
+		this.play.textContent = "Loading: "+progress+"%";
+	};
+	
+	if (this.parent.state == 1)
+	{
+		this.enable();
+	}
+}
+
+function resizeWindow(event)
+{
+	// Function called when the window has been resized.
+	// MANDATORY FUNCTION
+	
+	// Auto-align
+	var numObj = audioEngineContext.audioObjects.length;
+	var totalWidth = (numObj-1)*150+100;
+	var diff = (window.innerWidth - totalWidth)/2;
+	document.getElementById('slider').style.height = window.innerHeight - 180 + 'px';
+	if (diff <= 0){diff = 0;}
+	document.getElementById('slider-holder').style.marginLeft = diff + 'px';
+	for (var i in audioEngineContext.audioObjects)
+	{
+		audioEngineContext.audioObjects[i].interfaceDOM.resize(event);
+	}
 }
 
 
@@ -247,7 +328,7 @@
 	// Check that the anchor and reference objects are correctly placed
 	if (interfaceContext.checkHiddenAnchor() == false) {return;}
 	if (interfaceContext.checkHiddenReference() == false) {return;}
-	/*
+	
 	for (var i=0; i<checks.length; i++) {
 		if (checks[i].type == 'check')
 		{
@@ -273,17 +354,20 @@
 				var checkState = interfaceContext.checkAllCommented();
 				if (checkState == false) {canContinue = false;}
 				break;
-			case 'scalerange':
+			//case 'scalerange':
 				// Check the scale is used to its full width outlined by the node
-				var checkState = interfaceContext.checkScaleRange();
-				if (checkState == false) {canContinue = false;}
+				//var checkState = interfaceContext.checkScaleRange();
+				//if (checkState == false) {canContinue = false;}
+			//	break;
+			default:
+				console.log("WARNING - Check option "+checks[i].check+" is not supported on this interface");
 				break;
 			}
 
 		}
 		if (!canContinue) {break;}
 	}
-   */
+	
     if (canContinue) {
 	    if (audioEngineContext.status == 1) {
 	        var playback = document.getElementById('playback-button');
@@ -303,32 +387,7 @@
 
 function pageXMLSave(store, testXML)
 {
+	// MANDATORY
 	// Saves a specific test page
-	var xmlDoc = store;
-	// Check if any session wide metrics are enabled
-	
-	var commentShow = testXML.elementComments;
-	
-	var metric = document.createElement('metric');
-	if (audioEngineContext.metric.enableTestTimer)
-	{
-		var testTime = document.createElement('metricResult');
-		testTime.id = 'testTime';
-		testTime.textContent = audioEngineContext.timer.testDuration;
-		metric.appendChild(testTime);
-	}
-	xmlDoc.appendChild(metric);
-	var audioObjects = audioEngineContext.audioObjects;
-	for (var i=0; i<audioObjects.length; i++) 
-	{
-		var audioElement = audioEngineContext.audioObjects[i].exportXMLDOM();
-		audioElement.setAttribute('presentedId',i);
-		xmlDoc.appendChild(audioElement);
-	}
-	
-	$(interfaceContext.commentQuestions).each(function(index,element){
-		var node = element.exportXMLDOM();
-		xmlDoc.appendChild(node);
-	});
-	store = xmlDoc;
+	// You can use this space to add any extra nodes to your XML saves
 }
--- a/pythonServer.py	Mon Dec 21 23:18:43 2015 +0000
+++ b/pythonServer.py	Wed Dec 23 14:36:00 2015 +0000
@@ -26,9 +26,9 @@
 	curSaveIndex += 1;
 	curFileName = 'test-'+str(curSaveIndex)+'.xml'
 
-print "Next save - " + curFileName
 pseudo_index = curSaveIndex % len(pseudo_files)
-print "Next test in pseudo-random queue - " + pseudo_files[pseudo_index]
+
+print 'URL: http://localhost:8000/index.html'
 
 def send404(s):
 	s.send_response(404)
@@ -36,6 +36,8 @@
 	s.end_headers()
 	
 def processFile(s):
+	s.path = s.path.rsplit('?')
+	s.path = s.path[0]
 	s.path = s.path[1:len(s.path)]
 	st = s.path.rsplit(',')
 	lenSt = len(st)
--- a/scripts/generate_report.py	Mon Dec 21 23:18:43 2015 +0000
+++ b/scripts/generate_report.py	Wed Dec 23 14:36:00 2015 +0000
@@ -11,14 +11,14 @@
 import numpy as np # numbers
 
 # Command line arguments
-assert len(sys.argv)<4, "evaluation_stats takes at most 2 command line argument\n"+\
+assert len(sys.argv)<4, "generate_report takes at most 2 command line arguments\n"+\
                         "Use: python generate_report.py [results_folder] [no_render | -nr]"
 
 render_figures = True
 
 # XML results files location
 if len(sys.argv) == 1:
-    folder_name = "../saves"    # Looks in 'saves/' folder from 'scripts/' folder
+    folder_name = "../saves/"    # Looks in 'saves/' folder from 'scripts/' folder
     print "Use: python generate_report.py [results_folder] [no_render | -nr]"
     print "Using default path: " + folder_name
 elif len(sys.argv) == 2:
@@ -79,6 +79,7 @@
           \geometry{a4paper}
           \usepackage[parfill]{parskip} % empty line instead of indent
           \usepackage{graphicx}    % figures
+          \usepackage[space]{grffile} % include figures with spaces in paths
           \usepackage{hyperref}
           \usepackage{tikz}           % pie charts
           \title{Report}
@@ -87,7 +88,7 @@
           r'''}
           \graphicspath{{'''+\
           folder_name+\
-          r'''/}}
+          r'''}}
           %\setcounter{section}{-1} % Summary section 0 so number of sections equals number of files
           \begin{document}
           \maketitle
@@ -108,18 +109,21 @@
 
 body = ''
 
+# make sure folder_name ends in '/'
+folder_name = os.path.join(folder_name, '')
+
 # generate images for later use
 if render_figures:
-    subprocess.call("python timeline_view_movement.py "+folder_name, shell=True)
-    subprocess.call("python score_parser.py "+folder_name, shell=True)
-    subprocess.call("python score_plot.py "+folder_name, shell=True)
+    subprocess.call("python timeline_view_movement.py '"+folder_name+"'", shell=True)
+    subprocess.call("python score_parser.py '"+folder_name+"'", shell=True)
+    subprocess.call("python score_plot.py '"+folder_name+"ratings/'", shell=True)
 
 # get every XML file in folder
 files_list = os.listdir(folder_name)
 for file in files_list: # iterate over all files in files_list
     if file.endswith(".xml"): # check if XML file
         number_of_XML_files += 1
-        tree = ET.parse(folder_name + '/' + file)
+        tree = ET.parse(folder_name + file)
         root = tree.getroot()
         
         # PRINT name as section
@@ -220,10 +224,10 @@
             img_path = 'timelines_movement/'+file[:-4]+'-'+page_name+'.pdf'
             
             # check if available
-            if os.path.isfile(folder_name+'/'+img_path):
+            if os.path.isfile(folder_name+img_path):
                 # SHOW timeline image
                 timeline_plots += '\\includegraphics[width=\\textwidth]{'+\
-                         folder_name+'/'+img_path+'}\n\t\t'
+                         folder_name+img_path+'}\n\t\t'
             
             # keep track of duration in function of page index
             if len(duration_order)>page_number:
@@ -330,7 +334,7 @@
 plt.xlim(.8, len(duration_order)+1)
 plt.xticks(np.arange(1,len(duration_order)+1)+.4, range(1,len(duration_order)+1))
 plt.ylabel('Average time [minutes]')
-plt.savefig(folder_name+"/time_per_page.pdf", bbox_inches='tight')
+plt.savefig(folder_name+"time_per_page.pdf", bbox_inches='tight')
 plt.close()
 #TODO add error bars
 
@@ -372,7 +376,7 @@
 plt.xlim(.8, len(audioholder_names_ordered)+1)
 plt.xticks(np.arange(1,len(audioholder_names_ordered)+1)+.4, audioholder_names_ordered, rotation=90)
 plt.ylabel('Average time [minutes]')
-plt.savefig(folder_name+"/time_per_audioholder.pdf", bbox_inches='tight')
+plt.savefig(folder_name+"time_per_audioholder.pdf", bbox_inches='tight')
 plt.close()
 
 # SHOW bar plot of average time per page
@@ -385,7 +389,7 @@
 ylims = ax.get_ylim()
 yint = np.arange(int(np.floor(ylims[0])), int(np.ceil(ylims[1]))+1)
 plt.yticks(yint)
-plt.savefig(folder_name+"/subjects_per_audioholder.pdf", bbox_inches='tight')
+plt.savefig(folder_name+"subjects_per_audioholder.pdf", bbox_inches='tight')
 plt.close()
 
 # SHOW both figures
@@ -393,7 +397,7 @@
          \begin{figure}[htbp]
          \begin{center}
          \includegraphics[width=.65\textwidth]{'''+\
-         folder_name+"/time_per_page.pdf"+\
+         folder_name+'time_per_page.pdf'+\
         r'''}
         \caption{Average time spent per page.}
         \label{fig:avgtimeperpage}
@@ -404,7 +408,7 @@
 body += r'''\begin{figure}[htbp]
          \begin{center}
          \includegraphics[width=.65\textwidth]{'''+\
-         folder_name+"/time_per_audioholder.pdf"+\
+         folder_name+'time_per_audioholder.pdf'+\
         r'''}
         \caption{Average time spent per audioholder.}
         \label{fig:avgtimeperaudioholder}
@@ -415,7 +419,7 @@
 body += r'''\begin{figure}[htbp]
          \begin{center}
          \includegraphics[width=.65\textwidth]{'''+\
-         folder_name+"/subjects_per_audioholder.pdf"+\
+         folder_name+'subjects_per_audioholder.pdf'+\
         r'''}
         \caption{Number of subjects per audioholder.}
         \label{fig:subjectsperaudioholder}
@@ -430,11 +434,11 @@
 #TODO order in decreasing order of participants
 for audioholder_name in page_names: # get each name
     # plot boxplot if exists (not so for the 'alt' names)
-    if os.path.isfile(folder_name+'/ratings/'+audioholder_name+'-ratings-box.pdf'):
+    if os.path.isfile(folder_name+'ratings/'+audioholder_name+'-ratings-box.pdf'):
         body += r'''\begin{figure}[htbp]
              \begin{center}
              \includegraphics[width=.65\textwidth]{'''+\
-             folder_name+"/ratings/"+audioholder_name+'-ratings-box.pdf'+\
+             folder_name+"ratings/"+audioholder_name+'-ratings-box.pdf'+\
             r'''}
             \caption{Box plot of ratings for audioholder '''+\
             audioholder_name+' ('+str(subject_count[real_page_names.index(audioholder_name)])+\
@@ -505,21 +509,23 @@
 
 texfile = header+body+footer # add bits together
 
+print 'pdflatex -output-directory="'+folder_name+'"" "'+ folder_name + 'Report.tex"' # DEBUG
+
 # write TeX file
-with open(folder_name + '/' + 'Report.tex','w') as f:
+with open(folder_name + 'Report.tex','w') as f:
     f.write(texfile)
-proc=subprocess.Popen(shlex.split('pdflatex -output-directory='+folder_name+' '+ folder_name + '/Report.tex'))
+proc=subprocess.Popen(shlex.split('pdflatex -output-directory="'+folder_name+'" "'+ folder_name + 'Report.tex"'))
 proc.communicate()
 # run again
-proc=subprocess.Popen(shlex.split('pdflatex -output-directory='+folder_name+' '+ folder_name + '/Report.tex'))
+proc=subprocess.Popen(shlex.split('pdflatex -output-directory="'+folder_name+'" "'+ folder_name + 'Report.tex"'))
 proc.communicate()
 
 #TODO remove auxiliary LaTeX files
 try:
-    os.remove(folder_name + '/' + 'Report.aux')
-    os.remove(folder_name + '/' + 'Report.log')
-    os.remove(folder_name + '/' + 'Report.out')
-    os.remove(folder_name + '/' + 'Report.toc')
+    os.remove(folder_name + 'Report.aux')
+    os.remove(folder_name + 'Report.log')
+    os.remove(folder_name + 'Report.out')
+    os.remove(folder_name + 'Report.toc')
 except OSError:
     pass
     
\ No newline at end of file
--- a/scripts/score_parser.py	Mon Dec 21 23:18:43 2015 +0000
+++ b/scripts/score_parser.py	Wed Dec 23 14:36:00 2015 +0000
@@ -97,7 +97,7 @@
                                 filewriter.writerow(row + ['']*len(newfragments))
                     os.rename('temp.csv', file_name) # replace old file with temp file
                     headerrow = headerrow + newfragments
-
+                    
             # if not, create file and make header
             else:
                 headerrow = sorted(fragmentnamelist) # sort alphabetically
--- a/scripts/score_plot.py	Mon Dec 21 23:18:43 2015 +0000
+++ b/scripts/score_plot.py	Wed Dec 23 14:36:00 2015 +0000
@@ -115,7 +115,7 @@
 # CODE
 
 # get every csv file in folder
-for file in os.listdir(rating_folder): # You have to put this in folder where rating csv files are.
+for file in os.listdir(rating_folder):
     if file.endswith(".csv"):
         page_name = file[:-4] # file name (without extension) is page ID
 
--- a/test_create/test_create.html	Mon Dec 21 23:18:43 2015 +0000
+++ b/test_create/test_create.html	Wed Dec 23 14:36:00 2015 +0000
@@ -104,6 +104,38 @@
 					dnd.style.width = "100%";
 					dnd.style.height = "50px";
 					dnd.className = "dragndrop";
+					dnd.ondragover = function(e) {
+						e.stopPropagation();
+						e.preventDefault();
+					};
+					dnd.ondragenter = function(e) {
+						e.stopPropagation();
+						e.preventDefault();
+						this.style.backgroundColor = '#AAFFAA';
+					};
+					dnd.ondragleave = function(e) {
+						e.stopPropagation();
+						e.preventDefault();
+						this.style.backgroundColor = "#FFFFFF";
+					};
+					dnd.ondrop = function(e) {
+						e.stopPropagation();
+						e.preventDefault();
+						
+						var file = e.dataTransfer.files[0];
+						
+						// Uses HTML5 FileAPI - https://w3c.github.io/FileAPI/#filereader-interface
+						var reader = new FileReader();
+						reader.onload = function() {
+							var parse = new DOMParser();
+							var xml = parse.parseFromString(reader.result,'text/xml');
+							specificationNode.decode(xml);
+							popupInstance.hidePopup();
+							SpecificationToHTML();
+						};
+						reader.readAsText(file);
+						
+					};
 					this.popupBody.appendChild(text);
 					this.popupBody.appendChild(dnd);
 					this.showPopup();
@@ -1511,7 +1543,7 @@
 					case 8:
 						this.hidePopup();
 						this.state = 0;
-						SpecficationToHTML();
+						SpecificationToHTML ();
 					}
 					this.state++;
 				};
@@ -1585,6 +1617,34 @@
 						}
 					};
 				};
+				
+				this.randomiseOrder = function(input)
+				{
+					// This takes an array of information and randomises the order
+					var N = input.length;
+					
+					var inputSequence = []; // For safety purposes: keep track of randomisation
+					for (var counter = 0; counter < N; ++counter) 
+						inputSequence.push(counter) // Fill array
+					var inputSequenceClone = inputSequence.slice(0);
+					
+					var holdArr = [];
+					var outputSequence = [];
+					for (var n=0; n<N; n++)
+					{
+						// First pick a random number
+						var r = Math.random();
+						// Multiply and floor by the number of elements left
+						r = Math.floor(r*input.length);
+						// Pick out that element and delete from the array
+						holdArr.push(input.splice(r,1)[0]);
+						// Do the same with sequence
+						outputSequence.push(inputSequence.splice(r,1)[0]);
+					}
+					console.log(inputSequenceClone.toString()); // print original array to console
+					console.log(outputSequence.toString()); 	// print randomised array to console
+					return holdArr;
+				};
 				this.projectReturn = null;
 				this.randomiseOrder = null;
 				this.collectMetrics = null;
@@ -1592,7 +1652,7 @@
 				this.audioHolders = [];
 				this.metrics = [];
 				
-				this.decode = function() {
+				this.decode = function(projectXML) {
 					// projectXML - DOM Parsed document
 					this.projectXML = projectXML.childNodes[0];
 					var setupNode = projectXML.getElementsByTagName('setup')[0];
@@ -1614,8 +1674,19 @@
 					}
 					var metricCollection = setupNode.getElementsByTagName('Metric');
 					
-					this.preTest = new this.prepostNode('pretest',setupNode.getElementsByTagName('PreTest'));
-					this.postTest = new this.prepostNode('posttest',setupNode.getElementsByTagName('PostTest'));
+					var setupPreTestNode = setupNode.getElementsByTagName('PreTest');
+					if (setupPreTestNode.length != 0)
+					{
+						setupPreTestNode = setupPreTestNode[0];
+						this.preTest.construct(setupPreTestNode);
+					}
+					
+					var setupPostTestNode = setupNode.getElementsByTagName('PostTest');
+					if (setupPostTestNode.length != 0)
+					{
+						setupPostTestNode = setupPostTestNode[0];
+						this.postTest.construct(setupPostTestNode);
+					}
 					
 					if (metricCollection.length > 0) {
 						metricCollection = metricCollection[0].getElementsByTagName('metricEnable');
@@ -1675,11 +1746,13 @@
 					
 					var audioHolders = projectXML.getElementsByTagName('audioHolder');
 					for (var i=0; i<audioHolders.length; i++) {
-						this.audioHolders.push(new this.audioHolderNode(this,audioHolders[i]));
+						var node = new this.audioHolderNode(this);
+						node.decode(this,audioHolders[i]);
+						this.audioHolders.push(node);
 					}
 					
 					// New check if we need to randomise the test order
-					if (this.randomiseOrder)
+					if (this.randomiseOrder && typeof randomiseOrder === "function")
 					{
 				 		this.audioHolders = randomiseOrder(this.audioHolders);
 				 		for (var i=0; i<this.audioHolders.length; i++)
@@ -1718,13 +1791,13 @@
 					var setupPreTest = root.createElement("PreTest");
 					for (var i=0; i<this.preTest.options.length; i++)
 					{
-						setupPreTest.appendChild(this.preTest.options.exportXML(root));
+						setupPreTest.appendChild(this.preTest.options[i].exportXML(root));
 					}
 					
 					var setupPostTest = root.createElement("PostTest");
-					for (var i=0; i<this.preTest.options.length; i++)
+					for (var i=0; i<this.postTest.options.length; i++)
 					{
-						setupPostTest.appendChild(this.postTest.options.exportXML(root));
+						setupPostTest.appendChild(this.postTest.options[i].exportXML(root));
 					}
 					
 					setupNode.appendChild(setupPreTest);
@@ -1734,7 +1807,7 @@
 					var Metric = root.createElement("Metric");
 					for (var i=0; i<this.metrics.length; i++)
 					{
-						var metricEnable = document.createElement("metricEnable");
+						var metricEnable = root.createElement("metricEnable");
 						metricEnable.textContent = this.metrics[i].enabled;
 						Metric.appendChild(metricEnable);
 					}
@@ -1745,7 +1818,7 @@
 					for (var i=0; i<this.commonInterface.options.length; i++)
 					{
 						var CIObj = this.commonInterface.options[i];
-						var CINode = document.createElement(CIObj.type);
+						var CINode = root.createElement(CIObj.type);
 						if (CIObj.type == "check") {CINode.setAttribute("name",CIObj.check);}
 						else {CINode.setAttribute("name",CIObj.name);}
 						CommonInterface.appendChild(CINode);
@@ -1756,85 +1829,8 @@
 					// Time for the <audioHolder> tags
 					for (var ahIndex = 0; ahIndex < this.audioHolders.length; ahIndex++)
 					{
-						var AHObj = this.audioHolders[ahIndex];
-						var AHNode = root.createElement("audioHolder");
-						AHNode.id = AHObj.id;
-						AHNode.setAttribute("hostURL",AHObj.hostURL);
-						AHNode.setAttribute("sampleRate",AHObj.samplerate);
-						AHNode.setAttribute("randomiseOrder",AHObj.randomiseOrder);
-						AHNode.setAttribute("repeatCount",AHObj.repeatCount);
-						AHNode.setAttribute("loop",AHObj.loop);
-						AHNode.setAttribute("elementComments",AHObj.elementComments);
-						
-						// Create <interface> tag
-						for (var i=0; i<AHObj.interfaces.length; i++)
-						{
-							var AHObjInterface = AHObj.interfaces[i];
-							var AHinterface = root.createElement("interface");
-							if (AHObjInterface.title != undefined)
-							{
-								var title = root.createElement("title");
-								title.textContent = AHObjInterface.title;
-								AHinterface.appendChild(title);
-							}
-							for (var j=0; j<AHObjInterface.options.length; j++)
-							{
-								var CIObj = AHObjInterface.options[j];
-								var CINode = root.createElement(CIObj.type);
-								if (CIObj.type == "check") {CINode.setAttribute("name",CIObj.check);}
-								else {CINode.setAttribute("name",CIObj.name);}
-								AHinterface.appendChild(CINode);
-							}
-							if (AHObjInterface.scale != undefined)
-							{
-								for (var j=0; j<AHObjInterface.scale.length; j++)
-								{
-									var CIObj = AHObjInterface.scale[j];
-									var CINode = root.createElement("scale");
-									CINode.setAttribute("position",CIObj[0]);
-									CINode.textContent = CIObj[1];
-									AHinterface.appendChild(CINode);
-								}
-							}
-							AHNode.appendChild(AHinterface);
-						}
-						
-						// Create <audioElements>
-						for (var aeIndex = 0; aeIndex < AHObj.audioElements.length; aeIndex++)
-						{
-							var AEObj = AHObj.audioElements[aeIndex];
-							var AENode = root.createElement("audioElements");
-							AENode.id = AEObj.id;
-							AENode.setAttribute("url",AEObj.url);
-							AENode.setAttribute("type",AEObj.type);
-							if (AEObj.marker != undefined && AEObj.enforce)
-							{
-								AENode.setAttribute("marker",AEObj.marker*100);
-							}
-							AHNode.appendChild(AENode);
-						}
-						
-						// Create <CommentQuestion>
-						for (var i=0; i<AHObj.commentQuestions.length; i++)
-						{
-							AHNode.appendChild(AHObj.commentQuestions[i].exportXML(root));
-						}
-						
-						// Create <PreTest>
-						var AHPreTest = document.createElement("PreTest");
-						for (var i=0; i<AHObj.preTest.options.length; i++)
-						{
-							AHPreTest.appendChild(AHObj.preTest.options.exportXML(root));
-						}
-						
-						var AHPostTest = document.createElement("PostTest");
-						for (var i=0; i<AHObj.preTest.options.length; i++)
-						{
-							AHPostTest.appendChild(AHObj.postTest.options.exportXML(root));
-						}
-						AHNode.appendChild(AHPreTest);
-						AHNode.appendChild(AHPostTest);
-						root.getElementsByTagName("BrowserEvalProjectDocument")[0].appendChild(AHNode);
+						var node = this.audioHolders[ahIndex].encode(root);
+						root.getElementsByTagName("BrowserEvalProjectDocument")[0].appendChild(node);
 					}
 					return root;
 				};
@@ -1847,9 +1843,9 @@
 						
 						this.childOption = function() {
 							this.type = 'option';
-							this.id = element.id;
-							this.name = element.getAttribute('name');
-							this.text = element.textContent;
+							this.id = null;
+							this.name = undefined;
+							this.text = null;
 						};
 						
 						this.type = undefined;
@@ -1863,6 +1859,53 @@
 						this.max = undefined;
 						this.step = undefined;
 						
+						this.decode = function(child)
+						{
+							this.type = child.nodeName;
+							if (child.nodeName == "question") {
+								this.id = child.id;
+								this.mandatory;
+								if (child.getAttribute('mandatory') == "true") {this.mandatory = true;}
+								else {this.mandatory = false;}
+								this.question = child.textContent;
+								if (child.getAttribute('boxsize') == null) {
+									this.boxsize = 'normal';
+								} else {
+									this.boxsize = child.getAttribute('boxsize');
+								}
+							} else if (child.nodeName == "statement") {
+								this.statement = child.textContent;
+							} else if (child.nodeName == "checkbox" || child.nodeName == "radio") {
+								var element = child.firstElementChild;
+								this.id = child.id;
+								if (element == null) {
+									console.log('Malformed' +child.nodeName+ 'entry');
+									this.statement = 'Malformed' +child.nodeName+ 'entry';
+									this.type = 'statement';
+								} else {
+									this.options = [];
+									while (element != null) {
+										if (element.nodeName == 'statement' && this.statement == undefined){
+											this.statement = element.textContent;
+										} else if (element.nodeName == 'option') {
+											var node = new this.childOption();
+											node.id = element.id;
+											node.name = element.getAttribute('name');
+											node.text = element.textContent;
+											this.options.push(node);
+										}
+										element = element.nextElementSibling;
+									}
+								}
+							} else if (child.nodeName == "number") {
+								this.statement = child.textContent;
+								this.id = child.id;
+								this.min = child.getAttribute('min');
+								this.max = child.getAttribute('max');
+								this.step = child.getAttribute('step');
+							}
+						};
+						
 						this.exportXML = function(root)
 						{
 							var node = root.createElement(this.type);
@@ -1880,43 +1923,61 @@
 							case "number":
 								node.id = this.id;
 								node.setAttribute("mandatory",this.mandatory);
-								node.setAttribute("min"), this.min;
-								node.setAttribute("max"), this.max;
-								node.setAttribute("step"), this.step;
+								node.setAttribute("min", this.min);
+								node.setAttribute("max", this.max);
+								node.setAttribute("step", this.step);
+								node.textContent = this.statement;
 								break;
 							case "checkbox":
 								node.id = this.id;
-								var statement = document.createElement("statement");
+								var statement = root.createElement("statement");
 								statement.textContent = this.statement;
 								node.appendChild(statement);
 								for (var i=0; i<this.options.length; i++)
 								{
-									var option = this.option[i];
-									var optionNode = document.createElement("option");
+									var option = this.options[i];
+									var optionNode = root.createElement("option");
 									optionNode.id = option.id;
 									optionNode.textContent = option.text;
-									node.appendChild(option);
+									node.appendChild(optionNode);
 								}
 								break;
 							case "radio":
 								node.id = this.id;
-								var statement = document.createElement("statement");
+								var statement = root.createElement("statement");
 								statement.textContent = this.statement;
 								node.appendChild(statement);
 								for (var i=0; i<this.options.length; i++)
 								{
-									var option = this.option[i];
-									var optionNode = document.createElement("option");
-									optionNode.setAttribute("name") = option.name;
+									var option = this.options[i];
+									var optionNode = root.createElement("option");
+									optionNode.setAttribute("name",option.name);
 									optionNode.textContent = option.text;
-									node.appendChild(option);
+									node.appendChild(optionNode);
 								}
 								break;
 							}
 							return node;
 						};
 					};
+					this.construct = function(Collection)
+					{
+						if (Collection.childElementCount != 0) {
+							var child = Collection.firstElementChild;
+							var node = new this.OptionNode();
+							node.decode(child);
+							this.options.push(node);
+							while (child.nextElementSibling != null) {
+								child = child.nextElementSibling;
+								node = new this.OptionNode();
+								node.decode(child);
+								this.options.push(node);
+							}
+						}
+					};
 				};
+				this.preTest = new this.prepostNode("pretest");
+				this.postTest = new this.prepostNode("posttest");
 				
 				this.metricNode = function(name) {
 					this.enabled = name;
@@ -1931,38 +1992,257 @@
 					this.randomiseOrder = undefined;
 					this.loop = undefined;
 					this.elementComments = undefined;
-					this.preTest = new parent.prepostNode('pretest');
-					this.postTest = new parent.prepostNode('posttest');
+					this.outsideReference = null;
+					this.preTest = new parent.prepostNode("pretest");
+					this.postTest = new parent.prepostNode("pretest");
 					this.interfaces = [];
 					this.commentBoxPrefix = "Comment on track";
 					this.audioElements = [];
 					this.commentQuestions = [];
 					
-					this.interfaceNode = function(DOM) {
-						var title = DOM.getElementsByTagName('title');
-						if (title.length == 0) {this.title = null;}
-						else {this.title = title[0].textContent;}
-						this.options = parent.commonInterface.options;
-						var scale = DOM.getElementsByTagName('scale');
-						this.scale = [];
-						for (var i=0; i<scale.length; i++) {
-							var arr = [null, null];
-							arr[0] = scale[i].getAttribute('position');
-							arr[1] = scale[i].textContent;
-							this.scale.push(arr);
+					this.decode = function(parent,xml)
+					{
+						this.presentedId = parent.audioHolders.length;
+						this.id = xml.id;
+						this.hostURL = xml.getAttribute('hostURL');
+						this.sampleRate = xml.getAttribute('sampleRate');
+						if (xml.getAttribute('randomiseOrder') == "true") {this.randomiseOrder = true;}
+						else {this.randomiseOrder = false;}
+						this.repeatCount = xml.getAttribute('repeatCount');
+						if (xml.getAttribute('loop') == 'true') {this.loop = true;}
+						else {this.loop == false;}
+						if (xml.getAttribute('elementComments') == "true") {this.elementComments = true;}
+						else {this.elementComments = false;}
+						
+						var setupPreTestNode = xml.getElementsByTagName('PreTest');
+						if (setupPreTestNode.length != 0)
+						{
+							setupPreTestNode = setupPreTestNode[0];
+							this.preTest.construct(setupPreTestNode);
+						}
+						
+						var setupPostTestNode = xml.getElementsByTagName('PostTest');
+						if (setupPostTestNode.length != 0)
+						{
+							setupPostTestNode = setupPostTestNode[0];
+							this.postTest.construct(setupPostTestNode);
+						}
+						
+						var interfaceDOM = xml.getElementsByTagName('interface');
+						for (var i=0; i<interfaceDOM.length; i++) {
+							var node = new this.interfaceNode();
+							node.decode(interfaceDOM[i]);
+							this.interfaces.push(node);
+						}
+						this.commentBoxPrefix = xml.getElementsByTagName('commentBoxPrefix');
+						if (this.commentBoxPrefix.length != 0) {
+							this.commentBoxPrefix = this.commentBoxPrefix[0].textContent;
+						} else {
+							this.commentBoxPrefix = "Comment on track";
+						}
+						var audioElementsDOM = xml.getElementsByTagName('audioElements');
+						for (var i=0; i<audioElementsDOM.length; i++) {
+							var node = new this.audioElementNode();
+							node.decode(this,audioElementsDOM[i]);
+							if (audioElementsDOM[i].getAttribute('type') == 'outsidereference') {
+								if (this.outsideReference == null) {
+									this.outsideReference = node;
+								} else {
+									console.log('Error only one audioelement can be of type outsidereference per audioholder');
+									this.audioElements.push(node);
+									console.log('Element id '+audioElementsDOM[i].id+' made into normal node');
+								}
+							} else {
+								this.audioElements.push(node);
+							}
+						}
+						
+						if (this.randomiseOrder == true && typeof randomiseOrder === "function")
+						{
+							this.audioElements = randomiseOrder(this.audioElements);
+						}
+						
+						var commentQuestionsDOM = xml.getElementsByTagName('CommentQuestion');
+						for (var i=0; i<commentQuestionsDOM.length; i++) {
+							var node = new this.commentQuestionNode();
+							node.decode(commentQuestionsDOM[i]);
+							this.commentQuestions.push(node);
 						}
 					};
 					
-					this.audioElementNode = function(parent,audioObject) {
-						this.url = audioObject.file.name;
-						this.id = audioObject.id;
-						this.parent = parent;
+					this.encode = function(root)
+					{
+						var AHNode = root.createElement("audioHolder");
+						AHNode.id = this.id;
+						AHNode.setAttribute("hostURL",this.hostURL);
+						AHNode.setAttribute("sampleRate",this.sampleRate);
+						AHNode.setAttribute("randomiseOrder",this.randomiseOrder);
+						AHNode.setAttribute("repeatCount",this.repeatCount);
+						AHNode.setAttribute("loop",this.loop);
+						AHNode.setAttribute("elementComments",this.elementComments);
+						
+						for (var i=0; i<this.interfaces.length; i++)
+						{
+							AHNode.appendChild(this.interfaces[i].encode(root));
+						}
+						
+						for (var i=0; i<this.audioElements.length; i++) {
+							AHNode.appendChild(this.audioElements[i].encode(root));
+						}
+						// Create <CommentQuestion>
+						for (var i=0; i<this.commentQuestions.length; i++)
+						{
+							AHNode.appendChild(this.commentQuestions[i].exportXML(root));
+						}
+						
+						// Create <PreTest>
+						var AHPreTest = root.createElement("PreTest");
+						for (var i=0; i<this.preTest.options.length; i++)
+						{
+							AHPreTest.appendChild(this.preTest.options[i].exportXML(root));
+						}
+						
+						var AHPostTest = root.createElement("PostTest");
+						for (var i=0; i<this.postTest.options.length; i++)
+						{
+							AHPostTest.appendChild(this.postTest.options[i].exportXML(root));
+						}
+						AHNode.appendChild(AHPreTest);
+						AHNode.appendChild(AHPostTest);
+						return AHNode;
+					};
+					
+					this.interfaceNode = function() {
+						this.title = undefined;
+						this.options = [];
+						this.scale = [];
+						this.name = undefined;
+						this.decode = function(DOM)
+						{
+							var title = DOM.getElementsByTagName('title');
+							if (title.length == 0) {this.title = null;}
+							else {this.title = title[0].textContent;}
+							var name = DOM.getAttribute("name");
+							if (name != undefined) {this.name = name;}
+							this.options = parent.commonInterface.options;
+							var scale = DOM.getElementsByTagName('scale');
+							this.scale = [];
+							for (var i=0; i<scale.length; i++) {
+								var arr = [null, null];
+								arr[0] = scale[i].getAttribute('position');
+								arr[1] = scale[i].textContent;
+								this.scale.push(arr);
+							}
+						};
+						this.encode = function(root)
+						{
+							var node = root.createElement("interface");
+							if (this.title != undefined)
+							{
+								var title = root.createElement("title");
+								title.textContent = this.title;
+								node.appendChild(title);
+							}
+							for (var i=0; i<this.options.length; i++)
+							{
+								var optionNode = root.createElement(this.options[i].type);
+								if (this.options[i].type == "option")
+								{
+									optionNode.setAttribute("name",this.options[i].name);
+								} else if (this.options[i].type == "check") {
+									optionNode.setAttribute("check",this.options[i].check);
+								} else if (this.options[i].type == "scalerange") {
+									optionNode.setAttribute("min",this.options[i].min*100);
+									optionNode.setAttribute("max",this.options[i].max*100);
+								}
+								node.appendChild(optionNode);
+							}
+							for (var i=0; i<this.scale.length; i++) {
+								var scale = root.createElement("scale");
+								scale.setAttribute("position",this.scale[i][0]);
+								scale.textContent = this.scale[i][1];
+								node.appendChild(scale);
+							}
+							return node;
+						};
+					};
+					
+					this.audioElementNode = function() {
+						this.url = null;
+						this.id = null;
+						this.parent = null;
 						this.type = "normal";
-						
-						this.marker = undefined;
+						this.marker = false;
+						this.enforce = false;
+						this.gain = 1.0;
+						this.decode = function(parent,xml)
+						{
+							this.url = xml.getAttribute('url');
+							this.id = xml.id;
+							this.parent = parent;
+							this.type = xml.getAttribute('type');
+							var gain = xml.getAttribute('gain');
+							if (isNaN(gain) == false && gain != null)
+							{
+								this.gain = decibelToLinear(Number(gain));
+							}
+							if (this.type == null) {this.type = "normal";}
+							if (this.type == 'anchor') {this.anchor = true;}
+							else {this.anchor = false;}
+							if (this.type == 'reference') {this.reference = true;}
+							else {this.reference = false;}
+							if (this.anchor == true || this.reference == true)
+							{
+								this.marker = xml.getAttribute('marker');
+								if (this.marker != undefined)
+								{
+									this.marker = Number(this.marker);
+									if (isNaN(this.marker) == false)
+									{
+										if (this.marker > 1)
+										{	this.marker /= 100.0;}
+										if (this.marker >= 0 && this.marker <= 1)
+										{
+											this.enforce = true;
+											return;
+										} else {
+											console.log("ERROR - Marker of audioElement "+this.id+" is not between 0 and 1 (float) or 0 and 100 (integer)!");
+											console.log("ERROR - Marker not enforced!");
+										}
+									} else {
+										console.log("ERROR - Marker of audioElement "+this.id+" is not a number!");
+										console.log("ERROR - Marker not enforced!");
+									}
+								}
+							}
+						};
+						this.encode = function(root)
+						{
+							var AENode = root.createElement("audioElements");
+							AENode.id = this.id;
+							AENode.setAttribute("url",this.url);
+							AENode.setAttribute("type",this.type);
+							AENode.setAttribute("gain",linearToDecibel(this.gain));
+							if (this.marker != false)
+							{
+								AENode.setAttribute("marker",this.marker*100);
+							}
+							return AENode;
+						};
 					};
 					
 					this.commentQuestionNode = function(xml) {
+						this.id = null;
+						this.type = undefined;
+						this.question = undefined;
+						this.options = [];
+						this.statement = undefined;
+						
+						this.childOption = function() {
+							this.type = 'option';
+							this.name = null;
+							this.text = null;
+						};
 						this.exportXML = function(root)
 						{
 							var CQNode = root.createElement("CommentQuestion");
@@ -1973,13 +2253,25 @@
 							case "text":
 								CQNode.textContent = this.question;
 								break;
-							case "radio" || "checkbox":
-								var statement = document.createElement("statement");
+							case "radio":
+								var statement = root.createElement("statement");
 								statement.textContent = this.statement;
 								CQNode.appendChild(statement);
 								for (var i=0; i<this.options.length; i++)
 								{
-									var optionNode = document.createElement("option");
+									var optionNode = root.createElement("option");
+									optionNode.setAttribute("name",this.options[i].name);
+									optionNode.textContent = this.options[i].text;
+									CQNode.appendChild(optionNode);
+								}
+								break;
+							case "checkbox":
+								var statement = root.createElement("statement");
+								statement.textContent = this.statement;
+								CQNode.appendChild(statement);
+								for (var i=0; i<this.options.length; i++)
+								{
+									var optionNode = root.createElement("option");
 									optionNode.setAttribute("name",this.options[i].name);
 									optionNode.textContent = this.options[i].text;
 									CQNode.appendChild(optionNode);
@@ -1988,52 +2280,62 @@
 							}
 							return CQNode;
 						};
-						this.childOption = function(element) {
-							this.type = 'option';
-							this.name = element.getAttribute('name');
-							this.text = element.textContent;
+						this.decode = function(xml) {
+							this.id = xml.id;
+							if (xml.getAttribute('mandatory') == 'true') {this.mandatory = true;}
+							else {this.mandatory = false;}
+							this.type = xml.getAttribute('type');
+							if (this.type == undefined) {this.type = 'text';}
+							switch (this.type) {
+							case 'text':
+								this.question = xml.textContent;
+								break;
+							case 'radio':
+								var child = xml.firstElementChild;
+								this.options = [];
+								while (child != undefined) {
+									if (child.nodeName == 'statement' && this.statement == undefined) {
+										this.statement = child.textContent;
+									} else if (child.nodeName == 'option') {
+										var node = new this.childOption();
+										node.name = child.getAttribute('name');
+										node.text = child.textContent;
+										this.options.push(node);
+									}
+									child = child.nextElementSibling;
+								}
+								break;
+							case 'checkbox':
+								var child = xml.firstElementChild;
+								this.options = [];
+								while (child != undefined) {
+									if (child.nodeName == 'statement' && this.statement == undefined) {
+										this.statement = child.textContent;
+									} else if (child.nodeName == 'option') {
+										var node = new this.childOption();
+										node.name = child.getAttribute('name');
+										node.text = child.textContent;
+										this.options.push(node);
+									}
+									child = child.nextElementSibling;
+								}
+								break;
+							}
 						};
-						this.id = xml.id;
-						if (xml.getAttribute('mandatory') == 'true') {this.mandatory = true;}
-						else {this.mandatory = false;}
-						this.type = xml.getAttribute('type');
-						if (this.type == undefined) {this.type = 'text';}
-						switch (this.type) {
-						case 'text':
-							this.question = xml.textContent;
-							break;
-						case 'radio':
-							var child = xml.firstElementChild;
-							this.options = [];
-							while (child != undefined) {
-								if (child.nodeName == 'statement' && this.statement == undefined) {
-									this.statement = child.textContent;
-								} else if (child.nodeName == 'option') {
-									this.options.push(new this.childOption(child));
-								}
-								child = child.nextElementSibling;
-							}
-							break;
-						case 'checkbox':
-							var child = xml.firstElementChild;
-							this.options = [];
-							while (child != undefined) {
-								if (child.nodeName == 'statement' && this.statement == undefined) {
-									this.statement = child.textContent;
-								} else if (child.nodeName == 'option') {
-									this.options.push(new this.childOption(child));
-								}
-								child = child.nextElementSibling;
-							}
-							break;
-						}
 					};
 				};
-				
-				this.preTest = new this.prepostNode("pretest");
-				this.postTest = new this.prepostNode("posttest");
 			}
 			
+			function linearToDecibel(gain)
+			{
+				return 20.0*Math.log10(gain);
+			}
+			
+			function decibelToLinear(gain)
+			{
+				return Math.pow(10,gain/20.0);
+			}
+		
 			function createDeleteNodeButton(node)
 			{
 				var button = document.createElement("button");
@@ -2046,7 +2348,7 @@
 				return button;
 			}
 			
-			function SpecficationToHTML()
+			function SpecificationToHTML()
 			{
 				// Take information from Specification Node and format it into an HTML layout
 				var destination = document.getElementById("content");
@@ -2479,16 +2781,16 @@
 					var title = document.createElement("h3");
 					title.textContent = "Audio Elements";
 					audioElems.appendChild(title);
-					for (var i=0; i<aH.audioElements.length; i++)
+					for (var j=0; j<aH.audioElements.length; j++)
 					{
 						var entry = document.createElement("div");
 						entry.className = "SecondLevel";
-						entry.id = audioElems.id+"-"+aH.audioElements[i].id;
+						entry.id = audioElems.id+"-"+aH.audioElements[j].id;
 						var text = document.createElement("span");
 						text.textContent = "ID:";
 						var input = document.createElement("input");
 						input.id = entry.id+"-id";
-						input.value = aH.audioElements[i].id;
+						input.value = aH.audioElements[j].id;
 						input.onchange = function() {
 							var IDSplit = event.currentTarget.id.split("-");
 							var ahNode = specificationNode.audioHolders[IDSplit[1]];
@@ -2502,7 +2804,7 @@
 						text.textContent = "URL:";
 						input = document.createElement("input");
 						input.id = entry.id+"-URL";
-						input.value = aH.audioElements[i].url;
+						input.value = aH.audioElements[j].url;
 						input.onchange = function() {
 							var IDSplit = event.currentTarget.id.split("-");
 							var ahNode = specificationNode.audioHolders[IDSplit[1]];
@@ -2512,6 +2814,30 @@
 						input.style.margin = "5px";
 						entry.appendChild(text);
 						entry.appendChild(input);
+						text = document.createElement("span");
+						text.textContent = "Gain:";
+						input = document.createElement("input");
+						input.type = "range";
+						input.id = entry.id+"-gain";
+						input.value = aH.audioElements[j].gain;
+						input.min = -25;
+						input.max = 6;
+						input.step = 1;
+						input.onchange = function() {
+							var IDSplit = event.currentTarget.id.split("-");
+							var ahNode = specificationNode.audioHolders[IDSplit[1]];
+							ahNode.audioElements[IDSplit[3]].gain = decibelToLinear(Number(event.currentTarget.value));
+							var textRet = document.getElementById(event.currentTarget.id+"-ret");
+							textRet.textContent = String(event.currentTarget.value)+"dB";
+						};
+						var textRet = document.createElement("span");
+						textRet.textContent = String(linearToDecibel(aH.audioElements[j].gain))+"dB";
+						textRet.id = entry.id+"-gain-ret";
+						text.style.margin = "5px";
+						input.style.margin = "5px";
+						entry.appendChild(text);
+						entry.appendChild(input);
+						entry.appendChild(textRet);
 						var button_delete = document.createElement("button");
 						button_delete.textContent = "Delete";
 						button_delete.onclick = function() {