comparison docs/SMC15/smc2015template.tex @ 984:9f51f5c015a3

Paper: Changed links into footnotes instead of references.
author Brecht De Man <BrechtDeMan@users.noreply.github.com>
date Fri, 24 Apr 2015 18:53:24 +0100
parents 652d4c865ac8
children d610dec1aa2e
comparison
equal deleted inserted replaced
983:652d4c865ac8 984:9f51f5c015a3
149 149
150 \section{Introduction}\label{sec:introduction} 150 \section{Introduction}\label{sec:introduction}
151 151
152 %NICK: examples of what kind of audio applications HTML5 has made possible, with references to publications (or website)\\ 152 %NICK: examples of what kind of audio applications HTML5 has made possible, with references to publications (or website)\\
153 153
154 Perceptual evaluation of audio plays an important role in a wide range of research including audio quality \cite{schoeffler2013impact,repp}, sound synthesis \cite{de2013real,durr2015implementation}, audio effect design \cite{deman2014a}, source separation \cite{uhlereiss}, music and emotion %emotional analysis \cite{song2013b,song2013a}, and many others \cite{friberg2011comparison}. % codec design? 154 Perceptual evaluation of audio plays an important role in a wide range of research including audio quality \cite{schoeffler2013impact,repp}, sound synthesis \cite{de2013real,durr2015implementation}, audio effect design \cite{deman2014a}, source separation \cite{uhlereiss}, music and emotion analysis \cite{song2013b,song2013a}, and many others \cite{friberg2011comparison}. % codec design?
155 155
156 The Web Audio Application Programming Interface is a high-level JavaScript API designed for real-time processing of audio inside the browser through various processing nodes \cite{webaudioapi}. Various web sites have used the Web Audio API for either creative purposes, such as drum machines and score creation tools \cite{webaudiodemo}, %http://webaudio.github.io/demo-list/ 156 The Web Audio API is a high-level JavaScript Application Programming Interface (API) designed for real-time processing of audio inside the browser through various processing nodes\footnote{http://webaudio.github.io/web-audio-api/}. Various web sites have used the Web Audio API for either creative purposes, such as drum machines and score creation tools\footnote{http://webaudio.github.io/demo-list/},
157 others from the list show real-time captured audio processing such as room reverberation tools and a phase vocoder from the system microphone. The BBC Radiophonic Workshop shows effects used on famous TV shows such as Doctor Who, being simulated inside the browser \cite{bbcradiophonics}. %http://webaudio.prototyping.bbc.co.uk/ 157 others from the list show real-time captured audio processing such as room reverberation tools and a phase vocoder from the system microphone. The BBC Radiophonic Workshop shows effects used on famous TV shows such as Doctor Who, being simulated inside the browser\footnote{http://webaudio.prototyping.bbc.co.uk/}.
158 Another example is the BBC R\&D personalised compressor which applies a dynamic range compressor on a radio station that dynamically adjusts the compressor settings to match the listener's environment \cite{mason2015compression}. 158 Another example is the BBC R\&D personalised compressor which applies a dynamic range compressor on a radio station that dynamically adjusts the compressor settings to match the listener's environment \cite{mason2015compression}.
159 159
160 This work is based in part on the APE audio perceptual evaluation interface for MATLAB \cite{deman2014b}. An important drawback of this toolbox is the need to have MATLAB to create a test and even to run (barring the use of an executable generated by MATLAB), and limited compatibility with both earlier and newer versions of MATLAB, which makes it hard to maintain. On the other hand, a web application generally has the advantage of running in most browsers on most applications, and we present a tool that requires no specialised software or even programming knowledge to set up. 160 This work is based in part on the APE audio perceptual evaluation interface for MATLAB \cite{deman2014b}. An important drawback of this toolbox is the need to have MATLAB to create a test and even to run (barring the use of an executable generated by MATLAB), and limited compatibility with both earlier and newer versions of MATLAB, which makes it hard to maintain. On the other hand, a web application generally has the advantage of running in most browsers on most applications, and we present a tool that requires no specialised software or even programming knowledge to set up.
161 161
162 % IMPORTANT 162 % IMPORTANT
203 \item \texttt{index.html}: The main index file to load the scripts, this is the file the browser must request to load. 203 \item \texttt{index.html}: The main index file to load the scripts, this is the file the browser must request to load.
204 \item \texttt{core.js}: Contains functions and objects to define the audio objects, audio playback engine and loading media files 204 \item \texttt{core.js}: Contains functions and objects to define the audio objects, audio playback engine and loading media files
205 \item \texttt{ape.js}: Parses set up files to create the interface as instructed, following the same style chain as the MATLAB APE Tool \cite{deman2014b}. 205 \item \texttt{ape.js}: Parses set up files to create the interface as instructed, following the same style chain as the MATLAB APE Tool \cite{deman2014b}.
206 \end{itemize} 206 \end{itemize}
207 207
208 The HTML file loads the \texttt{core.js} file along with a few other ancillary files (such as the jQuery JavaScript extensions \cite{jquery}), at which point the browser JavaScript begins to execute the on-page instructions, which gives the URL of the test set up XML document (outlined in Section \ref{sec:setupresultsformats}). \texttt{core.js} parses this document and executes the function in \texttt{ape.js} to build the web page with the given audio files. The reason for separating these two files is to allow for further interface designs (such as MUSHRA \cite{mushra} or AB tests \cite{bech}) to be used, which would still require the same underlying core functions outlined in \texttt{core.js}, see also Section \ref{sec:interface}. 208 The HTML file loads the \texttt{core.js} file along with a few other ancillary files (such as the jQuery JavaScript extensions\footnote{http://jquery.com/}), at which point the browser JavaScript begins to execute the on-page instructions, which gives the URL of the test set up XML document (outlined in Section \ref{sec:setupresultsformats}). \texttt{core.js} parses this document and executes the function in \texttt{ape.js} to build the web page with the given audio files. The reason for separating these two files is to allow for further interface designs (such as MUSHRA \cite{mushra} or AB tests \cite{bech}) to be used, which would still require the same underlying core functions outlined in \texttt{core.js}, see also Section \ref{sec:interface}.
209 209
210 The \texttt{ape.js} file has several main functions but the most important are \textit{loadInterface(xmlDoc)}, \textit{loadTest(id)}, \textit{pageXMLSave(testId)} and \textit{interfaceXMLSave()}. \textit{loadInterface(xmlDoc)} is called to decode the supplied project document in respect for the interface specified and define any global structures (such as the slider interface). It also identifies the number of pages in the test and randomises the order, if specified to do so. This is the only madatory function in any of the interface files as this is called by \texttt{core.js} when the document is ready. The design style is such that \texttt{core.js} cannot 'see' any interface specific functions and therefore cannot assume any are available. Therefore the \textit{loadInterface(xmlDoc)} is very important to set up the entire test environment. It can be assumed that the interface files can `see' the \texttt{core.js} file and can therefore not only interact with it, but also modify it. 210 The \texttt{ape.js} file has several main functions but the most important are \textit{loadInterface(xmlDoc)}, \textit{loadTest(id)}, \textit{pageXMLSave(testId)} and \textit{interfaceXMLSave()}. \textit{loadInterface(xmlDoc)} is called to decode the supplied project document in respect for the interface specified and define any global structures (such as the slider interface). It also identifies the number of pages in the test and randomises the order, if specified to do so. This is the only madatory function in any of the interface files as this is called by \texttt{core.js} when the document is ready. The design style is such that \texttt{core.js} cannot 'see' any interface specific functions and therefore cannot assume any are available. Therefore the \textit{loadInterface(xmlDoc)} is very important to set up the entire test environment. It can be assumed that the interface files can `see' the \texttt{core.js} file and can therefore not only interact with it, but also modify it.
211 211
212 Each test page is loaded using \textit{loadTest(id)} which performs two major tasks: to populate the interface with the slider elements and comment boxes; and secondly to instruct the \textit{audioEngine} to load the audio fragments and construct the backend audio graph. The markers on the slider at the top of the page are positioned randomly, to minimise the bias that may be introduced when the initial positions are near the beginning, end or middle of the slider. Another approach is to place the markers outside of the slider bar at first and have the subject drag them in, but the authors believe this doesn't encourage careful consideration and comparison of the different fragments as the implicit goal of the test becomes to audition and drag each fragment in just once, rather than to compare all fragments rigorously. 212 Each test page is loaded using \textit{loadTest(id)} which performs two major tasks: to populate the interface with the slider elements and comment boxes; and secondly to instruct the \textit{audioEngine} to load the audio fragments and construct the backend audio graph. The markers on the slider at the top of the page are positioned randomly, to minimise the bias that may be introduced when the initial positions are near the beginning, end or middle of the slider. Another approach is to place the markers outside of the slider bar at first and have the subject drag them in, but the authors believe this doesn't encourage careful consideration and comparison of the different fragments as the implicit goal of the test becomes to audition and drag each fragment in just once, rather than to compare all fragments rigorously.
213 213
217 217
218 When an \textit{audioObject} is created, it is given the URL of the audio sample to load. This is downloaded into the browser asynchronously using the \textit{XMLHttpRequest} object. This downloads any file into the JavaScript environment for further processing which is particularly useful for the Web Audio API because it supports downloading of files in their binary form for decoding. Once downloaded the file is decoded using the Web Audio API offline decoder. This uses the browser available decoding schemes to decode the audio files into raw float32 arrays, which are in turn passed to the relevant \textit{audioObject} for playback. 218 When an \textit{audioObject} is created, it is given the URL of the audio sample to load. This is downloaded into the browser asynchronously using the \textit{XMLHttpRequest} object. This downloads any file into the JavaScript environment for further processing which is particularly useful for the Web Audio API because it supports downloading of files in their binary form for decoding. Once downloaded the file is decoded using the Web Audio API offline decoder. This uses the browser available decoding schemes to decode the audio files into raw float32 arrays, which are in turn passed to the relevant \textit{audioObject} for playback.
219 219
220 Once each page of the test is completed, identified by pressing the Submit button, the \textit{pageXMLSave(testId)} is called to store all of the collected data until all pages of the test are completed. After the final test and any post-test questions are completed, the \textit{interfaceXMLSave()} function is called. This function generates the final XML file for submission as outlined in Section \ref{sec:setupresultsformats}. 220 Once each page of the test is completed, identified by pressing the Submit button, the \textit{pageXMLSave(testId)} is called to store all of the collected data until all pages of the test are completed. After the final test and any post-test questions are completed, the \textit{interfaceXMLSave()} function is called. This function generates the final XML file for submission as outlined in Section \ref{sec:setupresultsformats}.
221 221
222 Browsers support various audio file formats and are not consistent in any format. Currently the Web Audio API is best supported in Chrome, Firefox, Opera and Safari. All of these support the use of the uncompressed WAV format. Although not a compact, web friendly format, most transport systems are of a high enough bandwidth this should not be a problem. Ogg Vorbis is another well supported format across the 4 supported major desktop browsers, as well as MP3 (although Firefox may not support all MP3 types) \cite{mozdevSupportedMedia}. %https://developer.mozilla.org/en-US/docs/Web/HTML/Supported_media_formats 222 Browsers support various audio file formats and are not consistent in any format. Currently the Web Audio API is best supported in Chrome, Firefox, Opera and Safari. All of these support the use of the uncompressed WAV format. Although not a compact, web friendly format, most transport systems are of a high enough bandwidth this should not be a problem. Ogg Vorbis is another well supported format across the 4 supported major desktop browsers, as well as MP3 (although Firefox may not support all MP3 types) \footnote{https://developer.mozilla.org/en-US/docs/Web/HTML/\\Supported\_media\_formats}. %https://developer.mozilla.org/en-US/docs/Web/HTML/Supported_media_formats
223 One issue of the Web Audio API is that the sample rate is assigned by the system sound device, rather than requested and does not have the ability to request a different one. % Does this make sense? The problem is across all audio files. 223 One issue of the Web Audio API is that the sample rate is assigned by the system sound device, rather than requested and does not have the ability to request a different one. % Does this make sense? The problem is across all audio files.
224 Therefore, the default operation when an audio file is loaded with a different sample rate to that of the system is to convert the sample rate. To provide a check for this, the desired sample rate can be supplied with the set up XML and checked against. If the sample rates do not match, a browser alert window is shown asking for the sample rate to be correctly adjusted. 224 Therefore, the default operation when an audio file is loaded with a different sample rate to that of the system is to convert the sample rate. To provide a check for this, the desired sample rate can be supplied with the set up XML and checked against. If the sample rates do not match, a browser alert window is shown asking for the sample rate to be correctly adjusted.
225 As this happens before any loading or decoding of audio files, the system will only fetch files as soon as the system's sample rate meets any requirements, avoiding requests for large files until they are actually needed. 225 As this happens before any loading or decoding of audio files, the system will only fetch files as soon as the system's sample rate meets any requirements, avoiding requests for large files until they are actually needed.
226 226
227 %During playback, the playback nodes loop indefinitely until playback is stopped. The gain nodes in the \textit{audioObject}s enable dynamic muting of nodes. When a bar in the sliding ranking is clicked, the audio engine mutes all \textit{audioObject}s and un-mutes the clicked one. Therefore, if the audio samples are perfectly aligned up and of the same sample length, they will remain perfectly aligned with each other. 227 %During playback, the playback nodes loop indefinitely until playback is stopped. The gain nodes in the \textit{audioObject}s enable dynamic muting of nodes. When a bar in the sliding ranking is clicked, the audio engine mutes all \textit{audioObject}s and un-mutes the clicked one. Therefore, if the audio samples are perfectly aligned up and of the same sample length, they will remain perfectly aligned with each other.
229 229
230 \section{Input and result files}\label{sec:setupresultsformats} 230 \section{Input and result files}\label{sec:setupresultsformats}
231 231
232 The set up and result files both use the common XML document format to outline the various parameters. The set up file determines which interface to use, the location of audio files, how many pages and other general set up rules to define the testing environment. Having one document to modify allows for quick manipulation in a `human readable' form to create new tests, or adjust current ones, without needing to edit multiple web files. An example of this XML document is presented in Figure~\ref{fig:xmlIn}% I mean the .js and .html files, though not sure if any better. 232 The set up and result files both use the common XML document format to outline the various parameters. The set up file determines which interface to use, the location of audio files, how many pages and other general set up rules to define the testing environment. Having one document to modify allows for quick manipulation in a `human readable' form to create new tests, or adjust current ones, without needing to edit multiple web files. An example of this XML document is presented in Figure~\ref{fig:xmlIn}% I mean the .js and .html files, though not sure if any better.
233 233
234 \subsection{set up and configurability} 234 \subsection{Set up and configurability}
235 235
236 \begin{figure}[ht] 236 \begin{figure}[ht]
237 \begin{center} 237 \begin{center}
238 \includegraphics[width=0.5\textwidth]{XMLInput.png} 238 \includegraphics[width=0.5\textwidth]{XMLInput.png}
239 \caption{An Example Input XML File} 239 \caption{An Example Input XML File}