changeset 65:9d7e5f690f28

Merged.
author samer
date Sat, 17 Mar 2012 01:03:15 +0000
parents a18a4b0517e8 (diff) 2994e5e485e7 (current diff)
children 6d67c0c11b2b
files draft.tex
diffstat 2 files changed, 52 insertions(+), 20 deletions(-) [+]
line wrap: on
line diff
Binary file draft.pdf has changed
--- a/draft.tex	Sat Mar 17 00:04:51 2012 +0000
+++ b/draft.tex	Sat Mar 17 01:03:15 2012 +0000
@@ -749,18 +749,49 @@
       }
     \end{fig}
 
-    \subsection{Audio based content analysis}
-     Using analogous definitions of differential entropy, the methods outlined
-     in the previous section are equally applicable to continuous random variables.
-     In the case of music, where expressive properties such as dynamics, tempo,
-     timing and timbre are readily quantified on a continuous scale, the information
-     dynamic framework may also be considered.
+    \subsection{Real-valued signals and audio analysis}
+	 Using analogous definitions based on the differential entropy
+	 \cite{CoverThomas}, the methods outlined
+	 in \secrf{surprise-info-seq} and \secrf{process-info}
+	 are equally applicable to random variables taking values in a continuous domain.
+	 In the case of music, where expressive properties such as dynamics, tempo,
+	 timing and timbre are readily quantified on a continuous scale, the information
+	 dynamic framework may thus be applied.
+%    \subsection{Audio based content analysis}
+%    Using analogous definitions of differential entropy, the methods outlined
+%     in the previous section are equally applicable to continuous random variables.
+%     In the case of music, where expressive properties such as dynamics, tempo,
+%     timing and timbre are readily quantified on a continuous scale, the information
+%     dynamic framework may also be considered.
 
-     In \cite{Dubnov2006}, Dubnov considers the class of stationary Gaussian
-     processes. For such processes, the entropy rate may be obtained analytically
-     from the power spectral density of the signal, allowing the multi-information
-     rate to be subsequently obtained. One aspect demanding further investigation
-     involves the comparison of alternative measures of predictability. In the case of the PIR, a Gaussian linear formulation is applicable, indicating that the PIR is a function of the correlation  between random innovations supplied to the stochastic process CITE.
+	 Dubnov \cite{Dubnov2006} considers the class of stationary Gaussian
+	 processes. For such processes, the entropy rate may be obtained analytically
+	 from the power spectral density of the signal. Dubnov found that the
+	 multi-information rate (which he refers to as `information rate') can be
+	 expressed as a function of the spectral flatness measure. For a given variance,
+	 Gaussian processes with maximal multi-information rate are those with maximally
+	 non-flat spectra. These are essentially consist of a single
+	 sinusoidal component and hence are completely predictable and periodic once
+	 the parameters of the sinusoid have been inferred.
+%	 Local stationarity is assumed, which may be achieved by windowing or 
+%	 change point detection \cite{Dubnov2008}. 
+	 %TODO
+
+	 We are currently working towards methods for the computation of predictive information
+	 rate in some restricted classes of Gaussian processes including finite-order
+	 autoregressive models and processes with power-law spectra (fractional Brownian
+	 motions).
+
+%	 mention non-gaussian processes extension Similarly, the predictive information
+%	 rate may be computed using a Gaussian linear formulation CITE. In this view,
+%	 the PIR is a function of the correlation  between random innovations supplied
+%	 to the stochastic process.  %Dubnov, MacAdams, Reynolds (2006) %Bailes and Dean (2009)
+
+%     In \cite{Dubnov2006}, Dubnov considers the class of stationary Gaussian
+%     processes. For such processes, the entropy rate may be obtained analytically
+%     from the power spectral density of the signal, allowing the multi-information
+%     rate to be subsequently obtained. One aspect demanding further investigation
+%     involves the comparison of alternative measures of predictability. In the case of the PIR, a Gaussian linear formulation is applicable, indicating that the PIR is a function of the correlation  between random innovations supplied to the stochastic process CITE.
     % !!! FIXME
 
 
@@ -899,9 +930,9 @@
 The distribution of transition matrices plotted in this space forms an arch shape
 that is fairly thin. Thus, it is a reasonable simplification to project out the 
 third dimension (the PIR) and present an interface that is just two dimensional. 
-The right-angled triangle is rotated and stretched to form an equilateral triangle with
-the $h_\mu=0, \rho_\mu=0$ vertex at the top, the `redundancy' axis down the right-hand
-side, and the `entropy rate' axis down the left, as shown in \figrf{TheTriangle}.
+The right-angled triangle is rotated, reflected and stretched to form an equilateral triangle with
+the $h_\mu=0, \rho_\mu=0$ vertex at the top, the `redundancy' axis down the left-hand
+side, and the `entropy rate' axis down the right, as shown in \figrf{TheTriangle}.
 This is our `Melody Triangle' and
 forms the interface by which the system is controlled. 
 %Using this interface thus involves a mapping to information space; 
@@ -970,17 +1001,18 @@
 	\def\scat#1{\colfig[0.42]{mtri/#1}}
 	\def\subj#1{\scat{scat_dwells_subj_#1} & \scat{scat_marks_subj_#1}}
 	\begin{tabular}{cc}
-		\subj{a} \\
+%		\subj{a} \\
 		\subj{b} \\
-		\subj{c} \\
-		\subj{d}
+		\subj{c} 
+%		\subj{d}
 	\end{tabular}
 	\caption{Dwell times and mark positions from user trials with the
-	on-screen Melody Triangle interface. The left-hand column shows
+	on-screen Melody Triangle interface, for two subjects. The left-hand column shows
 	the positions in a 2D information space (entropy rate vs multi-information rate
-	in bits) where spent their time; the area of each circle is proportional
+	in bits) where each spent their time; the area of each circle is proportional
 	to the time spent there. The right-hand column shows point which subjects
-	`liked'.}
+	`liked'; the area of the circles here is proportional to the duration spent at
+	that point before the point was marked.}
 \end{fig}
 
 Information measures on a stream of symbols can form a feedback mechanism; a