Times series averaging and denoising from a probabilistic perspective on time-elastic kernels. Marteau, P. CoRR, 2016.
Times series averaging and denoising from a probabilistic perspective on time-elastic kernels [link]Paper  abstract   bibtex   
In the light of regularized dynamic time warping kernels, this paper re-considers the concept of time elastic centroid for a setof time series. We derive a new algorithm based on a probabilistic interpretation of kernel alignment matrices. This algorithm expressesthe averaging process in terms of a stochastic alignment automata. It uses an iterative agglomerative heuristic method for averagingthe aligned samples, while also averaging the times of occurrence of the aligned samples. By comparing classification accuracies for45 heterogeneous time series datasets obtained by first nearest centroid/medoid classifiers we show that: i) centroid-basedapproaches significantly outperform medoid-based approaches, ii) for the considered datasets, our algorithm that combines averagingin the sample space and along the time axes, emerges as the most significantly robust model for time-elastic averaging with apromising noise reduction capability. We also demonstrate its benefit in an isolated gesture recognition experiment and its ability tosignificantly reduce the size of training instance sets. Finally we highlight its denoising capability using demonstrative synthetic data:we show that it is possible to retrieve, from few noisy instances, a signal whose components are scattered in a wide spectral band.
@Article{Marteau2016a,
  author    = {Pierre{-}Fran{\c{c}}ois Marteau},
  title     = {Times series averaging and denoising from a probabilistic perspective on time-elastic kernels},
  journal   = {CoRR},
  year      = {2016},
  volume    = {abs/1611.09194},
  abstract  = {In the light of regularized dynamic time warping kernels, this paper re-considers the concept of time elastic centroid for a setof time series. We derive a new algorithm based on a probabilistic interpretation of kernel alignment matrices. This algorithm expressesthe averaging process in terms of a stochastic alignment automata. It uses an iterative agglomerative heuristic method for averagingthe aligned samples, while also averaging the times of occurrence of the aligned samples. By comparing classification accuracies for45 heterogeneous time series datasets obtained by first nearest centroid/medoid classifiers we show that: i) centroid-basedapproaches significantly outperform medoid-based approaches, ii) for the considered datasets, our algorithm that combines averagingin the sample space and along the time axes, emerges as the most significantly robust model for time-elastic averaging with apromising noise reduction capability. We also demonstrate its benefit in an isolated gesture recognition experiment and its ability tosignificantly reduce the size of training instance sets. Finally we highlight its denoising capability using demonstrative synthetic data:we show that it is possible to retrieve, from few noisy instances, a signal whose components are scattered in a wide spectral band.},
  bibsource = {dblp computer science bibliography, http://dblp.org},
  biburl    = {http://dblp.uni-trier.de/rec/bib/journals/corr/Marteau16},
  review    = {This paper is interested in creating a base template given multiple examples of a trajectory by using a modified version of the DTW. They comment on a few DTW-based methods to create a trajectory template and noted that they are highly heuristical and too dependent on initial condition and tuning, and thus modifies DTW by borrowing concepts from left-right HMMs. Instead of creating a HMM where the state sequence traverses through the time series motion, the states describe the alignment between two trajectories instead. The state transition is similar to DTW (diagonal, right, or down). I haven't wrapped my mind around all the math yet, but it seems like their training and identification methods modifies HMM's Expectation-Maximization and Forward-Backward in order to fit their formulations. They compared their method against other DTW methods by training a central trajectory template using different DTW-based technique, aligning the observed trajectory to the template, then classify using either kNN or SVM. They also show that their method can be used to combine noisy training data together to create a denoised template.},
  timestamp = {Thu, 01 Dec 2016 19:32:08 +0100},
  url       = {http://arxiv.org/abs/1611.09194},
}

Downloads: 0