Systems and Means of Informatics
2015, Volume 25, Issue 3, pp 60-77
BUILDING SUPERPOSITION OF DEEP LEARNING NEURAL NETWORKS FOR SOLVING THE PROBLEM OF TIME SERIES CLASSIFICATION
- M. S. Popova
- V. V. Strijov
Abstract
This paper solves the problem of time series classification using deep learning neural networks. The paper proposes to use a multilevel superposition of models belonging to the following classes of neural networks: two-layer neural networks, Boltzmann machines, and autoencoders. Lower levels of superposition extract informative features from noisy data of high dimensionality, while the upper level of superposition solves the problem of classification based on these extracted features. The proposed model was tested on two samples of physical activity time series. The classification results obtained by the proposed model in the computational experiment were compared with the results which were obtained on the same datasets by foreign authors. The study showed the possibility of using deep learning neural networks for solving problems of physical activity time series classification.
[+] References (21)
- Langkvist, M., L. Karlsson, and A. Loutfi. 2014. A review of unsupervised feature learning and deep learning fortime-series modeling. Pattern Recognition Lett. 42(6): 11-24. doi:10.1016/j.patrec.2014.01.008.
- Keogh, E., and M. Pazzani. 2014. A simple dimensionality reduction technique for fast similarity search in large time series databases. Pacific-Asia Conference on Knowledge Discovery and Data Mining Proceedings. Kyoto, Japan: Springer. 122-133.
- Weston, J., S. Mukherjee, O. Chapelle, M. Pontil, T. Poggio, and V. Vapnik. 2000. Feature selection for SVMs. Advances in neural information processing systems. Eds. T.K. Leen, T. G. Dietterich, and V. Tresp. Denver, CO, USA: MIT Press. 13:668-674.
- Nanopoulos, A., R. Alcock, and Y. Manolopoulos. 2001. Feature-based classification of time-series data. Int. J. Comput. Res. 10:49-61.
- Mourchen, F. 2003. Time series feature extraction for data mining using DWT and DFT. Available at: http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.87.2037 (accessed March 1, 2015).
- Hinton, G. E., and R. R. Salakhutdinov. 2006. Reducing the dimensionality of data with neural networks. Science 313(5786):504-507.
- Estevez, P. A., M. Tesmer, C. A. Perez, and J. M. Zurada. 2009. Normalized mutual information feature selection. IEEE Trans. Neural Networks 20(2): 189-201.
- Stuhlsatz, A., J. Lippel, and T. Zielke. 2012. Feature extraction with deep neural networks by a generalized discriminant analysis. IEEE Trans. Neural Networks Learning Syst. 23(4):596-608.
- Ren. Y., and Y. Wu. 2014. Convolutional deep belief networks for feature extraction of EEG signal. Joint Conference (International) on Neural Networks (IJCNN 2014) Proceedings. Beijing, China: IEEE. 2850-2853.
- Xu, Y., T. Mo, Q. Feng, P. Zhong, M. Lai, and C. Chang. 2014. Deep learning of feature representation with multiple instance learning for medical image analysis. Conference (International) on Acoustics, Speech and Signal Processing (ICASSP 2014) Proceedings. Florence, Italy: IEEE. 1626-1630.
- Hinton, G.E., L. Deng, D. Yu, G.E. Dahl, A. Mohamed, N. Jaitly, A. Senior, V. Vanhoucke, P. Nguyen, T.N. Sainath, and B. Kingsbury. 2012. Deep neural networks for acoustic modeling in speech recognition: The shared views of four research groups. IEEE Signal Process. Mag. 29(6):82-97.
- Husken, M., and P. Stagge. 2003. Recurrent neural networks for time series classification. Neurocomputing 50:223-235.
- Wulsin, D., J. Gupta, R. Mani, J. Blanco, and B. Litt. 2011. Modeling electroencephalography waveforms with semi-supervised deep belief nets: Faster classification and anomaly measurement. J. Neural Eng. 8:1741-2552.
- Bengio, Y. 2009. Learning deep architectures for AI. Foundations Trends Machine Learning 2(1):1-127.
- Hinton, G. E. 2012. A practical guide to training restricted boltzmann machines. Neural networks: Tricks of the trade. 2nd ed. Springer. 599-619.
- Kwapisz, J.R., G. M. Weiss, and S. Moore. 2010. Activity recognition using cell phone accelerometers. SIGKDD Explorations 12(2):74-82.
- Anguita, D., A. Ghio, L. Oneto, X. Parra, and J. L. Reyes-Ortiz. 2012. Human activity recognition on smartphones using a multiclass hardware-friendly support vector machine. 4th Workshop (International) on Ambient Assisted Living and Home Care (IWAAL 2012) Proceedings. Vitoria-Gasteiz, Spain: Springer. 7657:216-223.
- Nabney, I.. 2002. NETLAB: Algorithms for pattern recognitions. Springer-Velag. 420 p.
- Deep learning class library. Available at: https://github.com/zellyn/deeplearning- class-2011/tree/master/ufdl/library (accessed March 1, 2015).
- Deep neural network tools. Available at: http://www.mathworks.com/matlabcentral/ fileexchange/42853-deep-neural-network (accessed March 1, 2015).
- Matlab/Octave toolbox for deep learning. Available at: https://github.com/ rasmusbergpalm/DeepLearnToolbox (accessed March 1, 2015).
[+] About this article
Title
BUILDING SUPERPOSITION OF DEEP LEARNING NEURAL NETWORKS FOR SOLVING THE PROBLEM OF TIME SERIES CLASSIFICATION
Journal
Systems and Means of Informatics
Volume 25, Issue 3, pp 60-77
Cover Date
2015-09-30
DOI
10.14357/08696527150304
Print ISSN
0869-6527
Publisher
Institute of Informatics Problems, Russian Academy of Sciences
Additional Links
Key words
classification; time series; deep learning neural networks; model superposition; feature extraction
Authors
M. S. Popova and V. V. Strijov
Author Affiliations
Moscow Institute of Physics and Technology, 9 Institutskiy Per., Dolgoprudny, Moscow Region 141700, Russian Federation
Dorodnicyn Computing Center, Federal Research Center "Computer Science and Control" of the Russian Academy of Sciences, 40 Vavilov Str., Moscow 119333, Russian Federation
|