Informatics and Applications
2015, Volume 9, Issue 1, pp 76-86
SELECTION OF OPTIMAL PHYSICAL ACTIVITY CLASSIFICATION MODEL USING MEASUREMENTS OF ACCELEROMETER
Abstract
The paper solves the problemof selecting optimal stablemodels for classification of physical activity. Each
type of physical activity of a particular person is described by a set of features generated froman accelerometer time
series. In conditions of feature’s multicollinearity, selection of stable models is hampered by the need to evaluate
a large number of parameters of these models. Evaluation of optimal parameter values is also difficult due to the
fact that the error function has a large number of local minima in the parameter space. In the paper, the optimal
models fromthe class of two-layer artificial neural networks are chosen. The problem of finding the Pareto optimal
front of the set of models is solved. The paper presents a stepwise strategy of building optimal stable models. The
strategy includes steps of deleting and adding parameters, criteria of pruning and growing the model and criteria of
breaking the process of building. The computational experiment compares the models generated by the proposed
strategy on three quality criteria — complexity, accuracy, and stability.
[+] References (19)
- Vizilter, Y., V. Gorbatcevich, S. Karateev, and N. Kostromov.
2012. Obuchenie algoritmov vydeleniya kozhi na
tsvetnykh izobrazheniyakh lits [Teaching of skin extraction
algorithms for human face color images]. Informatika
i ee Primeneniya — Inform. Appl. 6(1):109–113.
- Tokmakova, A.A., and V. V. Strizhov. 2012. Otsenivanie
giperparametrov lineynykh i regressionnykh modeley pri
otbore shumovykh i korreliruyushchikh priznakov [Estimation
of linear model hyperparameters for noise or
correlated feature selection poblem]. Informatika i ee
Primeneniya — Inform. Appl. 6(4):66–75.
- Khaplanov, A. Yu. 2013. Asimptoticheskaya normal’nost’
otsenki parametrovmnogomernoy logisticheskoy regressii
[Asymptotic normality of the estimation of the multivariate
logistic regression]. Informatika i ee Primeneniya —
Inform. Appl. 7(2):69–74.
- Myung, I. J. 2000. The importance of complexity inmodel
selection. J. Math. Psychol. 44(1):190–204.
- MacLeod, C., and M. Maxwell. 2001. Incremental evolution
in ANNs: Neural nets which grow. Artif. Intell. Rev.
16(3):201–224.
- Karnin, E.D. 1990. A simple procedure for pruning backpropagation
trained neural networks. IEEE Trans. Neural
Networks 1(2):239–242.
- LeCun, Y., L. S. Denker, and S. A. Solla. 1990. Optimal
brain damage. Adv. Neur. Inform. Processing Syst.
2(2):598–605.
- Hassibi, B., D.G. Stork, and G. J. Woff. 1993. Optimal
brain surgeon and general network pruning. IEEE Conference
(International) on Neural Networks Proceedings.
293–299.
- Hong-Gui, H., C. Qi-li, and Q. Jun-Fei. 2011. An efficient
self-organizingRBF neural network forwater quality
prediction. Neural Networks 24(7):717–725.
- Yang, S., and Y. Chen. 2012. An evolutionary constructive
and pruning algorithm for artificial neural networks and
its prediction applications. Neurocomputing 86(1):140–
149.
- Pu, X., and P. Pengfei-Sun. 2013. A new hybrid pruning
neural network algorithm based on sensitivity analysis for
stock market forcast. J. Inform. Comput. Sci. 3(1):883–
892.
- Knerr, S., L. Personnaz, and G. Dreyfus. 1990. Singlelayer
learning revisited: A stepwise procedure for building
and training a neural network. Neurocomputing Algorithms
Architectures Applications 68(1):41–50.
- Strijov, V., E. Krymova, and S. Weber. 2013. Evidence
optimization for consequently generated models. Math.
Comput. Modell. 57(1-2):50–56.
- Leont’eva, L.N. 2012. Posledovatel’nyy vybor priznakov
pri vosstanovlenii regressii [Feature selection in autoregression
forecasting]. J. Machine Learning Data Analysis
1(3):335–346.
- Zaytsev, A.A., and A.A. Tokmakova. 2012. Otsenka
giperparametrov lineynykh regressionnykh modeley
metodommaksimal’nogo pravdopodobiya pri otbore shumovykh
i korreliruyushchikh priznakov [Estimation regression
model hyperparameters using maximum likelihood].
J.Machine Learning Data Analysis 1(3):347–353.
- Kwapisz, J.R., G.M. Weiss, and S. Moore. 2010. Activity
recognition using cell phone accelerometers. SIGKDD
Explorations 12(2):74–82.
- Belsley, D. A., E. Kuh, R. E. Welsch. 2005. Regression diagnostics:
Identifying influential data and sources of collinearity.
New York: John Wiley and Sons. 302 p.
- Sanduljanu, L.N., and V. V. Strizhov. 2012. Vybor priznakov
v avtoregressionnykh zadachakh prognozirovaniya
[Feature selection in autoregression forecasting]. Information
Technologies 7:11–15.
- Popova, M. S. 2014. Realizatsiya strategii poshagovoy
modifikatsii neyronnoy seti [Realization of a stepwise
strategy for neural network modification]. Available at:
http://sourceforge.net/p/mlalgorithms/code/HEAD/
tree/Group174/Popova2014OptimalModelSelection/
code/main.m (accessed February 10, 2015).
[+] About this article
Title
SELECTION OF OPTIMAL PHYSICAL ACTIVITY CLASSIFICATION MODEL USING MEASUREMENTS OF ACCELEROMETER
Journal
Informatics and Applications
2015, Volume 9, Issue 1, pp 76-86
Cover Date
2014-10-30
DOI
10.14357/19922264150107
Print ISSN
1992-2264
Publisher
Institute of Informatics Problems, Russian Academy of Sciences
Additional Links
Key words
classification; artificial neural networks; complexity; accuracy; stability; Pareto efficiency; growing and
pruning criteria
Authors
M. Popova and V. Strijov
Author Affiliations
Moscow Institute of Physics and Technology, 9 Institutskiy Per., Dolgoprudny, Moscow Region 141700, Russian
Federation
Dorodnicyn Computing Center, Russian Academy of Sciences, 40 Vavilov Str., Moscow 119333, Russian
Federation
|