Electrical and Computer Engineering Department, University of West Florida, Pensacola, FL.
Registration Is Open! Early Bird Pricing Expires June 30th
ActiGraph Digital Data Summit 2021November 4 - 5 | Learn more
Feature Selection for Actigraphy Signal Processing and Recognition
- Presented on June 17, 2013
Introduction In classifying actigraphy signals, features are usually extracted from raw data and then fed to some sort of a classifier that determines the type and level of activity. A huge variety of features have been tried by many researchers in this field. Some features are statistical in nature, some are based on the frequency content of the signals, others are based on the time domain content, etc. In much of the literature we reviewed for this work, there usually was no justification of why a particular feature or set of features were used with a particular classifier.
Purpose The purpose of this research was to create algorithm(s) to automatically evaluate and select the top-performing feature or set of features to maximize the correct classification rate of actigraphy signals.
Methods Two selection algorithms are developed to evaluate and select the top performing features among 63 commonly-used ones in actigraphy signal processing. The first selection algorithm is based on entropy minimization. It tries to minimize the uncertainty and disorder (i.e. entropy) in a classifier decision. The second algorithm (a.k.a. Add-One) first classifies the data using each of the features individually and then picks the top performing feature. Next, that feature is paired with each of the other features. The top performing pair is chosen. Triplets are formed by combining that pair with each of the other features. The top performing triplet is chosen, and so on, until the specified number of features is reached. Once the top features are selected, they were used to classify signals from five activity types (standing, laying down, working at a desk, and jogging at 2mph and 3mph). The actigraphy data was sampled at 30 Hz and divided into 5 second intervals. Data was divided into training and testing subsets using the Jackknife method. Minimum distance and artificial neural networks classifiers were used to classify the testing data.
Results Features selected using the entropy minimization algorithm yielded correct classification rates of 85-90% using only the top 5-10 features. Correct classification rates of 95-100% were achieved using only 1-5 features selected using the Add-One method. A closer look at the top performing features shows that some were highly ranked across many experiments we conducted. Overall, the features based on statistics and nth percentile seemed to outperform all the other features.
Conclusions The Add-One feature selection algorithm yielded better results than the entropy minimization algorithm. Correct classification rates greater than 95% were achieved with only 1-5 features. There were some consistently top performing features; most of them were statistical in nature.
Supported by University of West Florida Office of Research and Sponsored Programs and ActiGraph.
- Mohamed A. Khabou
- Michael V. Parlato