A Cost-Effective Real-Time Human Activity Recognition System Using Supervised Learning Algorithms and Wearable Acceleration Sensors
DOI:
https://doi.org/10.37385/jaets.v6i2.6830Keywords:
Accelerometer, Classification, Feature Extraction, Human Activity Recognition, WearableAbstract
Human activity recognition (HAR) plays a vital role in health monitoring by providing detailed insights into daily movements. This study aims to enhance HAR by developing a lightweight and efficient machine learning model that balances accuracy, real-time performance, and affordability. Using acceleration data from a wearable inertial sensor, we extracted a novel feature set optimized for computational efficiency. The proposed model was evaluated on a benchmark dataset, achieving an accuracy of 98.9%, in classifying six essential daily activities: walking, walking upstairs, walking downstairs, laying, sitting, and standing. These results demonstrate the model’s potential for real-time health monitoring applications, offering a cost-effective and deployable solution for wearable-based activity recognition.
Downloads
References
Banos, O., Galvez, J. M., Damas, M., Pomares, H., & Rojas, I. (2014). Window size impact in human activity recognition. Sensors (Switzerland), 14(4), 6474–6499. https://doi.org/10.3390/s140406474.
Bulling, A., Blanke, U., & Schiele, B. (2014). A tutorial on human activity recognition using body-worn inertial sensors. ACM Computing Surveys, 46(3), 1–33. https://doi.org/10.1145/2499621.
Catal, C., Tufekci, S., Pirmit, E., & Kocabag, G. (2015). On the use of ensemble of classifiers for accelerometer-based activity recognition. Applied Soft Computing, 37, 1018-1022. https://doi.org/10.1016/j.asoc.2015.01.025.
Chernbumroong, S., Cang, S., Atkins, A., & Yu, H. (2013). Elderly activities recognition and classification for applications in assisted living. Expert Systems with Applications, 40(5), 1662–1674. https://doi.org/10.1016/j.eswa.2012.09.004.
Hazelhoff, L., Han, J., & de With, P. H. (2008). Video-based fall detection in the home using principal component analysis. In Advanced Concepts for Intelligent Vision Systems: 10th International Conference, ACIVS 2008, Juan-les-Pins, France, October 20-24, 2008. Proceedings 10 (pp. 298-309). Springer Berlin Heidelberg. https://doi.org/10.1007/978-3-540-88458-3_27.
Hevesi, P., Wille, S., Pirkl, G., When, N., & Lukowicz, P. (2014). Monitoring household activities and user location with a cheap, unobtrusive thermal sensor array. UbiComp 2014 - Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing, 141–145. https://doi.org/10.1145/2632048.2636084.
Jain, A., & Kanhangad, V. (2015). Exploring orientation and accelerometer sensor data for personal authentication in smartphones using touchscreen gestures. Pattern Recognition Letters, 68, 351–360. https://doi.org/10.1016/j.patrec.2015.07.004
Jamil, S., Basalamah, A., Lbath, A., & Youssef, M. (2015). Hybrid participatory sensing for analyzing group dynamics in the largest annual religious gathering. UbiComp 2015 - Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing, 547–558. https://doi.org/10.1145/2750858.2807548.
Khan, A. M., Lee, Y. K., Lee, S. Y., & Kim, T. S. (2010). A triaxial accelerometer-based physical-activity recognition via augmented-signal features and a hierarchical recognizer. IEEE transactions on information technology in biomedicine, 14(5), 1166-1172.
Kwapisz, J. R., Weiss, G. M., & Moore, S. A. (2011). Activity recognition using cell phone accelerometers. ACM SigKDD Explorations Newsletter, 12(2), 74-82. https://doi.org/10.1145/1964897.1964918.
Lin, W., & Ling, Z. H. (2007). Automatic fall incident detection in compressed video for intelligent homecare. Proceedings - International Conference on Computer Communications and Networks, ICCCN, 1172–1177. https://doi.org/10.1109/ICCCN.2007.4317978.
Marquis-Faulkes, F., McKenna, S. J., Newell, A. F., & Gregor, P. (2005). Gathering the requirements for a fall monitor using drama and video with older people. Technology and Disability, 17(4), 227–236. https://doi.org/10.3233/TAD-2005-17404
Peetoom, K. K. B., Lexis, M. A. S., Joore, M., Dirksen, C. D., & De Witte, L. P. (2015). Literature review on monitoring technologies and their outcomes in independently living elderly people. Disability and Rehabilitation: Assistive Technology, 10(4), 271–294. https://doi.org/10.3109/17483107.2014.961179.
Pham Van Thanh, Duc-Tan Tran, Dinh-Chinh Nguyen, Nguyen Duc Anh, Dang Nhu Dinh, & Sandrasegaran, K. (2015). Development of a Real-Time, Simple and High-Accuracy Fall Detection System for Elderly Using 3-DOF Accelerometers. Arabian Journal for Science and Engineering, 3–4. https://doi.org/10.1007/s13369-018-3496-4.
Pierleoni, P., Belli, A., Palma, L., Pellegrini, M., Pernini, L., & Valenti, S. (2015). A High Reliability Wearable Device for Elderly Fall Detection. IEEE Sensors Journal, 15(8), 4544–4553. https://doi.org/10.1109/JSEN.2015.2423562.
Rahman, T., Adams, A. T., Ravichandran, R. V., Zhang, M., Patel, S. N., Kientz, J. A., & Choudhury, T. (2015, September). Dopplesleep: A contactless unobtrusive sleep sensing system using short-range doppler radar. In Proceedings of the 2015 ACM international joint conference on pervasive and ubiquitous computing (pp. 39-50). https://doi.org/10.1145/2750858.2804280.
Roitberg, et al. (2014). Human activity recognition in the context of industrial human-robot interaction. Signal and Information Processing Association Annual Summit and Conference (APSIPA), 2014 Asia-Pacific. IEEE. https://doi.org/10.1109/APSIPA.2014.7041588.
Sepahvand, M., Meqdad, M. N., & Abdali-Mohammadi, F. (2025). State-of-the-art in human activity recognition based on inertial measurement unit sensors: survey and applications. International Journal of Computers and Applications, 47(1), 1-16. https://doi.org/10.1080/1206212X.2024.2426501.
Shieh, W. Y., & Huang, J. C. (2009). Speedup the multi-camera video-surveillance system for elder falling detection. Proceedings - 2009 International Conference on Embedded Software and Systems, ICESS 2009, 350–355. https://doi.org/10.1109/ICESS.2009.62.
Sundholm, M., Cheng, J., Zhou, B., Sethi, A., & Lukowicz, P. (2014). Smart-mat: Recognizing and counting gym exercises with low-cost resistive pressure sensing matrix. UbiComp 2014 - Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing, 373–382. https://doi.org/10.1145/2632048.2636088.
Torres, S. (2018). Aging in Place: Fall Detection and Localization in a Distributed Smart Camera Network. Critical Gerontology Comes of Age: Advances in Research and Theory for a New Century, 151–163. https://doi.org/10.1145/1291233.1291435.
Vavoulas. G, Chatzaki. C, Malliotakis. T, Pediaditis. M, and Tsiknakis. M. (2016). The MobiAct dataset: Recognition of activities of daily living using smartphones, ICT4AWE 2016 - 2nd International Conference on Information and Communication Technologies for Ageing Well and e-Health, pp. 143–151. https://doi.org/10.5220/0005792401430151
Walse, K. H., Dharaskar, R. V., & Thakare, V. M. (2016). Performance evaluation of classifiers on WISDM dataset for human activity recognition. In Proceedings of the second international conference on information and communication technology for competitive strategies, pp. 1-7. https://doi.org/10.1145/2905055.2905232
Wang, A., Chen, G., Yang, J., Zhao, S., & Chang, C. Y. (2016). A Comparative Study on Human Activity Recognition Using Inertial Sensors in a Smartphone. IEEE Sensors Journal, 16(11), 4566–4578. https://doi.org/10.1109/JSEN.2016.2545708.
Zhang, Haoran, & Linhai Xu. (2024). Multi-stmt: Multi-level network for human activity recognition based on wearable sensors. IEEE Transactions on Instrumentation and Measurement, 73, 1-12. https://doi.org/10.1109/TIM.2024.3365155.
Zhu, C., & Sheng, W. (2011). Wearable sensor-based hand gesture and daily activity recognition for robot-assisted living. IEEE Transactions on Systems, Man, and Cybernetics-Part A: Systems and Humans, 41(3), 569-573. https://doi.org/10.1109/TSMCA.2010.2093883.
Ziaeefard, Maryam, & Robert Bergevin. (2015). Semantic human activity recognition: A literature review. Pattern Recognition, 48.8, 2329–2345. https://doi.org/10.1016/j.patcog.2015.03.006.