Human Activity Recognition

The recording and analysis of physical and cognitive activities of a person in everyday life and/or in the laboratory can lead to better diagnostic and therapeutic decisions, e.g. in the case of neurodegenerative diseases or with regard to personalized nutrition. The rapid development in the field of sensor technology, especially with regard to wearable sensor systems, provides us with more and more tools to perform physiological and behavioral measurements on humans, not only in the laboratory but also in everyday life. Evaluated by modern methods of learning-based pattern recognition (machine learning), these multimodal sensor data can be used to inform health-related decisions.

In the CogAge project, we have developed a platform that continuously and in real time recognizes a person's physical activities on the basis of sensor data from their wearables. Since it is unclear which features are useful for an accurate detection of physical activity, we generate these features using the so-called codebook approach. Inspired by word processing, we have further developed the codebook approach towards modeling and classification of multimodal time series. The one-dimensional sensor signals are first divided into time windows. Then the signal sequences (code words) in the time windows are grouped according to shape similarity (clustering). The code word distribution in the clusters represents a feature vector, which describes a certain temporal section of the signal. Finally, we apply different classification algorithms for the automated detection of physical activities.

In our future research in this area we plan to build a multimodal sensor platform (SensFloor, RGB camera, depth sensors, wearables, etc.) for detailed detection of body movements and to develop algorithms for sensor data fusion and feature extraction that generate motion profiles in the form of high-dimensional mathematical models from the raw signals. In this way, scientists and physicians, who consider body movements in their research and/or care, will be provided with a tool for their precise, quantitative assessment. These research activities are carried out in the APPS Lab, which we have established at the University of Lübeck.

Third party funded projects, publications, demos:

BMBF Project: Cognitive Village - Adaptively Learning Technical Assistance for Elderly. Duration: 01.09.2015 - 30.11.2018.

Jinghua Zhang, Chen Li, Sergey Kosov, Marcin Grzegorzek, Kimiaki Shirahama, Tao Jiang, Changhao Sun, Zihan Li, and Hong Li. LCU-Net: A Novel Low-Cost U-Net for Environmental Microorganism Image Segmentation. Pattern Recognition (Elsevier, IF: 7.196), July 2021.

Frédéric Li, Kimiaki Shirahama, Muhammad Adeel Nisar, Xinyu Huang, and Marcin Grzegorzek. Deep Transfer Learning for Time Series Data Based on Sensor Modality Classification. Sensors (MDPI, IF: 3.275), July 2020.

Lukas Köping, Kimiaki Shirahama, and Marcin Grzegorzek. A General Framework for Sensor-based Human Activity Recognition. Elsevier Computers in Biology and Medicine, April 2018.

Przemysław Łagodziński, Kimiaki Shirahama, and Marcin Grzegorzek. Codebook-based Electrooculography Data Analysis Towards Cognitive Activity Recognition. Elsevier Computers in Biology and Medicine, April 2018.

Frédéric Li, Kimiaki Shirahama, Muhammad Adeel Nisar, Lukas Köping, and Marcin Grzegorzek. Comparison of Feature Learning Methods for Human Activity Recognition using Wearable Sensors. MDPI Sensors, 18(2), February 2018.

Demo-Video: Atomic Activity Recognition Using Wearable Devices.

Demo-Video: Composite Activity Recognition Using Wearable Devices.