Abstract
Recognizing human behavior is essential for early interventions in cognitive rehabilitation, particularly for older adults. Traditional methods often focus on improving third-person vision but overlook the importance of human visual attention during object interactions. This study introduces an egocentric behavior analysis (EBA) framework that uses transfer learning to analyze object relationships. Egocentric vision is used to extract features from hand movements, object detection, and visual attention. These features are then used to validate hand-object interactions (HOI) and describe human activities involving multiple objects. The proposed method employs graph attention networks (GATs) with transfer learning, achieving 97% accuracy in categorizing various activities while reducing computation time. These findings suggest that integrating the EBA with advanced machine learning methods could revolutionize cognitive rehabilitation by offering more personalized and efficient interventions. Future research can explore real-world applications of this approach, potentially improving the quality of life for older adults through better cognitive health monitoring.
| Original language | English |
|---|---|
| Pages (from-to) | 12-22 |
| Number of pages | 11 |
| Journal | Journal of Advanced Computational Intelligence and Intelligent Informatics |
| Volume | 29 |
| Issue number | 1 |
| DOIs | |
| State | Published - Jan 2025 |
| Externally published | Yes |
Bibliographical note
Publisher Copyright:© Fuji Technology Press Ltd.
Keywords
- behavior recognition
- environmental context
- episodic memory
- first-person vision
- human attention
ASJC Scopus subject areas
- Human-Computer Interaction
- Computer Vision and Pattern Recognition
- Artificial Intelligence