Combining Human Action Sensing of Wheelchair Users and Machine Learning for Autonomous Accessibility Data Collection


IEICE TRANSACTIONS on Information and Systems   Vol.E99-D   No.4   pp.1153-1161
Publication Date: 2016/04/01
Publicized: 2016/01/22
Online ISSN: 1745-1361
DOI: 10.1587/transinf.2015EDP7278
Type of Manuscript: PAPER
Category: Rehabilitation Engineering and Assistive Technology
street-level accessibility,  wearable sensor,  assistive technology,  machine learning,  

Full Text: PDF>>
Buy this Article

The recent increase in the use of intelligent devices such as smartphones has enhanced the relationship between daily human behavior sensing and useful applications in ubiquitous computing. This paper proposes a novel method inspired by personal sensing technologies for collecting and visualizing road accessibility at lower cost than traditional data collection methods. To evaluate the methodology, we recorded outdoor activities of nine wheelchair users for approximately one hour each by using an accelerometer on an iPod touch and a camcorder, gathered the supervised data from the video by hand, and estimated the wheelchair actions as a measure of street level accessibility in Tokyo. The system detected curb climbing, moving on tactile indicators, moving on slopes, and stopping, with F-scores of 0.63, 0.65, 0.50, and 0.91, respectively. In addition, we conducted experiments with an artificially limited number of training data to investigate the number of samples required to estimate the target.