A novel approach to predict ingress/egress discomfort based on human motion and biomechanical analysis.
Biomechanics
Discomfort
Feature selection
Ingress/egress
Motion analysis
Journal
Applied ergonomics
ISSN: 1872-9126
Titre abrégé: Appl Ergon
Pays: England
ID NLM: 0261412
Informations de publication
Date de publication:
Feb 2019
Feb 2019
Historique:
received:
31
01
2018
revised:
01
10
2018
accepted:
11
11
2018
entrez:
5
12
2018
pubmed:
5
12
2018
medline:
23
3
2019
Statut:
ppublish
Résumé
This study proposes an ingress/egress discomfort prediction algorithm using an in-depth biomechanical method and motion capture database. The ingress/egress motion of the subject was captured using an optical motion capture system and physically adjustable vehicle mock-up. The subjective discomfort evaluation data were also recorded at the same time. The inverse kinematics and inverse dynamics were performed to analyze captured ingress/egress motion. These procedure provide motion and joint torque information on each subject. Based on the analysis results, this study proposes the following novel features: accumulated movement of joint and sum of rectified joint torque. This study conducted a feature selection procedure to identify a relevant feature subset. Recursive feature selection and optimal feature selection methods found the most relevant feature subset with collected subjective responses. Finally, we constructed the prediction model using support vector machine. The prediction model was evaluated through prediction accuracy and statistical analysis. For comparison with the previous study, this study implemented two representative models and compare the result with those of the previous studies using the identical dataset. The effectiveness of proposed algorithm was demonstrated in comparison with previous studies.
Identifiants
pubmed: 30509535
pii: S0003-6870(18)30622-7
doi: 10.1016/j.apergo.2018.11.003
pii:
doi:
Types de publication
Journal Article
Langues
eng
Sous-ensembles de citation
IM
Pagination
263-271Informations de copyright
Copyright © 2018 Elsevier Ltd. All rights reserved.