Predicting Sagittal Plane Lifting Postures from Image Bounding Box Dimensions
Public Domain
-
2019/02/01
-
Details
-
Personal Author:
-
Description:OBJECTIVE: A method for automatically classifying lifting postures from simple features in video recordings was developed and tested. We explored if an "elastic" rectangular bounding box, drawn tightly around the subject, can be used for classifying standing, stooping, and squatting at the lift origin and destination. BACKGROUND: Current marker-less video tracking methods depend on a priori skeletal human models, which are prone to error from poor illumination, obstructions, and difficulty placing cameras in the field. Robust computer vision algorithms based on spatiotemporal features were previously applied for evaluating repetitive motion tasks, exertion frequency, and duty cycle. METHODS: Mannequin poses were systematically generated using the Michigan 3DSSPP software for a wide range of hand locations and lifting postures. The stature-normalized height and width of a bounding box were measured in the sagittal plane and when rotated horizontally by 30 degrees. After randomly ordering the data, a classification and regression tree algorithm was trained to classify the lifting postures. RESULTS: The resulting tree had four levels and four splits, misclassifying 0.36% training-set cases. The algorithm was tested using 30 video clips of industrial lifting tasks, misclassifying 3.33% test-set cases. The sensitivity and specificity, respectively, were 100.0% and 100.0% for squatting, 90.0% and 100.0% for stooping, and 100.0% and 95.0% for standing. CONCLUSIONS: The tree classification algorithm is capable of classifying lifting postures based only on dimensions of bounding boxes. APPLICATIONS: It is anticipated that this practical algorithm can be implemented on handheld devices such as a smartphone, making it readily accessible to practitioners. [Description provided by NIOSH]
-
Subjects:
-
Keywords:
-
ISSN:0018-7208
-
Document Type:
-
Genre:
-
Place as Subject:
-
CIO:
-
Division:
-
Topic:
-
Location:
-
Pages in Document:64-77
-
Volume:61
-
Issue:1
-
NIOSHTIC Number:nn:20052447
-
Citation:Hum Factors 2019 Feb; 61(1):64-77
-
Contact Point Address:Robert G. Radwin, Department of Industrial and Systems Engineering, University of Wisconsin-Madison, 1550 Engineering Drive, Madison, WI 53706
-
Email:rradwin@wisc.edu
-
Federal Fiscal Year:2019
-
Peer Reviewed:True
-
Source Full Name:Human Factors
-
Collection(s):
-
Main Document Checksum:urn:sha-512:6668f07f10fc02e80258984c371b0c616e9d88cb119ae03b88932aff3107e9e6b9be7c6a4d08f743a72e1b757e95c1166dc516324e7b2b7f655edaee4a02a63d
-
Download URL:
-
File Type:
ON THIS PAGE
CDC STACKS serves as an archival repository of CDC-published products including
scientific findings,
journal articles, guidelines, recommendations, or other public health information authored or
co-authored by CDC or funded partners.
As a repository, CDC STACKS retains documents in their original published format to ensure public access to scientific information.
As a repository, CDC STACKS retains documents in their original published format to ensure public access to scientific information.
You May Also Like