Interrater Reliability of Posture Observations
-
2009/06/01
-
Details
-
Personal Author:
-
Description:OBJECTIVE: The aims of this research were (a) to study the interrater reliability of a posture observation method, (b) to test the impact of different posture categorization systems on interrater reliability, and (c) to provide guidelines for improving interrater reliability. BACKGROUND: Estimation of posture through observation is challenging. Previous studies have shown varying degrees of validity and reliability, providing little information about conditions necessary to achieve acceptable reliability. METHOD: Seven raters estimated posture angles from video recordings. Different measures of interrater reliability, including percentage agreement, precision, expression as interrater standard deviation, and intraclass correlation coefficients (ICC), were computed. RESULTS: Some posture parameters, such as the upper arm flexion and extension, had ICCs > or = 0.50. Most posture parameters had a precision around the 10 degrees range. The predefined categorization and 300 posture categorization strategies showed substantially better agreement among the raters than did the 10 degrees strategy. CONCLUSIONS: Different interrater reliability measures described different aspects of agreement for the posture observation tool. The level of agreement differed substantially between the agreement measures used. Observation of large body parts generally resulted in better reliability. Wider width angle intervals resulted in better percentage agreement compared with narrower intervals. For most postures, 30 degrees-angle intervals are appropriate. Training aimed at using a properly designed data entry system, and clear posture definitions with relevant examples, including definitions of the neutral positions of the various body parts, will help improve interrater reliability. APPLICATION: The results provide ergonomics practitioners with information about the interrater reliability ofa postural observation method and guidelines for improving interrater reliability for video-recorded field data. [Description provided by NIOSH]
-
Subjects:
-
Keywords:
-
ISSN:0018-7208
-
Document Type:
-
Funding:
-
Genre:
-
Place as Subject:
-
CIO:
-
Topic:
-
Location:
-
Pages in Document:292-309
-
Volume:51
-
Issue:3
-
NIOSHTIC Number:nn:20035896
-
Citation:Hum Factors 2009 Jun; 51(3):292-309
-
Contact Point Address:Washington State Department of Labor and Industries, SHARP Program, P.O. Box 44330, Olympia, WA 98504-4330
-
Email:baos235@lni.wa.gov
-
Federal Fiscal Year:2009
-
NORA Priority Area:
-
Performing Organization:Washington State Department of Labor and Industries
-
Peer Reviewed:True
-
Start Date:20000930
-
Source Full Name:Human Factors
-
End Date:20060929
-
Collection(s):
-
Main Document Checksum:urn:sha-512:c580c0819ab160e19570cc81e9c12e9f5f69aacbb39877eb14afae9fedc425f064b592e7fae09e25c7db97723354eac6a87072a31940f6094676a6ce43b1992d
-
Download URL:
-
File Type:
ON THIS PAGE
CDC STACKS serves as an archival repository of CDC-published products including
scientific findings,
journal articles, guidelines, recommendations, or other public health information authored or
co-authored by CDC or funded partners.
As a repository, CDC STACKS retains documents in their original published format to ensure public access to scientific information.
As a repository, CDC STACKS retains documents in their original published format to ensure public access to scientific information.
You May Also Like