How Often Are Study Design and Level of Evidence Misreported in the Pediatric Orthopaedic Literature?
Journal
Journal of pediatric orthopedics
ISSN: 1539-2570
Titre abrégé: J Pediatr Orthop
Pays: United States
ID NLM: 8109053
Informations de publication
Date de publication:
Historique:
pubmed:
27
11
2019
medline:
8
10
2020
entrez:
27
11
2019
Statut:
ppublish
Résumé
Observational studies are the most commonly used study designs in the pediatric orthopaedic literature. The differences between observational study designs are important but not widely understood, leading to potential discrepancies between the reported and actual study design. Study design misclassification is associated with a potential for misreporting level of evidence (LOE). The purpose of this study was to determine the degree of study design and LOE misclassification in the pediatric orthopaedic literature. The Institute for Scientific Information (ISI) Web of Science was queried to identify all pediatric orthopaedic observational studies published from 2014 to 2017. Reported study design and LOE were recorded for each study. The actual study design and LOE were determined on the basis of established clinical epidemiological criteria by reviewers with advanced epidemiological training. Studies with a discrepancy between reported versus actual study design and LOE were identified. The following covariates were recorded for each study: subspecialty, inclusion of a statistician coauthor, sample size, journal, and journal impact factor. χ test was used to identify factors associated with study design and LOE misreporting. In total, 1000 articles were screened, yielding 647 observational studies. A total of 335 publications (52%) did not clearly report a study design in the abstract or manuscript text. Of those that did, 59/312 (19%) reported the incorrect study design. The largest discrepancy was in the 109 studies that were reported to be case series, among which 30 (27.5%) were actually retrospective cohort studies. In total, 313 publications (48%) did not report a LOE. Of those that did, 95/334 (28%) reported the incorrect LOE. In total, 33 studies (19%) reported a LOE that was higher than the actual LOE and 62 (35%) under-reported the LOE. The majority of observational pediatric orthopaedic studies did not report a study design or reported the wrong study design. Similarly, the majority of studies did not report or misreported their LOE. Greater epidemiological rigor in evaluating observational studies is required on the part of investigators, reviewers, and editors. Level II.
Sections du résumé
BACKGROUND
BACKGROUND
Observational studies are the most commonly used study designs in the pediatric orthopaedic literature. The differences between observational study designs are important but not widely understood, leading to potential discrepancies between the reported and actual study design. Study design misclassification is associated with a potential for misreporting level of evidence (LOE). The purpose of this study was to determine the degree of study design and LOE misclassification in the pediatric orthopaedic literature.
METHODS
METHODS
The Institute for Scientific Information (ISI) Web of Science was queried to identify all pediatric orthopaedic observational studies published from 2014 to 2017. Reported study design and LOE were recorded for each study. The actual study design and LOE were determined on the basis of established clinical epidemiological criteria by reviewers with advanced epidemiological training. Studies with a discrepancy between reported versus actual study design and LOE were identified. The following covariates were recorded for each study: subspecialty, inclusion of a statistician coauthor, sample size, journal, and journal impact factor. χ test was used to identify factors associated with study design and LOE misreporting.
RESULTS
RESULTS
In total, 1000 articles were screened, yielding 647 observational studies. A total of 335 publications (52%) did not clearly report a study design in the abstract or manuscript text. Of those that did, 59/312 (19%) reported the incorrect study design. The largest discrepancy was in the 109 studies that were reported to be case series, among which 30 (27.5%) were actually retrospective cohort studies. In total, 313 publications (48%) did not report a LOE. Of those that did, 95/334 (28%) reported the incorrect LOE. In total, 33 studies (19%) reported a LOE that was higher than the actual LOE and 62 (35%) under-reported the LOE.
CONCLUSIONS
CONCLUSIONS
The majority of observational pediatric orthopaedic studies did not report a study design or reported the wrong study design. Similarly, the majority of studies did not report or misreported their LOE. Greater epidemiological rigor in evaluating observational studies is required on the part of investigators, reviewers, and editors.
LEVEL OF EVIDENCE
METHODS
Level II.
Identifiants
pubmed: 31770169
doi: 10.1097/BPO.0000000000001470
pii: 01241398-202005000-00020
doi:
Types de publication
Journal Article
Langues
eng
Sous-ensembles de citation
IM
Pagination
e385-e389Références
Cashin M, Kelley S, Douziech J, et al. The levels of evidence in pediatric orthopaedic journals: where are we now? J Pediatr Orthop. 2011;31:721–725.
Sheffler L, Yoo B, Bhandari M, et al. Observational studies in orthopaedic surgery: the STROBE statement as a tool for transparent reporting. J Bone Joint Surg Am. 2013;95:1–12.
Von Elm E, Altman DG, Egger M, et al. The Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) statement: guidelines for reporting observational studies. PLoS Med. 2007;4:1623–1627.
Grimes DA. “Case-control” confusion: mislabeled reports in obstetrics and gynecology journals. Obstet Gynecol. 2009;114:1284–1286.
Hellems M, Kramer M, Hayden G. Case-control confusion. Ambul Pediatr. 2006;6:96–99.
Esene IN, Ngu J, El Zoghby M, et al. Case series and descriptive cohort studies in neurosurgery: the confusion and solution. Childs Nerv Syst. 2014;30:1321–1332.
Mathes T, Pieper D. Clarifying the distinction between case series and cohort studies in systematic reviews of comparative studies: potential impact on body of evidence and workload. BMC Med Res Methodol. 2017;17:8–13.
Wright J, Swiontkowski M, Heckman J. Introducing levels of evidence to the journal. J Bone Joint Surg Am. 2003;85(A1):1–3.
Greenhalgh T. How to Read a Paper: The Basics of Evidence-based Medicine, 4th ed. London: BMJ Books; 2010.
LeBrun DG, Tran T, Wypij D, et al. How often do orthopaedic matched case-control studies use matched methods? A review of methodological quality. Clin Orthop Relat Res. 2019;477:655–662.
Clarivate Analytics. InCites Journal Citation Reports® Science Edition; 2017. Available at: https://jcr.incites.thomsonreuters.com/. Accessed August 1, 2018.
Bhandari M, Morrow F, Kulkarni A, et al. Meta-analyses in orthopaedic surgery. A systematic review of their methodologies. J Bone Joint Surg Am. 2001;83-A:15–24.
Leopold SS, Warme WJ, Fritz Braunlich E, et al. Association between funding source and study outcome in orthopaedic research. Clin Orthop Relat Res. 2003;415:293–301.
Bland J, Altman D. Multiple significance tests: the Bonferroni method. BMJ. 1995;310:170.
Simunovic N, Sprague S, Bhandardi M. Methodological issues in systematic reviews and meta-analyses of observational studies in orthopaedic research. J Bone Joint Surg Am. 2009;91(suppl 3):87–94.
Esene IN, Mbuagbaw L, Dechambenoit G, et al. Misclassification of case-control studies in neurosurgery and proposed solutions. World Neurosurg. 2018;112:233–242.
Leopold SS. Editorial: getting the most from what you read in orthopaedic journals. Clin Orthop Relat Res. 2017;475:1757–1761.
Bernstein J. Not the last word: rethinking the resident research requirement. Clin Orthop Relat Res. 2017;475:1948–1953.
Reeves B, Deeks J, Higgins J, et al. Including non-randomized studies. In: Higgins JPT, Green S, eds. Cochrane Handbook for Systematic Reviews of Interventions; 5.1.0. West Sussex, England: Cochrane Collaboration; 2011:391–432.