AEM Education and Training 06: Editorial Processes in FOAM
Welcome to the sixth episode of AEM Education and Training, a podcast collaboration between the Academic Emergency Medicine E&T Journal and Brown Emergency Medicine. Each quarter, we'll give you digital open access to AEM E&T Articles or Articles in Press, with an author interview podcast and links to curated supportive educational materials for EM learners and medical educators.
Find this podcast series on iTunes here.
DISCUSSING: (CLICK ON TITLE FOR FULL TEXT ARTICLE):
Editorial Processes in Free Open Access Medical Education (FOAM) Resources. Arden Azim, Jennifer Beck‐Esmay MD, Teresa M. Chan MD, MHPE
LISTEN NOW: AUTHOR INTERVIEW WITH ARDEN AZIM
ARTICLE ABSTRACT:
Background
Much of the skepticism toward online educational resources (OERs) in emergency medicine (EM) stems from the low barrier to publishing and a perceived lack of editorial rigor. Learners and educators have demonstrated unreliable gestalt ratings of OERs, suggesting a lack of capacity to consistently appraise these resources. The development of tools to guide clinicians and learners in the selection and use of blogs and podcasts is a growing area of interest. Disclosure of editorial process was identified in previous studies as an important quality indicator for OERs. However, little is known about editorial process in online EM resources and whether it can be reliably integrated into a critical appraisal tool.
Methods
Two reviewers assessed 100 top EM and critical care OERs for mention and description of editorial process and academic and nonacademic affiliations. Ninety‐two sites were accessible for review. All sites were also contacted to attempt clarification of their editorial process. Inter‐rater reliability for mention and description of editorial process was evaluated using Cohen's kappa, and the relationship between academic affiliation and disclosure of editorial process was assessed by odds ratio (OR).
Results
Eleven sites mentioned an editorial process, and 10 of these sites included a description. Five of the seven sites that responded to contact also described an editorial process. Inter‐rater agreement was excellent for mention (κ = 0.90) and description (κ = 1.00) of editorial process. Eighteen sites listed academic affiliations and 21 sites had nonacademic affiliations. A greater proportion of sites with academic affiliations disclosed their editorial process compared to sites without academic affiliations (OR = 5.3, 95% confidence interval [CI] = 1.3–21.0; difference in proportions of 0.40, 95% CI = 11.6–60.8).
Conclusions
Although transparency is lacking, editorial processes exist among OERs. Inter‐rater reliability for disclosure of editorial process is excellent, supporting its use within critical appraisal tools.