Research to Actions Interview With Andy Crim, MEd, CHCP, FACEhp

In this interview, we sit down with Andy Crim, director of education and professional development at the American College of Osteopathic Obstetricians and Gynecologists. We’ll discuss his 2016 JCEHP publication, “Comparison of the Effectiveness of Interactive Didactic Lecture Versus Online Simulation-Based CME Programs Directed at Improving the Diagnostic Capabilities of Primary Care Practitioners.”

As with each interview, in our discussion we explored our four basics research to actions questions and hope to help the community find some actionable lessons from Andy’s recent research.

Here is a quick summary of what Andy shared:

  1. What was the problem or question you set out to answer?
    • We wanted to see if an online activity founded in the learning sciences was less, as or more effective than a traditional format of CME at improving a primary care provider’s diagnostic capability relative to joint pain. An online tutor was used in our university to help students learn to differentially diagnose many conditions and shown to be effective. We wanted to test if its effectiveness extended to providers in practice.
  2. What were the methods you applied to answer the question?
    • We conducted two CME activities, one face-to-face and one online (enduring). Both were structured using best practices from the literature, and controls were put in place to account for content and delivery variations. The live activity was held in conjunction with an annual conference, and the online was hosted on our website. Each participant took the same pre- and post-tests, which were cases representative of the content and featuring seven conditions for which differential diagnosis should be performed.
  3. What did you learn?
    • There was statistically no difference in the pre-test responses between the two cohorts, and consistent with published studies. On the post-test, however, the online cohort demonstrated significantly-improved diagnostic accuracy (~2x) than the live group, which saw no statistically significant improvement. Interestingly, the online cohort completed the activity in half the time. The learn-practice-try-feedback-try again route used in the online activity worked much more effectively than in a live setting.
  4. How do you think this could be applied in practice?
    • Educational activities intended to improve diagnostic performance should include multiple practice opportunities with immediate feedback.
    • Online activities that teach differential diagnosis can be effective, both in intended result and in cost.
    • The use of a codified set of learning principles, instructional design considerations, and an appropriately executed activity supports participants in transforming declarative knowledge presented during an activity into procedural knowledge and is more likely to develop the competences needed to improve outcomes.

If you learned something with this episode, please share the lessons and share the link with your colleagues. The Almanac is open access, meaning everyone in your organization or professional social network can benefit.

Please feel free to reach out if you have suggestions on folks you’d like to see us interview. Or maybe there are published articles you would like to see deconstructed or simplified. Just let us know. You can contact me through LinkedIn or Twitter @briansmcgowan.

Keep in mind that with every educational program we build, there are a thousand opportunities to ask a research question, and with every research article that is published there are dozens of lessons to learn. You don’t have to be a research scientist to build great training experiences, but you do need to embrace what the literature says and move past the status quo.

Thanks for joining us and until next time, NEVER STOP LEARNING.

Recent Stories
Must Read: ACCME’s New Advancing CME to Optimize Care

Continuing Medical Education as a Strategic Resource

Trust Between Teachers and Learners