Enhanced Performance Measurement with Case-Based, Follow-up CE

By Katie Robinson, PhD, CHCP, Associate Director of Medical Research and Outcomes, Vindico Medical Education

Introduction

In the era of digital information, busy clinicians are inundated with clinical evidence and information that should guide their practice. Repetitive education can help them grasp the subject matter and translate new knowledge into clinical practice. For this reason, CE that is part of an educational curriculum may be regarded as more impactful for learning versus a standalone program. Additionally, assessment of performance change in continuing education in the health professions (CEhp) is traditionally too subjective (follow-up surveys), too expensive (integration into a health systems’ EHR) or too delayed (claims data).

To that end, Vindico created an education model, Precision Decisions: Realistic Cases (PDRC), that can be used to reinforce content while subjectively and objectively measuring practice change. The model has been successfully adopted across several therapeutic areas.

Implementation and Impact of Precision Decisions: Realistic Learning (PDRC)

Vindico initiated the PDRC curriculum in 2018 with separate curricula designed for neurologists, oncologists or rheumatologists. A schematic of the program design is highlighted in Figure 1.


CasebasedCME1.png 

 Figure 1: Schematic of the PDRC model described herein.

Engagement: As an illustrative case, 261 emails were sent to completers of either a live or web rheumatology activity. Of them, 45 providers (17%) participated in the follow-up CME. This percentage is about three to five times higher than participation rates observed among rheumatology providers for standard, non-accredited, non-incentivized, follow-up assessments (eg, surveys). Importantly, the 17% participation rate has been observed in PCRC activities offered by Vindico in other specialties, suggesting that incentivization with CE credit may be a widely used strategy to increase engagement in follow-up education (Table 1).

CasebasedCME2.png
Table 1: Participation rates of PDRC activities observed across various specialties are higher than observed rates from non-accredited, non-incentivized follow-up assessments (eg, surveys).

Reinforcement of Content: The rheumatology program was designed to reinforce knowledge of treatment options for a patient with RA who has failed first-line biologic treatment with a TNFi. On the evaluation of the PDRC activity, 100% noted that this activity reinforced knowledge of this subject matter. Additionally, pre-test responses on the PDRC activity were 30% higher than pre-test scores from control participants who did not engage in the initial CE (live or web), suggesting that those who participated in the initial CE had a higher baseline knowledge of this subject.

Subjective Measures of Performance: At the start of the rheumatology PDRC activity, 94% of learners indicated that they have implemented changes into practice as a result of the initial CME, either the live or web component.

Objective Measures of Performance: Throughout the rheumatology PRDC, learners answered two practice challenge questions that served as surrogate markers for clinical decision making within their own practice. First, PDRC participants were more likely than control participants to select a non-TNFi after failure to achieve therapeutic target with methotrexate (MTX). Similarly, after failure to achieve therapeutic target with a TNFi, PDRC participants were 27% more likely than control participants to switch the patient to a non-TNFi.

On an individual level basis, 89% of learners who preferred to cycle TNFi versus switch to a new biologic post-TNFi failure, would now choose to switch to a non-TNFi. Interestingly, all but one (96%) of those individuals who would now prefer switching versus cycling noted that they had implemented practice changes, suggesting high correlation between self-reported practice change and actual change to practice. While this percentage is consistent across specialties in which Vindico has executed the PDRC model, n values are small.

Discussion

The assessment of performance change in CE is challenging, with major barriers including subjectively of data, access to electronic health records, costs and timeliness of data collection. The PDRC model is specifically designed to overcome many of these barriers while also reinforcing learning and allowing learners to obtain additional CE credit. By coupling careful educational design with the latest technology for data collection, analysis and communication, the highlights of the PDRC model are summarized:

  • Participation rates higher than traditional follow-up assessments of performance, particularly among specialized audiences are observed, suggesting that the potential to earn CE credit incentivized follow-up participation.
  • Follow-up CE reinforces knowledge: Those who participated in an initial CE activity scored approximately 30% higher than those who did not.
  • PDRC allows for self-reporting of implementation of practice changes due to initial CE.
  • Carefully-constructed practice challenge questions allow for objective measurement of practice changes on both an individual and aggregate level.
  • There is a high correlation between self-reported implementation of practice change and that which was measured objectively.

The success of the PDRC model relies on the notion that the content of the follow-up case is closely related to the content of the initial CE content. Open-ended, practice-based questions must be carefully constructed to reflect realistic clinical decision making. Engaging an expert faculty who truly values the role of CME in encouraging practice change can streamline the process. Moreover, having the appropriate software for data collection, analysis and email management is also vital for successful program execution.

Conclusion

The PDRC model is an effective method to encourage follow-up participation, reinforce knowledge and provide both subjective and objective measures of performance change without relying on expensive or untimely alternative options, such as chart review or claims data. The expanded use of this design may be a valuable tool to obtain more meaningful measures of performance impact while reinforcing critical learning points to practicing clinicians.

Support

The initiatives discussed herein were supported by educational grants from AbbVie, Inc.; Amgen; Bristol-Myers Squibb; Genentech, Inc.; and Lilly USA, LLC.; Pfizer, Inc.; Sanofi Genzyme; and Sanofi Genzyme and Regeneron Pharmaceuticals.

Recent Stories
An Examination of State and Federal Opioid Analgesic and Continuing Education Policies: 2016–2018

Commission Meets to Offer Input on Task Force Work To-Date

From the Editor: “V” Is for Virtual (and Sometimes Not Victory)