Do You Ever Struggle With Regularly Schedule Series (RSS)? We Do.

By Marci Fjelstad, MPH, MBA, CHCP; Sahar Pastel-Daneshgar; Trisha Veenema; Kari Durrant (University of Utah CME team) 

We run more than 140 regularly schedule series (RSS) each year for our program, UUCME, a decentralized program (meaning that our office enforces and tracks compliance with ACCME criteria and policies while our planning partners develop specific content).  

We work with planning partners throughout our organization across varying specialties as they develop their series that meet their specific professional practice gaps. Our relationships with planning partners (directors and coordinators) are especially crucial to our program's successes.  Planning partners generally engage with learners, so we work with our partners to facilitate effective learning and evaluation. Planning partners are healthcare providers and are typically very busy taking care of patients, so it falls to us to do our best to be as clear, simple and efficient as possible, while still meeting our needs for high quality education. Our general philosophy is to make it easy, make it valuable, win friends and improve patient care.  

Here are some of our challenges and ideas, as we are trying to turn challenges into opportunities. If you've had similar experiences, we'd love your feedback!

First, and I'm sure this will come as a shock to everyone, we had low learner response rates to RSS evaluation surveys. Most of our RSS use an annual survey sent to learners that asks for changes in practice over the past year, along with a few other questions. We write the digital survey, distribute it to our planning partners and ask them to email it to their learners. We hear about survey fatigue, email overload and provider burn out, so we expect a lower response rate.  But, these results are crucial to helping our planning partners determine if their RSS is producing the changes they intend and closing the professional practice gap, and if not, how things should be adapted.  

So, our challenge: How can we improve response rates from learners, when we are downstream and talking with the planners? 

We had already moved to a once a year survey of RSS learners, attempting to address survey fatigue.  In the past, we had noticed that some coordinators delayed in sending out the survey request to learners, which would then limit the time learners had to complete the survey.  So, this year, we decided to draft sample language for the coordinators to send out to their learners.  This allowed us to control the message that went to learners, encouraging response. It made life easier, more efficient and simpler for our planning partners — all they had to do was copy and paste. Seems obvious, right? 

In the past, we wanted the planners to put the request in their own language, reflecting the unique make up of their learners. But, drafting copy/paste language alone won us more friends and made a difference. Planning partners who would previously wait or not send out the survey contacted us to thank us for the innovation. We have also noticed in some of the learners responses that we are getting more complete, useful responses that are more reflective of improvements on patient outcomes.

Another idea we implemented this cycle was offering a random gift card drawing for coordinators who sent out the survey by a specific date. We selected one $50 gift card.  We weren't sure if people would really respond to the potential of winning a gift card with a 1 in 140+ chance. Again, we were surprised by the "buzz" we stirred up amongst our coordinators!  We loved being able to acknowledge their work and their part in this important process. Plus, they loved being recognized a bit for what they do. 

A second challenge to our RSS is learner engagement. We hear from our healthcare providers that sometimes it can be difficult to pay attention during yet another weekly lecture or to absorb information that doesn't always directly apply to their practice that week.  We know series can be a powerful tool in longitudinal "plant the seed" learning, so we want the learning to engage and stick with the learners. We work with planners in many ways to help create more engaging learning experiences with all of our activities. Turning this into an opportunity to improve, the past couple years, we've tried a new way: using Continuous Certification Lifelong Learning and Self-Assessment (aka MOC part II) credits to help engage learners. We know that offering MOC credit adds value, and we want to help our physician learners meet their Board requirements as much as we can.  And, MOC is all about making learning more interactive, more personal and more about participation rather than only attendance.  

Sahar Pastel-Daneshgar, our MOC coordinator, works with series planners who express an interest in MOC to include the requirements into their series. Typically, we have the guidelines from ACCME to explain the Board requirements for us and give us a mechanism to report credit. (See ACCME resources here: We ask our planning partners questions about what the typical session of their RSS is like and then talk through the specific requirements with our planning partners. Our goal is to identify areas where their RSS is already working toward participation, adding in a few assessment/evaluation components and then tweaking the process a bit to help us all get what we need to provide the credit. An example: We identified that treatment planning conferences already include a case discussion component as part of their learning format. So, why not work with them to offer MOC credit? 

We are still examining how these changes have worked, and if we have made a difference for our RSS, our learners and ourselves. We will update you again with results. 

What have you done to improve learner response rates? How are you working to improve learner engagement? Please feel free to share your feedback with us. We'd love to try some of your ideas, too! 


Recent Stories
CME Improved Management of Chronic Hyperkalemia in CKD

Research Shows Continuing Certification Programs Improve Care, Enhance Knowledge, Link to Fewer Disciplinary Actions

Evaluating Effectiveness of Online Learning Modules in Pediatric Environmental Health Education