Do you have a burning question for someone in the grantor space, or are you in need of an answer to a frequently asked, grant-related question? The Almanac is excited to launch a new series, “Ask the Grantor.” This new series, created by and for Alliance members, offers a place where these questions can be answered for all to see. Because grantors' policies and procedures differ, we have asked multiple Alliance members from the Industry Alliance for CE section (IACE) to weigh in on the questions. Both questions and answers will remain anonymous.
Question 1: Most grant portals ask for “total reach” but do not define what can be counted in that number. Most supporters also don't clearly state whether they follow outcomes standardization project (OSP) or not. When counting total reach, can you clarify who can be included and who can't? Do views on social media count? How long does a participant have to participate in the content for them to be counted?
Response 1
Although there have been attempts to standardize language for the definitions of those who engage in CME/CE activities (“participants,” “learners,” etc.), the CME/CE community has yet to align. Only 34% of accredited providers accept commercial support, so many providers utilize their accrediting body for their definition. Supporters are subject to their internal policies and reporting requirements and, therefore, utilize their own definitions.
However, more and more supporters are focused on the quality of the learners, not the quantity. Instead of how many brief clicks where an individual may not learn anything (or have their knowledge reinforced), consider how many stay to consume the content. While those who claim credit may not be the best metric to assess completion, consider how many have completed the activity in its entirety (a post-test may also not be the best metric). As time in an activity may vary by the format (live vs. online), many supporters would prefer to know that the activities they supported were compelling enough for the learner to stay awhile.
Think of it like a movie: Does the audience leave after the previews? After the first act? Or does the audience stay until the end of the credits because the movie is so compelling?
Response 2
When applying for grant support, it is very important to be clear about the goal of the initiative and how, specifically, the impact of the initiative will be measured. If the goal is to create awareness, recognition or simple reinforcement in a passive manner, perhaps “reach” or “impressions” are sufficient.
For example, say the goal is to address a healthcare gap. In this example, to do this, a proposal describes an interactive or engaging educational design meant to examine new information, help answer a question, or aid in future clinical decision-making processes, etc. Then, it is more appropriate to focus on reporting how many clinicians will move beyond front matter and actually interact or engage with the education in some way. In general, the number of learners engaging with content is the metric that tends to be most meaningful in this circumstance (and “reach” less so).
In some cases, a combination of metrics may be helpful when there are multiple components included in an initiative with different goals (e.g., predisposing, learning, reinforcement). Identifying within the application what metrics will be collected is important and should align with the overall goals and design of the initiative. To contextualize learning metrics (e.g., objective/subjective interactivity questions, pre-/post- questionnaires, verbatims, etc.) clearly indicate the number of learners anticipated to contribute such feedback. This sets expectations at the outset. Finally, while some supporters may or may not identify “OSP” specifically as the particular source that they utilize to define a learner, it should be noted that OSP definitions were developed and vetted across the IME industry, including multiple experts from different types of organizations, and provides reasonable guidance. When in doubt, consider what number will be most meaningful in terms of the purpose of the initiative being proposed.
Response 3
You asked, “Most grant portals ask for total reach but do not define what can be counted in that number.”
- This is an observation I’ve heard a lot. In the absence of standardized language across all grant portals, as a provider of continuing education, you should clearly define your organization’s methodology and rationale for how you measure activity participation. For example, if you are counting individuals that click on a link in the activity, clearly state why they are being counted, what can be learned from the “click-throughs”, etc. Each type of engagement with the content should be clearly differentiated and help elucidate something different. Also, be consistent in your interpretation from one grant request application to the next.
“Most supporters also don't clearly state whether they follow OSP or not. When counting total reach, can you clarify who can be included and who can't?”
- Before you submit a grant request, you can ask compliant questions about supporter expectations for participation metrics so that you are not blindsided after the fact. If the information is not available in the grant system, request clarification so that you can determine if your instructional design and/or educational platform can accommodate their expectations. You may need to collaborate in order to ensure you are capturing the right data, or you may decide not to move forward.
“Do views on social media count?”
- It’s unlikely that views alone would be acceptable as a metric to determine educational value, engagement, participation, etc. What purpose does social media serve based on the gaps and objectives? Know your “why” so you can articulate the value and impact of your approach.
“How long does a participant have to participate in the content for them to be counted?”
- When in doubt, always return to the application of adult learning principles. Unless your goal is merely to get eyes on the content, determining what “meaningful” engagement looks like for your target learners should happen in the design phase.
Question 2: I work for a medical education company (MEC) that is accredited by the ACCME. We do not accept any commercial support, but occasionally other organizations reach out to us for accreditation services. If those activities are grant-supported, can you recommend resources to ensure compliance, or any tips/cautions for beginners in this area?
Response 1
The best resources to ensure compliance are your own policies and your accrediting body/bodies. Too often, accredited providers do not monitor the activities they jointly provide, which could impact their accreditation status. If considering jointly providing activities that are commercially supported, consider the following:
- Have documentation that sets clear expectations about what is required before you get started. Get a partnership agreement in writing (signed by all parties included) outlining roles, responsibilities and how funds will be handled. Ensure this is included in grant requests.
- Explain accreditation requirements, particularly the Standards for Integrity and Independence (identification and mitigation of relevant financial relationships, management of commercial support, interactions with ineligible companies). Send (and obtain) copies of internal compliance policies and standard operating procedures (SOPs) to ensure everyone knows the rules.
- Be involved at every step of the way, including reviewing content and materials, as well as all grant requests. You wouldn’t want to get a finding of non-compliance because a presentation was extremely biased or the educational partner disclosed support using a logo.
- Get the history of the non-accredited provider, and make sure they aren’t on probation on relevant accreditor websites (ACCME, ACPE, ANCC, etc.).
- Ensure the provider has engaged in successful endeavours of this kind (ask for references!)
- Vet their CE staff (who are they, how long have they worked in CE, do they have support staff?)
Ultimately, it’s your accreditation — and reputation — if an activity goes awry.
Look for to the next instalment of “Ask the Grantor” coming soon. Did this article spark a grant-related question for you? Email your question to almanac@acehp.org, and it will be submitted to the Alliance volunteers for their consideration to be answered in an upcoming article.