- Continuing medical education
- continuing professional development
- longitudinal studies
- impact of CME
- cardíology
- cardiovascular risk
In this issue of the Annals of Family Medicine, Kiessling and her colleagues1 describe a randomized controlled trial of a case-based continuing medical education (CME) intervention associated with decreased mortality in Swedish cardiac patients. At first glance outdated (this study commenced in 1995 and observed its patient population for 10 years), there is a currency—even an urgency, considering the imperative of better patient care—to the study. There is also reason embedded in its findings for optimism about the effect a carefully planned and implemented methodology on patient outcomes. The study allows many observations about evidence, physician practice, and the roles that an effective continuing education presence can occupy in health care and its quality and reform efforts, arenas in which this presence is often invisible, unconsidered, and neglected. These observations can be couched in terms of several questions: What do we mean by CME? Does it work? How would we know if it does? Where does it fit in the picture of health services, health care reform, and the implementation of best practice?
First, what is CME? To most, the acronym conjures visions of lectures and conferences, one-time only activities using didactic methods to convey new information, generally shown to be ineffective in changing physician behavior.2 In the US context, this picture is confounded by the need for most physicians to claim credit by attending these lectures—providing an equally inadequate view of the field. So great is the negativity that CME can conjure, other terms have emerged, among them continuing professional development, lifelong learning, maintenance of competence, and others.
Regardless of its terminology, the Kiessling et al study describes a more complete picture of the methods of an effective, multiphase physician continuing education intervention. In this case, the intervention comprised a standard lecture, mailed distribution of guidelines to all participants, and an interactive, sequenced strategy in which randomly selected physicians practices participate in several case-based seminars separated by work experience. The latter, more effective intervention permitted discussion of usual case presentations in the primary care setting, problem solving, and perhaps most importantly, learner engagement. There are several elements here that build on the literature supporting effective strategies—sequencing of learning activities, close attention to the adult learning principles of relevance, and engagement and interactivity. These tools, of course, are not the only ones available to the CME provider; other methods range from academic detailing to reminders, broad effective strategies to engage the physician learner.3 Taken together, they broaden the definition of CME.
Second, CME, defined more narrowly or in this more holistic fashion, is subject to the question, Does it work? Certainly it worked in this study and in many others.4 Perhaps a more important question here is, What do we mean by work? For many, an effective educational intervention improves or optimizes the competence of physicians—their ability to demonstrate knowledge, skills, or attitudes in the test or educational environment. For many others, performance is the reference standard, ie, the demonstrated behavior of clinicians in the work environment. For Kiessling and her coauthors and perhaps for the health care system as a whole, however, patient care outcomes form the ultimate proof of concept. So many variables intervene between competence and performance and performance and health care outcomes, however, that a sizable voltage drop exists between each of these levels; the authors carefully use the phrase “associated with” as opposed to “caused by” in their final analysis of the effect of their intervention. Nonetheless, this attempt to mark that trajectory is rigorous and thus permits a discussion of CME efforts in the subtle context of health care by not placing CME in a solely educational environment parallel to or even unrelated to health care outcomes.
It is an unusual study that undertakes this effort.4 Ironically Sweden was the setting for one of the first CME-to-patient outcomes study,5 giving rise to an optimistic, positivist response to those who believe it is important—but impossible—to track the effects right to their impact on patients. Although the Swedish setting may be too homogenous to be replicable in the United States or other world contexts, it appears that we are in a position to move beyond the does-it-work question. The robust literature in this area offers many insights about effective strategies; better and more current questions follow the lines of which educational strategies, in which settings, and with which health professional and patient populations?
Third, how would we know that it worked? This study shows many elements conveyed by the descriptor elegant. It paid attention to strict randomization and cross-group comparisons, possibilities of bias, trial design, and educational theory. It used proximal (laboratory values, medication usage), as well as distal (mortality), rates to track its effects. It used carefully applied biostatistical principles. It built from a knowledge platform in which the clinical evidence and level of recommendation are clear, well developed, and robust. It is representative of an exquisitely pragmatic, longitudinal study, one of very few to follow the full course of the evidence-to-practice journey and to operate in a context that is both explanatory and supportive of policy directions.6 To be fair, these are extraordinarily difficult studies. Although it exemplified expense and durability, it can or should answer questions about the efficacy and effect of CME—worth twice the price in any context.
Fourth, where does all of this discussion fit in the broader, US-focused health care system? Two sets of answers representing research and practice come to mind. From the research side, not all trials of CME interventions need to be as elaborate, but they must find a place within the latticework of a larger theoretical framework. In Canada, that framework is termed knowledge translation (KT),7 recognizing the complex and interdependent variables that affect the transmission of information and best practices to clinicians and health care settings and systems. In the United States and for much or the world, the term implementation science (IS) covers much of the same highly interdisciplinary terrain.8 From the practice of CME side, there is a strong association (the multiple intervening variables in the Kiessling et al study limit the use of the term causation) with population health outcomes of some significance. Thus, clearly CME has an important and necessary (if not quite sufficient) role to play in health care delivery.9 To do so, it cannot use exclusively ineffective, traditional methods, and it cannot exist in a world parallel to health care, perhaps created in part by commercial needs. It needs, of course, to use effective methods, to be anchored in the health system, to build on valid learner and patient needs, and help develop a science of CMEKT, integrated into KT and IS principles,10 thus shedding the negativity so often associated with it. There is clear evidence that this transition is occurring, at least among academic CME providers.11
As the world of health care is discovering, it is not pills or new investigations by themselves that can save lives, but rather a holistic understanding of the journey that takes us from the development and localization of clinical evidence to its widespread and effective transmission and adoption, to ultimate patient outcomes—and the importance of effective continuing education or professional development of physicians in the process. If we cannot say that CME saves lives, we can certainly claim from this study and many others that there is a strong association—one which we ignore at some risk to better patient care.
Footnotes
-
Conflicts of interest: author reports none.
- Received for publication April 10, 2011.
- Accepted for publication April 11, 2011.
- © 2011 Annals of Family Medicine, Inc.