Background

Residency programs are increasingly being asked to defend their quality, and that of the residents they produce. Yet “residency quality” is a construct that has not been well defined, with no accepted standards other than meeting accreditation standards. In 2009, the Association of Family Medicine Residency Directors developed a strategic plan that included the goal of raising the quality of family medicine training.

Objective

We describe the development of this quality improvement tool, which we called the residency performance index (RPI), and its first year of use by family medicine residency programs. We describe the use of the tool as a “dashboard” to facilitate program self-improvement.

Intervention

Using program metrics specific to family medicine training, and benchmark criteria for these metrics, the RPI was launched in 2012 to help programs identify strengths and areas for improvement in their educational activities and resident clinical experiences that could be tracked and reviewed as part of the annual program evaluation.

Results

Approximately 100 program directors began using the tool and 70 finished the process, and were provided aggregate data. Initial review of this experience revealed difficulties with collecting data, and lack of information on graduates' scope of practice. It also showed the potential usefulness of the tool as a program improvement mechanism.

Conclusions

The RPI is a new quality improvement tool for family medicine residency programs. Although some initial challenges need to be addressed, it has the promise to aid family medicine residency in its internal improvement efforts.

What was known

Residency programs need data and metrics to conduct meaningful quality assessment and improvement activities.

What is new

A residency performance index (RPI) that creates a dashboard to facilitate assessment and self-improvement in family medicine residency programs.

Limitations

Single specialty study reduces generalizability; volunteer participants introduce the potential for selection bias.

Bottom line

The RPI was well accepted and shows promise in aiding family medicine programs in improvement efforts.

Editor's Note: The online version of this article contains quality criteria and metrics of the residency performance index and a color-coded report reflecting performance relative to the metrics.

Graduate medical education has undergone a tremendous amount of change in recent years.1 From duty hour standards to a new accreditation system to threats of funding cuts, residency programs are increasingly being called to justify their existence and defend the quality of the residents they produce. The Accreditation Council for Graduate Medical Education (ACGME) requires all programs to perform an annual program evaluation, which must include a systematic evaluation of curriculum, resident and graduate performance, faculty development, and program quality.2 It is important for residency programs to engage in continuous program quality improvement.

To improve quality, it is helpful to have accepted measures of excellence. Widely agreed-upon quality metrics for family medicine programs currently do not exist, potentially owing to the different philosophies that underpin educational quality metrics. Some favor traditional educational metrics, such as board certification rates, entering resident qualifications, scholarly activity, and research grants.3,4 Others feel that more patient-centered criteria should be used, such as clinical performance measures, or graduates' scope and quality of practice.5,6 Others suggest a combination of the 2 approaches.7,8 Regardless of the proposed criteria, there has been little or no study of their effectiveness in improving residency program quality. In 2009, the Association of Family Medicine Residency Directors (AFMRD) developed a strategic plan that included the goal of raising the quality of family medicine education. The Board of Directors of the AFMRD proposed the development of the residency performance index (RPI), a style “dashboard” to help programs in the process of self-improvement.9 Here we describe the process to develop this tool.

The RPI was developed in a multistage process. The first stage involved 5 experienced program directors and a third-year resident reviewing the literature on quality assessment tools for residency programs. This group developed a consensus opinion on the quality criteria to be measured, and the metrics for each of these. Quality measures were chosen by using a modified Delphi process. An initial list of more than 35 metrics was pared down to 12 measures. Criteria chosen by the committee included the following: (1) measures must be relevant to accreditation, board certification, or graduate practice; and (2) there must be a published accreditation standard or basis in the literature for at least 1 of the chosen metrics. This phase of the development process took approximately 18 hours over 3 months. The conceptual design was presented at the 2011 program director's workshop. The criteria and metrics were then refined by using audience feedback and data from alpha testing. The criteria and corresponding metrics are shown in table 1 and provided as online supplemental material.

TABLE 1

Residency Performance Index (RPI) Quality Criteriaa

Residency Performance Index (RPI) Quality Criteriaa
Residency Performance Index (RPI) Quality Criteriaa

The resulting RPI is intended to assist programs in identifying program-level strengths and areas for improvement that could be analyzed and tracked. A “stoplight” convention of “red, yellow, and green” was adopted for the RPI, with green representing achievement of excellence in quality targets; yellow denoting adequate program quality but with room for improvement or caution; and red indicating metrics below accreditation standards, national norms, or targets promoted by family medicine professional organizations, based on published requirements or literature.

The RPI was used for data collection for the annual program evaluation that is already performed by family medicine programs to meet ACGME requirements. All AFMRD program director members were invited to complete the online tool by using data from their programs' annual program evaluation. The initial test of the RPI tool assessed the feasibility of administering the tool online, its acceptability to program directors, ease of collecting and entering data, and the utility of the RPI tool for the annual program evaluation. Our analysis collated online comments solicited from a survey of participants after receiving their RPI results.

The study was submitted to the American Academy of Family Physicians Institutional Review Board and deemed exempt.

The first version of the tool was launched as an online survey in the fall of 2012. It was sent to all family medicine program directors, with results available to program directors who were members of AFMRD. Approximately 100 program directors began the tool, and 70 finished data collection with sufficient data to generate a meaningful report. The 30 program directors who did not complete data collection reported reasons other than lack of trust in the tool. Aggregate data from the 70 programs that completed the tool comprised the source for the median RPI values reported to programs for comparison to their own data. Each program received a color-coded report that reflected its performance relative to the metrics set by the RPI Task Force, using the “red, yellow, green” rubric previously described (an example is provide as online supplemental material). Programs were encouraged to use the report during their annual program evaluation for discussion, as a motivation for self-improvement, and as a source for specific measurable targets for striving for excellence. Collation of online comments (table 2) from program directors completing the tool revealed that most programs used the data in their annual program evaluation.

TABLE 2

Summary Comments From Users of the Residency Performance Index (RPI) Toola

Summary Comments From Users of the Residency Performance Index (RPI) Toola
Summary Comments From Users of the Residency Performance Index (RPI) Toola

A multistep, consensus approach to creating a quality assessment tool for family medicine programs produced a visual display that allowed programs to compare their performance to national benchmarks. In its first implementation, the RPI appeared acceptable and feasible to use, and was usually integrated into the annual program evaluations. Problems noted with its implementation included access to, and time required for, data acquisition, particularly data for program graduates.

To our knowledge, this is the first US specialty-based comprehensive quality improvement tool for residency programs. The ACGME Next Accreditation System requires an annual program evaluation, which may be enhanced by specialty-specific assessments and national benchmarks provided by the RPI. The RPI could serve as an objective means of promoting rigorous internal program reflection based on comparison to a national data set. The AFMRD Board of Directors views this tool as a valuable element in its strategic plan to support continuously improving the quality of family medicine training programs.18 Given our experience and modifications based on feedback and input from evolving program requirements in its second year of use in 2013–2014, we believe the RPI tool will evolve and mature over time to become more useful to programs.

Problems accessing data, particularly data for program graduates, may be difficult to overcome, yet this type of data is essential in order to create validity evidence for the RPI. Otherwise, the tool may provide only measures of process and current, rather than future, performance. Future steps in tool development and administration include the use of family medicine graduate data from a standardized survey tool that is now being developed as a joint effort between the American Board of Family Medicine (ABFM) and AFMRD. In addition, refinement of metrics to reflect the current ACGME Accreditation Data System national data set for family medicine program requirements that went into effect in July 2014 should provide more accurate measurement of patient encounter data. Finally, the possibility of ABFM Maintenance of Certification Part IV credit for program directors who use the tool as a practice improvement process may stimulate interest in a larger group of programs to complete the tool, and produce a more robust data set for future analyses.

Limitations of our study include the fact that the participants were volunteers, who may have been motivated to fill out and submit the online data and more likely to find the process acceptable, reducing the ability to generalize to the population of all programs. The RPI was developed by a consensus process; however, there may be elements that were omitted, particularly owing to the mandate to create the least burdensome tool. How assessments of quality using the RPI relate to future physician performance is not known, and will be difficult to evaluate.

The RPI tool appears to be feasible and acceptable to many family medicine program directors and may aid in the annual program evaluation. Development of the tool through a broad consensus process was feasible and could be replicated by other specialties. The tool shows early promise in aiding family medicine programs in their annual program evaluation and improvement process.

The authors would like to thank Vickie Greenwood and Susan Quigg of the Association of Family Medicine Residency Directors for their role in the development and implementation of the Residency Performance Index tool.

1.
Hackbarth
,
G
,
Boccuti
,
C
.
Transforming graduate medical education to improve health care value
.
N Engl J Med
.
2011
;
364
(
8
):
693
695
.
2.
Accreditation Council for Graduate Medical Education
.
ACGME Common Program Requirements
. .
3.
Iverson
,
DJ
.
Meritocracy in graduate medical education: some suggestions for creating a report card
.
Acad Med
.
1998
;
73
(
12
):
1223
1225
.
4.
Murray
,
PM
,
Valdivia
,
JH
,
Berquist
,
MR
.
A metric to evaluate the comparative performance of an institution's graduate medical education program
.
Acad Med
.
2009
;
84
(
2
):
212
219
.
5.
Asch
,
DA
,
Epstein
,
A
,
Nicholson
,
S
.
Evaluating medical training programs by the quality of care delivered by their alumni
.
JAMA
.
2007
;
298
(
9
):
1049
1051
.
6.
Giudice
,
E
,
Carraccio
,
C
.
Best evidence calling for educational reform: will we do the right thing
?
J Grad Med Educ
.
2011
;
3
(
4
):
577
579
.
7.
Klessig
,
JM
,
Wolfsthal
,
SD
,
Levine
,
MA
,
Stickley
,
W
,
Bing-You
,
RG
,
Lansdale
,
TF
,
et al.
.
A pilot survey study to define quality in residency education
.
Acad Med
.
2000
;
75
(
1
):
71
73
.
8.
Rose
,
SH
,
Long
,
TR
.
Accreditation Council for Graduate Medical Education (ACGME) annual anesthesiology residency and fellowship program review: a “report card” model for continuous improvement
.
BMC Med Educ
.
2010
;
10
:
13
.
9.
Hoekzema
,
G
,
Abercrombie
,
S
,
Carr
,
S
,
Gravel
,
JW
Jr,
Hall
,
KL
,
Kozakowski
,
S
,
et al.
.
Residency “dashboard”: family medicine GME's step towards transparency and accountability
.
Ann Fam Med
.
2010
;
8
(
5
):
470
.
10.
American Academy of Family Physicians
.
The Residency Program Solutions Criteria for Excellence 2011, 8th ed
.
http://www.aafp.org/rps. Accessed September 4, 2014
.
11.
Nothnagle
,
M
,
Sicilia
,
JM
,
Forman
,
S
,
Fish
,
J
,
Ellert
,
W
,
Gebhard
,
R
,
et al.
.
Required procedural training in family medicine residency: a consensus statement
.
Fam Med
.
2008
;
40
(
4
):
248
252
.
12.
Accreditation Council for Graduate Medical Education
.
Scholarly Activity Guidelines Review Committee for Family Medicine. 2012
. .
13.
Kruse
,
J
.
The patient-centered medical home: a brief educational agenda for teachers of family medicine
.
Fam Med
.
2013
;
45
(
2
):
132
136
.
14.
Newton
,
WP
.
Family physician scope of practice: what it is and why it matters
.
J Am Board Fam Med
.
2011
;
24
(
6
):
633
634
.
15.
Tong
,
ST
,
Makaroff
,
LA
,
Xierali
,
IM
,
Parhat
,
P
,
Puffer
,
JC
,
Newton
,
WP
,
et al.
.
Proportion of family physicians providing maternity care continues to decline
.
J Am Board Fam Med
.
2012
;
25
(
3
):
270
271
.
16.
Blanchette
,
H
.
The impending crisis in the decline of family physicians providing maternity care
.
J Am Board Fam Med
.
2012
;
25
(
3
):
272
273
.
17.
Bazemore
,
AW
,
Petterson
,
S
,
Johnson
,
N
,
Xierali
,
IM
,
Phillips
,
RL
,
Rinaldo
,
J
,
et al.
.
What services do family physicians provide in a time of primary care transition
?
J Am Board Fam Med
.
2011
;
24
(
6
):
635
636
.
18.
Association of Family Medicine Directors
.
Year-2 (2011–2012) strategic plan update
. .

Author notes

Grant S. Hoekzema, MD, is Immediate Past-President, Association of Family Medicine Residency Directors, and Program Director, Mercy Family Medicine Residency; Lisa Maxwell, MD, is Program Director, Christiana Care Health System Family Medicine Residency; Joseph W. Gravel Jr, MD, is Program Director, Lawrence Family Medicine Residency; Walter W. Mills, MD, is Associate Program Director, Natividad Family Medicine Residency, Department of Family and Community Medicine, University of California, San Francisco School of Medicine; and William Geiger, MD, is Senior Associate Program Director, Grant Family Medicine Residency, Grant Medical Center (Ohio Health).

Funding: The authors report no external funding source for this study.

Conflict of interest: The authors declare they have no competing interests.

Supplementary data