Abstract
PURPOSE We examined quality, satisfaction, financial, and productivity outcomes associated with implementation of Care by Design (CBD), the University of Utah’s version of the patient-centered medical home.
METHODS We measured the implementation of individual elements of CBD using a combination of observation, chart audit, and collection of data from operational reports. We assessed correlations between level of implementation of each element and measures of quality, patient and clinician satisfaction, financial performance, and efficiency.
RESULTS Team function elements had positive correlations (P ≤.05) with 6 quality measures, 4 patient satisfaction measure, and 3 clinician satisfaction measures. Continuity elements had positive correlations with 2 satisfaction measures and 1 quality measure. Clinician continuity was the key driver in the composite element of appropriate access. Unexpected findings included the negative correlation of use of templated questionnaires with 3 patient satisfaction measures. Trade-offs were observed for performance of blood draws in the examination room and the efficiency of visits, with some positive and some negative correlations depending on the outcome.
CONCLUSIONS Elements related to care teams and continuity appear to be key elements of CBD as they influence all 3 CBD organizing principles: appropriate access, care teams, and planned care. These relationships, as well as unexpected, unfavorable ones, require further study and refined analyses to identify causal associations.
INTRODUCTION
Despite widespread pilot implementation and favorable initial results of the patient-centered medical home (PCMH),1 assessment of its impacts is in an early stage.2 Not enough is known about the model’s implications to ascertain its results in terms of practice quality, satisfaction, or finances.3 Evidence is also lacking about relationships between individual elements of the PCMH model and specific beneficial outcomes.
In this study, in contrast to looking at the PCMH as a whole, we examined the relationship between individual elements of Care by Design (CBD), a comprehensive redesigned model of care that incorporates many elements of a PCMH, and multiple outcomes in quality of care, patient and clinician satisfaction, productivity, and operational costs.
The University of Utah’s Community Clinics introduced CBD in 2003. The model had 3 founding principles: appropriate access, care teams, and planned care. The transformation included expanded and new roles for support staff and redesigned workflows and processes.
Implementation initially focused on improved access with an emphasis on same-day appointments. Appropriate access was designed primarily to improve patient satisfaction. By 2006, the model had incorporated additional elements including team-based care and more comprehensive planned care. Care teams enhanced efficiency by using the time and skills of support staff, allowing clinicians to focus more on relationships with patients. Medical assistants (MAs) assumed increased responsibilities for many of the visit’s time-consuming tasks. This coordinated team care was also intended to increase quality of care through improved communication and information sharing with patients.
Planned care was implemented to enhance continuity and integration of care. New care team members including clinical pharmacists, care managers, and most recently a transitions navigator, have contributed considerably to these goals.
We expected the revised work process and personnel changes involved in implementing CBD to improve quality, patient and clinician satisfaction, and productivity, while at the same time avoiding increased operational costs. After 10 years of experience with different phases of the transformation, we can now better assess the relationships between the elements of our model and multiple outcomes. This study was approved by the University of Utah Institutional Review Board.
METHODS
Setting
The University of Utah Community Clinics are a network of 10 university-owned clinics with approximately 70 primary care clinicians (physicians, physician assistants, nurse practitioners) in addition to specialists. The clinics provide primary and secondary medical care for approximately 100,000 active patients generating more than 200,000 primary care visits annually, as shown in Table 1 and in Supplemental Appendix 1 (available online at http://annfammed.org/content/11/Suppl_1/S50/suppl/DC1). Evolution of the clinics has been described elsewhere.4,5
Data Collection
Implementation Data
We assessed clinic CBD implementation 3 times between 2008 and 2011, using an internally developed tool with 28 measures: 6 for appropriate access, 14 for care teams, and 8 for planned care. Clinic management developed operational definitions for each of the 3 principles. We report data from the most recent assessment, July 2011, reflecting the fullest implementation and most comprehensive assessment of CBD. The instruments and assessments for the current analysis reflect the latest refinements and newest elements of redesigned care. We assessed the level of implementation of each element using an ordinal scale ranging from 0 (not implemented) to 4 (fully implemented). Criteria for the scale points were established by Community Clinics’ operations staff on the basis of internally developed benchmarks and expectations as detailed in Table 2 and in Supplemental Appendix 2 (available online at http://annfammed.org/content/11/Suppl_1/S50/suppl/DC1).
To evaluate the level of implementation of each CBD element, we compiled data from several sources. Quantitative data were obtained from chart audits and operational/administrative reports. For data obtained by chart audit, a sample of 5 patient charts per element, per clinician, were evaluated to determine presence or absence of each element. Results were reported as a percentage for each element for each clinician. These percentages were used to calculate an implementation score of 0 to 4 and were averaged over clinicians at the clinic level to calculate clinic-specific implementation scores of 0 to 4. The samples relevant to each element are shown in Table 2 and depicted in Figure 1.
We gathered qualitative assessments from systematic direct observations of clinic processes and care team interactions, along with information provided by staff. Clinicians were observed during 5 successive patient visits. Observers were trained in the use of the data collection tool and scoring criteria. For elements evaluated by observation, we used percentages to derive a score from 0 to 4. For data pulled from operational reports, we converted percentages to a score from 0 to 4. The composite scores for each of the 3 areas (appropriate access, care teams, and planned care) were calculated by averaging the individual CBD element scores for each clinic. The overall CBD implementation score was calculated for each clinic by averaging the composite scores for the 3 areas.
Outcome Data
We used 21 measures of chronic and preventive care services (adapted from the Medicare Care Management Performance demonstration project6) to assess clinical quality. These data were reported at the clinician level as the number of eligible patients (denominator) and the number of eligible patients for whom the standard was met (numerator). Clinician-level scores were rolled up into a percentage of patients meeting the standard for each measure for each clinic. Scores ranged from 0% to 100%.
We measured patient satisfaction with Press Ganey’s Medical Practice Survey7 adapted from a visit encounter survey originally developed by Ware and Hays.8 This questionnaire includes 8 domain-specific items with high internal consistency (Cronbach α = 0.89) and high correlation with an overall satisfaction item (P <.001).7 These data were reported at the clinician level as the percentage of “top-box” (very satisfied) responses. These scores were summed for all clinicians in a given center and averaged to determine the clinic patient satisfaction scores. Scores ranged from 0% to 100%.
We used responses to 15 items from the AMGA questionnaire to measure clinician satisfaction. This survey, based on the Physician Worklife Survey9 and the Primary Care Physician Survey10 with additional items evaluated by AMGA, includes 12 dimensions with average α coefficients of 0.78.11 Responses to the clinician satisfaction questionnaire were reported at the clinic level as percentage of very satisfied/strongly agree responses. Scores ranged from 0% to 100%.
To measure financial performance and productivity, we used 9 measures derived from clinic operations data including various operational ratios such as staff cost per clinician.
Statistical Analysis
We report correlations between CBD implementation scores at the clinic level and quality measures, patient satisfaction, clinician satisfaction, and our financial and productivity parameters at the clinic level. Using the implementation scores, all measures were converted to be rank-ordered with the lowest-performing clinic as 1 and the highest-performing clinic as 10. For example, the clinic with the lowest proportion of patients who responded “very satisfied” was ranked as 1, while the clinic with the highest proportion of patients who responded “very satisfied” was ranked as 10. We applied the same ranking approach to the other measures. The numeric scaling represented ordinal rather than strictly continuous, quantitative assessments. Considering both the number of clinics in our sample (N = 10) and rank-ordered measures, we selected the Spearman rank correlation for our analyses. We selected a P value ≤.05 for the correlation coefficient as the threshold for statistical significance.
RESULTS
Table 3 provides a compilation of the 2011 summary statistics including means, SDs, and ranges for measures of CBD implementation, patient satisfaction, clinician satisfaction, and financial data. The overall mean (SD) implementation score was 1.94 (0.20) on a scale of 0 to 4. Results revealed a range of scores among the CBD elements.
Results of correlation analyses between CBD implementation scores at the clinic level and each of the sets of outcome variables at the clinic level are reported in Supplemental Appendix 3 Tables A through D (available online at http://annfammed.org/content/11/Suppl_1/S50/suppl/DC1). A summary of correlations significant at P ≤.05 is presented in Table 4.
Multiple CBD elements related to team function were positively correlated across quality, patient satisfaction, and clinician satisfaction. The contact of patients after hospital discharge was positively correlated (P ≤.05) with 5 quality measures including those for diabetes and preventive care. Use of an After-Visit Summary was positively correlated with 4 quality measures, including ones for coronary artery disease, diabetes, and congestive heart failure. Presence of advance directives was positively correlated with diabetes quality measures. Medication reconciliation was positively correlated with patient heart failure education. MA engagement, huddles (brief team meetings), and clinician continuity were positively correlated with quality measures.
The overall care team implementation score was positively correlated with patients’ overall satisfaction and clinicians’ satisfaction with time spent working and their patient relationships.
Elements related to continuity had positive correlations with patient satisfaction. Primary care clinician continuity was positively correlated with patients’ likelihood of recommending their clinician. The composite for appropriate access, which contains the key driver of primary care clinician continuity, was positively correlated with patient satisfaction with explanations and instructions, and with the likelihood of recommending the clinician. MA continuity was positively correlated with patient satisfaction with the friendliness of the clinician.
Results also showed a number of potential unintended and unfavorable consequences in terms of correlations. The use of templated questionnaires (X-files) was negatively correlated with 3 patient satisfaction measures: explanations, instructions, and likelihood to recommend.
Some correlations suggest that trade-offs may be necessary. Performing blood draws in the examination room was positively correlated with 2 patient satisfaction measures, as well as 1 clinician satisfaction measure. Also, it was negatively correlated with the efficiency measure of the work relative value units (WRVUs) per clinician. An efficient visit was positively correlated with patients’ satisfaction with the wait time and negatively correlated with clinicians’ satisfaction with the time spent with patients. Additionally, it had negative correlations with 3 quality measures.
DISCUSSION
In this study, we conducted a preliminary assessment of the relationships between multiple individual elements of our PCMH model and 4 important types of outcomes. Identifying these associations provides insights about what the “active ingredients” of the PCMH model might be. Our analyses suggest several encouraging relationships, but also reveal some potential unintended negative outcomes.
Two dominant themes emerged from our data: the importance of team-based care and the importance of continuity of care. Our original care team consisted primarily of microteams with the MAs working in an expanded role. Over time, we included clinical pharmacists and care managers who are responsible for engaging patients in their treatment as well as participating in transitions management. Team functions span a number of CBD elements for which there are positive correlations with quality outcomes as well as patient and clinician satisfaction. Our findings of numerous positive correlations between quality outcomes across multiple team functions highlight some of the most critical functions of team members including patient contact postdischarge, provision of After-Visit Summaries, use of advance directives, and medication reconciliation. Our findings of positive correlations for primary care clinician continuity and MA engagement suggest that the entire team plays a positive role in delivering quality health care. The clinician and patient satisfaction elements associated with the composite team score indicate that patients value the team and that it enhances the clinicians’ relationship with patients and helps them be more efficient.
Continuity of care as seen in our measures includes primary care clinician continuity, MA continuity, and the composite appropriate access element. Primary care clinician continuity in the composite measure appears to be the driver of patient satisfaction with the explanations and instructions they received. These findings suggest that continuity with a clinician positively affects patients’ perceptions that they are receiving personally relevant care. Patients appear to be more likely to recommend their clinician if they have established a continuous relationship. Continuity with the team MA adds cohesiveness and contributes to a positive perception of the clinician. Continuity with the clinician and the team appears to be important to the patient and to further the provision of better, safer care.
Our study revealed not only the intended relationships discussed but also some potential unintended consequences. The MAs’ use of templated, symptom-based questionnaires was intended to increase efficiency, allowing clinicians to relate more personally to patients, thereby enhancing patient satisfaction; however, our analysis revealed that this use was negatively correlated with 3 patient satisfaction measures. We speculate that in our effort to enhance efficiency and patient satisfaction, the mechanized delivery of the questionnaire was not well received by patients. Additionally, although we expected to see positive correlations between use of Best Practice Alerts and clinical quality, we found none.
Our analyses also suggest that in a multifaceted transformation certain trade-offs or tensions exist. In the development of CBD, performing blood draws in the room was intended to increase the efficiency of the visit for the patient. We found that this practice was positively correlated with patient satisfaction measures as well as with clinicians’ satisfaction with their relationship with their patients, but it was negatively correlated with the efficiency measure of WRVUs per clinician. Drawing blood in the examination room could slow down the workflow. Alternatively, it could be that when clinic volumes were high, patients were instead sent to the laboratory.
Although efficiency is often viewed in terms of efficiency for clinicians and staff, it should also be viewed from the patient’s perspective. The correlations associated with our measure, the efficient visit, also demonstrate the trade-offs that occur when implementing transformation. Although patients find an efficient visit satisfying, clinicians perceive that it negatively affects their satisfaction with the time they spend with patients. Three quality measures were also negatively correlated with the efficient visit, suggesting that if there is an emphasis on efficiency, quality may suffer.
We found that several individual elements of CBD correlated positively and negatively with multiple types of outcomes. These findings illustrate the importance of monitoring both intended and unintended consequences of practice redesign.
Interpretation of this study must be tempered by its limitations. Our correlation analyses identified important associations between elements of our care model and several important outcomes. The absence of significant correlations, however, does not rule out the possibility that other important relationships may exist.
Our analyses were limited to CBD implementation data collected through chart review (5 patients per clinician and half-day clinic observations) using an assessment tool developed for use by clinic operations and modified over time to reflect evolution of the model. Data obtained using this tool were transformed into a 5-point ordinal scale based on subjective benchmarks developed by staff. Further, our data reflect a small number of clinics, only 10, in just a single system.
Contextual factors are likely to affect the applicability of our findings to other health care systems. Although we identified significant correlations among our CBD elements and outcomes, our data do not permit determination of cause and effect. Our results may reflect the impact of other factors influencing both our model elements and our outcomes (Supplemental Appendix 1, Context Matters, available online at http://annfa-mmed.org/content/11/Suppl_1/S50/suppl/DC1).
There are certain limitations related to the statistical methods and data that should be noted. Because of the number of correlations generated, some of the significant ones reported may be due to type I errors. Although the correlations should therefore be interpreted cautiously, the consistent finding of significant relationships between specific implementation criteria and outcomes reinforced the conclusion that many of the identified associations were not attributable to type I error.
Correlations were all generated at the clinic level because the implementation involved clinic-level interventions. It is true that variation among clinicians and patients within clinics may provide additional important insights on the relationship between implementation and outcome. Such nesting of patients within clinicians and of clinicians within clinics was not considered in these analyses, but merits attention in further research. We assumed a nonlinear monotonic relationship as the basis for adopting Spearman rank order correlations rather than parametric regression analysis, which requires stiffer distributional assumptions.
In sum, to our knowledge, this is the first report that relates implementation of specific PCMH elements to specific measures of clinical quality, satisfaction for patients and clinicians, and financial performance. Our evidence suggests that particular elements of a PCMH are associated with several important outcomes. Some of these associations were consistent and as intended, whereas others were not. We found multiple positive correlations related to team functions across all outcome measures including quality of care, and patient and clinician satisfaction. We conclude that all members of the team are important to delivering high-quality care and enhancing the patient experience. Our findings are consistent with those of others who have found continuity of care to be associated with improved patient care and patient satisfaction. These associations support the principle that successful medical homes must foster personal, healing relationships.
Our findings suggest several future directions for research. The elements of care teams and continuity of care are inherent to the structure and success of the PCMH; however, the comprehensive system as a whole is likely more important than any individual elements. Further, PCMH implementation is a dynamic process, with changing relationships between individual elements. The context in which one operates may have considerable impact on individual elements as well as the overall system design. Full evaluation of PCMH implementation will require complex mixed methods studies to identify the most productive approach to primary care redesign.
Acknowledgments
We thank Ken Gondor, IT Specialist, University of Utah Community Clinics, for assistance with data collection; Lisa Simpson, Data Analyst, Department of Family and Preventive Medicine, School of Medicine, University of Utah, for assistance with data analysis; and Jennifer Tabler, MS, Department of Sociology, University of Utah, for assistance with literature review and manuscript preparation.
Footnotes
-
Conflicts of interest: authors report none.
-
Funding support: This project was supported by grant R18HS019136 from the Agency for Healthcare Research and Quality, Michael K. Magill, MD, Principal Investigator, and in part by funding from the National Institutes of Health (NIH) grant 1KM1CA156723 to Jaewhan Kim.
-
Disclaimer: The content of this article is solely the responsibility of the authors and does not necessarily represent the official views of the Agency for Healthcare Research and Quality or NIH.
- Received for publication July 1, 2012.
- Revision received November 7, 2012.
- Accepted for publication November 19, 2012.
- © 2013 Annals of Family Medicine, Inc.