Abstract
PURPOSE The objective of this study was to elucidate the effect of facilitation on practice outcomes in the 2-year patient-centered medical home (PCMH) National Demonstration Project (NDP) intervention, and to describe practices’ experience in implementing different components of the NDP model of the PCMH.
METHODS Thirty-six family practices were randomized to a facilitated intervention group or a self-directed intervention group. We measured 3 practice-level outcomes: (1) the proportion of 39 components of the NDP model that practices implemented, (2) the aggregate patient rating of the practices’ PCMH attributes, and (3) the practices’ ability to make and sustain change, which we term adaptive reserve. We used a repeated-measures analysis of variance to test the intervention effects.
RESULTS By the end of the 2 years of the NDP, practices in both facilitated and self-directed groups had at least 70% of the NDP model components in place. Implementation was relatively harder if the model component affected multiple roles and processes, required coordination across work units, necessitated additional resources and expertise, or challenged the traditional model of primary care. Electronic visits, group visits, team-based care, wellness promotion, and proactive population management presented the greatest challenges. Controlling for baseline differences and practice size, facilitated practices had greater increases in adaptive reserve (group difference by time, P = .005) and the proportion of NDP model components implemented (group difference by time, P=.02); the latter increased from 42% to 72% in the facilitated group and from 54% to 70% in the self-directed group. Patient ratings of the practices’ PCMH attributes did not differ between groups and, in fact, diminished in both of them.
CONCLUSIONS Highly motivated practices can implement many components of the PCMH in 2 years, but apparently at a cost of diminishing the patient’s experience of care. Intense facilitation increases the number of components implemented and improves practices’ adaptive reserve. Longer follow-up is needed to assess the sustained and evolving effects of moving independent practices toward PCMHs
- Primary care
- family practice
- National Demonstration Project
- organizational change
- patient-centered medical home, patient-centered care
- adaptive reserve
- outcomes assessment
- practice-based research
INTRODUCTION
The Future of Family Medicine (FFM) report of the American Academy of Family Physicians (AAFP) recognized the dire need for family practice to improve its practice model in an uncertain health care environment.1 A report in 2004 by 1 of the 6 FFM task forces provided the first outline of its “New Model” of primary care practice and recommended a large-scale national demonstration project.2 The AAFP launched the National Demonstration Project (NDP) in 2006 to explore the feasibility of implementing the new model in existing family practices. Based initially on the FFM report, the NDP model was widely examined and considered, and further refined with the publication of the Joint Principles of the Patient-Centered Medical Home (PCMH).3 The model continues to evolve as experience with it grows.
Despite widespread enthusiasm for such change,4–8 there is little systematic evidence of what it takes to transform a traditional family practice into a PCMH, nor of the relative difficulty for practices attempting the specific changes required. Although many demonstration projects are planned and already in the field,9 the NDP is the first national, large-scale demonstration project, with a detailed multimethod evaluation of what it takes to implement the PCMH.
The NDP compared 2 approaches to implementation: facilitated and self-directed. The facilitated approach used an intense combination of on-site assistance from practice change facilitators, learning sessions, national consultants, and preselected vendors of a range of health information technology. The self-directed approach entailed access to Web-based practice improvement tools and services. Articles in this supplement describe our observation of the NDP intervention process,10 patient-level outcomes,11 and a qualitative analysis of the practices’ experience in integrating the NDP model components into their operations.12 This article specifically focuses on the effect of facilitation vs self-direction on practice-level outcomes. We tested hypotheses that, compared with self-directed practices, facilitated practices would be able to put more NDP model components in place, would receive higher ratings as PCMHs from their patients, and would be better able to improve their adaptive reserve (capability to make and sustain change).12–14 We also present a secondary analysis examining the effect of adaptive reserve at baseline on the ability to implement NDP model components in all practices. Finally, we present qualitative data on practices’ experience in implementing the different components of the NDP model.
METHODS
We obtained approval for the evaluation protocol from the appropriate institutional review boards (IRBs), including those of the AAFP and the academic institutions of each evaluation team member.
Participants and Settings
The NDP was launched in June 2006 by TransforMED, a division of the AAFP,15 to test an evolving model of the PCMH.10 Thirty-six family practices were selected by an NDP Technical Advisory Committee from a national pool of 337 practices that completed a detailed online application. The Committee chose practices that appeared ready to take on the NDP model and that, as a group, were maximally diverse in terms of geographic location, size, age, physician and staff structure, ownership arrangements, and scope of practice.
Overall, the practices were located in 25 states, with 11 situated in rural communities, 16 in suburban areas, and 9 in urban areas. Ten practices were solo physicians (some having a midlevel clinician), 8 practices were small (2–3 physicians), 10 practices were medium sized (4–6 physicians), and 8 practices were large (≥7 physicians). Twenty-two practices were owned by physicians, 1 was owned by a governing board, and 13 were owned by larger hospital or medical systems.
Table 1⇓ shows a comparison of the practices in the 2 groups on baseline characteristics: basic practice demographics, number of NDP model components in place, and patient ratings as a PCMH. Of the 27 characteristics compared, only 3 differed significantly (P <.05) between groups. Since the self-directed practices generally started with more model components in place at baseline, we adjusted subsequent analyses for baseline status.
During the NDP, 5 practices withdrew from the project. One facilitated practice dropped out because of local IRB issues and another closed because of financial difficulty. Among the self-directed practices, 2 dropped out because their larger system closed or restructured their office, and 1 dropped out because of local competing demands for time and attention. Analyses are based on complete data for 16 facilitated and 15 self-directed practices.
The NDP Intervention
The evaluation team randomly assigned practices to facilitated and self-directed interventions groups. Details of the 2-year intervention are described elsewhere in this supplement.10 In brief, facilitated practices received ongoing assistance from a change facilitator; ongoing consultation from experts in practice economics, health information technology, and quality improvement; and discounted software technology, training, and support. They also participated in four 2-day learning sessions and regular group conference calls. Self-directed practices were given access to Web-based practice improvement tools and services, but did not receive any on-site assistance. The self-directed practices organized their own retreat halfway through the 2-year project and shared their interim experiences. They also participated in the final learning session with the facilitated practices.
Measure Development and Data Collection
Assessing Implementation of Model Components
The NDP model originated with the FFM report1 and went through several iterations from its inception in June 2006, including modification consistent with the joint principles of the PCMH.16 The version of the model we used for the evaluation is described elsewhere in this supplement10 and consisted of 55 components within 8 domains, including access to care and information, care management, practice services, continuity of care services, practice management, quality and safety, health information technology, and practice-based care teams.15
Of the 55 NDP model components, we judged 16 to be unmeasurable by our observational methods. For example, assessment of some components would require further observation of patient visits (eg, evidence-based practices), careful observation of staff activities beyond self-report (eg, patient participation), and judgment calls on our part that could not be made consistently (eg, culturally sensitive care). In some cases, we simply did not have reliable data on the status of the component at baseline (eg, optimized coding and billing). Removing these unmeasurable components left 39 NDP model components that we assessed for each practice at several points in time. Supplemental Appendix 1 (available at: http://www.annfammed.org/cgi/content/full/8/Suppl_1/S33/DC1) shows all 55 components and their operational definition, and indicates which were not assessed.
For this analysis, we developed a template to guide data collection and to assess the status of each practice for the 39 model components. Initial data were collected during a 2- to 3-day site visit by one of the authors (E.E.S.) to each self-directed practice (summer through fall of 2007) and each facilitated practice (summer through fall 2008). On-site interviews with multiple practice staff were used to establish the model components that were in place, as well as when and how they were implemented. We recognize that memory of when events occurred may create error in judging when some components were implemented. After the end of the NDP, we followed up extensively by telephone interviews with 1 or more informants in each practice (always the physician champion and often the office manager) to ensure accuracy of the final assessment of model components in place at the end of the NDP and to assess components implemented in the 9 months after the project ended. We also gathered additional qualitative data on the processes, barriers, challenges, and special notes of accomplishment that fleshed out the practice-specific experience with the implementation process.
Before the telephone interviews, we reviewed previous practice data to customize standard questions used for each practice. We asked specific open-ended follow-up questions on model components. For example, for population management, we asked, describe to me how you are tracking your patients requiring care for chronic conditions, such as diabetes, now? When did you start this process and how is that different from before? Who is chiefly responsible for the tasks? During the interviews, we made notes on the template and then expanded and edited these notes immediately after the interview. When possible, we collected direct quotes.
We constructed a categorical variable for each of the 39 NDP model components for each practice. Components fell into 4 categories: not implemented at all, in place at baseline, implemented during the NDP, and implemented in the 9 months after the NDP ended. Where ambiguity about status of a model component remained, 2 members of the team (E.E.S. and P.A.N.) discussed the data and made a consensus judgment. In cases where clear consensus was not achieved, we recontacted the practice by phone or e-mail for additional data. We repeated this process until we were confident of the accuracy of data. In some cases, the practice implemented a component, tried it, and decided not to continue to use it. In these instances, we considered the components to be implemented, although we recognize that they were not successfully sustained. Finally, we tabulated the categorical data by practice and examined patterns across practices and practice groups.
Assessing Patient-Rated PCMH Attributes
Development and administration of the patient outcomes survey (POS) is described in detail elsewhere in this supplement.17 NDP staff mailed the POS to a cross-sectional sample of 120 consecutive patients of any age seen in the practice on 3 target dates: baseline (July 3, 2006), 9 months (April 1, 2007), and 26 months (August 1, 2008). The POS included more than 100 items, most of which used a 5-point Likert-type scale. Response rates across all 31 practices for the POS were 27% (wave 1), 22% (wave 2), and 21% (wave 3).
For this analysis, we constructed a practice-level measure of the patient’s assessment of the PCMH attributes of the practice (the patient-rated PCMH) that consisted of 23 items in 5 scales (Table 2⇓). Analysis of these data as patient-level outcomes are reported elsewhere in this supplement.11 As a group, the patient-rated PCMH measure addressed the 4 pillars of primary care (easy access to first-contact care, comprehensive care, coordination of care, and personal relationship over time) that have been shown to be associated with improved outcomes and reduced cost.21–23 For the 4 pillars of primary care, we used well-validated measures: the Ambulatory Care Experience Survey (ACES)19 for organizational access and the Components of Primary Care Index (CPCI)18 for measures of comprehensive care, coordination of care, and accumulated knowledge as a proxy for personal relationship over time. The patient-rated PCMH also used 2 new items in a fifth scale to assess the global practice experience, as rated on Likert-type scales regarding statements of “I am delighted with this practice” and “I receive the care I want and need when and how I want and need it.”11,20 We conducted a reliability analysis using PASW Statistics version 17 (SPSS Inc, Chicago, Illinois) by loading all 23 items, resulting in a measure with a Cronbach α of .92. Cronbach α is a measure of the internal consistency of a scale. High values (eg, >.7) indicate that all variables in the set correlate well with one another.
Assessing Practice Adaptive Reserve
Reports have identified the importance of a practice’s ability to make and sustain change.24–26 We have termed this characteristic the adaptive reserve and have observed how it becomes important in times of stress and rapid change.12,14,27 Adaptive reserve includes the practice relationship infrastructure; alignment of management functions in which clinical care, practice operations, and financial functions share and reflect a consistent vision; facilitative leadership; teamwork; sensemaking; a positive work environment; and a culture of learning.17 The relationship infrastructure in turn consists of trust, mindfulness, heedful interaction, respectful interaction, cognitive diversity, a balance of social and task relatedness, and a balance of rich and lean communication venues.28
We created the adaptive reserve scale from the clinician staff questionnaire (CSQ) described elsewhere in this supplement.17 The purpose of the CSQ was to measure and track changes over the course of the NDP in how clinicians and office staff perceived key practice attributes, such as modes of communication, leadership styles, culture of learning, psychological safety, and approach to cultural diversity. The CSQ was distributed to all clinical and nonclinical practice staff at each practice in person and collected by mail in 3 waves (baseline, 9 months, and 26 months). Staff who agreed to participate returned the CSQ by mail directly to the study center. To comply with the IRB protocol, the questionnaires did not include an individual identifier, so the 3 waves of the CSQ represent repeated cross-sections of the staff at each practice. Response rates for the CSQ were 60% (wave 1), 48% (wave 2), and 52% (wave 3).
We submitted 82 items from the CSQ to a principal components factor analysis separately for each of the 3 waves, as described in detail elsewhere in this supplement.17 The analysis identified a 23-item scale that addressed the relationship infrastructure, facilitative leadership, culture of learning, and work environment (Table 3⇓). Items for alignment of management functions were not included as the importance of this characteristic emerged from our analysis12 and it was not included in the original CSQ. The adaptive reserve measure had a Cronbach α of .97.
Data Analysis
We analyzed the effect of facilitation on the 3 main outcomes, namely, the proportion of model components implemented by the practice during the NDP, the patient-rated PCMH, and the practice’s adaptive reserve. We used a full factorial repeated-measures analysis of variance (ANOVA) to assess the main effects (eg, mean differences between groups) and the within-group change over time. This approach also allowed us to determine whether one group changed more rapidly over time (group-by-time interaction).29 We weighted the analysis by the number of respondents in each practice because of varying response rates. Although the patient-rated PCMH and the practice adaptive reserve measures are based on individual responses, variables in this analysis were aggregated practice-level scores. This approach precluded the necessity of using multilevel methods because the practice, rather than the individual, was the unit of analysis.
In a secondary analysis, we assessed whether practice adaptive reserve at baseline was associated with number of model components implemented. We used an ordinary least squares regression model and adjusted for the number of model components in place at baseline.
We also examined qualitatively the patterns of practice adoption of the NDP model components. We produced marginal counts for each of the 39 measurable components. For each practice and for each model component, we assessed whether and when the component was implemented, and patterns of implementation across practices.
Finally, we analyzed qualitative field notes made during and immediately after telephone interviews with practices to enrich understanding of the challenges faced in implementing the components and some of the patterns of adoption that emerged.
RESULTS
Effect of the NDP Intervention
Results of the analyses of the effect of the NDP intervention on the 3 main outcomes are shown in Table 4⇓. Practices in both groups significantly increased the proportion of NDP model components in place from baseline to the 26-month follow-up; however, the facilitated practices had fewer components in place at baseline, and the significant interaction term (between group and time) indicates that facilitation significantly increased component implementation. The patient-rated PCMH measure significantly decreased in both facilitated and self-directed groups, with no significant difference between them. Finally, practice adaptive reserve increased during the NDP in the facilitated practices but remained essentially the same in self-directed practices, with a significant difference between groups.
Adaptive Reserve and Component Implementation
Baseline adaptive reserve appeared to influence the number of NDP model components implemented. After adjusting for difference between groups in components in place at baseline (P = .04), there was a nonsignificant trend toward implementation of more model components in practices having greater adaptive reserve (standardized β = .23, SE = .21, P = .08). We should note that our analysis had only 60% power for detecting a significant effect (P <.05) on group, time, and group-by-time effect for these variables.
Patterns of Practice Implementation of Components
Data on implementation of individual NDP model components for all 31 practices completing the NDP are summarized in Table 5⇓ and shown in detail by practice in Supplemental Appendix 2 (available at http://www.annfammed.org/cgi/content/full/8/Suppl_1/S33/DC1). Practices in both groups already had many model components in place at baseline (Table 5⇓). The facilitated practices had on average 17.0 (43.6%) and the self-directed practices had on average 20.1 (51.5%) of the 39 model components when the NDP began. In general, most practices entered the NDP with most of the components of practice services and many of the components that had to do with the scope of services (after-hours coverage, hospital care, maternity care, and disease prevention) in place. The few missing components in these areas were completed during the NDP.
Practices in both groups were successful in implementing many additional components: facilitated practices added an average of 10.7 new components and self-directed practices added an average of 7.7. Additionally, in the 9 months after the end of the NDP, practices continued to implement model components, an additional 1.5 in facilitated and 2.5 in self-directed practices. Consequently, by the end of this 9-month period, practices in both groups had in place at least 70% of the components: an average of 27.7 and 27.9 of the 39 NDP model components in the facilitated and self-directed practices, respectively.
Model components were not all equally likely to be implemented. All practices were able to implement same-day appointments and nearly all were able to implement electronic prescribing and make laboratory results highly accessible to patients. Many were able to improve practice management processes such implementing as more disciplined financial management, cost-benefit decision making, revenue enhancement, improved personnel management, and more efficient office design. Many practices also had or developed a practice Web site, although providing a fully functioning patient portal proved more difficult for most.
The NDP practices appeared to be early adopters of electronic medical records (EMRs). Twenty-two (71%) had EMRs in place at baseline (well above the national average30–32), another 6 implemented them during the NDP, and only 3 (2 facilitated and 1 self-directed) did not have EMRs by 26 months. The 2 facilitated practices without EMRs were both part of larger systems for whom EMR implementation was an evolving priority.
At the same time, some model components presented greater challenges. These components included e-visits, group visits, team-based care, wellness promotion, and population management (involving 3 model components). We describe these more challenging components in greater detail below.
e-Visits
Practices in both groups struggled with e-visits. Only 1 practice had implemented e-visits before the NDP and only 8 more put e-visits into place during the NDP. Of these, 4 used them for a time, but subsequently stopped. There were several reasons why e-visits were not popular among practices. Several practices felt they were useful “when they worked”; however, they had great difficulty using the templates from their commercial vendor and obtaining commercial products to enable patients to pay with credit cards online. Further, e-visits were also seen as not being very efficient and requiring a great deal of effort when it came to marketing them to patients. For example, one physician noted he spent more time on the computer for an e-visit than he did in the examination room for a conventional visit. At least 5 practices indicated that a lack of coverage by health plans was a major impediment, and they resisted asking patients to pay for a service they saw as providing only marginal value. Interestingly, several practices noted that once they had implemented same-day visits, adding e-visits seemed contradictory to both them and their patients. In fact, the facilitated practice that came into the NDP with e-visits in place reported a rapid decline in patient interest once they had implemented same-day visits.
On the other hand, the majority of practices enthusiastically used e-mail to communicate with patients in some fashion. A few found e-mail to be a useful way for patients to ask questions that could be triaged within the practice for an appropriate response. Several physicians felt that using e-mail provided a useful adjunct to office visits, but did not serve to replace them. Some practices reported that they believed providing e-mail access to their patients reduced the number of telephone calls. Among both groups, only 7 practices did not communicate with patients in some manner using e-mail.
Group Visits
Group visits also presented a quandary for many NDP practices. Only 1 practice in each group had experimented with group visits before the NDP. Fifteen practices implemented 1 or more group visits during the NDP; however, 9 of these practices subsequently discontinued group visits, citing lack of time and support for planning and a general sense that they did not have enough value to justify the financial investment. Two practices independently estimated that they would need 7 to 8 patients per session to break even financially, a goal they were unable to reach. On the other hand, several practices planned to continue to explore group visits, and a number were modifying their format to shift emphasis onto wellness, support groups, and sessions that emphasized education over providing visit-type services. Group visits were a particular challenge for small practices, which had difficulty in finding time, space, and a critical mass of patients. One self-directed practice pointed out that the time and energy spent planning and organizing the group visit directly competed with that available for nurse visits to educate patients with chronic illness. All practices struggled with how to code for group visits, because of the lack of clear guidelines and the variation they observed in expert opinion.
Team-Based Care
Practices often had trouble implementing team-based care. Many took initial steps by creating stable physician–medical assistant teams and locating physicians and medical assistants in the same work area; however, these actions were generally viewed only as important intermediate steps and did not constitute team care. Creating care teams required breaching the traditional gap in front-back office communication by developing shared visions of how care teams affect the patient experience, having frequent front-back office meetings and retreats, and reconfiguring office work flow and patient flow across front-back office functions. Developing team-based care also required substantial effort in cross-training and systematically creating agendas for ongoing training in expanded tasks. One physician observed, “Taking the time to train my staff to take part in the history and physical exam was the smartest thing I ever did.” In addition, establishing standing orders and protocols for ordering laboratory tests and refilling prescriptions were important team-based care roles. A number of practices reported that daily huddles33 were an important way to model team behavior. Two practices strengthened both their care teams and their community connections by providing a lunch allowance for staff to meet with other service providers in the community and bring back relevant information for the practice. In the words of one facilitated practice physician, “We have always been in touch with community services, the difference is now we are using the practice care team to help build a knowledge base—it’s not just [physician name] and me anymore. The MAs [medical assistants] help to coordinate this stuff.”
Nevertheless, practices cited a number of barriers to care teams, including reliance on part-time staff and physicians that created challenges to continuity. Additionally, part-time staff had less incentive to expend effort on a larger shared practice vision. A barrier that did not surface without probing was the perception of many physicians about their role and a reluctance to sharing that role with others. As one physician noted, “Doctors should be doing the doctor things.” Another physician pointed out that he had gained an appreciation of care teams during the NDP, but that other physicians in the practice are “stuck in the old way of doctoring.”
Wellness Promotion
Four facilitated practices and 5 self-directed practices had an emphasis on wellness coming into the NDP. Another 3 practices in the facilitated group and 1 in the self-directed group made substantial progress on this component during the NDP. Nine practices in each group did not report progress in emphasizing wellness, however. Although virtually all practices valued wellness as an integral part of the scope of their work, they largely cited time and energy as barriers. Several physicians saw an important association between an emphasis on wellness and expanding team care in the practice, both of which were seen as challenges to be faced as the practice developed further. Many of the practices reporting an emphasis on wellness were able to offer wellness services through their larger hospital or medical system. Several practices pointed out an association between an emphasis on wellness and strong connections with the larger community. Most of the practices who strengthened their community connections during the NDP did so through participation in health fairs or sponsoring community health or fitness events.
Population Management
Three of the NDP model components focused on building practices’ ability to monitor and proactively address the health care of subpopulations. Two of the components—population management (in the care management domain) and population management/registries (in the health information technology domain)—had overlapping properties for identifying groups of patients with selected characteristics such as diabetes. The third component, case management, addressed processes for identifying, tracking, and taking action for patients with complex comorbidities and preventing those patients from falling through the cracks. Nevertheless, the technologic solutions for information support and population management were often less than ideal. Private practices were typically at the mercy of EMR vendors’ time lines, whereas system-owned practices had to wait until a feature became a priority for their system’s information technology department. Many practices found their EMR could not provide information support for population-based care, although some systems could print out lists of patients with certain conditions so practices could catch patients as they came in for visits. As the physician in one facilitated practice said, “we’re ready and willing—the software isn’t willing yet!”
Some practices therefore used billing data to identify target populations by age and sex, and send an e-mail or postal “blast” for special purposes, such as encouraging influenza shots. In several practices, particularly motivated individual physicians created their own work-around for a topic-specific population management issue. For example, one self-directed practice jury-rigged their EMR to produce population reports and point-of-service reminders, and to place the reminders on their patient portal. Although facilitated practices had access to an innovative and sophisticated proprietary disease management tool, most who tried to implement the tool discovered it did not integrate easily with their EMR. Many practices decided to wait until their EMR offered an upgrade with population management features. In addition to limitations in available technology and the added time required for activities not traditionally included in primary care practice, there was resistance to change in roles for existing personnel and to the required shift in paradigm from care of 1 patient at a time to population-based, proactive care of groups of patients. Even in many practices embedded within a larger system that was capable of producing population reports, it was still up to the practice to request and use the reports. In many cases, this capacity was not used.
DISCUSSION
This analysis of the effects of the NDP revealed some important findings on 3 hypothesized practice-level effects regarding facilitation of the intervention. There are, however, several important limitations of the analysis. First are limitations to generalizability of the findings that include highly selected practices and extremely capable change facilitators, both working under intense national scrutiny. Similarly, the NDP facilitated intervention was very intense and involved a learning evaluation (described elsewhere in this supplement10) that interacted in important ways with the unusual capability and motivation of the practices. Although these characteristics are helpful in examining the feasibility of implementing many of the features of a PCMH, they limit our ability to understand how more typical practices will be able to adopt these features. Future efforts to adopt a PCMH model may find that less can be accomplished in more typical settings or that adoption requires even more time and resources.
A second limitation is possible bias in the 3 outcome measures. The number of NDP model components implemented was derived from self-report by practice informants, although we were able to triangulate the assessment with multiple practice informants, facilitator reports, and e-mail streams. The response rates for the 2 surveys (in the range of 20% for the POS and 50% for the CSQ) may have produced selection bias, and perhaps a rosier result, and need to be replicated in other studies. A third limitation was that because the NDP focused so intensely on specific model components, further work is required to understand the strategies for ensuring that implementing PCMH model components leads eventually to strong patient-centered characteristics. Finally, the NDP did not incorporate new reimbursement strategies into the intervention, and the effect of various types of reimbursement reform must be studied in current and future demonstration projects.
The NDP facilitated intervention increased the practices’ adaptive reserve,12–14 a characteristic shown in our qualitative analysis to be important for success in adopting model components.12 In the analysis reported here, we also observed a nonsignificant positive association between adaptive reserve at baseline and implementation of NDP model components. This is an important finding and taken together with our qualitative findings,12 suggests that strengthening adaptive reserve will serve the practices well over the next decade as they continue the transformation to PCMHs and adapt to rapidly changing demands of the health care environment.
Most practices in both groups were able to implement many of the NDP model components over the 2 years of the initiative. Facilitation appeared to significantly increase the number of adopted model components, with an average of 10.7 added in facilitated practices, compared with 7.7 in self-directed practices. The cost and effort required in the NDP intervention to achieve a modest difference in model components implemented call into question the feasibility of such an intense intervention as a national strategy for adopting a PCMH model. Importantly, the self-directed practices were also successful in adopting model components, and practices in both groups ended up with at least 70% of model components in place. The ability of many self-directed practices to make substantial progress suggests that not all practices need intense assistance. We believe from this and other work that a practice’s baseline adaptive reserve can be an important indicator of the magnitude and kind of assistance a practice may need.
We also observed that facilitation did not directly increase patient ratings of their practice as a PCMH. In fact, the patient-rated PCMH significantly decreased in both groups. Whether the intense effort to adopt model components or the nature of the components themselves (eg, an EMR in the examination room) has a deleterious effect on the patient experience is not clear from our data, but deserves further study. The differing effect of facilitation on implementing model components and on patient-rated PCMH attributes suggests that from a patient’s perspective, a PCMH is more than the sum of the NDP model components.11 Changing a practice in a way that improves the patient’s experience requires either a different set of strategies or more time for existing strategies to take effect. Adopting NDP model components is a very proximal step in a complex chain of events that also includes effective and consistent application of model components to the patient population before improved patient-level outcomes will be realized.
Finally, not all changes included in the NDP model required the same level of effort. In looking across all practices, it became apparent that changes were relatively harder if they had an impact on multiple roles and processes, required coordination across work units, necessitated additional resources and expertise, and challenged the traditional model of primary care. Some model components were necessarily implemented in sequence. For example, practices often postponed addressing case management until they had a functioning registry in place, while such a registry, in turn, was rarely available as a routine function of the EMR. Components were also more difficult to implement when they required shifts in the ways people thought about and understood their roles. For example, adopting a team care approach required that multiple roles in the practice be redefined, representing a more difficult task than implementing same-day appointments. Although the latter was very difficult, it did not generally create a ripple effect through the practice that disrupted the practice’s working relationships and style. Changes were also more challenging when they required that individuals or groups adopt a different mental model of their work. Adopting team care was seen to conflict with some physicians’ vision of their work as a doctor, whereas adopting a population-based approach to care required the entire practice to shift from a model of (in the words of one physician) “get ‘em in, get ‘em out” to one that viewed population-based proactive care of defined populations as legitimate work of the practice.
Acknowledgments
The NDP was designed and implemented by TransforMED, LLC, a wholly-owned subsidiary of the AAFP. We are indebted to the participants in the NDP and to TransforMED for their tireless work.
Footnotes
-
Conflicts of interest: The authors’ funding partially supports their time devoted to the evaluation, but they have no financial stake in the outcome. The authors’ agreement with the funders gives them complete independence in conducting the evaluation and allows them to publish the findings without prior review by the funders. The authors have full access to and control of study data. The funders had no role in writing or submitting the manuscript.
-
Disclaimer: Drs Stange and Nutting, who are editors of the Annals, were not involved in the editorial evaluation of or decision to publish this article.
-
Funding support: The independent evaluation of the National Demonstration Project (NDP) practices was supported by the American Academy of Family Physicians (AAFP) and The Commonwealth Fund. The Commonwealth Fund is a national, private foundation based in New York City that supports independent research on health care issues and makes grants to improve health care practice and policy.
-
Publication of the journal supplement is supported by the American Academy of Family Physicians Foundation, the Society of Teachers of Family Medicine Foundation, the American Board of Family Medicine Foundation, and The Commonwealth Fund.
-
Dr Stange’s time was supported in part by a Clinical Research Professorship from the American Cancer Society.
-
Disclaimer: The views presented here are those of the authors and not necessarily those of The Commonwealth Fund, its directors, officers, or staff.
- Received for publication November 12, 2009.
- Revision received February 12, 2010.
- Accepted for publication March 11, 2010.
- © 2010 Annals of Family Medicine, Inc.