Abstract
The patient-centered medical home (PCMH) is widely touted as the current pathway to high-quality primary care practice. Many payers and institutions are using the formal National Committee for Quality Assurance (NCQA) PCMH tool to evaluate practices. Practices commonly feel pressured financially to achieve NCQA recognition. As 2 small high-functioning innovative primary care practices, we describe the actual process of using this tool and assess its utility using a framework based on patient experience of care, costs, and population health. We both attained certification as Level 3 PCMHs but conclude that NCQA’s tool mismatches form and function, is costly and wasteful, and may succeed more in documentation of policies than in supporting improved outcomes in practices.
ARE WE MEDICAL HOMES?
Our practices were recently recognized by the National Committee for Quality Assurance (NCQA) as Level 3 Patient-Centered Medical Homes (PCMHs). After completing the process and weighing the results, we came to an inescapable conclusion: this process wastes time and money and fails to improve patient care. The recognition process encourages low-value documentation in practices of all sizes, unintentionally handicaps small practices compared with their larger counterparts, and highlights how lofty goals alone do not guarantee improved care.
We are 2 independent family medicine solo practices located in Maine (J.A.) and Rhode Island (L.H.) that are insurance based, do not assess any added fees, have low overhead, and follow a high tech–high touch approach (using technology to lower overhead so that we focus on patients and have time to establish relationships with them). Our practices were designed from the ground up to deliver patient-centered collaborative care, based on the Idealized Design of Clinical Office Practices principles that anchor effective ambulatory care (Table 1).1 According to criteria of well-established metrics of patients’ experience of care—the HowsYourHealth.org (HYH) and Consumer Assessment of Healthcare Providers (CAPHS) surveys—and cost (J.A.), our practices were high-functioning primary care practices before we undertook the NCQA certification process. We embarked on the process to look at a few questions: (1) What costs would be incurred? (2) What exactly would certification entail? and (3) Would our practices improve as a result of the process? Additionally, obtaining recognition offered potential financial incentives.
THE MEDICAL HOME MODEL MAKES SENSE
The principles represented by NCQA’s PCMH model are laudable and sensible—who could be opposed to excellent patient access; use of population data, community resources, and evidence-based guidelines to improve care; enhanced patient engagement; tracking tests and referrals; and running practice improvement projects? Unfortunately, the devil is in the details as discussed below: high costs, excessive documentation, and rigid structural and process requirements.
COSTS IN TIME AND MONEY
Neither of our practices had administrative support to expedite the application process. In our separate but parallel worlds, completing the necessary work averaged about 15 to 20 hours per week for approximately 8 months, or about 500 hours total (in addition to our usual 45- to 65-hour clinical work weeks). Although our practices had already deployed many of the principles underlying the PCMH model, the effort to document our use of those standards and to fit the compliance data into the required format was herculean, consuming about two-thirds of total project time (300 hours).
If competent support staff had been available, salary costs ($30/h) to manage documentation would have run about $9,000, while the opportunity cost of physician oversight, planning, and execution ($150/h) would have run about $30,000. Absent staff to perform the bulk of the document handling, we estimated our opportunity costs at $75,000 per physician. Certification reporting costs for 1 practice (L.H.) ran approximately $500 for the application and $1,500 for documentation review. Neither of our practices opted to use a NCQA-approved CAPHS vendor, which costs about $30 per patient ($1,500 for 50 patients).
Less easy to calculate were the compliance costs to build a suggested “team” that required hiring staff and adding computers and space. One practice (L.H.) did hire a nurse care manager to assist with population health and outreach, spending $5,000 for her salary. We estimate a typical practice might expect an outlay of about $46,000 in staff, opportunity, and reporting costs. As small practices, we found the costs prohibitive.
Recognition is Based on This?
We believe that the NCQA PCMH recognition process is based on 3 erroneous assumptions: documentation equals actualization, form equals function, and technologic capability equals utility.
Documentation Does Not Equal Actualization
Much information can be placed in the record, but what message did the patient actually take home?
A particularly galling exercise in the PCMH recognition process was inserting standardized text into patient records relating to 48 instances of care, then returning to do reviews of the same charts before submission in order to check off on a spreadsheet that all 912 pieces of data (19 per chart) had been inserted. The practice chooses 3 or 4 chronic conditions, and each time a patient with any of these conditions is seen, the practice documents up to 19 items in the chart, such as “collaborating with the patient and family to document an individual care plan,” “assessing and addressing barriers to meeting goals,” and “developing and documenting self-management plans and goals.” If this task is done correctly, a practice can earn 20 points (out of 100) toward certification. This requirement fosters a template-inserting, copy-and-paste mentality that provides little benefit to the patient and wastes the time of the physician “doctoring” the record, as shown in an example below.
This requirement is also a crushing documentation burden for small and solo practices, as they must submit the same 912 data points for certification as 25-practitioner practices. Most importantly, what is glaringly absent from this exercise is the patient voice. If one really wants to know what patients took home about their chronic condition, query the patients, not the chart! Sophisticated yet simple tools exist to solicit patient voice with minimal added burden to practices.2,3 We think this exercise should be removed from the recognition process.
Form Does Not Equal Function
Many activities must be done well in a practice to achieve patient-centered collaborative care. No single prescribed care delivery structure ensures excellent care. NCQA recognition seeks to establish with its standards that large care teams are acting with the same unified purpose that solo practices provide by default. The 2014 standards place an even greater emphasis on team functions. As we see in the documentation example below, however, having a team in no way ensures that 1 person assumes responsibility for the team’s conclusion, and having a team in place that does not communicate well, as in Texas Presbyterian’s missed Ebola diagnosis,4 can be downright dangerous.
Documenting that a practice is set up to deliver team care does not automatically equate to quality care. High-functioning teams are hard to assemble and maintain, and managing communication can be problematic. Each member added to a team increases the lines of communication by n(n − 1), so one can quickly see how a large team can degenerate into chaos. Conversely, an individual approach does not, as a matter of course, predict poor-quality care. Small practices excel at maintaining strong clinician-patient relationships, a quality difficult to measure but crucial for delivering excellent patient-centered care. It is not clear to us that the team model is inherently superior to a solo model running with technologic and carefully selected logistical and community support. Rigid insistence that a care team must provide patient care functions may be neither feasible nor desirable for small practices. NCQA’s documentation requirements may be helpful in clarifying roles for teams in large practices, but are of limited utility for small and solo practices. For example, as a solo practitioner, J.A. was astonished to find that she was penalized for not including her own job description in her policy manual. L.H. hired a nurse care manager to meet a requirement for team previsit huddles, but during the preceding 9 years had adequately conducted previsit reviews of the chart herself. Both small practices spent hours perfecting a more than 40-page policy manual that was superfluous for us but required for recognition.
Technology Does Not Equal Utility
“Not everything that counts can be counted and not everything that can be counted counts.”
–W.B. Cameron
Awarding points for specific structural data-handling capabilities that depend on the sophistication (read: cost) of the practice’s electronic health record has no correlation with practice quality. Points are awarded for the electronic capability to submit syndromic surveillance data to public health agencies, and to be able to quantify the ratio of electronic to written prescriptions per quarter. A point can be earned for the ability to count (using a searchable data set) how many patients have a designated caregiver. We argue that inputting and then counting data—simply because it can be done—wastes time and distracts physicians from patient care. What is the utility of these data for the practice, and when and how did the ability to meet Meaningful Use requirements become a proxy for high-quality medical practice?
NCQA DOCUMENTATION EXAMPLES: MEETING THE MEASURES
One of our patients was referred to an NCQA-recognized large group practice to see their pulmonologist for a severe asthma exacerbation. The following example from the consult note exemplifies several of our concerns about the recognition process:
-
Overweight.
-
Collaborated with patient on care and goal setting.
-
Assessed and addressed barriers to achieving treatment goals.
-
Educated patient on nutrition and exercise. Patient agreed to work towards the following goals: exercise more, eat healthier foods, eat less.
When this note was reviewed with the patient, she stated that weight was never broached during her visit. Here, the certification process is promoting meaningless verbiage, dishonesty in the documenting of the clinical encounter, and an abdication of responsibility. Such boilerplate text outputs are survival strategies that practices have turned to in order to deal with proliferating documentation requirements resulting from the cumulative impact of coding, billing, and quality demands. The work documented in the snippet above might in reality occur typically over months in a primary care practice, but probably never in a new patient problem visit with specialty care, yet this is precisely the type of documentation inspired by the NCQA process. Which part of the practice team was responsible for this note?
In another example, a colleague from a large group had always given his cell phone number to patients to ensure easy access. To meet NCQA’s requirements, the group had to conform to a single standard, and he began to use an answering service instead. It is doubtful that this extra step between the patient and physician improved access, and it certainly increased costs.
Both authors used emergency department and hospitalization cost data from the HYH online nonproprietary patient-reported outcomes tool to meet a documentation requirement around costs. One practice received the point and the other did not, suggesting that the recognition process itself suffers from a lack of consistency.
WERE THERE USEFUL CONSEQUENCES OF THIS EXERCISE?
We both felt that the review process led to internal clarification of our practices’ response to routine office events and promoted reliability for key office processes. A focus on population management and outreach to populations, not usually reimbursed during typical fee-for-service interactions, was educational and not overly time-consuming.
SMALL PRACTICES ARE NOT PROTECTED
Because certification seems predicated on chart documentation, structural form, and electronic record and data-mining capabilities, to obtain recognition, larger practices can hire or use more administrative staff and are able to purchase more robust electronic health records that facilitate extracting NCQA requirements. Smaller practices lack these financial resources. In speaking with colleagues whose large practices have become certified, we hear comments such as, “Oh, they just hired people to check the boxes for me.” How much real change can occur when physicians have no buy-in to the process? A practice manager described her efforts to “protect her providers” from the certification process. One wonders: how valuable can this process be when physicians must be protected from it?
We believe that the effort required to complete the recognition process is prohibitive for most small independent practices. These practices comprise 40% to 60% of the national primary care workforce5; as many hover on the brink of financial viability, excluding them from a recognition pathway that may lead to greater reimbursement will only hasten their demise. Given data showing that small practices have a 30% lower hospital readmission rate than larger practices,5 is this sidelining of small practices truly the outcome our policies should promote?
A BETTER WAY
According to Friedberg et al,6 the medical home as outlined by NCQA neither saves money nor improves quality. We believe that in place of costly proprietary practice-reported processes such as NCQA certification for the PCMH, short, low-cost, Internet-aggregated surveys of patient experience of care would shift the work of documenting “medical home-ness” to the patient, where it rightfully belongs. Patient-completed nonproprietary surveys such as HYH2 take profit out of the equation and enable easy measurement of key patient metrics such as access, continuity, confidence, and coordination of care, as well as built-in opportunities to improve patient care.3 Such surveys decrease administrative and time costs, disallow recognition from practice self-report, circumvent text insertion and box-checking, increase flexibility of measurement for all kinds of practices, and abolish the imposition of a rigid 1-size-fits-all structure on practices to meet recognition requirements. Combining patient experience of care surveys with burden of illness measures and claims cost data could produce a powerful lens through which we could measure the quality of all primary care practices.
We find NCQA’s PCMH model an especially poor fit for small practices, because of the proportionately greater costs incurred in implementation, and because the model’s requirements for a team approach are often nonapplicable. We fear that for all practices, actualizing this model nurtures “chart-centered medical homes” rather than PCMHs. We find it ironic that the model claims to focus on patient empowerment and yet does not base its results mainly on feedback from patients and patient-entered data, and the ensuing simplification of measurement. We agree that population health measurement is doable, and that we can assume that role, yet object that the fee-for-service primary care model does not pay us to perform this extra work. A 3-pronged process for certifying the medical home makes sense and simplifies measurement while covering the 3 assessment areas (patient experience of care, population health, and costs) and should be based on easily obtainable patient-reported outcome measures, documented processes around population preventive health, and claims data to evaluate costs.7
In conclusion, despite having completed and excelled in the PCMH NCQA recognition process, we strongly advocate that it should be discontinued.
Acknowledgments
We acknowledge Dr John Wasson and Dr James Rickert for their help in reviewing this work as well as the many hard-working primary care doctors across the country who shared their stories with us. We thank James Warren and Dr John Machata for editorial assistance.
Footnotes
-
Conflicts of interest: authors report none.
- Received for publication December 18, 2014.
- Revision received February 25, 2015.
- Accepted for publication March 10, 2015.
- © 2015 Annals of Family Medicine, Inc.