Abstract
PURPOSE In 2004, we undertook a consultation with Canadian primary health care experts to define the attributes that should be evaluated in predominant and proposed models of primary health care in the Canadian context.
METHOD Twenty persons considered to be experts in primary health care or recommended by at least 2 peers responded to an electronic Delphi process. The expert group was balanced between clinicians (principally family physicians and nurses), academics, and decision makers from all regions in Canada. In 4 iterative rounds, participants were asked to propose and modify operational definitions. Each round incorporated the feedback from the previous round until consensus was achieved on most attributes, with a final consensus process in a face-to-face meeting with some of the experts.
RESULTS Operational definitions were developed and are proposed for 25 attributes; only 5 rate as specific to primary care. Consensus on some was achieved early (relational continuity, coordination-continuity, family-centeredness, advocacy, cultural sensitivity, clinical information management, and quality improvement process). The definitions of other attributes were refined over time to increase their precision and reduce overlap between concepts (accessibility, quality of care, interpersonal communication, community orientation, comprehensiveness, multidisciplinary team, responsiveness, integration).
CONCLUSION This description of primary care attributes in measurable terms provides an evaluation lexicon to assess initiatives to renew primary health care and serves as a guide for instrument selection.
- Primary health care
- delivery of health care
- outcome and process assessment (health care), terminology
- Delphi technique
INTRODUCTION
Health systems based on a strong primary health care system are more effective and efficient than those centered on specialty and tertiary care.1 In Canada, various national and provincial commissions on health care2–8 concluded that strengthening and expanding primary health care will meet Canadians’ needs for prompt access to comprehensive evidence-based services. Major initiatives have also been undertaken in New Zealand and the United Kingdom to strengthen primary health care.9,10 As health systems worldwide engage in evaluation efforts to assess the impacts of primary health care renewal initiatives, there is a critical need to provide evaluation frameworks and tools to facilitate these efforts.
An important starting point for evaluation is an operational definition of the dimension being evaluated. An operational definition is a description of a concept in measurable terms. It is used to remove ambiguity, to serve as a guide for the selection of measurement tools, and to reduce the likelihood of disparate results between different data collections. In 2004 we conducted a consultation with Canadian primary health care experts to develop a common lexicon of operational definitions of attributes to be evaluated in predominant and emerging models of primary health care in Canada, but many of these definitions will be relevant to primary health care models in other countries.
We took as a starting point a list of 13 attributes of primary health care that had been identified by an Expert Working Group consisting of academic primary care physicians and researchers and convened in 2000 by the Canadian Institute for Health Information (personal communication, J. Zelmer, 2003). This initial work focused on family medicine, whereas the new policy directions—and evaluation challenges—emphasize multidisciplinary and population-based models of primary health care. We built on this work to create a comprehensive list of measurable descriptions for key concepts to be evaluated. We report here the results of a consensus development process with key Canadian primary care researchers and evaluators using a combination of an electronic Delphi process and face-to-face consultations.
METHODS
The Delphi is a written consensus process whereby documents are circulated to a group of experts, with each round of the document incorporating the feedback from the previous round until sufficient consensus has been achieved and no more major changes are suggested. Consensus was defined to participants as “I can live with it,” which still allows for variation in the details. The circulated document consisted principally of the list of operational definitions in alphabetical order with instructions to modify them or to indicate that they were adequate. Additional questions reflected comments raised in previous rounds or addressed specific measurement issues. When at least 80% of respondents agreed on a definition or question, we assumed consensus, and the issue was dropped from subsequent rounds.
Our goal was to have 12 to 15 respondents on most rounds.11 We identified experts from the previously mentioned Expert Working Group and from names recommended by a group of 8 primary health care researchers and decision makers who participated in a Canadian Health Services Research Foundation event on April 7, 2004, as well as names suggested during contacts with the above-mentioned recommended experts. A recommendation by more than 1 person was taken as an indication of recognition by peers. We generated a list of 26 Canadian primary health care experts, equally balanced among clinicians (principally family physicians and nurses), academics, and decision makers from all regions in the country; we selected those with community-oriented as well as medical approaches to primary health care.
Of the 26 experts, we successfully contacted 20 by telephone, 18 of whom participated in at least 2 Delphi rounds and 6 of whom participated in all 4 rounds; 4 of those who participated in every round are leading experts. The number of respondents was 15, 12, 14, and 11 in each of the 4 rounds, respectively. We conducted the first 3 rounds in June 2004, and round 4 and the face-to-face consultation in fall 2004. The study received ethical approval, and experts consented to be identified as participants.
We purposely did not define primary health care, although we encouraged participants to think broadly, and we introduced the notions of professional models (primary care) and community-oriented models (comprehensive primary health care) that emerged in a classification of international models conducted by Lamarche and colleagues.12
We specified that the attribute descriptions be stated in measurable terms which reflect on the organization providing care. Participants were asked (1) to modify the existing definitions, (2) to suggest other attributes that should be included, and (3) to propose operational definitions for new attributes. In the second round, experts were again asked to modify definitions where consensus had not emerged and to comment on the 9 additional dimensions that were added. They were asked to remove any attributes that were either redundant or unmeasurable.
In the third round definitions or labels were again refined. Additionally, the participants were asked to identify the best data source for measuring the attribute and to indicate whether the attribute was specific to primary health care or whether it was a generic attribute of health care.
During the summer we reviewed the literature and mapped the operational definitions to validated questionnaires that evaluate primary health care attributes from the client perspective. This effort led us to identify additional attributes and to split some definitions into measurable components within distinct data sources. The results and suggestions were again circulated to experts. Issues for which consensus had not been achieved were brought to a face-to-face meeting with the research team, 6 of the participants, 3 project officers from the Canadian Institute for Health Information, and the PHC research team from the University of Ottawa that hosted the meeting. We also revisited all the attributes that had been raised at any point in the process to determine whether they should be retained.
RESULTS
The evolution of the attributes and their labels during the 4 Delphi rounds is displayed in Figure 1⇓. Some attributes and labels were retained over most of the rounds, as indicated by arrows that cut across the rounds. Converging arrows indicate attributes that were collapsed. For instance, some elements of the attribute of patient safety were integrated into the definition of quality of care and others into quality improvement process. Other concepts were split for either definitional or measurement clarity, as indicated by diverging arrows. For instance, comprehensiveness was split into whole-person care and comprehensiveness of services. Clinical quality of care was split into technical quality of clinical care and interpersonal communication, because the sources of information for these 2 components were completely different.
Table 1⇓ displays the attribute labels, indicates whether the attribute is specific to primary health care, and shows the best information source for measuring the attribute. We have categorized the attributes as clinical practice attributes, structural dimensions, person-centered dimensions, community-oriented dimensions, and system performance dimensions.
Table 2⇓ displays the final list of 25 operational definitions in the same order that they are presented in Table 1⇑. This table presents the degree of consensus for each definition. Definitions were considered to be of high consensus when at least 80% of participants agreed completely with the definition. Definitions were considered to be of moderate consensus when between 60% and 79% agreed. Definitions were considered to be of low consensus when less than 60% agreed. Definitions designated as new were not submitted to the Delphi consultation because they were proposed at the last face-to-face consultation as having problematic clarifications over various rounds.
DISCUSSION
This consensus process resulted in 25 operational definitions of attributes of primary health care. Only a few were identified as being specific to primary health care: first-contact accessibility, relational continuity, family-centered care, population orientation, and intersectoral team work. The last 2 pertain particularly to community-oriented primary health care models. Although some attributes, such as comprehensiveness of services and technical quality of clinical care, are relevant to all parts of the health system, there are specific subdomains that pertain specifically to primary health care. Comprehensiveness specific to primary health care covers diagnosis and management of commonly occurring acute and chronic conditions and of clinical preventive care. Likewise, the criteria for technical quality of clinical care will differ for primary health care, specialty ambulatory care, and in-hospital care.
The request to identify the best information sources for measurement led to definitional clarity in some cases. For instance, comprehensiveness, as defined in the first 2 rounds, encompassed several sub-concepts and threatened to become immeasureable. This problem was resolved by splitting the definition into whole-person care (consideration of all dimensions of a person), which is best measured by the patient; and comprehensiveness of services (availability of a range of health services), which is best measured by the provider. Likewise, quality of clinical care, classically defined as care conforming to technical and interpersonal standards,13,14 narrowed to technical quality of clinical care, which is best measured from chart audit or clinician report; and to specific aspects of interpersonal care, such as interpersonal communication and respectfulness, which is best measured by the patient.
The intentional inclusion of panelists who represented both the community-oriented (comprehensive primary health care) and professional (primary care) models enriched the operational definitions, making this list broadly relevant to a variety of primary health care models across Canada and internationally. The inclusion of client/community participation and the intersectoral team are relevant principally to community-oriented models. The initial conception of community orientation as “the extent to which the primary care provider assesses and responds to the health needs of the community” was expanded by many clinicians to include “… and to which the community context is considered in the care of individual patients.” The latter phrase was eventually incorporated into the definition of whole-person care and community orientation was reframed as “population orientation,” with a note that the population is conceived differently in community and professional models.
Program evaluators from regional and provincial government health authorities also enriched and expanded operational definitions. From an evaluation perspective they were particularly interested in how primary health care fit in the health system rather than in the effectiveness of process and structures at the clinical level. Decision makers placed particular emphasis on such concepts as resource availability, efficiency, and accountability. Efficiency was recognized by all as an important outcome, but despite repeated efforts, it seemed impossible to find a concise operational definition on which everyone could agree. We leave the completion of this exercise to others! As a result of trying to map the dimensions to questionnaires, it also seems that certain attributes, such as equity and accountability, are inferred as outcomes from analyses rather than measured directly.
Some concepts and definitions remained thorny throughout the process. Responsiveness is a case in point. Responsiveness is an attribute used by the World Health Organization when ranking health systems,15 and it is defined “as a measure of how the system performs relative to non-health expectations for how people should be treated by providers with respect to dignity of the person, confidentiality, autonomy to participate in choices about one’s own health, prompt attention, quality of amenities, choice of provider, and access to family and friends during care.”15 Because the WHO definition overlapped with our definitions of other attributes, we removed obvious overlapping elements of confidentiality (addressed in clinical information management), autonomy (addressed as shared decision making elsewhere), and choice of clinician (addressed in accessibility). We dropped access to social support—family and friends because it is mostly pertinent to hospital services in nonindustrialized countries. We did achieved high consensus on the definition of responsiveness as the” ability of the primary care unit to provide care that meets the non-health expectations of users in terms of dignity, privacy, promptness, and quality of basic amenities.” We experienced problems, however, when mapping questionnaires to the operational definitions. Many questionnaire subscales were mapped equally to responsiveness, interpersonal communication, whole-person care, and relational continuity. Consequently, at the face-to-face meeting the panelists concluded that responsiveness was unworkable as an operational definition, and it was narrowed into the more measurable and distinct dimension of a respectfulness.
The placement of shared decision making within a dimension was also problematic. It was initially represented in both advocacy (clinician representing the individual’s best interests, including informed decision making) and in interpersonal communication (ability of the clinician to engage in shared decision making). At the face-to-face meeting the suggestion was made to include it in another dimension—patient-centeredness—following the work of Moira Stewart.16–19 A literature search encountered a variety of definitions of patient-centered care, with common elements being shared decision making and the explicit recognition of the individual’s values, preferences, and ways of understanding their health.20–24 A literature-based definition rapidly became unwieldy, however, and included too many elements of care and overlaps with other definitions. It became more workable to recognize that patient-centered care encompasses the attributes of whole-person care, family-centered care, respectfulness, cultural sensitivity, and advocacy, for which there is empirical support.25 Even shared decision making is not essential to patient-centeredness. Stewart observes that “being patient centred means taking into account the patient’s desire for information and for sharing decision making and responding appropriately.”22 In consultation with Stewart, the research team decided that the important concept of patient-centeredness was already included in 7 existing definitions, and a separate definition would therefore be redundant. As well, the team and Dr Stewart agreed with the Delphi panel’s preference to place shared decision making in interpersonal communication.
The final list of operational definitions includes two that were not submitted to the full consensus process: informational continuity and accessibility-accommodation. Informational continuity was suggested as an attribute in round 2 but was then subsumed within clinical information management and relational continuity in round 3. At the face-to-face meeting, however, the panelists suggested that it be reintroduced in accordance with the conceptual work of Haggerty and colleagues.26 First-contact accessibility was recognized as a core attribute of primary care after round 1, but as more elements were added to the definition, we began to lose the specificity of first contact. In round 4, the panel agreed that it was important to keep first contact (patient initiated) as a separate dimension, and introduce another definition for general accessibility. The definition we present here recognizes that accessibility occurs at the interface of service availability and patient capacity. We include it for completeness, recognizing that it has not been submitted to scrutiny.
Some will be surprised at the absence of satisfaction as an attribute to be measured. Satisfaction is defined as the “extent to which services adequately fulfill the expectations of patients.”27 Our panelists were divided as to whether satisfaction was an attribute of care or a metric of the achievement of other attributes; ultimately, by a small majority it was not included as an attribute of care. A systematic review by Crow and colleagues27 points out that satisfaction is a relative concept, and little is known about the mechanisms by which judgments are made. Furthermore, expressions of satisfaction and dissatisfaction seem to tap into different constructs rather than being extreme ends of the same dimension. Even so, we cannot ignore that many questionnaires, especially visit-based ones, elicit satisfaction with care.
Several other dimensions were identified in questionnaires that did not emerge as attributes in our consultation. One is patient enablement, a patient’s sense of self-efficacy in being better able to understand and manage a health condition as a result of the clinician’s behavior during the visit.28,29 As is satisfaction, patient enablement is considered to be an outcome rather than an attribute of care and is elicited only in visit-based questionnaires.30 Another is trust,31 which we also believe is an outcome rather than an attribute of the care process. Some may also see the omission of cost as a barrier to accessibility as an important lapse. This omission undoubtedly reflects the Canadian context of universal access to medical services, in which cost barriers are theoretically minimal. There may be other omissions or particularities that reflect the Canadian nature of the consultation, but we believe that the definitions are robust internationally. For example, the attributes map well to the 6 characteristics identified by the Institute of Medicine of a health system that meets patients’ needs: safe, effective, patient-centered, timely, efficient, and equitable.32
There are important limitations to address. Some may find that the number of experts (see Acknowledgments) was too small to constitute a consensus among Canadian experts; however, this number is within the norm for Delphi studies, as it allows for rich input and feedback.33 The Delphi method falls more within the qualitative rather than quantitative paradigm, and participants were purposefully selected to maximize the variation in perspective. We acknowledge that our desire to balance our panel meant that expertise was limited for a given perspective, and in future efforts it would be important to seek specific expertise on areas of low consensus. Nonetheless, we are confident that areas where consensus is high would be largely agreed on by an independent consensus consultation, recalling that consensus reflects general agreement, allowing for variation in details. Definitions having moderate or low consensus may indeed benefit from further development and broader consultation, but we leave that to future research.
Some may also be concerned that not having all the experts respond to every round may bias the consensus. Although the varying number of responses for each round was not ideal, we were comforted that the core of consistent responders included the most experienced and widely recognized experts on the panel, and that the major changes occurred early in the process. Again, we refer the reader to the degree of consensus and the stability of definition (described in Figure 1⇑) as a guide to confidence in the definitions proposed.
This attempt is not the first, nor will it be the last, to define qualities and attributes of primary health care.24,34 Our goal was to arrive at operational definitions that would establish a common lexicon for describing attributes in the health system and in primary health care in particular and to aid in the selection of evaluation tools. Our report highlights the need for data from patients and clinicians, as well as routine administrative data, to get a valid and global evaluation of primary health care.
Acknowledgments
In addition to Christine Beaulieu, who coordinated the project, Ian Haggerty, who conducted the Delphi, and Gervais Beneguissé, who mapped attributes to validated questionnaires, we are indebted to all the Canadian primary health care experts who agreed to participate and be named: Jan Barnsley, PhD, University of Toronto, Ontario; June Bergman, MD, Department Family Medicine, University of Calgary, Alberta; Brendan Carr, Capital District Health Authority & Dalhousie University, Nova Scotia; Brian Hutchison, MD, MSc, Department of Family Medicine, McMaster University, Ontario; Judith McIntosh, PhD, Nursing, University New Brunswick, New Brunswick; Carmel Martin, MD PhD, Department of Family Medicine, University of Ottawa, Ontario; Ruth Martin, RN, Nursing, Dalhousie University, Nova Scotia; Maria Mathews, PhD, Community Health, Memorial University of Newfoundland, Newfoundland; John Millar, CIHI, Vice President of Research, Vancouver Coastal Health Authority, Vancouver, British Columbia; Donna Murnaghan, PhD, School of Nursing, PEI Health Research Institute, Prince Edward Island; Betty Newson, Consultant, Policy Analyst, Prince Edward Island; Pierre Tousignant, McGill University, and Département de Santé Publique; Lysette Trahan, Ministère de la santé et des services sociaux du Québec, Québec; W. E. Thurston, University of Calgary, Alberta; Diane Watson, Health Council of Canada & University of British Columbia, British Columbia; Greg Webster, Canadian Institute for Health Information, Ontario; Ruth Wilson, Department of Family Medicine, Queen’s Univesity, Ontrio; Sabrina Wong, RN, PhD, School of Nursing, University of British Columbia, British Columbia; Christel Woodward, PhD, Clinical Epidemiology & Biostatistics, McMaster University, Ontario; Moira Stewart, University of Western Ontario; London, Ontario.
Footnotes
-
Conflicts of interest: none reported
-
Funding support: This project was funded by the Canadian Institutes for Health Research.
- Received for publication October 24, 2006.
- Revision received January 16, 2007.
- Accepted for publication January 29, 2007.
- © 2007 Annals of Family Medicine, Inc.