Abstract
PURPOSE Increased workload associated with electronic visits (eVisits) in primary care could potentially be decreased by the use of artificial intelligence (AI); however, it is unknown whether this use of AI would be acceptable to staff and patients. We explored patient and primary care staff views on the use of and opportunities for AI during eVisits.
METHODS We conducted semistructured interviews and focus groups with primary care staff (n = 16) and patients (n = 37) from primary care practices in northwest England and London (n = 14) using the Patchs eVisits system (Patchs Health Limited; www.patchs.ai) from May 2020 to September 2021. We analyzed verbatim transcripts using thematic analysis.
RESULTS Misconceptions regarding AI were common, which led to initial reservations on its use during eVisits. Perceived potential AI benefits included decreased staff workload and faster response times for patients. Safety concerns stemmed from the complexity of primary care and fears of depersonalized service. The following 7 opportunities for AI during eVisits were identified: workflow, directing, prioritization, asking questions, writing assistance, providing self-help information, and face-to-face appointment booking. Despite staff concerns regarding patient acceptability, most patients welcomed the use of AI if it were used as an adjunct to (not replacement for) clinical judgment and could support them in getting help more quickly. Retention of clinical oversight and ongoing evaluation was key to staff acceptability.
CONCLUSIONS Patients and staff welcomed the use of AI and identified 7 potential uses during eVisits to decrease staff workload and improve patient safety. Successful implementation will depend on clear communication from practices, demonstrating and monitoring safety, clarifying misconceptions, and reassuring that it will not replace humans.
- primary health care
- general practice
- electronic visits (eVisits)
- artificial intelligence (AI)
- qualitative research
INTRODUCTION
Remote or electronic visits (eVisits) enable patients to contact their health care provider by completing an online form.1 Also described as e-consultations, online consultation systems, and online triage, eVisits have been promoted internationally as a way to relieve pressure on health care services via more effective use of resources.2-8
Primary care providers in England have been mandated to offer eVisits for all patients since April 2020.6 Adoption of eVisits was rapidly accelerated during the COVID-19 pandemic, and eVisits have been available to 85% of the population of England since May 2020.9 Whereas eVisits can increase access to health care for certain groups of patients,10 their use can also have negative consequences including increased staff workload,11,12 increased patient demand,13 and creating new inequalities in access.14 This increased demand might be supply-induced15 (ie, eVisits make it easier to contact primary care clinicians) or unmask a previously unmet need.16 Negative outcomes of eVisits could potentially be mitigated by improved system design, better integration into staff workflows, and the incorporation of advanced technologies such as artificial intelligence (AI).8
Artificial intelligence can be defined as the ability of machines to “mimic human intelligence as characterized by behaviors such as cognitive ability, memory, learning, and decision making.”10 One way AI could help decrease workload and patient demand associated with eVisits is by supporting patient self-care or directing to other services as appropriate.8
Despite the potential to improve health care, AI tools are not yet routinely used in primary care.17-20 Barriers include data privacy and ethical concerns, legislation, and a lack of training and trust among health care professionals and patients.19-22 There are also broader concerns regarding how to adequately validate AI performance, particularly AI that continuously learns from clinical data over time, which has been highlighted by regulators including the US Food and Drug Administration and the European Union.23 There are few examples of eVisits using AI and limited research on its use in this context. In our recent review, 13 of 24 eVisit systems used AI, though this was mainly restricted to adapting the questions used to elicit information from patients.8 The studies in our review offered little exploration of staff and patient experiences of AI features and were limited by poor patient uptake.8
To ensure the successful adoption of AI systems, primary care staff need to consider the technology to be advantageous24 and be willing to make the necessary adjustments to their work practices.25,26 However, there is a paucity of research at the preadoption stage of AI innovation and limited exploration of patient perspectives on the use of AI in primary care.19,27 To address this gap in the literature, we used qualitative methods to explore the views of staff and patients in primary care to inform the development of AI features for eVisits. We report our findings according to the Consolidated Criteria for Reporting Qualitative Studies (COREQ).28
METHODS
Setting and Sample
Primary care practices in England using the Patchs eVisit system (Patchs Health Limited; www.patchs.ai) were eligible for the study (n = 51 during the study period [May 2020-September 2021]). Practices were identified to obtain variation in the following characteristics thought to affect the adoption of health technologies8: practice size (according to number of registered patients), rurality, and level of socioeconomic deprivation. Fourteen practices were recruited in northwest England (n = 11) and London (n = 3), representing different geographic areas, patient population sizes, and levels of socioeconomic deprivation. Recruited practices had used Patchs for 1-21 months by the end of the study period.
eVisit System
Patients access Patchs from their primary care practice’s website and can submit clinical (eg, new health problem, ongoing health problem) and nonclinical (eg, administrative, medication) requests as unstructured free text responses to open-ended questions in an online form. Questions are determined by the type of request selected by patients and cover topics of typical traditional primary care consultations.29 Staff aim to respond within a stipulated timeframe (eg, 48 working hours) set by the practice, either by written message, video/telephone call, or arranging an in-person visit. Supplemental Appendix 1 provides a full description of the system using the template for intervention description and replication (TIDieR) checklist.30
Data Collection
We collected data using semistructured interviews and focus groups with primary care practice staff and patients. We invited 37 practice staff and 230 patients via e-mail to be interviewed. Recruitment was purposeful to achieve variation in the following characteristics that could affect the adoption of health technologies: age, sex, ethnicity, frequency of eVisit usage, and role (staff). Recruitment was stopped at 40 participants (16 staff and 24 patients) when data saturation was reached.31
Interviews were conducted by telephone by S.M., S.D., or T.C. (mean 33 minutes, range 13-62 minutes) and were audio-recorded and transcribed verbatim. The interview topic guide (Supplemental Appendix 2) started by exploring each interviewee’s understanding of AI, which was clarified by presenting a brief explanation in plain English. It then covered the interviewee’s views on the potential uses of AI during eVisits, risks, benefits, and likely challenges to its adoption into clinical practice. Our initial topic guide and explanation of AI was refined based on feedback from meeting with the National Institute for Health and Care Research Applied Research Collaboration Greater Manchester Public and Community Involvement and Engagement Panel (8 members). This meeting (June 2020) was also used to test emerging findings from early interviews. A second focus group was conducted (May 2021) with a different patient group (5 participants) to discuss and interpret findings of the interviews. Discussions were led by S.M. while S.D. took detailed notes (Supplemental Appendix 2).
Data Analysis
We imported interview transcripts and focus group notes into Nvivo v12 (Lumivero LLC). Each was independently coded line by line by at least 2 of 3 authors (S.M., S.D., T.C.) using thematic analysis.31 Disagreements were resolved via discussion, with the broader study team consulted as necessary. The coding framework was both emergent from the data (inductive) and guided by findings from our recent systematic review of eVisit research (deductive).8 Findings were triangulated between focus groups and interviews32 and compared and contrasted between participants and primary care practices. Codes and themes were refined during weekly research meetings with the broader study team and organized into descriptive themes to present the results. We conducted data analysis alongside data collection; both ceased at saturation when themes were fully developed with clear definitions, and no new information emerged after ≥3 interviews.33 Supplemental Appendix 3 provides further information on data analysis.
RESULTS
Both clinical (n = 10) and nonclinical (n = 6) staff agreed to be interviewed (Table 1). Patient interviewee (n = 24) demographic characteristics are presented in Table 2. Patient and staff perspectives are reported together, with differences highlighted as necessary. The following 4 overarching themes were identified: initial misconceptions and reservations, potential benefits of using AI during eVisits, potential risks of using AI during eVisits, and opportunities for using AI during eVisits.
Staff Characteristics
Patient Characteristics
Initial Misconceptions and Reservations
Unfamiliarity With Artificial Intelligence
For some patients, the term AI was off-putting and carried negative connotations. They felt AI to be intimidating and otherworldly. The suggestion of AI in the context of health care was therefore alarming:
Well, something like ET I think, twiddling a finger and thinking, what shall I tell this human? No, I just, no. I just…no (patient 23, practice 44, female, age 75 years).
Staff felt they knew little about AI and believed they had no experience in using AI technologies in their clinical practice. This led to some trepidation regarding the incorporation of AI, which was balanced with acknowledgment of their initial skepticism about the use of eVisits:
I think, probably like a lot of clinicians, I’m probably initially sceptical…it just comes back to safety. But then, if you’d have asked me a year ago, or a bit longer, what do I think about eVisits, again, I would have been so sceptical, but, you know, now I’m quite a fan (staff 13, practice 1, female, general practitioner [GP]).
Overestimation and Underestimation of Artificial Intelligence Capabilities
Unfamiliarity with AI led patients to overestimate the level of input and responsibility AI would have, believing that AI could diagnose and prescribe with no input from their primary care provider:
My concern would be, if it’s possible for the bot to straight away prescribe you medication or make a diagnosis (patient 4, practice 2, female, age 38 years)?
Patients feared that if AI proved too successful, it would eventually replace doctors and lead to replacement of the workforce:
One concern is that maybe some people, like they will lose their jobs, this is the only thing, that the machine will make the decision and maybe after we have more data and you teach the model better, so when the time comes that it makes knowledgeable decisions than a human, then it will replace the human (patient 7, practice 1, male, age 36 years).
Although few staff identified this as a real concern, they also expressed initial confusion as to what features of the eVisit could be automated. Some were therefore unable to specify in advance how they thought AI features could improve their workflow:
I suppose, I mean I don’t really know, I’m not entirely sure kind of what the ability of the AI is going to be (staff 7, practice 6, male, GP).
For staff and patients who could provide examples of AI, prior experiences they could recall were largely limited to frustrating encounters with online customer service chatbots or virtual assistants. Despite an initial overestimation of AI capabilities, interviewees described chatbots as rudimentary and unable to provide adequate and relevant information. These negative encounters contributed to concerns regarding the use of AI in health care, given the higher stakes involved:
I don’t like it from [energy supplier]. I wonder how I would find it in terms of my healthcare requirements (patient 10, practice 6, female, age 57 years).
Once the plain English explanation of AI had been presented (Supplemental Appendix 2), most participants were able to discuss the potential benefits and risks of AI during eVisits in more depth.
Potential Benefits of Using Artificial Intelligence During eVisits
Workload Reduction
The potential to ease clinical staff workload was the most commonly cited benefit. Patients readily acknowledged the strain placed on primary care and believed that AI could make certain tasks more efficient and cost effective:
I’m extremely comfortable with it, and obviously I recognise the benefits of it, because we are an ageing population, and healthcare is becoming more and more expensive, and it is eventually going to become unaffordable unless we introduce technologies like AI (patient 11, practice 1, male, age 69 years).
Primary care staff hoped that AI would help address the perceived supply-induced demand14 witnessed since adoption of eVisits:
Because I think it’s good that patients can access GPs a bit easier, but hopefully…you’re dealing with the stuff that needs to be dealt with rather than wasting your time dealing with trivial stuff, you know, when someone else with a more urgent problem could be being dealt with…I think hopefully it would free us up but still sort the patients out (staff 5, practice 2, female, GP).
Safety Improvements
Some interviewees identified potential safety benefits from the incorporation of AI. They believed that AI would decrease the risk of human fatigue and error and welcomed its ability to access and process large amounts information quickly:
I could see perhaps a potential benefit from the wealth of information that would be available to it. It can assess things with a huge amount of information that has come in, can’t it. So it should be pretty accurate really, I would have thought (patient 18, practice 5, male, age 73 years).
Potential Risks of Using Artificial Intelligence During eVisits
Depersonalized Service
Several patients questioned if the inclusion of AI would lead to less in-person contact. Patients feared a depersonalized service and a disconnection between themselves and their primary care practice:
I feel uncomfortable if I’m being really honest. Knee-jerk reaction is I feel uncomfortable [about] any kind of AI because it’s…I just think certain things, you still need some face-to-face interaction, some interaction with another human (patient 15, practice 6, male, age 58 years).
Effect on Patient Use of eVisits
Staff were eager for patients to continue to use eVisits. They worried that AI incorporation could jeopardize patient acceptability:
So, I kind of think, taking that little bit away from them, they’ll then go back to the phones (staff 4, practice 3, female, receptionist).
However, some staff believed that any patient trepidation would be part of an inevitable transition period, similar to that observed after the adoption of eVisits. They believed that over time, patients would begin to trust a system with automated features, just as they had done with eVisits:
I think, initially, there possibly might be quite a lot of resistance from it…it might be extra work in the beginning, but once they start to spot a pattern, actually, no, the doctors are agreeing with what the AI is telling us, that might then disappear (staff 10, practice 5, female, GP).
Many patients were largely apathetic about the introduction of AI as long as their user experience was not adversely affected. They found eVisits an easy way to contact their primary care provider and were happy with the introduction of AI as long as they could still get help swiftly:
I mean, they’re going to get replies, aren’t they, still? And be treated exactly the same…Is there going to be a big difference if they do it (patient 5, practice 8, female, age 49 years)?
Safety Concerns
Both patients and staff expressed safety concerns, specifically with regard to the complexity of primary care practice. Worries stemmed from the belief that AI would miss nuanced issues, leading to delays in patients receiving correct help. Some clinicians confessed that they were not always confident as to how to prioritize requests based on the information received from eVisits currently and therefore were unsure about trusting AI, given that it lacked human instinct and knowledge of their individual patients’ behavior:
I think general practices are so complicated and sometimes I can’t even work out what’s routine and what isn’t based on an initial request. It just rings a lot of alarm bells for me (staff 9, practice 3, male, GP).
The requirement for rigorous and ongoing evaluation was a prominent theme for clinicians, along with the need for clarification of clinical responsibility and oversight. However, most believed that there was minimal risk to patient safety by automating specific tasks. Patients and staff felt reassured if eVisits had contact with a human at some point after submission. This would provide an opportunity for staff to intercept any errors (eg, an eVisit sent to reception instead of a clinician) and to escalate urgent eVisits as needed:
I think there’s obviously risk involved with trusting computers and AI, so we’d want to know that it was fit for purpose…I wouldn’t want it to just do something automatically in the background and then close something off without a human taking a look at that request too at some point in the process (staff 1, practice 3, male, GP).
Patients voiced concerns that the use of AI would increase the responsibility placed on them to input their symptoms accurately and in sufficient detail. They worried that omitting key words or using incorrect terminology could result in their request being deprioritized in error or directed to the wrong person, or even in a misdiagnosis. They believed that administrative staff triaging requests over the telephone, or a clinician triaging eVisits, would note the absence of key information and probe with further questions:
If someone was having a heart attack, say, but didn’t express themselves very well, and the AI thought, oh, it’s nothing much that and said just rest at home or something (patient 18, practice 5, male, age 73 years).
Few patients raised concerns pertaining to data security and the potential vulnerability of AI systems. Most dismissed fears of data misuse, citing trust in their primary care practice to have undertaken all necessary checks and precautions before the introduction of any new software:
Well, there’s the element of trust in everything, isn’t there? If you can’t trust your GP, you’re on a loser straight away (patient 17, practice 5, male, age 58 years).
Opportunities for Using Artificial Intelligence During eVisits
Given their lack of clinical training and unfamiliarity with how eVisits work from an organizational perspective, patients found it challenging to suggest specific uses for AI. Staff were able to identify 7 potential uses of AI during eVisits, which are described below. When these were explained to patients, they believed that these were both acceptable and useful.
Workflow
The ability of AI to direct eVisits to the most appropriate professional within the practice was suggested, to speed up triage decisions and ensure that requests needing clinical input were reviewed more quickly:
I certainly think from an AI point of view, things like allocating tasks to the right person would be quite useful for us…so taking out those steps where we have to kind of triage things to various clinicians. And perhaps if it learned…if somebody wanted a form completing it would go to…a member of staff or if somebody needed a prescription it would go to another member of staff, it might be quite useful (staff 1, practice 3, male, GP).
Directing
Directing patients away from the practice was also proposed. Requests containing emergency symptoms, such as chest pains or suicidal intentions, could be directed by AI to emergency services to improve safety. Staff also believed that directing eVisits to community pharmacies or supporting self-care would be particularly useful to relieve unnecessary pressure on primary care practices:
I think that’ll be really useful for quite a few tricky patients, that are repeatedly coming back with the same thing that’s not requiring a doctor or obviously the problems I was saying about over the weekend, signposting to the right services. I think it’ll be really useful for that (staff 5, practice 2, female, GP).
Prioritization
Clinicians believed that AI that could highlight urgent requests would help structure their time more effectively. This would support them to prioritize their workload, speed up triage decisions, and improve patient safety:
I think triaging urgent queries and putting them to the top of the list would be really useful. Because that is something that, at the moment, we’re having to do ourselves (staff 10, practice 5, female, GP).
Asking Questions
Patients voiced concerns regarding the use of AI to adapt questions during eVisit query submission, as used in some existing eVisit systems.8 They believed that it would lead to irrelevant or superfluous questions and feared that the system would no longer be quick and easy to use. However, staff suggested that AI could help gain more targeted information from patients after submission, which was also acceptable to patients. This could include sending relevant validated questionnaires (eg, 9-item patient health questionnaire for depression) based on the content of the request, prompting patients to upload photos (eg, of rashes), and supporting patients to add more information as necessary:
The computer system could say, right, send me a picture, rather than [me] having to telephone the patient and say, you know, your rash, send me a picture (staff 16, practice 2, male, advanced nurse practitioner).
Writing Assistance
Clinicians also felt that a significant proportion of their time was spent responding to common patient requests, for example mental health inquiries. They hoped that AI could recommend messages to send to patients that they could choose and edit based on their previous activity:
Sometimes there’s certain things that I find I’m writing over and over again. So just as an example, in the last month, the number of people that I’ve, had a discussion about mental health, gone through the options…I always have the same set advice…typing that out over and over again…it would save us a lot of time and it would save a lot of repetition in us writing the same thing (staff 10, practice 5, female, GP).
Providing Self-Help Information
The ability to provide patients with appropriate trusted information regarding their condition without interacting with a clinician was suggested by staff. This could decrease clinician workload by encouraging patient self-care and making it unnecessary for them to locate and send such information to the patient themselves:
If a patient, for example, mentions a particular condition, or a certain key word…for example, irritable bowel syndrome, or some sort of common condition like that, perhaps [AI] could flag up some already known resources, which are regulated and safe, for example, from the NHS website…they might find the answer that they’re looking for that way [without needing input from a clinician] (staff 13, practice 1, female, GP).
Face-to-Face Appointment Booking
Some clinicians hoped that AI could book face-to-face appointments for patients to speed up the process and avoid workload duplication. In addition, eVisits requiring a physical examination or that pertained to complex, vague, or multiple symptoms could be highlighted as requiring a face-to-face appointment and automatically arranged. This would avoid back-and-forth messaging or telephone calls with the patient, which result in multiple contacts for the same issue:
If it could automatically triage things that definitely need an [face-to-face] appointment, because actually that could be one way in which it could save some time, rather than having a telephone call from somebody to say, yes, you need to come in…if they needed a pill check and they had to have a blood pressure [check]…could it direct them to an appointment where we have some certain slots for the pill checks (staff 7, practice 6, male, GP)?
DISCUSSION
Summary
This is the first study to report, in-depth, perspectives of patients and staff on the prospective use of AI during eVisits. Our findings highlight the need to challenge common misconceptions regarding AI and manage user expectations before its adoption into practice. Perceived risks of AI included depersonalized service. However, staff and patients welcomed AI when used as an adjunct to (not a replacement for) clinical judgement. Perceived benefits of AI included the ability to speed up the consultation process and decrease staff workload. Participants felt that AI had the potential to both improve and worsen patient safety. Despite staff concerns regarding patient acceptability, most patients welcomed the use of AI during eVisits. There are currently few examples of eVisits using AI, which are restricted to adapting the questions asked of patients.8 We propose that the use of AI during eVisits could be expanded. The following 7 ways to incorporate AI into eVisits were suggested by participants: (1) workflow, (2) directing, (3) prioritization, (4) asking questions, (5) writing assistance, (6) providing self-help information, and (7) face-to-face appointment booking.
Strengths and Limitations
The qualitative research methods we used were suited to providing a detailed understanding of staff and patient views on AI and its use during eVisits. We used interviews and focus groups to triangulate findings. The topic guide encouraged participants’ responses to be grounded in their real-world experience of an eVisit system, which helped them formulate practical responses. Although patients were recruited from a broad range of backgrounds (eg, age, ethnicity, and occupation) (Table 2), the majority of participants were White and British. Interviews occurred during the COVID-19 pandemic, when patients were strongly encouraged to use eVisits, and all patients had used the eVisit system at least once. Patients who had not used the eVisit system might be more averse to new technology, and their perspectives were not explored in this study. Given the increasing global interest and public use of AI (eg, large language models such as ChatGPT) in the past few years, it is possible that public acceptance of AI has evolved since our data were collected.
Comparisons With Existing Literature
Patients in this study initially overestimated AI capabilities and the level of responsibility that it would have, leading to fears that AI would completely replace clinicians during eVisits. Staff did not echo these fears, and in contrast to previous research had few concerns regarding clinician unemployment and deskilling.19,20 However, staff were also unfamiliar with AI and initially unsure how it could be used for eVisits. Staff are crucial to the successful implementation of eVisits within primary care and the promotion of AI to patients.19,22,26,27 This reinforces the importance of exploring perceptions before the adoption of AI into practice23 and the need to challenge common misconceptions to ensure that staff are engaged and supportive of the new technology.19,25
In line with broader research on the use of AI in health care, interviewees saw the value of AI for eVisits but voiced concerns regarding its potential effects on patient safety and the doctor-patient relationship.18,19 We found that patient and staff acceptability depended on the retention of clinician oversight and ongoing evaluation.
In our recent review of the eVisit literature,8 we proposed the following 4 potential new opportunities for AI-powered features in addition to adapting questions posed to patients during eVisit submission: workflow, directing, prioritization, and face-to-face appointment booking. Staff in the present study independently suggested these same features, providing further evidence for their potential utility. Participants also identified the following 3 additional opportunities for AI during eVisits: asking questions, writing assistance, and providing self-help information. Figure 1 shows how these new suggestions relate to the ones identified in our prior review within the broader context of processing eVisits.
Newly Identified Opportunities for Use of Artificial Intelligence
A novel finding of the present study is that despite staff concerns regarding patient acceptability, patients largely expressed a progressive view of AI. However, this study was conducted since the advent of the COVID-19 pandemic, and both staff and patients reflected that their initial skepticism of eVisits had been allayed by recent positive experiences. In the context of these recent rapid changes in primary care, participants might have felt more open to try new technologies. This could explain why we observed a less mixed response to AI innovation in comparison to other studies24 and why there were fewer concerns raised by patients regarding data security.20
Implications for Research and Practice
eVisit System Developers and Designers
Our results suggested the following 3 new AI-powered features that could be incorporated into future eVisit system designs that have not previously been reported (Figure 1): (1) asking questions (eg, sending patients detailed questionnaires about symptoms), (2) writing assistance (eg, recommending editable templates based on previous responses), and (3) providing self-help information (eg, directing patients to relevant websites). System designers could build these new features into their eVisit systems to evaluate their utility in clinical practice.
Our findings suggest that the use of AI to adapt questions posed to patients during eVisit query submission might not be acceptable to all patients. Patients had negative associations with online chatbots and believed that using AI to determine questions would be encroaching on the core doctor role. Patients were reluctant to alter the user experience of this particular eVisit system, which consisted of a few simple questions and free-text responses. A combination of multiple-choice questions and free-text responses could be considered to ensure that sufficient information is captured for AI to workflow, direct, and prioritize requests effectively. This will also address patient concerns that the use of AI could increase their responsibility to input symptoms accurately and in sufficient detail.
Future system development could include the integration of internet of things devices to supplement information received during eVisits, which could be analyzed by AI. Examples include physiologic data from wearables such as smart watches and devices that enable remote physical examinations (eg, TytoCare [www.tytocare.com]; a handheld exam kit and app that connects patients with clinicians for on-demand medical exams).
Primary Care Staff
Primary care staff should be reassured that patients are open to the use of AI during eVisits once their initial anxieties have been addressed, for example, by providing reassurance that AI will not replace clinical input from their primary care clinician and emphasizing that face-to-face appointments will still be available if necessary. Highlighting expected benefits for patients, such as improved patient safety and the ability to speed up the process of resolving their eVisit, is also important to ensure sustained use by patients.
Researchers
This study investigated preadoption views of AI during eVisits. Future research should develop and implement the suggested AI features into eVisits in practice. Discussions with staff and patients should aim to establish key performance criteria for each AI feature to help inform eVisit system design. Development and evaluation of AI features has already begun in the Patchs system, and articles are currently being prepared for publication. Given that participants in the present study were current users of eVisits, more research is required to determine if any particular groups, such as racial and ethnic minority groups or those less confident with technology, are more hesitant regarding the incorporation of AI.
Conclusion
This is the first reported exploration of patient and staff perspectives on the use of AI during eVisits. We identified 7 specific AI functions of eVisits that could improve patient experience, support patient safety, and decrease staff workload. Successful implementation will depend on demonstrating and monitoring safety, clarifying misconceptions, and reassuring patients that AI use will not negatively affect their experience or replace clinical input.
Acknowledgments
The study authors wish to thank members of the NIHR Applied Research Collaboration Greater Manchester Public and Community Involvement and Engagement Panel for their input to the design of the study and patients and staff from general practice who volunteered their time to participate in this study.
Footnotes
Conflicts of interest: B.C.B. is a shareholder in Patchs Health, which developed the Patchs eVisit system. All other authors report none.
Funding support: This research was funded by Innovate UK (105178) and a Wellcome Trust Clinical Research Career Development Fellowship for B.C.B. (209593/Z/17/Z). This work was supported by the National Institute for Health and Care Research (NIHR) Greater Manchester Patient Safety Translational Research Centre (award number PSTRC-2016 to 003).
Disclaimer: The views expressed are those of the authors and not necessarily those of the NIHR or the Department of Health and Social Care. The NIHR had no role in study design, data collection, data analysis, data interpretation, or writing of the report. The corresponding author had full access to all of the data and the final responsibility to submit for publication.
Ethical approval: Ethical approval was obtained from the National Health Service Health Research Authority Yorkshire and The Humber - Bradford Leeds Research Ethics Committee (reference 20/YH/0020).
- Received for publication June 20, 2024.
- Revision received November 12, 2024.
- Accepted for publication January 20, 2025.
- © 2025 Annals of Family Medicine, Inc.