Article Figures & Data
Tables
Characteristic Value Gender, No. (%) Women 9 (75.0) Men 3 (25.0) Age group, No. (%) ≤29 years 3 (25.0) 30-44 years 8 (66.7) 45-59 years 1 (8.3) Main role, No. (%) Family physician 4 (33.3) Practice nurse (eg, RN, LPN) 3 (25.0) Nurse practitioner 2 (16.7) Clinic lead or manager 1 (8.3) Social worker 1 (8.3) Income security health promoter 1 (8.3) Time in clinical practice, mean (SD), y 6.9 (9.2) Time in nonclinical role, mean (SD), y 3.3 (3.8) General knowledge of AI, No. (%) Minimally knowledgeable 7 (58.3) Moderately knowledgeable 4 (33.3) Very knowledgeable 1 (8.3) AI = artificial intelligence; LPN = licensed practical nurse; RN = registered nurse.
- Table 2.
Using a Sociotechnical Systems Framework for the Application of Al in Health Care Delivery to Describe Participant Preferences on People, Process, and Technology
Domain Themes and Subthemes Salient Quotes People Who should have access to AI-derived social data? “Circle of care” team or anyone already having access to patient record
Uncertainty whether clerical staff and/or patients should have access
Appropriate to be accessed for quality improvement, practice-related purposes (eg, program planning/development, practice management, resource allocation) and research
No access for private/third-party entities outside of circle of care (eg, insurance companies, housing agencies, any commercial organization)
“I think it’s very important to really understand those vulnerabilities linked to the social determinants to, kind of, allocate your resources as a provider, how much time investment is required to cater to the specific needs. So yeah, definitely, that information should be available to all the providers in the family health team, so they can provide that targeted, tailored care.” Who should take action if a social need is identified through the AI tool? Should be led by most responsible provider (usually the family physician)
Varying levels of action for different team members involved with patient care (eg, social workers, nurses)
Many expressed the challenge of deciding when and how to act on an identified social need
“I think MRP is the most responsible. But I think anybody that sees that information could take a step to act on it. So you know, if they they’re meeting with nursing that day, and they notice something, and it’s something that might be appropriate for a referral to a social worker, the income program, you know, we get referrals from doc, from nurses, everybody does.” Process Start with pilot implementation Focus on 1 or 2 social determinants initially
Verify AI response with patients to establish accuracy of the tool
Anticipate guidance or management required according to AI output
Obtain feedback from clinicians at end of pilot phase to measure satisfaction and usefulness
“How would I go about contacting the patient and saying, ‘Hey, the computer thinks you might have low income? What’s your income?’ How would that communication piece go?” Workflow considerations Patients were to verify/confirm social need once identified, but participants were uncertain how to do this without concerning their patients
Additional staff time and resources needed
Participants desired a balance between asking for SDOH information directly from patients (eg, surveys) and using AI to derive it where missing
Participants indicated that consent should be sought from patients to use AI to derive social information (either direct or implied)
“I can’t imagine telling all my patients that AI is going to be reviewing their charts. They would absolutely never see me again.”
“… for an individual social worker or physician, I think this would add work to our day. But probably provide better care. There’s a chance that we’d maybe solve their homelessness earlier and then later not have to deal with terrible mental health issues. So I guess that could be time gained. But overall, I suspect it would cause more work, which isn’t bad, because it’s probably for the best of the patient.”Activities or initiatives to support the adoption and integration of the AI tool Regularly scheduled meetings to discuss AI tool, implementation, and evaluation
Use team to develop the algorithms alongside FHT staff who are ideally knowledgeable about the clinical environment
Additional hours/staffing for nursing, physician assistant, and/or social work
More staff for income and housing supports
Ensure free tax clinics for patients are available
Hire additional community health workers to conduct telephone checkins with vulnerable patients
Ensure good connections and referral pathways to community agencies (ie, housing, income, gender transitioning)
Plans or recommendations for physicians managing a large volume of messages flagging social needs or concerns identified by the AI tool, which would then need to be sorted and verified; this is especially pertinent for the FHT, as they provide care for a large proportion of patients who would potentially be flagged with social needs or concerns
Meaningful and long-term engagement with patients and communities
Support from leadership and clinic management
“But if you want it to be more actionable, then you’d have to have scheduled meetings and have people suited to the clinical environment to help develop algorithms with the staff. So [you’d need] personnel to help do that. And ideally, like nursing and social work hours or physician assistant hours, but that’s like in a dream world, because that’s a huge cost.”
“It would be something I feel like management could be involved in supporting, whether it’s programs run by the nurses or something, but like we would need support from management and leadership.”Technology What SDOH data should be included in the AI tool? All participants agreed on housing and income insecurity as priorities
Other SDOH suggested by participants: drug benefits/coverage and other medical coverage (eg, relevant to which medications are prescribed and allied health referrals, such as physiotherapy and massage therapy), sexual orientation, gender, country of origin, education, food insecurity, social isolation (particularly for elderly patients or immigrants), ability to navigate the health care system, health care access
Date associated with each determinant
“Income is such a broad category, that kind of ties to so many different aspects like food security, housing, job security. And usually, it’s almost like, it’s so interchangeable, like, because of the health, you know, all these things are affected, or because of the income, the health is affected. So it just relates so well. So that [income] will be a very broad theme that should be given good focus.”
“… Sure, it takes a lot of work and resources to get something like this going. So if we know that it’s this … and then maybe we move on to another one. We’d say like, ‘Oh, I really like housing and income, it’s really important. It’s helpful.’ And maybe the rest of it is like, we’re okay to do without or something, we can just figure that out. And it’s less critical or maybe down the road.”
“Like the prescribing, you briefly glance at the side to see ‘Do they have insurance coverage?’, like ‘What did they do for work?’ all of like micro pieces of information that guide your decision with income and employment.”How could this AI tool be most useful? Embedded in patient record/EHR
AI tool to gather and summarize important information from patient record
Ensure tool provides actionable output
Monitor changes in patient status (eg, housing, income) and alert if potential challenge arises
Automatic prompts for specific appointment or referral related to social work, income support, telephone support, or other resources
AI tool output connected to local, evidence-based solutions
“I think kind of a change in status could also be interesting, you know, someone who’s kind of been as, let’s say, middle of the road, and then all of a sudden, the algorithm predicts that there’s been a significant drop in their income security, housing security, etcetera.
And flagging that to the provider, kind of using that as a prompt to have a discussion around that. I think that could also be a pretty useful tool.”
“But if we could use it for programming, like, if we find that like, a lot of our patients are low income and not filing their taxes, then we could send them directly to like tax clinics and make sure that those get done.”AI = artificial intelligence; EHR = electronic health record; FHT = family health team; MRP = most responsible provider; SDOH = social determinants of health.
Note: Framework was developed by Salwei and Carayon.9
- Table 3.
Barriers and Facilitators for the Use of Al-Derived Social Data as Reported by Participants
Domain Barriers Facilitators People Ensuring patient privacy, especially for patients who have expressed they do not want their health information shared outside the circle of care
Use of AI might lead to patient mistrust
Some clinicians expressed distrust in AI, especially the ability for it to be accurate
Clinic staff who do not like technology or are unfamiliar with technology are less likely to adopt
Use of AI might lead to patient mistrust in their clinicians
Possible public perception of AI as going through their sensitive medical information or disclosing this information outside of the clinical care circle
Provide 1-page summary for staff about AI tool
Make available 1-page lay summary for patients about AI tool
Don’t call it AI
Demonstrate the benefit to patients of allowing their clinicians access to AI-derived social data
Have an AI champion at each FHT site to advocate for its use and to problem solve, and ensure champion is well supported
Ensure FHT leadership and clinic management support tool and integration into clinical processes
Have a trustworthy team managing the AI tool who are familiar to the FHT
Process Lack of knowledge among participants about how the AI tool works or what data are being used
Differing opinions about the AI tool and how it should be used
Too much additional staff time and resources required
Overreliance on AI may erode clinical skills and decision making over time
AI-derived data presented consistently and as a simple summary
AI-derived data easy to find and access (eg, in CPP; not a lot of clicks to get to it)
AI should not slow down the EHR
AI tool should require little to no learning curve
Ensuring adequate practice staffing to manage increase in workload
Regular engagement needed with FHT to explain how the AI tool works (eg, site visits, communication, embedding of presentations into regular clinic meetings)
Circulate background information about AI tool (eg, its performance, validity) for reference
Technology AI tool requiring extra technology or hardware
EHR data used for the AI seen as not reliable or timely
Variation in the data (eg, type and amount of information entered) could cause bias
Accuracy of AI tool in correctly classifying patients
Accurate representation of the data in the AI tool
Display accuracy metrics (eg, percentages)
More/better data collection within the patient record (to help improve accuracy, trust)
Automatically updating AI output or EHR if circumstances change (eg, date-stamped)
AI = artificial intelligence; CPP = Cumulative Patient Profile; EHR = electronic health record; FHT = family health team.
Additional Files
SUPPLEMENTAL APPENDIX IN PDF FILE BELOW
Supplemental Appendix. Workshop Guide
VISUAL ABSTRACT IN PNG FILE BELOW
Visual Abstract PNG file
- GariesAIVisualAbstract.png -
PNG file
- GariesAIVisualAbstract.png -
PLAIN-LANGUAGE SUMMARY OF ARTICLE
Original Research
Health Care Providers Weigh In On Their Experiences Developing an AI Tool to Understand Primary Care Patients’ Social Determinants of Health
Background and Goal: Social determinants of health are the conditions in which people are born, grow, live, work, and age. These conditions include income, education, and access to health care. Knowledge of these factors is essential for primary care clinicians to deliver fair and complete care, plan programs and distribute resources effectively. However, this information is rarely captured consistently in clinical settings. This study identified how an Artificial Intelligence (AI) social determinants of health tool can be designed using a collaborative design strategy with input from primary care team members.
Study Approach: Semi-structured, 50-minute workshops were conducted with the St. Michael’s Hospital Academic family health team in Toronto, Ontario, Canada from May to June 2023. Participants were asked for their feedback on a proposed AI-based tool that derives patient social determinants of health from electronic health record (EHR) data.
Main Results:
-
15 participants of the health team participated across 4 workshops.
-
Participants reported that most patient information was not available or difficult to find in their EHR.
-
Participants recommended starting with 1-2 social determinants (such as income and housing). They also emphasized the need for adequate resources, staff, and training materials.
-
Many challenges were reported, including how to discuss the use of AI with patients and confirm their social needs identified by the AI tool.
Why It Matters: The integration of AI into health care is rapidly advancing, presenting both opportunities and challenges. This study provides valuable insights from end users on the meaningful design and implementation of an AI-based tool for social data in primary care. The AI-based tool can potentially improve patient care and reduce clinician burnout.
Visual Abstract:
-