Skip to main content

Main menu

  • Home
  • Current Issue
  • Content
    • Current Issue
    • Early Access
    • Multimedia
    • Podcast
    • Collections
    • Past Issues
    • Articles by Subject
    • Articles by Type
    • Supplements
    • Plain Language Summaries
    • Calls for Papers
  • Info for
    • Authors
    • Reviewers
    • Job Seekers
    • Media
  • About
    • Annals of Family Medicine
    • Editorial Staff & Boards
    • Sponsoring Organizations
    • Copyrights & Permissions
    • Announcements
  • Engage
    • Engage
    • e-Letters (Comments)
    • Subscribe
    • Podcast
    • E-mail Alerts
    • Journal Club
    • RSS
    • Annals Forum (Archive)
  • Contact
    • Contact Us
  • Careers

User menu

  • My alerts

Search

  • Advanced search
Annals of Family Medicine
  • My alerts
Annals of Family Medicine

Advanced Search

  • Home
  • Current Issue
  • Content
    • Current Issue
    • Early Access
    • Multimedia
    • Podcast
    • Collections
    • Past Issues
    • Articles by Subject
    • Articles by Type
    • Supplements
    • Plain Language Summaries
    • Calls for Papers
  • Info for
    • Authors
    • Reviewers
    • Job Seekers
    • Media
  • About
    • Annals of Family Medicine
    • Editorial Staff & Boards
    • Sponsoring Organizations
    • Copyrights & Permissions
    • Announcements
  • Engage
    • Engage
    • e-Letters (Comments)
    • Subscribe
    • Podcast
    • E-mail Alerts
    • Journal Club
    • RSS
    • Annals Forum (Archive)
  • Contact
    • Contact Us
  • Careers
  • Follow annalsfm on Twitter
  • Visit annalsfm on Facebook
Research ArticleMethodology

Adapting Psychosocial Intervention Research to Urban Primary Care Environments: A Case Example

Luis H. Zayas, M. Diane McKee and Katherine R. B. Jankowski
The Annals of Family Medicine September 2004, 2 (5) 504-508; DOI: https://doi.org/10.1370/afm.108
Luis H. Zayas
PhD
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
M. Diane McKee
MD, MS
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Katherine R. B. Jankowski
MA
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • Article
  • Figures & Data
  • eLetters
  • Info & Metrics
  • PDF
Loading

Abstract

PURPOSE We wanted to describe the unique issues encountered by our research team in testing an intervention to reduce perinatal depression in real-world community health centers.

METHOD We used a case study of an experience in conducting a randomized controlled trial designed to test the effectiveness of a low-cost multimodal psychosocial intervention to reduce prenatal and postpartum depression. Low-income minority women (N = 187) with low-risk pregnancies were randomly assigned to the intervention or treatment as usual. Outcomes of interest were depressive symptoms and social support assessed at 3 months’ postpartum.

RESULTS Our intervention was not associated with changes in depressive symptoms or social support. Challenges in implementation were related to participant retention and intervention delivery. Turnover of student therapists affected continuity in participant-therapist relationships and created missed opportunities to deliver the intervention. The academic-community partnership that was formed also required more involvement of health center personnel to facilitate ownership at the site level, especially for fidelity monitoring. While attentive to cultural sensitivity, the project called for more collaboration with participants to define common goals and outcomes. Participatory research strategies could have anticipated barriers to uptake of the intervention and achieved a better match between outcomes desired by researchers and those of participants.

CONCLUSION Several criteria for future research planning emerged: assessing what the population is willing and able to accept, considering what treatment providers can be expected to implement, assessing the setting’s capacity to accommodate intervention research, and collecting and using emerging unanticipated data.

  • Psychosocial treatment
  • intervention research, primary health care
  • perinatal depression
  • mental health

INTRODUCTION

During the past 3 decades, primary care practice-based research has made notable progress in establishing its feasibility and potential to address questions important to primary care and the communities we serve.1–,5 Practice-based research can answer questions of greater relevance to primary care practice, test effectiveness of treatments in undifferentiated patient populations,3 and engage clinicians in the generation of new knowledge that can be readily assimilated into practice.5 Despite gains made, researchers note that challenges persist in successful implementation of studies in primary care settings.6–,8 Moving psychosocial intervention research into urban community-based primary care settings that are not research oriented remains highly challenging. Disappointing results of well-designed intervention research may be due to practical difficulties of implementing studies in unpredictable community-based health centers rather than to lack of intervention efficacy.9–,11 Barriers, expected and unexpected, to implementing intervention research in community-based settings will remain1,2,12 and will likely be exacerbated in the unique setting of safety net sites.13

In this article, we describe logistical hurdles encountered in testing an intervention to reduce depression in pregnant, low-income urban minority women in community health centers. We describe the varied elements that affected our research: the persons who receive interventions, the persons who provide interventions, and the persons who collectively create service delivery systems—contexts subject to the effects of unanticipated events. We offer several criteria for adapting intervention research to real-world circumstances.

METHODS

Participants were 187 African American and His-panic women with low-risk pregnancies (mean age = 25 years; mean education = 12 years). Women were screened for depressive symptoms in the third trimester, and those with elevated depressive symptoms were randomly assigned to an intervention group (n = 57) or group receiving treatment as usual (n = 43) offered by their health center. A third group of women who were not depressed (n = 87) was used for comparison. Data were collected again at 2 weeks’ and 3 months’ postpartum.14,15 Depression was measured with the Beck Depression Inventory, second edition (BDI-II ), in which a score of 14 is considered the lower end of depressive symptoms.16 Total functional social support (ie, actual received support) was measured with the Norbeck Social Support Questionnaire, an instrument for use with pregnant women.17 Life stressors were measured with the Life Events Questionnaire,18 and functional abilities were measured with the Medical Outcome Study Short Form-36 (SF-36).19

The intervention consisted of 3 components: an 8-session cognitive behavioral treatment specifically developed for depression prevention in primary care,20 4 psychoeducation sessions using videotaped and written materials on infant development and maternal sensitivity, and ongoing social support building from the therapist. Therapists were graduate students in social work (all women of an ethnic minority completing clinical internships) who were trained and supervised in both the intervention and cultural sensitivity. Intervention sessions were offered in the health center or at home at least twice a month.

The settings were 3 health centers in the south Bronx, NY, all of which had prenatal teams of physicians, nurses, and social workers. The 2 larger, federally funded community health centers had formal psychosocial service teams. The smallest site was a family practice setting with a part-time social worker. The sites had little experience with research and none with intervention studies.

RESULTS

We describe the challenges and lessons learned in implementing and measuring study effects. Thirty-eight percent of the sample was lost to attrition. Of this 38%, 42% and 44% of those in the intervention and treatment-as-usual groups, respectively, did not complete the study (Table 1⇓). Only 32% of the women who were not depressed dropped out of the study. No significant differences were apparent in changes in total functional social support or reduction of depression in the 2 treatment groups.

View this table:
  • View inline
  • View popup
Table 1.

Summary of Attrition and Depression Scores

Methodological and logistical challenges arose that affected outcomes, some from the realities of the research contexts and some through errors of commission and omission.

Community Health Center Structures

The reputations of the health centers aided patient recruitment, but conducting an intervention in 3 health centers required negotiating with different systems and staff attitudes toward research. Administrators feared that service productivity and the bottom line would be compromised rather than enhanced by an intervention study. Occasional chaos broke out in health centers from operational changes wrought by regulatory and funding authorities.21 Collaborating with different sets of clinicians facing extraordinary demands from patients and administration also affected delivery of the intervention. Other interruptions occurred when equipment or space was unavailable, requiring sudden adjustments in plans for sessions and many missed treatment opportunities.

Therapists and Intervention Fidelity

Constraints of the academic calendar on student-therapists affected the continuity of therapeutic relationships, leading us to conclude that engaging full-time on-site psychosocial providers might have resulted in both more sessions delivered and less attrition. Concurrently, the limited ability to monitor the fidelity of the intervention owing to space constraints, lack of secure storage for recording equipment, and logistical barriers to videotaping or audio recording the intervention sessions affected implementation. Although we were successful in an important goal of minimizing the intrusiveness of our study to the practices, greater involvement of center personnel in problem solving, especially the presence of an on-site champion, might have encouraged a greater sense of ownership in the project and commitment to facilitating fidelity monitoring.6–,8

Patient Adherence to the Intervention

Although participants were well informed of what the study entailed, through their behavior and comments it was apparent that they determined—generally by default—what parts of the planned intervention they would or could accept. Often, these choices were dictated by life circumstances (eg, caring for a newborn, a sick child, financial problems, competing appointments) rather than simple refusals. We found less acceptance of home visits than expected; reasons cited included the social activities of health centers, concerns for the safety of our therapists, and occasionally the opposition of partners. Adherence might have been enhanced with a more systematic collection and analysis of qualitative information so we could adapt our protocol to the women’s real-life circumstances. Strategies from participatory research could have anticipated these barriers by collaboration with members of our target community throughout study development and implementation.21,22

Relevance of Outcomes

We encountered differences in what researchers consider relevant patient outcomes and what patients think are relevant outcomes for themselves.10 Women described themselves as stressed, not depressed; thus, those elements of the intervention aimed toward reducing depressive symptoms may have seemed irrelevant to many women. We learned that getting social support or psychoeducational information, having a compassionate listener, or learning about available childcare was more important to them than reducing depression or engaging in pleasurable activities. Participatory research strategies that included patients’ and other stakeholders’ voices in all stages of the project, even if it involved modifying the goals initially set for the study, would have created a better match in the preferred outcomes of patients and researchers.21,22

Retention of Participants

As Table 1⇑ shows, retaining participants for a 4- to 6-month period was a major challenge. This issue is especially common in studies of urban, minority poor populations.23,24 The turnover in student therapists affected some participants’ willingness to be transferred to new therapists. Motivation to continue may have waned for some clients as a result of a mismatch of the intervention components to women’s perceived needs.

Uptake of the Intervention

Providing full dosages of a varied intervention proved to be more than what the women’s lives allowed, even with therapists’ best efforts and with the most motivated participants. The intervention group received an average of 3 of 8 planned cognitive behavioral treatment sessions (range 0 to 7), 1 of 4 planned child development sessions (range 0 to 3), and 2 of an indefinite number of social-support-building sessions (range 0 to 14), with substantial variability for individual women. We found that women who reported more positive life events and more social support and better perceived general health availed themselves of more treatment than those with less social support, low perceived health, and fewer positive life events, suggesting that delivering interventions may be most problematic for those with greatest need.

CONCLUSION

We concur with others that the adaptations and findings of field trials may actually be stronger than those of laboratory studies because of the more demanding, fl uid research environments in which they are conducted.25 Thus, practice-based research has greater relevance and applicability to the primary care patients we serve.2,3 We affirm the reality that intervention research is difficult to implement faithfully in urban primary care settings, especially those without a tradition or infrastructure for research. Our project represented an academic-community partnership, but we had insufficient involvement of the community practices to facilitate ownership at the site level or optimal problem solving, especially beyond its initial months. While we were attentive to cultural sensitivity, we lacked the true collaboration with participants necessary to define common goals and outcomes.6,22

Our key message is that psychosocial intervention research in community-based primary care will require fl exibility in adjusting the research design before and during implementation.10 The concerns of multiple stakeholders, nonresearch settings, and real-life issues in patients’ lives can be reconciled by research designs that result in cycles of outcome assessments and procedure refinements.26

Observing several criteria may help adapt research design to stakeholder and service system characteristics:

1. Assess what the target population can and will accept of an intervention. Using participatory research approaches, patients and knowledgeable staff can answer such questions as, What are the key issues in patients’ lives that are essential to target with the intervention? What kinds of interventions and in what dosages will patients accept? What kinds of outcomes do patients want and in what domains? Which outcomes will be most compelling to patients? Answers to these questions are contingent on understanding service and community contexts early on in a project. Operationally, this understanding may be accomplished by partnering with community representatives and conducting key stakeholder focus groups and interviews.

2. Aim toward interventions that clinicians in comparable settings can deliver, given the typical demands of practice in community settings. This criterion would pose such questions as, What will clinicians in community settings like this one need to know? What intervention and how much of the intervention can clinicians reasonably be expected to deliver? What will they embrace and what will they reject? Clinicians’ acceptance of empirically-based interventions in these circumstances is more likely to occur when combinations of treatments with small doses are available and where training requirements are not excessive. Despite the demands of the primary care setting, training on-site personnel to deliver the intervention might raise their ownership of the study and help identify potential problems and realistic solutions.

3. Assess the service setting’s capacity to accommodate intervention research. How does the intervention researcher integrate externally and internally induced changes in the service system into the research protocol? Countless predictable and unpredictable changes in participating practices can disrupt implementation and data collection. In every dimension of research, adaptations might be required that deviate from the original plans; these adaptations can be creatively pursued while maintaining the fundamental research design. This criterion obviates the need for ongoing discussions with providers and administrators in preparing for impending changes, particularly in nonresearch environments. Looking ahead at looming policy or fiscal realities that will bring changes to service systems and asking how the centers managed similar upheavals in the past can prepare the research project staff to reduce interruptions and adapt in advance.

4. Collect emerging data systematically that are unanticipated or not easily quantifiable but which may be important later. Typically, this information is not originally considered as part of the data to be collected. Yet the events that occur can have powerful effects on research implementation. The question then becomes, How can emerging, even anecdotal information, be collected and reviewed systematically to make appropriate design adaptations while maintaining scientific integrity? We recommend encouraging regular input of anecdotes and hunches from all members of the team, recording and examining these systematically for indications of trends and troubles. Our experience shows that unexpected findings or implementation problems can lead to creative and useful research adaptations.

Footnotes

  • Conflicts of interest: none reported

  • Funding support: Financial support was provided, in part, by National Institute of Mental Health grants R24 MH57936 and R24 MH60002.

  • Received for publication March 3, 2003.
  • Revision received October 12, 2003.
  • Accepted for publication October 18, 2003.
  • © 2004 Annals of Family Medicine, Inc.

REFERENCES

  1. ↵
    Nutting PE, Beasley JW, Werner JJ. Practice-based research networks answer primary care questions. JAMA. 1999;281:686–688.
    OpenUrlCrossRefPubMed
  2. ↵
    Stange K. Primary care research: barriers and opportunities. J Fam Pract. 1996;42:192–198.
    OpenUrlPubMed
  3. ↵
    Green LA, Hames CG, Nutting P. Potential of practice-based research networks: experiences from ASPN. J Fam Pract. 1994;38:400–406.
    OpenUrlPubMed
  4. Yawn B. Conference Proceedings. Methods for Practice-Based Research Networks: Challenges And Opportunities. Leawood, Kan: American Academy of Family Physicians; 2001:13–14.
  5. ↵
    Stange K, Miller W, McWhinney I. Developing the knowledge base of family practice. Fam Med. 2001;33:286–297.
    OpenUrlPubMed
  6. ↵
    Christoffel KK, Binns HJ, Stockman JA, et al, and the Pediatric Practice Research Group. Practice-based research: opportunities and obstacles. Pediatrics. 1988;82:399–406.
    OpenUrlAbstract/FREE Full Text
  7. Beasley JW. Success and failure in practice-based research: WreN. In: Practice-Based Research Networks in the 21st Century, Conference Proceedings. Leawood, Kan: American Academy of Family Physicians; 1998:65–71.
  8. ↵
    Hickner J. The best and worse studies of the Upper Peninsula Research Network. In: Practice-Based Research Networks in the 21st Century, Conference Proceedings. Leawood, Kan: American Academy of Family Physicians; 1998;72–75.
  9. ↵
    Farmer EMZ, Burns BJ, Guiles HB, Behar L, Gerber D. Conducting randomized clinical trials in children’s mental health: experiences and lesson from one venture. In: Nixon C, Northrup D, eds. Evaluating Mental Health Services: How Do Programs for Children “Work” in the Real World? Thousand Oaks, Calif: Sage Publications; 1997.
  10. ↵
    Hohman AA, Shear MK. Community-based intervention research: coping with the noise” of real life in study design. Am J Psychiat. 2002;159:201–207.
    OpenUrlCrossRefPubMed
  11. ↵
    Rogers S, Humphrey C, Nazareth I, Lister S, Tomlin Z, Haines A. Designing trials of interventions to change professional practice in primary care: lessons from an exploratory study of two change strategies. BMJ. 2000;320:1580–1583.
    OpenUrlFREE Full Text
  12. ↵
    Klinkman MS, Okkes I. Mental health problems in primary care: a research agenda. J Fam Pract. 1998;47:379–384.
    OpenUrlPubMed
  13. ↵
    Dietrich AJ, Tobin JN, Sox CH, et al. Cancer early-detection services in community health centers for the underserved. A randomized controlled trial. Arch Fam Med. 1998;7:320–327.
    OpenUrlCrossRefPubMed
  14. ↵
    McKee MD, Cunningham M, Jankowski KRB, Zayas LH. Health-related functional status in pregnancy: relationship to depression and social support in a multi-ethnic population. Obstet Gynecol. 2001;97:988–993.
    OpenUrlCrossRefPubMed
  15. ↵
    Zayas LH, Cunningham M, McKee MD, Jankowski KRB. Depression and negative life events among pregnant African-American and His-panic women. Women’s Health Issues 2002;12:16–22.
    OpenUrl
  16. ↵
    Beck AT, Steer RA, Brown GK: BDI-II: Beck Depression Inventory Manual. 2nd ed. San Antonio, Tex: Psychological Corporation; 1996.
  17. ↵
    Norbeck JS, Lindsey AM, Carrieri VL. Further development of the Norbeck Social Support Questionnaire: Normative data and validity testing. Nursing Res. 1983;32:4.
    OpenUrl
  18. ↵
    Norbeck JS, Anderson NJ. Life stress, social support, and anxiety in mid- and late-pregnancy among low income women. Res Nursing Health. 1989;12:281–289.
    OpenUrlPubMed
  19. ↵
    Ware JE. Short Form-36 Health Survey: Manual and Interpretation Guide. Boston, Mass: The Health Institute, New England Medical Center; 1993.
  20. ↵
    Miranda J, Muñoz RF. Intervention for minor depression in primary care patients. Psychosomatic Med. 1995;56:136–142.
    OpenUrl
  21. ↵
    Green LW, Mercer SL. Can public health researchers and agencies reconcile the push from funding bodies and the pull from communities? Am J Public Health. 2001;91:1926–1929.
    OpenUrlCrossRefPubMed
  22. ↵
    Macaulay AC, Commanda LE, Freeman WL, et al. Participatory research maximizes community and lay involvement. BMJ. 1999;319:774–778.
    OpenUrlFREE Full Text
  23. ↵
    Miranda J, Azocar F, Organista KC, Muñoz RF, Lieberman A. Recruiting and retaining low-income Latinos in psychotherapy research. J Consult Clin Psychol. 1996;64:868–874.
    OpenUrlCrossRefPubMed
  24. ↵
    Thompson EE, Neighbors HW, Munday C, Jackson JS. Recruitment and retentions of African American patients for clinical research: an exploration of response rates in an urban psychiatric hospital. J Consult Clin Psychol. 1996;64:861–867.
    OpenUrlCrossRefPubMed
  25. ↵
    Hoagwood K., Hibbs T, Brent D, Jensen P. Introduction to the special section: efficacy and effectiveness in studies of child and adolescent psychotherapy. J Consult Clin Psychol. 1995;63:683–687.
    OpenUrlCrossRefPubMed
  26. ↵
    Weisz JR, Donenberg GR, Han SS, Weiss B. Bridging the gap between laboratory and clinic in child and adolescent psychotherapy. J Consult Clin Psychol. 1995;63:688–701.
    OpenUrlCrossRefPubMed
PreviousNext
Back to top

In this issue

The Annals of Family Medicine: 2 (5)
The Annals of Family Medicine: 2 (5)
Vol. 2, Issue 5
1 Sep 2004
  • Table of Contents
  • Index by author
  • The Issue in Brief
Print
Download PDF
Article Alerts
Sign In to Email Alerts with your Email Address
Email Article

Thank you for your interest in spreading the word on Annals of Family Medicine.

NOTE: We only request your email address so that the person you are recommending the page to knows that you wanted them to see it, and that it is not junk mail. We do not capture any email address.

Enter multiple addresses on separate lines or separate them with commas.
Adapting Psychosocial Intervention Research to Urban Primary Care Environments: A Case Example
(Your Name) has sent you a message from Annals of Family Medicine
(Your Name) thought you would like to see the Annals of Family Medicine web site.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
5 + 9 =
Solve this simple math problem and enter the result. E.g. for 1+3, enter 4.
Citation Tools
Adapting Psychosocial Intervention Research to Urban Primary Care Environments: A Case Example
Luis H. Zayas, M. Diane McKee, Katherine R. B. Jankowski
The Annals of Family Medicine Sep 2004, 2 (5) 504-508; DOI: 10.1370/afm.108

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
Get Permissions
Share
Adapting Psychosocial Intervention Research to Urban Primary Care Environments: A Case Example
Luis H. Zayas, M. Diane McKee, Katherine R. B. Jankowski
The Annals of Family Medicine Sep 2004, 2 (5) 504-508; DOI: 10.1370/afm.108
Twitter logo Facebook logo Mendeley logo
  • Tweet Widget
  • Facebook Like
  • Google Plus One

Jump to section

  • Article
    • Abstract
    • INTRODUCTION
    • METHODS
    • RESULTS
    • CONCLUSION
    • Footnotes
    • REFERENCES
  • Figures & Data
  • eLetters
  • Info & Metrics
  • PDF

Related Articles

  • PubMed
  • Google Scholar

Cited By...

  • Efficacy and effectiveness studies of depression are not well-differentiated in the literature: a systematic review
  • The Role of the Champion in Primary Care Change Efforts: From the State Networks of Colorado Ambulatory Practices and Partners (SNOCAP)
  • In this Issue: The Patient-Clinician Relationship and Practice-Based Network Research
  • Google Scholar

More in this TOC Section

  • Joint Display of Integrated Data Collection for Mixed Methods Research: An Illustration From a Pediatric Oncology Quality Improvement Study
  • Patient-Guided Tours: A Patient-Centered Methodology to Understand Patient Experiences of Health Care
  • Putting Evidence Into Practice: An Update on the US Preventive Services Task Force Methods for Developing Recommendations for Preventive Services
Show more Methodology

Similar Articles

Subjects

  • Domains of illness & health:
    • Mental health
  • Person groups:
    • Women's health
    • Vulnerable populations
  • Methods:
    • Quantitative methods
  • Other topics:
    • Quality improvement
    • Organizational / practice change
    • Social / cultural context

Content

  • Current Issue
  • Past Issues
  • Early Access
  • Plain-Language Summaries
  • Multimedia
  • Podcast
  • Articles by Type
  • Articles by Subject
  • Supplements
  • Calls for Papers

Info for

  • Authors
  • Reviewers
  • Job Seekers
  • Media

Engage

  • E-mail Alerts
  • e-Letters (Comments)
  • RSS
  • Journal Club
  • Submit a Manuscript
  • Subscribe
  • Family Medicine Careers

About

  • About Us
  • Editorial Board & Staff
  • Sponsoring Organizations
  • Copyrights & Permissions
  • Contact Us
  • eLetter/Comments Policy

© 2025 Annals of Family Medicine