Published on in Vol 13 (2024)

Preprints (earlier versions) of this paper are available at https://preprints.jmir.org/preprint/51974, first published .
Designing mHealth Apps to Incorporate Evidence-Based Techniques for Prolonging User Engagement

Designing mHealth Apps to Incorporate Evidence-Based Techniques for Prolonging User Engagement

Designing mHealth Apps to Incorporate Evidence-Based Techniques for Prolonging User Engagement

Viewpoint

1Medable Inc, Palo Alto, CA, United States

2Weill Cornell Medical College, White Plains, NY, United States

3The Public Health Institute, Oakland, CA, United States

Corresponding Author:

Ingrid Oakley-Girvan, MPH, PhD

Medable Inc

525 University Ave

Palo Alto, CA, 94301

United States

Phone: 1 8778206259

Email: oakley@stanford.edu


Maintaining user engagement with mobile health (mHealth) apps can be a challenge. Previously, we developed a conceptual model to optimize patient engagement in mHealth apps by incorporating multiple evidence-based methods, including increasing health literacy, enhancing technical competence, and improving feelings about participation in clinical trials. This viewpoint aims to report on a series of exploratory mini-experiments demonstrating the feasibility of testing our previously published engagement conceptual model. We collected data from 6 participants using an app that showed a series of educational videos and obtained additional data via questionnaires to illustrate and pilot the approach. The videos addressed 3 elements shown to relate to engagement in health care app use: increasing health literacy, enhancing technical competence, and improving positive feelings about participation in clinical trials. We measured changes in participants’ knowledge and feelings, collected feedback on the videos and content, made revisions based on this feedback, and conducted participant reassessments. The findings support the feasibility of an iterative approach to creating and refining engagement enhancements in mHealth apps. Systematically identifying the key evidence-based elements intended to be included in an app’s design and then systematically testing the implantation of each element separately until a satisfactory level of positive impact is achieved is feasible and should be incorporated into standard app design. While mHealth apps have shown promise, participants are more likely to drop out than to be retained. This viewpoint highlights the potential for mHealth researchers to test and refine mHealth apps using approaches to better engage users.

Interact J Med Res 2024;13:e51974

doi:10.2196/51974

Keywords



Smartphones have a global penetration estimated at 3.9 billion users [1], enabling mobile health (mHealth) apps to reach even low-resource areas and underserved populations [2,3]. mHealth apps have been developed to enable remote participation in clinical trials [4-7] and provide health education, health management, and other uses across the continuum from prevention through active treatment to palliative care [8]. Decentralized clinical trials using mHealth technologies promise faster participant accrual and a higher return on investment than traditional site-based trials [9]. mHealth apps have been shown to reduce inpatient readmission rates and decrease the length of hospital stay [10]. mHealth can increase knowledge and improve confidence and communication with health professionals [11]. However, while participants readily sign up for mHealth education and decentralized clinical trial apps, retention remains a major challenge [12-14]. For mHealth apps to succeed, users must consistently engage with them [15,16].

Engagement in mHealth apps has been conceptualized to include behavior, cognition, and affective components [17]. However, measures of patient engagement are underreported and lack consistency [18,19]. Participants are more likely to drop out than be retained despite app elements such as feedback, reminders, in-app support, gamification, and participant compensation [20-23].

We developed a conceptual model to optimize patient engagement based on different phases of the engagement process [24]. Because digital literacy and anxiety have been shown to be negatively correlated with engagement [25], we established an approach to develop and test the educational components of our conceptual model to enhance app engagement by increasing health literacy, enhancing technical competence, and improving feelings about clinical trials. This Viewpoint aims to report on a series of exploratory mini-experiments, demonstrating the feasibility of testing our engagement conceptual model.


Testing Design

We used a product testing approach rather than the traditional research evaluation approach. We used a group of existing product testers who are patients or caregivers working for Medable to rapidly test different iterations of our educational videos. Questionnaires were used both before and after participants viewed the videos, and semistructured interviews were also conducted.

Data Collection

We developed apps to collect specific data from participants over 1 week’s duration through questionnaires available on their smartphones before and after exposure to videos, as shown in Table 1. The videos were based on a review of the literature defining and studying each of these 3 target areas: health literacy, technical competence, and feelings about participation in clinical trials [24]. Each concept area was tested separately, with questionnaires specific to the educational component.

Table 1. Schedule of tasks or questionnaires.
Task or questionnaireFrequencyDay 1Day 2Day 3Day 4Day 5Day 6
Sign upOnceTask 1N/AaN/AN/AN/AN/A
Demographics (age, gender, education, housing, race, ethnicity, location, health condition, and notification)OnceTask 2N/AN/AN/AN/AN/A
Technology competence questions (TAM3b CANX,c worries about pressing the wrong button and device preference)TwiceTask 3dTask 2eN/AN/AN/AN/A
Technology competence video combined with practice questions (video on how to answer questions and practice questions)OnceN/ATask 1d,eN/AN/AN/AN/A
Health literacy questions (BRIEFf, own health knowledge, clinical trial knowledge, and BMI)TwiceN/AN/ATask 1d,edTask 2eN/AN/A
Health literacy video with knowledge check questions (St. Luke’s University Health Network’s video, “Wellness 101 – How to Improve Your Overall Health”; knowledge check questions)OnceN/AN/AN/ATask 1d,eN/AN/A
Clinical trials question (temperature scale regarding participant’s feelings about study participation, Figure S1 in Multimedia Appendix 1)TwiceN/AN/AN/AN/ATask 1d,eTask 2
Clinical trials video (National Institute of Diabetes and Digestive and Kidney Diseases video: “Why Should I Join a Clinical Trial?”)OnceN/AN/AN/AN/AN/ATask 1d,e
Study completeOnceN/AN/AN/AN/AN/ATask 3

aN/A: not applicable.

bTAM3: Technology Acceptance Model 3.

cCANX: Computer Anxiety.

dSet up with an automated morning reminder: “You have new tasks available today in the Patient Engagement app!”

eSet-up with an automated 8 pm reminder: “You have uncompleted tasks in the Patient Engagement app. Please finish them before midnight!”

fBRIEF: Brief Health Literacy Screening Tool.

In addition to collecting data through the questionnaires noted in Table 1, we conducted individual semistructured interviews with each participant at the end of the series of user evaluations using a video conferencing platform. Based on feedback, we revised specific videos and content and then conducted a second round of feedback. In this second round, after all the questions were answered in each section, participants were asked for immediate feedback with the open-ended question, “What did you think of this video?”

Recruitment

We used a product testing approach rather than a research evaluation approach. We recruited 6 individuals from the Medable Patient Care Network (PCN) who participated in this product development effort from May 2022 through July 2022. The PCN is a group of patients and caregivers who provide insights and user feedback from their perspective for a variety of apps being developed as products at Medable.

Ethical Considerations

This work was conducted and approved under the the Advarra IRB (Pro00062352). PCN members were paid an hourly rate of approximately US $150 by Medable for their work on behalf of the network in support of Medable product development efforts. Informed consent was obtained from all participants. All data were deidentified.


Health Literacy

The Brief Health Literacy Screening Tool [26] was selected to measure change before and after the health literacy intervention video. This questionnaire has four items that are rated on a 5-point Likert scale from “always” to “never”: (1) How often do you have someone help you read hospital materials? (2) How often do you have problems learning about your medical condition because of difficulty understanding written information? (3) How often do you have a problem understanding what is told to you about your medical condition? (4) How confident are you filling out medical forms by yourself? Two questions were asked in addition to the Brief Health Literacy Screening Tool using a 10-point scale: “How much do you know about your own health,” “How much do you know about clinical trials,” and 1 true or false question: “Do you know your body mass index (BMI)?” (Multimedia Appendix 1).

Before showing a video, a module with a knowledge check portion was included to facilitate pre- and postvideo knowledge comparison. St. Luke’s University Health Network’s video, titled “Wellness 101 – How to Improve Your Overall Health” [27] was chosen for the content area of health literacy. The video provided 5 tips to improve an individual’s overall health. Five knowledge-related questions based on the video were asked. A BMI calculator was included as the final task in this section. Participants were instructed to “Try calculating your BMI on this website” using the CDC BMI calculator [28].

Technology Competence

We measured internet skills using a section of the Technology Acceptance Model 3 [29]. Statements presented to the participant included “The study website does not scare me at all,” “Working with the study website makes me nervous,” “The study website makes me feel uncomfortable,” and “The study website makes me feel uneasy.” We also asked, “While using the study website, I’m worried that I might press the wrong button and make a mistake that crashes the program” and “I am most comfortable using my (multi-select) iPad/Tablet, Smart Phone (iPhone or Android), Computer, Other.”

The video we used to increase technology competence was created in-house and showed participants examples of how to click on checkboxes, “radio button” response buttons, or move the cursor to a particular spot to answer different types of questions, including multiple choice, multiple selections, and a sliding scale. Participants were then asked to practice answering the same type of questions on their own (Multimedia Appendix 1 for the Technology Competence Questionnaire).

Clinical Trials

We also asked the question, “When it comes to your feelings about participating in this study, how do you rate your comfort?” using a scale ranging from “0, meaning no distress; totally relaxed” to “100, reflecting the highest anxiety/distress that you have ever felt” (Figure S1 in Multimedia Appendix 1). We showed the video from the National Institute of Diabetes and Digestive and Kidney Diseases, “Why Should I Join a Clinical Trial?” [30].

Semistructured Interviews

RM and AB conducted 45-minute interviews using Zoom (Zoom Technologies Inc) video conferencing with all 6 study participants using semistructured guides. Open-ended questions explored how participants felt about the process of using the app and how they felt about the questions that were asked through the app. A 5-point Likert scale was used to determine whether they agreed or disagreed with several statements focusing on how useful and informational they felt each of the videos and questionnaires were, for example, “I found the knowledge check questions to be useful” and “I felt less anxious about the idea of participating in a clinical trial after completing the knowledge check.”

Data Analysis

Given the small sample size and our product testing approach, we used simple descriptive statistics to give us insights into the differences between the “before” and “after” questionnaire results. The semistructured interviews were reviewed for commonalities.


User Demographics

Users testing the smartphone apps ranged in age from 54 to 69 years. Of the 6 participants, 3 identified as female and 3 as male. Five stated their race as White, 1 selected Black or African American, and none identified as Hispanic. Participants lived in the United States and Europe; 5 owned their homes and 1 rented. The majority (n=5) indicated they had 1 or more health conditions, and 5 had completed at least 4 years of college. All participants indicated they preferred to receive notifications before noon, 4 preferred SMS text message notifications, and 2 preferred email notifications.

Health Literacy

The mean score and SD for each survey item before and after viewing the instructional video are listed in Table S1 in Multimedia Appendix 1. Lower scores indicated a more positive response for 2 of the questions, specifically, “How confident are you filling out medical forms by yourself?” and “Do you know your Body Mass Index (BMI)?” The mean score for each survey item was slightly more positive after viewing the instructional video, except for the item “How often do you have a problem understanding what is told to you about your medical condition?” However, the SD was greater than the change in mean scores, indicating that it could be due to chance. There were no changes in questionnaire responses before and after watching the instructional video for 3 of the 6 participants, improvement in 2 questions and a decline in 1 question for 1 participant, improvement in 4 questions and no change in 3 questions for 1 participant, and improvement in knowledge of their overall health and a decline in the clinical trial knowledge item for 1 participant.

Participants’ feedback from the semistructured interviews revealed negative feelings toward the video “Wellness 101 – How to Improve Your Overall Health.” One participant described the video as “juvenile,” while another noted concerns that some participants might object to the health video if they already smoke or have a high BMI. All participants agreed or strongly agreed to the question, “I found the knowledge check questions to be useful.” In total, 4 of the 6 participants neither agreed nor disagreed with the statement, “I felt less anxious about the idea of participating in a clinical trial after completing the knowledge check.” Participants liked the alternative health literacy video, “5 Ways to Make the Most of Your Doctor Visit” [31].

Technology Competence

The mean scores for all items were more positive after viewing the video. The scores for 3 participants improved after watching the instructional video but declined for 2 participants. There was no change for 1 participant (Table S1 in Multimedia Appendix 1).

Participants (N=6) shared positive feedback about the video showing how to answer and practice questions. When asked, “Watching somebody else demonstrate how to answer questions made me feel like I knew what was expected of me in the study,” participants answered, “agree” (2 participants) and “strongly agree” (4 participants). Additionally, participants answered “strongly agree” (4 participants) to the survey, “Practicing answering questions on my own made me feel less anxious about participating in the study.” Some participants felt the questions in this section were redundant and thought all the questions could be combined into 1 question. In addition, participants thought the questions felt negative with the emphasis on the terms “anxious” and “nervous” and suggested changing the questions to make them seem more positive. One participant suggested making the technology anxiety section optional for those who feel more comfortable using the study website.

Clinical Trials

Mean scores were slightly higher prior to viewing the instructional video compared with after the video. One participant improved by 10 points, 1 decreased by 10 points, and the other 4 stayed the same.

Participants had positive things to say about the video “Why Should I Join a Clinical Trial?” [30] When asked, “I found the video to be useful,” 2 participants answered “agree” and 4 answered “strongly agree.”

Additional Overall User Feedback

The participants had several specific suggestions. One participant suggested making sure that the videos were clearly specific to diseases or therapy areas in the trial and gave specific information on the trial structure. Two participants suggested including additional content on participant safety. We did an ad hoc assessment of this suggestion and sent the National Cancer Institute’s “Patient Safety in Clinical Trials” video for feedback [32]. The majority (n=5) participants liked the video. The National Institute of Mental Health’s video, “What are the risks and benefits of participating in clinical research?” was also considered [33]. Most (n=4) participants liked the video and 2 did not. Some participants thought the video was juvenile and better for children or young adults. Others also thought the video did not explain concepts such as placebos well. In response to this feedback, the following resources from the National Institute on Aging were added: “What are Clinical Trials and Studies?” [34] and “Clinical Research Benefits, Risks and Safety” [35], after which we sought a second round of feedback. Most participants (n=5) liked the additional resources.


Key Lessons

Maintaining continuous and complete use of mHealth apps has remained a persistent problem that has not yielded even sophisticated solutions such as timed and individualized user messaging. A newer and evolving understanding of the foundational importance of user engagement with mHealth suggests that this problem comes from a lack of appreciation by mHealth app designers of the complex and multicomponent structures behind user engagement. We have built on prior knowledge and work to develop a model of engagement that accounts for the complexity of engagement [24].

This viewpoint was an exploratory study to determine the feasibility of this approach and to guide the refinement of this interactive test strategy. We learned several key lessons: (1) Specificity—the participants endorsed the recommendation that the interventions should be specific to the educational needs of the target of the mHealth app. The most positive feedback was given to the video we developed de novo to teach participants the technical competence required to correctly and effectively use the app to report their evaluations. (2) Attention to inadvertent adverse affective variables—the participants noted the importance of avoiding or rephrasing medical terms that could be seen as demeaning by some participants (eg, obesity, age, and infirmity). (3) Individualization—the participants clearly reflected different levels of need for improving their technical competence and health literacy. Our results indicate the potential importance of personalization in health app design addressing individuals’ levels of need and cultural and personal sensitivities. For example, a way to allow more individualization is to allow users to potentially opt out of certain learning features if they do not think they need them.

Comparison With Prior Work

Other studies have assessed how to adjust apps to increase engagement. One study found positive effects on adherence from personalization or tailoring of the app content to users’ needs, push notification reminders, user-friendly design, and personal support along with digital intervention [36]. However, the high dropout rate in app usage remains a major challenge [37]. A recent literature review found that despite factors such as appropriate reminders and feedback, app participants were more likely to drop out than be retained [20]. App literacy skills have been identified as a major factor in the uptake and engagement of smartphone apps [38]. Although we identified studies recommending web-based interventions to increase health literacy and technical skills [39,40], we have not found other studies testing approaches to increase those skills.

Limitations and Strengths

Our sample size was limited to 6 participants, 5 of whom were highly educated. We were unable to use any statistical significance measures because of the small sample size or draw conclusions that would apply to a larger population. Our highly educated sample is a limitation because these individuals may have better digital literacy than the general population. The main goal of our series of mini-experiments was to assess the feasibility of our approach and see whether we could retain the interest of the participants and obtain useful feedback on the interventions, which was successful. This type of testing is intended to evaluate tailored iterations of the app after gathering rapid participant feedback, with the ultimate goal of developing an app that will engage users. The next phase of our work will be to undertake the systematic testing of each component of this model in a larger and more diverse sample. We will then be able to refine the interventional enhancements for those components for a broader population. Ultimately, the functional use of this approach requires much larger, population-specific samples.

Conclusions

To fulfill the promise of mHealth apps to improve health outcomes, apps need to be improved so they reduce participant attrition. Health care apps do not work for people who do not use them. To date, app feedback, notifications and reminders, in-app support, gamification, and participant compensation have not been consistently successful in eliminating participant dropout. This study highlights the potential to develop and refine mHealth apps using evidence-based interventions derived from a broad range of behavioral and social science to increase engagement as a way of improving participant retention. This viewpoint highlights the potential for mHealth researchers to test and refine mHealth apps using approaches to better engage users. The preliminary experience reported in this viewpoint supports the feasibility of this iterative approach to create and refine engagement interventional enhancements for each element of the multidimensional, multicomponent theory of the engagement process.

Acknowledgments

We thank members of the Medable Patient Care Network for their valuable input and critical insights. No artificial intelligence products were used to develop questionnaires, interventions, analysis, or write any of this manuscript. Research reported in this publication was partially supported by the National Cancer Institute of the National Institutes of Health under contract numbers HHSN261201700030C and HHSN261201800010C. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health.

Data Availability

The data sets generated and analyzed during this study are not publicly available due to the data release requirements of the Patient Care Network and the intellectual property associated with this work but are available from the corresponding author on reasonable request.

Authors' Contributions

RM, AB, JPD, and IO-G conceived the mini-experiments designed to test the model developed by JPD and IO-G. RM and AB developed the app to collect data, coordinated with the Patient Care Network, and collected and analyzed the data. RM, JPD, IO-G, and SWD wrote the first draft of the manuscript. All authors reviewed and edited the manuscript and approved the final version of the manuscript.

Conflicts of Interest

RM, JPD, SWD, and AB were employed by Medable at the time the data collection and manuscript writing was done. ML and IOG are currently employed at Medable. Medable is a clinical trial software-as-a-service platform and evidence-generation company and supported the authors’ conduct of this work without interference.

Multimedia Appendix 1

Study questionnaires and responses.

DOCX File , 158 KB

  1. Newzoo global mobile market report 2021. Newzoo. 2021. URL: https://newzoo.com/insights/trend-reports/newzoo-global-mobile-market-report-2021-free-version [accessed 2023-07-18]
  2. Purnell JQ, Griffith J, Eddens KS, Kreuter MW. Mobile technology, cancer prevention, and health status among diverse, low-income adults. Am J Health Promot. 2014;28(6):397-402. [FREE Full text] [CrossRef] [Medline]
  3. Anderson-Lewis C, Darville G, Mercado RE, Howell S, Di Maggio S. mHealth technology use and implications in historically underserved and minority populations in the United States: systematic literature review. JMIR Mhealth Uhealth. 2018;6(6):e128. [FREE Full text] [CrossRef] [Medline]
  4. Dorsey ER, Kluger B, Lipset CH. The new normal in clinical trials: decentralized studies. Ann Neurol. 2020;88(5):863-866. [CrossRef] [Medline]
  5. Sarraju A, Seninger C, Parameswaran V, Petlura C, Bazouzi T, Josan K, et al. Pandemic-proof recruitment and engagement in a fully decentralized trial in atrial fibrillation patients (DeTAP). NPJ Digit Med. 2022;5(1):80. [FREE Full text] [CrossRef] [Medline]
  6. Sessa C, Cortes J, Conte P, Cardoso F, Choueiri T, Dummer R, et al. The impact of COVID-19 on cancer care and oncology clinical research: an experts' perspective. ESMO Open. 2022;7(1):100339. [FREE Full text] [CrossRef] [Medline]
  7. Patel S, Goldsack JC, Cordovano G, Downing A, Fields KK, Geoghegan C, et al. Advancing digital health innovation in oncology: priorities for high-value digital transformation in cancer care. J Med Internet Res. 2023;25:e43404. [FREE Full text] [CrossRef] [Medline]
  8. Davis SW, Oakley-Girvan I. mHealth education applications along the cancer continuum. J Cancer Educ. 2015;30(2):388-394. [CrossRef] [Medline]
  9. DiMasi JA, Smith Z, Oakley-Girvan I, Mackinnon A, Costello M, Tenaerts P, et al. Assessing the financial value of decentralized clinical trials. Ther Innov Regul Sci. 2023;57(2):209-219. [FREE Full text] [CrossRef] [Medline]
  10. Bruce CR, Harrison P, Nisar T, Giammattei C, Tan NM, Bliven C, et al. Assessing the impact of patient-facing mobile health technology on patient outcomes: retrospective observational cohort study. JMIR Mhealth Uhealth. 2020;8(6):e19333. [FREE Full text] [CrossRef] [Medline]
  11. Tuvesson H, Eriksén S, Fagerström C. mHealth and engagement concerning persons with chronic somatic health conditions: integrative literature review. JMIR Mhealth Uhealth. 2020;8(7):e14315. [FREE Full text] [CrossRef] [Medline]
  12. Pratap A, Neto EC, Snyder P, Stepnowsky C, Elhadad N, Grant D, et al. Indicators of retention in remote digital health studies: a cross-study evaluation of 100,000 participants. NPJ Digit Med. 2020;3:21. [FREE Full text] [CrossRef] [Medline]
  13. Druce KL, Dixon WG, McBeth J. Maximizing engagement in mobile health studies: lessons learned and future directions. Rheum Dis Clin North Am. 2019;45(2):159-172. [FREE Full text] [CrossRef] [Medline]
  14. Kaveladze BT, Wasil AR, Bunyi JB, Ramirez V, Schueller SM. User experience, engagement, and popularity in mental health apps: secondary analysis of app analytics and expert app reviews. JMIR Hum Factors. 2022;9(1):e30766. [FREE Full text] [CrossRef] [Medline]
  15. Serrano KJ, Coa KI, Yu M, Wolff-Hughes DL, Atienza AA. Characterizing user engagement with health app data: a data mining approach. Transl Behav Med. 2017;7(2):277-285. [FREE Full text] [CrossRef] [Medline]
  16. Trifan A, Oliveira M, Oliveira JL. Passive sensing of health outcomes through smartphones: systematic review of current solutions and possible limitations. JMIR Mhealth Uhealth. 2019;7(8):e12649. [FREE Full text] [CrossRef] [Medline]
  17. Kelders SM, van Zyl LE, Ludden GDS. The concept and components of engagement in different domains applied to eHealth: a systematic scoping review. Front Psychol. 2020;11:926. [FREE Full text] [CrossRef] [Medline]
  18. Madujibeya I, Lennie T, Aroh A, Chung ML, Moser D. Measures of engagement with mHealth interventions in patients with heart failure: scoping review. JMIR Mhealth Uhealth. 2022;10(8):e35657. [FREE Full text] [CrossRef] [Medline]
  19. Pham Q, Graham G, Carrion C, Morita PP, Seto E, Stinson JN, et al. A library of analytic indicators to evaluate effective engagement with consumer mHealth apps for chronic conditions: scoping review. JMIR Mhealth Uhealth. 2019;7(1):e11941. [FREE Full text] [CrossRef] [Medline]
  20. Amagai S, Pila S, Kaat AJ, Nowinski CJ, Gershon RC. Challenges in participant engagement and retention using mobile health apps: literature review. J Med Internet Res. 2022;24(4):e35120. [FREE Full text] [CrossRef] [Medline]
  21. Mustafa AS, Ali N, Dhillon JS, Alkawsi G, Baashar Y. User engagement and abandonment of mHealth: a cross-sectional survey. Healthcare (Basel). 2022;10(2):221. [FREE Full text] [CrossRef] [Medline]
  22. Oakley-Girvan I, Lavista JM, Miller Y, Davis S, Acle C, Hancock J, et al. Evaluation of a mobile device survey system for behavioral risk factors (SHAPE): app development and usability study. JMIR Form Res. 2019;3(1):e10246. [FREE Full text] [CrossRef] [Medline]
  23. Oakley-Girvan I, Yunis R, Longmire M, Ouillon JS. What works best to engage participants in mobile app interventions and e-health: a scoping review. Telemed J E Health. 2022;28(6):768-780. [FREE Full text] [CrossRef] [Medline]
  24. Oakley-Girvan I, Docherty JP. A new approach to enhancing engagement in eHealth apps. Interact J Med Res. 2022;11(2):e38886. [FREE Full text] [CrossRef] [Medline]
  25. Lepore SJ, Rincon MA, Buzaglo JS, Golant M, Lieberman MA, Bass SB, et al. Digital literacy linked to engagement and psychological benefits among breast cancer survivors in internet-based peer support groups. Eur J Cancer Care (Engl). 2019;28(4):e13134. [FREE Full text] [CrossRef] [Medline]
  26. Haun J, Noland-Dodd V, Varnes J, Graham-Pole J, Rienzo B, Donaldson P. Testing the BRIEF health literacy screening tool. Fed Pract. 2009;26(12):24-31. [FREE Full text]
  27. Wellness 101 – how to improve your overall health. St. Luke's University Health Network YouTube page. 2019. URL: https://www.youtube.com/watch?v=_nuDp-fded8 [accessed 2024-03-01]
  28. Adult BMI calculator. Centers for Disease Control and Prevention. URL: https://www.cdc.gov/healthyweight/assessing/bmi/adult_bmi/english_bmi_calculator/bmi_calculator.html [accessed 2023-07-18]
  29. Venkatesh V, Bala H. Technology acceptance model 3 and a research agenda on interventions. Decis Sci. 2008;39(2):273-315. [FREE Full text] [CrossRef]
  30. Rodgers GP. Why should I join a clinical trial? National Institute of Diabetes and Digestive and Kidney Disease YouTube page. 2019. URL: https://www.youtube.com/watch?v=36Sd8WpgR94 [accessed 2024-03-01]
  31. 5 ways to make the most of your doctor visit. National Institute on Aging YouTube page. URL: https://www.youtube.com/watch?v=BincCVl-YsI [accessed 2023-07-18]
  32. Are clinical trials safe? National Cancer Institute. URL: https://www.cancer.gov/about-cancer/treatment/clinical-trials/patient-safety [accessed 2023-07-19]
  33. What are the risks and benefits of participating in clinical research? National Institute of Mental Health. URL: https:/​/www.​nimh.nih.gov/​news/​media/​2021/​what-are-the-risks-and-benefits-of-participating-in-clinical-research [accessed 2023-07-19]
  34. What are clinical trials and studies? National Institute on Aging. URL: https:/​/www.​nia.nih.gov/​health/​what-are-clinical-trials-and-studies#:~:text=Clinical%20trials%20are%20research%20studies,safe%20and%20effective%20in%20people [accessed 2023-07-19]
  35. Clinical research: benefits, risks, and safety. National Institute on Aging. URL: https://www.nia.nih.gov/health/clinical-research-benefits-risks-and-safety [accessed 2023-07-19]
  36. Jakob R, Harperink S, Rudolf AM, Fleisch E, Haug S, Mair JL, et al. Factors influencing adherence to mHealth apps for prevention or management of noncommunicable diseases: systematic review. J Med Internet Res. 2022;24(5):e35371. [FREE Full text] [CrossRef] [Medline]
  37. Sim I. Mobile devices and health. N Engl J Med. 2019;381(10):956-968. [CrossRef] [Medline]
  38. Szinay D, Jones A, Chadborn T, Brown J, Naughton F. Influences on the uptake of and engagement with health and well-being smartphone apps: systematic review. J Med Internet Res. 2020;22(5):e17572. [FREE Full text] [CrossRef] [Medline]
  39. Moon Z, Zuchowski M, Moss-Morris R, Hunter MS, Norton S, Hughes LD. Disparities in access to mobile devices and e-health literacy among breast cancer survivors. Support Care Cancer. 2022;30(1):117-126. [FREE Full text] [CrossRef] [Medline]
  40. O'Connor S, Hanlon P, O'Donnell CA, Garcia S, Glanville J, Mair FS. Understanding factors affecting patient and public engagement and recruitment to digital health interventions: a systematic review of qualitative studies. BMC Med Inform Decis Mak. 2016;16(1):120. [FREE Full text] [CrossRef] [Medline]


mHealth: mobile health
PCN: Patient Care Network


Edited by G Eysenbach, T de Azevedo Cardoso; submitted 18.08.23; peer-reviewed by S Amagai, B Oh, C Latkin; comments to author 17.10.23; revised version received 14.11.23; accepted 27.02.24; published 26.03.24.

Copyright

©Rebecca Monachelli, Sharon Watkins Davis, Allison Barnard, Michelle Longmire, John P Docherty, Ingrid Oakley-Girvan. Originally published in the Interactive Journal of Medical Research (https://www.i-jmr.org/), 26.03.2024.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Interactive Journal of Medical Research, is properly cited. The complete bibliographic information, a link to the original publication on https://www.i-jmr.org/, as well as this copyright and license information must be included.