“Should I trust or Should I not?” A Survey to measure AI Confidence Levels Among Students of the bachelor’s degree in Psychiatric Rehabilitation Techniques and Psychiatry Residents. Introduction

Titolo Rivista RIVISTA SPERIMENTALE DI FRENIATRIA
Autori/Curatori A cura della Redazione
Anno di pubblicazione 2024 Fascicolo 2024/3
Lingua Inglese Numero pagine 21 P. 47-67 Dimensione file 829 KB
DOI 10.3280/RSF2024-003004
Il DOI è il codice a barre della proprietà intellettuale: per saperne di più clicca qui

Qui sotto puoi vedere in anteprima la prima pagina di questo articolo.

Se questo articolo ti interessa, lo puoi acquistare (e scaricare in formato pdf) seguendo le facili indicazioni per acquistare il download credit. Acquista Download Credits per scaricare questo Articolo in formato PDF

Anteprima articolo

FrancoAngeli è membro della Publishers International Linking Association, Inc (PILA)associazione indipendente e non profit per facilitare (attraverso i servizi tecnologici implementati da CrossRef.org) l’accesso degli studiosi ai contenuti digitali nelle pubblicazioni professionali e scientifiche

The use of Artificial Intelligence (AI) in mental health context has significantly expanded in recent years, offering innovative solutions in various clinical settings. This study investigates the levels of trust and familiarity with AI among psychiatric rehabilitation students (PRTS) and psychiatric residents (PR) at the University of Modena and Reggio Emilia. Methods: An online questionnaire developed by the University of Twente was administered using the REDCap platform to collect data from 78 participants, including 53 PRTS and 25 PRs. Results: The findings revealed that 80% of PRTS reported familiarity with AI, compared to less than 50% of PRs. However, only 42.6% of PRTS and 22.5% of PRs felt familiar with AI chatbots. Trust in AI-driven recommendations was generally neutral across both groups, with 40.8% of respondents expressing neither agreement nor disagreement. Additionally, over 68% of PRTS and 70.8% of PRs expressed confidence in their ability to generate high-quality ideas. Conclusion: A more structured AI education within medical training is needed to bridge the gap between familiarity and trust. AI, with its continuous advancements, offers substantial potential in healthcare, bringing with it both opportunities and risks.

L’utilizzo dell’Intelligenza Artificiale (AI) nel contesto della salute mentale si è notevolmente ampliata negli ultimi anni, offrendo soluzioni innovative in diversi contesti clinici. Questo studio indaga i livelli di fiducia e di familiarità con l’AI tra gli studenti di riabilitazione psichiatrica (PRTS) e gli specializzandi in psichiatria (PR) dell’Università degli Studi di Modena e Reggio Emilia. Metodi: È stato somministrato un questionario online sviluppato dall’Università di Twente utilizzando la piattaforma REDCap per raccogliere i dati di 78 partecipanti, tra cui 53 PRTS e 25 PR. Risultati: I risultati hanno rivelato che l’80% dei PRTS ha dichiarato di avere familiarità con l’IA, rispetto a meno del 50% dei PR. Tuttavia, solo il 42.6 % dei PRTS e il 22.5% dei PR si sentivano familiari con i chatbot di IA. La fiducia nelle raccomandazioni guidate dall’IA è stata generalmente neutra in entrambi i gruppi, con il 40.8% degli intervistati che non ha espresso né accordo né disaccordo. Lo studio ha inoltre rilevato che oltre il 68% dei PRTS e il 70.8% dei PR sono fiduciosi nella loro capacità di formulare idee di alta qualità. Conclusioni: È necessaria una formazione più strutturata sull’AI all’interno della formazione nel contesto della salute mentale per colmare il divario tra familiarità e fiducia. L’AI, con i suoi continui progressi, offre un potenziale sostanziale nell’assistenza sanitaria, portando con se sia opportunità che rischi.

Keywords:Confidenza nell’AI, Studenti, Università, AI Chatbot

  1. J. W. Cook, Ed., Sustainability, Human Well-Being, and the Future of Education. Cham: Springer International Publishing, 2019. DOI: 10.1007/978-3-319-78580-6
  2. Sideri M, Moriva 70 anni fa l’uomo che con una domanda cambiò la storia: Alan Turing. Ha senso oggi chiedersi se le macchine possono pensare? Corriere; June 19, 2024. [Online]. -- Available: https://www. corriere.it/cronache/24_giugno_19/moriva-70-anni-fa-l-uomo-checon-una-domanda-cambio-la-storia-alan-turing-ha-senso-oggi-chiedersi-se-le-macchine-possono-pensare-4a166263-e2a2-46e1-b374e37741e52xlk.shtml
  3. Turin AM, I. Computing Machinery and Intelligence. Mind 1950. LIX; 236: 433-460,
  4. M. Vidali, “Intelligenza Artificiale in Medicina: implicazioni e applicazioni, sfide e opportunità. Biochimica Clinica.,” vol. Biochimica Clinica, no. 48, pp. 129–141, 2024, DOI: 10.19186/BC_2024.020
  5. M. Haenlein and A. Kaplan, “A Brief History of Artificial Intelligence: On the Past, Present, and Future of Artificial Intelligence,” Calif. Manage. Rev., vol. 61, no. 4, pp. 5–14, Aug. 2019, DOI: 10.1177/0008125619864925
  6. K. T. Pham, A. Nabizadeh, and S. Selek, “Artificial Intelligence and Chatbots in Psychiatry,” Psychiatr. Q., vol. 93, no. 1, pp. 249–253, Mar. 2022,
  7. H. Kalanderian and H. A. Nasrallah, “Artificial intelligence in psychiatry. Potential uses of machine learning include predicting the risk of suicide, psychosis,” Current Psychiatry, 2019.
  8. E. E. Lee et al., “Artificial Intelligence for Mental Health Care: Clinical Applications, Barriers, Facilitators, and Artificial Wisdom,” Biol. Psychiatry Cogn. Neurosci. Neuroimaging, vol. 6, no. 9, pp. 856–864, Sep. 2021,
  9. M. M. MIR et al., “Application of Artificial Intelligence in Medical Education: Current Scenario and Future Perspectives,” J. Adv. Med. Educ. Prof., vol. 11, no. 3, Jul. 2023,
  10. M. Dave and N. Patel, “Artificial intelligence in healthcare and education,” Br. Dent. J., vol. 234, no. 10, pp. 761–764, May 2023,
  11. K. Masters, “Artificial intelligence in medical education,” Med. Teach., vol. 41, no. 9, pp. 976–980, Sep. 2019, DOI: 10.1080/0142159X.2019.1595557
  12. V. Kaul, S. Enslin, and S. A. Gross, “History of artificial intelligence in medicine,” Gastrointest. Endosc., vol. 92, no. 4, pp. 807–812, Oct. 2020,
  13. G. D. Espejo, “Artificial Intelligence: the Next Frontier in Psychiatric Treatment and Education,” Acad. Psychiatry, vol. 47, no. 4, pp. 437– 438, Aug. 2023,
  14. S. Massaro, “L’educatore nei contesti in divenire della digital health. : Una ricerca transdisciplinare su educazione e medicina digitale per la qualità di vita di pazienti oncologici,” Pedagog. OGGI, vol. 20, no. 2, pp. 149–155, Dec. 2022, DOI: 10.7346/PO-022022-18
  15. S. Garg and A. Chauhan, “Artificial intelligence GPT-4: A game changer in the advancement of psychiatric rehabilitation in the new millennium,” Asian J. Psychiatry, vol. 95, p. 103972, May 2024,
  16. C. De Oliveira, M. Saka, L. Bone, and R. Jacobs, “The Role of Mental Health on Workplace Productivity: A Critical Review of the Literature,” Appl. Health Econ. Health Policy., vol. 21, no. 2, pp. 167–193, Mar. 2023,
  17. A. Vita, L. Dell’Osso, and A. Mucci, Eds., Manuale di clinica e riabilitazione psichiatrica: dalle conoscenze teoriche alla pratica dei servizi di salute mentale, Rist. Roma: Fioriti, 2019.
  18. F. Mahomed, “Addressing the Problem of Severe Underinvestment in Mental Health and Well-Being from a Human Rights Perspective,” Health Hum. Rights, vol. 22, no. 1, pp. 35–49, Jun. 2020.
  19. S. Benjamens, P. Dhunnoo, and B. Meskó, “The state of artificial intelligence-based FDA-approved medical devices and algorithms: an online database,” Npj Digit. Med., vol. 3, no. 1, p. 118, Sep. 2020,
  20. Z. Elyoseph, D. Hadar-Shoval, K. Asraf, and M. Lvovsky, “ChatGPT outperforms humans in emotional awareness evaluations,” Front. Psychol., vol. 14, p. 1199058, May 2023,
  21. D. B. Dwyer, P. Falkai, and N. Koutsouleris, “Machine Learning Approaches for Clinical Psychology and Psychiatry,” Annu. Rev. Clin. Psychol., vol. 14, no. 1, pp. 91–118, May 2018,
  22. C. Bassett, “The computational therapeutic: exploring Weizenbaum’s ELIZA as a history of the present,” AI Soc., vol. 34, no. 4, pp. 803–812, Dec. 2019,
  23. W. H. O. WHO, S.A.R.A.H, a Smart AI Resource Assistant for Health. (2024).
  24. G. Briganti and O. Le Moine, “Artificial Intelligence in Medicine: Today and Tomorrow,” Front. Med., vol. 7, p. 27, Feb. 2020,
  25. C. Panciroli, P. C. Rivoltella, M. Gabbrielli, and O. Zawacki Richter, “Artificial Intelligence and education: new research perspectives,” Formre Open J. Formazione Rete, pp. 244KB, 1-12 Pages, Dec. 2020, DOI: 10.13128/FORM-10210
  26. C. Preiksaitis and C. Rose, “Opportunities, Challenges, and Future Directions of Generative Artificial Intelligence in Medical Education: Scoping Review,” JMIR Med. Educ., vol. 9, p. e48785, Oct. 2023, DOI: 10.2196/48785
  27. K. Duerrschnabel, “Are users who are confident in their high-quality ideas & ability to formulate high-quality ideas affected by the type of feedback received by a chatbot?,” Jun. 2022, [Online]. -- Available: http:// essay.utwente.nl/91254/
  28. O. Behling and K. Law, Translating Questionnaires and Other Research Instruments. 2455 Teller Road, Thousand Oaks California 91320 United States of America: SAGE Publications, Inc., 2000. DOI: 10.4135/9781412986373
  29. P. A. Harris, R. Taylor, R. Thielke, J. Payne, N. Gonzalez, and J. G. Conde, “Research electronic data capture (REDCap)—A metadata-driven methodology and workflow process for providing translational research informatics support,” J. Biomed. Inform., vol. 42, no. 2, pp. 377–381, Apr. 2009,
  30. P. A. Harris et al., “The REDCap consortium: Building an international community of software platform partners,” J. Biomed. Inform., vol. 95, p. 103208, Jul. 2019,
  31. M. C.-T. Tai, “The impact of artificial intelligence on human society and bioethics,” Tzu Chi Med. J., vol. 32, no. 4, pp. 339–343, 2020,
  32. A. Ermolina and V. Tiberius, “Voice-Controlled Intelligent Personal Assistants in Health Care: International Delphi Study,” J. Med. Internet Res., vol. 23, no. 4, p. e25312, Apr. 2021, DOI: 10.2196/25312
  33. Ministero Dell’Università e Della Ricerca, “Portale dei Dati dell’istruzione Superiore,” Jul. 2024. [Online]. -- Available: https://dati-ustat.mur. gov.it/dataset/immatricolati
  34. R. Kansal et al., “Differences in Knowledge and Perspectives on the Usage of Artificial Intelligence Among Doctors and Medical Students of a Developing Country: A Cross-Sectional Study,” Cureus, vol. 14, no. 1, p. e21434, Jan. 2022,
  35. P. Esmaeilzadeh, T. Mirzaei, and S. Dharanikota, “Patients’ Perceptions Toward Human-Artificial Intelligence Interaction in Health Care: Experimental Study,” J. Med. Internet Res., vol. 23, no. 11, p. e25856, Nov. 2021, DOI: 10.2196/25856
  36. W. Zhang, M. Cai, H. J. Lee, R. Evans, C. Zhu, and C. Ming, “AI in Medical Education: Global situation, effects and challenges.,” Educ Inf Technol, vol. 29, pp. 4611–4633, 2024,
  37. B. Zhang, “Preparing Educators and Students for ChatGPT and AI Technology in Higher Education:Benefits, Limitations, Strategies, and Implications of ChatGPT & AI Technologies.,” 2023, DOI: 10.13140/RG.2.2.32105.98404
  38. A. Carnevale, E. A. Tangari, A. Iannone, and E. Sartini, “Will Big Data and personalized medicine do the gender dimension justice?,” AI Soc., vol. 38, pp. 829–841,
  39. L. Weidener and M. Fischer, “Artificial Intelligence in Medicine: Cross-Sectional Study Among Medical Students on Application, Education, and Ethical Aspects,” JMIR Med. Educ., vol. 10, p. e51247, Jan. 2024, DOI: 10.2196/51247
  40. T. K. Tegegne, L.-D. Tran, R. Nourse, C. Gurrin, and R. Maddison, “Daily Activity Lifelogs of People With Heart Failure: Observational Study,” JMIR Form. Res., vol. 8, p. e51248, Feb. 2024, DOI: 10.2196/51248
  41. S.-C. Yeh, A.-W. Wu, H.-C. Yu, H. C. Wu, Y.-P. Kuo, and P.-X. Chen, “Public Perception of Artificial Intelligence and Its Connections to the Sustainable Development Goals,” Sustainability, vol. 13, no. 16, p. 9165, Aug. 2021,
  42. Microsoft, “Navigando il futuro – Intelligenza Artificiale, Politica ed Etica nell’Era Digitale,” Online, Apr. 2024. [Online]. -- Available: https:// news.microsoft.com/it-it/2024/04/12/yas-e-microsoft-insieme-per-y7italia-2024-confronto-tra-giovani-e-imprese-sugli-sviluppi-dellia/
  43. PEW RESEARCH CENTER, “Survey of U.S. teens ages 13 to 17,” Sep. 2023. [Online]. -- Available: https://www.pewresearch.org/short- reads/2023/11/16/about-1-in-5-us-teens-whove-heard-of-chatgpt-haveused-it-for-schoolwork/
  44. D. Shevtsova et al., “Trust in and Acceptance of Artificial Intelligence Applications in Medicine: Mixed Methods Study,” JMIR Hum. Factors, vol. 11, p. e47031, Jan. 2024, DOI: 10.2196/47031
  45. S. I. Lambert et al., “An integrative review on the acceptance of artificial intelligence among healthcare professionals in hospitals,” NPJ Digit. Med., vol. 6, no. 1, p. 111, Jun. 2023, 00852-5.
  46. J. Amann, E. Vayena, K. E. Ormond, D. Frey, V. I. Madai, and A. Blasimme, “Expectations and attitudes towards medical artificial intelligence: A qualitative study in the field of stroke,” PloS One, vol. 18, no. 1, p. e0279088, 2023,
  47. K. B. Johnson et al., “Precision Medicine, AI, and the Future of Personalized Health Care,” Clin. Transl. Sci., vol. 14, no. 1, pp. 86–93, Jan. 2021,

A cura della Redazione, “Should I trust or Should I not?” A Survey to measure AI Confidence Levels Among Students of the bachelor’s degree in Psychiatric Rehabilitation Techniques and Psychiatry Residents. Introduction in "RIVISTA SPERIMENTALE DI FRENIATRIA" 3/2024, pp 47-67, DOI: 10.3280/RSF2024-003004