Specchio riflesso! Un’esperienza etnografica con TikTok

Titolo Rivista SOCIOLOGIA DELLA COMUNICAZIONE
Autori/Curatori Elisabetta Zurovac
Anno di pubblicazione 2025 Fascicolo 2024/68
Lingua Italiano Numero pagine 19 P. 83-101 Dimensione file 183 KB
DOI 10.3280/SC2024-068007
Il DOI è il codice a barre della proprietà intellettuale: per saperne di più clicca qui

Qui sotto puoi vedere in anteprima la prima pagina di questo articolo.

Se questo articolo ti interessa, lo puoi acquistare (e scaricare in formato pdf) seguendo le facili indicazioni per acquistare il download credit. Acquista Download Credits per scaricare questo Articolo in formato PDF

Anteprima articolo

FrancoAngeli è membro della Publishers International Linking Association, Inc (PILA)associazione indipendente e non profit per facilitare (attraverso i servizi tecnologici implementati da CrossRef.org) l’accesso degli studiosi ai contenuti digitali nelle pubblicazioni professionali e scientifiche

Drawing on a critical perspective of algorithm studies, this study analyzes the possible algorithmic harms caused by TikTok’s For You Page (FYP). Through a year and eight months long ethnographic study, this paper delves into the complex relationship between the researcher and the algorithm, resulting in autoethnographic reporting of data. In doing so, the study proposes 4 main phases of FYP’s interaction with users: generalization, deviation, synchronization, and intuition. This research underscores the dual nature of TikTok’s algorithmic mediation: it can provide a personalized, engaging experience, but it also presents risks related to bias, social harm, and loss of agency.

Parole chiave:TikTok; algorithm studies; algorithmic ethnography; algorithmic harms; algorithmic practices; digital media.

  1. Abidin C. (2021), From “networked publics” to “refracted publics”: A companion framework for researching “below the radar” studies, in «Social Media + Society» 7. DOI: 10.1177/2056305120984458
  2. Atay A. (2020), What is cyber or digital autoethnography?, in «International Review of Qualitative Research» 13(3), pp.267-279. DOI: 10.1177/1940844720934373
  3. Avella H. (2024), “TikTok ? therapy”: Mediating mental health and algorithmic mood disorders, in «New Media & Society» 26(10), pp. 6040-6058. DOI: 10.1177/14614448221147284
  4. Benjamin, B. (2019), Race after technology, Polity Press, Cambridge.
  5. Bhandari A., Bimo S. (2022) Why’s Everyone on TikTok Now? The Algorithmized Self and the Future of Self-Making on Social Media, in «Social Media + Society» 8(1), pp. 1-11. DOI: 10.1177/20563051221086241
  6. Boccia Artieri G. (2020), Fare Sociologia attraverso l’algoritmo: potere, cultura e agency, in «Sociologia italiana» (15), pp137-148. DOI: 10.1485/2281-2652-202015-7
  7. Boccia Artieri G., Bartoletti R. (2023), Algoritmi e vita quotidiana: un approccio socio-comunicativo critico, in «Sociologia della comunicazione», 66(2), pp.5-20. DOI: 10.1485/2281-2652-202015-7
  8. Bonini T., Treré E. (2024), Algorithms of resistance: The everyday fight against platform power, Mit Press, Boston.
  9. Bucher T. (2018), If... then: Algorithmic power and politics, Oxford University Press, Oxford.
  10. Bucher T. (2020), The right-time web: Theorizing the kairologic of algorithmic media, in «New Media & Society», 22(9), pp.1699-1714. DOI: 10.1177/1461444820913560
  11. Cinnamon J. (2017), Social injustice in surveillance capitalism, in «Surveillance & Society», 15(5), pp. 609–625.
  12. Cotter K., DeCook J.R., Kanthawala S., Foyle K. (2022), In FYP we trust: the divine force of algorithmic conspirituality, in «International Journal of Communication», 16, pp. 1-23.
  13. Douglas M. (1975), Purezza e pericolo. Un’analisi dei concetti di contaminazione e tabù, Iil Mulino, Bologna.
  14. Eriksson-Krutrök, M. (2021), Algorithmic closeness in mourning: Vernaculars of the hashtag# grief on TikTok, in «Social Media+ Society», 7(3), pp.1-12, DOI: 10.1177/20563051211042396
  15. Parisi S., Firth E. (2023), “Il magico mondo dell'algoritmo”: immaginario, percezione e interazione degli utenti di TikTok con l'algoritmo di piattaforma, in «Sociologia della comunicazione», 66(2), pp. 60-76. DOI: 10.3280/SC2023-066001
  16. Fisher E. (2020), Can algorithmic knowledge about the self be critical, in «The digital age and its discontents: critical reflections in education», pp. 111-122.
  17. Francisco M.E.Z., Ruhela S. (2021), Investigating TikTok as an AI user platform, in «2021 2nd International Conference on Computation, Automation and Knowledge Management», pp. 293-298. DOI: 10.1109/ICCAKM50778.2021.9357752
  18. Fuchs C. (2008), Internet and Society: Social Theory in the Information Age, Routledge, London.
  19. Cardano M., Gariglio L. (2022), Metodi qualitativi. Pratiche di ricerca in presenza, a distanza e ibride, Carocci, Roma.
  20. Gebru T. (2019), Race and Gender, in Dubber M.D., Pasquale F., Das S. (eds.), Oxford Handbook on AI Ethics, Oxford University Press, Oxford, pp. 1-27.
  21. Gillespie T. (2016), # trendingistrending: When algorithms become culture, in Seyfert R., Roberge J. (eds.), Algorithmic cultures: Essays on meaning, performance and new technologies, Routledge, London, pp. 64-87.
  22. Grandinetti J., Bruinsma J. (2023), The affective algorithms of conspiracy TikTok, in «Journal of Broadcasting & Electronic Media», 67(3), pp.274-293. DOI: 10.1080/08838151.2022.2140806
  23. Harriger J.A., Evans J.A., Thompson J.K., Tylka T.L. (2022), The dangers of the rabbit hole: Reflections on social media as a portal into a distorted world of edited bodies and eating disorder risk and the role of algorithms, in «Body Image», 41, pp. 292-297.
  24. Herrick S.S., Hallward L., Duncan L.R. (2021), “This is just how I cope”: An inductive thematic analysis of eating disorder recovery content created and shared on TikTok using# EDrecovery, in «International journal of eating disorders», 54(4), pp.516-526.
  25. Hine C., (2015), Ethnography for the internet: Embedded, embodied and everyday, Routledge, London.
  26. Hofmann K., Schuth A., Bellogin A., De Rijke M. (2014), Effects of Position Bias on Click-Based Recommender Evaluation, in «European Conference on Information Retrieval», Springer, pp. 624–630. DOI: 10.1007/978-3-319-06028-6_67
  27. Kang H., Lou C. (2022), AI agency vs. human agency: understanding human–AI interactions on TikTok and their implications for user engagement, in «Journal of Computer-Mediated Communication», 27(5).
  28. Lee A.Y., Mieczkowski H., Ellison N.B., Hancock J.T. (2022), The algorithmic crystal: Conceptualizing the self through algorithmic personalization on TikTok, in «Proceedings of the ACM on Human-computer Interaction», 6, pp.1-22. DOI: 10.1145/3555601
  29. Malik H.M., Lepinkäinen N., Alvesalo-Kuusi A., Viljanen M. (2022), Social harms in an algorithmic context, in «Justice, Power and Resistance», 5(3), pp. 193–207. DOI: 10.1332/OYUA8095
  30. Marjanovic O., Cecez-Kecmanovic D., Vidgen R. (2021), Algorithmic pollution: Making the invisible visible, in «Journal of Information Technology», 36(4), pp. 391-408, https://doi.org/10.1177/02683962211010356.
  31. Noble S.U. (2018), Algorithms of Oppression: How Search Engines Reinforce Racism, New York University Press, New York.
  32. Nunes M. (2024), Algorithmic Agency, Automated Content, and User Engagement on TikTok, in Fors V., Berg M., Brodersen M. (eds.), The De Gruyter Handbook of Automated Futures: Imaginaries, Interactions and Impact, De Gruyter, Berlino, pp. 175-190. DOI: 10.1515/9783110792256-011
  33. O’Neil C. (2016), Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy, Penguin Random House, New York.
  34. Prawesh S., Padmanabhan B. (2014), The “Most Popular News” Recommender: Count Amplification and Manipulation Resistance, in «Information Systems Research», 25(3), pp. 569–589.
  35. Rettberg J.W. (2017), Hand signs for lip-syncing: The emergence of a gestural language on musical.ly as a video-based equivalent to emoji, in «Social Media+ Society», 3(4), pp. 1-11. DOI: 10.1177/2056305117735751
  36. Safransky S. (2020), Geographies of algorithmic violence: redlining the smart city, in «International Journal of Urban and Regional Research» 44(2), pp. 200–218. DOI: 10.1111/1468-2427.12833.
  37. Sarine L.E. (2012), Regulating the social pollution of systematic discrimination caused by implicit bias, in «California Law Review», 100(5), pp. 1359–1399, http://www.jstor.org/stable/23408739.
  38. Schellewald A. (2022), Theorizing “stories about algorithms” as a mechanism in the formation and maintenance of algorithmic imaginaries, in «Social Media+ Society», 8(1). DOI: 10.1177/20563051221077025
  39. Seaver N. (2017), Algorithms as culture: Some tactics for the ethnography of algorithmic systems, in «Big Data & Society», 4(2). DOI: 10.1177/2053951717738104
  40. Shelby R., Rismani S., Henne K., Moon A., Rostamzadeh N., Nicholas P., Yilla-Akbari N.M., Gallegos J., Smart A., Garcia E., Virk G. (2023), Sociotechnical harms of algorithmic systems: Scoping a taxonomy for harm reduction, in «Proceedings of the 2023 AAAI/ACM Conference on AI, Ethics, and Society», pp. 723-741. DOI: 10.1145/3600211.3604673
  41. Siles I., Espinoza-Rojas J., Naranjo A., Tristán M. F. (2019), The mutual domestication of users and algorithmic recommendations on Netflix, in «Communication, Culture & Critique», 12(4), pp. 499-518.
  42. Siles I., Valerio-Alfaro L., Meléndez-Moran A. (2024), Learning to like TikTok . . . and not: Algorithm awareness as process, in «New Media & Society», 26(10), pp. 5702-5718. DOI: 10.1177/1461444822113897
  43. Suarez-Villa L. (2009), Technocapitalism: A Critical Perspective on Technological Innovation and Corporatism, Temple University Press, Philadelphia.
  44. Swart J. (2021), Experiencing Algorithms: How Young People Understand, Feel About, and Engage With Algorithmic News Selection on Social Media, in «Social Media + Society», 7(2). DOI: 10.1177/20563051211008828
  45. The Guardian (2019, Dic. 3), TikTok owns up to censoring some users' videos to stop bullying, disponibile all’indirizzo: https://www.theguardian.com/technology/2019/dec/03/TikTok-owns-up-to-censoring-some-users-videos-to-stop-bullying
  46. Tufekci Z. (2015), Algorithmic harms beyond Facebook and Google: emergent
  47. challenges of computational agency, in «Colorado Technology Law Journal», 13, pp. 203-216.
  48. Weimann G., Masri N. (2020), Research note: Spreading hate on TikTok, in «Studies in Conflict and Terrorism», pp. 1–14.
  49. Zuboff S. (2019), The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power, Public Affairs, New York.
  50. Zurovac E., Boccia Artieri G., Donato V. (2023), Tales of Visibility in TikTok: The Algorithmic Imaginary and Digital Skills in Young Users, in Radovanovic D. (eds.), Digital Literacy and Inclusion: Stories, Platforms, Communities, Springer International Publishing, pp. 113-126. DOI: 10.1007/978-3-031-30808-6_8

Elisabetta Zurovac, Specchio riflesso! Un’esperienza etnografica con TikTok in "SOCIOLOGIA DELLA COMUNICAZIONE " 68/2024, pp 83-101, DOI: 10.3280/SC2024-068007