Yapay Zeka Uygulamalarında Toplumsal Cinsiyet Eşitsizliğinin Yeniden Üretimi: Kodlama ve Veri Setlerindeki Gizli Ön Yargılar


Üçler N., Şahin Z. B.

4. Uluslararası Dijital Çağda İletişim Sempozyumu’, Gazimagusa, Kıbrıs (Kktc), 28 - 30 Kasım 2024, ss.0-1

  • Yayın Türü: Bildiri / Özet Bildiri
  • Basıldığı Şehir: Gazimagusa
  • Basıldığı Ülke: Kıbrıs (Kktc)
  • Sayfa Sayıları: ss.0-1
  • İstanbul Gelişim Üniversitesi Adresli: Evet

Özet

Gender equality, identified by the United Nations as a fundamental human right, is outlined as the fifth goal in the "Millennium Development Goals." In Turkey, women, who constitute nearly half of the population, do not enjoy equal rights with men in political, economic, and social domains. Gender equality can be achieved through the equal representation of women and men across all fields. As technology advances, artificial intelligence programs have been utilized for various purposes, including coding, visualization, data analysis, natural language processing, security, and monitoring. These applications generate data based on predefined algorithms and data sets. Feminist technology theory, meanwhile, is an approach that examines how technologies reflect and shape gender relations. This study employs feminist technology theory to explore how AI systems represent gender and how these representations mirror biases. The study’s objective is to identify how algorithms and data sets used in AI applications reproduce existing societal biases. In this context, prominent AI programs known for their visual creation capabilities such as ChatGPT, DALL-E, Midjourney, Runway ML, and Artbreeder are analyzed to determine how stereotypes about male and female roles are reproduced and reinforced. Questions related to gender roles, the definition of gender, and factors influencing gender equality in society were posed to these applications. Additionally, these applications were asked to generate figures based on gender roles to visualize these representations. Data obtained in this study were analyzed using the descriptive analysis method. The findings reveal that AI applications define gender roles differently; women are commonly associated with descriptors like “gentle,” “emotional,” and “caring,” while men are more often linked with “individual success,” “power,” and “status.” The study concludes with recommendations for identifying and addressing gender biases in AI. Key Words: Gender, Feminist Technology Theory, AI Applications, Algorithms, Data Sets, Descriptive Analysis.