Daily Arxiv

This is a page that curates AI-related papers published worldwide.
All content here is summarized using Google Gemini and operated on a non-profit basis.
Copyright for each paper belongs to the authors and their institutions; please make sure to credit the source when sharing.

How Do Generative Models Draw a Software Engineer? A Case Study on Stable Diffusion Bias

Created by
  • Haebom

Author

Tosin Fadahunsi, Giordano d'Aloisio, Antinisca Di Marco, Federica Sarro

Outline

This paper generates images related to software engineering using three versions of the Stable Diffusion (SD) model (SD 2, SD XL, and SD 3) and analyzes the gender and racial biases in the generated images. Considering the gender and racial imbalances in the software engineering field, we generated 6,720 images using each model for two prompts: one containing the keyword "software engineer" and one not. Our analysis reveals that all models tend to portray software engineers as male, with SD 2 and SD XL showing a slightly higher representation of White individuals, and SD 3 showing a slightly higher representation of Asian individuals. All models underrepresented Black individuals and Arab individuals. These results raise serious concerns about bias when using generative models in the software engineering field and suggest the need for further research to mitigate this bias.

Takeaways, Limitations

Takeaways:
Empirically demonstrating gender and racial bias in generative models in software engineering.
Highlights that generative models can amplify existing social biases.
The need for bias mitigation strategies when using generative models in software engineering is raised.
Raising the need for further research on the bias issue of generative models.
Limitations:
Analysis of a specific generative model (Stable Diffusion) has limitations in generalization.
Absence of specific proposals for mitigating bias.
Limitations on the variety of prompts used in the analysis.
Potential for error due to the subjectivity of race and gender classification.
👍