iScience. Sepideh Bazazi, Jurgis Karpus, Taha Yasseri. Published December 2025.

One understudied anthropomorphic feature of AI agents, yet perfectly familiar to anyone who has used a voiced GPS navigation guide or smart home assistant device, is gender. There is evidence that AI’s assigned gender can influence people’s behavioral dispositions, for example, willingness to donate money, and that existing gender stereotypes affecting human-human interactions extend to human interactions with “gendered” voice computers. In general, people have been found to perceive female bots as more human-like than male bots. Even when there are no explicit cues in the design, users often assign human-like attributes, including gender, to AI systems such as ChatGPT. ChatGPT is reported to be typically perceived as male by default; however, this perception can be reversed when the chatbot’s “feminine” abilities (e.g., providing emotional support) are emphasized.

Participants exploited female-labeled and distrusted male-labeled AI agents more than human counterparts with the same gender labels, reflecting gender biases similar to those in human-human interactions. These findings underscore the importance of accounting for gender bias in AI design, policy, and regulation.

Read more