“‘The world needs to pay much closer attention to how, when and whether AI technologies are gendered and, crucially, who is gendering them,’ said Saniye Gulser Corat, UNESCO’s director for gender equality,” Elks reports. ‘Siri’s submissiveness in the face of gender abuse – and the servility expressed by so many other digital assistants projected as young women – provides a powerful illustration of gender biases coded into technology products,’ it said.” “The study highlighted that Siri was previously programmed to respond to users calling her a ‘bitch’ by saying ‘I’d blush if I could’ as an example of the issue. scientific and cultural body UNESCO,” Elks reports. “They are programmed to be submissive and servile – including politely responding to insults – meaning they reinforce gender bias and normalize sexist harassment, said researchers from the U.N. “The vast majority of assistants such as Apple’s Siri, Amazon Alexa and Microsoft’s Cortana are designed to be seen as feminine, from their names to their voices and personalities, said the study.” “Popular digital assistants that reply in a woman’s voice and are styled as female helpers are reinforcing sexist stereotypes, according to a United Nations report released on Wednesday,” Sonia Elks reports for Reuters.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |