
Racist, sexist, casteist: Is AI bad news for India?
Urvashi was quoted in a Context Newsroom piece on the impact of AI on marginalised groups in India.
"The irony is that people who are not counted in these datasets are still subject to these data-driven systems which reproduce bias and discrimination," said Urvashi Aneja, founding director of Digital Futures Lab, a research collective. Having mostly high-caste men design AI tools can unduly benefit the privileged and altogether bypass women, lower-caste and other marginalised groups, said Aneja.
"How much agency do women or lower-caste groups have to check or contradict what's coming out of a system? Especially generative AI, which is designed to seem human-like," she said. A technical fix cannot take existing bias out of the system; what's needed is a better understanding of the biases and their impacts in different social contexts, Aneja said.
"We should shed the assumption that bias is going to go away - instead, we should accept that bias is always going to be there, and design and build systems accordingly."
These are excerpts from a piece. Read the complete report here.