Miles Brundage: "Does Object Recognition Work for Everyone?," DeVries, Misra, and Wang et al.: https://arxiv.org/abs/1906.02659
"The results of our study suggest that further work is needed to make object-recognition systems work equally well for people across different countries and income levels." https://t.co/bMHdYMvzWO
7 replies, 224 likes
mike cook: AI is dominated by white, male, western, middle-class perceptions of the world, driven both by the imbalance in the makeup of research communities, and pre-existing biases in datasets and in test/experiment participant demos. This is a good, clear example of what effect that has.
5 replies, 183 likes
Deb Raji: Newer papers:
"Does Object Recognition Work for Everyone?" https://arxiv.org/abs/1906.02659
"Predictive Inequity in Object Detection"
"Gender Shades" http://gendershades.org/ (+"Actionable Auditing" https://dl.acm.org/doi/10.1145/3306618.3314244, "Saving Face" https://arxiv.org/abs/2001.00964)
2 replies, 17 likes
Nasrin Mostafazadeh: It’s been known that image recognition models are extremely biased towards recognizing limited categories of things they’ve been trained on,now, this new work uncovers that even in the same category of household items,the performance drops for countries with low household income!
1 replies, 13 likes
David Grangier: @WWRob @stanfordnlp @david__jurgens @jurafsky Does Object Recognition Work for Everyone?
Conference Computer Vision and Pattern Recognition (CVPR)
By: Terrance DeVries, Ishan Misra, Changhan Wang, Laurens van der Maaten
1 replies, 5 likes
Sebastian Flores: Another type of bias. This is why diversity in all aspects is key.
0 replies, 1 likes
Deb Raji: Change the language of the queries and suddenly the world becomes so much bigger. Here's an example of what happened when researchers sourced a dataset after translating words into Hindi. (ref: https://arxiv.org/abs/1906.02659) https://t.co/k4d6C1NL34
1 replies, 0 likes
Found on Jun 07 2019 at https://arxiv.org/pdf/1906.02659.pdf