Papers of the day   All papers

Does Object Recognition Work for Everyone?


Jun 07 2019 Miles Brundage

"Does Object Recognition Work for Everyone?," DeVries, Misra, and Wang et al.: "The results of our study suggest that further work is needed to make object-recognition systems work equally well for people across different countries and income levels."
7 replies, 224 likes

Jun 07 2019 mike cook

AI is dominated by white, male, western, middle-class perceptions of the world, driven both by the imbalance in the makeup of research communities, and pre-existing biases in datasets and in test/experiment participant demos. This is a good, clear example of what effect that has.
5 replies, 183 likes

Jun 08 2019 Nasrin Mostafazadeh

It’s been known that image recognition models are extremely biased towards recognizing limited categories of things they’ve been trained on,now, this new work uncovers that even in the same category of household items,the performance drops for countries with low household income!
1 replies, 13 likes

Jan 23 2020 David Grangier

@WWRob @stanfordnlp @david__jurgens @jurafsky Does Object Recognition Work for Everyone? Conference Computer Vision and Pattern Recognition (CVPR) By: Terrance DeVries, Ishan Misra, Changhan Wang, Laurens van der Maaten
1 replies, 5 likes

Jun 07 2019 Sebastian Flores

Another type of bias. This is why diversity in all aspects is key.
0 replies, 1 likes

Sep 09 2019 Deb Raji

Change the language of the queries and suddenly the world becomes so much bigger. Here's an example of what happened when researchers sourced a dataset after translating words into Hindi. (ref:
1 replies, 0 likes