Papers of the day   All papers

Faster Gaze Prediction With Dense Networks and Fisher Pruning

Comments

👻🎃 Status Quo 🎃👻: @socrates1024 @bascule No, Twitter decides the cropping via the actual data, not the metadata, to try to pick out the most salience items in the photo, which they describe in detail here: https://arxiv.org/pdf/1801.05787.pdf In practice, it just prefers white over black faces 🙃

6 replies, 812 likes


˗ˏˋ Insane Clown Posse Comitatus ˎˊ˗: Twitter has a face-detection algorithm to try to center faces that are looking directly at the camera to create the preview image. Problem: it always favors white faces when automatically deciding which part of the larger photo to feature. https://arxiv.org/pdf/1801.05787.pdf

0 replies, 135 likes


Bianca Kastl: In case you want to reverse engineer further: The paper with the machine learning stuff and some example saliency maps https://arxiv.org/pdf/1801.05787.pdf https://t.co/yNtAmjFcYq

1 replies, 67 likes


Kyle McDonald: twitter has a paper on their cropping algorithm here https://arxiv.org/pdf/1801.05787.pdf and google has a saliency model on github https://github.com/PAIR-code/saliency but it's hard to imagine a better intervention to come out of this than "get everyone posting variations to interrogate the system"

2 replies, 21 likes


Elias Probst: @robertorourke @NeilCastle It uses a neural net which is trained using saliency prediction data, basically: it crops to the section, which people are expected to look at first. See also: https://blog.twitter.com/engineering/en_us/topics/infrastructure/2018/Smart-Auto-Cropping-of-Images.html https://arxiv.org/abs/1801.05787

1 replies, 13 likes


Robert Zubek: In short: it's using gaze prediction. It starts with data about how human eye wanders when looking at images. With this data, one can train ML systems to predict what parts of photos are most likely to be looked at. There's a great paper here: https://arxiv.org/pdf/1801.05787.pdf 4/ https://t.co/G5SW5i57Od

1 replies, 6 likes


carter: @R3DAC73D @bascule @jack @Twitter @TwitterSupport @TwitterAPI i think they have.... and the explanation isnt great.... https://twitter.com/KardOnIce/status/1307442695925837830

1 replies, 5 likes


Vicki Boykis: And here is the actual paper https://arxiv.org/pdf/1801.05787.pdf

1 replies, 5 likes


Igor Brigadir: @alexhanna Did they change it from the salience one? I thought they didn't rely on faces and detected areas of higher contrasting patterns instead - this was a while ago though: https://twitter.com/IgorBrigadir/status/1252631290169434113

1 replies, 3 likes


Igor Brigadir: @generativist For further context for people and a link dump of more relevant things: Twitter's Image Cropping uses Image Salience. Paper here: https://arxiv.org/abs/1801.05787 https://blog.twitter.com/engineering/en_us/topics/infrastructure/2018/Smart-Auto-Cropping-of-Images.html

1 replies, 3 likes


Silvia 🏴‍☠️ 🔻[hiro]: Apparently, from their paper (https://arxiv.org/pdf/1801.05787.pdf) the training set was CAT2000. Please check it out, I think it is 90% white people (https://saliency.tuebingen.ai/datasets.html).

0 replies, 2 likes


Future Sprog: @BexGraham @Te_Taipo @JanelleCShane It eventually traces its lineage back to a neural network trained on ImageNet https://arxiv.org/abs/1409.1556v6

0 replies, 1 likes


Prithviraj Ammanabrolu: @geomblog This paper was on gaze prediction by Twitter was mentioned https://arxiv.org/abs/1801.05787, can't find the original tweet anymore. Unsure if it's what's actually being used in production or not but gaze prediction has been historically used for image cropping

0 replies, 1 likes


Content

Found on Sep 19 2020 at https://arxiv.org/pdf/1801.05787.pdf

PDF content of a computer science paper: Faster Gaze Prediction With Dense Networks and Fisher Pruning