Papers of the day   All papers

Connections between Support Vector Machines, Wasserstein distance and gradient-penalty GANs

Comments

Oct 16 2019 Alexia Jolicoeur-Martineau

My new paper is out! We show a framework in which we can both derive #SVMs and gradient penalized #GANs! We also show how to make better gradient penalties! https://ajolicoeur.wordpress.com/MaximumMarginGAN https://arxiv.org/abs/1910.06922
9 replies, 677 likes


Oct 16 2019 hardmaru 😷

Connections between Support Vector Machines, Wasserstein distance and gradient-penalty GANs New work by @jm_alexia and @bouzoukipunks 🔥🔥 https://arxiv.org/abs/1910.06922 https://t.co/Pc3DmcRTrt
1 replies, 383 likes


Oct 19 2019 Alexia Jolicoeur-Martineau

Proof that SVMs are just a trivial case of GAN. 🤠
4 replies, 294 likes


Oct 27 2019 Alexia Jolicoeur-Martineau

You miss the old times when SVMs where at the top of the food-chain? Turns out that gradient-penalized classifiers are a generalization of Soft-SVMs. Read my paper to find out how to make your #NeuralNetworks act like SVMs. #ML #AI
5 replies, 198 likes


Nov 04 2019 Alexia Jolicoeur-Martineau

Can you train a #SupportVectorMachine (SVM) when your classifier is a #NeuralNetwork? 🤔 Yes, use a Hinge loss classifier with a L2-norm gradient penalty, see: https://arxiv.org/abs/1910.06922 #MachineLearning #AI #ArtificialInteligence #Math
2 replies, 151 likes


Nov 13 2019 Alexia Jolicoeur-Martineau

Frequent users of gradient penalty (WGAN-GP, StyleGAN, etc.), make sure to try out the new Linfinity hinge gradient penalty from https://arxiv.org/abs/1910.06922 for better results. See https://github.com/AlexiaJM/MaximumMarginGANs for how to quickly and easily implement it in #PyTorch.
0 replies, 140 likes


Nov 02 2019 Alexia Jolicoeur-Martineau

You like Relativistic GANs? 🤩 Read my recent paper https://arxiv.org/abs/1910.06922 for a geometrical interpretation of Relativistic GANs. I also explain why certain variants only perform well with a gradient penalty. #AI #DeepLearning #MachineLearning
0 replies, 88 likes


Oct 16 2019 Statistics Papers

Connections between Support Vector Machines, Wasserstein distance and gradient-penalty GANs. http://arxiv.org/abs/1910.06922
0 replies, 37 likes


Oct 29 2019 Alexia Jolicoeur-Martineau

There are many ways to explain gradient penalties in GANs, but most are post-hoc reasonings. A satisfying explanation is that gradient penalties result from assuming a maximum-margin discriminator/critic (a generalization of SVM to non-linear classifiers). https://arxiv.org/abs/1910.06922
1 replies, 34 likes


Nov 07 2019 Alexia Jolicoeur-Martineau

Learn how to generalize Support Vector Machines (SVMs) with #DeepLearning: https://arxiv.org/abs/1910.06922 #AI #ML #MachineLearning #ComputerScience #Mathematics
0 replies, 16 likes


Oct 21 2019 Alexia Jolicoeur-Martineau

@reworkdl I'll be available to talk on Thursday and Friday at the conference. Feel free to ask me questions about the links between SVMs and GANs: https://arxiv.org/abs/1910.06922. 🧐
0 replies, 9 likes


Oct 24 2019 Montreal.AI

http://Montreal.AI just had an insightful conversation with @jm_alexia about the Relativistic GANs https://arxiv.org/abs/1910.06922 https://t.co/bFApdYRved
1 replies, 8 likes


Oct 22 2019 HotComputerScience

Most popular computer science paper of the day: "Connections between Support Vector Machines, Wasserstein distance and gradient-penalty GANs" https://hotcomputerscience.com/paper/connections-between-support-vector-machines-wasserstein-distance-and-gradient-penalty-gans https://twitter.com/jm_alexia/status/1184429680746815488
0 replies, 2 likes


Nov 16 2019 Pinaki Dasgupta ,MBA ✨

This work by @jm_alexia provides a framework to derive MMCs(maximum-margin classifiers) that results in very effective #GAN loss functions. can be used to derive new gradient norm penalties & improve the performance of #GANs. #SVMs
0 replies, 1 likes


Content