hardmaru: Adversarial Latent Autoencoders
Kind of creepy to imagine what my entire family might look like as Tyrion Lannister, Neo, or Emma Watson.
6 replies, 666 likes
Stanislav Pidhorskyi: Checkout "Adversarial Latent Autoencoders" (ALAE), #CVPR2020
Code and pre-trained models: https://github.com/podgorskiy/ALAE
#CVPR, #AI, #ML https://t.co/pt0CJZAUOf
2 replies, 95 likes
Shirley Ho: One can imagine lots of physical applications of this adversarial latent autoencoder.
But it would be interesting to morph our politicians into #GameofThrones characters ... who should @realDonaldTrump be?
2 replies, 44 likes
Daisuke Okanohara: ALAE is the first autoencoder that can generate high fidelity images as ones by Style-GAN. ALAE uses the combination of the adversarial loss in image space and the reconstruction error in the "learned" latent space. GAN quality with encoding capability. https://arxiv.org/abs/2004.04467
1 replies, 37 likes
AverageName: To start with, I'll try to analyze this paper:
[CVPR2020] Adversarial Latent Autoencoders
The main idea is to combine ideas of GANs and VAEs, by telling that our discriminator and generator should have shared latent space.
1 replies, 7 likes
By splitting the Encoder and Decoder into two parts and letting them learn the distribution of intermediate representations adversely like GAN, authors propose an AutoEncoder that can achieve both Sota GAN level expressiveness and an organized latent space https://t.co/N7Uw2MC1V7
0 replies, 4 likes
Alberto Tono: 👉🏻💫Great weekend reading. 🌈StyleALAE: https://arxiv.org/abs/2004.04467 you can learn the probability distribution of the latent space (with data distribution in adversarial settings). It enables learning representations that are likely less entangled. #ai #ml #DeepLearning 💫
0 replies, 3 likes
Vladimir: 𝗔𝗱𝘃𝗲𝗿𝘀𝗮𝗿𝗶𝗮𝗹 𝗟𝗮𝘁𝗲𝗻𝘁 𝗔𝘂𝘁𝗼𝗲𝗻𝗰𝗼𝗱𝗲𝗿𝘀 🤖. Wow, how much the output quality has increased in the last two years, deepfakes are real 😱😱😱
#MachineLearning #ArtificialIntelligence https://t.co/1uQTypmhBx
1 replies, 2 likes
Found on Apr 22 2020 at https://arxiv.org/pdf/2004.04467.pdf