Building your own Self-attention GANs

栏目: IT技术 · 发布时间: 5年前

Building your own Self-attention GANs

A PyTorch implementation of SAGAN with MNIST and CelebA dataset

Meme from imgflip.com

GANs, as known as Generative Adversarial Networks , is one of the most popular topics in the Machine Learning fields recently. It consists of two different Neural Network models, one called Generator , and one called Discriminator . It sounds hard to understand, but let me try to put it in this way: Let’s say we want to forge famous paintings starting without any knowledge of painting, what should we do? Most would say, just look at the paintings and learn how to do it. But it’s not a one-man job, to some point, I am sure you will be better and better in painting. You will need to have your friend come in front of one real painting and one that you forged and let him guess which one is the real one. It will be pretty easy for him to guess in the beginning, but keep it going and you will eventually confuse your friend.

In GANs, the generator is like you who forge paintings, and the discriminator is the friend who specializes in telling which painting is fake. Think about the goal here, you want to make it hard for your friend to tell real or fake. If your friend were to give out a probability of being real from 0 to 1 for each painting, you would want him to give 0.5 to any paintings you show him, either real or forged. This will also be the objective of GANs, as reflected in loss functions.

We also see DCGAN a lot, which stands for Deep Convolutional GAN . It is a GAN design that specialized in image generation, using convolution layers for both generator and discriminator . It works just like a CNN . A Self-attention GAN is a DCGAN that utilizes self-attention layers. The idea of self-attention has been out there for years, also known as non-local in some researches. Think about how does convolution works: they convolve nearby pixels and extract features out of local blocks. They work “locally” in each layer. In contrast, self-attention layers learn from distant blocks. In 2017, Google published a paper “ Attention Is All You Need ”, bringing more hypes about the topic. For a single image input, it works like this:

Request for deletion

About

MC.AI – Aggregated news about artificial intelligence

MC.AI collects interesting articles and news about artificial intelligence and related areas. The contributions come from various open sources and are presented here in a collected form.

The copyrights are held by the original authors, the source is indicated with each contribution.

Contributions which should be deleted from this platform can be reported using the appropriate form (within the contribution).

MC.AI is open for direct submissions, we look forward to your contribution!

Search on MC.AI

mc.ai aggregates articles from different sources - copyright remains at original authors


以上就是本文的全部内容,希望对大家的学习有所帮助,也希望大家多多支持 码农网

查看所有标签

猜你喜欢:

本站部分资源来源于网络,本站转载出于传递更多信息之目的,版权归原作者或者来源机构所有,如转载稿涉及版权问题,请联系我们

GitHub入门与实践

GitHub入门与实践

[日] 大塚弘记 / 支鹏浩、刘斌 / 人民邮电出版社 / 2015-7 / 39.00元

本书从Git的基本知识和操作方法入手,详细介绍了GitHub的各种功能,GitHub与其他工具或服务的协作,使用GitHub的开发流程以及如何将GitHub引入到企业中。在讲解GitHub的代表功能Pull Request时,本书专门搭建了供各位读者实践的仓库,邀请各位读者进行Pull Request并共同维护。一起来看看 《GitHub入门与实践》 这本书的介绍吧!

图片转BASE64编码
图片转BASE64编码

在线图片转Base64编码工具

随机密码生成器
随机密码生成器

多种字符组合密码

UNIX 时间戳转换
UNIX 时间戳转换

UNIX 时间戳转换