Building your own Self-attention GANs

栏目: IT技术 · 发布时间: 4年前

Building your own Self-attention GANs

A PyTorch implementation of SAGAN with MNIST and CelebA dataset

Meme from imgflip.com

GANs, as known as Generative Adversarial Networks , is one of the most popular topics in the Machine Learning fields recently. It consists of two different Neural Network models, one called Generator , and one called Discriminator . It sounds hard to understand, but let me try to put it in this way: Let’s say we want to forge famous paintings starting without any knowledge of painting, what should we do? Most would say, just look at the paintings and learn how to do it. But it’s not a one-man job, to some point, I am sure you will be better and better in painting. You will need to have your friend come in front of one real painting and one that you forged and let him guess which one is the real one. It will be pretty easy for him to guess in the beginning, but keep it going and you will eventually confuse your friend.

In GANs, the generator is like you who forge paintings, and the discriminator is the friend who specializes in telling which painting is fake. Think about the goal here, you want to make it hard for your friend to tell real or fake. If your friend were to give out a probability of being real from 0 to 1 for each painting, you would want him to give 0.5 to any paintings you show him, either real or forged. This will also be the objective of GANs, as reflected in loss functions.

We also see DCGAN a lot, which stands for Deep Convolutional GAN . It is a GAN design that specialized in image generation, using convolution layers for both generator and discriminator . It works just like a CNN . A Self-attention GAN is a DCGAN that utilizes self-attention layers. The idea of self-attention has been out there for years, also known as non-local in some researches. Think about how does convolution works: they convolve nearby pixels and extract features out of local blocks. They work “locally” in each layer. In contrast, self-attention layers learn from distant blocks. In 2017, Google published a paper “ Attention Is All You Need ”, bringing more hypes about the topic. For a single image input, it works like this:

Request for deletion

About

MC.AI – Aggregated news about artificial intelligence

MC.AI collects interesting articles and news about artificial intelligence and related areas. The contributions come from various open sources and are presented here in a collected form.

The copyrights are held by the original authors, the source is indicated with each contribution.

Contributions which should be deleted from this platform can be reported using the appropriate form (within the contribution).

MC.AI is open for direct submissions, we look forward to your contribution!

Search on MC.AI

mc.ai aggregates articles from different sources - copyright remains at original authors


以上就是本文的全部内容,希望对大家的学习有所帮助,也希望大家多多支持 码农网

查看所有标签

猜你喜欢:

本站部分资源来源于网络,本站转载出于传递更多信息之目的,版权归原作者或者来源机构所有,如转载稿涉及版权问题,请联系我们

软件的奥秘

软件的奥秘

[美] V. Anton Spraul / 解福祥 / 人们邮电出版社 / 2017-9-1 / 49

软件已经成为人们日常生活与工作中常见的辅助工具,但是对于软件的工作原理,很多人却不是非常了解。 本书对软件的工作原理进行了解析,让读者对常用软件的工作原理有一个大致的了解。内容涉及数据如何加密、密码如何使用和保护、如何创建计算机图像、如何压缩和存储视频、如何搜索数据、程序如何解决同样的问题而不会引发冲突以及如何找出最佳路径等方面。 本书适合从事软件开发工作的专业技术人员,以及对软件工作......一起来看看 《软件的奥秘》 这本书的介绍吧!

JSON 在线解析
JSON 在线解析

在线 JSON 格式化工具

Markdown 在线编辑器
Markdown 在线编辑器

Markdown 在线编辑器