Google ‘BigBird’ Achieves SOTA Performance on Long-Context NLP Tasks

栏目: IT技术 · 发布时间: 5年前

内容简介:The phenomenal success of Google’s BERT and other natural language processing (NLP) models based on transformers isn’t accidental. Behind all the SOTA performances lies transformers’ innovative self-attention mechanism, which enables networks to capture co

The phenomenal success of Google’s BERT and other natural language processing (NLP) models based on transformers isn’t accidental. Behind all the SOTA performances lies transformers’ innovative self-attention mechanism, which enables networks to capture contextual information from an entire text sequence. However, the memory and computational requirements of self-attention grow quadratically with sequence length, making it very expensive to use transformer-based models for processing long sequences .

To alleviate the quadratic dependency of transformers, a team of researchers from Google Research recently proposed a new sparse attention mechanism dubbed BigBird. In their paper Big Bird: Transformers for Longer Sequences , the team demonstrates that despite being a sparse attention mechanism, BigBird preserves all known theoretical properties of quadratic full attention models. In experiments, BigBird is shown to dramatically improve performance across long-context NLP tasks, producing SOTA results in question answering and summarization .

Google ‘BigBird’ Achieves SOTA Performance on Long-Context NLP Tasks

The researchers designed BigBird to satisfy all known theoretical properties of full transformers, building three main components into the model:

  • A set of g global tokens that attend to all parts of a sequence.
  • For each query qi , a set of r random keys that each query will attend to.
  • A block of local neighbours w so that each node attends on their local structure

These innovations enable BigBird to handle sequences up to eight times longer than what was previously possible using standard hardware.

Google ‘BigBird’ Achieves SOTA Performance on Long-Context NLP Tasks

Additionally, inspired by the capability of BigBird to handle long contexts, the team introduced a novel application of attention-based models for extracting contextual representations of genomics sequences like DNA. In experiments, BigBird proved to be beneficial in processing the longer input sequences and also delivered improved performance on downstream tasks such as promoter-region and chromatin profile prediction.

Google ‘BigBird’ Achieves SOTA Performance on Long-Context NLP Tasks
Google ‘BigBird’ Achieves SOTA Performance on Long-Context NLP Tasks
Google ‘BigBird’ Achieves SOTA Performance on Long-Context NLP Tasks

The paper Big Bird: Transformers for Longer Sequences is on arXiv .

Reporter: Fangyu Cai | Editor : Michael Sarazen

Google ‘BigBird’ Achieves SOTA Performance on Long-Context NLP Tasks

Synced Report |  A Survey of China’s Artificial Intelligence Solutions in Response to the COVID-19 Pandemic — 87 Case Studies from 700+ AI Vendors

This report offers a look at how the Chinese government and business owners have leveraged artificial intelligence technologies in the battle against COVID-19. It is also available on Amazon Kindle .

Click here to find more reports from us.

We know you don’t want to miss any story. Subscribe to our popular  Synced Global AI Weekly  to get weekly AI updates.

Google ‘BigBird’ Achieves SOTA Performance on Long-Context NLP Tasks

Advertisements


以上就是本文的全部内容,希望对大家的学习有所帮助,也希望大家多多支持 码农网

查看所有标签

猜你喜欢:

本站部分资源来源于网络,本站转载出于传递更多信息之目的,版权归原作者或者来源机构所有,如转载稿涉及版权问题,请联系我们

解密搜索引擎技术实战

解密搜索引擎技术实战

罗刚 / 2011-6 / 69.80元

《解密搜索引擎技术实战-Lucene&Java精华版(附盘)》,本书主要包括总体介绍部分、爬虫部分、自然语言处理部分、全文检索部分以及相关案例分析。爬虫部分介绍了网页遍历方法和如何实现增量抓取,并介绍了从网页等各种格式的文档中提取主要内容的方法。自然语言处理部分从统计机器学习的原理出发,包括了中文分词与词性标注的理论与实现以及在搜索引擎中的实用等细节,同时对文档排重、文本分类、自动聚类、句法分析树......一起来看看 《解密搜索引擎技术实战》 这本书的介绍吧!

图片转BASE64编码
图片转BASE64编码

在线图片转Base64编码工具

RGB HSV 转换
RGB HSV 转换

RGB HSV 互转工具