Google ‘BigBird’ Achieves SOTA Performance on Long-Context NLP Tasks

栏目: IT技术 · 发布时间: 5年前

内容简介:The phenomenal success of Google’s BERT and other natural language processing (NLP) models based on transformers isn’t accidental. Behind all the SOTA performances lies transformers’ innovative self-attention mechanism, which enables networks to capture co

The phenomenal success of Google’s BERT and other natural language processing (NLP) models based on transformers isn’t accidental. Behind all the SOTA performances lies transformers’ innovative self-attention mechanism, which enables networks to capture contextual information from an entire text sequence. However, the memory and computational requirements of self-attention grow quadratically with sequence length, making it very expensive to use transformer-based models for processing long sequences .

To alleviate the quadratic dependency of transformers, a team of researchers from Google Research recently proposed a new sparse attention mechanism dubbed BigBird. In their paper Big Bird: Transformers for Longer Sequences , the team demonstrates that despite being a sparse attention mechanism, BigBird preserves all known theoretical properties of quadratic full attention models. In experiments, BigBird is shown to dramatically improve performance across long-context NLP tasks, producing SOTA results in question answering and summarization .

Google ‘BigBird’ Achieves SOTA Performance on Long-Context NLP Tasks

The researchers designed BigBird to satisfy all known theoretical properties of full transformers, building three main components into the model:

  • A set of g global tokens that attend to all parts of a sequence.
  • For each query qi , a set of r random keys that each query will attend to.
  • A block of local neighbours w so that each node attends on their local structure

These innovations enable BigBird to handle sequences up to eight times longer than what was previously possible using standard hardware.

Google ‘BigBird’ Achieves SOTA Performance on Long-Context NLP Tasks

Additionally, inspired by the capability of BigBird to handle long contexts, the team introduced a novel application of attention-based models for extracting contextual representations of genomics sequences like DNA. In experiments, BigBird proved to be beneficial in processing the longer input sequences and also delivered improved performance on downstream tasks such as promoter-region and chromatin profile prediction.

Google ‘BigBird’ Achieves SOTA Performance on Long-Context NLP Tasks
Google ‘BigBird’ Achieves SOTA Performance on Long-Context NLP Tasks
Google ‘BigBird’ Achieves SOTA Performance on Long-Context NLP Tasks

The paper Big Bird: Transformers for Longer Sequences is on arXiv .

Reporter: Fangyu Cai | Editor : Michael Sarazen

Google ‘BigBird’ Achieves SOTA Performance on Long-Context NLP Tasks

Synced Report |  A Survey of China’s Artificial Intelligence Solutions in Response to the COVID-19 Pandemic — 87 Case Studies from 700+ AI Vendors

This report offers a look at how the Chinese government and business owners have leveraged artificial intelligence technologies in the battle against COVID-19. It is also available on Amazon Kindle .

Click here to find more reports from us.

We know you don’t want to miss any story. Subscribe to our popular  Synced Global AI Weekly  to get weekly AI updates.

Google ‘BigBird’ Achieves SOTA Performance on Long-Context NLP Tasks

Advertisements


以上就是本文的全部内容,希望对大家的学习有所帮助,也希望大家多多支持 码农网

查看所有标签

猜你喜欢:

本站部分资源来源于网络,本站转载出于传递更多信息之目的,版权归原作者或者来源机构所有,如转载稿涉及版权问题,请联系我们

用户体验面面观

用户体验面面观

[美] 库涅夫斯基(Mike Kuniavsky) / 汤海 / 清华大学出版社 / 2010-5 / 69.00

这是一本专注于用户研究和用户体验的经典读物,同时也是一本容易上手的实战手册,从实践者的角度,着重讨论和阐述了用户研究的重要性、主要的用户研究方法和工具,同时借助于实例介绍了相关的应用。全书共3部分18章,深度剖析了何为优秀的用户设计,用户体验包括哪些研究方法和工具,如何 得出和分析用户体验调查结果等。一起来看看 《用户体验面面观》 这本书的介绍吧!

XML、JSON 在线转换
XML、JSON 在线转换

在线XML、JSON转换工具

UNIX 时间戳转换
UNIX 时间戳转换

UNIX 时间戳转换

正则表达式在线测试
正则表达式在线测试

正则表达式在线测试