LUKE: Deep Contextualized Entity Representations with Entity-aware Self-attention

2021/11/17 (Wed) 12:00 (JST)

山田育矢 / Ikuya Yamada (株式会社Studio Ousia)

[Webサイト]

(株)Studio Ousiaを創業し,自然言語処理の研究開発に従事.2000年に(株)ニューロンを起業し,2005年に売却.その後,2007年に(株)Studio Ousiaを共同創業.理化学研究所AIP客員研究員.Kaggle Master.

概要

Entity representations are useful in natural language tasks involving entities. LUKE is a new pretrained contextualized representations of words and entities based on the bidirectional transformer. LUKE treats words and entities in a given text as independent tokens, and outputs contextualized representations of them. Our model is trained using a new pretraining task based on the masked language model of BERT. The task involves predicting randomly masked words and entities in a large entity-annotated corpus retrieved from Wikipedia. We also propose an entity-aware self-attention mechanism that is an extension of the self-attention mechanism of the transformer, and considers the types of tokens (words or entities) when computing attention scores. The proposed model achieves impressive empirical performance on a wide range of entity-related tasks. In particular, it obtains state-of-the-art results on five well-known datasets: Open Entity (entity typing), TACRED (relation classification), CoNLL-2003 (named entity recognition), ReCoRD (cloze-style question answering), and SQuAD 1.1 (extractive question answering).

※トークは日本語です。

[動画] [スライド] [論文] (EMNLP 2020)

メーリングリストへの登録: 参加用URLなどNLPコロキウムに関するお知らせを受け取りたい方はメーリングリストへのご登録をお願いします。

メーリングリスト登録フォーム

[トップページへ戻る]