Deep Contextualized Word Representations

NLPIR SEMINAR Y2018#12

INTRO

       In the new semester, our Lab, Web Search Mining and Security Lab, plans to hold an academic seminar every Wednesdays, and each time a keynote speaker will share understanding of papers published in recent years with you.

Arrangement

This week’s seminar is organized as follows:
1. The seminar time is 1.pm, Wed, at Zhongguancun Technology Park ,Building 5, 1306.
2. The lecturer is Zhaoyang Wang, the paper’s title is Deep contextualized word representations.
3. The seminar will be hosted by Li Shen.
4. Attachment is the paper of this seminar, please download in advance.

Deep contextualized word representations

Matthew E. Petersy, Mark Neumanny, Mohit Iyyery, Matt Gardnery,

Christopher Clark, Kenton Lee, Luke Zettlemoyery

Abstract

       We introduce a new type of deep contextualized word representation that models both (1) complex characteristics of word use (e.g., syntax and semantics), and (2) how these uses vary across linguistic contexts (i.e., to model polysemy). Our word vectors are learned functions of the internal states of a deep bidirectional language model (biLM), which is pretrained on a large text corpus. We show that these representations can be easily added to existing models and significantly improve the state of the art across six challenging NLP problems, including question answering, textual entailment and sentiment analysis. We also present an analysis showing that exposing the deep internals of the pre-trained network is crucial, allowing downstream models to mix different types of semi-supervision signals.

You May Also Like

About the Author: nlpvv

发表评论