Transfer Learning For Sequence Tagging With Hierarchical Recurrent Networks

NLPIR SEMINAR Y2019#39

INTRO

In the new semester, our Lab, Web Search Mining and Security Lab, plans to hold an academic seminar every Monday, and each time a keynote speaker will share understanding of papers on his/her related research with you.

Arrangement

Tomorrow’s seminar is organized as follows:

  1. The seminar time is 1:20.pm, Mon (December 2, 2019), at Zhongguancun Technology Park ,Building 5, 1306.
  2. Yvette GBEDEVI is going to give a presentation on the paper, Transfer Learning For Sequence Tagging With Hierarchical Recurrent Networks.
  3. The seminar will be hosted by Qinghong Jiang.

Everyone interested in this topic is welcomed to join us.

Transfer Learning For Sequence Tagging With Hierarchical Recurrent Networks

Zhilin Yang, Ruslan Salakhutdinov & William W. Cohen

Abstract

Recent papers have shown that neural networks obtain state-of-the-art performance on several different sequence tagging tasks. One appealing property of such systems is their generality, as excellent performance can be achieved with a unified architecture and without task-specific feature engineering. However, it is unclear if such systems can be used for tasks without large amounts of training data. In this paper we explore the problem of transfer learning for neural sequence taggers, where a source task with plentiful annotations (e.g., POS tagging on Penn Treebank) is used to improve performance on a target task with fewer available annotations (e.g., POS tagging for microblogs). We examine the effects of transfer learning for deep hierarchical recurrent networks across domains, applications, and languages, and show that significant improvement can often be obtained. These improvements lead to improvements over the current state-of-the-art on several well-studied tasks.

You May Also Like

About the Author: nlpvv

发表评论