Densely Connected CNN with Multi-scale Feature Attention for Text Classification – NLPIR自然语言处理与信息检索共享平台

自然语言处理与信息检索共享平台 自然语言处理与信息检索共享平台

Densely Connected CNN with Multi-scale Feature Attention for Text Classification

NLPIR SEMINAR Y2019#33

INTRO

In the new semester, our Lab, Web Search Mining and Security Lab, plans to hold an academic seminar every Monday, and each time a keynote speaker will share understanding of papers on his/her related research with you.

Arrangement

Tomorrow’s seminar is organized as follows:

  1. The seminar time is 1:20.pm, Mon (October 21, 2019), at Zhongguancun Technology Park ,Building 5, 1306.
  2. Qinghong Jiang is going to give a presentation on the paper, Densely Connected CNN with Multi-scale Feature Attention for Text Classification. (Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence (IJCAI-18), July 13-19, 2018, Stockholm, Sweden.)
  3. The seminar will be hosted by WangGang.

Everyone interested in this topic is welcomed to join us.

Densely Connected CNN with Multi-scale Feature Attention for Text Classification

Shiyao Wang, Minlie Huang, Zhidong Deng

Abstract

Text classification is a fundamental problem in natural language processing. As a popular deep learning model, convolutional neural net- work(CNN) has demonstrated great success in this task. However, most existing CNN models apply convolution filters of fixed window size, thereby unable to learn variable n-gram features flexibly. In this paper, we present a densely connected CNN with multi-scale feature attention for text classification. The dense connections build short-cut paths between upstream and downstream convolutional blocks, which enable the model to compose features of larger scale from those of smaller scale, and thus produce variable n-gram features. Furthermore, a multi-scale feature atten- tion is developed to adaptively select multi-scale features for classification. Extensive experiments demonstrate that our model obtains competitive performance against state-of-the-art baselines on six benchmark datasets. Attention visualization further reveals the model’s ability to select proper n-gram features for text classification. Our code is available at: https://github.com/wangs hy31/Densely-Connected-CNN-with-Multiscale-Feature-Attention.git.

You May Also Like

About the Author: nlpvv

发表评论