NLPIR/ICTCLA2018 ACADEMIC SEMINAR 1st ISSUE

热度86票  浏览57次 【共0条评论】【我要评论 时间:2018年9月18日 20:51

NLPIR/ICTCLA2018 ACADEMIC SEMINAR 1st ISSUE

vuo,H\&dc H/g0

INTRO

o#D Y&Vk({8D)X a N0

In the new semester, our Lab, Web Search Mining and Security Lab, plans to hold an academic seminar every Wednesdays, and each time a keynote speaker will share understanding of papers published in recent years with you.自然语言处理与信息检索共享平台"f!IaY*x~0C&P%[|

 

.x6W-H(R\ w2G @0

This week's seminar is organized as follows:自然语言处理与信息检索共享平台?4|T3[]$C%th H`

1. The seminar time is 1pm tomorrow, at the center building 1013自然语言处理与信息检索共享平台B(U'[uH"_%t,L qN

2. The lecturer is Zhang Xi, the paper's title is A Neural Attention Model for Abstractive Sentence Summarization自然语言处理与信息检索共享平台O*@Q Mi1XJ e%y

3. Attachment is the paper of this seminar, please download in advance

k$Yd-Z X:NhuV0

 

X;oa9hi8@0

Abstract自然语言处理与信息检索共享平台f3I3A;sw6p

Summarization based on text extraction is inherently limited, but generation-style abstractive methods have proven challenging to build. In this work, we propose a fully data-driven approach to abstractive sentence summarization. Our method utilizes a local attention-based model that generates each word of the summary conditioned on the input sentence. While the model is structurally simple, it can easily be trained end-to-end and scales to a large amount of training data. The model shows significant performance gains on the DUC-2004 shared task compared with several strong baselines.自然语言处理与信息检索共享平台1N~q,u \:l5P's0d

 自然语言处理与信息检索共享平台R TH |dHn i [

顶:3 踩:6
对本文中的事件或人物打分:
当前平均分:-0.86 (28次打分)
对本篇资讯内容的质量打分:
当前平均分:0.07 (27次打分)
【已经有22人表态】
3票
感动
4票
路过
3票
高兴
3票
难过
4票
搞笑
2票
愤怒
2票
无聊
1票
同情
上一篇 下一篇
发表评论
换一张

网友评论仅供网友表达个人看法,并不表明本网同意其观点或证实其描述。

查看全部回复【已有0位网友发表了看法】