﻿{"id":1104,"date":"2018-09-18T09:18:58","date_gmt":"2018-09-18T01:18:58","guid":{"rendered":"http:\/\/www.nlpir.org\/wordpress\/?p=1104"},"modified":"2018-12-13T22:13:58","modified_gmt":"2018-12-13T14:13:58","slug":"nlpir-ictcla2018-academic-seminar-1st-issue","status":"publish","type":"post","link":"http:\/\/www.nlpir.org\/wordpress\/2018\/09\/18\/nlpir-ictcla2018-academic-seminar-1st-issue\/","title":{"rendered":"NLPIR\/ICTCLA2018 ACADEMIC SEMINAR 1st ISSUE"},"content":{"rendered":"<div id=\"article_body\">\n<p class=\"MsoNormal\" style=\"text-align: center; margin: 0cm 0cm 0pt;\" align=\"center\"><b style=\"mso-bidi-font-weight: normal;\"><span lang=\"EN-US\" style=\"font-size: 16pt; mso-bidi-font-size: 11.0pt;\"><span style=\"font-family: Times New Roman;\">NLPIR\/ICTCLA2018 ACADEMIC SEMINAR 1st ISSUE<!--?xml:namespace prefix = \"o\" ns = \"urn:schemas-microsoft-com:office:office\" \/--><\/span><\/span><\/b><span style=\"display: none;\">\u81ea\u7136\u8bed\u8a00\u5904\u7406\u4e0e\u4fe1\u606f\u68c0\u7d22\u5171\u4eab\u5e73\u53f0*Y\u0016U\u0012s7v^6p3X\u0018o#@0S\fU<\/span><\/p>\n<p class=\"MsoNormal\" style=\"margin: 0cm 0cm 0pt;\"><b style=\"mso-bidi-font-weight: normal;\"><span lang=\"EN-US\" style=\"font-size: 12pt; mso-bidi-font-size: 11.0pt;\"><span style=\"font-family: Times New Roman;\">\u00a0 \u00a0 \u00a0 \u00a0 INTRO<\/span><\/span><\/b><span style=\"display: none;\">\u81ea\u7136\u8bed\u8a00\u5904\u7406\u4e0e\u4fe1\u606f\u68c0\u7d22\u5171\u4eab\u5e73\u53f0\u0017K&amp;o\u0011P\u0017G\u0018Y\u0010B\/`<br \/>\nC9\\*F&#8221;t<\/span><\/p>\n<p class=\"MsoNormal\" style=\"margin: 0cm 0cm 0pt;\"><span lang=\"EN-US\"><span style=\"font-size: medium;\"><span style=\"font-family: Times New Roman;\"><strong>\u00a0 \u00a0 \u00a0 \u00a0 <\/strong>In the new semester, our Lab, Web Search Mining and Security Lab, plans to hold an academic seminar every Wednesdays, and each time a keynote speaker will share understanding of papers published in recent years with you.<\/span><\/span><\/span><span style=\"display: none;\">\u81ea\u7136\u8bed\u8a00\u5904\u7406\u4e0e\u4fe1\u606f\u68c0\u7d22\u5171\u4eab\u5e73\u53f0\bU\u001cq\u0010~%g\u001aF<br \/>\n\\)G<br \/>\nA$p#B-A<\/span><\/p>\n<p class=\"MsoNormal\" style=\"margin: 0cm 0cm 0pt;\"><span lang=\"EN-US\"><span style=\"font-family: Times New Roman; font-size: medium;\">\u00a0<\/span><\/span><span style=\"display: none;\">\u81ea\u7136\u8bed\u8a00\u5904\u7406\u4e0e\u4fe1\u606f\u68c0\u7d22\u5171\u4eab\u5e73\u53f0,v\u000ei#e5q:H<\/span><\/p>\n<p class=\"MsoNormal\" style=\"margin: 0cm 0cm 0pt;\"><span lang=\"EN-US\"><span style=\"font-size: medium;\"><span style=\"font-family: Times New Roman;\"><strong>\u00a0 \u00a0 \u00a0 \u00a0 <\/strong>This week&#8217;s seminar is organized as follows:<\/span><\/span><\/span><\/p>\n<p><span style=\"display: none;\">;r6g4|5f4I-d2K*k0<\/span><\/p>\n<p class=\"MsoNormal\" style=\"margin: 0cm 0cm 0pt;\"><span lang=\"EN-US\"><span style=\"font-size: medium;\"><span style=\"font-family: Times New Roman;\"><strong>\u00a0 \u00a0 \u00a0 \u00a0 <\/strong>1. The seminar time is 1pm tomorrow, at the center building 1013<\/span><\/span><\/span><span style=\"display: none;\">\u81ea\u7136\u8bed\u8a00\u5904\u7406\u4e0e\u4fe1\u606f\u68c0\u7d22\u5171\u4eab\u5e73\u53f0\u0019Y$p$k\bT\br\u0016p(\\\u0003[<\/span><\/p>\n<p class=\"MsoNormal\" style=\"margin: 0cm 0cm 0pt;\"><span lang=\"EN-US\"><span style=\"font-size: medium;\"><span style=\"font-family: Times New Roman;\"><strong>\u00a0 \u00a0 \u00a0 \u00a0 <\/strong>2. The lecturer is Zhang Xi, the paper&#8217;s title is A Neural Attention Model for Abstractive Sentence Summarization<\/span><\/span><\/span><\/p>\n<p><span style=\"display: none;\">-q\u0010f\u0007u\u0012J\u001b{\u0012}\u0014G\fE\u0005O0<\/span><\/p>\n<p class=\"MsoNormal\" style=\"margin: 0cm 0cm 0pt;\"><span lang=\"EN-US\"><span style=\"font-size: medium;\"><span style=\"font-family: Times New Roman;\"><strong>\u00a0 \u00a0 \u00a0 \u00a0 <\/strong>3. Attachment is the paper of this seminar, please download in advance<\/span><\/span><\/span><span style=\"display: none;\">\u81ea\u7136\u8bed\u8a00\u5904\u7406\u4e0e\u4fe1\u606f\u68c0\u7d22\u5171\u4eab\u5e73\u53f0*] M\bH5O*C0[\u0005p I\u001ca<\/span><\/p>\n<p class=\"MsoNormal\" style=\"margin: 0cm 0cm 0pt;\"><span lang=\"EN-US\"><span style=\"font-family: Times New Roman; font-size: medium;\">\u00a0<\/span><\/span><\/p>\n<p><span style=\"display: none;\">$T&#8221;P<br \/>\n\\#?\u000ex p&#8221;c3K1l0<\/span><\/p>\n<div style=\"mso-element: para-border-div; mso-border-alt: dotted windowtext .5pt; border: windowtext 1pt dotted; padding: 1pt 4pt 1pt 4pt;\">\n<p class=\"MsoNormal\" style=\"text-align: center; margin: 0cm 0cm 0pt; mso-border-alt: dotted windowtext .5pt; mso-padding-alt: 1.0pt 4.0pt 1.0pt 4.0pt; padding: 0cm;\" align=\"center\"><span lang=\"EN-US\" style=\"font-size: 12pt; mso-bidi-font-size: 11.0pt;\"><span style=\"font-family: Times New Roman;\">Abstract<\/span><\/span><span style=\"display: none;\">\u81ea\u7136\u8bed\u8a00\u5904\u7406\u4e0e\u4fe1\u606f\u68c0\u7d22\u5171\u4eab\u5e73\u53f0\/u\u001aD8L-t\u0014C\u0012u#h<\/span><\/p>\n<p class=\"MsoNormal\" style=\"margin: 0cm 0cm 0pt; mso-border-alt: dotted windowtext .5pt; mso-padding-alt: 1.0pt 4.0pt 1.0pt 4.0pt; padding: 0cm;\"><span lang=\"EN-US\"><span style=\"font-size: medium;\"><span style=\"font-family: Times New Roman;\"><strong>\u00a0 \u00a0 \u00a0 \u00a0 <\/strong>Summarization based on text extraction is inherently limited, but generation-style abstractive methods have proven challenging to build. In this work, we propose a fully data-driven approach to abstractive sentence summarization. Our method utilizes a local attention-based model that generates each word of the summary conditioned on the input sentence. While the model is structurally simple, it can easily be trained end-to-end and scales to a large amount of training data. The model shows significant performance gains on the DUC-2004 shared task compared with several strong baselines.<\/span><\/span><\/span><span style=\"display: none;\">\u81ea\u7136\u8bed\u8a00\u5904\u7406\u4e0e\u4fe1\u606f\u68c0\u7d22\u5171\u4eab\u5e73\u53f0\u0019Q\u0006e V\u0004i\u0012j\u0015R A\/D+^\br5U+c<\/span><\/p>\n<\/div>\n<p class=\"MsoNormal\" style=\"margin: 0cm 0cm 0pt;\"><span lang=\"EN-US\"><span style=\"font-family: Times New Roman; font-size: medium;\">\u00a0<\/span><\/span><\/p>\n<p><span style=\"display: none;\">)s\u0014S&#8217;I)n&#8221;C5r\u0001[\u0015B0<\/span><\/p>\n<\/div>\n","protected":false},"excerpt":{"rendered":"<p>NLPIR\/ICTCLA2018 ACADEMIC SEMINAR 1st IS &hellip; <a href=\"http:\/\/www.nlpir.org\/wordpress\/2018\/09\/18\/nlpir-ictcla2018-academic-seminar-1st-issue\/\">\u7ee7\u7eed\u9605\u8bfb <span class=\"meta-nav\">&rarr;<\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[38],"tags":[],"_links":{"self":[{"href":"http:\/\/www.nlpir.org\/wordpress\/wp-json\/wp\/v2\/posts\/1104"}],"collection":[{"href":"http:\/\/www.nlpir.org\/wordpress\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"http:\/\/www.nlpir.org\/wordpress\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"http:\/\/www.nlpir.org\/wordpress\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"http:\/\/www.nlpir.org\/wordpress\/wp-json\/wp\/v2\/comments?post=1104"}],"version-history":[{"count":1,"href":"http:\/\/www.nlpir.org\/wordpress\/wp-json\/wp\/v2\/posts\/1104\/revisions"}],"predecessor-version":[{"id":1105,"href":"http:\/\/www.nlpir.org\/wordpress\/wp-json\/wp\/v2\/posts\/1104\/revisions\/1105"}],"wp:attachment":[{"href":"http:\/\/www.nlpir.org\/wordpress\/wp-json\/wp\/v2\/media?parent=1104"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"http:\/\/www.nlpir.org\/wordpress\/wp-json\/wp\/v2\/categories?post=1104"},{"taxonomy":"post_tag","embeddable":true,"href":"http:\/\/www.nlpir.org\/wordpress\/wp-json\/wp\/v2\/tags?post=1104"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}