﻿{"id":1085,"date":"2018-11-06T09:11:45","date_gmt":"2018-11-06T01:11:45","guid":{"rendered":"http:\/\/www.nlpir.org\/wordpress\/?p=1085"},"modified":"2018-12-13T22:13:58","modified_gmt":"2018-12-13T14:13:58","slug":"nlpir-ictcla2018-academic-seminar-7th-issue","status":"publish","type":"post","link":"http:\/\/www.nlpir.org\/wordpress\/2018\/11\/06\/nlpir-ictcla2018-academic-seminar-7th-issue\/","title":{"rendered":"NLPIR\/ICTCLA2018 ACADEMIC SEMINAR 7th ISSUE"},"content":{"rendered":"<div id=\"article_body\">\n<p class=\"MsoNormal\" style=\"text-align: center; margin: 0cm 0cm 0pt;\" align=\"center\"><b style=\"mso-bidi-font-weight: normal;\"><span lang=\"EN-US\" style=\"font-size: 16pt; mso-bidi-font-size: 11.0pt; mso-bidi-font-family: 'Times New Roman';\"><span style=\"font-family: Times New Roman;\">NLPIR\/ICTCLA2018 ACADEMIC SEMINAR 7th ISSUE<!--?xml:namespace prefix = \"o\" ns = \"urn:schemas-microsoft-com:office:office\" \/--><\/span><\/span><\/b><span style=\"display: none;\">\u81ea\u7136\u8bed\u8a00\u5904\u7406\u4e0e\u4fe1\u606f\u68c0\u7d22\u5171\u4eab\u5e73\u53f0;\\z\bB\u001eP\u001bh<br \/>\nU<\/span><\/p>\n<p class=\"MsoNormal\" style=\"margin: 0cm 0cm 0pt; padding-left: 30px;\"><b style=\"mso-bidi-font-weight: normal;\"><span lang=\"EN-US\" style=\"font-size: 12pt; mso-bidi-font-size: 11.0pt; mso-bidi-font-family: 'Times New Roman';\"><span style=\"font-family: Times New Roman;\">INTRO<\/span><\/span><\/b><span style=\"display: none;\">\u81ea\u7136\u8bed\u8a00\u5904\u7406\u4e0e\u4fe1\u606f\u68c0\u7d22\u5171\u4eab\u5e73\u53f0 M e\u0019`\u0016~ N\u0004P\u0019N\u0006`\u0012`<\/span><\/p>\n<p class=\"MsoNormal\" style=\"margin: 0cm 0cm 0pt;\"><span lang=\"EN-US\" style=\"mso-bidi-font-family: 'Times New Roman';\"><span style=\"font-size: medium;\"><span style=\"font-family: Times New Roman;\">\u00a0 \u00a0 \u00a0 \u00a0 In the new semester, our Lab, Web Search Mining and Security Lab, plans to hold an academic seminar every Wednesdays, and each time a keynote speaker will share understanding of papers published in recent years with you.<\/span><\/span><\/span><span style=\"display: none;\">\u81ea\u7136\u8bed\u8a00\u5904\u7406\u4e0e\u4fe1\u606f\u68c0\u7d22\u5171\u4eab\u5e73\u53f0\u0017i)G!Q8D\u0006L<\/span><\/p>\n<p class=\"MsoNormal\" style=\"margin: 0cm 0cm 0pt;\"><span lang=\"EN-US\"><span style=\"font-family: Times New Roman; font-size: medium;\">\u00a0<\/span><\/span><\/p>\n<p><span style=\"display: none;\">&amp;I\u0004k$a#o7R\u001b}0<\/span><\/p>\n<p class=\"MsoNormal\" style=\"margin: 0cm 0cm 0pt; padding-left: 30px;\"><b style=\"mso-bidi-font-weight: normal;\"><span lang=\"EN-US\" style=\"font-size: 12pt; mso-bidi-font-size: 11.0pt; mso-bidi-font-family: 'Times New Roman';\"><span style=\"font-family: Times New Roman;\">Arrangement<\/span><\/span><\/b><\/p>\n<p><span style=\"display: none;\">Z\u000fF\u0005?\u0002J\u0005A\u0004x,P\u001cm\u0003b0<\/span><\/p>\n<p class=\"MsoNormal\" style=\"margin: 0cm 0cm 0pt; padding-left: 30px;\"><span lang=\"EN-US\" style=\"mso-bidi-font-family: 'Times New Roman';\"><span style=\"font-size: medium;\"><span style=\"font-family: Times New Roman;\">This week&#8217;s seminar is organized as follows:<\/span><\/span><\/span><\/p>\n<p><span style=\"display: none;\">%W\u0015L\u0015N\u001av-R\u0019R<br \/>\n\\\u0014n\u000fc<br \/>\nO\u0003q<br \/>\nL0<\/span><\/p>\n<p class=\"MsoNormal\" style=\"margin: 0cm 0cm 0pt;\"><span lang=\"EN-US\" style=\"mso-bidi-font-family: 'Times New Roman';\"><span style=\"font-size: medium;\"><span style=\"font-family: Times New Roman;\">\u00a0 \u00a0 \u00a0 \u00a0 1. The seminar time is <b style=\"mso-bidi-font-weight: normal;\">1.pm, Wed<\/b>, at Zhongguancun Technology Park ,Building 5, 1306.<\/span><\/span><\/span><span style=\"display: none;\">\u81ea\u7136\u8bed\u8a00\u5904\u7406\u4e0e\u4fe1\u606f\u68c0\u7d22\u5171\u4eab\u5e73\u53f0<br \/>\nL\u0015e<br \/>\nE;e\u001bK8e\fv9u<\/span><\/p>\n<p class=\"MsoNormal\" style=\"text-align: left; margin: 0cm 0cm 0pt;\" align=\"left\"><span lang=\"EN-US\" style=\"mso-bidi-font-family: 'Times New Roman';\"><span style=\"font-size: medium;\"><span style=\"font-family: Times New Roman;\">\u00a0 \u00a0 \u00a0 \u00a0 2. The lecturer is <b style=\"mso-bidi-font-weight: normal;\">Ziyu Liu<\/b>, the paper&#8217;s title is <b style=\"mso-bidi-font-weight: normal;\">Neural Text Generation in Stories Using Entity Representations as Context<\/b>.<\/span><\/span><\/span><\/p>\n<p><span style=\"display: none;\">\u0016q\u0003a\u0005@\u001eV$n0<\/span><\/p>\n<p class=\"MsoNormal\" style=\"text-align: left; margin: 0cm 0cm 0pt;\" align=\"left\"><span lang=\"EN-US\" style=\"mso-bidi-font-family: 'Times New Roman';\"><span style=\"font-size: medium;\"><span style=\"font-family: Times New Roman;\">\u00a0 \u00a0 \u00a0 \u00a0 3. The seminar will be hosted Zhaoyang Wang.<\/span><\/span><\/span><span style=\"display: none;\">\u81ea\u7136\u8bed\u8a00\u5904\u7406\u4e0e\u4fe1\u606f\u68c0\u7d22\u5171\u4eab\u5e73\u53f0 g3s\u001ab7_\u0005^\u001dt\u0001A<\/span><\/p>\n<p class=\"MsoNormal\" style=\"margin: 0cm 0cm 0pt;\"><span lang=\"EN-US\" style=\"mso-bidi-font-family: 'Times New Roman';\"><span style=\"font-size: medium;\"><span style=\"font-family: Times New Roman;\">\u00a0 \u00a0 \u00a0 \u00a0 4. Attachment is the paper of this seminar, please download in advance<\/span><\/span><\/span><span style=\"display: none;\">\u81ea\u7136\u8bed\u8a00\u5904\u7406\u4e0e\u4fe1\u606f\u68c0\u7d22\u5171\u4eab\u5e73\u53f0#y3O\u0014J&#8217;S\u000bD\u0010Q\u001dy6G1A\u0001N<\/span><\/p>\n<p class=\"MsoNormal\" style=\"margin: 0cm 0cm 0pt;\"><span lang=\"EN-US\"><span style=\"font-family: Times New Roman; font-size: medium;\">\u00a0<\/span><\/span><span style=\"display: none;\">\u81ea\u7136\u8bed\u8a00\u5904\u7406\u4e0e\u4fe1\u606f\u68c0\u7d22\u5171\u4eab\u5e73\u53f09F\u0011D\u000e@3K&#8221;M\u001dw\u001cA\/x<\/span><\/p>\n<p class=\"MsoNormal\" style=\"margin: 0cm 0cm 0pt;\"><span lang=\"EN-US\" style=\"mso-bidi-font-family: 'Times New Roman';\"><span style=\"font-size: medium;\"><span style=\"font-family: Times New Roman;\">\u00a0 \u00a0 \u00a0 \u00a0 Everyone interested in this topic is welcomed to join us. the following is the abstract for this week\u2019s paper<\/span><\/span><\/span><span style=\"display: none;\">\u81ea\u7136\u8bed\u8a00\u5904\u7406\u4e0e\u4fe1\u606f\u68c0\u7d22\u5171\u4eab\u5e73\u53f0:k,~\u0006E4Q2x\u0007P4U\u0010_0W<\/span><\/p>\n<p class=\"MsoNormal\" style=\"margin: 0cm 0cm 0pt;\"><span lang=\"EN-US\"><span style=\"font-family: Times New Roman; font-size: medium;\">\u00a0<\/span><\/span><\/p>\n<p><span style=\"display: none;\"><br \/>\noY\u0005hj\u0005|!}*@\bU\u000bV0<\/span><\/p>\n<div style=\"mso-element: para-border-div; mso-border-alt: dotted windowtext .5pt; border: windowtext 1pt dotted; padding: 1pt 4pt 1pt 4pt;\">\n<p class=\"MsoNormal\" style=\"layout-grid-mode: char; text-align: center; margin: 0cm 0cm 0pt; mso-border-alt: dotted windowtext .5pt; mso-padding-alt: 1.0pt 4.0pt 1.0pt 4.0pt; padding: 0cm;\" align=\"center\"><span lang=\"EN-US\" style=\"font-size: 14pt; mso-bidi-font-size: 11.0pt;\"><span style=\"font-family: Times New Roman;\">Neural Text Generation in Stories Using Entity Representations as Context<\/span><\/span><span style=\"display: none;\">\u81ea\u7136\u8bed\u8a00\u5904\u7406\u4e0e\u4fe1\u606f\u68c0\u7d22\u5171\u4eab\u5e73\u53f0\u000fN\u0002Q\bY1e\u0017_\u0011w*]\br*N<\/span><\/p>\n<p class=\"MsoNormal\" style=\"text-align: center; margin: 0cm 0cm 0pt; mso-border-alt: dotted windowtext .5pt; mso-padding-alt: 1.0pt 4.0pt 1.0pt 4.0pt; padding: 0cm;\" align=\"center\"><span lang=\"EN-US\" style=\"font-size: 9pt; mso-bidi-font-size: 11.0pt;\"><span style=\"font-family: Times New Roman;\">Elizabeth Clark<span style=\"mso-tab-count: 2;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 <\/span>Yangfeng Ji<span style=\"mso-tab-count: 1;\">\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 <\/span>Noah A. Smith<\/span><\/span><\/p>\n<p><span style=\"display: none;\">\u000eJ\u001ey\u0002U\u0016[\u001fr\u001cr%B\u000fg%w\u0013p\u0002g0<\/span><\/p>\n<p class=\"MsoNormal\" style=\"text-align: center; margin: 0cm 0cm 0pt; mso-border-alt: dotted windowtext .5pt; mso-padding-alt: 1.0pt 4.0pt 1.0pt 4.0pt; padding: 0cm;\" align=\"center\"><span lang=\"EN-US\" style=\"font-size: 12pt; mso-bidi-font-size: 11.0pt;\"><span style=\"font-family: Times New Roman;\">Abstract<\/span><\/span><span style=\"display: none;\">\u81ea\u7136\u8bed\u8a00\u5904\u7406\u4e0e\u4fe1\u606f\u68c0\u7d22\u5171\u4eab\u5e73\u53f0#|\u0002`4^\u000bU\u0003C\fG,q<br \/>\n| x<\/span><\/p>\n<p class=\"MsoNormal\" style=\"margin: 0cm 0cm 0pt; padding-top: 0cm; padding-right: 0cm; padding-bottom: 0cm;\"><span lang=\"EN-US\"><span style=\"font-size: medium;\"><span style=\"font-family: Times New Roman;\">\u00a0 \u00a0 \u00a0 \u00a0 We introduce an approach to neural text generation that explicitly represents entities mentioned in the text. Entity representations are vectors that are updated as the text proceeds; they are designed specifically for narrative text like fiction or news stories. Our experiments demonstrate that modeling entities offers a benefit in two automatic evaluations: mention generation (in which a model chooses which entity to mention next and which words to use in the mention) and selection between a correct next sentence and a distractor from later in the same story. We also conduct a human evaluation on automatically generated text in story contexts; this study supports our emphasis on entities and suggests directions for further research.<\/span><\/span><\/span><span style=\"display: none;\">\u81ea\u7136\u8bed\u8a00\u5904\u7406\u4e0e\u4fe1\u606f\u68c0\u7d22\u5171\u4eab\u5e73\u53f0.j\u001bs+a-y:u<\/span><\/p>\n<\/div>\n<p class=\"MsoNormal\" style=\"margin: 0cm 0cm 0pt;\"><span lang=\"EN-US\"><span style=\"font-family: Times New Roman; font-size: medium;\">\u00a0<\/span><\/span><span style=\"display: none;\">\u81ea\u7136\u8bed\u8a00\u5904\u7406\u4e0e\u4fe1\u606f\u68c0\u7d22\u5171\u4eab\u5e73\u53f0\u0010q5D\u0003D\u001f\\\u0013Z#C\u000el,P<\/span><\/p>\n<p class=\"MsoNormal\" style=\"margin: 0cm 0cm 0pt;\"><span lang=\"EN-US\"><span style=\"font-family: Times New Roman; font-size: medium;\">\u00a0<\/span><\/span><\/p>\n<p><span style=\"display: none;\">&#8216;W\u0001c)w,f7O\u0004D\u000eO0<\/span><\/p>\n<\/div>\n","protected":false},"excerpt":{"rendered":"<p>NLPIR\/ICTCLA2018 ACADEMIC SEMINAR 7th IS &hellip; <a href=\"http:\/\/www.nlpir.org\/wordpress\/2018\/11\/06\/nlpir-ictcla2018-academic-seminar-7th-issue\/\">\u7ee7\u7eed\u9605\u8bfb <span class=\"meta-nav\">&rarr;<\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[38],"tags":[],"_links":{"self":[{"href":"http:\/\/www.nlpir.org\/wordpress\/wp-json\/wp\/v2\/posts\/1085"}],"collection":[{"href":"http:\/\/www.nlpir.org\/wordpress\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"http:\/\/www.nlpir.org\/wordpress\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"http:\/\/www.nlpir.org\/wordpress\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"http:\/\/www.nlpir.org\/wordpress\/wp-json\/wp\/v2\/comments?post=1085"}],"version-history":[{"count":1,"href":"http:\/\/www.nlpir.org\/wordpress\/wp-json\/wp\/v2\/posts\/1085\/revisions"}],"predecessor-version":[{"id":1086,"href":"http:\/\/www.nlpir.org\/wordpress\/wp-json\/wp\/v2\/posts\/1085\/revisions\/1086"}],"wp:attachment":[{"href":"http:\/\/www.nlpir.org\/wordpress\/wp-json\/wp\/v2\/media?parent=1085"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"http:\/\/www.nlpir.org\/wordpress\/wp-json\/wp\/v2\/categories?post=1085"},{"taxonomy":"post_tag","embeddable":true,"href":"http:\/\/www.nlpir.org\/wordpress\/wp-json\/wp\/v2\/tags?post=1085"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}