中国高校课件下载中心 》 教学资源 》 大学文库

《深度自然语言处理》课程教学课件(Natural language processing with deep learning)14 Question Answering

文档信息
资源类别:文库
文档格式:PDF
文档页数:60
文件大小:4.71MB
团购合买:点击进入团购
内容简介
《深度自然语言处理》课程教学课件(Natural language processing with deep learning)14 Question Answering
刷新页面文档预览

西安交通大学Natural Language ProcessingwithDeepLearningXIANHAOTONGUNIVERSITYQuestion Answering交通大学ChenLi2023

Chen Li 2023 Question Answering Natural Language Processing with Deep Learning

Outlines1.Motivation/History2.TheSQuADdataset3.TheStanfordAttentiveReadermodel4.BiDAF5.Recent,moreadvanced architectures

Outlines 1. Motivation/History 2. The SQuAD dataset 3. The Stanford Attentive Reader model 4. BiDAF 5. Recent, more advanced architectures

Outlines1.Motivation/History2.TheSQuADdataset3.TheStanfordAttentiveReadermodel4.BiDAF5.Recent,moreadvancedarchitectures

Outlines 1. Motivation/History 2. The SQuAD dataset 3. The Stanford Attentive Reader model 4. BiDAF 5. Recent, more advanced architectures

MotivationGoogleQWhowasAustralia'sthirdprimeminister?AllNewsImagesVideosMapsMoreSettingsToolsAbout6,030,000results(0.69seconds)JohnChristianWatsonJohnChristianWatson(bornJohnChristianTanck:9April1867-18November1941),commonlyknownasChrisWatsonwasanAustralianpoliticianwhoservedasthethirdPrimeMinisterofAustralia.en.wikipedia.orgChrisWatson-Wikipediahttps://en.wikipedia.org/wiki/Chris_WatsonPeoplealsosearchforView15+moreBillyAlfredAndrewGeorgeEdmundKevinRuddJulia GillardReidHughesBartonDeakinFisherMoreaboutChrisWatson

Motivation

MotivationQuestionansweringWith massive collections of full-text documents, i.e., the web, simplyreturning relevant documents is of limited useRather,weoftenwantanswerstoourquestionsEspeciallyonmobileOr using a digital assistant device, like Alexa, Google Assistant, We canfactorthis into two parts:1.Finding documentsthat (might)containananswerWhichcanbehandledbytraditionalinformationretrieval/websearch2.Findingan answerin a paragraph or adocumentThisproblemisoftentermedReadingComprehension.It iswhatwewill focusontoday

Motivation • With massive collections of full-text documents, i.e., the web , simply returning relevant documents is of limited use • Rather, we often want answers to our questions • Especially on mobile • Or using a digital assistant device, like Alexa, Google Assistant, . • We can factor this into two parts: 1. Finding documents that (might) contain an answer • Which can be handled by traditional information retrieval/web search 2. Finding an answer in a paragraph or a document • This problem is often termed Reading Comprehension • It is what we will focus on today l Question answering

MotivationA Brief Historyof ReadingComprehensionMuchearlyNLPwork attempted readingcomprehensionSchank, Abelson, Lehnert et al. c. 1977 --"Yale A.l. Project'Revived by Lynette Hirschman in 1999:CouldNLPsystemsanswerhumanreadingcomprehensionquestions for 3rd to 6th graders? Simple methods attempted.Revived again by Chris Burges in 2013 with MCTest·Again answering questions over simple story textsFloodgates opened in 2015/16 with the production of largedatasets which permit supervised neural systems to be builtHermannetal.(NIPS2015)DeepMindCNN/DMdatasetRajpurkaret al. (EMNLP 2016) SQuADMSMARCO,TriviaQA,RACE,NewsQA,NarrativeQA

Motivation l A Brief History of Reading Comprehension • Much early NLP work attempted reading comprehension • Schank, Abelson, Lehnert et al. c. 1977 – “Yale A.I. Project” • Revived by Lynette Hirschman in 1999: • Could NLP systems answer human reading comprehension questions for 3rd to 6th graders? Simple methods attempted. • Revived again by Chris Burges in 2013 with MCTest • Again answering questions over simple story texts • Floodgates opened in 2015/16 with the production of large datasets which permit supervised neural systems to be built • Hermann et al. (NIPS 2015) DeepMind CNN/DM dataset • Rajpurkar et al. (EMNLP 2016) SQuAD • MS MARCO, TriviaQA, RACE, NewsQA, NarrativeQA,

HistoryABrief Historyof ReadingComprehensionOctoberCNN/Daily Mail2016Attentive ReaderAprilChildren Book TestJulyStanford Attentive Reader2018OctoberQANetSQuAD 1.1AprilNarrativeQA2017BiDAF+self-att+ELMoMatch-LSTMJulyAprilBiDAFSQuAD 2.0TriviaQAJulyOctoberBERTR-NetHotpotQAOctoberRACE2019

History l A Brief History of Reading Comprehension

MotivationMachineComprehension(Burges2013)"A machine comprehends a passage of text if, for anyquestion regarding that text that can be answeredcorrectly by a majority of native speakers, thatmachine can provide a string which those speakerswould agree both answers that question, and doesnot contain information irrelevant to that question.'TowardstheMachineComprehensionof Text:AnEssayChristopherJ.C.BurgesMicrosoftResearchOneMicrosoftWayRedmond,WA98052,USADecember23,2013

Motivation l Machine Comprehension (Burges 2013) • “A machine comprehends a passage of text if, for any question regarding that text that can be answered correctly by a majority of native speakers, that machine can provide a string which those speakers would agree both answers that question, and does not contain information irrelevant to that question

MotivationMCTestReadingComprehensionPassage (P)Answer (A)+Question(Q)Alyssa got to the beach after a long trip. She's from Charlotte.She traveled from Atlanta. She's now in Miami. She went toMiami to visit some friends.But shewanted some timetoherselfat the beach, so she went there first.Aftergoing swimming andPlaying out, she went to her friend Ellen's house. Ellen greetedAlyssa and they both had some lemonade to drink. Alyssa calledher friends Kristin and Rachel to meet at Ellen's house.QAWhy did Alyssa go to Miami?Tovisitsomefriends

Motivation l MC Test Reading Comprehension Alyssa got to the beach after a long trip. She's from Charlotte. She traveled from Atlanta. She's now in Miami. She went to Miami to visit some friends. But she wanted some time to herself at the beach, so she went there first. After going swimming and laying out, she went to her friend Ellen's house. Ellen greeted Alyssa and they both had some lemonade to drink. Alyssa called her friends Kristin and Rachel to meet at Ellen's house. Why did Alyssa go to Miami? To visit some friends P Q A Passage (P) + Question (Q) Answer (A)

HistoryTurn-of-theMillenniumFull NLPQA[architecture of Lcc(Harabagiu/Moldovan) QA system,circa 20o3] Complex systems but theydid work fairly well on "factoid"questionsDocumentProcessingQuestion ProcessingFactoidAnswerProcessingSingleFactoidQuestionParsePassagesAnswerExtraction(NER)FactoidMultipleQuestionAnswerJustificationSemanticList(alignment,relations)TransformationFactoidPassagesAnswerAnswerRerankingRecognition ofListExpectedAnswer(TheoremProver)Type(forNER)Question1LAxiomaticKnowledgePassageRetrievalKeywordExtractionBaseListListAnswerProcessingDocumentIndexAnswerAnswerExtractionNamedEntityAnswerTypeRecognitionHierarchy(CICEROLITE)(WordNet)ThresholdCutoffDocumentQuestionProcessingDefinitionAnswerProcessingCollectionDefinitionQuestionParseQuestionAnswerExtractionDefinitionPattern MatchingAnswerPatternPattern MatchingRepositorKeyword Extraction

History l Turn-of-the Millennium Full NLP QA Question Parse Semantic Transformation Recognition of Expected Answer Type (for NER) Keyword Extraction Factoid Question List Question Named Entity Recognition (CICERO LITE) Answer Type Hierarchy (WordNet) Question Processing Question Parse Pattern Matching Keyword Extraction Question Processing Definition Question Definition Answer Answer Extraction Pattern Matching Definition Answer Processing Answer Extraction Threshold Cutoff List Answer Processing List Answer Answer Extraction (NER) Answer Justification (alignment,relations) Answer Reranking (Theorem Prover) Factoid Answer Processing Axiomatic Knowledge Base Factoid Answer Multiple Definition Passages Pattern Repositor Single Factoid Passages Multiple List Passages Passage Retrieval Document Processing Document Index Document Collection [architecture of LCC (Harabagiu/Moldovan) QA system, circa 2003] Complex systems but they did work fairly well on “factoid” questions

刷新页面下载完整文档
VIP每日下载上限内不扣除下载券和下载次数;
按次数下载不扣除下载券;
注册用户24小时内重复下载只扣除一次;
顺序:VIP每日次数-->可用次数-->下载券;
相关文档