经典深度学习PPT136页

上传人:公**** 文档编号:584947537 上传时间:2024-09-01 格式:PPT 页数:138 大小:8.91MB
返回 下载 相关 举报
经典深度学习PPT136页_第1页
第1页 / 共138页
经典深度学习PPT136页_第2页
第2页 / 共138页
经典深度学习PPT136页_第3页
第3页 / 共138页
经典深度学习PPT136页_第4页
第4页 / 共138页
经典深度学习PPT136页_第5页
第5页 / 共138页
点击查看更多>>
资源描述

《经典深度学习PPT136页》由会员分享,可在线阅读,更多相关《经典深度学习PPT136页(138页珍藏版)》请在金锄头文库上搜索。

1、RANLP2015,Hissar,BulgariaDeepLearninginIndustryDataAnalytics JunlanFeng ChinaMobileResearch1人工智能的起点:达特茅斯会议1919-20011927-20111927-20161916-2011NathanielRochester人工智能的阶段1950s1980s2000sFuture 自动计算机如何为计算机编程使其能够使用语言神经网络计算规模理论自我提升抽象随机性与创造性基于规则的专家系统通用智能123人工智能的当前技术:存在的问题1.依赖大量的标注数据2.“窄人工智能”训练完成特定的任务3.不够稳定,

2、安全4.不具备解释能力,模型不透明人工智能的当前状态:应用人工智能成为热点的原因:深度学习,强化学习大规模的,复杂的,流式的数据概要1.解析白解析白宫人工智能研人工智能研发战略略计划划3. 深度学深度学习及最新及最新进展展2. 解析十家技解析十家技术公司的的人工智能公司的的人工智能战略略4. 强化学化学习及最新及最新进展展5. 深度学深度学习在企在企业数据分析中的数据分析中的应用用美国人工智能战略规划美国人工智能研发战略规划策略-I:在人工智能研究领域做长期研发投资目标:.确保美国的世界领导地位.优先投资下一代人工智能技术1.推动以数据为中心的知识发现技术高效的数据清洁技术以,确保用于训练系统

3、的数据的可信性(varascty)和正确性(appropriateness)综合考虑数据,元数据,以及人的反馈或知识异构数据,多模态数据分析和挖掘,离散数据,连续数据,时间域数据,空间域数据,时空数据,图数据小数据挖掘,强调小概率事件的重要性数据和知识尤其领域知识库的融合使用策略-I:在人工智能研究领域做长期研发投资目标:.确保美国的世界领导地位.优先投资下一代人工智能技术1.推动以数据为中心的知识发现技术2.增强系统的感知能力硬件或算法能提升系统感知能力的稳健性和可靠性提升在复杂动态环境中对物体的检测,分类,辨别,识别能力提升传感器或算法对人的感知,以便系统更好地跟人的合作计算和传播感知系统

4、的不确定性给系统以便更好的判断策略-I:在人工智能研究领域做长期研发投资目标:.确保美国的世界领导地位.优先投资下一代人工智能技术1.推动以数据为中心的知识发现技术2.增强系统的感知能力当前硬件环境和算法框架下AI的理论上限学习能力语言能力感知能力推理能力创造力计划,规划能力3.理论能力和上限策略-I:在人工智能研究领域做长期研发投资目标:.确保美国的世界领导地位.优先投资下一代人工智能技术1.推动以数据为中心的知识发现技术2.增强系统的感知能力目前的AI系统均为窄人工智能,“NarrowAI”而不是“GeneralAI”GAI:灵活,多任务,有自由意志,在多认知任务中的通用能力(学习能力,语

5、言能力,感知能力,推理能力,创造力,计划,规划能力迁移学习3.理论能力和上限4.通用AI策略-I:在人工智能研究领域做长期研发投资目标:.确保美国的世界领导地位.优先投资下一代人工智能技术1.推动以数据为中心的知识发现技术2.增强系统的感知能力多AI系统的协同分布式计划和控制技术3.理论能力和上限4.通用AI5.规模化AI系统策略-I:在人工智能研究领域做长期研发投资目标:.确保美国的世界领导地位.优先投资下一代人工智能技术1.推动以数据为中心的知识发现技术2.增强系统的感知能力AI系统的自我解释能力目前AI系统的学习方法:大数据,黑盒人的学习方法:小数据,接受正规的指导规则以及各种暗示仿人的

6、AI系统,可以做智能助理,智能辅导3.理论能力和上限4.通用AI5.规模化AI系统6.仿人类的AI技术策略-I:在人工智能研究领域做长期研发投资目标:.确保美国的世界领导地位.优先投资下一代人工智能技术1.推动以数据为中心的知识发现技术2.增强系统的感知能力提升机器人的感知能力,更智能的同复杂的物理世界交互3.理论能力和上限4.通用AI5.规模化AI系统6.仿人类的AI技术7.研发实用,可靠,易用的机器人策略-I:在人工智能研究领域做长期研发投资目标:.确保美国的世界领导地位.优先投资下一代人工智能技术1.推动以数据为中心的知识发现技术2.增强系统的感知能力提升机器人的感知能力,更智能的同复杂

7、的物理世界交互GPU:提升的内存,输入输出,时钟速度,并行能力,节能“类神经元”处理器处理基于流式,动态数据利用AI技术提升硬件能力:高性能计算,优化能源消耗,增强计算性能,自我智能配置,优化数据在多核处理器和内存直接移动3.理论能力和上限4.通用AI5.规模化AI系统6.仿人类的AI技术7.研发实用,可靠,易用的机器人8.AI和硬件的相互推动策略-II:开发有效的人机合作方法.不是替代人,而是跟人合作,强调人和AI系统之间的互补作用1.辅助人类的人工智能技术AI系统的设计很多是为人所用复制人类计算,决策,认知策略-II:开发有效的人机合作方法.不是替代人,而是跟人合作,强调人和AI系统之间的

8、互补作用1.辅助人类的人工智能技术2.开发增强人类的技术稳态设备穿戴设备植入设备辅助数据理解策略-II:开发有效的人机合作方法.不是替代人,而是跟人合作,强调人和AI系统之间的互补作用1.辅助人类的人工智能技术2.开发增强人类的技术数据和信息的可视化,以人可以理解的方式展现提升人和系统通信的效率3.可视化,AI-人之间的友好界面策略-II:开发有效的人机合作方法.不是替代人,而是跟人合作,强调人和AI系统之间的互补作用1.辅助人类的人工智能技术2.开发增强人类的技术已成功:安静环境下的流畅的语音识未解决的:噪声环境下的识别,远场语音识别,口音,儿童语音识别,受损语音识别,语言理解,对话能力3.

9、可视化,AI-人之间的友好界面4.研发更有效的语言处理系统策略III:理解并重点关注人工智能可能带来的伦理,法律,社会方面的影响1.研究人工智能技术可能带来的伦理,法律,社会方面的影响2.期待其符合人的类规范1.AI系统从设计上需要符合人类的道德标准:公平,正义,透明,责任感策略III:理解并重点关注人工智能可能带来的伦理,法律,社会方面的影响1.研究人工智能技术可能带来的伦理,法律,社会方面的影响2.期待其符合人的类规范1.AI系统从设计上需要符合人类的道德标准:公平,正义,透明,责任感2.构建符合道德的AI技术如何将道德量化,由模糊变为精确的系统和算法设计道德通常是模糊的,随文化,宗教和信

10、仰而不同策略III:理解并重点关注人工智能可能带来的伦理,法律,社会方面的影响1.研究人工智能技术可能带来的伦理,法律,社会方面的影响2.期待其符合人的类规范1.AI系统从设计上需要符合人类的道德标准:公平,正义,透明,责任感2.构建符合道德的AI技术两层架构:由一层专门负责道德建设道德标准植入每一个工程AI步骤3.符合道德标准的AI技术的实现框架策略-IV:确保人工智能系统的自身和对周围环境安全性1.在人工智能系统广泛使用之前,必须确保系统的安全性2.研究创造稳定,可依靠,可信赖,可理解,可控制的人工智能系统所面临的挑战及解决办法1.提升AI系统的可解释性和透明度2.建立信任3.增强veri

11、fication和validation4.自我监控,自我诊断,自我修正5.意外处理能力,防攻击能力策略-V:发展人工智能技术所需的共享的数据集和共享的模拟环境1.一件重要的公益事业,同时要充分尊重企业和个人在数据中的权利和利益2.鼓励开源策略-VI:评价和评测人工智能技术的标准1.开发恰当的评级策略和方法策略-VII:更好的理解国家在人工智能研发方面的人力需求1.保证足够的人才资源大数据和人工智能数据是人工智能的来源大数据并行计算,流计算等技术是人工智能能实用化的保障人工智能是大数据,尤其复杂数据分析的主要方法. Top 10 家技家技术公司的布局公司的布局Google: AI-First S

12、trategy1.Google化4亿美金购买英国伦敦大学人工智能创业公司:DeepMind2.AlphaGo3.GNC4.WaveNet5.Q-Learning2011年成立1.语音识别,合成;2.机器翻译;3.无人驾驶车.4.谷歌眼镜.5.GoogleNow.6.收购Api.uiFacebook共享深度学习开源代码:TorchFacbookM数字助理研究和应用:FAIR&AMLApple AIAppleSiriAppleboughtEmotientandVocalIQ?Partnership on AIItwill“conductresearch,recommendbestpractices

13、,andpublishresearchunderanopenlicenseinareassuchasethics,fairnessandinclusivity;transparency,privacy,andinteroperability;collaborationbetweenpeopleandAIsystems;andthetrustworthiness,reliabilityandrobustnessofthetechnology”2016年年9月月29日日Elon Musk : OpenAIPaypal,Telsla,SpaceX,SolarCity四家公司CEO,投资十个亿美金成立

14、OpenAIMicrosoftIBM百度国内技术巨头腾讯,阿里,讯飞在人工智能领域投入巨大5. 深度学深度学习在企在企业数据分析中的案例数据分析中的案例Anexample:AIinDataAnalyticswithDeepLearning-客户情感分析1.Introduction2.EmotionRecognitioninText3.EmotionRecognitioninSpeech4.EmotionRecognitioninConversations5.IndustrialApplicationDatasetsFeaturesMethodsIntroduction:Interchangea

15、bleTermsOpinionMiningSentimentalAnalysisEmotionRecognitionPolarityDetectionReviewMining42Introduction:Whatemotionsare?43Introduction:ProblemDefinitionWewillonlyfocusondocumentlevelsentimentOpinionMiningRANLP2015,Hissar,BulgariaIntroduction:TextExamples6thSeptember2015athrillerwithoutalotofthrillsAne

16、dgythrillerthatdeliversasurprisingpunchAflawedbutengrossingthrillerItsunlikelywellseeabetterthrillerthisyearAneroticthrillerthatsneithertooeroticnorverythrillingeitherEmotionsareexpressedartisticallywithhelpofNegationConjunctionWordsSentimentalWords,e.g.45RANLP2015,Hissar,BulgariaIntroduction:TextEx

17、amples DSE:explicitlyexpressanopinionholdersattitude ESE:indirectlyexpresstheattitudeofthewriter6thSeptember2015Emotionsareexpressedexplicitlyandindirectly.46RANLP2015,Hissar,BulgariaIntroduction:TextExamples6thSeptember2015Emotionsareexpressedlanguagethatisoftenobscuredbysarcasm,ambiguity,andplayso

18、nwords,allofwhichcouldbeverymisleadingforbothhumansandcomputers AsharptonguedoesnotmeanyouhaveakeenmindIdontknowwhatmakesyousodumbbutitreallyworksPlease,keeptalking.Sogreat.IalwaysyawnwhenIaminterested.47RANLP2015,Hissar,BulgariaIntroduction:SpeechConversationExamples6thSeptember201548RANLP2015,Hiss

19、ar,BulgariaIntroduction:ConversationExamples6thSeptember201549RANLP2015,Hissar,BulgariaTypicalApproach:AClassificationTask6thSeptember2015ADocumentFeatures:Ngrams(Uni,bigrams)POSTagsTermFrequencySyntacticDependencyNegationTagsSVMMaxentNaveBayesCRFRandomForestPosNeuNegSupervisedLearningPos-TagPattern

20、s+Dictionary+MutualInfoRulesUnsupervisedLearning50RANLP2015,Hissar,BulgariaTypicalApproach:AClassificationTask6thSeptember2015Features:Prosodic features:pitch,energy,formants,etc.Voice quality features:harsh,tense,breathy,etc.Spectral features:LPC,MFCC,LPCC,etc.Teager Energy Operator (TEO)-based fea

21、tures:TEO-FM-var,TEO-Auto-Env,etcSVMGMMHMMDBNKNNLDACARTPosNeuNegSupervisedLearning51ChallengesRemain1.Text-Based:CapturethecompositionaleffectswithhigheraccuracyNegatingPositivesentencesNegatingNegativesentencesConjunction:2.Speech-Based:Effectivefeaturesunknown.Emotionalspeechsegmentstendtobetransc

22、ribedwithlowerASRaccuracyOverview1.Introduction2.EmotionRecognitioninTextWordEmbeddingforSentimentAnalysisCNNforSentimentClassificationRNN,LSTMforsentimentClassificationPriorKnowledge+CNN/LSTMParsing+RNN3.EmotionRecognitioninSpeech4.EmotionRecognitioninConversations5.IndustrialApplicationHowdeeplear

23、ningcanchangethegame?RANLP2015,Hissar,Bulgaria6thSeptember2015EmotionClassificationwithDeeplearningapproaches54RANLP2015,Hissar,Bulgaria1.WordEmbeddingasFeatures6thSeptember2015Representationoftextisveryimportantforperformanceofmanyreal-worldapplicationsincludingemotionrecognition:Localrepresentatio

24、ns:N-gramsBag-of-words1-of-NcodingContinuousRepresentations:LatentSemanticAnalysisLatentDirichletAllocationDistributedRepresentations:wordembeddingTomasMikolov,“LearningRepresentationsofTextusingNeuralNetworks”,NIPsDeeplearningWorkshop2013(Bengioetal.,2006;Collobert&Weston,2008;Mnih&Hinton,2008;Turi

25、anetal.,2010;Mikolovetal.,2013a;c)55RANLP2015,Hissar,Bulgaria1.WordEmbeddingasFeatures6thSeptember2015Representationoftextisveryimportantforperformanceofmanyreal-worldapplicationsincludingemotionrecognition:Localrepresentations:N-gramsBag-of-words1-of-NcodingContinuousRepresentations:LatentSemanticA

26、nalysisLatentDirichletAllocationDistributedRepresentations:wordembeddingTomasMikolov,“LearningRepresentationsofTextusingNeuralNetworks”,NIPsDeeplearningWorkshop201356RANLP2015,Hissar,BulgariaWordEmbedding6thSeptember2015Skip-gramArchCBOWThehiddenlayervectoristheword-embeddingvectorforw(t)57WordEmbed

27、dingforSentimentDetectionIthasbeenwidelyacceptedasstandardfeaturesforNLPapplicationsincludingsentimentanalysissince2013Mikolov2013Thewordvectorspaceimplicitlyencodesmanylinguisticregularitiesamongwords:semanticandsyntacticExample:GooglePre-trainedwordvectorswith1000BillionwordsDoesitencodepolaritysi

28、milarities?great0.729151bad0.719005terrific0.688912decent0.683735nice0.683609excellent0.644293fantastic0.640778better0.612073solid0.580604lousy0.576420wonderful0.572612terrible0.560204Good0.558616TopRelevantWordsto“good”MostlyYes,butitdoesntseparateantonymswellRANLP2015,Hissar,BulgariaLearningSentim

29、ent-SpecificWordEmbedding6thSeptember2015Tang,etal,“LearningSentimentSpecificWordEmbeddingforTwitterSentimentClassification”,ACL201459RANLP2015,Hissar,BulgariaLearningSentiment-SpecificWordEmbedding6thSeptember2015Tang,etal,“LearningSentimentSpecificWordEmbeddingforTwitterSentimentClassification”,AC

30、L2014InSpirit,itissimilartomulti-tasklearning.Itlearnsthesamewayastheregularword-embeddingwithlossfunctionconsideringbothsemanticcontextandsentimentdistancetothetwitteremotionsymbols.6010milliontweetsselectedbypositiveandnegativeemoticonsastrainingdataTheTwittersentimentclassificationtrackofSemEval2

31、013LearningSentiment-SpecificWordEmbeddingTang,etal,“LearningSentimentSpecificWordEmbeddingforTwitterSentimentClassification”,ACL2014ParagraphVectorsLeandMikolov,“DistributionalRepresentationsofSentencesandDocuments,ICML2014Paragraphvectorsaredistributionalvectorrepresentationforpiecesoftext,suchass

32、entencesorparagraphsTheparagraphvectorsarealsoaskedtocontributetothepredictiontaskofthenextwordgivenmanycontextssampledfromtheparagraph.EachparagraphcorrespondstoonecolumninDItactsasamemoryrememberingwhatismissingfromthecurrentcontext,aboutthetopicoftheparagraphParagraphVectorsBestResultsonMRDataSet

33、LeandMikolov,“DistributionalRepresentationsofSentencesandDocuments,ICML2014Overview1.Introduction2.EmotionRecognitioninTextWordEmbeddingforSentimentAnalysisCNNforSentimentClassificationRNN,LSTMforsentimentClassificationPriorKnowledge+CNN/LSTMDatasetCollection3.EmotionRecognitioninSpeech4.EmotionReco

34、gnitioninConversations5.IndustrialApplicationCNNforSentimentClassificationRef:YoonKim.ConvolutionalNeuralNetworksforSentenceClassification.EMNLP,2014.CNNforSentimentClassification1.Ref:YoonKim.ConvolutionalNeuralNetworksforSentenceClassification.EMNLP,2014.1.AsimpleCNNwithOneLayerofconvolutionontopo

35、fwordvectors.MotivatedbyCNNhasbeensuccessfulonmanyotherNLPtasks2.InputLayer:Wordvectorsarefrompre-trainedGoogle-Newsword2vector3.ConvLayer:Windowsize:3words,4words,5words.Eachwith100featuremap.300featuresinthepenultimatelayer4.PoolingLayer:MaxOvertimePoolingatthe5.Outputlayer:Fullyconnectedsoftmaxla

36、yer,outputdistributionoverlabels6.Regularization:Drop-outonthepenultimatelayerwithaconstrainonthel2normsoftheweightvectors7.Fine-trainembeddingvectorsduringtrainingCommonDatasetsCNNforSentimentClassification-Results CNN-rand:Randomlyinitializeallwordembeddings CNN-static:word2vec,keeptheembeddingsfi

37、xed CNN-nonstatic:Fine-tuningembeddingvectorsCNNforSentimentClassification-ResultsWhyitissuccessful?1.MultiplefiltersandmultiplefeaturemapsEmotionsareexpressedinsegments,insteadofthespanningoverthewholesentence2.Usepre-trainedword2vecvectorsasinputfeatures.3.Embeddingwordvectorsarefurtherimprovedfor

38、non-statictraining.Antonymsarefurtherseparatedaftertraining.ResourcesforThiswork1.SourceCode:https:/ Parameters in Experiments:K=4m=5,14featuremapsm=7,6featuremapsd=48DynamicCNNforSentimentKalchbrenneretal,“AConvolutionalNeuralNetworkforModelingSentences”,ACL2014OneDimensionConvolutionTwoDimensionCo

39、nvolution48Dwordvectorsrandomlyinitiated300DInitiatedwithGoogleword2vectorMorecomplicatedmodelarchitecturewithdynamicpoolingStraightForward6,4featuremaps100-128featuremapsJohnsonandZhang.,“EffectiveUseofWordOrderforTextCategorizationwithConvolutionalNeuralNetworks”,ACL-2015WhyCNNiseffectiveAsimplere

40、medyistousewordbi-gramsinadditiontounigramsIthasbeennotedthatlossofwordordercausedbybag-of-wordvectors(bowvectors)isparticularlyproblematiconsentimentclassificationComparingSVMwithTri-gramfeatureswith1,2,3windowfilterCNNTop100FeaturesSVMCNNUni-Grams687Bi-Grams2833Tri-Grams460SVMscantfullytakeadvanta

41、geofhigh-orderngramsSentimentClassificationConsideringFeaturesbeyondTextwithCNNModelsTangetal.,“LearningSemanticRepresentationsofUsersandProductsforDocumentLevelSentimentClassification“”,ACL-2015Overview1.Introduction2.EmotionRecognitioninTextWordEmbeddingforSentimentAnalysisCNNforSentimentClassific

42、ationRNN,LSTMforsentimentClassificationPriorKnowledge+CNN/LSTMDatasetCollection3.EmotionRecognitioninSpeech4.EmotionRecognitioninConversations5.IndustrialApplicationRecursiveNeuralTensorNetworkSocheretal.,“RecursiveDeepModelsforSemanticCompositionalityoveraSentimentTreebank”,EMNLP-2013.http:/nlp.sta

43、nford.edu/sentiment/1.TheStanfordSentimentTreebackisacorpuswithfullylabeledparsetrees2.Createdtofacilitateanalysisofthecompositionaleffectsofsentimentinlanguage3.10,662sentencesfrommoviereviews.Parsedbystanfordparser.215,154phrasesarelabeled4.AmodelcalledRecursiveNeuralTensorNetworkswasproposedRecur

44、siveNeuralTensorNetwork-DistributionofsentimentvaluesforN-gramsSocheretal.,“RecursiveDeepModelsforSemanticCompositionalityoveraSentimentTreebank”,EMNLP-2013.http:/nlp.stanford.edu/sentiment/StrongersentimentoftenbuildsupinlongerphrasesandthemajorityoftheshorterphrasesareneutralRecursiveNeuralTensorN

45、etwork(RNTN)Socheretal.,“RecursiveDeepModelsforSemanticCompositionalityoveraSentimentTreebank”,EMNLP-2013.http:/nlp.stanford.edu/sentiment/f=tanhVisthetensordirectlyrelateinputvectors,WistheregularRNNweightmatrixWangetal.,“PredictingPolaritiesofTweetsbyComposingWordEmbeddingwithLongShort-TermMemory”

46、,ACL-2015LSTMforSentimentAnalysisLSTMworkstremendouslywellonalargenumberofproblemsSucharchitecturesaremorecapabletolearnacomplexcompositionsuchasnegationofwordvectorsthansimpleRNNs.Input,storedinformation,andoutputarecontrolledbythreegates.Wangetal.,“PredictingPolaritiesofTweetsbyComposingWordEmbedd

47、ingwithLongShort-TermMemory”,ACL-2015LSTMforSentimentAnalysisDataset:theStanfordTwitterSentimentcorpus(STS)LSTM-TLT:Word-embeddingvectorsasinput.TLT:TrainableLook-upTableItisobservedthatnegationscanbebettercaptured.Tangetal.,“DocumentModelingwithGatedRecurrentNeuralNetworkforSentimentClassification”

48、,EMNLP-2015GatedRecurrentUnitTangetal.,“DocumentModelingwithGatedRecurrentNeuralNetworkforSentimentClassification”,EMNLP-2015GatedRecurrentNeuralNetworkUseCNN/LSTMtogeneratelsentencerepresentationsfromwordvectorsGateRecurrentNeuralNetwork(GRU)toencodesentencerelationsforsentimentclassificationGRUcan

49、viewedasvariantofLSTM,withoutputgatealwaysonTangetal.,“DocumentModelingwithGatedRecurrentNeuralNetworkforSentimentClassification”,EMNLP-2015GatedRecurrentNeuralNetworkJ.Wangetal.,DimensionalSentimentAnalysisUsingaRegionalCNN-LSTMModel”,ACL-2016CNN-LSTMJ.Wangetal.,DimensionalSentimentAnalysisUsingaRe

50、gionalCNN-LSTMModel”,ACL-2016CNN-LSTMThedimensionalapproachrepresentsemotionalstatesascontinuousnumericalvaluesinmultipledimensionssuchasthevalence-arousal(VA)space(Russell,1980).Thedimensionofvalencereferstothedegreeofpositiveandnegativesentiment,whereasthedimensionofarousalreferstothedegreeofcalma

51、ndexcitementK.STaietal,ImprovedSemanticRepresentationsFromTree-StructuredLongShort-TermMemoryNetworks”,ACL-2015Tree-LSTMTree-LSTM:ageneralizationofLSTMstotree-structurednetworktopologies.TreeLSTMsoutperformallexistingsystemsandstrongLSTMbaselinesontwotasks:predictingthesemanticrelatednessoftwosenten

52、ces(SemEval2014,Task1)andsentimentclassification(StanfordSentimentTreebank).K.STaietal,ImprovedSemanticRepresentationsFromTree-StructuredLongShort-TermMemoryNetworks”,ACL-2015Tree-LSTMAchievecomparableaccuracyConstituency-TreebasedperformsbetterThewordvectorsareinitializedbyGloveWord2Vectors(Trained

53、on840billiontokensofCommonCrawldata,http:/nlp.stanford.edu/projects/glove/)Overview1.Introduction2.EmotionRecognitioninTextWordEmbeddingforSentimentAnalysisCNNforSentimentClassificationRNN,LSTMforsentimentClassificationPriorKnowledge+CNN/LSTMDatasetCollection3.EmotionRecognitioninSpeech4.EmotionReco

54、gnitioninConversations5.IndustrialApplicationRANLP2015,Hissar,BulgariaPriorKnowledge+DeepNeuralNetworks6thSeptember2015Foreachiteration:1.Theteachernetworkisobtainedbyprojectingthestudentnetworktoarule-regularizedsubspace(reddashedarrow);2.Thestudentnetworkisupdatedtobalancebetweenemulatingtheteache

55、rsoutputandpredictingthetruelabels(black/bluesolidarrows).Huetal,”HarnessingDeepNeuralNetworkswithLogicRules”,ACL-2016Thisprocessisagnosticthestudentnetwork,applicabletoanyarchitecture:RNN/DNN/CNN90RANLP2015,Hissar,BulgariaPriorKnowledge+DeepNeuralNetworks6thSeptember2015Huetal,”HarnessingDeepNeural

56、NetworkswithLogicRules”,ACL-2016TheTeacherNetworkiscreatedeachiterationbasedontwocriteria:1closeenoughtothestudentnetwork2reflectallrules91Prior Knowledge + Deep Neural NetworksAccuracyonSST2Datasets,-Rule-qistheteachernetworkOnedifficultyfortheplainneuralnetworkistoidentifycontrastivesenseinorderto

57、capturethedominantsentimentprecisely.PriorKnowledgeinExperiment:“AButB”,theoverallsentimentisconsistentwiththesentimentofBOverview1.Introduction2.EmotionRecognitioninTextWordEmbeddingforSentimentAnalysisCNNforSentimentClassificationRNN,LSTMforsentimentClassificationPriorKnowledge+CNN/LSTMDatasetColl

58、ection3.EmotionRecognitioninSpeech4.EmotionRecognitioninConversations5.IndustrialApplicationTextCorpusforSentimentAnalysisMR:Moviereviewswithonesentenceperreview.Classificationinvolvesdetectingpositive/negativereviews. https:/www.cs.cornell.edu/people/pabo/movie-review-data/SST:StanfordSentimentTree

59、bankanextensionofMRbutwithtrain/dev/testsplitsprovidedandfine-grainedlabels(verypositive,positive,neutral,negative,verynegative),re-labeledbySocheretal.(2013)http:/nlp.stanford.edu/sentiment/CR:Customerreviewsofvariousproducts(cameras,MP3setc.).Taskistopredictpositive/negativereviews(HuandLiu,2004).

60、http:/www.cs.uic.edu/liub/FBS/sentiment-analysis.htmlMPQA:OpinionpolaritydetectionsubtaskoftheMPQAdataset(Wiebeetal.,2005)http:/www.cs.pitt.edu/mpqa/YelpDatasetChallengein2013and2014http:/ blog emotion corpus (Quan & Ren, 2009)Thesentencesareannotatedwitheightemotions:joy,expectation,love,surprise,a

61、nxiety,sorrow,anger,andhate.2013ChineseMicroblogSentimentAnalysisEvaluation(CMSAE)DatasetofpostsfromSinaWeiboannotatedwithsevenemotions:anger,disgust,fear,happiness,like,sadnessandsurprise.Thetrainset:4000instances(13252sentences)Thetestset:10000instances(32185sentences) http:/ Dictionary of Affect

62、(Whissell) http:/sail.usc.edu/dal_app.php Affective Norms for English Words (Texts) (Bradley & Lang) http:/csea.phhp.ufl.edu/media.html Harvard General Inquirer categories (Stone etc.) http:/www.wjh.harvard.edu/inquirer/ NRC Emotion Lexicon (Mohammad & Turney) http:/ MaxDiff Sentiment Lexicon (Kirit

63、chenko, Zhu, & Mohammad) http:/ SemEval-2007:AffectiveTexthttp:/nlp.cs.swarthmore.edu/semeval/tasks/task14/summary.shtml SemEval-2013,2014,2015:SentimentAnalysisinTwitter https:/www.cs.york.ac.uk/semeval-2013/task2/ http:/alt.qcri.org/semeval2014/task9/ http:/alt.qcri.org/semeval2015/task10/ SemEval

64、-2015:SentimentAnalysisofFigurativeLanguageinTwitter http:/alt.qcri.org/semeval2015/task11/ KaggleCompetition:SentimentAnalysisonMoviereviews https:/ Affective Text Dataset (Strapparava & Mihalcea) news; headlineshttp:/web.eecs.umich.edu/mihalcea/downloads.html#affective Affect Dataset (Alm) classic

65、 literary tales; sentenceshttp:/people.rc.rit.edu/coagla/ 2012 US Presidential Elections tweets (Mohammad et al.)http:/ Emotional Prosody Speech and Transcripts actors/numbers(Libermanetal.)https:/catalog.ldc.upenn.edu/LDC2002S28 HUMAINE multimodal (Douglas-Cowie et al.)http:/emotion- Other: Emotion

66、ML (Schrder et al.) http:/www.w3.org/TR/emotionml/ ACII (multiple data formats), Interspeech (spoken language) IEEE Trans. on Affective Comp. http:/puter.org/web/tacSaifM.Mohammad,“ComputationalAnalysisofAffectandEmotioninLanguage”,EMNLP2015Overview1.Introduction2.EmotionRecognitioninText3.EmotionRe

67、cognitioninSpeechThecommonframeworkDNNforspeechemotionrecognitionRNNforspeechemotionrecognitionCNNforspeechemotionrecognitionDatacollectionforspeechemotionrecognition4.EmotionRecognitioninConversations5.IndustrialApplicationTheCommonFrameworkStep1:SegmentLevelStep2:UtteranceLevelClassifierCNNDNN/LST

68、MRNN/ELMThecommonfeatures1.Framefeatureset:Framelength:25ms,with10msslidingSegmentlength:265ms,enoughtoexpressemotionINTERSPEECH2009EmotionChallengeFeatureset:12MFCC;F0,root-mean-squaresignalframeenergy;zero-crossingrateoftimesignalandthevoicingprobabilitycomputedfromtheACF?.1storderderivatives;2.ac

69、ousticfeatures:Segmentlength:250ms;stackframefeaturesClassifier,DistributionofemotionstatesOverview1.Introduction2.EmotionRecognitioninText3.EmotionRecognitioninSpeechThecommonframeworkDNNforspeechemotionrecognitionRNNforspeechemotionrecognitionCNNforspeechemotionrecognitionDatacollectionforspeechem

70、otionrecognition4.EmotionRecognitioninConversations5.IndustrialApplicationDBN+iVectorRuiXiaandYangliu,DBN-ivectorFrameworkforAcousticEmotionRecognition”,Interspeech2016DNN+ELM1.Frame-levelfeatures:30commonacousticfeatures2.Segment-levelfeatures:Stacksoflowlevelframe-basedfeatures.DNNasaclassifiertos

71、eparatepositiveandnegative3.Utterance-levelfeatures:statisticsofthesegment-levelprobabilities,maximal,minimalandmeanofsegment-levelprobabilityofthekthemotionovertheutterance,thepercentageofsegmentswhichhavehighprobabilityofemotionkK.Han,DYuandI.Tashev,“SpeechEmotionRecognitionUsingDeepNeuralNetworka

72、ndExtremeLearningMachine“,Interspeech2014DNN+ELMFrameLevel:1.InputLayer:750Units(25frames,30LLDfeaturesperframe)2.HiddenLayers:3layers;256Reluneuronsperlayer;3.OutputLayer:5emotions(excitement,frustration,happiness,neutralandsurprise)4.Training:Mini-batchgradientdescendmethod,cross-entropyastheobjec

73、tivefunctionK.Han,DYuandI.Tashev,“SpeechEmotionRecognitionUsingDeepNeuralNetworkandExtremeLearningMachine“,Interspeech2014DNN+ELMUtteranceLevel:Extremelearningmachine1.InputLayer:4statisticsx5emotions2.HiddenLayers:1layer;1203.OutputLayer:5emotions(excitement,frustration,happiness,neutralandsurprise

74、)4.Training:SuperFastK.Han,DYuandI.Tashev,“SpeechEmotionRecognitionUsingDeepNeuralNetworkandExtremeLearningMachine“,Interspeech2014DNN+ELMK.Han,DYuandI.Tashev,“SpeechEmotionRecognitionUsingDeepNeuralNetworkandExtremeLearningMachine“,Interspeech2014EmotionalDyadicMotionCapture(IEMOCAP)databasereferre

75、dincommentstoevaluateourapproach.Thedatabasecontainsaudiovisualdatafrom10actors,andonlyaudiotrackforourevaluationOverview1.Introduction2.EmotionRecognitioninText3.EmotionRecognitioninSpeechThecommonframeworkDNNforspeechemotionrecognitionRNNforspeechemotionrecognitionCNNforspeechemotionrecognition4.E

76、motionRecognitioninConversations5.IndustrialApplicationRNN-LSTM1.ApplyLSTMtoreplaceDNNinthepreviouswork2.Changesegmentselectionstrategy:randomlyassigna“NULL”emotiontonon-silentsegments,insteadofusingsegmentswithhighestenergy3.Motivation:DNN:Assumethecontextualeffectcanbecoveredbyalong,150-250msRNN-L

77、STM:capableofmodelinglongandvariablecontexteffectJinkyuLeeandIvanTashev,“High-levelFeatureRepresentationusingRecurrentNeuralNetworkforSpeechEmotionRecognition“,Interspeech2015RNN-LSTMJinkyuLeeandIvanTashev,“High-levelFeatureRepresentationusingRecurrentNeuralNetworkforSpeechEmotionRecognition“,Inters

78、peech2015Overview1.Introduction2.EmotionRecognitioninText3.EmotionRecognitioninSpeechThecommonframeworkDNNforspeechemotionrecognitionRNNforspeechemotionrecognitionCNNforspeechemotionrecognitionDatacollectionforspeechemotionrecognition4.EmotionRecognitioninConversations5.IndustrialApplicationCNNExtra

79、ctAffect-EfficientFeaturesforSERZ.Huanget.al,“SpeechEmotionRecognitionusingCNN“,MM14Hand-TunedFeaturesApplyCNNtoautomaticallyselectaffect-salientfeaturesdisentanglingemotionsfromotherfactorssuchasspeakersandnoiseCNNExtractAffect-EfficientFeaturesforSERZ.Huanget.al,“SpeechEmotionRecognitionusingCNN“,

80、MM14Theinputisspectrogramwithtwodifferentresolutions.Throughunsupervisedfeaturelearning,oursystemobtainsonelongfeaturevectory=F(x),basedonwhich,thesemi-supervisedfeaturelearningproducestheaffect-salientfeatures(e)andthenuisancefeatures(o).Finally,affect-salientfeaturesarefedtoalinearSVMforSERCNNSemi

81、-CNNZ.Huanget.al,“SpeechEmotionRecognitionusingCNN“,MM141.UnsupervisedLearning2.Semi-supervisedLearningCNNSemi-CNNZ.Huanget.al,“SpeechEmotionRecognitionusingCNN“,MM141.Data:Differentlanguages:SurreyAudio-VisualExpressedEmotion(SAVEE)Database,BerlinEmotionalDatabase(Emo-DB),DanishEmotionalSpeechdatab

82、ase(DES),MandarinEmotionalSpeechdatabase(MES)2.ResultswithDifferentFeatures:spectrogramrepresentation(“RAW”features)acousticfeaturesTeagerEnergyOperator(TEO)LocalInvariantFeatures(LIF)Withandwithoutaffect-salientpenaltyWithandwithoutorthogonalitypenaltyCNNResultsZ.Huanget.al,“SpeechEmotionRecognitio

83、nusingCNN“,MM14End-to-EndSERwithCNN+LSTMG.Trigeogiset.al,ADIEUFEATURES?END-TO-ENDSPEECHEMOTIONRECOGNITIONUSINGADEEPCONVOLUTIONALRECURRENTNETWOR”,ICASSP2016Overview1.Introduction2.EmotionRecognitioninText3.EmotionRecognitioninSpeechThecommonframeworkDNNforspeechemotionrecognitionRNNforspeechemotionre

84、cognitionCNNforspeechemotionrecognitionDatacollectionforspeechemotionrecognition4.EmotionRecognitioninConversations5.IndustrialApplicationDatasets-1Datasets-2Datasets-3Chung-HsienWu,“Emotionandmentalstaterecognition:features,models,system,applicationsandbeyond”,ISCSLP2014Overview1.Introduction2.Emot

85、ionRecognitioninText3.EmotionRecognitioninSpeech4.EmotionRecognitioninReal-LifeConversationsCustomercaredialog5.IndustrialApplicationReal-Life Conversations “NEGATIVEEMOTIONSDETECTIONASANINDICATOROFDIALOGSQUALITYINCALLCENTERS”,C.Vaudable,L.Devillers,ICASSP2012NotwidelyresearchedEmotionsaremuchmoresh

86、adedinthecorpusVoxfactory(PowerSupplyCompanyCallCenterRecordinginFrance)thanemotionintheprototypicalcorpusJEMO(“aportrayedemotioncorpus”)UnsatisfiedCustomerDetectionwithDeepLearning P.Congetal.,“UnsatisfiedCustomerCallDetectionwithDeepLearning”,ISCSLP-2016Asampleddatafromacallcenter.Labelsareprovide

87、dbytheusersChallenges:1.Lowsamplingrate6K2.Userscallunderunpredictablebackgroundnoiseandwithdifferentlevelsofaccent3.Oftenovertalk4.NegativesegmentsarerareUnsatisfiedCustomerDetectionwithDeepLearning P.Congetal.,“UnsatisfiedCustomerCallDetectionwithDeepLearning”,ISCSLP-2016Overview1.Introduction2.Em

88、otionRecognitioninText3.EmotionRecognitioninSpeech4.EmotionRecognitioninConversationsCustomercaredialog5.IndustrialApplicationsChat robots Detectnegative/positiveandgiveproperresponsePublic Opinion PollingSurveysofpublicopinionsarewidelyusedinindustry,government,researchOwenRambow,“SentimentandBelie

89、f:HowtoThinkabout,Represent,andAnnotatePrivateStates”,ACL2015129StockMarketPrediction DailyLiveStockMarketPredictionandTrackingusingTwitterSentimentFuruWei,“SentimentAnalysisandOpinionMining”,TechReportApplication-1Applicationsviableinthenearfutureandrelatedtotheaudio-visualemotionrecognition:uAffec

90、tiveRobotuAffectiveGamesuIntelligentClassroomuIntelligentHomeuMore applications Product review mining:Whatfeaturesofaproductdocustomerslikeandwhichdotheydislike? Review classification:Isareviewpositiveornegativetowardthemovie? Tracking sentiments toward topics over time:Isangerratchetinguporcoolingd

91、own? Prediction (election outcomes, market trends):WillClintonorCruzwin?OwenRambow,“SentimentandBelief:HowtoThinkabout,Represent,andAnnotatePrivateStates”,ACL2015Trend from paper numberdatafromInterspeech&ICASSP,目前是目前是2015、2016两年,如果两年,如果这个数据有用的个数据有用的话就接着就接着统计2011到到2014年的年的?1,只看语音情感的话,数量是在下降下降2、为了表现热

92、度提升应该是把多模多模态交互的论文放在其中SentimentanalysisPapercounts SentimentanalysispapercountinACLfrom2011to2016ReferencelTomasMikolov,“LearningRepresentationsofTextusingNeuralNetworks”,NIPsDeeplearningWorkshop2013lTomasMikolov,“LearningRepresentationsofTextusingNeuralNetworks”,NIPsDeeplearningWorkshop2013lTang,eta

93、l,“LearningSentimentSpecificWordEmbeddingforTwitterSentimentClassification”,ACL2014lLeandMikolov,“DistributionalRepresentationsofSentencesandDocuments”,ICML2014lYoonKim,“ConvolutionalNeuralNetworksforSentenceClassification”,EMNLP2014.lKalchbrenneretal,“AConvolutionalNeuralNetworkforModelingSentences

94、”,ACL2014lJohnsonandZhang.,“EffectiveUseofWordOrderforTextCategorizationwithConvolutionalNeuralNetworks”,ACL2015lTangetal.,“LearningSemanticRepresentationsofUsersandProductsforDocumentLevelSentimentClassification“,ACL2015lSocheretal.,“RecursiveDeepModelsforSemanticCompositionalityoveraSentimentTreeb

95、ank”,EMNLP2013.lWangetal.,“PredictingPolaritiesofTweetsbyComposingWordEmbeddingwithLongShort-TermMemory”,ACL2015lTangetal.,“DocumentModelingwithGatedRecurrentNeuralNetworkforSentimentClassification”,EMNLP2015lJ.Wangetal.,DimensionalSentimentAnalysisUsingaRegionalCNN-LSTMModel”,ACL2016lK.STaietal,ImprovedSemanticRepresentationsFromTree-StructuredLongShort-TermMemoryNetworks”,ACL2015ReferencelSaifM.Mohammad,“ComputationalAnalysisofAffectandEmotioninLanguage”,EMNLP2015演讲完毕,谢谢观看!

展开阅读全文
相关资源
正为您匹配相似的精品文档
相关搜索

最新文档


当前位置:首页 > 医学/心理学 > 基础医学

电脑版 |金锄头文库版权所有
经营许可证:蜀ICP备13022795号 | 川公网安备 51140202000112号