Very Deep Convolutional Networks

Very Deep Convolutional Networks

ID:40729536

大?。?1.49 KB

頁(yè)數(shù):10頁(yè)

時(shí)間:2019-08-06

Very Deep Convolutional Networks_第1頁(yè)
Very Deep Convolutional Networks_第2頁(yè)
Very Deep Convolutional Networks_第3頁(yè)
Very Deep Convolutional Networks_第4頁(yè)
Very Deep Convolutional Networks_第5頁(yè)
資源描述:

《Very Deep Convolutional Networks》由會(huì)員上傳分享,免費(fèi)在線閱讀,更多相關(guān)內(nèi)容在學(xué)術(shù)論文-天天文庫(kù)。

1、VeryDeepConvolutionalNetworksforTextClassi?cationAlexisConneauHolgerSchwenkYannLeCunFacebookAIResearchFacebookAIResearchFacebookAIResearchaconneau@fb.comschwenk@fb.comyann@fb.comLo¨?cBarraultLIUM,UniversityofLeMans,Franceloic.barrault@univ-lemans.frAbstractterestintheresearch

2、communityandtheyaresys-tematicallyappliedtoallNLPtasks.However,ThedominantapproachformanyNLPwhiletheuseof(deep)neuralnetworksinNLPtasksarerecurrentneuralnetworks,inpar-hasshownverygoodresultsformanytasks,itticularLSTMs,andconvolutionalneuralseemsthattheyhavenotyetreachedthele

3、veltonetworks.However,thesearchitecturesoutperformthestate-of-the-artbyalargemargin,arerathershallowincomparisontotheasitwasobservedincomputervisionandspeechdeepconvolutionalnetworkswhichhaverecognition.pushedthestate-of-the-artincomputervi-Convolutionalneuralnetworks,inshort

4、Con-sion.Wepresentanewarchitecture(VD-vNets,areverysuccessfulincomputervision.InCNN)fortextprocessingwhichoperatesearlyapproachestocomputervision,handcrafteddirectlyatthecharacterlevelandusesfeatureswereused,forinstance“scale-invariantonlysmallconvolutionsandpoolingoper-featu

5、retransform(SIFT)”(Lowe,2004),followedations.Weareabletoshowthattheper-bysomeclassi?er.ThefundamentalideaofCon-formanceofthismodelincreaseswiththevNets(LeCunetal.,1998)istoconsiderfeaturedepth:usingupto29convolutionallayers,extractionandclassi?cationasonejointlytrainedwerepor

6、timprovementsoverthestate-of-task.Thisideahasbeenimprovedovertheyears,the-artonseveralpublictextclassi?cationinparticularbyusingmanylayersofconvolutionstasks.Tothebestofourknowledge,thisisandpoolingtosequentiallyextractahierarchicalthe?rsttimethatverydeepconvolutionalrepresen

7、tation(ZeilerandFergus,2014)ofthein-netshavebeenappliedtotextprocessing.put.Thebestnetworksareusingmorethan150layersasin(Heetal.,2016a;Heetal.,2016b).1IntroductionarXiv:1606.01781v2[cs.CL]27Jan2017ManyNLPapproachesconsiderwordsasba-Thegoalofnaturallanguageprocessing(NLP)issic

8、units.Animportantstepwastheintroductiontoprocesstextwithcomputersino

當(dāng)前文檔最多預(yù)覽五頁(yè),下載文檔查看全文

此文檔下載收益歸作者所有

當(dāng)前文檔最多預(yù)覽五頁(yè),下載文檔查看全文
溫馨提示:
1. 部分包含數(shù)學(xué)公式或PPT動(dòng)畫的文件,查看預(yù)覽時(shí)可能會(huì)顯示錯(cuò)亂或異常,文件下載后無(wú)此問(wèn)題,請(qǐng)放心下載。
2. 本文檔由用戶上傳,版權(quán)歸屬用戶,天天文庫(kù)負(fù)責(zé)整理代發(fā)布。如果您對(duì)本文檔版權(quán)有爭(zhēng)議請(qǐng)及時(shí)聯(lián)系客服。
3. 下載前請(qǐng)仔細(xì)閱讀文檔內(nèi)容,確認(rèn)文檔內(nèi)容符合您的需求后進(jìn)行下載,若出現(xiàn)內(nèi)容與標(biāo)題不符可向本站投訴處理。
4. 下載文檔時(shí)可能由于網(wǎng)絡(luò)波動(dòng)等原因無(wú)法下載或下載錯(cuò)誤,付費(fèi)完成后未能成功下載的用戶請(qǐng)聯(lián)系客服處理。