Paper: | SP-L11.2 | ||
Session: | Language Modeling and Search | ||
Time: | Friday, May 21, 15:50 - 16:10 | ||
Presentation: | Lecture | ||
Topic: | Speech Processing: Language Modeling | ||
Title: | EXACT TRAINING OF A NEURAL SYNTACTIC LANGUAGE MODEL | ||
Authors: | Ahmad Emami; Johns Hopkins University | ||
Frederick Jelinek; Johns Hopkins University | |||
Abstract: | The Structured Language Model aims at making a prediction ofthe next word in a given word string by making a syntacticalanalysis of the preceding words. However, it faces the datasparseness problem because of the large dimensionality anddiversity of the information available in the syntactic parses.In previous works, we proposed using neural network models forSLM. The neural network model is better suited to tackle thedata sparseness problem and the combination gave significantimprovements in perplexity and word error rate over thebaseline SLM.In this paper we present a new method of training the neural netbased SLM. The presented procedure makes use of the partialparses hypothesized by the SLM itself, and is more expensive thanthe approximate training method used in the previous works.Experiments with the new training method on the UPenn and WSJcorpus show significant reductions in perplexity and word errorrate, achieving the lowest published results for thegiven corpora. | ||
Back |
Home -||-
Organizing Committee -||-
Technical Committee -||-
Technical Program -||-
Plenaries
Paper Submission -||-
Special Sessions -||-
ITT -||-
Paper Review -||-
Exhibits -||-
Tutorials
Information -||-
Registration -||-
Travel Insurance -||-
Housing -||-
Workshops