Result Details
Extensions of Recurrent Neural Network Language Model
Kombrink Stefan, Dipl.-Linguist., DCGM (FIT)
Burget Lukáš, doc. Ing., Ph.D., DCGM (FIT)
Černocký Jan, prof. Dr. Ing., DCGM (FIT)
Khudanpur Sanjeev
This paper describes results that we obtained when using extensions of Recurrent Neural Network (RNN) Language Model.
language modeling, recurrent neural networks, speech recognition
We present several modifications of the original recurrent neural network language model (RNN LM).While this model has been shown to significantly outperform many competitive language modeling techniques in terms of accuracy, the remaining problem is the computational complexity. In this work, we show approaches that lead to more than 15 times speedup for both training and testing phases. Next, we show importance of using a backpropagation through time algorithm. An empirical comparison with feedforward networks is also provided. In the end, we discuss possibilities how to reduce the amount of parameters in the model. The resulting RNN model can thus be smaller, faster both during training and testing, and more accurate than the basic one.
@inproceedings{BUT76388,
author="Tomáš {Mikolov} and Stefan {Kombrink} and Lukáš {Burget} and Jan {Černocký} and Sanjeev {Khudanpur}",
title="Extensions of Recurrent Neural Network Language Model",
booktitle="Proceedings of the 2011 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2011",
year="2011",
pages="5528--5531",
publisher="IEEE Signal Processing Society",
address="Praha",
isbn="978-1-4577-0537-3",
url="https://www.fit.vut.cz/research/publication/9658/"
}
Recognition and presentation of multimedia data, BUT, Vnitřní projekty VUT, FIT-S-10-2, 2010, start: 2010-04-01, end: 2010-12-31, completed
Security-Oriented Research in Information Technology, MŠMT, Institucionální prostředky SR ČR (např. VZ, VC), MSM0021630528, start: 2007-01-01, end: 2013-12-31, running
Speech Recognition under Real-World Conditions, GACR, Standardní projekty, GA102/08/0707, start: 2008-01-01, end: 2011-12-31, completed