Result Details
RNNLM - Recurrent Neural Network Language Modeling Toolkit
Kombrink Stefan, Dipl.-Linguist., DCGM (FIT)
Deoras Anoop
Burget Lukáš, doc. Ing., Ph.D., DCGM (FIT)
Černocký Jan, prof. Dr. Ing., DCGM (FIT)
This article is about the RNNLM - Recurrent Neural Network Language Modeling Toolkit, which was presented at the poster session of the ASRU 2011 conference.
neural network, language modeling, speeech recognition
We present a freely available open-source toolkit for training recurrent neural network based language models. It can be easily used to improve existing speech recognition and machine translation systems. Also, it can be used as a baseline for future research of advanced language modeling techniques. In the paper, we discuss optimal parameter selection and different modes of functionality. The toolkit, example scripts and basic setups are freely available at http://rnnlm.sourceforge.net/.
@inproceedings{BUT97008,
author="Tomáš {Mikolov} and Stefan {Kombrink} and Anoop {Deoras} and Lukáš {Burget} and Jan {Černocký}",
title="RNNLM - Recurrent Neural Network Language Modeling Toolkit",
booktitle="Proceedings of ASRU 2011",
year="2011",
pages="1--4",
publisher="IEEE Signal Processing Society",
address="Hilton Waikoloa Village, Big Island, Hawaii",
isbn="978-1-4673-0366-8",
url="http://www.fit.vutbr.cz/research/groups/speech/publi/2011/mikolov_asru2011_demo_RNNLM-1.pdf"
}
Security-Oriented Research in Information Technology, MŠMT, Institucionální prostředky SR ČR (např. VZ, VC), MSM0021630528, start: 2007-01-01, end: 2013-12-31, running
Speech Recognition under Real-World Conditions, GACR, Standardní projekty, GA102/08/0707, start: 2008-01-01, end: 2011-12-31, completed
Technologies of speech processing for efficient human-machine communication, TAČR, Program aplikovaného výzkumu a experimentálního vývoje ALFA, TA01011328, start: 2011-01-01, end: 2014-12-31, completed