Path: Top -> Journal -> Telkomnika -> 2018 -> Vol. 16, No. 3, June

Shared-hidden-layer Deep Neural Network for Under-resourced Language the Content

Journal from gdlhub / 2018-07-25 15:17:49
Oleh : Devin Hoesen, Dessi Puji Lestari, Dwi Hendratmo Widyantoro, Telkomnika
Dibuat : 2018-07-25, dengan 1 file

Keyword : deep neural network; grapheme-to-phoneme; indonesian; shared hidden layer; under-resourced;
Url : http://journal.uad.ac.id/index.php/TELKOMNIKA/article/view/7984
Sumber pengambilan dokumen : WEB

Training speech recognizer with under-resourced language data still proves difficult. Indonesian language is considered under-resourced because the lack of a standard speech corpus, text corpus, and dictionary. In this research, the efficacy of augmenting limited Indonesian speech training data with highly-resourced-language training data, such as English, to train Indonesian speech recognizer was analyzed. The training was performed in form of shared-hidden-layer deep-neural-network (SHL-DNN) training. An SHL-DNN has language-independent hidden layers and can be pre-trained and trained using multilingual training data without any difference with a monolingual deep neural network. The SHL-DNN using Indonesian and English speech training data proved effective for decreasing word error rate (WER) in decoding Indonesian dictated-speech by achieving 3.82% absolute decrease compared to a monolingual Indonesian hidden Markov model using Gaussian mixture model emission (GMM-HMM). The case was confirmed when the SHL-DNN was also employed to decode Indonesian spontaneous-speech by achieving 4.19% absolute WER decrease.

Beri Komentar ?#(0) | Bookmark

PropertiNilai Properti
ID Publishergdlhub
OrganisasiTelkomnika
Nama KontakHerti Yani, S.Kom
AlamatJln. Jenderal Sudirman
KotaJambi
DaerahJambi
NegaraIndonesia
Telepon0741-35095
Fax0741-35093
E-mail Administratorelibrarystikom@gmail.com
E-mail CKOelibrarystikom@gmail.com

Print ...

Kontributor...

  • , Editor: sukadi

Download...