Please use this identifier to cite or link to this item:
Marta Campos Ferreira
João Manuel R. S. Tavares
|A customized residual neural network and bi-directional gated recurrent unit-based automatic speech recognition model
|Speech recognition aims to convert human speech into text and has applications in security, healthcare, commerce, automobiles, and technology, just to name a few. Inserting residual neural networks before recurrent neural network cells improves accuracy and cuts training time by a good margin. Furthermore, layer normalization instead of batch normalization is more effective in model training and performance enhancement. Also, the size of the datasets presents tremendous influences in achieving the best performance. Leveraging these tricks, this article proposes an automatic speech recognition model with a stacked five layers of customized Residual Convolution Neural Network and seven layers of Bi-Directional Gated Recurrent Units, including a logarithmic so f tmax for the model output. Each of them incorporates a learnable per-element affine parameter-based layer normalization technique. The training and testing of the new model were conducted on the LibriSpeech corpus and LJ Speech dataset. The experimental results demonstrate a character error rate (CER) of 4.7 and 3.61% on the two datasets, respectively, with only 33 million parameters without the requirement of any external language model.
|Ciências Tecnológicas, Ciências da engenharia e tecnologias
Technological sciences, Engineering and technology
|Ciências da engenharia e tecnologias
Engineering and technology
|info:eu-repo/grantAgreement/Agência para o Investimento e Comércio Externo de Portugal, E.P.E/Regime Contratual de Investimento/POCI-01-0247-FEDER-041435 (Safe Cities)/Safe Cities - Inovação para Construir Cidades Seguras/Safe Cities
|Artigo em Revista Científica Internacional
|Appears in Collections:
|FEUP - Artigo em Revista Científica Internacional
Files in This Item:
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.