Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

Improving Post-Filtering of Artificial Speech Using Pre-Trained LSTM Neural Networks

Version 1 : Received: 16 May 2019 / Approved: 17 May 2019 / Online: 17 May 2019 (16:16:53 CEST)

A peer-reviewed article of this Preprint also exists.

Coto-Jiménez, M. Improving Post-Filtering of Artificial Speech Using Pre-Trained LSTM Neural Networks. Biomimetics 2019, 4, 39. Coto-Jiménez, M. Improving Post-Filtering of Artificial Speech Using Pre-Trained LSTM Neural Networks. Biomimetics 2019, 4, 39.

Abstract

Several researchers have contemplated deep learning-based post-filters to increase the quality of statistical parametric speech synthesis, which perform a mapping of the synthetic speech to the natural speech, considering the different parameters separately and trying to reduce the gap between them. The Long Short-term Memory (LSTM) Neural Networks have been applied successfully in this purpose, but there are still many aspects to improve in the results and in the process itself. In this paper, we introduce a new pre-training approach for the LSTM, with the objective of enhancing the quality of the synthesized speech, particularly in the spectrum, in a more efficient manner. Our approach begins with an auto-associative training of one LSTM network, which is used as an initialization for the post-filters. We show the advantages of this initialization for the enhancing of the Mel-Frequency Cepstral parameters of synthetic speech. Results show that the initialization succeeds in achieving better results in enhancing the statistical parametric speech spectrum in most cases when compared to the common random initialization approach of the networks.

Keywords

Deep learning, LSTM, Machine learning, Post-filtering, Signal processing, Speech Synthesis

Subject

Computer Science and Mathematics, Artificial Intelligence and Machine Learning

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.