tran-gatingmechanismbased-2019.pdf (1.37 MB)
Download fileGating mechanism based Natural Language Generation for spoken dialogue systems
journal contribution
posted on 2019-01-01, 00:00 authored by Van Khanh Tran, L M NguyenRecurrent Neural Network (RNN) based approaches have recently shown promising in tackling Natural Language Generation (NLG) problems. This paper presents an approach to leverage gating mechanisms, in which we incrementally propose three additional semantic cells into a traditional RNN model: a Refinement cell to filter the sequential inputs before RNN computations, an Adjustment cell, and an Output cell to select semantic elements and gate a feature vector during generation. The proposed gating-based generators can learn from unaligned data by jointly training both sentence planning and surface realization to generate natural language utterances. We conducted extensive experiments on four different NLG domains in which the results empirically show that the proposed methods not only achieved better performance on all the NLG domains in comparison with previous gating-based, attention-based methods, but also obtained highly competitive results compared to a hybrid generator.