Deakin University
Browse
tran-gatingmechanismbased-2019.pdf (1.37 MB)

Gating mechanism based Natural Language Generation for spoken dialogue systems

Download (1.37 MB)
journal contribution
posted on 2019-01-01, 00:00 authored by Van Khanh Tran, L M Nguyen
Recurrent Neural Network (RNN) based approaches have recently shown promising in tackling Natural Language Generation (NLG) problems. This paper presents an approach to leverage gating mechanisms, in which we incrementally propose three additional semantic cells into a traditional RNN model: a Refinement cell to filter the sequential inputs before RNN computations, an Adjustment cell, and an Output cell to select semantic elements and gate a feature vector during generation. The proposed gating-based generators can learn from unaligned data by jointly training both sentence planning and surface realization to generate natural language utterances. We conducted extensive experiments on four different NLG domains in which the results empirically show that the proposed methods not only achieved better performance on all the NLG domains in comparison with previous gating-based, attention-based methods, but also obtained highly competitive results compared to a hybrid generator.

History

Journal

Neurocomputing

Volume

325

Pagination

48 - 58

Publisher

Elsevier

Location

Amsterdam, The Netherlands

ISSN

0925-2312

eISSN

1872-8286

Language

eng

Publication classification

C1.1 Refereed article in a scholarly journal

Copyright notice

2018, Elsevier B.V.

Usage metrics

    Research Publications

    Categories

    No categories selected

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC