Deakin University
Browse

File(s) under permanent embargo

Enhanced semantic refinement gate for RNN-based neural language generator

conference contribution
posted on 2017-01-01, 00:00 authored by Van Khanh Tran, Van-Tao Nguyen, Le-Minh Nguyen
Natural language generation (NLG) plays an important role in a Spoken Dialogue System. Recurrent Neural Network (RNN)-based approaches have shown promising in tackling NLG tasks. This paper presents approaches to enhance gating mechanism applied for RNN-based natural language generator, in which an attentive dialog act representation is introduced, and two gating mechanisms are proposed to semantically gate input sequences before RNN computation. The proposed RNN-based generators can be learned from unaligned data by jointly training both sentence planning and surface realization to produce natural language responses. The model was extensively evaluated on four different NLG domains. The results show that the proposed generators achieved better performance on all the NLG domains in comparison to the previous generators.

History

Pagination

172-178

Location

Hue, Vietnam

Start date

2017-10-19

End date

2017-10-21

ISBN-13

978-1-5386-3576-6

Language

eng

Publication classification

E1.1 Full written paper - refereed

Copyright notice

2017, IEEE

Editor/Contributor(s)

Nguyen TT, Le AP, Tojo S, Nguyen LM, Phan XH

Title of proceedings

KSE 2017 : Proceedings of the 9th International Conference on Knowledge and Systems Engineering 2017

Event

Knowledge and Systems Engineering. Conference (9th : 2017 : Hue, Vietnam)

Publisher

Institute of Electrical and Electronics Engineers

Place of publication

Piscataway, N.J.

Series

Knowledge and Systems Engineering Conference

Usage metrics

    Research Publications

    Categories

    No categories selected

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC