Deakin University
Browse

Dual latent variable model for low-resource natural language generation in dialogue systems

conference contribution
posted on 2018-01-01, 00:00 authored by Van Khanh Tran, Le-Minh Nguyen
Recent deep learning models have shown improving results to natural language generation (NLG) irrespective of providing sufficient annotated data. However, a modest training data may harm such models’ performance. Thus, how to build a generator that can utilize as much of knowledge from a low-resource setting data is a crucial issue in NLG. This paper presents a variational neural-based generation model to tackle the NLG problem of having limited labeled dataset, in which we integrate a variational inference into an encoder-decoder generator and introduce a novel auxiliary auto-encoding with an effective training procedure. Experiments showed that the proposed methods not only outperform the previous models when having sufficient training dataset but also demonstrate strong ability to work acceptably well when the training data is scarce.

History

Pagination

21-30

Location

Brussels, Belgium

Start date

2018-10-31

End date

2018-11-01

ISBN-13

978-1-948087-72-8

Language

eng

Publication classification

E1.1 Full written paper - refereed

Copyright notice

2018, Association for Computational Linguistics

Editor/Contributor(s)

[Unknown]

Title of proceedings

CoNLL 2018 : Proceedings of the 22nd Conference on Computational Natural Language Learning

Event

Association for Computational Linguistics. Conference (22nd : 2018 : Brussels, Belgium)

Publisher

Association for Computational Linguistics

Place of publication

Stroudsburg, Pa.

Series

Association for Computational Linguistics Conference

Usage metrics

    Research Publications

    Categories

    No categories selected

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC