Deakin University
Browse

File(s) under permanent embargo

Faster training of very deep networks via p-norm gates

conference contribution
posted on 2016-12-08, 00:00 authored by Truyen TranTruyen Tran, Trang Thi Minh Pham, Quoc-Dinh Phung, Svetha VenkateshSvetha Venkatesh
A major contributing factor to the recent advances in deep neural networks is structural units that let sensory information and gradients to propagate easily. Gating is one such structure that acts as a flow control. Gates are employed in many recent state-of-the-art recurrent models such as LSTM and GRU, and feedforward models such as Residual Nets and Highway Networks. This enables learning in very deep networks with hundred layers and helps achieve record-breaking results in vision (e.g., ImageNet with Residual Nets) and NLP (e.g., machine translation with GRU). However, there is limited work in analysing the role of gating in the learning process. In this paper, we propose a flexible p-norm gating scheme, which allows usercontrollable flow and as a consequence, improve the learning speed. This scheme subsumes other existing gating schemes, including those in GRU, Highway Networks and Residual Nets as special cases. Experiments on large sequence and vector datasets demonstrate that the proposed gating scheme helps improve the learning speed significantly without extra overhead.

History

Event

Pattern Recognition. International Conference (23rd : 2016 : Cancun, Mexico)

Pagination

3542 - 3547

Publisher

IEEE

Location

Cancun, Mexico

Place of publication

Piscataway, N.J.

Start date

2016-12-04

End date

2016-12-08

ISBN-13

9781509048472

Language

eng

Publication classification

E Conference publication; E1 Full written paper - refereed

Copyright notice

2016, IEEE

Title of proceedings

ICPR 2016: Proceedings of the 23rd International Conference on Pattern Recognition

Usage metrics

    Research Publications

    Categories

    No categories selected

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC