Deakin University
Browse

File(s) under permanent embargo

Tensor-variate restricted Boltzmann machines

Version 2 2024-06-05, 11:48
Version 1 2015-08-26, 14:50
conference contribution
posted on 2015-01-01, 00:00 authored by Tu Dinh Nguyen, Quoc-Dinh Phung, Truyen TranTruyen Tran, Svetha VenkateshSvetha Venkatesh
Restricted Boltzmann Machines (RBMs) are an important class of latent variable models for representing vector data. An under-explored area is multimode data, where each data point is a matrix or a tensor. Standard RBMs applying to such data would require vectorizing matrices and tensors, thus resulting in unnecessarily high dimensionality and at the same time, destroying the inherent higher-order interaction structures. This paper introduces Tensor-variate Restricted Boltzmann Machines (TvRBMs) which generalize RBMs to capture the multiplicative interaction between data modes and the latent variables. TvRBMs are highly compact in that the number of free parameters grows only linear with the number of modes. We demonstrate the capacity of TvRBMs on three real-world applications: handwritten digit classification, face recognition and EEG-based alcoholic diagnosis. The learnt features of the model are more discriminative than the rivals, resulting in better classification performance.

History

Event

AAAI Conference on Artificial Intelligence (29th : 2015 : Austin Texas)

Volume

3

Pagination

2887 - 2893

Publisher

AAAI Press

Location

Austin, Tex.

Place of publication

Palo Alto, Calif.

Start date

2015-01-25

End date

2015-01-30

ISBN-13

9781577357018

Language

eng

Notes

paper No: 2887

Publication classification

E Conference publication; E1 Full written paper - refereed

Copyright notice

2015, The Authors

Title of proceedings

AAAI 2015: The Proceedings of the 29th AAAI Conference on Artificial Intelligence