Factorial multi-task learning : a Bayesian nonparametric approach

Gupta, Sunil Kumar, Phung, Dinh and Venkatesh, Svetha 2013, Factorial multi-task learning : a Bayesian nonparametric approach, in ICML 2013 : Proceedings of the Machine Learning 2013 International Conference, International Machine Learning Society (IMLS), [Atlanta, Ga.], pp. 1694-1702.

Attached Files
Name Description MIMEType Size Downloads

Title Factorial multi-task learning : a Bayesian nonparametric approach
Author(s) Gupta, Sunil KumarORCID iD for Gupta, Sunil Kumar orcid.org/0000-0002-3308-1930
Phung, DinhORCID iD for Phung, Dinh orcid.org/0000-0002-9977-8247
Venkatesh, SvethaORCID iD for Venkatesh, Svetha orcid.org/0000-0001-8675-6631
Conference name Machine Learning. International Conference (30th : 2013 : Atlanta, Ga.)
Conference location Atlanta, Ga.
Conference dates 16 - 21 Jun. 2013
Title of proceedings ICML 2013 : Proceedings of the Machine Learning 2013 International Conference
Editor(s) Dasgupta, Sanjoy
McAllester, David
Publication date 2013
Series JMLR Workshop and Conference Proceedings Vol. 28
Conference series Machine Learning International Conference
Start page 1694
End page 1702
Total pages 9
Publisher International Machine Learning Society (IMLS)
Place of publication [Atlanta, Ga.]
Keyword(s) artificial intelligence
software engineering
dirichlet process
infinite numbers
joint learning
multitask learning
nonparametric approaches
performance degradation
task relatedness
Summary Multi-task learning is a paradigm shown to improve the performance of related tasks through their joint learning. However, for real-world data, it is usually difficult to assess the task relatedness and joint learning with unrelated tasks may lead to serious performance degradations. To this end, we propose a framework that groups the tasks based on their relatedness in a subspace and allows a varying degree of relatedness among tasks by sharing the subspace bases across the groups. This provides the flexibility of no sharing when two sets of tasks are unrelated and partial/total sharing when the tasks are related. Importantly, the number of task-groups and the subspace dimensionality are automatically inferred from the data. To realize our framework, we introduce a novel Bayesian nonparametric prior that extends the traditional hierarchical beta process prior using a Dirichlet process to permit potentially infinite number of child beta processes. We apply our model for multi-task regression and classification applications. Experimental results using several synthetic and real datasets show the superiority of our model to other recent multi-task learning methods. Copyright 2013 by the author(s).
Language eng
Field of Research 080109 Pattern Recognition and Data Mining
Socio Economic Objective 970108 Expanding Knowledge in the Information and Computing Sciences
HERDC Research category E1.1 Full written paper - refereed
HERDC collection year 2013
Copyright notice ©2013, IMLS
Persistent URL http://hdl.handle.net/10536/DRO/DU:30067638

Connect to link resolver
Unless expressly stated otherwise, the copyright for items in DRO is owned by the author, with all rights reserved.

Version Filter Type
Citation counts: TR Web of Science Citation Count  Cited 0 times in TR Web of Science
Scopus Citation Count Cited 10 times in Scopus
Google Scholar Search Google Scholar
Access Statistics: 541 Abstract Views, 8 File Downloads  -  Detailed Statistics
Created: Mon, 24 Nov 2014, 10:41:12 EST

Every reasonable effort has been made to ensure that permission has been obtained for items included in DRO. If you believe that your rights have been infringed by this repository, please contact drosupport@deakin.edu.au.