Deakin University
Browse

Multiple task transfer learning with small sample sizes

Version 2 2024-06-03, 17:12
Version 1 2015-08-26, 14:47
journal contribution
posted on 2024-06-03, 17:12 authored by B Saha, Sunil GuptaSunil Gupta, D Phung, Svetha VenkateshSvetha Venkatesh
Prognosis, such as predicting mortality, is common in medicine. When confronted with small numbers of samples, as in rare medical conditions, the task is challenging. We propose a framework for classification with data with small numbers of samples. Conceptually, our solution is a hybrid of multi-task and transfer learning, employing data samples from source tasks as in transfer learning, but considering all tasks together as in multi-task learning. Each task is modelled jointly with other related tasks by directly augmenting the data from other tasks. The degree of augmentation depends on the task relatedness and is estimated directly from the data. We apply the model on three diverse real-world data sets (healthcare data, handwritten digit data and face data) and show that our method outperforms several state-of-the-art multi-task learning baselines. We extend the model for online multi-task learning where the model parameters are incrementally updated given new data or new tasks. The novelty of our method lies in offering a hybrid multi-task/transfer learning model to exploit sharing across tasks at the data-level and joint parameter learning.

History

Journal

Knowledge and Information Systems

Volume

46

Pagination

315-342

Location

Berlin, Germany

ISSN

0219-1377

eISSN

0219-3116

Language

English

Publication classification

C Journal article, C1 Refereed article in a scholarly journal

Copyright notice

2016, Springer

Issue

2

Publisher

SPRINGER LONDON LTD