Deakin University
Browse

Meta-transfer learning for emotion recognition

Version 2 2024-06-02, 15:03
Version 1 2023-02-20, 04:01
journal contribution
posted on 2024-06-02, 15:03 authored by D Nguyen, Duc Thanh NguyenDuc Thanh Nguyen, S Sridharan, S Denman, TT Nguyen, D Dean, C Fookes
AbstractDeep learning has been widely adopted in automatic emotion recognition and has lead to significant progress in the field. However, due to insufficient training data, pre-trained models are limited in their generalisation ability, leading to poor performance on novel test sets. To mitigate this challenge, transfer learning performed by fine-tuning pr-etrained models on novel domains has been applied. However, the fine-tuned knowledge may overwrite and/or discard important knowledge learnt in pre-trained models. In this paper, we address this issue by proposing a PathNet-based meta-transfer learning method that is able to (i) transfer emotional knowledge learnt from one visual/audio emotion domain to another domain and (ii) transfer emotional knowledge learnt from multiple audio emotion domains to one another to improve overall emotion recognition accuracy. To show the robustness of our proposed method, extensive experiments on facial expression-based emotion recognition and speech emotion recognition are carried out on three bench-marking data sets: SAVEE, EMODB, and eNTERFACE. Experimental results show that our proposed method achieves superior performance compared with existing transfer learning methods.

History

Journal

Neural Computing and Applications

Pagination

10535-10549

Location

Berlin, Germany

ISSN

0941-0643

eISSN

1433-3058

Language

eng

Publication classification

C1 Refereed article in a scholarly journal

Publisher

Springer

Usage metrics

    Research Publications

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC