Multi-task mid-level feature alignment network for unsupervised cross-dataset person re-identification
Version 2 2024-06-05, 03:32Version 2 2024-06-05, 03:32
Version 1 2019-10-03, 08:39Version 1 2019-10-03, 08:39
conference contribution
posted on 2024-06-05, 03:32 authored by S Lin, H Li, Chang-Tsun LiChang-Tsun Li, AC Kot© 2018. The copyright of this document resides with its authors. Most existing person re-identification (Re-ID) approaches follow a supervised learning framework, in which a large number of labelled matching pairs are required for training. Such a setting severely limits their scalability in real-world applications where no labelled samples are available during the training phase. To overcome this limitation, we develop a novel unsupervised Multi-task Mid-level Feature Alignment (MMFA) network for the unsupervised cross-dataset person re-identification task. Under the assumption that the source and target datasets share the same set of mid-level semantic attributes, our proposed model can be jointly optimised under the person's identity classification and the attribute learning task with a cross-dataset mid-level feature alignment regularisation term. In this way, the learned feature representation can be better generalised from one dataset to another which further improve the person re-identification accuracy. Experimental results on four benchmark datasets demonstrate that our proposed method outperforms the state-of-the-art baselines.
History
Location
Newcastle upon Tyne, EnglandStart date
2018-09-03End date
2018-09-06Language
engPublication classification
E1.1 Full written paper - refereedCopyright notice
2018, The AuthorsTitle of proceedings
BMVC 2018: Proceedings of the 29th British Machine Vision ConferenceEvent
BMVC 2018 (29th : 2018 : Newcastle upon Tyne, England)Publisher
BMVCPlace of publication
[Newcastle upon Tyne, England]Publication URL
Usage metrics
Categories
No categories selectedLicence
Exports
RefWorksRefWorks
BibTeXBibTeX
Ref. managerRef. manager
EndnoteEndnote
DataCiteDataCite
NLMNLM
DCDC