Deakin University
Browse

File(s) under permanent embargo

Hilbert Sinkhorn Divergence for Optimal Transport

conference contribution
posted on 2021-01-01, 00:00 authored by Q Li, Z Wang, Gang LiGang Li, J Pang, G Xu
The Sinkhorn divergence has become a very popular metric to compare probability distributions in optimal transport. However, most works resort to the Sinkhorn divergence in Euclidean space, which greatly blocks their applications in complex data with nonlinear structure. It is therefore of theoretical demand to empower the Sinkhorn divergence with the capability of capturing nonlinear structures. We propose a theoretical and computational framework to bridge this gap. In this paper, we extend the Sinkhorn divergence in Euclidean space to the reproducing kernel Hilbert space, which we term “Hilbert Sinkhorn divergence” (HSD). In particular, we can use kernel matrices to derive a closed form expression of the HSD that is proved to be a tractable convex optimization problem. We also prove several attractive statistical properties of the proposed HSD, i.e., strong consistency, asymptotic behavior and sample complexity. Empirically, our method yields state-of-the-art performances on image classification and topological data analysis.

History

Pagination

3834-3843

Location

Nashville, Tenn.

Start date

2021-06-20

End date

2021-06-25

ISSN

1063-6919

ISBN-13

9781665445092

Language

English

Publication classification

E1 Full written paper - refereed

Title of proceedings

CVPR 2021: Proceedings of the IEEE/CVF Computer Vision and Pattern Recognition 2021 Conference

Event

CVPR Computer Vision and Pattern Recognition : Conference (2021 : Nashville, Tenn.)

Publisher

IEEE

Place of publication

Piscataway, N.J.

Series

IEEE Conference on Computer Vision and Pattern Recognition

Usage metrics

    Research Publications

    Categories

    No categories selected

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC