Deakin University

File(s) under embargo

Diffeomorphic Information Neural Estimation

conference contribution
posted on 2023-09-11, 03:39 authored by B Duong, Thin NguyenThin Nguyen
Mutual Information (MI) and Conditional Mutual Information (CMI) are multi-purpose tools from information theory that are able to naturally measure the statistical dependencies between random variables, thus they are usually of central interest in several statistical and machine learning tasks, such as conditional independence testing and representation learning. However, estimating CMI, or even MI, is infamously challenging due the intractable formulation. In this study, we introduce DINE (Diffeomorphic Information Neural Estimator)–a novel approach for estimating CMI of continuous random variables, inspired by the invariance of CMI over diffeomorphic maps. We show that the variables of interest can be replaced with appropriate surrogates that follow simpler distributions, allowing the CMI to be efficiently evaluated via analytical solutions. Additionally, we demonstrate the quality of the proposed estimator in comparison with state-of-the-arts in three important tasks, including estimating MI, CMI, as well as its application in conditional independence testing. The empirical evaluations show that DINE consistently outperforms competitors in all tasks and is able to adapt very well to complex and high-dimensional relationships.








Publication classification

E1 Full written paper - refereed

Title of proceedings

Proceedings of the 37th AAAI Conference on Artificial Intelligence, AAAI 2023

Usage metrics

    Research Publications


    No categories selected