[KDD 2020] Discovering Approximate Functional Dependencies using Smoothed Mutual Information
Aug 13, 20202 views
We consider the task of discovering the topK,reliable approximate,functional dependencies,X,→,Y,from high dimensional data. While,naively maximizing mutual information involving high dimensional,entropies over empirical data is subject to false discoveries, correcting the empirical estimator against data sparsity can lead to,efficient exact algorithms for robust dependency discovery. Previous approaches focused on correcting by subtracting expected,values of different null hypothesis models. In this paper, we consider a different correction strategy and counter data sparsity using,uniform priors and smoothing techniques, that leads to an efficient,and robust estimating process. In addition, we derive an admissible and tight bounding function for the smoothed estimator that,allows us to efficiently solve via branch-and-bound the hard search,problem for the topK,dependencies. Our experiments show that,our approach is much faster than previous proposals, and leads to,the discovery of sparse and informative functional dependencies.