Commonly used dependence measures, such as linear correlation, cross-correlogram, or Kendall's τ, cannot capture the complete dependence structure in data unless the structure is restricted to linear, periodic, or monotonic. Mutual information (MI) has been frequently utilized for capturing the complete dependence structure including nonlinear dependence. Recently, several methods have been proposed for the MI estimation, such as kernel density estimators (KDEs), k-nearest neighbors (KNNs), Edgeworth approximation of differential entropy, and adaptive partitioning of the XY plane. However, outstanding gaps in the current literature have precluded the ability to effectively automate these methods, which, in turn, have caused limited adoptions by the application communities. This study attempts to address a key gap in the literature— specifically, the evaluation of the above methods to choose the best method, particularly in terms of their robustness for short and noisy data, based on comparisons with the theoretical MI estimates, which can be computed analytically, as well with linear correlation and Kendall's τ. Here we consider smaller data sizes, such as 50, 100, and 1000, and within this study we characterize 50 and 100 data points as very short and 1000 as short. We consider a broader class of functions, specifically linear, quadratic, periodic, and chaotic, contaminated with artificial noise with varying noise-to-signal ratios. Our results indicate KDEs as the best choice for very short data at relatively high noise-to-signal levels whereas the performance of KNNs is the best for very short data at relatively low noise levels as well as for short data consistently across noise levels. In addition, the optimal smoothing parameter of a Gaussian kernel appears to be the best choice for KDEs while three nearest neighbors appear optimal for KNNs. Thus, in situations where the approximate data sizes are known in advance and exploratory data analysis and/or domain knowledge can be used to provide a priori insights into the noise-to-signal ratios, the results in the paper point to a way forward for automating the process of MI estimation.
dependence measures, mutual information estimation, short data, noisy data
Nonlinear systems, Prediction theory
American Physical Society
Copyright 2007 The American Physical Society
American Physical Society
Khan, Shiraj; Bandyopadhyay, Sharba; Ganguly, Auroop R.; Saigal, Sunil; Erickson, David J. III; Protopopescu, Vladimir; and Ostrouchov, George, "Relative performance of mutual information estimation methods for quantifying the dependence among short and noisy data" (2007). Civil and Environmental Engineering Faculty Publications. Paper 3. http://hdl.handle.net/2047/d20002073
Click button above to open, or right-click to save.