Elements of Information Theory, Thomas M. Cover and Joy A. Thomas, 2006 (John Wiley & Sons)DOI: 10.1002/047174882X - A standard textbook that provides a foundation in information theory, detailing concepts like entropy, mutual information, and divergence measures.
sklearn.metrics.mutual_info_score, scikit-learn developers, 2023 - Official documentation for calculating mutual information between discrete labels or a contingency matrix, useful for comparing dependencies in data.
scipy.stats.entropy, SciPy developers, 2023 - Official documentation describing the computation of Shannon entropy and Kullback-Leibler divergence for probability distributions.