Information Theory

Information Theory

• Entropy and Cross-Entropy, and Information Gain

• Definition of entropy and its significance in measuring uncertainty

• Information gain and its use in decision trees

• Cross-Entropy and Kullback-Leibler Divergence

• Understanding cross-entropy loss in classification problems

• KL divergence as a measure of difference between probability distributions

Last updated