Lesson 5 of 15
Mutual Information
Mutual Information
Mutual information measures how much knowing one variable reduces uncertainty about the other.
Equivalently:
Properties
- Non-negative: always
- Symmetric:
- Zero for independence: iff
- Maximum:
Normalized Mutual Information
Mutual information values depend on entropy magnitude, making comparisons across different datasets difficult. Normalized MI scales it to :
If or , return (no uncertainty to reduce).
Example
For independent uniform variables: , , so .
For fully correlated variables (): , so .
import math
def mutual_information(joint):
mx = [sum(row) for row in joint]
n_cols = len(joint[0])
my = [sum(joint[i][j] for i in range(len(joint))) for j in range(n_cols)]
hx = sum(-p * math.log2(p) for p in mx if p > 0)
hy = sum(-p * math.log2(p) for p in my if p > 0)
hxy = sum(-p * math.log2(p) for row in joint for p in row if p > 0)
return hx + hy - hxy
Your Task
Implement:
mutual_information(joint)—normalized_mi(joint)— ; return if denominator is zero
Python runtime loading...
Loading...
Click "Run" to execute your code.