Lesson 3 of 15
Joint Entropy
Joint Entropy
Joint entropy measures the total uncertainty in a pair of random variables together.
This is just Shannon entropy applied to the joint distribution — flatten the probability matrix and sum over all cells.
Marginal Distributions
From a joint distribution , you can recover each variable's distribution by summing out the other:
In matrix notation: marginal of = row sums, marginal of = column sums.
Key Bound
- The lower bound holds when one variable determines the other.
- The upper bound holds when and are independent: .
Example
For a uniform joint distribution (each cell ):
import math
def joint_entropy(joint_probs):
result = 0.0
for row in joint_probs:
for p in row:
if p > 0:
result += -p * math.log2(p)
return result
j = [[0.25, 0.25], [0.25, 0.25]]
print(joint_entropy(j)) # 2.0
Your Task
Implement:
joint_entropy(joint_probs)— entropy of a 2D list (matrix of joint probabilities)marginal_x(joint)— list of row sumsmarginal_y(joint)— list of column sums
Python runtime loading...
Loading...
Click "Run" to execute your code.