Lesson 8 of 15
Jensen-Shannon Divergence
Jensen-Shannon Divergence
The Jensen-Shannon divergence (JSD) fixes KL divergence's asymmetry by averaging the KL from each distribution to their mixture:
where is the mixture distribution (element-wise average).
Key Properties
- Symmetric:
- Bounded: when using
- Square root is a metric: satisfies the triangle inequality
Interpretation
| JSD | Meaning |
|---|---|
| 0 | (identical distributions) |
| 1 | and have disjoint supports (completely different) |
Jensen-Shannon Distance
The square root of JSD is a proper metric:
import math
def js_divergence(p, q):
m = [(p[i] + q[i]) / 2 for i in range(len(p))]
kl_pm = sum(p[i] * math.log2(p[i] / m[i]) for i in range(len(p)) if p[i] > 0)
kl_qm = sum(q[i] * math.log2(q[i] / m[i]) for i in range(len(q)) if q[i] > 0)
return 0.5 * kl_pm + 0.5 * kl_qm
p = [0.5, 0.5]
q = [0.25, 0.75]
print(round(js_divergence(p, q), 4)) # 0.0488
Your Task
Implement:
js_divergence(p, q)— wherejs_distance(p, q)—
Python runtime loading...
Loading...
Click "Run" to execute your code.