Lesson 13 of 15

Channel Capacity

Channel Capacity

Shannon's channel capacity is the maximum rate at which information can be transmitted reliably over a noisy channel:

C=maxp(X)I(X;Y) bits/channel useC = \max_{p(X)} I(X; Y) \text{ bits/channel use}

Binary Symmetric Channel (BSC)

The simplest noisy channel flips each bit independently with probability pp:

BSC(p):{P(Y=0X=0)=1pP(Y=1X=0)=pP(Y=0X=1)=pP(Y=1X=1)=1p\text{BSC}(p): \begin{cases} P(Y=0|X=0) = 1-p \\ P(Y=1|X=0) = p \\ P(Y=0|X=1) = p \\ P(Y=1|X=1) = 1-p \end{cases}

BSC Capacity

The maximum MI over BSC(pp) is achieved by the uniform input distribution:

CBSC(p)=1Hb(p)C_{\text{BSC}}(p) = 1 - H_b(p)

where Hb(p)H_b(p) is the binary entropy function:

Hb(p)=plog2p(1p)log2(1p)H_b(p) = -p \log_2 p - (1-p) \log_2(1-p)

Special Cases

Error probabilityCapacity
p=0p = 0 (noiseless)C=1C = 1 bit
p=0.5p = 0.5 (pure noise)C=0C = 0 bits
p=1p = 1 (inverted)C=1C = 1 bit (just flip output)
import math

def binary_entropy(p):
    if p <= 0 or p >= 1:
        return 0.0
    return -p * math.log2(p) - (1-p) * math.log2(1-p)

def binary_channel_capacity(p_error):
    return 1 - binary_entropy(p_error)

print(binary_channel_capacity(0.0))   # 1.0
print(binary_channel_capacity(0.5))   # 0.0
print(round(binary_channel_capacity(0.1), 4))  # 0.531

Your Task

Implement:

  • binary_entropy(p)Hb(p)H_b(p); return 0.00.0 for p0p \leq 0 or p1p \geq 1
  • binary_channel_capacity(p_error)1Hb(p)1 - H_b(p)
  • bsc_capacity(p) — alias for binary_channel_capacity
Python runtime loading...
Loading...
Click "Run" to execute your code.