Lesson 15 of 15
Solving XOR
The XOR Problem
The XOR problem was historically significant. In 1969, Minsky & Papert proved that a single-layer perceptron cannot learn XOR — the function is not linearly separable:
| XOR | ||
|---|---|---|
| 0 | 0 | 0 |
| 0 | 1 | 1 |
| 1 | 0 | 1 |
| 1 | 1 | 0 |
No single line can separate the 0s from the 1s. This insight contributed to the first "AI winter" — until multi-layer networks with backpropagation were rediscovered in the 1980s.
A Two-Layer Solution
A 2-layer network with sigmoid activations can solve XOR exactly. Here is one such set of weights:
Layer 1 (2→2):
W1 = [[20, 20], [-20, -20]]
b1 = [-10, 30]
Layer 2 (2→1):
W2 = [20, 20]
b2 = -30
Intuition: neuron 1 learns AND, neuron 2 learns NAND. The output neuron combines them to compute OR(AND, NAND) = XOR.
Your Task
Implement forward_xor(inputs, W1, b1, W2, b2) that:
- Computes
hidden[j] = sigmoid(W1[j]·inputs + b1[j])for each hidden neuron - Computes
output = sigmoid(W2·hidden + b2) - Returns the scalar output
Python runtime loading...
Loading...
Click "Run" to execute your code.