Lesson 15 of 15

Solving XOR

The XOR Problem

The XOR problem was historically significant. In 1969, Minsky & Papert proved that a single-layer perceptron cannot learn XOR — the function is not linearly separable:

x1x_1x2x_2XOR
000
011
101
110

No single line can separate the 0s from the 1s. This insight contributed to the first "AI winter" — until multi-layer networks with backpropagation were rediscovered in the 1980s.

A Two-Layer Solution

A 2-layer network with sigmoid activations can solve XOR exactly. Here is one such set of weights:

Layer 1 (2→2):
  W1 = [[20, 20], [-20, -20]]
  b1 = [-10, 30]

Layer 2 (2→1):
  W2 = [20, 20]
  b2 = -30

Intuition: neuron 1 learns AND, neuron 2 learns NAND. The output neuron combines them to compute OR(AND, NAND) = XOR.

Your Task

Implement forward_xor(inputs, W1, b1, W2, b2) that:

  1. Computes hidden[j] = sigmoid(W1[j]·inputs + b1[j]) for each hidden neuron
  2. Computes output = sigmoid(W2·hidden + b2)
  3. Returns the scalar output
Python runtime loading...
Loading...
Click "Run" to execute your code.