Lesson 13 of 15

Neural Network Forward Pass

Neural Network Forward Pass

A neural network layer takes an input vector x\mathbf{x} and applies a weight matrix WW and bias vector b\mathbf{b}:

z=Wx+b\mathbf{z} = W\mathbf{x} + \mathbf{b}

Each row WiW_i contains the weights for one neuron:

zi=Wix+bi=jWijxj+biz_i = W_i \cdot \mathbf{x} + b_i = \sum_j W_{ij} x_j + b_i

Then an activation function σ\sigma (sigmoid or another non-linearity — not to be confused with the σ\sigma used for standard deviation in statistics) is applied element-wise:

a=σ(z)\mathbf{a} = \sigma(\mathbf{z})

Two-Layer Network

A network with two layers computes:

a(1)=ReLU(W(1)x+b(1))\mathbf{a}^{(1)} = \text{ReLU}(W^{(1)}\mathbf{x} + \mathbf{b}^{(1)}) y^=W(2)a(1)+b(2)\hat{y} = W^{(2)}\mathbf{a}^{(1)} + b^{(2)}

Your Task

Implement:

  • layer_output(x, W, b) → list: [Wix+bi][W_i \cdot x + b_i] for each neuron row WiW_i
  • relu_layer(x, W, b) → apply ReLU to each element of layer_output(x, W, b)
Python runtime loading...
Loading...
Click "Run" to execute your code.