Lesson 13 of 15
Neural Network Forward Pass
Neural Network Forward Pass
A neural network layer takes an input vector and applies a weight matrix and bias vector :
Each row contains the weights for one neuron:
Then an activation function (sigmoid or another non-linearity — not to be confused with the used for standard deviation in statistics) is applied element-wise:
Two-Layer Network
A network with two layers computes:
Your Task
Implement:
layer_output(x, W, b)→ list: for each neuron rowrelu_layer(x, W, b)→ apply ReLU to each element oflayer_output(x, W, b)
Python runtime loading...
Loading...
Click "Run" to execute your code.