Lesson 9 of 15

Network Forward Pass

Composing Layers

A neural network is a sequence of layers. The forward pass is just function composition:

a(0)=x\mathbf{a}^{(0)} = \mathbf{x}

a(l)=f(l)(a(l1))for l=1,2,,L\mathbf{a}^{(l)} = f^{(l)}(\mathbf{a}^{(l-1)}) \quad \text{for } l = 1, 2, \ldots, L

Where each f(l)f^{(l)} is either a DenseLayer or an ActivationLayer.

Activation Layers

An ActivationLayer simply applies a scalar function element-wise to its input:

a=[f(z1),f(z2),,f(zm)]\mathbf{a} = [f(z_1), f(z_2), \ldots, f(z_m)]

Network Architecture

A typical two-hidden-layer network looks like:

Input → Dense(n, 64) → ReLU → Dense(64, 32) → ReLU → Dense(32, 1) → Sigmoid

Each Dense layer mixes features. Each Activation layer introduces non-linearity.

Your Task

Implement:

  • ActivationLayer(activation) with a forward(inputs) method that applies activation element-wise
  • Network(layers) with a forward(inputs) method that passes data through each layer in sequence
Python runtime loading...
Loading...
Click "Run" to execute your code.