πŸ”„ Backpropagation Visualizer

Watch data flow forward, then gradients flow backward

β‘  Select an Input (XOR Problem)

[0, 0]
Expected: 0
[0, 1]
Expected: 1
[1, 0]
Expected: 1
[1, 1]
Expected: 0

β‘‘ Current State

?
Network Output
0
Expected
?
Loss (ErrorΒ²)
0
Training Steps

β‘’ Step Through the Process

1
β†’
2
β†’
3
β†’
4
1: Forward 2: Calculate Loss 3: Backward 4: Update Weights

β‘£ The Network

Forward pass (data flows right)
Backward pass (gradients flow left)

β‘€ What's Happening

Click "Forward Pass" to start
We'll walk through how the network processes an input, calculates error, and then figures out how to adjust each weight to reduce that error.

β‘₯ Gradients (How Much to Adjust Each Weight)

Positive gradient β†’ weight is too high β†’ decrease it
Negative gradient β†’ weight is too low β†’ increase it