Skip to content

gitced/neural_network_backprop

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 

Repository files navigation

Neural Network Backpropagation Visualizer

An interactive visualization tool that demonstrates backpropagation in a simple 2-layer neural network. This tool helps understand how neural networks learn by showing the forward and backward passes of computation, along with real-time parameter updates.

Features

  • Interactive visualization of a 2-layer neural network
  • Real-time parameter adjustment through sliders
  • Live computation of forward and backward passes
  • Visualization of gradients and activations
  • Adjustable learning rate
  • Reset parameters functionality
  • Step-by-step training capability

Architecture

The neural network consists of:

  • Input layer (1 neuron)
  • Hidden layer (2 neurons)
  • Output layer (1 neuron)
  • Sigmoid activation function
  • Mean Squared Error (MSE) loss function

Requirements

  • Python 3.x
  • NumPy
  • Tkinter (usually comes with Python)

Installation

  1. Clone this repository:
git clone https://github.com/yourusername/backpropagation.git
cd backpropagation
  1. Install the required dependencies:
pip install numpy

Usage

Run the visualization tool:

python backprop.py

Controls

  • Input x: Set the input value (default: 0.5)
  • Target y: Set the target output value (default: 0.8)
  • LR (α): Set the learning rate for training
  • Reset Params: Randomize all network parameters
  • Train (1 Step): Perform one step of gradient descent

Interactive Features

  • Adjust network parameters (weights and biases) using sliders
  • View real-time updates of:
    • Network activations
    • Parameter values
    • Gradients
    • Loss value
    • Predicted output

How It Works

  1. Forward Pass:

    • Input is multiplied by weights and added to biases
    • Sigmoid activation is applied
    • Final output is computed
    • Loss is calculated
  2. Backward Pass:

    • Gradients are computed for all parameters
    • Parameters are updated using gradient descent
    • Updates are reflected in the visualization

Contributing

Feel free to submit issues and enhancement requests!

License

This project is open source and available under the MIT License.

About

A simple visual neural network in python for pedagogical purpose

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages