Skip to content

Latest commit

 

History

History
11 lines (7 loc) · 668 Bytes

File metadata and controls

11 lines (7 loc) · 668 Bytes

Neural-Network-backpropagation

Implementation of Neural Network from scratch, used Sigmoid, tanh and ReLu activation functions.

Coded a neural network (NN) having two hidden layers, besides the input and output layers. Implemented Sigmoid, tanh and ReLu activation functions. Implemented backpropagation algorithm for training the neural network.

The above neural network was then used to make predictions for three datasets below:

  1. Car Evaluation Dataset: https://archive.ics.uci.edu/ml/datasets/Car+Evaluation
  2. Iris Dataset: https://archive.ics.uci.edu/ml/datasets/Iris
  3. Adult Census Income Dataset: https://archive.ics.uci.edu/ml/datasets/Census+Income