Machine Learning Part 9: An Introduction to Neural Networks


deep learning

Neural Networks are the backbone of modern Artificial Intelligence, powering everything from face recognition to self-driving cars. In this post, we’ll explore the basic building block of a neural network: the Perceptron.

The Building Block: The Perceptron 🕸

The perceptron is a simplified mathematical model of a biological neuron. It takes multiple inputs, processes them, and produces a single output.

How it Works:

  1. Inputs: Numerical values from your data.
  2. Weights: Every input has a weight that determines its importance.
  3. Summation: The perceptron multiplies each input by its weight and adds them up.
  4. Activation Function: The sum is passed through a function that decides whether the neuron “fires” (outputs a value).

Example Calculation:

  • Inputs: $x_1 = 5, x_2 = 6$
  • Weights: $w_1 = 2, w_2 = 1$
  • Sum = $(5 \cdot 2) + (6 \cdot 1) = 16$

If our Activation Function says: “If sum > 10, output 1, else 0,” then our perceptron outputs 1.


Adding the Bias 🎋

If all inputs are 0, the sum will always be 0, no matter what the weights are. To prevent this, we add a Bias input (usually a constant 1 with its own weight). This allows the neuron to “fire” even when all other inputs are zero.

Training the Network 🎋

A Neural Network is just a large collection of these perceptrons organized in layers. When we “train” a network, we are actually adjusting the weights and biases.

  1. Guess: The network makes a prediction using random weights.
  2. Calculate Error: We compare the guess to the actual answer ($Error = Actual - Guess$).
  3. Adjust: We update the weights based on the error. $$New Weight = Weight + (Error \cdot Input \cdot Learning Rate)$$

The Learning Rate controls how much we change the weight in each step. Too high, and the model becomes unstable. Too low, and it takes forever to learn.

Summary

Neural Networks learn by trial and error. By processing thousands of examples and constantly adjusting their weights, they can eventually “recognize” complex patterns that traditional algorithms can’t.

Exercise

Try to find a “Neural Network from scratch” tutorial in Python. You’ll be surprised at how simple the math actually is for a single neuron!

Written by

Abdur-Rahmaan Janhangeer

Chef

Python author of 9+ years having worked for Python companies around the world

Suggested Posts

What is Perceptron in Deep Learning

Deep learning is one of the most useful and trending topics in the current world scenario, which wor...

Read article

ReLU Activation Function and Its Variants

Activation functions in deep learning are the functions that decide the output from node or hidden l...

Read article

Biosim4: Framework For Evolution With Neural Networks

Biosim4 is a project in which there’s a 2-dimensional world where creatures with neural brains live ...

Read article
Free Flask Course