Numpy Tutorial
Creating NumPy Array
NumPy Array Manipulation
Matrix in NumPy
Operations on NumPy Array
Reshaping NumPy Array
Indexing NumPy Array
Arithmetic operations on NumPy Array
Linear Algebra in NumPy Array
NumPy and Random Data
Sorting and Searching in NumPy Array
Universal Functions
Working With Images
Projects and Applications with NumPy
Creating a neural network from scratch provides a deep understanding of its workings. Here, we'll implement a simple feedforward neural network using just NumPy.
First, ensure you have NumPy installed:
pip install numpy
Then, import the necessary libraries:
import numpy as np
We'll use the sigmoid function as our activation due to its simplicity:
def sigmoid(x): return 1 / (1 + np.exp(-x)) def sigmoid_derivative(x): return x * (1 - x)
We'll make a simple network with 1 hidden layer:
input_neurons = 3 hidden_neurons = 4 output_neurons = 1 # Weights and biases initialization weights_input_hidden = np.random.randn(input_neurons, hidden_neurons) weights_hidden_output = np.random.randn(hidden_neurons, output_neurons) biases_hidden = np.random.randn(hidden_neurons) biases_output = np.random.randn(output_neurons)
def feedforward(X): hidden_layer_input = np.dot(X, weights_input_hidden) + biases_hidden hidden_layer_output = sigmoid(hidden_layer_input) output_layer_input = np.dot(hidden_layer_output, weights_hidden_output) + biases_output predicted_output = sigmoid(output_layer_input) return hidden_layer_output, predicted_output
def backpropagation(X, Y, hidden_layer_output, predicted_output, lr=0.1): output_error = Y - predicted_output output_delta = output_error * sigmoid_derivative(predicted_output) hidden_layer_error = output_delta.dot(weights_hidden_output.T) hidden_layer_delta = hidden_layer_error * sigmoid_derivative(hidden_layer_output) # Update weights and biases weights_hidden_output += hidden_layer_output.T.dot(output_delta) * lr weights_input_hidden += X.T.dot(hidden_layer_delta) * lr biases_output += np.sum(output_delta, axis=0) * lr biases_hidden += np.sum(hidden_layer_delta, axis=0) * lr
def train(X, Y, epochs): for epoch in range(epochs): hidden_layer_output, predicted_output = feedforward(X) backpropagation(X, Y, hidden_layer_output, predicted_output) if epoch % 1000 == 0: loss = np.mean(np.square(Y - predicted_output)) print(f"Epoch {epoch}, Loss: {loss}")
Let's use the neural network to perform a simple logic operation like XOR:
# XOR dataset X = np.array([[0, 0, 1], [0, 1, 1], [1, 0, 1], [1, 1, 1]]) Y = np.array([[0], [1], [1], [0]]) train(X, Y, 10000) _, predicted_output = feedforward(X) print("Predicted Output after Training:") print(predicted_output)
This is a basic implementation of a feedforward neural network using only NumPy. For larger and more complex networks, you would typically use specialized libraries like TensorFlow or PyTorch. However, creating a simple network from scratch helps solidify understanding of core neural network concepts.
Description: Creating a neural network entirely using NumPy, covering the architecture, activation functions, and the training process.
Code:
import numpy as np # Neural network architecture input_size = 3 hidden_size = 4 output_size = 2 # Random initialization of weights and biases weights_input_hidden = np.random.rand(input_size, hidden_size) biases_hidden = np.zeros((1, hidden_size)) weights_hidden_output = np.random.rand(hidden_size, output_size) biases_output = np.zeros((1, output_size)) # Activation function (sigmoid) def sigmoid(x): return 1 / (1 + np.exp(-x)) # Forward pass def forward(input_data): hidden_layer_input = np.dot(input_data, weights_input_hidden) + biases_hidden hidden_layer_output = sigmoid(hidden_layer_input) output_layer_input = np.dot(hidden_layer_output, weights_hidden_output) + biases_output output = sigmoid(output_layer_input) return output # Example input data input_data = np.array([[0.8, 0.2, 0.5]]) # Forward pass to get predictions predictions = forward(input_data) print("Neural Network Predictions:") print(predictions)
Description: Creating a simple neural network with a single hidden layer using NumPy.
Code:
import numpy as np # Neural network architecture input_size = 3 hidden_size = 4 output_size = 2 # Random initialization of weights and biases weights_input_hidden = np.random.rand(input_size, hidden_size) biases_hidden = np.zeros((1, hidden_size)) weights_hidden_output = np.random.rand(hidden_size, output_size) biases_output = np.zeros((1, output_size)) # Activation function (sigmoid) def sigmoid(x): return 1 / (1 + np.exp(-x)) # Forward pass def forward(input_data): hidden_layer_input = np.dot(input_data, weights_input_hidden) + biases_hidden hidden_layer_output = sigmoid(hidden_layer_input) output_layer_input = np.dot(hidden_layer_output, weights_hidden_output) + biases_output output = sigmoid(output_layer_input) return output # Example input data input_data = np.array([[0.8, 0.2, 0.5]]) # Forward pass to get predictions predictions = forward(input_data) print("Neural Network Predictions:") print(predictions)
Description: Creating a basic neural network with one hidden layer and using NumPy for implementation.
Code:
import numpy as np # Neural network architecture input_size = 3 hidden_size = 4 output_size = 2 # Random initialization of weights and biases weights_input_hidden = np.random.rand(input_size, hidden_size) biases_hidden = np.zeros((1, hidden_size)) weights_hidden_output = np.random.rand(hidden_size, output_size) biases_output = np.zeros((1, output_size)) # Activation function (sigmoid) def sigmoid(x): return 1 / (1 + np.exp(-x)) # Forward pass def forward(input_data): hidden_layer_input = np.dot(input_data, weights_input_hidden) + biases_hidden hidden_layer_output = sigmoid(hidden_layer_input) output_layer_input = np.dot(hidden_layer_output, weights_hidden_output) + biases_output output = sigmoid(output_layer_input) return output # Example input data input_data = np.array([[0.8, 0.2, 0.5]]) # Forward pass to get predictions predictions = forward(input_data) print("Neural Network Predictions:") print(predictions)
Description: Building a feedforward neural network (without backpropagation) from scratch using Python and NumPy.
Code:
import numpy as np # Neural network architecture input_size = 3 hidden_size = 4 output_size = 2 # Random initialization of weights and biases weights_input_hidden = np.random.rand(input_size, hidden_size) biases_hidden = np.zeros((1, hidden_size)) weights_hidden_output = np.random.rand(hidden_size, output_size) biases_output = np.zeros((1, output_size)) # Activation function (sigmoid) def sigmoid(x): return 1 / (1 + np.exp(-x)) # Forward pass def forward(input_data): hidden_layer_input = np.dot(input_data, weights_input_hidden) + biases_hidden hidden_layer_output = sigmoid(hidden_layer_input) output_layer_input = np.dot(hidden_layer_output, weights_hidden_output) + biases_output output = sigmoid(output_layer_input) return output # Example input data input_data = np.array([[0.8, 0.2, 0.5]]) # Forward pass to get predictions predictions = forward(input_data) print("Neural Network Predictions:") print(predictions)
Description: Providing an example of a neural network implemented with NumPy for feedforward computation.
Code:
import numpy as np # Neural network architecture input_size = 3 hidden_size = 4 output_size = 2 # Random initialization of weights and biases weights_input_hidden = np.random.rand(input_size, hidden_size) biases_hidden = np.zeros((1, hidden_size)) weights_hidden_output = np.random.rand(hidden_size, output_size) biases_output = np.zeros((1, output_size)) # Activation function (sigmoid) def sigmoid(x): return 1 / (1 + np.exp(-x)) # Forward pass def forward(input_data): hidden_layer_input = np.dot(input_data, weights_input_hidden) + biases_hidden hidden_layer_output = sigmoid(hidden_layer_input) output_layer_input = np.dot(hidden_layer_output, weights_hidden_output) + biases_output output = sigmoid(output_layer_input) return output # Example input data input_data = np.array([[0.8, 0.2, 0.5]]) # Forward pass to get predictions predictions = forward(input_data) print("Neural Network Predictions:") print(predictions)
Description: Demonstrating a neural network implementation using only Python and NumPy, covering the forward pass.
Code:
import numpy as np # Neural network architecture input_size = 3 hidden_size = 4 output_size = 2 # Random initialization of weights and biases weights_input_hidden = np.random.rand(input_size, hidden_size) biases_hidden = np.zeros((1, hidden_size)) weights_hidden_output = np.random.rand(hidden_size, output_size) biases_output = np.zeros((1, output_size)) # Activation function (sigmoid) def sigmoid(x): return 1 / (1 + np.exp(-x)) # Forward pass def forward(input_data): hidden_layer_input = np.dot(input_data, weights_input_hidden) + biases_hidden hidden_layer_output = sigmoid(hidden_layer_input) output_layer_input = np.dot(hidden_layer_output, weights_hidden_output) + biases_output output = sigmoid(output_layer_input) return output # Example input data input_data = np.array([[0.8, 0.2, 0.5]]) # Forward pass to get predictions predictions = forward(input_data) print("Neural Network Predictions:") print(predictions)
Description: Constructing a deep learning model with multiple hidden layers using NumPy.
Code:
import numpy as np # Neural network architecture input_size = 3 hidden_sizes = [4, 5] output_size = 2 # Random initialization of weights and biases for each layer weights_input_hidden1 = np.random.rand(input_size, hidden_sizes[0]) biases_hidden1 = np.zeros((1, hidden_sizes[0])) weights_hidden1_hidden2 = np.random.rand(hidden_sizes[0], hidden_sizes[1]) biases_hidden2 = np.zeros((1, hidden_sizes[1])) weights_hidden2_output = np.random.rand(hidden_sizes[1], output_size) biases_output = np.zeros((1, output_size)) # Activation function (sigmoid) def sigmoid(x): return 1 / (1 + np.exp(-x)) # Forward pass for a deep network def forward(input_data): hidden1_input = np.dot(input_data, weights_input_hidden1) + biases_hidden1 hidden1_output = sigmoid(hidden1_input) hidden2_input = np.dot(hidden1_output, weights_hidden1_hidden2) + biases_hidden2 hidden2_output = sigmoid(hidden2_input) output_input = np.dot(hidden2_output, weights_hidden2_output) + biases_output output = sigmoid(output_input) return output # Example input data input_data = np.array([[0.8, 0.2, 0.5]]) # Forward pass to get predictions predictions = forward(input_data) print("Deep Neural Network Predictions:") print(predictions)
Description: Providing a step-by-step walkthrough of a simple neural network implementation using NumPy.
Code:
# Code for a basic neural network using NumPy # Walkthrough in comments import numpy as np # Define the neural network architecture input_size = 3 hidden_size = 4 output_size = 2 # Random initialization of weights and biases weights_input_hidden = np.random.rand(input_size, hidden_size) biases_hidden = np.zeros((1, hidden_size)) weights_hidden_output = np.random.rand(hidden_size, output_size) biases_output = np.zeros((1, output_size)) # Activation function (sigmoid) def sigmoid(x): return 1 / (1 + np.exp(-x)) # Forward pass def forward(input_data): # Input to hidden layer hidden_layer_input = np.dot(input_data, weights_input_hidden) + biases_hidden # Activation of hidden layer hidden_layer_output = sigmoid(hidden_layer_input) # Input to output layer output_layer_input = np.dot(hidden_layer_output, weights_hidden_output) + biases_output # Activation of output layer output = sigmoid(output_layer_input) return output # Example input data input_data = np.array([[0.8, 0.2, 0.5]]) # Forward pass to get predictions predictions = forward(input_data) # Print the predictions print("Neural Network Predictions:") print(predictions)