In this lesson, we will learn Python Code for Differential Evolution Algorithm. Differential Evolution (DE) is a population-based stochastic optimization algorithm used for solving complex optimization problems. It is inspired by the process of natural selection and evolution. DE iteratively refines the population over multiple generations, aiming to converge toward an optimal or near-optimal solution. The algorithm’s effectiveness lies in its ability to efficiently explore high-dimensional search spaces and handles non-differentiable, noisy, or discontinuous objective functions. Let’s understand the Python implementation of the DE algorithm.

Implementing Differential Evolution (DE) in Python involves a few key steps. First, define the objective function that you want to optimize. This function should take a vector of variables as input and return a fitness value. Next, set the DE parameters such as population size, number of variables, and the ranges for each variable. Then, create a main loop that iterates over the desired number of generations. Within each generation, perform mutation, crossover, and selection operations to create the offspring population. Finally, evaluate the fitness of the final population and extract the best solution. Python provides powerful libraries like NumPy for vectorized operations, making it convenient to implement the DE algorithm efficiently. Let’s have a look at the implementation of the Python Code for Differential Evolution Algorithm.

import numpy as np

def sphere_func(x):
    return np.sum(x**2)

def differential_evolution(pop_size, num_vars, lb, ub, num_generations, F, CR):
    # Initialize population
    population = np.random.uniform(lb, ub, (pop_size, num_vars))
    
    # Main loop
    for gen in range(num_generations):
        # Evaluate fitness
        fitness = np.apply_along_axis(sphere_func, 1, population)
        best_fitness = np.min(fitness)
        
        # Create offspring population
        offspring = np.zeros_like(population)
        for i in range(pop_size):
            # Select three distinct individuals from the population
            idx = np.random.choice(pop_size, 3, replace=False)
            x1, x2, x3 = population[idx]
            
            # Mutation
            v = x1 + F * (x2 - x3)
            
            # Crossover
            mask = np.random.rand(num_vars) < CR
            u = np.where(mask, v, population[i])
            
            # Selection
            if sphere_func(u) <= fitness[i]:
                offspring[i] = u
            else:
                offspring[i] = population[i]
        
        # Update population
        population = offspring
        
        # Display current best fitness
        print(f'Generation {gen}: Best Fitness = {best_fitness}')
    
    # Final evaluation
    final_fitness = np.apply_along_axis(sphere_func, 1, population)
    best_solution = population[np.argmin(final_fitness)]
    best_fitness = np.min(final_fitness)
    print(f'Final Best Fitness = {best_fitness}')
    print('Best Solution:')
    print(best_solution)

# DE Parameters
pop_size = 50
num_vars = 3
lb = -100
ub = 100
num_generations = 1000
F = 0.8  # Scaling factor
CR = 0.9  # Crossover rate

# Run DE algorithm
differential_evolution(pop_size, num_vars, lb, ub, num_generations, F, CR)

Copy this code into your code editor and start playing with DE algorithms. Define some more objective functions and see the power of the Differential Evolution (DE) by yourself. If you’re looking for the MATLAB Code of the DE algorithm click here. In case of any doubt or discussion, please leave a comment below. Happy Learning!

Leave a Reply

Your email address will not be published. Required fields are marked *