In this chapter, we are going to implement the MATLAB code for Differential Evolution (DE) algorithm. If you’re a beginner and want to understand the basic idea behind the Differential Evolution algorithm. (click here, link to Introduction to DE) Before implementing the DE algorithm, it’s a good idea to understand the basics of the algorithm, because most of the other evolutionary algorithms share the same design methodology. Anyway, let’s get back to the implementation of the DE algorithm in MATLAB. First, we need to define an objective function, you can change the definition of the objective function based on your needs. For example, we are taking a simple 10-dimensional sphere function.
function result = sphere_func(x)
% x is a vector of input values
% result is the value of the sphere function at x
% Compute the sum of squares of elements in x
result = sum(x.^2);
end
%Save this file in the same working directory where you'll save your DE file
After defining your objective function it’s time to implement the DE algorithm. Start the optimization process by initializing parameters and population in the search space, and then apply mutation, crossover, and selection operators to proceed. DE is a very fast, easy-to-implement, and user-friendly meta-heuristic algorithm. Let’s have a glimpse of the MATLAB Code for Differential Evolution Algorithm.
clc;
clear all;
% Differential Evolution Parameters
pop_size = 50;
num_vars = 30;
lb = -100;
ub = 100;
num_generations = 1000;
F = 0.5; % Differential weight
CR = 0.7; % Crossover probability
% Initialize population
population = lb + (ub - lb) * rand(pop_size, num_vars);
trial_population = zeros(pop_size, num_vars);
% Evaluate initial population
fitness = zeros(pop_size, 1);
for i = 1:pop_size
fitness(i) = sphere_func(population(i, :));
end
% Main loop
for gen = 1:num_generations
for i = 1:pop_size
% Mutation
indices = setdiff(1:pop_size, i); % Exclude the current index
selected_indices = randsample(indices, 3); % Randomly select three distinct indices
a = population(selected_indices(1), :);
b = population(selected_indices(2), :);
c = population(selected_indices(3), :);
mutant_vector = a + F * (b - c);
% Crossover
j_rand = randi([1, num_vars]);
for j = 1:num_vars
if rand() <= CR || j == j_rand
trial_population(i, j) = mutant_vector(j);
else
trial_population(i, j) = population(i, j);
end
end
end
% Selection
for i = 1:pop_size
trial_fitness = sphere_func(trial_population(i, :));
if trial_fitness < fitness(i)
population(i, :) = trial_population(i, :);
fitness(i) = trial_fitness;
end
end
% Display current best fitness
best_fitness = min(fitness);
fprintf('Generation %d: Best Fitness = %f\n', gen, best_fitness);
end
% Final evaluation
best_solution = population(fitness == min(fitness), :);
fprintf('Final Best Fitness = %f\n', best_fitness);
disp('Best Solution:');
disp(best_solution);
Differential Evolution (DE) has been successfully applied in many domains, such as engineering design, finance, robotics, machine learning, and artificial intelligence. DE has been used in the design of antennas, the prediction of stock market trends, the development of autonomous robots, and the training of neural networks. If you’re interested in the Python code of the DE algorithms. (click here). In case you want more help with the Differential Evolution algorithm, drop a comment in the comment box below. Enjoy learning the Differential Evolution Algorithm. Read more about its variants and applications.