Introduction
Genetic Algorithm (GA) is a metaheuristic optimization technique inspired by biological evolution. It repeatedly applies selection, crossover, and mutation to a population of candidate solutions to search for optima of an objective function.
GA is a gradient-free black-box optimization method applicable to multimodal functions and discrete optimization problems.
Algorithm Overview
The basic GA flow is:
- Initialization: Randomly generate \(N\) individuals
- Fitness evaluation: Compute objective function for each individual
- Selection: Select high-fitness individuals as parents
- Crossover: Generate offspring from parent pairs
- Mutation: Modify genes of individuals with some probability
- Replacement: Form the next generation
- Repeat steps 2-6 until convergence
Selection
Tournament Selection
Randomly pick \(k\) individuals from the population and select the one with the best fitness. \(k\) is called the tournament size; larger values increase selection pressure.
\[P(\text{best individual selected}) = 1 - \left(1 - \frac{1}{N}\right)^k \tag{1}\]Roulette Wheel Selection
Select individuals with probability proportional to fitness. The selection probability for individual \(i\) is:
\[P(i) = \frac{f(x_i)}{\sum_{j=1}^{N} f(x_j)} \tag{2}\]where \(f(x_i)\) is the fitness of individual \(i\). For minimization problems, a fitness transformation (e.g., \(f' = f_{\max} - f\)) is needed.
Crossover
BLX-\(\alpha\) Crossover (for Real-Valued Optimization)
For continuous optimization, BLX-\(\alpha\) crossover is widely used. When generating a child from parents \(p_1, p_2\), each dimension \(d\) is computed as:
\[c_d = U\left[\min(p_{1,d}, p_{2,d}) - \alpha \cdot I_d, \; \max(p_{1,d}, p_{2,d}) + \alpha \cdot I_d\right] \tag{3}\]where \(I_d = |p_{1,d} - p_{2,d}|\), \(U[a, b]\) is the uniform distribution, and \(\alpha\) is typically 0.5.
Mutation
Gaussian Mutation
For continuous optimization, adding Gaussian noise is the standard mutation approach:
\[x'_d = x_d + \mathcal{N}(0, \sigma^2) \tag{4}\]\(\sigma\) controls mutation strength. Adaptive mutation that decreases \(\sigma\) over generations is also effective.
Python Implementation: Rastrigin Function Optimization
The Rastrigin function is a multimodal benchmark with global minimum \(f(\mathbf{0}) = 0\):
\[f(\mathbf{x}) = 10n + \sum_{i=1}^{n} \left[x_i^2 - 10\cos(2\pi x_i)\right] \tag{5}\]import numpy as np
import matplotlib.pyplot as plt
def rastrigin(x):
"""Rastrigin function"""
return 10 * len(x) + np.sum(x**2 - 10 * np.cos(2 * np.pi * x), axis=-1)
class GeneticAlgorithm:
def __init__(self, dim, pop_size=100, bounds=(-5.12, 5.12),
crossover_rate=0.8, mutation_rate=0.1, mutation_sigma=0.5,
tournament_size=3):
self.dim = dim
self.pop_size = pop_size
self.bounds = bounds
self.crossover_rate = crossover_rate
self.mutation_rate = mutation_rate
self.mutation_sigma = mutation_sigma
self.tournament_size = tournament_size
def initialize(self):
"""Generate random initial population"""
low, high = self.bounds
return np.random.uniform(low, high, (self.pop_size, self.dim))
def tournament_selection(self, population, fitness):
"""Tournament selection"""
indices = np.random.randint(0, self.pop_size, size=self.tournament_size)
best = indices[np.argmin(fitness[indices])]
return population[best]
def blx_alpha_crossover(self, parent1, parent2, alpha=0.5):
"""BLX-alpha crossover"""
low = np.minimum(parent1, parent2) - alpha * np.abs(parent1 - parent2)
high = np.maximum(parent1, parent2) + alpha * np.abs(parent1 - parent2)
child = np.random.uniform(low, high)
return np.clip(child, *self.bounds)
def gaussian_mutation(self, individual):
"""Gaussian mutation"""
mask = np.random.random(self.dim) < self.mutation_rate
noise = np.random.normal(0, self.mutation_sigma, self.dim)
individual[mask] += noise[mask]
return np.clip(individual, *self.bounds)
def run(self, objective, max_generations=200):
"""Run GA"""
population = self.initialize()
best_history = []
for gen in range(max_generations):
fitness = objective(population)
best_idx = np.argmin(fitness)
best_history.append(fitness[best_idx])
new_population = [population[best_idx].copy()] # Elitism
while len(new_population) < self.pop_size:
p1 = self.tournament_selection(population, fitness)
p2 = self.tournament_selection(population, fitness)
if np.random.random() < self.crossover_rate:
child = self.blx_alpha_crossover(p1, p2)
else:
child = p1.copy()
child = self.gaussian_mutation(child)
new_population.append(child)
population = np.array(new_population[:self.pop_size])
fitness = objective(population)
best_idx = np.argmin(fitness)
return population[best_idx], fitness[best_idx], best_history
# --- Run ---
np.random.seed(42)
ga = GeneticAlgorithm(dim=10, pop_size=200, max_generations=300)
best_solution, best_fitness, history = ga.run(rastrigin, max_generations=300)
print(f"Best fitness: {best_fitness:.6f}")
print(f"Best solution: {best_solution[:3]}... (first 3 dimensions)")
# --- Convergence plot ---
plt.figure(figsize=(10, 5))
plt.plot(history)
plt.xlabel('Generation')
plt.ylabel('Best Fitness')
plt.title('GA Convergence on Rastrigin Function (10D)')
plt.yscale('log')
plt.grid(True, alpha=0.3)
plt.tight_layout()
plt.show()
Comparison with Other Methods
| Method | Population-based | Gradient-free | Discrete problems | Key feature |
|---|---|---|---|---|
| GA | Yes | Yes | Yes | Combinatorial exploration via crossover |
| CEM | Yes | Yes | Difficult | Iterative distribution update |
| PSO | Yes | Yes | Difficult | Velocity-based particle movement |
| SA | No | Yes | Yes | Probabilistic acceptance via temperature |
Related Articles
- Cross-Entropy Method: A Practical Monte Carlo Optimization Technique - CEM is also a black-box optimizer but uses distribution updates for exploration.
- Particle Swarm Optimization (PSO) Python Implementation - Another metaheuristic based on swarm intelligence.
- Bayesian Optimization: Fundamentals and Python Implementation - Efficient optimization using Gaussian process surrogate models.
- MPPI (Model Predictive Path Integral) Mathematics - Sampling-based control optimization.
References
- Goldberg, D. E. (1989). Genetic Algorithms in Search, Optimization, and Machine Learning. Addison-Wesley.
- Eshelman, L. J., & Schaffer, J. D. (1993). “Real-coded genetic algorithms and interval-schemata”. Foundations of Genetic Algorithms, 2, 187-202.
- Back, T. (1996). Evolutionary Algorithms in Theory and Practice. Oxford University Press.