A heuristic is a common-sense strategy for intelligently moving through the solution space in order to obtain an approximate solution in a reasonable time.
Two main types of heuristics are used: construction heuristics (e.g. greedy methods) that build solutions via iterations, and descending heuristics that seek a local optimum from a given solution.
Heuristics depend on the problem to solve, mainly for choosing the nearest neighbour (thus in moving through the solution space).
More advanced heuristics have been developed, giving birth to a new category of algorithms: meta-heuristics.
The goal of a meta-heuristic is to find a global optimum. This is achieved by both browsing the search area and exploring promising regions, but without being “trapped” by a local optimum.
Meta-heuristics are often inspired by natural processes.
The main meta-heuristics used to solve problems with discrete variables include:
- simulated annealing, which explores the search area while accepting to deteriorate its solution in order to identify the local optima. Throughout the process the annealing will decreasingly accept this deterioration, which will make it converge on a (hopefully) global optimum.
- tabu searches, which—contrary to simulated annealing—are deterministic and use the concept of memory. The choice of the best neighbour for a solution drives the algorithm to find the local optima; and since exploring the search area involves limiting the solution’s neighbours by making certain movements taboo, in theory the algorithm must only find the global optimum.
- evolutionary algorithms, resulting from Darwin’s theory of evolution, which manipulate several solutions at the same time by combining them to create new solutions. The fact of having a group of solutions makes it easier to explore the search area, and the best solutions will be favoured to help create new solutions. This promotes combinations of “good” characteristics, and thus helps find a global optimum.