Non-dominated Sorting Genetic Algorithm, Nondominated Sorting Genetic Algorithm, Fast Elitist Non-dominated Sorting Genetic Algorithm, NSGA, NSGA-II
Taxonomy
The Non-dominated Sorting Genetic Algorithm is a Multiple Objective Optimization (MOO) algorithm and is an instance of an Evolutionary Algorithm from the field of Evolutionary Computation. Refer to for more information and references on Multiple Objective Optimization. NSGA is an extension of the Genetic Algorithm for multiple objective function optimization. It is related to other Evolutionary Multiple Objective Optimization Algorithms (EMOO) (or Multiple Objective Evolutionary Algorithms MOEA) such as the Vector-Evaluated Genetic Algorithm (VEGA), Strength Pareto Evolutionary Algorithm (SPEA), and Pareto Archived Evolution Strategy (PAES). There are two versions of the algorithm, the classical NSGA and the updated and currently canonical form NSGA-II.
Strategy
The objective of the NSGA algorithm is to improve the adaptive fit of a population of candidate solutions to a Pareto front constrained by a set of objective functions. The algorithm uses an evolutionary process with surrogates for evolutionary operators including selection, genetic crossover, and genetic mutation. The population is sorted into a hierarchy of sub-populations based on the ordering of Pareto dominance. Similarity between members of each sub-group is evaluated on the Pareto front, and the resulting groups and similarity measures are used to promote a diverse front of non-dominated solutions.
Procedure
Algorithm (below) provides a pseudocode listing of the Non-dominated Sorting Genetic Algorithm II (NSGA-II) for minimizing a cost function. The SortByRankAndDistance
function orders the population into a hierarchy of non-dominated Pareto fronts. The CrowdingDistanceAssignment
calculates the average distance between members of each front on the front itself. Refer to Deb et al. for a clear presentation of the Pseudocode and explanation of these functions [Deb2002]. The CrossoverAndMutation
function performs the classical crossover and mutation genetic operators of the Genetic Algorithm. Both the SelectParentsByRankAndDistance
and SortByRankAndDistance
functions discriminate members of the population first by rank (order of dominated precedence of the front to which the solution belongs) and then distance within the front (calculated by CrowdingDistanceAssignment
).
Pseudocode for NSGAII:
Input
: Population_{size},ProblemSize
, P_{crossover}, P_{mutation}$Output
:Children
Population
<–InitializePopulation
(Population_{size},ProblemSize
)
EvaluateAgainstObjectiveFunctions
(Population
)
FastNondominatedSort
(Population
)
Selected
<–SelectParentsByRank
(Population
, Population_{size})
Children
<–CrossoverAndMutation
(Selected
, P_{crossover}, P_{mutation})
While
(StopCondition
())
EvaluateAgainstObjectiveFunctions
(Children
)
Union
<–Merge
(Population
,Children
)
Fronts
<–FastNondominatedSort
(Union
)
Parents <-- 0
Front_L <-- 0
For
(Front_i \inFronts
)
CrowdingDistanceAssignment
(Front_i)
If
(Size
(Parents
)+Size
(Front_i) > Population_{size})
Front_L <-- i
Break
()
Else
Parents
<–Merge
(Parents
, Front_i)
End
End
If
(Size
(Parents
)< Population_{size})
Front_L <--
SortByRankAndDistance
(Front_L)
For
(P_1To
P_{Population_{size} – Size{Front_L}})
Parents
<– Pi
End
End
Selected
<–SelectParentsByRankAndDistance
(Parents
, Population_{size})
Population
<–Children
Children
<–CrossoverAndMutation
(Selected
, P_{crossover}, P_{mutation})
End
Return
(Children
)
Heuristics
- NSGA was designed for and is suited to continuous function multiple objective optimization problem instances.
- A binary representation can be used in conjunction with classical genetic operators such as one-point crossover and point mutation.
- A real-valued representation is recommended for continuous function optimization problems, in turn requiring representation specific genetic operators such as Simulated Binary Crossover (SBX) and polynomial mutation [Deb1995].
Code Listing
Listing (below) provides an example of the Non-dominated Sorting Genetic Algorithm II (NSGA-II) implemented in the Ruby Programming Language. The demonstration problem is an instance of continuous multiple objective function optimization called SCH (problem one in [Deb2002]). The problem seeks the minimum of two functions: and , and . The optimal solution for this function are . The algorithm is an implementation of NSGA-II based on the presentation by Deb et al. [Deb2002]. The algorithm uses a binary string representation (16 bits per objective function parameter) that is decoded and rescaled to the function domain. The implementation uses a uniform crossover operator and point mutations with a fixed mutation rate of , where is the number of bits in a solution’s binary string.
NSGA-II in Ruby:
def objective1(vector) return vector.inject(0.0) {|sum, x| sum + (x**2.0)} end def objective2(vector) return vector.inject(0.0) {|sum, x| sum + ((x-2.0)**2.0)} end def decode(bitstring, search_space, bits_per_param) vector = [] search_space.each_with_index do |bounds, i| off, sum = i*bits_per_param, 0.0 param = bitstring[off...(off+bits_per_param)].reverse param.size.times do |j| sum += ((param[j].chr=='1') ? 1.0 : 0.0) * (2.0 ** j.to_f) end min, max = bounds vector << min + ((max-min)/((2.0**bits_per_param.to_f)-1.0)) * sum end return vector end def random_bitstring(num_bits) return (0...num_bits).inject(""){|s,i| s<<((rand<0.5) ? "1" : "0")} end def point_mutation(bitstring, rate=1.0/bitstring.size) child = "" bitstring.size.times do |i| bit = bitstring[i].chr child << ((rand()<rate) ? ((bit=='1') ? "0" : "1") : bit) end return child end def crossover(parent1, parent2, rate) return ""+parent1 if rand()>=rate child = "" parent1.size.times do |i| child << ((rand()<0.5) ? parent1[i].chr : parent2[i].chr) end return child end def reproduce(selected, pop_size, p_cross) children = [] selected.each_with_index do |p1, i| p2 = (i.modulo(2)==0) ? selected[i+1] : selected[i-1] p2 = selected[0] if i == selected.size-1 child = {} child[:bitstring] = crossover(p1[:bitstring], p2[:bitstring], p_cross) child[:bitstring] = point_mutation(child[:bitstring]) children << child break if children.size >= pop_size end return children end def calculate_objectives(pop, search_space, bits_per_param) pop.each do |p| p[:vector] = decode(p[:bitstring], search_space, bits_per_param) p[:objectives] = [objective1(p[:vector]), objective2(p[:vector])] end end def dominates(p1, p2) p1[:objectives].each_index do |i| return false if p1[:objectives][i] > p2[:objectives][i] end return true end def fast_nondominated_sort(pop) fronts = Array.new(1){[]} pop.each do |p1| p1[:dom_count], p1[:dom_set] = 0, [] pop.each do |p2| if dominates(p1, p2) p1[:dom_set] << p2 elsif dominates(p2, p1) p1[:dom_count] += 1 end end if p1[:dom_count] == 0 p1[:rank] = 0 fronts.first << p1 end end curr = 0 begin next_front = [] fronts[curr].each do |p1| p1[:dom_set].each do |p2| p2[:dom_count] -= 1 if p2[:dom_count] == 0 p2[:rank] = (curr+1) next_front << p2 end end end curr += 1 fronts << next_front if !next_front.empty? end while curr < fronts.size return fronts end def calculate_crowding_distance(pop) pop.each {|p| p[:dist] = 0.0} num_obs = pop.first[:objectives].size num_obs.times do |i| min = pop.min{|x,y| x[:objectives][i]<=>y[:objectives][i]} max = pop.max{|x,y| x[:objectives][i]<=>y[:objectives][i]} rge = max[:objectives][i] - min[:objectives][i] pop.first[:dist], pop.last[:dist] = 1.0/0.0, 1.0/0.0 next if rge == 0.0 (1...(pop.size-1)).each do |j| pop[j][:dist]+=(pop[j+1][:objectives][i]-pop[j-1][:objectives][i])/rge end end end def crowded_comparison_operator(x,y) return y[:dist]<=>x[:dist] if x[:rank] == y[:rank] return x[:rank]<=>y[:rank] end def better(x,y) if !x[:dist].nil? and x[:rank] == y[:rank] return (x[:dist]>y[:dist]) ? x : y end return (x[:rank]<y[:rank]) ? x : y end def select_parents(fronts, pop_size) fronts.each {|f| calculate_crowding_distance(f)} offspring, last_front = [], 0 fronts.each do |front| break if (offspring.size+front.size) > pop_size front.each {|p| offspring << p} last_front += 1 end if (remaining = pop_size-offspring.size) > 0 fronts[last_front].sort! {|x,y| crowded_comparison_operator(x,y)} offspring += fronts[last_front][0...remaining] end return offspring end def weighted_sum(x) return x[:objectives].inject(0.0) {|sum, x| sum+x} end def search(search_space, max_gens, pop_size, p_cross, bits_per_param=16) pop = Array.new(pop_size) do |i| {:bitstring=>random_bitstring(search_space.size*bits_per_param)} end calculate_objectives(pop, search_space, bits_per_param) fast_nondominated_sort(pop) selected = Array.new(pop_size) do better(pop[rand(pop_size)], pop[rand(pop_size)]) end children = reproduce(selected, pop_size, p_cross) calculate_objectives(children, search_space, bits_per_param) max_gens.times do |gen| union = pop + children fronts = fast_nondominated_sort(union) parents = select_parents(fronts, pop_size) selected = Array.new(pop_size) do better(parents[rand(pop_size)], parents[rand(pop_size)]) end pop = children children = reproduce(selected, pop_size, p_cross) calculate_objectives(children, search_space, bits_per_param) best = parents.sort!{|x,y| weighted_sum(x)<=>weighted_sum(y)}.first best_s = "[x=#{best[:vector]}, objs=#{best[:objectives].join(', ')}]" puts " > gen=#{gen+1}, fronts=#{fronts.size}, best=#{best_s}" end union = pop + children fronts = fast_nondominated_sort(union) parents = select_parents(fronts, pop_size) return parents end if __FILE__ == $0 # problem configuration problem_size = 1 search_space = Array.new(problem_size) {|i| [-10, 10]} # algorithm configuration max_gens = 50 pop_size = 100 p_cross = 0.98 # execute the algorithm pop = search(search_space, max_gens, pop_size, p_cross) puts "done!" end
References
Primary Sources
Srinivas and Deb proposed the NSGA inspired by Goldberg’s notion of a non-dominated sorting procedure [Srinivas1994]. Goldberg proposed a non-dominated sorting procedure in his book in considering the biases in the Pareto optimal solutions provided by VEGA [Goldberg1989]. Srinivas and Deb’s NSGA used the sorting procedure as a ranking selection method, and a fitness sharing niching method to maintain stable sub-populations across the Pareto front. Deb et al. later extended NSGA to address three criticism of the approach: the time complexity, the lack of elitism, and the need for a sharing parameter for the fitness sharing niching method [Deb2000] [Deb2002].
Learn More
Deb provides in depth coverage of Evolutionary Multiple Objective Optimization algorithms in his book, including a detailed description of the NSGA in Chapter 5 [Deb2001].
Bibliography
[Deb1995] | K. Deb and R. B. Agrawal, “Simulated binary crossover for continuous search space“, Complex Systems, 1995. |
[Deb2000] | K. Deb and S. Agrawal and A. Pratap and T. Meyarivan, “A Fast Elitist Non–dominated Sorting Genetic Algorithm for Multi–Objective Optimization: NSGA–II“, Parallel Problem Solving from Nature PPSN VI, 2000. |
[Deb2001] | K. Deb, “Multi-Objective Optimization Using Evolutionary Algorithms“, John Wiley and Sons, 2001. |
[Deb2002] | K. Deb and A. Pratap and S. Agarwal and T. Meyarivan, “A Fast and Elitist Multiobjective Genetic Algorithm: NSGA–II“, IEEE Transactions on Evolutionary Computation, 2002. |
[Goldberg1989] | D. E. Goldberg, “Genetic Algorithms in Search, Optimization, and Machine Learning“, Addison-Wesley, 1989. |
[Srinivas1994] | N. Srinivas and K. Deb, “Muiltiobjective Optimization Using Nondominated Sorting in Genetic Algorithms“, Evolutionary Computation, 1994. |
Please Note: This content was automatically generated from the book content and may contain minor differences.
Source : http://www.cleveralgorithms.com/nature-inspired/evolution/nsga.html
I’m curious to find out what blog platform you are utilizing? I’m having some minor security problems with my latest site and I’d like to find something more risk-free. Do you have any solutions?