Main Page: Difference between revisions

From formulasearchengine
Jump to navigation Jump to search
mNo edit summary
mNo edit summary
Line 1: Line 1:
{{unreferenced|date=May 2014}}
In [[computer science]], '''differential evolution''' (DE) is a method that [[optimization (mathematics)|optimizes]] a problem by [[iterative method|iteratively]] trying to improve a [[candidate solution]] with regard to a given measure of quality. Such methods are commonly known as [[metaheuristic]]s as they make few or no assumptions about the problem being optimized and can search very large spaces of candidate solutions. However, metaheuristics such as DE do not guarantee an optimal solution is ever found.
In [[mathematics]], a '''coframe''' or '''coframe field''' on a [[smooth manifold]] <math>M</math> is a system of [[one-form]]s or [[covector]]s which form a [[vector space|basis]] of the [[cotangent bundle]] at every point. In the [[exterior algebra]] of <math>M</math>, one has a natural map from <math>v_k:\bigoplus^kT^*M\to\bigwedge^kT^*M</math>, given by <math>v_k:(\rho_1,\ldots,\rho_k)\mapsto \rho_1\wedge\ldots\wedge\rho_k</math>If <math>M</math> is <math>n</math> dimensional a coframe is given by a section <math>\sigma</math> of <math>\bigoplus^nT^*M</math> such that <math>v_n\circ\sigma\neq 0</math>.  The inverse image under <math>v_n</math> of the complement of the zero section of <math>\bigwedge^nT^*M</math> forms a <math>GL(n)</math> [[principal bundle]] over <math>M</math>, which is called the [[frame bundle|coframe bundle]].
 
DE is used for multidimensional real-valued [[function (mathematics)|functions]] but does not use the [[gradient]] of the problem being optimized, which means DE does not require for the optimization problem to be [[differentiable]] as is required by classic optimization methods such as [[gradient descent]] and [[quasi-newton methods]]. DE can therefore also be used on optimization problems that are not even [[:wikt:continuous|continuous]], are noisy, change over time, etc.<ref name=elediadereview/>
 
DE optimizes a problem by maintaining a population of candidate solutions and creating new candidate solutions by combining existing ones according to its simple formulae, and then keeping whichever candidate solution has the best score or fitness on the optimization problem at hand. In this way the optimization problem is treated as a black box that merely provides a measure of quality given a candidate solution and the gradient is therefore not needed.
 
DE is originally due to Storn and Price.<ref name=storn97differential/><ref name=storn96usage/> Books have been published on theoretical and practical aspects of using DE in [[parallel computing]], [[multiobjective optimization]], [[constrained optimization]], and the books also contain surveys of application areas.<ref name=price05differential/><ref name=feoktistov06differential/><ref name=chakraborty08advances/>
 
== Algorithm ==
<!-- There is no need for extra pseudo-code when this algorithm description is made as detailed as it is -->
 
A basic variant of the DE algorithm works by having a population of [[candidate solutions]] (called agents). These agents are moved around in the search-space by using simple mathematical [[formula]]e to combine the positions of existing agents from the population. If the new position of an agent is an improvement it is accepted and forms part of the population, otherwise the new position is simply discarded. The process is repeated and by doing so it is hoped, but not guaranteed, that a satisfactory solution will eventually be discovered.
 
Formally, let <math>f: \Bbb{R}^n \to \Bbb{R}</math> be the cost function which must be minimized or fitness function which must be maximized. The function takes a candidate solution as argument in the form of a [[Row vector|vector]] of [[real number]]s and produces a real number as output which indicates the fitness of the given candidate solution. The [[gradient]] of <math>f</math> is not known. The goal is to find a solution <math>m</math> for which <math>f(m) \leq f(p)</math> for all <math>p</math> in the search-space, which would mean <math>m</math> is the global minimum. Maximization can be performed by considering the function <math>h := -f</math> instead.
 
Let <math>\mathbf{x} \in \Bbb{R}^n</math> designate a candidate solution (agent) in the population. The basic DE algorithm can then be described as follows:
 
* Initialize all agents <math>\mathbf{x}</math> with random positions in the search-space.
* Until a termination criterion is met (e.g. number of iterations performed, or adequate fitness reached), repeat the following:
** For each agent <math>\mathbf{x}</math> in the population do:
*** Pick three agents <math>\mathbf{a},\mathbf{b}</math>, and <math>\mathbf{c}</math> from the population at random, they must be distinct from each other as well as from agent <math>\mathbf{x}</math>
*** Pick a random index <math>R \in \{1, \ldots, n\}</math> (<math>n</math> being the dimensionality of the problem to be optimized).
*** Compute the agent's potentially new position <math>\mathbf{y} = [y_1, \ldots, y_n]</math> as follows:
**** For each <math>i</math>, pick a uniformly distributed number <math>r_i \equiv U(0,1)</math>
**** If <math>r_i < \text{CR}</math> or <math>i = R</math> then set <math>y_i = a_i + F \times (b_i-c_i)</math> otherwise set <math>y_i = x_i</math>
**** (In essence, the new position is outcome of binary crossover of agent <math>\mathbf{x}</math> with intermediate agent  <math>\mathbf{z} = \mathbf{a} + F \times (\mathbf{b}-\mathbf{c})</math>.)
*** If <math>f(\mathbf{y}) < f(\mathbf{x})</math> then replace the agent in the population with the improved candidate solution, that is, replace <math>\mathbf{x}</math> with <math>\mathbf{y}</math> in the population.
* Pick the agent from the population that has the highest fitness or lowest cost and return it as the best found candidate solution.
 
Note that <math>F \in [0,2]</math> is called the ''differential weight'' and <math>\text{CR} \in [0,1]</math> is called the ''crossover probability'', both these parameters are selectable by the practitioner along with the population size <math>\text{NP} \geq 4</math> see below.
 
== Parameter selection ==
 
[[Image:DE Meta-Fitness Landscape (Sphere and Rosenbrock).JPG|thumb|Performance landscape showing how the basic DE performs in aggregate on the Sphere and Rosenbrock benchmark problems when varying the two DE parameters <math>\text{NP}</math> and <math>\text{F}</math>, and keeping fixed <math>\text{CR}</math>=0.9.]]
 
The choice of DE parameters <math>F, \text{CR}</math> and <math>\text{NP}</math> can have a large impact on optimization performance. Selecting the DE parameters that yield good performance has therefore been the subject of much research. [[Rules of thumb]] for parameter selection were devised by Storn et al.<ref name=storn96usage/><ref name=price05differential/> and Liu and Lampinen.<ref name=liu02setting/> Mathematical convergence analysis regarding parameter selection was done by Zaharie.<ref name=zaharie02critical/> [[Meta-optimization]] of the DE parameters was done by Pedersen <ref name=pedersen08thesis/><ref name=pedersen10good-de/> and Zhang et al.<ref name=zhang11fitting/>
 
== Variants ==
 
<!-- Please add proper reference to conference-paper / journal-paper / tech report / master's or phd thesis. Please only add representative works. There must be hundreds of DE variants and Wikipedia is not the proper place to list them all. -->
 
Variants of the DE algorithm are continually being developed in an effort to improve optimization performance. Many different schemes for performing crossover and mutation of agents are possible in the basic algorithm given above, see e.g.<ref name=storn96usage/> More advanced DE variants are also being developed with a popular research trend being to perturb or adapt the DE parameters during optimization, see e.g. Price et al.,<ref name=price05differential/> Liu and Lampinen,<ref name=liu05fuzzy/> Qin and Suganthan,<ref name=qin05selfadaptive/> Civicioglu <ref name=civici/> and Brest et al.<ref name=brest06selfadapting/>


== See also ==
== See also ==
<!-- Please only add optimizers conceptually related to DE. -->
* [[CMA-ES]]
* [[Artificial bee colony algorithm]]
* [[Evolution strategy]]
* [[Genetic algorithm]]
* [[Differential search algorithm]] <ref name=civici/>
* [[Biogeography-based optimization]]
==References==
{{Reflist|refs=
<ref name=storn97differential>
{{cite journal
|last=Storn
|first=R.
|author2=Price, K.
|title=Differential evolution - a simple and efficient heuristic for global optimization over continuous spaces
|journal=Journal of Global Optimization
|year=1997
|volume=11
|pages=341&ndash;359
|doi=10.1023/A:1008202821328
}}
</ref>
<ref name=storn96usage>
{{cite conference
|last=Storn
|first=R.
|title=On the usage of differential evolution for function optimization
|booktitle=Biennial Conference of the North American Fuzzy Information Processing Society (NAFIPS)
|year=1996
|pages=519&ndash;523
}}
</ref>
<ref name=price05differential>
{{cite book
|title=Differential Evolution: A Practical Approach to Global Optimization
|url=http://www.springer.com/computer/theoretical+computer+science/foundations+of+computations/book/978-3-540-20950-8
|last1=Price
|first1=K.
|last2=Storn
|first2=R.M.
|last3=Lampinen
|first3=J.A.
|year=2005
|publisher=Springer
|isbn=978-3-540-20950-8
}}
</ref>
<ref name=feoktistov06differential>
{{cite book
|title=Differential Evolution: In Search of Solutions
|url=http://www.springer.com/mathematics/book/978-0-387-36895-5
|last=Feoktistov
|first=V.
|year=2006
|publisher=Springer
|isbn=978-0-387-36895-5
}}
</ref>
<ref name=chakraborty08advances>
{{citation
|title=Advances in Differential Evolution
|url=http://www.springer.com/engineering/book/978-3-540-68827-3
|editor-last=Chakraborty
|editor-first=U.K.
|year=2008
|publisher=Springer
|isbn=978-3-540-68827-3
}}
</ref>
<ref name=liu02setting>
{{cite conference
|title=On setting the control parameter of the differential evolution method
|booktitle=Proceedings of the 8th International Conference on Soft Computing (MENDEL)
|last=Liu
|first=J.
|last2=Lampinen
|first2=J.
|year=2002
|pages=11&ndash;18
|location=Brno, Czech Republic
}}
</ref>
<ref name=zaharie02critical>
{{cite conference
|title=Critical values for the control parameters of differential evolution algorithms
|booktitle=Proceedings of the 8th International Conference on Soft Computing (MENDEL)
|last=Zaharie
|first=D.
|year=2002
|pages=62&ndash;67
|location=Brno, Czech Republic
}}
</ref>
<ref name=brest06selfadapting>
{{cite journal
|last1=Brest
|first1=J.
|last2=Greiner
|first2=S.
|last3=Boskovic
|first3=B.
|last4=Mernik
|first4=M.
|last5=Zumer
|first5=V.
|title=Self-adapting control parameters in differential evolution: a comparative study on numerical benchmark functions
|journal=IEEE Transactions on Evolutionary Computation
|year=2006
|volume=10
|issue=6
|pages=646&ndash;657
|doi=10.1109/tevc.2006.872133
}}
</ref>
<ref name=qin05selfadaptive>
{{cite conference
|last1=Qin
|first1=A.K.
|last2=Suganthan
|first2=P.N.
|title=Self-adaptive differential evolution algorithm for numerical optimization
|booktitle=Proceedings of the IEEE congress on evolutionary computation (CEC)
|year=2005
|pages=1785&ndash;1791
}}
</ref>
<ref name=liu05fuzzy>
{{cite journal
|last1=Liu
|first1=J.
|last2=Lampinen
|first2=J.
|title=A fuzzy adaptive differential evolution algorithm
|journal=Soft Computing
|year=2005
|volume=9
|issue=6
|pages=448&ndash;462
|doi=10.1007/s00500-004-0363-x
}}
</ref>
<ref name=pedersen08thesis>
{{cite book
|type=PhD thesis
|title=Tuning & Simplifying Heuristical Optimization
|url=http://www.hvass-labs.org/people/magnus/thesis/pedersen08thesis.pdf
|last=Pedersen
|first=M.E.H.
|year=2010
|publisher=University of Southampton, School of Engineering Sciences, Computational Engineering and Design Group
}}
</ref>
<ref name=pedersen10good-de>
{{Cite journal
|last=Pedersen
|first=M.E.H.
|url=http://www.hvass-labs.org/people/magnus/publications/pedersen10good-de.pdf
|title=Good parameters for differential evolution
|journal=Technical Report HL1002
|publisher=Hvass Laboratories
|year=2010
}}
</ref>
<ref name=zhang11fitting>
{{cite conference
|last1=Zhang
|first1=X.
|last2=Jiang
|first2=X.
|last3=Scott
|first3=P.J.
|title=A Minimax Fitting Algorithm for Ultra-Precision Aspheric Surfaces
|booktitle=The 13th International Conference on Metrology and Properties of Engineering Surfaces
|year=2011
}}
</ref>
<ref name=civici>
{{cite journal
|last=Civicioglu
|first=P.
|title=Transforming geocentric cartesian coordinates to geodetic coordinates by using differential search algorithm
|journal=Computers & Geosciences
|year=2012
|volume=46
|pages=229&ndash;247
|doi=10.1016/j.cageo.2011.12.011
}}
</ref>
<ref name=elediadereview>
{{Cite journal
|last1=Rocca
|first1=P.
|last2=Oliveri
|first2=G.
|last3=Massa
|first3=A.
|title=Differential Evolution as Applied to Electromagnetics
|journal=IEEE Antennas and Propagation Magazine
|year=2011
|volume=53
|issue=1
|pages=38&ndash;49
|doi=10.1109/MAP.2011.5773566
}}
</ref>


* [[Frame fields in general relativity]]
}}
* [[Moving frame]]


[[Category:Differential geometry]]
==External links==
* [http://www.icsi.berkeley.edu/~storn/code.html Storn's Homepage on DE] featuring source-code for several programming languages.
* [http://www.sciencedirect.com/science/article/pii/S0957417410010493 Fast DE Algorithm] A Fast Differential Evolution Algorithm using k-Nearest Neighbour Predictor.
* [http://www.sciencedirect.com/science/article/pii/S0957417413000857 MODE Application] Parameter Estimation of a Pressure Swing Adsorption Model for Air Separation Using Multi-objective Optimisation and Support Vector Regression Model.


{{Major subfields of optimization}}


{{differential-geometry-stub}}
{{DEFAULTSORT:Differential Evolution}}
[[Category:Optimization algorithms and methods]]
[[Category:Evolutionary algorithms]]
[[Category:Mathematical optimization]]
[[Category:Operations research]]

Revision as of 18:05, 14 August 2014

In computer science, differential evolution (DE) is a method that optimizes a problem by iteratively trying to improve a candidate solution with regard to a given measure of quality. Such methods are commonly known as metaheuristics as they make few or no assumptions about the problem being optimized and can search very large spaces of candidate solutions. However, metaheuristics such as DE do not guarantee an optimal solution is ever found.

DE is used for multidimensional real-valued functions but does not use the gradient of the problem being optimized, which means DE does not require for the optimization problem to be differentiable as is required by classic optimization methods such as gradient descent and quasi-newton methods. DE can therefore also be used on optimization problems that are not even continuous, are noisy, change over time, etc.[1]

DE optimizes a problem by maintaining a population of candidate solutions and creating new candidate solutions by combining existing ones according to its simple formulae, and then keeping whichever candidate solution has the best score or fitness on the optimization problem at hand. In this way the optimization problem is treated as a black box that merely provides a measure of quality given a candidate solution and the gradient is therefore not needed.

DE is originally due to Storn and Price.[2][3] Books have been published on theoretical and practical aspects of using DE in parallel computing, multiobjective optimization, constrained optimization, and the books also contain surveys of application areas.[4][5][6]

Algorithm

A basic variant of the DE algorithm works by having a population of candidate solutions (called agents). These agents are moved around in the search-space by using simple mathematical formulae to combine the positions of existing agents from the population. If the new position of an agent is an improvement it is accepted and forms part of the population, otherwise the new position is simply discarded. The process is repeated and by doing so it is hoped, but not guaranteed, that a satisfactory solution will eventually be discovered.

Formally, let be the cost function which must be minimized or fitness function which must be maximized. The function takes a candidate solution as argument in the form of a vector of real numbers and produces a real number as output which indicates the fitness of the given candidate solution. The gradient of is not known. The goal is to find a solution for which for all in the search-space, which would mean is the global minimum. Maximization can be performed by considering the function instead.

Let designate a candidate solution (agent) in the population. The basic DE algorithm can then be described as follows:

Note that is called the differential weight and is called the crossover probability, both these parameters are selectable by the practitioner along with the population size see below.

Parameter selection

Performance landscape showing how the basic DE performs in aggregate on the Sphere and Rosenbrock benchmark problems when varying the two DE parameters and , and keeping fixed =0.9.

The choice of DE parameters and can have a large impact on optimization performance. Selecting the DE parameters that yield good performance has therefore been the subject of much research. Rules of thumb for parameter selection were devised by Storn et al.[3][4] and Liu and Lampinen.[7] Mathematical convergence analysis regarding parameter selection was done by Zaharie.[8] Meta-optimization of the DE parameters was done by Pedersen [9][10] and Zhang et al.[11]

Variants

Variants of the DE algorithm are continually being developed in an effort to improve optimization performance. Many different schemes for performing crossover and mutation of agents are possible in the basic algorithm given above, see e.g.[3] More advanced DE variants are also being developed with a popular research trend being to perturb or adapt the DE parameters during optimization, see e.g. Price et al.,[4] Liu and Lampinen,[12] Qin and Suganthan,[13] Civicioglu [14] and Brest et al.[15]

See also

References

43 year old Petroleum Engineer Harry from Deep River, usually spends time with hobbies and interests like renting movies, property developers in singapore new condominium and vehicle racing. Constantly enjoys going to destinations like Camino Real de Tierra Adentro.

External links

  • Storn's Homepage on DE featuring source-code for several programming languages.
  • Fast DE Algorithm A Fast Differential Evolution Algorithm using k-Nearest Neighbour Predictor.
  • MODE Application Parameter Estimation of a Pressure Swing Adsorption Model for Air Separation Using Multi-objective Optimisation and Support Vector Regression Model.

Template:Major subfields of optimization

  1. Cite error: Invalid <ref> tag; no text was provided for refs named elediadereview
  2. Cite error: Invalid <ref> tag; no text was provided for refs named storn97differential
  3. 3.0 3.1 3.2 Cite error: Invalid <ref> tag; no text was provided for refs named storn96usage
  4. 4.0 4.1 4.2 Cite error: Invalid <ref> tag; no text was provided for refs named price05differential
  5. Cite error: Invalid <ref> tag; no text was provided for refs named feoktistov06differential
  6. Cite error: Invalid <ref> tag; no text was provided for refs named chakraborty08advances
  7. Cite error: Invalid <ref> tag; no text was provided for refs named liu02setting
  8. Cite error: Invalid <ref> tag; no text was provided for refs named zaharie02critical
  9. Cite error: Invalid <ref> tag; no text was provided for refs named pedersen08thesis
  10. Cite error: Invalid <ref> tag; no text was provided for refs named pedersen10good-de
  11. Cite error: Invalid <ref> tag; no text was provided for refs named zhang11fitting
  12. Cite error: Invalid <ref> tag; no text was provided for refs named liu05fuzzy
  13. Cite error: Invalid <ref> tag; no text was provided for refs named qin05selfadaptive
  14. 14.0 14.1 Cite error: Invalid <ref> tag; no text was provided for refs named civici
  15. Cite error: Invalid <ref> tag; no text was provided for refs named brest06selfadapting