Saddle point: Difference between revisions

From formulasearchengine
Jump to navigation Jump to search
en>Debouch
fixed link
en>Albany NY
m section heading capitalization per WP:HEAD
 
(One intermediate revision by one other user not shown)
Line 1: Line 1:
[[Image:Secant method.svg|thumb|300px|The first two iterations of the secant method. The red curve shows the function ''f'' and the blue lines are the secants. For this particular case, the secant method will not converge.]]
My name is Evan Bellamy. I life in Wallgau (Germany).<br><br>My blog post; [http://fallondesign.co.uk/2013/05/premium-wordpress-themes-web-templates-mobile-themes/ Window Installatio]
 
In [[numerical analysis]], the '''secant method''' is a [[root-finding algorithm]] that uses a succession of [[Root of a function|root]]s of [[secant line]]s to better approximate a root of a [[Function (mathematics)|function]] ''f''. The secant method can be thought of as a [[finite difference]] approximation of [[Newton's method]]. However, the method was developed independently of Newton's method, and predated the latter by over 3,000 years.<ref>{{Citation |last=Papakonstantinou |first=J. |title=The Historical Development of the Secant Method in 1-D |url=http://citation.allacademic.com/meta/p_mla_apa_research_citation/2/0/0/0/4/p200044_index.html | accessdate = 2011-06-29}}</ref>
 
==The method==
The secant method is defined by the [[recurrence relation]]
:<math>
x_n
=x_{n-1}-f(x_{n-1})\frac{x_{n-1}-x_{n-2}}{f(x_{n-1})-f(x_{n-2})}
=\frac{x_{n-2}f(x_{n-1})-x_{n-1}f(x_{n-2})}{f(x_{n-1})-f(x_{n-2})}
</math>
 
As can be seen from the recurrence relation, the secant method requires two initial values, ''x''<sub>0</sub> and ''x''<sub>1</sub>, which should ideally be chosen to lie close to the root.
 
==Derivation of the method==
Starting with initial values <math>x_0</math> and <math>x_1</math>, we construct a line through the points <math>(~x_0,~f(x_0)~)</math> and <math>(~x_1,~f(x_1)~)</math>, as demonstrated in the picture on the right. In point-slope form, this line has the equation
 
:<math>y = \frac{f(x_1)-f(x_0)}{x_1-x_0}(x-x_1) + f(x_1)</math>
   
We find the root of this line &ndash; the value of <math>x</math> such that <math>y=0</math> &ndash; by solving the following equation for <math>x</math>:
 
:<math>0 = \frac{f(x_1)-f(x_0)}{x_1-x_0}(x-x_1) + f(x_1)</math>
   
The solution is
 
:<math>x = x_1 - f(x_1)\frac{x_1-x_0}{f(x_1)-f(x_0)}</math>
 
We then use this new value of <math>x</math> as <math>x_2</math> and repeat the process using <math>x_1</math> and <math>x_2</math> instead of <math>x_0</math> and <math>x_1</math>. We continue this process, solving for <math>x_3</math>, <math>x_4</math>, etc., until we reach a sufficiently high level of precision (a sufficiently small difference between <math>x_n</math> and <math>x_{n-1}</math>).
 
:<math>x_2 = x_1 - f(x_1)\frac{x_1-x_0}{f(x_1)-f(x_0)}</math>
 
:<math>x_3 = x_2 - f(x_2)\frac{x_2-x_1}{f(x_2)-f(x_1)}</math>
 
:...
 
:<math>x_n = x_{n-1} - f(x_{n-1})\frac{x_{n-1}-x_{n-2}}{f(x_{n-1})-f(x_{n-2})}</math>
 
==Convergence==
 
The iterates <math>x_n</math> of the secant method converge to a root of <math>f</math>, if the initial values <math>x_0</math> and <math>x_1</math> are sufficiently close to the root. The [[order of convergence]] is α, where
:<math> \alpha = \frac{1+\sqrt{5}}{2} \approx 1.618 </math>
is the [[golden ratio]]. In particular, the convergence is superlinear, but not quite [[quadratic convergence|quadratic]].
 
This result only holds under some technical conditions, namely that <math>f</math> be twice continuously differentiable and the root in question be simple (i.e., with multiplicity 1).
 
If the initial values are not close enough to the root, then there is no guarantee that the secant method converges. There is no general definition of "close enough", but the criterion has to do with how "wiggly" the function is on the interval <math>[~x_0,~x_1~]</math>. For example, if <math>f</math> is differentiable on that interval and there is a point where <math>f^\prime = 0</math> on the interval, then the algorithm may not converge.
 
==Comparison with other root-finding methods==
 
The secant method does not require that the root remain bracketed like the [[bisection method]] does, and hence it does not always converge. The [[false position method]] (or ''regula falsi'') uses the same formula as the secant method. However, it does not apply the formula on <math>x_{n-1}</math> and <math>x_n</math>, like the secant method, but on <math>x_n</math> and on the last iterate <math>x_k</math> such that <math>f(x_k)</math> and <math>f(x_n)</math> have a different sign. This means that the [[false position method]] always converges.
 
The recurrence formula of the secant method can be derived from the formula for [[Newton's method]]
:<math> x_{n} = x_{n-1} - \frac{f(x_{n-1})}{f^\prime(x_{n-1})} </math>
by using the [[finite difference]] approximation
:<math> f^\prime(x_{n-1}) \approx \frac{f(x_{n-1}) - f(x_{n-2})}{x_{n-1} - x_{n-2}}</math>.
If we compare Newton's method with the secant method, we see that Newton's method converges faster (order 2 against α ≈ 1.6). However, Newton's method requires the evaluation of both <math>f</math> and its derivative <math>f^\prime</math> at every step, while the secant method only requires the evaluation of <math>f</math>. Therefore, the secant method may occasionally be faster in practice. For instance, if we assume that evaluating <math>f</math> takes as much time as evaluating its derivative and we neglect all other costs, we can do two steps of the secant method (decreasing the logarithm of the error by a factor α² ≈ 2.6) for the same cost as one step of Newton's method (decreasing the logarithm of the error by a factor 2), so the secant method is faster. If however we consider parallel processing for the evaluation of the derivative, Newton's method proves its worth, being faster in time, though still spending more steps.
 
==Generalizations==
 
[[Broyden's method]] is a generalization of the secant method to more than one dimension.
 
The following graph shows the function ''f'' in red and the last secant line in bold blue.  In the graph, the ''x''-intercept of the secant line seems to be a good approximation of the root of ''f''.
 
[[Image:Secant method example code result.svg|center]]
 
==A computational example==
The Secant method is applied to find a root of the function ''f''(''x'')=''x''<sup>2</sup>−612. Here is an implementation in the [[Matlab]] language.
(From calculation, we expect that the iteration converges at x=24.7386)
<source lang=Matlab>
f=@(x) x^2 - 612;
x(1)=10;
x(2)=30;
for i=3:7
    x(i) = x(i-1) - (f(x(i-1)))*((x(i-1) - x(i-2))/(f(x(i-1)) - f(x(i-2))));
end
root=x(7)
</source>
 
==Notes==
{{reflist}}
 
== See also ==
* [[False position method]]
 
==References==
* {{citation | last1=Kaw | first1=Autar | last2=Kalu | first2=Egwu | year=2008 | title=Numerical Methods with Applications | edition=1st | url=http://www.autarkaw.com/books/numericalmethods/index.html }}.
* {{cite book| title=Numerical analysis for applied science|first1=Myron B. |last1=Allen |first2=Eli L. |last2=Isaacson | pages=188–195| isbn=978-0-471-55266-6| year=1998| publisher=[[John Wiley & Sons]]| url=http://books.google.co.uk/books?id=PpB9cjOxQAQC}}
 
==External links==
* [https://s3.amazonaws.com/torkian/torkian/Site/Research/Entries/2008/2/28_Root-finding_algorithm_Java_Code_(_Secant%2C_Bisection%2C_Newton_).html  Java code By Behzad Torkian]
* [http://math.fullerton.edu/mathews/a2001/Animations/RootFinding/SecantMethod/SecantMethod.html Animations for the secant method]
* [http://numericalmethods.eng.usf.edu/topics/secant_method.html Secant Method] Notes, PPT, Mathcad, Maple, Mathematica, Matlab at [http://numericalmethods.eng.usf.edu Holistic Numerical Methods Institute]
* {{MathWorld|urlname=SecantMethod|title=Secant Method}}
*[http://math.fullerton.edu/mathews/n2003/SecantMethodMod.html Module for Secant Method by John H. Mathews]
*[http://catc.ac.ir/mazlumi/jscodes/secant.php Online root finding of a polynomial-Secant method] by Farhad Mazlumi
 
[[Category:Root-finding algorithms]]

Latest revision as of 19:00, 12 August 2014

My name is Evan Bellamy. I life in Wallgau (Germany).

My blog post; Window Installatio