|
|
(One intermediate revision by one other user not shown) |
Line 1: |
Line 1: |
| {{for|other uses of "differential" in mathematics|Differential (mathematics)}}
| | Accidental injuries is actually a intricate issue. Every day, big numbers of trials take place. The verdicts that are achieved be dependent in excellent area of the expertise of your legal representatives included. The most severe legal representative will invariably lead to a decrease. The methods and solutions are talked about listed below.<br><br> |
| {{Calculus |Differential}}
| |
|
| |
|
| In [[calculus]], the '''differential''' represents the principal part of the change in a function ''y'' = ''f''(''x'') with respect to changes in the independent variable. The differential ''dy'' is defined by
| | Stay away social media! Which means Facebook or twitter, Flickr and Flickr. Don't publish pictures of yourself grooving if you're proclaiming an harmed rear! Don't notify anybody you went h2o skiing if you state they have hurt your feet. Just remain off of the web and say well under possible on-line as an alternative.<br><br>Exactly what a personal injury situation comes down to is who was clumsy in inducing the incident. Which is the individual who will end up making payment on the charges in the other hurt bash. Legitimate responsibility is likely to slip into the lap of anybody driving a vehicle recklessly, or someone who didn't repair a difficulty which caused the damage of some other.<br><br>Search for medical help. If you locate yourself harmed, view a medical doctor at the earliest opportunity. In addition to being health and well being assistance, you will want the doctor's analysis and documentation out of your go to as proof. Ensure you papers every thing, while keeping duplicates for yourself. Failing to get medical attention following an accident or trauma can in fact minimize the award sum within a personal injury circumstance.<br><br>Papers your fees. Record any expenditures or loss in revenue you incur because of your damage. A few of these paperwork can include insurance coverage forms, medical expenses, prescribed statements, and house harm maintenance. If you skip work as a result of trauma, make sure to also file any lost wages. Papers each and every expense since it takes place, while it is refreshing within your storage.<br><br>If you are involved in the vehicle automobile accident, you need to acquire as much images that you can in the scene. If you find any sort of injury scenario introduced up, these will allow you to present your situation. In case you have a legal representative, it will help them see just what taken place.<br><br>Conduct a swift track record verify to find out regardless of whether your own injury legal representative is qualified. The last thing that you need is made for your legal representative to deceive you, that may have catastrophic outcomes monetarily and individually should your circumstance is essential. Employ a attorney having a great past and at present has every one of the specifications to practice.<br><br>In case you have been involved with a move and slip accident, you need to have proof of the crash. In the course of the crash, get photos from the internet site and have approved experience assertions. You need to be aware if there was "drenched floor" sign on the location.<br><br>Be totally sincere together with your lawyer and find a new legal representative if your own shows dishonest methods to earn your circumstance. You should also record those to the neighborhood nightclub organization. Lying down into a evaluate bears with it really severe penalty charges and may promise you do not earn your circumstance.<br><br>Ask any legal professional you are looking for whether they have dealt with these situations well before. This is an excellent way to find out should they be an excellent fit. If your form of scenario is something the individual does constantly, then that legal representative may well be a a lot better suit when compared to a lawyer with minimal experience.<br><br>While you have to take them with a grain of salt, take a look at online evaluations about legal professionals you are interested in. This is a great way to check if any lawyers are very messing up with clientele. Nevertheless, very few people bother making positive testimonials, so bear that in mind as you may read.<br><br>Always keep data of methods your injury cost funds. Involve skipped function, travelling costs, physician appointments, medications and so forth. You will want this proof if you're in the courtroom, of course, if you don't get it, it won't be regarded as a part of your verdict.<br><br>Fully grasp just before coming over to a binding agreement exactly how much your individual injury lawyer or attorney is going to be charging you you. Plenty of law places of work keep you at nighttime with regards to their costs except if you invest in the lawyer. It is essential that you happen to be not lied to, and you also are obvious in regards to what you will pay for services. Insist upon obtaining a printout of the attorney's costs.<br><br>Facing injury lawyers, ensure that the one particular you employ considers inside your circumstance. If she or he is apparently mumbling with the situation and [http://www.youtube.com/watch?v=A4NiEiCwUA0 http://www.youtube.com/watch?v=A4NiEiCwUA0] just dealing with the motions, you may want to look for somebody else. Ensure the legal professional can be as fired-up and furious concerning your suffering and pain when your are!<br><br>You ought to now be capable of take a seat and plan your situation. Your data in this article can help you produce a brief list. One of these brilliant finalists could be the legal professional you choose. |
| :<math>dy = f'(x)\,dx,</math>
| |
| where <math>f'(x)</math> is the [[derivative]] of ''f'' with respect to ''x'', and ''dx'' is an additional real [[variable (mathematics)|variable]] (so that ''dy'' is a function of ''x'' and ''dx''). The notation is such that the equation
| |
| | |
| :<math>dy = \frac{dy}{dx}\, dx</math>
| |
| | |
| holds, where the derivative is represented in the [[Leibniz notation]] ''dy''/''dx'', and this is consistent with regarding the derivative as the quotient of the differentials. One also writes
| |
| | |
| :<math>df(x) = f'(x)\,dx.</math>
| |
| | |
| The precise meaning of the variables ''dy'' and ''dx'' depends on the context of the application and the required level of mathematical rigor. The domain of these variables may take on a particular geometrical significance if the differential is regarded as a particular [[differential form]], or analytical significance if the differential is regarded as a [[linear approximation]] to the increment of a function. In physical applications, the variables ''dx'' and ''dy'' are often constrained to be very small ("[[infinitesimal]]").
| |
| | |
| ==History and usage==
| |
| The differential was first introduced via an intuitive or heuristic definition by [[Gottfried Wilhelm Leibniz]], who thought of the differential ''dy'' as an infinitely small (or [[infinitesimal]]) change in the value ''y'' of the function, corresponding to an infinitely small change ''dx'' in the function's argument ''x''. For that reason, the instantaneous rate of change of ''y'' with respect to ''x'', which is the value of the [[derivative]] of the function, is denoted by the fraction
| |
| | |
| : <math> \frac{dy}{dx} </math>
| |
| | |
| in what is called the [[Leibniz notation]] for derivatives. The quotient ''dy''/''dx'' is not infinitely small; rather it is a [[real number]].
| |
| | |
| The use of infinitesimals in this form was widely criticized, for instance by the famous pamphlet [[The Analyst]] by Bishop Berkeley. [[Augustin-Louis Cauchy]] ([[#CITEREFCauchy1823|1823]]) defined the differential without appeal to the atomism of Leibniz's infinitesimals.<ref>For a detailed historical account of the differential, see {{harvnb|Boyer|1959}}, especially page 275 for Cauchy's contribution on the subject. An abbreviated account appears in {{harvnb|Kline|1972|loc=Chapter 40}}.</ref><ref>Cauchy explicitly denied the possibility of actual infinitesimal and infinite quantities {{harv|Boyer|1959|pp=273–275}}, and took the radically different point of view that "a variable quantity becomes infinitely small when its numerical value decreases indefinitely in such a way as to converge to zero" ({{harvnb|Cauchy|1823|p=12}}; translation from {{harvnb|Boyer|1959|p=273}}).</ref> Instead, Cauchy, following [[Jean le Rond d'Alembert|d'Alembert]], inverted the logical order of Leibniz and his successors: the derivative itself became the fundamental object, defined as a [[limit (mathematics)|limit]] of difference quotients, and the differentials were then defined in terms of it. That is, one was free to ''define'' the differential ''dy'' by an expression
| |
| :<math>dy = f'(x)\,dx</math>
| |
| in which ''dy'' and ''dx'' are simply new variables taking finite real values,<ref>{{harvnb|Boyer|1959|p=275}}</ref> not fixed infinitesimals as they had been for Leibniz.<ref>{{harvnb|Boyer|1959|p=12}}: "The differentials as thus defined are only new ''variables'', and not fixed infinitesimals..."</ref>
| |
| | |
| According to {{harvtxt|Boyer|1959|p=12}}, Cauchy's approach was a significant logical improvement over the infinitesimal approach of Leibniz because, instead of invoking the metaphysical notion of infinitesimals, the quantities ''dy'' and ''dx'' could now be manipulated in exactly the same manner as any other real quantities
| |
| in a meaningful way. Cauchy's overall conceptual approach to differentials remains the standard one in modern analytical treatments,<ref>{{harvnb|Courant|1937i|loc=II, §9}}: "Here we remark merely in passing that it is possible to use this approximate representation of the increment Δ''y'' by the linear expression ''hƒ''(''x'') to construct a logically satisfactory definition of a "differential", as was done by Cauchy in particular."</ref> although the final word on rigor, a fully modern notion of the limit, was ultimately due to [[Karl Weierstrass]].<ref>{{harvnb|Boyer|1959|p=284}}</ref> | |
| | |
| In physical treatments, such as those applied to the theory of [[thermodynamics]], the infinitesimal view still prevails. {{harvtxt|Courant|John|1999|p=184}} reconcile the physical use of infinitesimal differentials with the mathematical impossibility of them as follows. The differentials represent finite non-zero values that are smaller than the degree of accuracy required for the particular purpose for which they are intended. Thus "physical infinitesimals" need not appeal to a corresponding mathematical infinitesimal in order to have a precise sense.
| |
| | |
| Following twentieth-century developments in [[mathematical analysis]] and [[differential geometry]], it became clear that the notion of the differential of a function could be extended in a variety of ways. In [[real analysis]], it is more desirable to deal directly with the differential as the principal part of the increment of a function. This leads directly to the notion that the differential of a function at a point is a [[linear functional]] of an increment Δ''x''. This approach allows the differential (as a linear map) to be developed for a variety of more sophisticated spaces, ultimately giving rise to such notions as the [[Fréchet derivative|Fréchet]] or [[Gâteaux derivative]]. Likewise, in [[differential geometry]], the differential of a function at a point is a linear function of a [[tangent vector]] (an "infinitely small displacement"), which exhibits it as a kind of one-form: the [[exterior derivative]] of the function. In [[non-standard calculus]], differentials are regarded as infinitesimals, which can themselves be put on a rigorous footing (see [[differential (infinitesimal)]]).
| |
| | |
| ==Definition==
| |
| | |
| [[File:Sentido geometrico del diferencial de una funcion.png|thumb|The differential of a function ''ƒ''(''x'') at a point ''x''<sub>0</sub>.]]
| |
| The differential is defined in modern treatments of differential calculus as follows.<ref>See, for instance, the influential treatises of {{harvnb|Courant|1937i}}, {{harvnb|Kline|1977}}, {{harvnb|Goursat|1904}}, and {{harvnb|Hardy|1905}}. Tertiary sources for this definition include also {{harvnb|Tolstov|2001}} and {{harvnb|Ito|1993|loc=§106}}.</ref> The differential of a function ''f''(''x'') of a single real variable ''x'' is the function ''df'' of two independent real variables ''x'' and ''Δx'' given by
| |
| | |
| :<math>df(x, \Delta x) \stackrel{\rm{def}}{=} f'(x)\,\Delta x.</math>
| |
| | |
| One or both of the arguments may be suppressed, i.e., one may see ''df''(''x'') or simply ''df''. If ''y'' = ''f''(''x''), the differential may also be written as ''dy''. Since ''dx''(''x'', Δ''x'') = Δ''x'' it is conventional to write ''dx'' = Δ''x'', so that the following equality holds:
| |
| | |
| :<math>df(x) = f'(x) \, dx</math>
| |
|
| |
| This notion of differential is broadly applicable when a [[linear approximation]] to a function is sought, in which the value of the increment Δ''x'' is small enough. More precisely, if ''f'' is a [[differentiable function]] at ''x'', then the difference in ''y''-values
| |
| | |
| :<math>\Delta y \stackrel{\rm{def}}{=} f(x+\Delta x) - f(x)</math>
| |
| | |
| satisfies
| |
| | |
| :<math>\Delta y = f'(x)\,\Delta x + \varepsilon = df(x) + \varepsilon\,</math>
| |
| | |
| where the error ε in the approximation satisfies ε/Δ''x'' → 0 as Δ''x'' → 0. In other words, one has the approximate identity
| |
| | |
| :<math>\Delta y \approx dy</math>
| |
| | |
| in which the error can be made as small as desired relative to Δ''x'' by constraining ''Δx'' to be sufficiently small; that is to say,
| |
| :<math>\frac{\Delta y - dy}{\Delta x}\to 0</math>
| |
| as Δ''x'' → 0. For this reason, the differential of a function is known as the [[principal part|principal (linear) part]] in the increment of a function: the differential is a [[linear function]] of the increment Δ''x'', and although the error ε may be nonlinear, it tends to zero rapidly as Δ''x'' tends to zero.
| |
| | |
| ==Differentials in several variables==
| |
| Following {{harvtxt|Goursat|1904|loc=I, §15}}, for functions of more than one independent variable,
| |
| | |
| : <math> y = f(x_1,\dots,x_n), \, </math>
| |
| | |
| the '''partial differential''' of ''y'' with respect to any one of the variables ''x''<sub>1</sub> is the principal part of the change in ''y'' resulting from a change ''dx''<sub>1</sub> in that one variable. The partial differential is therefore
| |
| | |
| : <math> \frac{\partial y}{\partial x_1} dx_1 </math>
| |
| | |
| involving the [[partial derivative]] of ''y'' with respect to ''x''<sub>1</sub>. The sum of the partial differentials with respect to all of the independent variables is the '''total differential'''
| |
| | |
| : <math> dy = \frac{\partial y}{\partial x_1} dx_1 + \cdots + \frac{\partial y}{\partial x_n} dx_n, </math>
| |
| | |
| which is the principal part of the change in ''y'' resulting from changes in the independent variables ''x''<sub> ''i''</sub>.
| |
| | |
| More precisely, in the context of multivariable calculus, following {{harvtxt|Courant|1937ii}}, if ''f'' is a differentiable function, then by the [[Fréchet derivative|definition of the differentiability]], the increment
| |
| | |
| :<math>\begin{align}
| |
| \Delta y &{}\stackrel{\mathrm{def}}{=} f(x_1+\Delta x_1, \dots, x_n+\Delta x_n) - f(x_1,\dots,x_n)\\
| |
| &{}= \frac{\partial y}{\partial x_1} \Delta x_1 + \cdots + \frac{\partial y}{\partial x_n} \Delta x_n + \varepsilon_1\Delta x_1 +\cdots+\varepsilon_n\Delta x_n
| |
| \end{align}</math>
| |
| | |
| where the error terms ε<sub> ''i''</sub> tend to zero as the increments Δ''x''<sub>''i''</sub> jointly tend to zero. The total differential is then rigorously defined as
| |
| | |
| :<math>dy = \frac{\partial y}{\partial x_1} \Delta x_1 + \cdots + \frac{\partial y}{\partial x_n} \Delta x_n.</math>
| |
| | |
| Since, with this definition,
| |
| :<math>dx_i(\Delta x_1,\dots,\Delta x_n) = \Delta x_i,</math>
| |
| one has
| |
| :<math>dy = \frac{\partial y}{\partial x_1}\,d x_1 + \cdots + \frac{\partial y}{\partial x_n}\,d x_n.</math>
| |
| | |
| As in the case of one variable, the approximate identity holds
| |
| | |
| :<math>dy \approx \Delta y</math>
| |
| | |
| in which the total error can be made as small as desired relative to <math>\sqrt{\Delta x_1^2+\cdots +\Delta x_n^2}</math> by confining attention to sufficiently small increments.
| |
| | |
| ==Higher-order differentials==
| |
| Higher-order differentials of a function ''y'' = ''f''(''x'') of a single variable ''x'' can be defined via:<ref>{{harvnb|Cauchy|1823}}. See also, for instance, {{harvnb|Goursat|1904|loc=I, §14}}.</ref>
| |
| :<math>d^2y = d(dy) = d(f'(x)dx) = f''(x)\,(dx)^2,</math>
| |
| and, in general,
| |
| :<math>d^ny = f^{(n)}(x)\,(dx)^n.</math>
| |
| Informally, this justifies Leibniz's notation for higher-order derivatives
| |
| :<math>f^{(n)}(x) = \frac{d^n f}{dx^n}.</math>
| |
| When the independent variable ''x'' itself is permitted to depend on other variables, then the expression becomes more complicated, as it must include also higher order differentials in ''x'' itself. Thus, for instance,
| |
| :<math>
| |
| \begin{align}
| |
| d^2 y &= f''(x)\,(dx)^2 + f'(x)d^2x\\
| |
| d^3 y &= f'''(x)\, (dx)^3 + 3f''(x)dx\,d^2x + f'(x)d^3x
| |
| \end{align}</math>
| |
| and so forth.
| |
| | |
| Similar considerations apply to defining higher order differentials of functions of several variables. For example, if ''f'' is a function of two variables ''x'' and ''y'', then
| |
| :<math>d^nf = \sum_{k=0}^n \binom{n}{k}\frac{\partial^n f}{\partial x^k \partial y^{n-k}}(dx)^k(dy)^{n-k},</math>
| |
| where <math>\scriptstyle{\binom{n}{k}}</math> is a [[binomial coefficient]]. In more variables, an analogous expression holds, but with an appropriate [[multinomial coefficient|multinomial]] expansion rather than binomial expansion.<ref>{{harvnb|Goursat|1904|loc=I, §14}}</ref>
| |
| | |
| Higher order differentials in several variables also become more complicated when the independent variables are themselves allowed to depend on other variables. For instance, for a function ''f'' of ''x'' and ''y'' which are allowed to depend on auxiliary variables, one has
| |
| :<math>d^2f = \left(\frac{\partial^2f}{\partial x^2}(dx)^2+2\frac{\partial^2f}{\partial x\partial y}dx\,dy + \frac{\partial^2f}{\partial y^2}(dy)^2\right) + \frac{\partial f}{\partial x}d^2x + \frac{\partial f}{\partial y}d^2y.</math>
| |
| | |
| Because of this notational infelicity, the use of higher order differentials was roundly criticized by {{harvnb|Hadamard|1935}}, who concluded:
| |
| :Enfin, que signifie ou que représente l'égalité
| |
| ::<math>d^2z = r\,dx^2 + 2s\,dx\,dy + t\,dy^2\,?</math>
| |
| :A mon avis, rien du tout.
| |
| | |
| That is: ''Finally, what is meant, or represented, by the equality [...]? In my opinion, nothing at all.'' In spite of this skepticism, higher order differentials did emerge as an important tool in analysis<ref>In particular to [[infinite dimensional holomorphy]] {{harv|Hille|Phillips|1974}} and [[numerical analysis]] via the calculus of [[finite differences]].</ref>
| |
| | |
| In these contexts, the ''n''th order differential of the function ''f'' applied to an increment Δ''x'' is defined by
| |
| :<math>d^nf(x,\Delta x) = \left.\frac{d^n}{dt^n} f(x+t\Delta x)\right|_{t=0}</math>
| |
| or an equivalent expression, such as
| |
| :<math>\lim_{t\to 0}\frac{\Delta^n_{t\Delta x} f}{t^n}</math>
| |
| where <math>\Delta^n_{t\Delta x} f</math> is an ''n''th [[forward difference]] with increment ''t''Δ''x''.
| |
| | |
| This definition makes sense as well if ''f'' is a function of several variables (for simplicity taken here as a vector argument). Then the ''n''th differential defined in this way is a [[homogeneous function]] of degree ''n'' in the vector increment Δ''x''. Furthermore, the [[Taylor series]] of ''f'' at the point ''x'' is given by
| |
| :<math>f(x+\Delta x)\sim f(x) + df(x,\Delta x) + \frac{1}{2}d^2f(x,\Delta x) + \cdots + \frac{1}{n!}d^nf(x,\Delta x) + \cdots</math>
| |
| The higher order [[Gâteaux derivative]] generalizes these considerations to infinite dimensional spaces.
| |
| | |
| ==Properties==
| |
| A number of properties of the differential follow in a straightforward manner from the corresponding properties of the derivative, partial derivative, and total derivative. These include:<ref>{{harvnb|Goursat|1904|loc=I, §17}}</ref>
| |
| | |
| * [[Linearity]]: For constants ''a'' and ''b'' and differentiable functions ''f'' and ''g'',
| |
| ::<math>d(af+bg) = a\,df + b\,dg.</math>
| |
| * [[Product rule]]: For two differentiable functions ''f'' and ''g'',
| |
| ::<math>d(fg) = f\,dg+g\,df.</math>
| |
| | |
| An operation ''d'' with these two properties is known in [[abstract algebra]] as a [[derivation (abstract algebra)|derivation]]. They imply the Power rule
| |
| ::<math> d( f^n ) = n f^{n-1} df </math>
| |
| In addition, various forms of the [[chain rule]] hold, in increasing level of generality:<ref>{{harvnb|Goursat|1904|loc=I, §§14,16}}</ref>
| |
| | |
| * If ''y'' = ''f''(''u'') is a differentiable function of the variable ''u'' and ''u'' = ''g''(''x'') is a differentiable function of ''x'', then
| |
| ::<math>dy = f'(u)\,du = f'(g(x))g'(x)\,dx.</math>
| |
| | |
| * If ''y'' = ''f''(''x''<sub>1</sub>, ..., ''x''<sub>''n''</sub>) and all of the variables ''x''<sub>1</sub>, ..., ''x''<sub>''n''</sub> depend on another variable ''t'', then by the [[Chain rule#Chain rule for several variables|chain rule for partial derivatives]], one has
| |
| | |
| :: <math>\begin{align}
| |
| dy &= \frac{dy}{dt}dt \\
| |
| &= \frac{\partial y}{\partial x_1} dx_1 + \cdots + \frac{\partial y}{\partial x_n} dx_n\\
| |
| &= \frac{\partial y}{\partial x_1} \frac{dx_1}{dt}\,dt + \cdots + \frac{\partial y}{\partial x_n} \frac{dx_n}{dt}\,dt.
| |
| \end{align}</math>
| |
| | |
| :Heuristically, the chain rule for several variables can itself be understood by dividing through both sides of this equation by the infinitely small quantity ''dt''.
| |
| | |
| * More general analogous expressions hold, in which the intermediate variables ''x''<sub> ''i''</sub> depend on more than one variable.
| |
| | |
| ==General formulation==
| |
| {{See also|Fréchet derivative|Gâteaux derivative}}
| |
| A consistent notion of differential can be developed for a function ''f'' '''R'''<sup>''n''</sup> → '''R'''<sup>''m''</sup> between two [[Euclidean space]]s. Let '''x''',Δ'''x''' ∈ '''R'''<sup>''n''</sup> be a pair of [[Euclidean vector]]s. The increment in the function ''f'' is
| |
| :<math>\Delta f = f(\mathbf{x}+\Delta\mathbf{x}) - f(\mathbf{x}).</math>
| |
| If there exists an ''m'' × ''n'' [[matrix (mathematics)|matrix]] ''A'' such that
| |
| :<math>\Delta f = A\Delta\mathbf{x} + \|\Delta\mathbf{x}\|\boldsymbol{\varepsilon}</math>
| |
| in which the vector '''''ε''''' → 0 as Δ'''x''' → 0, then ''f'' is by definition differentiable at the point '''x'''. The matrix ''A'' is sometimes known as the [[Jacobian matrix]], and the [[linear transformation]] that associates to the increment Δ'''x''' ∈ '''R'''<sup>''n''</sup> the vector ''A''Δ'''x''' ∈ '''R'''<sup>''m''</sup> is, in this general setting, known as the differential ''df''(''x'') of ''f'' at the point ''x''. This is precisely the [[Fréchet derivative]], and the same construction can be made to work for a function between any [[Banach space]]s.
| |
| | |
| Another fruitful point of view is to define the differential directly as a kind of [[directional derivative]]:
| |
| | |
| :<math>df(\mathbf{x},\mathbf{h}) = \lim_{t\to 0}\frac{f(\mathbf{x}+t\mathbf{h})-f(\mathbf{x})}{t} = \left.\frac{d}{dt}f(\mathbf{x}+t\mathbf{h})\right|_{t=0},</math>
| |
| | |
| which is the approach already taken for defining higher order differentials (and is most nearly the definition set forth by Cauchy). If ''t'' represents time and '''x''' position, then '''h''' represents a velocity instead of a displacement as we have heretofore regarded it. This yields yet another refinement of the notion of differential: that it should be a linear function of a kinematic velocity. The set of all velocities through a given point of space is known as the [[tangent space]], and so ''df'' gives a linear function on the tangent space: a [[differential form]]. With this interpretation, the differential of ''f'' is known as the [[exterior derivative]], and has broad application in [[differential geometry]] because the notion of velocities and the tangent space makes sense on any [[differentiable manifold]]. If, in addition, the output value of ''f'' also represents a position (in a Euclidean space), then a dimensional analysis confirms that the output value of ''df'' must be a velocity. If one treats the differential in this manner, then it is known as the [[pushforward (differential)|pushforward]] since it "pushes" velocities from a source space into velocities in a target space.
| |
| | |
| ==Other approaches==
| |
| {{Main|Differential (infinitesimal)}}
| |
| Although the notion of having an infinitesimal increment ''dx'' is not well-defined in modern [[mathematical analysis]], a variety of techniques exist for defining the [[differential (infinitesimal)|infinitesimal differential]] so that the differential of a function can be handled in a manner that does not clash with the [[Leibniz notation]]. These include:
| |
| | |
| * Defining the differential as a kind of [[differential form]], specifically the [[exterior derivative]] of a function. The infinitesimal increments are then identified with vectors in the [[tangent space]] at a point. This approach is popular in [[differential geometry]] and related fields, because it readily generalizes to mappings between [[differentiable manifold]]s.
| |
| * Differentials as [[nilpotent]] elements of [[commutative ring]]s. This approach is popular in [[algebraic geometry]].<ref>{{Harvnb|Eisenbud|Harris|1998}}.</ref>
| |
| * Differentials in smooth models of set theory. This approach is known as [[synthetic differential geometry]] or [[smooth infinitesimal analysis]] and is closely related to the algebraic geometric approach, except that ideas from [[topos theory]] are used to ''hide'' the mechanisms by which nilpotent infinitesimals are introduced.<ref>See {{Harvnb|Kock|2006}} and {{Harvnb|Moerdijk|Reyes|1991}}.</ref>
| |
| * Differentials as infinitesimals in [[hyperreal number]] systems, which are extensions of the real numbers which contain invertible infinitesimals and infinitely large numbers. This is the approach of [[nonstandard analysis]] pioneered by [[Abraham Robinson]].<ref name="nonstd">See {{Harvnb|Robinson|1996}} and {{Harvnb|Keisler|1986}}.</ref>
| |
| | |
| == Examples and applications ==
| |
| Differentials may be effectively used in [[numerical analysis]] to study the propagation of experimental errors in a calculation, and thus the overall [[numerical stability]] of a problem {{harv|Courant|1937i}}. Suppose that the variable ''x'' represents the outcome of an experiment and ''y'' is the result of a numerical computation applied to ''x''. The question is to what extent errors in the measurement of ''x'' influence the outcome of the computation of ''y''. If the ''x'' is known to within Δ''x'' of its true value, then [[Taylor's theorem]] gives the following estimate on the error Δ''y'' in the computation of ''y'':
| |
| :<math>\Delta y = f'(x)\Delta x + \frac{(\Delta x)^2}{2}f''(\xi)</math>
| |
| where ξ = ''x'' + θΔ''x'' for some 0 < θ < 1. If Δ''x'' is small, then the second order term is negligible, so that Δ''y'' is, for practical purposes, well-approximated by ''dy'' = ''f'''(''x'')Δ''x''.
| |
| | |
| The differential is often useful to rewrite a [[differential equation]]
| |
| | |
| : <math> \frac{dy}{dx} = g(x) </math>
| |
| | |
| in the form
| |
| | |
| : <math> dy = g(x)\,dx, </math>
| |
| | |
| in particular when one wants to [[separation of variables|separate the variables]].
| |
| | |
| ==Notes==
| |
| <references/>
| |
| | |
| == References ==
| |
| *{{Citation | last1=Boyer | first1=Carl B. | author1-link=Carl Benjamin Boyer | title=The history of the calculus and its conceptual development | publisher=[[Dover Publications]] | location=New York | mr=0124178 | year=1959}}.
| |
| *{{citation|first=Augustin-Louis|last=Cauchy|authorlink=Augustin-Louis Cauchy|chapter=<!--Quatrième leçon: Différentialles des fonctions d'une seule variable-->|title=Résumé des Leçons données à l'Ecole royale polytechnique sur les applications du calcul infinitésimal|year=1823|url=http://math-doc.ujf-grenoble.fr/cgi-bin/oeitem?id=OE_CAUCHY_2_4_9_0}}.
| |
| *{{Citation | last1=Courant | first1=Richard |authorlink=Richard Courant | title=Differential and integral calculus. Vol. I | publisher=[[John Wiley & Sons]] | location=New York | series=Wiley Classics Library | isbn=978-0-471-60842-4 | mr=1009558 | year=1937i|publication-date=1988}}.
| |
| *{{Citation | last1=Courant | first1=Richard | authorlink=Richard Courant |title=Differential and integral calculus. Vol. II | publisher=[[John Wiley & Sons]] | location=New York | series=Wiley Classics Library | isbn=978-0-471-60840-0 | mr=1009559 | year=1937ii|publication-date=1988}}.
| |
| *{{Citation | last1=Courant | first1=Richard | authorlink1=Richard Courant| last2=John | first2=Fritz |authorlink2=Fritz John| title=Introduction to Calculus and Analysis Volume 1|series=Classics in Mathematics| publisher=[[Springer-Verlag]] | location=Berlin, New York | isbn=3-540-65058-X | year=1999 | mr=1746554 }}
| |
| * {{Citation| author1-link=David Eisenbud|first1=David|last1=Eisenbud|author2-link=Joe Harris (mathematician)|first2=Joe|last2=Harris| year = 1998 |title = The Geometry of Schemes| publisher = Springer-Verlag| isbn = 0-387-98637-5}}.
| |
| *{{Citation | last1=Fréchet | first1=Maurice | author1-link= Maurice Fréchet | title=La notion de différentielle dans l'analyse générale | mr=1509268 | year=1925 | journal=Annales Scientifiques de l'École Normale Supérieure. Troisième Série | issn=0012-9593 | volume=42 | pages=293–323}}.
| |
| *{{Citation | last1=Goursat | first1=Édouard | authorlink=Édouard Goursat|title=A course in mathematical analysis: Vol 1: Derivatives and differentials, definite integrals, expansion in series, applications to geometry| publisher=[[Dover Publications]] | location=New York | others= E. R. Hedrick | mr=0106155 | year=1904 | publication-date=1959|url=http://www.archive.org/details/coursemathanalys01gourrich}}.
| |
| *{{citation|last=Hadamard|first=Jacques|authorlink=Jacques Hadamard|title=La notion de différentiel dans l'enseignement|journal=Mathematical Gazette|volume=XIX|year=1935|issue=236|pages=341–342|jstor=3606323}}.
| |
| *{{Citation | last1=Hardy | first1=Godfrey Harold | author1-link=G. H. Hardy | title=A Course of Pure Mathematics | publisher=[[Cambridge University Press]] | isbn=978-0-521-09227-2 | year=1908}}.
| |
| *{{Citation | last1=Hille | first1=Einar | authorlink1=Einar Hille | last2=Phillips | first2=Ralph S. | authorlink2=Ralph Phillips (mathematician) | title=Functional analysis and semi-groups | publisher=[[American Mathematical Society]] | location=Providence, R.I. | mr=0423094 | year=1974}}.
| |
| *{{Citation | last1=Ito | first1=Kiyosi | title=Encyclopedic Dictionary of Mathematics | publisher=[[MIT Press]] | edition=2nd | isbn=978-0-262-59020-4 | year=1993}}.
| |
| *{{citation|chapter=Chapter 13: Differentials and the law of the mean|title=Calculus: An intuitive and physical approach|first=Morris|last=Kline|authorlink=Morris Kline|publisher=John Wiley and Sons|year=1977}}.
| |
| *{{Citation | last1=Kline | first1=Morris | author1-link=Morris Kline | title=Mathematical thought from ancient to modern times | year=1972 | publisher=[[Oxford University Press]] | edition=3rd | isbn=978-0-19-506136-9 | publication-date=1990}}
| |
| * {{Citation |author-link=Howard Jerome Keisler|first=H. Jerome|last=Keisler|title=Elementary Calculus: An Infinitesimal Approach|edition=2nd|year=1986|url=http://www.math.wisc.edu/~keisler/calc.html}}.
| |
| * {{Citation | first=Anders|last= Kock|url=http://home.imf.au.dk/kock/sdg99.pdf|title= Synthetic Differential Geometry|publisher= Cambridge University Press|edition= 2nd|year=2006}}.
| |
| * {{Citation | last1=Moerdijk|first1= I.|authorlink1=Ieke Moerdijk|last2=Reyes|first2=G.E.|title=Models for Smooth Infinitesimal Analysis|publisher= Springer-Verlag|year= 1991}}.
| |
| * {{Citation | last1=Robinson | first1=Abraham | author1-link=Abraham Robinson | title=Non-standard analysis | publisher=[[Princeton University Press]] | isbn=978-0-691-04490-3 | year=1996}}.
| |
| *{{springer|id=D/d031810|title=Differential|first=G.P.|last=Tolstov}}.
| |
| | |
| ==External links==
| |
| *[http://demonstrations.wolfram.com/DifferentialOfAFunction/ Differential Of A Function] at Wolfram Demonstrations Project
| |
| | |
| {{DEFAULTSORT:Differential Of A Function}}
| |
| | |
| [[Category:Differential calculus]]
| |
| [[Category:Generalizations of the derivative]]
| |
| [[Category:Linear operators in calculus]]
| |