|
|
Line 1: |
Line 1: |
| In [[mathematics]], '''constraint counting''' is a crude but often useful way of counting the number of ''free functions'' needed to specify a solution to a [[partial differential equation]].
| | Nice to satisfy you, I am Marvella Shryock. The thing she adores most is physique building and now she is trying to earn cash with it. South Dakota is exactly where me and my spouse live. Managing people is his occupation.<br><br>My web page ... [http://moodle.kspu.karelia.ru/user/view.php?id=23302&course=1 std home test] |
| | |
| ==Einstein strength==
| |
| | |
| Everyone knows that
| |
| [[wikiquote:Albert Einstein|Albert Einstein said]] that a [[physical theory]] should be ''as simple as possible, but no simpler''. But not everyone knows that he had a quantitative idea in mind.{{fact|date=December 2007}}
| |
| | |
| Consider a second order partial differential equation in three variables, such as the two-dimensional [[wave equation]]
| |
| :<math> u_{tt} = u_{xx} + u_{yy}. </math>
| |
| It is often profitable to think of such an equation as a ''rewrite rule'' allowing us to rewrite arbitrary partial derivatives of the function <math>u(t,x,y)</math> using fewer partials than would be needed for an arbitrary function. For example, if <math>u</math> satisfies the wave equation, we can rewrite
| |
| :<math> u_{tyt} = u_{tty} = u_{xxy} + u_{yyy} </math>
| |
| where in the first equality, we appealed to the fact that ''partial derivatives commute''. | |
| | |
| Einstein asked: how much ''redundancy'' can we eliminate in this fashion, for a given partial differential equation?
| |
| | |
| ===Linear equations===
| |
| | |
| To answer this in the important special case of a [[linear]] partial differential equation, Einstein asked: how many of the partial derivatives of a solution can be [[linearly independent]]? It is convenient to record his answer using an [[ordinary generating function]]
| |
| :<math>s(\xi) = \sum_{k=0}^\infty s_k \xi^k </math>
| |
| where <math>s_k</math> is a natural number counting the number of linearly independent partial derivatives (of order k) of an arbitrary function in the solution space of the equation in question.
| |
| | |
| Einstein observed that whenever a function satisfies some partial differential equation, we can use the corresponding rewrite rule to eliminate some of them, because ''further mixed partials have necessarily become linearly dependent''. Specifically, the power series counting the variety of ''arbitrary'' functions of three variables (no constraints) is
| |
| :<math>f(\xi) = \frac{1}{(1-\xi)^3} = 1 + 3 \xi + 6 \xi^2 + 10 \xi^3 + \dots</math>
| |
| but the power series counting those in the solution space of some second order p.d.e. is
| |
| :<math>g(\xi) = \frac{1-\xi^2}{(1-\xi)^3} = 1 + 2 \xi + 5 \xi^2 + 7 \xi^3 + \dots </math> | |
| which records that we can eliminate ''one'' second order partial <math>u_{tt}</math>, ''three'' third order partials <math>u_{ttt}, \, u_{ttx}, \, u_{tty} </math>, and so forth.
| |
| | |
| More generally, the o.g.f. for an arbitrary function of n variables is
| |
| :<math>s[n](\xi) = 1/(1-\xi)^n = 1 + n \, \xi + \left( \begin{matrix} n \\ 2 \end{matrix} \right) \, \xi^2 + \left( \begin{matrix} n+1 \\ 3 \end{matrix} \right) \, \xi^3 + \dots </math>
| |
| where the coefficients of the infinite [[power series]] of the generating function are constructed using an appropriate infinite sequence of [[binomial coefficient]]s, and the power series for a function required to satisfy a linear m-th order equation is
| |
| :<math>g(\xi) = \frac{1-\xi^m}{(1-\xi)^n} </math>
| |
| | |
| Next,
| |
| :<math> \frac{1-\xi^2}{(1-\xi)^3} = \frac{1 + \xi}{(1-\xi)^2}</math>
| |
| which can be interpreted to predict that a solution to a second order linear p.d.e. in ''three'' variables is expressible by two ''freely chosen'' functions of ''two'' variables, one of which is used immediately, and the second, only after taking a ''first derivative'', in order to express the solution.
| |
| | |
| ===General solution of initial value problem===
| |
| | |
| To verify this prediction, recall the solution of the [[initial value problem]]
| |
| :<math>u_{tt} = u_{xx} + u_{yy}, \; u(0,x,y) = p(x,y), \; u_t(0,x,y) = q(x,y) </math>
| |
| Applying the [[Laplace transform]] <math>u(t,x,y) \mapsto [Lu](\omega,x,y)</math> gives
| |
| :<math> -\omega^2 \, [Lu] + \omega \, p(x,y) + q(x,y) + [Lu]_x + [Lu]_y</math>
| |
| Applying the [[Fourier transform]] <math>[Lu](\omega,x,y) \mapsto [FLU](\omega,m,n)</math> to the two spatial variables gives
| |
| :<math> -\omega^2 \, [FLu] + \omega \, [Fp] + [Fq] - (m^2+n^2) \, [FLu]</math>
| |
| or
| |
| :<math>[FLu](\omega,m,n) = \frac{ \omega \, [Fp](m,n) + [Fq](m,n)}{\omega^2 + m^2 + n^2}</math>
| |
| Applying the inverse Laplace transform gives
| |
| :<math> [Fu](t,m,n) = [Fp](m,n) \, \cos( \sqrt{m^2+n^2} \, t ) + \frac{ [Fq](m,n) \, \sin (\sqrt{m^2+n^2} \, t) }{\sqrt{m^2+n^2}} </math>
| |
| Applying the inverse Fourier transform gives
| |
| :<math>u(t,x,y) = Q(t,x,y) + P_t(t,x,y)</math>
| |
| where
| |
| :<math>P(t,x,y) = \frac{1}{2\pi} \, \int_{(x-x')^2 + (y-y')^2 < t^2} \frac{p(x',y') \, dx' dy'}{ \left[ t^2-(x-x')^2-(y-y')^2 \right]^{1/2}} </math>
| |
| :<math>Q(t,x,y) = \frac{1}{2\pi} \, \int_{(x-x')^2 + (y-y')^2 < t^2} \frac{q(x',y') \, dx' dy'}{ \left[ t^2-(x-x')^2-(y-y')^2 \right]^{1/2}} </math>
| |
| Here, p,q are arbitrary (sufficiently smooth) functions of two variables, so (due their modest time dependence) the integrals P,Q also count as "freely chosen" functions of two variables; as promised, one of them is differentiated once before adding to the other to express the general solution of the initial value problem for the two dimensional wave equation.
| |
| | |
| ===Quasilinear equations===
| |
| | |
| In the case of a nonlinear equation, it will only rarely be possible to obtain the general solution in closed form. However, if the equation is ''quasilinear'' (linear in the highest order derivatives), then we can still obtain approximate information similar to the above: specifying a member of the solution space will be "modulo nonlinear quibbles" equivalent to specifying a certain number of functions in a smaller number of variables. The number of these functions is the ''Einstein strength'' of the p.d.e. In the simple example above, the strength is two, although in this case we were able to obtain more precise information.
| |
| | |
| ==References==
| |
| | |
| *{{cite journal | author=Siklos, S. T. C. | title=Counting solutions of Einstein's equation | journal=Class. Quant. Grav. | year=1996 | volume=13 | pages=1931–1948 | doi=10.1088/0264-9381/13/7/021 | issue=7}} Application of constraint counting to Riemannian geometry and to general relativity.
| |
| | |
| [[Category:Combinatorics]]
| |
| [[Category:Partial differential equations]]
| |
| [[Category:Riemannian geometry]]
| |
Nice to satisfy you, I am Marvella Shryock. The thing she adores most is physique building and now she is trying to earn cash with it. South Dakota is exactly where me and my spouse live. Managing people is his occupation.
My web page ... std home test