Capillary number: Difference between revisions

From formulasearchengine
Jump to navigation Jump to search
en>ZéroBot
 
Changed imprecise term "shear viscosity" to technical term "dynamic viscosity".
Line 1: Line 1:
The name of the author is just Gabrielle Lattimer. For years she's been working as a library assistant. For a while she's just lately in Massachusetts. As a woman what your sweetheart really likes is mah jongg but she have not made a dime by using it. She are running and maintaining the latest blog here: http://prometeu.net<br><br>My page ... [http://prometeu.net triche Clash of clans]
In [[mathematics]], an '''invariant subspace''' of a [[linear mapping]]
 
:''T'' : ''V'' &rarr; ''V''
 
from some [[vector space]] ''V'' to itself is a [[linear subspace|subspace]] ''W'' of ''V'' such that ''T''(''W'') is contained in ''W''. An invariant subspace of ''T'' is also said to be ''' ''T'' invariant'''.
 
If ''W'' is ''T''-invariant, we can restrict ''T'' to ''W'' to arrive at a new linear mapping
 
:''T''|''W'' : ''W'' &rarr; ''W''.
 
Next we give a few immediate examples of invariant subspaces.
 
Certainly ''V'' itself, and the subspace {0}, are trivially invariant subspaces for every linear operator ''T'' : ''V'' → ''V''. For certain linear operators there is no ''non-trivial'' invariant subspace; consider for instance a [[rotation (mathematics)|rotation]] of a two-dimensional real vector space.
 
Let '''''v''''' be an [[eigenvector]] of ''T'', i.e. ''T'' '''''v''''' = λ'''''v'''''. Then ''W'' = [[linear span|span]] {'''''v'''''} is ''T'' invariant. As a consequence of the [[fundamental theorem of algebra]], every linear operator on a [[complex number|complex]] finite-[[dimension (vector space)|dimensional]] vector space with dimension at least 2 has an eigenvector. Therefore every such linear operator has a non-trivial invariant subspace. The fact that the complex numbers are [[algebraically closed]] is required here. Comparing with the previous example, one can see that the invariant subspaces of a linear transformation are dependent upon the underlying scalar field of ''V''.
 
An '''invariant vector''' ([[Fixed point (mathematics)|fixed point]] of ''T''), other than 0, spans an invariant subspace of dimension 1. An invariant subspace of dimension 1 will be acted on by ''T'' by a scalar, and consists of invariant vectors if and only if that scalar is 1. 
 
As the above examples indicate, the invariant subspaces of a given linear transformation ''T'' shed light on the structure of ''T''. When ''V'' is a finite dimensional vector space over an algebraically closed field, linear transformations acting on ''V'' are characterized (up to similarity) by the [[Jordan canonical form]], which decomposes ''V'' into invariant subspaces of ''T''. Many fundamental questions regarding ''T'' can be translated to questions about invariant subspaces of ''T''.
 
More generally, invariant subspaces are defined for sets of operators as subspaces invariant for each operator in the set. Let ''L''(''V'') denote the algebra of linear transformations on ''V'', and Lat(''T'') be the family of subspaces invariant under ''T'' ∈ ''L''(''V''). (The "Lat" notation refers to the fact that Lat(''T'') forms a [[lattice (order)|lattice]]; see discussion below.) Given a nonempty set Σ ⊂ ''L''(''V''), one considers the invariant subspaces invariant under each ''T'' ∈ Σ. In symbols,
 
:<math>\mbox{Lat}(\Sigma) = \bigcap_{T \in \Sigma} \mbox{Lat}( T ) \;.</math>
 
For instance, it is clear that if Σ = ''L''(''V''), then  Lat(Σ) = { {0}, ''V''}.
 
Given a [[Group representation|representation]] of a group ''G'' on a vector space ''V'', we have a linear transformation ''T''(''g'') : ''V'' → ''V'' for every element ''g'' of ''G''. If a subspace ''W'' of ''V'' is invariant with respect to all these transformations, then it is a [[group representation|subrepresentation]] and the group ''G'' acts on ''W'' in a natural way.
 
As another example, let ''T'' ∈ ''L''(''V'') and Σ be the algebra generated by {1, ''T''}, where 1 is the identity operator. Then Lat(''T'') = Lat(Σ). Because ''T'' lies in Σ trivially, Lat(Σ) ⊂ Lat(''T''). On the other hand, Σ consists of polynomials in 1 and ''T'', therefore the reverse inclusion holds as well.
 
== Matrix representation ==
 
Over a finite dimensional vector space every linear transformation ''T'' : ''V'' → ''V'' can be represented by a [[matrix (math)|matrix]] once a [[basis (linear algebra)|basis]] of ''V'' has been chosen.
 
Suppose now ''W'' is a ''T'' invariant subspace. Pick a basis ''C'' = {'''''v'''''<sub>1</sub>, ..., '''''v'''''<sub>''k''</sub>} of ''W'' and complete it to a basis ''B'' of ''V''. Then, with respect to this basis, the matrix representation of ''T'' takes the form:
 
:<math> T = \begin{bmatrix} T_{11} & T_{12} \\ 0 & T_{22} \end{bmatrix} </math>
 
where the upper-left block ''T''<sub>11</sub> is the restriction of ''T'' to ''W''.
 
In other words, given an invariant subspace ''W'' of ''T'', ''V'' can be decomposed into the [[direct sum of vector spaces|direct sum]]
 
:<math>V = W \oplus W'.</math>
 
Viewing ''T'' as an operator matrix
 
:<math>
T = \begin{bmatrix} T_{11} & T_{12} \\ T_{21} & T_{22} \end{bmatrix} : \begin{matrix}W \\ \oplus \\ W' \end{matrix} \rightarrow \begin{matrix}W \\ \oplus \\ W' \end{matrix},
</math>
 
it is clear that ''T''<sub>21</sub>: ''W'' → ''W' '' must be zero.
 
Determining whether a given subspace ''W'' is invariant under ''T'' is ostensibly a problem of geometric nature. Matrix representation allows one to phrase this problem algebraically. The [[projection operator]] ''P'' onto ''W'', is defined by
''P''(''w + w' '') = ''w'', where ''w'' ∈ ''W'' and ''w' '' ∈ ''W' ''. The projection ''P'' has matrix representation
 
:<math>
P = \begin{bmatrix} 1 & 0 \\ 0 & 0 \end{bmatrix} : \begin{matrix}W \\ \oplus \\ W' \end{matrix} \rightarrow \begin{matrix}W \\ \oplus \\ W' \end{matrix}.
</math>
 
A straightforward calculation shows that ''W'' = Ran ''P'', the range of ''P'', is invariant under ''T'' if and only of ''PTP'' = ''TP''. In other words, a subspace ''W'' being an element of Lat(''T'') is equivalent to the corresponding projection satisfying the relation ''PTP'' = ''TP''.
 
If ''P'' is a projection (i.e. ''P''<sup>2</sup> = ''P''), so is 1 - ''P'', where 1 is the identity operator. It follows from the above that ''TP = PT'' if and only if both Ran ''P'' and Ran (1 - ''P'') are invariant under ''T''. In that case, ''T'' has matrix representation
 
:<math>
T = \begin{bmatrix} T_{11} & 0 \\ 0 & T_{22} \end{bmatrix} : \begin{matrix} \mbox{Ran}P \\ \oplus \\ \mbox{Ran}(1-P) \end{matrix} \rightarrow  \begin{matrix} \mbox{Ran}P \\ \oplus \\ \mbox{Ran}(1-P) \end{matrix} \;.
</math>
 
Colloquially, a projection that commutes with ''T'' "diagonalizes" ''T''.
 
== Invariant subspace problem ==
:{{main|Invariant subspace problem}}
 
The invariant subspace problem concerns the case where ''V'' is a separable [[Hilbert space]] over the [[complex number]]s, of dimension > 1, and ''T'' is a [[bounded operator]]. The problem is to decide whether every such ''T'' has a non-trivial, closed, invariant subspace. This problem is unsolved {{As of|2013|lc=on}}.
 
In the more general case where ''V'' is hypothesized to be a [[Banach space]], there is an example of an [[Invariant subspace problem|operator without an invariant subspace]] due to [[Per Enflo]] (1976).  A [[Invariant subspace problem|concrete example]] of an operator without an invariant subspace was produced in 1985 by [[Charles Read (mathematician)|Charles Read]].
 
== Invariant-subspace lattice ==
 
Given a nonempty Σ ⊂ ''L''(''V''), the invariant subspaces invariant under each element of Σ form a [[lattice (order)|lattice]], sometimes called the '''invariant-subspace lattice''' of Σ and denoted by Lat(Σ).  
 
The lattice operations are defined in a natural way: for Σ' ⊂ Σ, the ''meet'' operation is defined by
 
:<math>\bigwedge_{W \in \Sigma'} W = \bigcap_{W \in \Sigma'} W</math>
 
while the ''join'' operation is
 
:<math>\bigvee_{W \in \Sigma'} W = \mbox{span} \bigcup_{W \in \Sigma'} W \;.</math>
 
A minimal element in Lat(Σ) in said to be a '''minimal invariant subspace'''.
 
== Fundamental theorem of noncommutative algebra ==
 
Just as the fundamental theorem of algebra ensures that every linear transformation acting on a finite dimensional complex vector space has a nontrivial invariant subspace, the ''fundamental theorem of noncommutative algebra'' asserts that Lat(Σ) contains nontrivial elements for certain Σ.
 
'''Theorem (Burnside)''' Assume ''V'' is a complex vector space of finite dimension. For every proper subalgebra Σ of ''L''(''V''), Lat(Σ) contain a nontrivial element.
 
Burnside's theorem is of fundamental importance in linear algebra. One consequence is that every commuting family in ''L''(''V'') can be simultaneously upper-triangularized.
 
A nonempty Σ ⊂ ''L''(''V'') is said to be '''triangularizable''' if there exists a basis {''e''<sub>1</sub>...''e<sub>n</sub>''} of ''V'' such that
 
:<math>\mbox{span} \{ e_1, \cdots, e_k \} \in \mbox{Lat}(\Sigma) \quad \forall k \geq 1 \;.</math>
 
In other words, Σ is triangularizable if there exists a basis such that every element of Σ has an upper-triangular matrix representation in that basis. It follows from Burnside's theorem that every commutative algebra Σ in ''L''(''V'') is triangularizable. Hence every commuting family in ''L''(''V'') can be simultaneously upper-triangularized.
 
== Left ideals ==
 
If ''A'' is an algebra, one can define a [[regular representation|''left regular representation'']] Φ on ''A'': Φ(''a'')''b'' = ''ab'' is a homomorphism from ''A'' to ''L''(''A''), the algebra of linear transformations on ''A''
 
The invariant subspaces of Φ are precisely the left ideals of ''A''. A left ideal ''M'' of ''A'' gives a subrepresentation of ''A'' on ''M''.
 
If ''M'' is a left ideal of ''A''. Consider the quotient vector space ''A''/''M''. The left regular representation Φ on ''M'' now descends to a representation Φ' on ''A''/''M''. If [''b''] denotes an equivalence class in ''A''/''M'', Φ'(''a'')[''b''] = [''ab'']. The kernel of the representation Φ' is the set {''a'' ∈ ''A''| ''ab'' ∈ ''M'' for all ''b''}.
 
The representation Φ' is [[irreducible representation|irreducible]] if and only if ''M'' is a maximal left ideal, since a subspace ''V'' ⊂ ''A''/''M'' is an invariant under {Φ'(''a'')| ''a'' ∈ ''A''} if and only if its preimage under the quotient map, ''V'' + ''M'', is a left ideal in ''A''.
 
==See also==
* [[Invariant manifold]]
 
==Bibliography==
* {{cite book
|author=Yuri A. Abramovich and [[Charalambos D. Aliprantis]],
|title=An Invitation to Operator Theory
|publisher=[http://www.ams.org/bookstore-getitem/item=GSM-50 American Mathematical Society]
|year=2002
|isbn=978-0-8218-2146-6}}
*{{cite book
|author=Beauzamy, Bernard
|title=Introduction to Operator Theory and Invariant Subspaces
|year=1988
|publisher=North Holland
}}
* {{cite book
|author=[[Per Enflo|Enflo, Per]] and Lomonosov, Victor
|chapter=Some aspects of the invariant subspace problem
|title=Handbook of the geometry of Banach spaces
|volume=I
|pages=533–559
|publisher=North-Holland
|location=Amsterdam|year=2001
}}
*{{cite book
|title=Invariant Subspaces of Matrices with Applications
|author=Israel Gohberg, Peter Lancaster, and Leiba Rodman
|edition=Reprint, with list of [[errata]] and new preface,  of the 1986 Wiley
|series=Classics in Applied Mathematics
|volume=51
|publisher=[http://www.ec-securehost.com/SIAM/CL51.html Society for Industrial and Applied Mathematics (SIAM)]
|year=2006
|pages=xxii+692
|isbn=978-0-89871-608-5
}}
* Yurii I. Lyubich. ''Introduction to the Theory of Banach Representations of Groups''. Translated from the 1985 Russian-language edition (Kharkov, Ukraine). Birkhäuser Verlag. 1988.
* {{cite book
|author=Heydar Radjavi and Peter Rosenthal
|title=Invariant Subspaces
|year=2003
|edition=Update of 1973 Springer-Verlag
|isbn=0-486-42822-2
|publisher=[http://store.doverpublications.com/0486428222.html Dover]
}}
 
[[Category:Linear algebra]]
[[Category:Operator theory]]
[[Category:Representation theory]]

Revision as of 18:47, 31 December 2013

In mathematics, an invariant subspace of a linear mapping

T : VV

from some vector space V to itself is a subspace W of V such that T(W) is contained in W. An invariant subspace of T is also said to be T invariant.

If W is T-invariant, we can restrict T to W to arrive at a new linear mapping

T|W : WW.

Next we give a few immediate examples of invariant subspaces.

Certainly V itself, and the subspace {0}, are trivially invariant subspaces for every linear operator T : VV. For certain linear operators there is no non-trivial invariant subspace; consider for instance a rotation of a two-dimensional real vector space.

Let v be an eigenvector of T, i.e. T v = λv. Then W = span {v} is T invariant. As a consequence of the fundamental theorem of algebra, every linear operator on a complex finite-dimensional vector space with dimension at least 2 has an eigenvector. Therefore every such linear operator has a non-trivial invariant subspace. The fact that the complex numbers are algebraically closed is required here. Comparing with the previous example, one can see that the invariant subspaces of a linear transformation are dependent upon the underlying scalar field of V.

An invariant vector (fixed point of T), other than 0, spans an invariant subspace of dimension 1. An invariant subspace of dimension 1 will be acted on by T by a scalar, and consists of invariant vectors if and only if that scalar is 1.

As the above examples indicate, the invariant subspaces of a given linear transformation T shed light on the structure of T. When V is a finite dimensional vector space over an algebraically closed field, linear transformations acting on V are characterized (up to similarity) by the Jordan canonical form, which decomposes V into invariant subspaces of T. Many fundamental questions regarding T can be translated to questions about invariant subspaces of T.

More generally, invariant subspaces are defined for sets of operators as subspaces invariant for each operator in the set. Let L(V) denote the algebra of linear transformations on V, and Lat(T) be the family of subspaces invariant under TL(V). (The "Lat" notation refers to the fact that Lat(T) forms a lattice; see discussion below.) Given a nonempty set Σ ⊂ L(V), one considers the invariant subspaces invariant under each T ∈ Σ. In symbols,

Lat(Σ)=TΣLat(T).

For instance, it is clear that if Σ = L(V), then Lat(Σ) = { {0}, V}.

Given a representation of a group G on a vector space V, we have a linear transformation T(g) : VV for every element g of G. If a subspace W of V is invariant with respect to all these transformations, then it is a subrepresentation and the group G acts on W in a natural way.

As another example, let TL(V) and Σ be the algebra generated by {1, T}, where 1 is the identity operator. Then Lat(T) = Lat(Σ). Because T lies in Σ trivially, Lat(Σ) ⊂ Lat(T). On the other hand, Σ consists of polynomials in 1 and T, therefore the reverse inclusion holds as well.

Matrix representation

Over a finite dimensional vector space every linear transformation T : VV can be represented by a matrix once a basis of V has been chosen.

Suppose now W is a T invariant subspace. Pick a basis C = {v1, ..., vk} of W and complete it to a basis B of V. Then, with respect to this basis, the matrix representation of T takes the form:

T=[T11T120T22]

where the upper-left block T11 is the restriction of T to W.

In other words, given an invariant subspace W of T, V can be decomposed into the direct sum

V=WW.

Viewing T as an operator matrix

T=[T11T12T21T22]:WWWW,

it is clear that T21: WW' must be zero.

Determining whether a given subspace W is invariant under T is ostensibly a problem of geometric nature. Matrix representation allows one to phrase this problem algebraically. The projection operator P onto W, is defined by P(w + w' ) = w, where wW and w' W' . The projection P has matrix representation

P=[1000]:WWWW.

A straightforward calculation shows that W = Ran P, the range of P, is invariant under T if and only of PTP = TP. In other words, a subspace W being an element of Lat(T) is equivalent to the corresponding projection satisfying the relation PTP = TP.

If P is a projection (i.e. P2 = P), so is 1 - P, where 1 is the identity operator. It follows from the above that TP = PT if and only if both Ran P and Ran (1 - P) are invariant under T. In that case, T has matrix representation

T=[T1100T22]:RanPRan(1P)RanPRan(1P).

Colloquially, a projection that commutes with T "diagonalizes" T.

Invariant subspace problem

Mining Engineer (Excluding Oil ) Truman from Alma, loves to spend time knotting, largest property developers in singapore developers in singapore and stamp collecting. Recently had a family visit to Urnes Stave Church.

The invariant subspace problem concerns the case where V is a separable Hilbert space over the complex numbers, of dimension > 1, and T is a bounded operator. The problem is to decide whether every such T has a non-trivial, closed, invariant subspace. This problem is unsolved Template:As of.

In the more general case where V is hypothesized to be a Banach space, there is an example of an operator without an invariant subspace due to Per Enflo (1976). A concrete example of an operator without an invariant subspace was produced in 1985 by Charles Read.

Invariant-subspace lattice

Given a nonempty Σ ⊂ L(V), the invariant subspaces invariant under each element of Σ form a lattice, sometimes called the invariant-subspace lattice of Σ and denoted by Lat(Σ).

The lattice operations are defined in a natural way: for Σ' ⊂ Σ, the meet operation is defined by

WΣW=WΣW

while the join operation is

WΣW=spanWΣW.

A minimal element in Lat(Σ) in said to be a minimal invariant subspace.

Fundamental theorem of noncommutative algebra

Just as the fundamental theorem of algebra ensures that every linear transformation acting on a finite dimensional complex vector space has a nontrivial invariant subspace, the fundamental theorem of noncommutative algebra asserts that Lat(Σ) contains nontrivial elements for certain Σ.

Theorem (Burnside) Assume V is a complex vector space of finite dimension. For every proper subalgebra Σ of L(V), Lat(Σ) contain a nontrivial element.

Burnside's theorem is of fundamental importance in linear algebra. One consequence is that every commuting family in L(V) can be simultaneously upper-triangularized.

A nonempty Σ ⊂ L(V) is said to be triangularizable if there exists a basis {e1...en} of V such that

span{e1,,ek}Lat(Σ)k1.

In other words, Σ is triangularizable if there exists a basis such that every element of Σ has an upper-triangular matrix representation in that basis. It follows from Burnside's theorem that every commutative algebra Σ in L(V) is triangularizable. Hence every commuting family in L(V) can be simultaneously upper-triangularized.

Left ideals

If A is an algebra, one can define a left regular representation Φ on A: Φ(a)b = ab is a homomorphism from A to L(A), the algebra of linear transformations on A

The invariant subspaces of Φ are precisely the left ideals of A. A left ideal M of A gives a subrepresentation of A on M.

If M is a left ideal of A. Consider the quotient vector space A/M. The left regular representation Φ on M now descends to a representation Φ' on A/M. If [b] denotes an equivalence class in A/M, Φ'(a)[b] = [ab]. The kernel of the representation Φ' is the set {aA| abM for all b}.

The representation Φ' is irreducible if and only if M is a maximal left ideal, since a subspace VA/M is an invariant under {Φ'(a)| aA} if and only if its preimage under the quotient map, V + M, is a left ideal in A.

See also

Bibliography

  • 20 year-old Real Estate Agent Rusty from Saint-Paul, has hobbies and interests which includes monopoly, property developers in singapore and poker. Will soon undertake a contiki trip that may include going to the Lower Valley of the Omo.

    My blog: http://www.primaboinca.com/view_profile.php?userid=5889534
  • 20 year-old Real Estate Agent Rusty from Saint-Paul, has hobbies and interests which includes monopoly, property developers in singapore and poker. Will soon undertake a contiki trip that may include going to the Lower Valley of the Omo.

    My blog: http://www.primaboinca.com/view_profile.php?userid=5889534
  • 20 year-old Real Estate Agent Rusty from Saint-Paul, has hobbies and interests which includes monopoly, property developers in singapore and poker. Will soon undertake a contiki trip that may include going to the Lower Valley of the Omo.

    My blog: http://www.primaboinca.com/view_profile.php?userid=5889534
  • 20 year-old Real Estate Agent Rusty from Saint-Paul, has hobbies and interests which includes monopoly, property developers in singapore and poker. Will soon undertake a contiki trip that may include going to the Lower Valley of the Omo.

    My blog: http://www.primaboinca.com/view_profile.php?userid=5889534
  • Yurii I. Lyubich. Introduction to the Theory of Banach Representations of Groups. Translated from the 1985 Russian-language edition (Kharkov, Ukraine). Birkhäuser Verlag. 1988.
  • 20 year-old Real Estate Agent Rusty from Saint-Paul, has hobbies and interests which includes monopoly, property developers in singapore and poker. Will soon undertake a contiki trip that may include going to the Lower Valley of the Omo.

    My blog: http://www.primaboinca.com/view_profile.php?userid=5889534