# Theorems and definitions in linear algebra

## Vector spaces

A vector space( or linear space) V over a number field² F consists of a set on which two operations (called addition and scalar multiplication, respectively) are defined so, that for each pair of elements x, y, in V there is a unique element x + y in V, and for each element a in F and each element x in V there is a unique element ax in V, such that the following conditions hold.

### Subspaces

A subspace W of a vector space V over a field F is a subset of V which also has the properties that W is closed under scalar addition and multiplication. That is, For all x, y in W, x and y are in V and for any c in F, ${\displaystyle cx+y}$ is in W.

## Linear transformations and matrices

P.S. coefficient of the differential equation, differentiability of complex function,vector space of functionsdifferential operator, auxiliary polynomialTemplate:Disambiguation needed, to the power of a complex number, exponential function.

### ${\displaystyle {\color {Blue}~2.1}}$ N(T) and R(T) are subspaces

Let V and W be vector spaces and I: VW be linear. Then N(T) and R(T) are subspaces of V and W, respectively.

### ${\displaystyle {\color {Blue}~2.2}}$ R(T)= span of T(basis in V)

Let V and W be vector spaces, and let T: V→W be linear. If ${\displaystyle \beta ={v_{1},v_{2},\ldots ,v_{n}}}$ is a basis for V, then

${\displaystyle \mathrm {R(T)} =\mathrm {span} (T(\beta \mathrm {))} =\mathrm {span} ({T(v_{1}),T(v_{2}),\ldots ,T(v_{n})})}$.

### ${\displaystyle {\color {Blue}~2.3}}$ Dimension theorem

Let V and W be vector spaces, and let T: V → W be linear. If V is finite-dimensional, then

${\displaystyle \mathrm {nullity} (T)+\mathrm {rank} (T)=\dim(V).}$

### ${\displaystyle {\color {Blue}~2.4}}$ one-to-one ⇔ N(T) = {0}

Let V and W be vector spaces, and let T: V→W be linear. Then T is one-to-one if and only if N(T)={0}.

### ${\displaystyle {\color {Blue}~2.5}}$ one-to-one ⇔ onto ⇔ rank(T) = dim(V)

Let V and W be vector spaces of equal (finite) dimension, and let T:VW be linear. Then the following are equivalent.

(a) T is one-to-one.
(b) T is onto.
(c) rank(T) = dim(V).

### ${\displaystyle {\color {Blue}~2.6}}$ ∀ ${\displaystyle {w_{1},w_{2},\ldots ,w_{n}}=}$ exactly one T (basis),

Let V and W be vector space over F, and suppose that ${\displaystyle {v_{1},v_{2},\ldots ,v_{n}}}$ is a basis for V. For ${\displaystyle w_{1},w_{2},\ldots ,w_{n}}$ in W, there exists exactly one linear transformation T: V→W such that ${\displaystyle \mathrm {T} (v_{i})=w_{i}}$ for ${\displaystyle i=1,2,\ldots ,n.}$
Corollary. Let V and W be vector spaces, and suppose that V has a finite basis ${\displaystyle {v_{1},v_{2},\ldots ,v_{n}}}$. If U, T: V→W are linear and ${\displaystyle U(v_{i})=T(v_{i})}$ for ${\displaystyle i=1,2,\ldots ,n,}$ then U=T.

### ${\displaystyle {\color {Blue}~2.7}}$ T is vector space

Let V and W be vector spaces over a field F, and let T, U: V→W be linear.

(a) For all ${\displaystyle a}$F, ${\displaystyle a\mathrm {T} +\mathrm {U} }$ is linear.
(b) Using the operations of addition and scalar multiplication in the preceding definition, the collection of all linear transformations form V to W is a vector space over F.

### ${\displaystyle {\color {Blue}~2.8}}$ linearity of matrix representation of linear transformation

Let V and W be finite-dimensional vector spaces with ordered bases β and γ, respectively, and let T, U: V→W be linear transformations. Then

(a)${\displaystyle [T+U]_{\beta }^{\gamma }=[T]_{\beta }^{\gamma }+[U]_{\beta }^{\gamma }}$ and
(b)${\displaystyle [aT]_{\beta }^{\gamma }=a[T]_{\beta }^{\gamma }}$ for all scalars ${\displaystyle a}$.

### ${\displaystyle {\color {Blue}~2.9}}$ composition law of linear operators

Let V,w, and Z be vector spaces over the same field f, and let T:V→W and U:W→Z be linear. then UT:V→Z is linear.

### ${\displaystyle {\color {Blue}~2.10}}$ law of linear operator

Let v be a vector space. Let T, U1, U2${\displaystyle {\mathcal {L}}}$(V). Then
(a) T(U1+U2)=TU1+TU2 and (U1+U2)T=U1T+U2T
(b) T(U1U2)=(TU1)U2
(c) TI=IT=T
(d) ${\displaystyle a}$(U1U2)=(${\displaystyle a}$U1)U2=U1(${\displaystyle a}$U2) for all scalars ${\displaystyle a}$.

### ${\displaystyle {\color {Blue}~2.11}}$ [UT]αγ=[U]βγ[T]αβ

Let V, W and Z be finite-dimensional vector spaces with ordered bases α β γ, respectively. Let T: V⇐W and U: W→Z be linear transformations. Then

${\displaystyle [UT]_{\alpha }^{\gamma }=[U]_{\beta }^{\gamma }[T]_{\alpha }^{\beta }}$.

Corollary. Let V be a finite-dimensional vector space with an ordered basis β. Let T,U∈${\displaystyle {\mathcal {L}}}$(V). Then [UT]β=[U]β[T]β.

### ${\displaystyle {\color {Blue}~2.12}}$ law of matrix

Let A be an m×n matrix, B and C be n×p matrices, and D and E be q×m matrices. Then

(a) A(B+C)=AB+AC and (D+E)A=DA+EA.
(b) ${\displaystyle a}$(AB)=(${\displaystyle a}$A)B=A(${\displaystyle a}$B) for any scalar ${\displaystyle a}$.
(c) ImA=AIm.
(d) If V is an n-dimensional vector space with an ordered basis β, then [Iv]β=In.

Corollary. Let A be an m×n matrix, B1,B2,...,Bk be n×p matrices, C1,C1,...,C1 be q×m matrices, and ${\displaystyle a_{1},a_{2},\ldots ,a_{k}}$ be scalars. Then

${\displaystyle A{\Bigg (}\sum _{i=1}^{k}a_{i}B_{i}{\Bigg )}=\sum _{i=1}^{k}a_{i}AB_{i}}$

and

${\displaystyle {\Bigg (}\sum _{i=1}^{k}a_{i}C_{i}{\Bigg )}A=\sum _{i=1}^{k}a_{i}C_{i}A}$.

### ${\displaystyle {\color {Blue}~2.13}}$ law of column multiplication

Let A be an m×n matrix and B be an n×p matrix. For each ${\displaystyle j(1\leq j\leq p)}$ let ${\displaystyle u_{j}}$ and ${\displaystyle v_{j}}$ denote the jth columns of AB and B, respectively. Then
(a) ${\displaystyle u_{j}=Av_{j}}$
(b) ${\displaystyle v_{j}=Be_{j}}$, where ${\displaystyle e_{j}}$ is the jth standard vector of Fp.

### ${\displaystyle {\color {Blue}~2.14}}$ [T(u)]γ=[T]βγ[u]β

Let V and W be finite-dimensional vector spaces having ordered bases β and γ, respectively, and let T: V→W be linear. Then, for each u ∈ V, we have

${\displaystyle [T(u)]_{\gamma }=[T]_{\beta }^{\gamma }[u]_{\beta }}$.

### ${\displaystyle {\color {Blue}~2.15}}$ laws of LA

Let A be an m×n matrix with entries from F. Then the left-multiplication transformation LA: Fn→Fm is linear. Furthermore, if B is any other m×n matrix (with entries from F) and β and γ are the standard ordered bases for Fn and Fm, respectively, then we have the following properties.
(a) ${\displaystyle [L_{A}]_{\beta }^{\gamma }=A}$.
(b) LA=LB if and only if A=B.
(c) LA+B=LA+LB and L${\displaystyle a}$A=${\displaystyle a}$LA for all ${\displaystyle a}$∈F.
(d) If T:Fn→Fm is linear, then there exists a unique m×n matrix C such that T=LC. In fact, ${\displaystyle \mathrm {C} =[L_{A}]_{\beta }^{\gamma }}$.
(e) If W is an n×p matrix, then LAE=LALE.
(f ) If m=n, then ${\displaystyle L_{I_{n}}=I_{F^{n}}}$.

### ${\displaystyle {\color {Blue}~2.16}}$ A(BC)=(AB)C

Let A,B, and C be matrices such that A(BC) is defined. Then A(BC)=(AB)C; that is, matrix multiplication is associative.

### ${\displaystyle {\color {Blue}~2.17}}$ T-1is linear

Let V and W be vector spaces, and let T:V→W be linear and invertible. Then T−1: W →V is linear.

### ${\displaystyle {\color {Blue}~2.18}}$ [T-1]γβ=([T]βγ)-1

Let V and W be finite-dimensional vector spaces with ordered bases β and γ, respectively. Let T:V→W be linear. Then T is invertible if and only if ${\displaystyle [T]_{\beta }^{\gamma }}$ is invertible. Furthermore, ${\displaystyle [T^{-1}]_{\gamma }^{\beta }=([T]_{\beta }^{\gamma })^{-1}}$

Lemma. Let T be an invertible linear transformation from V to W. Then V is finite-dimensional if and only if W is finite-dimensional. In this case, dim(V)=dim(W).

Corollary 1. Let V be a finite-dimensional vector space with an ordered basis β, and let T:V→V be linear. Then T is invertible if and only if [T]β is invertible. Furthermore, [T−1]β=([T]β)−1.

Corollary 2. Let A be an n×n matrix. Then A is invertible if and only if LA is invertible. Furthermore, (LA)−1=LA−1.

### ${\displaystyle {\color {Blue}~2.19}}$ V is isomorphic to W ⇔ dim(V)=dim(W)

Let W and W be finite-dimensional vector spaces (over the same field). Then V is isomorphic to W if and only if dim(V)=dim(W).

Corollary. Let V be a vector space over F. Then V is isomorphic to Fn if and only if dim(V)=n.

### ${\displaystyle {\color {Blue}~2.20}}$ ??

Let V and W be finite-dimensional vector spaces over F of dimensions n and m, respectively, and let β and γ be ordered bases for V and W, respectively. Then the function ${\displaystyle ~\Phi }$: ${\displaystyle {\mathcal {L}}}$(V,W)→Mm×n(F), defined by ${\displaystyle ~\Phi (T)=[T]_{\beta }^{\gamma }}$ for T∈${\displaystyle {\mathcal {L}}}$(V,W), is an isomorphism.

Corollary. Let V and W be finite-dimensional vector spaces of dimension n and m, respectively. Then ${\displaystyle {\mathcal {L}}}$(V,W) is finite-dimensional of dimension mn.

### ${\displaystyle {\color {Blue}~2.21}}$ Φβ is an isomorphism

For any finite-dimensional vector space V with ordered basis β, Φβ is an isomorphism.

### ${\displaystyle {\color {Blue}~2.22}}$ ??

Let β and β' be two ordered bases for a finite-dimensional vector space V, and let ${\displaystyle Q=[I_{V}]_{\beta '}^{\beta }}$. Then
(a) ${\displaystyle Q}$ is invertible.
(b) For any ${\displaystyle v\in }$ V, ${\displaystyle ~[v]_{\beta }=Q[v]_{\beta '}}$.

### ${\displaystyle {\color {Blue}~2.23}}$ [T]β'=Q-1[T]βQ

Let T be a linear operator on a finite-dimensional vector space V,and let β and β' be two ordered bases for V. Suppose that Q is the change of coordinate matrix that changes β'-coordinates into β-coordinates. Then

${\displaystyle ~[T]_{\beta '}=Q^{-1}[T]_{\beta }Q}$.

Corollary. Let A∈Mn×n(F), and le t γ be an ordered basis for Fn. Then [LA]γ=Q−1AQ, where Q is the n×n matrix whose jth column is the jth vector of γ.

### ${\displaystyle {\color {Blue}~2.27}}$ p(D)(x)=0 (p(D)∈C∞)⇒ x(k)exists (k∈N)

Any solution to a homogeneous linear differential equation with constant coefficients has derivatives of all orders; that is, if ${\displaystyle x}$ is a solution to such an equation, then ${\displaystyle x^{(k)}}$ exists for every positive integer k.

### ${\displaystyle {\color {Blue}~2.28}}$ {solutions}= N(p(D))

The set of all solutions to a homogeneous linear differential equation with constant coefficients coincides with the null space of p(D), where p(t) is the auxiliary polynomial with the equation.

Corollary. The set of all solutions to s homogeneous linear differential equation with constant coefficients is a subspace of ${\displaystyle \mathrm {C} ^{\infty }}$.

### ${\displaystyle {\color {Blue}~2.29}}$ derivative of exponential function

For any exponential function ${\displaystyle f(t)=e^{ct},f'(t)=ce^{ct}}$.

### ${\displaystyle {\color {Blue}~2.30}}$ {e-at} is a basis of N(p(D+aI))

The solution space for the differential equation,

${\displaystyle y'+a_{0}y=0}$

is of dimension 1 and has ${\displaystyle \{e^{-a_{0}t}\}}$as a basis.

Corollary. For any complex number c, the null space of the differential operator D-cI has {${\displaystyle e^{ct}}$} as a basis.

### ${\displaystyle {\color {Blue}~2.31}}$ ${\displaystyle e^{ct}}$ is a solution

Let p(t) be the auxiliary polynomial for a homogeneous linear differential equation with constant coefficients. For any complex number c, if c is a zero of p(t), then to the differential equation.

### ${\displaystyle {\color {Blue}~2.32}}$ dim(N(p(D)))=n

For any differential operator p(D) of order n, the null space of p(D) is an n_dimensional subspace of C.

Lemma 1. The differential operator D-cI: C to C is onto for any complex number c.

Lemma 2 Let V be a vector space, and suppose that T and U are linear operators on V such that U is onto and the null spaces of T and U are finite-dimensional, Then the null space of TU is finite-dimensional, and

dim(N(TU))=dim(N(U))+dim(N(U)).

Corollary. The solution space of any nth-order homogeneous linear differential equation with constant coefficients is an n-dimensional subspace of C.

### ${\displaystyle {\color {Blue}~2.33}}$ ecit is linearly independent with each other (ci are distinct)

Given n distinct complex numbers ${\displaystyle c_{1},c_{2},\ldots ,c_{n}}$, the set of exponential functions ${\displaystyle \{e^{c_{1}t},e^{c_{2}t},\ldots ,e^{c_{n}t}\}}$ is linearly independent.

Corollary. For any nth-order homogeneous linear differential equation with constant coefficients, if the auxiliary polynomial has n distinct zeros ${\displaystyle c_{1},c_{2},\ldots ,c_{n}}$, then ${\displaystyle \{e^{c_{1}t},e^{c_{2}t},\ldots ,e^{c_{n}t}\}}$ is a basis for the solution space of the differential equation.

Lemma. For a given complex number c and positive integer n, suppose that (t-c)^n is athe auxiliary polynomial of a homogeneous linear differential equation with constant coefficients. Then the set

${\displaystyle \beta =\{e^{c_{1}t},e^{c_{2}t},\ldots ,e^{c_{n}t}\}}$

is a basis for the solution space of the equation.

### ${\displaystyle {\color {Blue}~2.34}}$ general solution of homogeneous linear differential equation

Given a homogeneous linear differential equation with constant coefficients and auxiliary polynomial

${\displaystyle (t-c_{1})_{1}^{n}(t-c_{2})_{2}^{n}\cdots (t-c_{k})_{k}^{n},}$

where ${\displaystyle n_{1},n_{2},\ldots ,n_{k}}$ are positive integers and ${\displaystyle c_{1},c_{2},\ldots ,c_{n}}$ are distinct complex numbers, the following set is a basis for the solution space of the equation:

${\displaystyle \{e^{c_{1}t},te^{c_{1}t},\ldots ,t^{n_{1}-1}e^{c_{1}t},\ldots ,e{c_{k}t},te^{c_{k}t},\ldots ,t^{n_{k}-1}e^{c_{k}t}\}}$.

## Elementary matrix operations and systems of linear equations

### Elementary matrix operations

1. Matrix rows can be interchanged 2. Matrix rows can be multiplied by a non-zero real number(i.e. number can be either positive or negative) 3. Any row can be changed by adding or subtracting corresponding row elements with another row

### Rank of a matrix

The rank of a matrix A is the number of pivot columns after the reduced row echelon form of A.

## Determinants

If

${\displaystyle A={\begin{pmatrix}a&b\\c&d\\\end{pmatrix}}}$

is a 2×2 matrix with entries form a field F, then we define the determinant of A, denoted det(A) or |A|, to be the scalar ${\displaystyle ad-bc}$.

＊Theorem 1: linear function for a single row.
＊Theorem 2: nonzero determinant ⇔ invertible matrix

Theorem 1: The function det: M2×2(F) → F is a linear function of each row of a 2×2 matrix when the other row is held fixed. That is, if ${\displaystyle u,v,}$ and ${\displaystyle w}$ are in F² and ${\displaystyle k}$ is a scalar, then

${\displaystyle \det {\begin{pmatrix}u+kv\\w\\\end{pmatrix}}=\det {\begin{pmatrix}u\\w\\\end{pmatrix}}+k\det {\begin{pmatrix}v\\w\\\end{pmatrix}}}$

and

${\displaystyle \det {\begin{pmatrix}w\\u+kv\\\end{pmatrix}}=\det {\begin{pmatrix}w\\u\\\end{pmatrix}}+k\det {\begin{pmatrix}w\\v\\\end{pmatrix}}}$

Theorem 2: Let A ${\displaystyle \in }$ M2×2(F). Then thee deter minant of A is nonzero if and only if A is invertible. Moreover, if A is invertible, then

${\displaystyle A^{-1}={\frac {1}{\det(A)}}{\begin{pmatrix}A_{22}&-A_{12}\\-A_{21}&A_{11}\\\end{pmatrix}}}$

## Diagonalization

Characteristic polynomial of a linear operator/matrix

### ${\displaystyle {\color {Blue}~5.1}}$ diagonalizable⇔basis of eigenvector

A linear operator T on a finite-dimensional vector space V is diagonalizable if and only if there exists an ordered basis β for V consisting of eigenvectors of T. Furthermore, if T is diagonalizable, ${\displaystyle \beta ={v_{1},v_{2},\ldots ,v_{n}}}$ is an ordered basis of eigenvectors of T, and D = [T]β then D is a diagonal matrix and ${\displaystyle D_{jj}}$ is the eigenvalue corresponding to ${\displaystyle v_{j}}$ for ${\displaystyle 1\leq j\leq n}$.

### ${\displaystyle {\color {Blue}~5.2}}$ eigenvalue⇔det(A-λIn)=0

Let A∈Mn×n(F). Then a scalar λ is an eigenvalue of A if and only if det(AIn)=0

### ${\displaystyle {\color {Blue}~5.3}}$ characteristic polynomial

Let A∈Mn×n(F).
(a) The characteristic polynomial of A is a polynomial of degree n with leading coefficient(-1)n.
(b) A has at most n distinct eigenvalues.

### ${\displaystyle {\color {Blue}~5.4}}$ υ to λ⇔υ∈N(T-λI)

Let T be a linear operator on a vector space V, and let λ be an eigenvalue of T.
A vector υ∈V is an eigenvector of T corresponding to λ if and only if υ≠0 and υ∈N(T-λI).

### ${\displaystyle {\color {Blue}~5.5}}$ vi to λi⇔vi is linearly independent

Let T be a linear operator on a vector space V, and let ${\displaystyle \lambda _{1},\lambda _{2},\ldots ,\lambda _{k},}$ be distinct eigenvalues of T. If ${\displaystyle v_{1},v_{2},\ldots ,v_{k}}$ are eigenvectors of t such that ${\displaystyle \lambda _{i}}$ corresponds to ${\displaystyle v_{i}}$ (${\displaystyle 1\leq i\leq k}$), then {${\displaystyle v_{1},v_{2},\ldots ,v_{k}}$} is linearly independent.

### ${\displaystyle {\color {Blue}~5.6}}$ characteristic polynomial splits

The characteristic polynomial of any diagonalizable linear operator splits.

### ${\displaystyle {\color {Blue}~5.7}}$ 1 ≤ dim(Eλ) ≤ m

Let T be alinear operator on a finite-dimensional vectorspace V, and let λ be an eigenvalue of T having multiplicity ${\displaystyle m}$. Then ${\displaystyle 1\leq \dim(E_{\lambda })\leq m}$.

### ${\displaystyle {\color {Blue}~5.8}}$ S = S1 ∪ S2 ∪ ...∪ Sk is linearly independent

Let T be a linear operator on a vector space V, and let ${\displaystyle \lambda _{1},\lambda _{2},\ldots ,\lambda _{k},}$ be distinct eigenvalues of T. For each ${\displaystyle i=1,2,\ldots ,k,}$ let ${\displaystyle S_{i}}$ be a finite linearly independent subset of the eigenspace ${\displaystyle E_{\lambda _{i}}}$. Then ${\displaystyle S=S_{1}\cup S_{2}\cup \cdots \cup S_{k}}$ is a linearly independent subset of V.

### ${\displaystyle {\color {Blue}~5.9}}$ ⇔T is diagonalizable

Let T be a linear operator on a finite-dimensional vector space V that the characteristic polynomial of T splits. Let ${\displaystyle \lambda _{1},\lambda _{2},\ldots ,\lambda _{k}}$ be the distinct eigenvalues of T. Then
(a) T is diagonalizable if and only if the multiplicity of ${\displaystyle \lambda _{i}}$ is equal to ${\displaystyle \dim(E_{\lambda _{i}})}$ for all ${\displaystyle i}$.
(b) If T is diagonalizable and ${\displaystyle \beta _{i}}$ is an ordered basis for ${\displaystyle E_{\lambda _{i}}}$ for each ${\displaystyle i}$, then ${\displaystyle \beta =\beta _{1}\cup \beta _{2}\cup \cup \beta _{k}}$ is an ordered ${\displaystyle basis^{2}}$ for V consisting of eigenvectors of T.

Test for diagonlization

## Inner product spaces

### ${\displaystyle {\color {Blue}~6.2}}$ law of norm

Let V be an inner product space over F. Then for all x,y\in V and c\in F, the following statements are true.
(a) ${\displaystyle \|cx\|=|c|\cdot \|x\|}$.
(b) ${\displaystyle \|x\|=0}$ if and only if ${\displaystyle x=0}$. In any case, ${\displaystyle \|x\|\geq 0}$.
(c)(Cauchy-Schwarz In equality)${\displaystyle |\langle x,y\rangle |\leq \|x\|\cdot \|y\|}$.
(d)(Triangle Inequality)${\displaystyle \|x+y\|\leq \|x\|+\|y\|}$.

### ${\displaystyle {\color {Blue}~6.3}}$ span of orthogonal subset

Let V be an inner product space and ${\displaystyle S=\{v_{1},v_{2},\ldots ,v_{k}\}}$ be an orthogonal subset of V consisting of nonzero vectors. If ${\displaystyle y}$∈span(S), then

${\displaystyle y=\sum _{i=1}^{n}{\langle y,v_{i}\rangle \over \|v_{i}\|^{2}}v_{i}}$

### ${\displaystyle {\color {Blue}~6.4}}$ Gram-Schmidt process

Let V be an inner product space and S=${\displaystyle \{w_{1},w_{2},\ldots ,w_{n}\}}$ be a linearly independent subset of V. DefineS'=${\displaystyle \{v_{1},v_{2},\ldots ,v_{n}\}}$, where ${\displaystyle v_{1}=w_{1}}$ and

${\displaystyle v_{k}=w_{k}-\sum _{j=1}^{k-1}{\langle w_{k},v_{j}\rangle \over \|v_{j}\|^{2}}v_{j}}$

Then S' is an orhtogonal set of nonzero vectors such that span(S')=span(S).

### ${\displaystyle {\color {Blue}~6.5}}$ orthonormal basis

Let V be a nonzero finite-dimensional inner product space. Then V has an orthonormal basis β. Furthermore, if β =${\displaystyle \{v_{1},v_{2},\ldots ,v_{n}\}}$ and x∈V, then

${\displaystyle x=\sum _{i=1}^{n}\langle x,v_{i}\rangle v_{i}}$.

Corollary. Let V be a finite-dimensional inner product space with an orthonormal basis β =${\displaystyle \{v_{1},v_{2},\ldots ,v_{n}\}}$. Let T be a linear operator on V, and let A=[T]β. Then for any ${\displaystyle i}$ and ${\displaystyle j}$, ${\displaystyle A_{ij}=\langle T(v_{j}),v_{i}\rangle }$.

### ${\displaystyle {\color {Blue}~6.6}}$ W⊥ by orthonormal basis

Let W be a finite-dimensional subspace of an inner product space V, and let ${\displaystyle y}$∈V. Then there exist unique vectors ${\displaystyle u}$∈W and ${\displaystyle z}$∈W such that ${\displaystyle y=u+z}$. Furthermore, if ${\displaystyle \{v_{1},v_{2},\ldots ,v_{k}\}}$ is an orthornormal basis for W, then

${\displaystyle u=\sum _{i=1}^{k}\langle y,v_{i}\rangle v_{i}}$.

S=\{v_1,v_2,\ldots,v_k\} Corollary. In the notation of Theorem 6.6, the vector ${\displaystyle u}$ is the unique vector in W that is "closest" to ${\displaystyle y}$; thet is, for any ${\displaystyle x}$∈W, ${\displaystyle \|y-x\|\geq \|y-u\|}$, and this inequality is an equality if and onlly if ${\displaystyle x=u}$.

### ${\displaystyle {\color {Blue}~6.7}}$ properties of orthonormal set

Suppose that ${\displaystyle S=\{v_{1},v_{2},\ldots ,v_{k}\}}$ is an orthonormal set in an ${\displaystyle n}$-dimensional inner product space V. Than
(a) S can be extended to an orthonormal basis ${\displaystyle \{v_{1},v_{2},\ldots ,v_{k},v_{k+1},\ldots ,v_{n}\}}$ for V.
(b) If W=span(S), then ${\displaystyle S_{1}=\{v_{k+1},v_{k+2},\ldots ,v_{n}\}}$ is an orhtonormal basis for W(using the preceding notation).
(c) If W is any subspace of V, then dim(V)=dim(W)+dim(W).

### ${\displaystyle {\color {Blue}~6.8}}$ linear functional representation inner product

Let V be a finite-dimensional inner product space over F, and let ${\displaystyle g}$:V→F be a linear transformation. Then there exists a unique vector ${\displaystyle y}$∈ V such that ${\displaystyle {\rm {{g}(x)=\langle x,y\rangle }}}$ for all ${\displaystyle x}$∈ V.

### ${\displaystyle {\color {Blue}~6.9}}$ definition of T*

Let V be a finite-dimensional inner product space, and let T be a linear operator on V. Then there exists a unique function T*:V→V such that ${\displaystyle \langle {\rm {{T}(x),y\rangle =\langle x,{\rm {{T}^{*}(y)\rangle }}}}}$ for all ${\displaystyle x,y}$ ∈ V. Furthermore, T* is linear

### ${\displaystyle {\color {Blue}~6.10}}$ [T*]β=[T]*β

Let V be a finite-dimensional inner product space, and let β be an orthonormal basis for V. If T is a linear operator on V, then

${\displaystyle [T^{*}]_{\beta }=[T]_{\beta }^{*}}$.

### ${\displaystyle {\color {Blue}~6.11}}$ properties of T*

Let V be an inner product space, and let T and U be linear operators onV. Then
(a) (T+U)*=T*+U*;
(b) (${\displaystyle c}$T)*=${\displaystyle {\bar {c}}}$ T* for any c∈ F;
(c) (TU)*=U*T*;
(d) T**=T;
(e) I*=I.

Corollary. Let A and B be n×nmatrices. Then
(a) (A+B)*=A*+B*;
(b) (${\displaystyle c}$A)*=${\displaystyle {\bar {c}}}$ A* for any ${\displaystyle c}$∈ F;
(c) (AB)*=B*A*;
(d) A**=A;
(e) I*=I.

### ${\displaystyle {\color {Blue}~6.12}}$ Least squares approximation

Let A ∈ Mm×n(F) and ${\displaystyle y}$∈Fm. Then there exists ${\displaystyle x_{0}}$ ∈ Fn such that ${\displaystyle (A*A)x_{0}=A*y}$ and ${\displaystyle \|Ax_{0}-Y\|\leq \|Ax-y\|}$ for all x∈ Fn

Lemma 1. let A ∈ Mm×n(F), ${\displaystyle x}$∈Fn, and ${\displaystyle y}$∈Fm. Then

${\displaystyle \langle Ax,y\rangle _{m}=\langle x,A*y\rangle _{n}}$

Lemma 2. Let A ∈ Mm×n(F). Then rank(A*A)=rank(A).

Corollary.(of lemma 2) If A is an m×n matrix such that rank(A)=n, then A*A is invertible.

### ${\displaystyle {\color {Blue}~6.13}}$ Minimal solutions to systems of linear equations

Let A ∈ Mm×n(F) and b∈ Fm. Suppose that ${\displaystyle Ax=b}$ is consistent. Then the following statements are true.
(a) There existes exactly one minimal solution ${\displaystyle s}$ of ${\displaystyle Ax=b}$, and ${\displaystyle s}$∈R(LA*).
(b) The vector ${\displaystyle s}$ is the only solution to ${\displaystyle Ax=b}$ that lies in R(LA*); that is, if ${\displaystyle u}$ satisfies ${\displaystyle (AA*)u=b}$, then ${\displaystyle s=A*u}$.

## References

• Linear Algebra 4th edition, by Stephen H. Friedberg Arnold J. Insel and Lawrence E. spence ISBN 7-04-016733-6
• Linear Algebra 3rd edition, by Serge Lang (UTM) ISBN 0-387-96412-6
• Linear Algebra and Its Applications 4th edition, by Gilbert Strang ISBN 0-03-010567-6