Let $S$ be a vector space. A function that assigns a real number $\langle\mathbf{x}, \mathbf{y}\rangle$ to every pair of vectors $\mathbf{x}, \mathbf{y}$ in $S$ is said to be an inner product if it satisfies the following conditions:

(i) $\langle\mathbf{x}, \mathbf{y}\rangle=\langle\mathbf{y}, \mathbf{x}\rangle$
(ii) $\langle\mathbf{x}, \mathbf{x}\rangle \geq 0$ and equality holds if and only if $\mathbf{x}=\mathbf{0}$.
(iii) $\langle c \mathbf{x}, \mathbf{y}\rangle=c\langle\mathbf{x}, \mathbf{y}\rangle .$
(iv) $\langle\mathbf{x}+\mathbf{y}, \mathbf{z}\rangle=\langle\mathbf{x}, \mathbf{z}\rangle+\langle\mathbf{y}, \mathbf{z}\rangle$
In $R^{n},\langle\mathbf{x}, \mathbf{y}\rangle=\mathbf{x}^{\prime} \mathbf{y}=x_{1} y_{1}+\cdots+x_{n} y_{n}$ is easily seen to be an inner product. We will work with this inner product while dealing with $R^{n}$ and its subspaces, unless indicated otherwise.

For a vector $\mathbf{x}$, the positive square root of the inner product $\langle\mathbf{x}, \mathbf{x}\rangle$ is called the norm of $\mathbf{x}$, denoted by $|\mathbf{x}|$. Vectors $\mathbf{x}, \mathbf{y}$ are said to be orthogonal or perpendicular if $\langle\mathbf{x}, \mathbf{y}\rangle=0$, in which case we write $\mathbf{x} \perp \mathbf{y}$.

5.1. If $\mathbf{x}{1}, \ldots, \mathbf{x}{\mathbf{m}}$ are pairwise orthogonal nonzero vectors, then they are linearly independent.
PROOF. $\quad$ Suppose $c_{1} \mathbf{x}{\mathbf{1}}+\cdots+c{m} \mathbf{x}{\mathbf{m}}=\mathbf{0}$. Then $$ \left\langle c{1} \mathbf{x}{\mathbf{1}}+\cdots+c{m} \mathbf{x}{\mathbf{m}}, \mathbf{x}{\mathbf{1}}\right\rangle=0
$$
and hence,
$$
\sum_{i=1}^{m} c_{i}\left\langle\mathbf{x}{\mathbf{i}}, \mathbf{x}{\mathbf{1}}\right\rangle=0
$$
Since the vectors $\mathbf{x}{\mathbf{1}}, \ldots, \mathbf{x}{\mathbf{m}}$ are pairwise orthogonal, it follows that $c_{1}\left\langle\mathbf{x}{\mathbf{1}}, \mathbf{x}{\mathbf{1}}\right\rangle=0$ and since $\mathbf{x}{\mathbf{1}}$ is nonzero, $c{1}=0 .$ Similarly, we can show that each $c_{i}$ is zero. Therefore, the vectors are linearly independent.

A set of vectors $\mathbf{x}{1}, \ldots, \mathbf{x}{\mathbf{m}}$ is said to form an orthonormal basis for the vector space $S$ if the set is a basis for $S$ and furthermore, $\left\langle\mathbf{x}{\mathbf{i}}, \mathbf{x}{\mathbf{j}}\right\rangle$ is 0 if $i \neq j$ and 1 if $i=j$

We now describe the Gram-Schmidt procedure, which produces an orthonormal basis starting with a given basis $\mathbf{x}{1}, \ldots, \mathbf{x}{\mathbf{n}}$
Set $\mathbf{y}{\mathbf{1}}=\mathbf{x}{\mathbf{1}} .$ Having defined $\mathbf{y}{1}, \ldots, \mathbf{y}{\mathbf{i}-\mathbf{1}}$, we define
$$
\mathbf{y}{\mathbf{i}}=\mathbf{x}{\mathbf{i}}-a_{i, i-1} \mathbf{y}{\mathbf{i}-\mathbf{1}}-\cdots-a{i 1} \mathbf{y}{\mathbf{1}} $$ where $a{i, i-1}, \ldots, a_{i 1}$ are chosen so that $\mathbf{y}{\mathbf{i}}$ is orthogonal to $\mathbf{y}{1}, \ldots, \mathbf{y}{\mathbf{i}-\mathbf{1}}$. Thus we must solve $\left\langle\mathbf{y}{\mathbf{i}}, \mathbf{y}{\mathbf{j}}\right\rangle=0, j=1, \ldots, i-1 .$ This leads to $$ \left\langle\mathbf{x}{\mathbf{i}}-a_{i, i-1} \mathbf{y}{\mathbf{i}-\mathbf{1}}-\cdots-a{i 1} \mathbf{y}{\mathbf{1}}, \mathbf{y}{\mathbf{j}}\right\rangle=0, \quad j=1, \ldots, i-1
$$

which gives
$$
\left\langle\mathbf{x}{\mathbf{i}}, \mathbf{y}{\mathbf{j}}\right\rangle-\sum_{k=1}^{i-1} a_{i k}\left\langle\mathbf{y}{\mathbf{k}}, \mathbf{y}{\mathbf{j}}\right\rangle=0, \quad j=1, \ldots, i-1
$$
Now, since $\mathbf{y}{1}, \ldots, \mathbf{y}{\mathbf{i}-\mathbf{1}}$ is an orthogonal set, we get
$$
\left\langle\mathbf{x}{\mathbf{i}}, \mathbf{y}{\mathbf{j}}\right\rangle-a_{i j}\left\langle\mathbf{y}{\mathbf{j}}, \mathbf{y}{\mathbf{j}}\right\rangle=0
$$

and hence,
$$
a_{i j}=\frac{\left\langle\mathbf{x}{\mathbf{i}}, \mathbf{y}{\mathbf{j}}\right\rangle}{\left\langle\mathbf{y}{\mathbf{j}}, \mathbf{y}{\mathbf{j}}\right\rangle}, \quad j=1, \ldots, i-1
$$
The process is continued to obtain the basis $\mathbf{y}{1}, \ldots, \mathbf{y}{\mathbf{n}}$ of pairwise orthogonal vectors. Since $\mathbf{x}{1}, \ldots, \mathbf{x}{\mathbf{n}}$ are linearly independent, each $\mathbf{y}{\mathbf{i}}$ is nonzero. Now if we set $\mathbf{z}{\mathbf{i}}=\frac{\mathbf{y}{\mathrm{i}}}{\left|\mathbf{y}{\mathrm{i}}\right|}$, then $\mathbf{z}{1}, \ldots, \mathbf{z}{\mathbf{n}}$ is an orthonormal basis. Note that the linear span of $\mathbf{z}{1}, \ldots, \mathbf{z}{\mathbf{i}}$ equals the linear span of $\mathbf{x}{\mathbf{1}}, \ldots, \mathbf{x}{\mathbf{i}}$ for each $i$

We remark that given a set of linearly independent vectors $\mathbf{x}{1}, \ldots, \mathbf{x}{\mathbf{m}}$, the GramSchmidt procedure described above can be used to produce a pairwise orthogonal set $\mathbf{y}{1}, \ldots, \mathbf{y}{\mathbf{m}}$, such that $\mathbf{y}{\mathbf{i}}$ is a linear combination of $\mathbf{x}{\mathbf{1}}, \ldots, \mathbf{x}_{\mathbf{i}-\mathbf{1}}, i=1, \ldots, m$ This fact is used in the proof of the next result.

Let $W$ be a set (not necessarily a subspace) of vectors in a vector space $S$. We define
$$
W^{\perp}={\mathbf{x}: \mathbf{x} \in S,\langle\mathbf{x}, \mathbf{y}\rangle=0 \text { for all } \mathbf{y} \in W}
$$
It follows from the definitions that $W^{\perp}$ is a subspace of $S$.

发表回复

您的电子邮箱地址不会被公开。