linearalgebra.me 为您的留学生涯保驾护航 在线性代数linear algebra作业代写方面已经树立了自己的口碑, 保证靠谱, 高质且原创的线性代数linear algebra代写服务。我们的专家在线性代数linear algebra代写方面经验极为丰富，各种线性代数linear algebra相关的作业也就用不着 说。

• 数值分析
• 高等线性代数
• 矩阵论
• 优化理论
• 线性规划
• 逼近论

## 线性代数作业代写linear algebra代考|Invariance and reducibility

In this subsection, we consider some situations in which the complexity of a linear mapping may be ‘reduced’ somewhat.

Definition 2.11 Let $T \in L(U)$ and $V$ be a subspace of $U$. We say that $V$ is an invariant subspace of $T$ if $T(V) \subset V$.

Given $T \in L(U)$, it is clear that the null-space $N(T)$ and range $R(T)$ of $T$ are both invariant subspaces of $T$.

To see how the knowledge about an invariant subspace reduces the complexity of a linear mapping, we assume that $V$ is a nontrivial invariant subspace of $T \in L(U)$ where $U$ is $n$-dimensional. Let $\left{u_{1}, \ldots, u_{k}\right}$ be any basis of $V$. We extend it to get a basis of $U$, say $\left{u_{1}, \ldots, u_{k}, u_{k+1}, \ldots, u_{n}\right}$. With respect to such a basis, we have
$$T\left(u_{i}\right)=\sum_{i^{\prime}=1}^{k} b_{i^{\prime} i} u_{i^{\prime}}, i=1, \ldots, k, T\left(u_{j}\right)=\sum_{j^{\prime}=1}^{n} c_{j^{\prime} j} u_{j^{\prime}}, j=k+1, \ldots, n,$$
where $B=\left(b_{i^{\prime} i}\right) \in \mathbb{F}(k, k)$ and $C=\left(c_{j^{\prime} j}\right) \in \mathbb{F}(n,[n-k])$. With respect to this basis, the associated matrix $A \in \mathbb{F}(n, n)$ becomes
$$A=\left(\begin{array}{ll} B & C_{1} \ 0 & C_{2} \end{array}\right), \quad\left(\begin{array}{c} C_{1} \ C_{2} \end{array}\right)=C .$$
Such a matrix is sometimes referred to as boxed upper triangular.
Thus, we see that a linear mapping $T$ over a finite-dimensional vector space $U$ has a nontrivial invariant subspace if and only if there is a basis of $U$ so that the associated matrix of $T$ with respect to this basis is boxed upper triangular.
For the matrix $A$ given in (2.5.2), the vanishing of the entries in the leftlower portion of the matrix indeed reduces the complexity of the matrix. We have seen clearly that such a ‘reduction’ happens because of the invariance property
$$T\left(\operatorname{Span}\left{u_{1}, \ldots, u_{k}\right}\right) \subset \operatorname{Span}\left{u_{1}, \ldots, u_{k}\right} .$$
Consequently, if we also have the following additionally imposed invariance property
$$T\left(\operatorname{Span}\left{u_{k+1}, \ldots, u_{n}\right}\right) \subset \operatorname{Span}\left{u_{k+1}, \ldots, u_{n}\right},$$
then $c_{j^{\prime} j}=0$ for $j^{\prime}=1, \ldots, k$ in (2.5.1) or $C_{1}=0$ in (2.5.2), which further reduces the complexity of the matrix $A$.

The above investigation motivates the introduction of the concept of reducibility of a linear mapping as follows.

## 线性代数作业代写linear algebra代考|Projections

In this subsection, we study an important family of reducible linear mappings called projections.

Definition $2.15$ Let $V$ and $W$ be two complementary subspaces of $U$. That is, $U=V \oplus W$. For any $u \in U$, express $u$ uniquely as $u=v+w, v \in V, w \in W$, and défine the mapping $P: U \rightarrow U$ by
$$P(u)=v .$$
Then $P \in L(U)$ and is called the projection of $U$ onto $V$ along $W$.
We need to check that the mapping $P$ defined in Definition $2.15$ is indeed linear. To see this, we take $u_{1}, u_{2} \in U$ and express them as $u_{1}=v_{1}+w_{1}, u_{2}=$ $v_{2}+w_{2}$, for unique $v_{1}, v_{2} \in V, w_{1}, w_{2} \in W$. Hence $P\left(u_{1}\right)=v_{1}, P\left(u_{2}\right)=v_{2}$. On the other hand, from $u_{1}+u_{2}=\left(v_{1}+v_{2}\right)+\left(w_{1}+w_{2}\right)$, we get $P\left(u_{1}+u_{2}\right)=$ $v_{1}+v_{2}$. Thus $P\left(u_{1}+u_{2}\right)=P\left(u_{1}\right)+P\left(u_{2}\right)$. Moreover, for any $a \in \mathbb{F}$ and $u \in U$, write $u=v+w$ for unique $v \in V, w \in W$. Thus $P(u)=v$ and $a u=a v+a w$ give us $P(a u)=a v=a P(u)$. So $P \in L(U)$ as claimed.

From Definition 2.15, we see that for $v \in V$ we have $P(v)=v$. Thus $P(P(u))=P(u)$ for any $u \in U$. In other words, the projection $P$ satisfies the special property $P \circ P=P$. For notational convenience, we shall use $T^{k}$ to denote the $k$-fold composition $T \circ \cdots \circ T$ for any $T \in L(U)$. With this notation, we see that a projection $P$ satisfies the condition $P^{2}=P$. Any linear mapping satisfying such a condition is called idempotent.

We now show that being idempotent characterizes a linear mapping being a projection.

## 线性代数作业代写linear algebra代考|Invariance and reducibility

$$T\left(u_{i}\right)=\sum_{i^{\prime}=1}^{k} b_{i^{\prime} i} u_{i^{\prime}}, i=1, \ldots, k, T\left(u_{j}\right)=\sum_{j^{\prime}=1}^{n} c_{j^{\prime} j} u_{j^{\prime}}, j=k+1, \ldots, n$$

$$A=\left(\begin{array}{llll} B & C_{1} & 0 & C_{2} \end{array}\right), \quad\left(C_{1} C_{2}\right)=C .$$

## 线性代数作业代写linear algebra代考|Projections

$$P(u)=v .$$

# 计量经济学代写

## 在这种情况下，如何学好线性代数？如何保证线性代数能获得高分呢？

1.1 mark on book

【重点的误解】划重点不是书上粗体，更不是每个定义，线代概念这么多，很多朋友强迫症似的把每个定义整整齐齐用荧光笔标出来，然后整本书都是重点，那期末怎么复习呀。我认为需要标出的重点为

A. 不懂，或是生涩，或是不熟悉的部分。这点很重要，有的定义浅显，但证明方法很奇怪。我会将晦涩的定义，证明方法标出。在看书时，所有例题将答案遮住，自己做，卡住了就说明不熟悉这个例题的方法，也标出。

B. 老师课上总结或强调的部分。这个没啥好讲的，跟着老师走就对了

C. 你自己做题过程中，发现模糊的知识点

1.2 take note

1.3 understand the relation between definitions