3.3. Vector Spaces
i
“bookmt” — 2006/8/8 — 12:58 — page 158 — #170
i
i
158
3. PRODUCTS OF GROUPS 3.2.5. Consider the set G of n-by-n matrices with entries in f0; ˙1g that have exactly one nonzero entry in each row and column. These are called signed permutation matrices. Show that G is a group, and that G is a semidirect product of Sn and the group of diagonal matrices with entries in f˙1g. Sn acts on the group of diagonal matrices by permutation of the diagonal entries.
One final example shows that direct products and semidirect products do not exhaust the ways in which a normal subgroup N and the quotient group G=N can be fit together to form a group G: 3.2.6. Z4 has a subgroup isomorphic to Z2, namely the subgroup generated by Œ2. The quotient Z4=Z2 is also isomorphic to Z2. Nevertheless, Z4 is not a direct or semidirect product of two copies of Z2.
3.3. Vector Spaces
You can use your experience with group theory to gain a new appreciation of linear algebra. In this section K denotes one of the fields Q, R; C, or Zp, or any other favorite field of yours.
Definition 3.3.1. A vector space V over a field K is a abelian group with a product K V ! V , .˛; v/ 7! ˛v satisfying the following conditions: (a) 1v D v for all v 2 V .
(b) .˛ˇ/v D ˛.ˇv/ for all ˛; ˇ 2 K, v 2 V .
(c) ˛.v C w/ D ˛v C ˛w for all ˛ 2 K and v; w 2 V .
(d) .˛ C ˇ/v D ˛v C ˇv for all ˛; ˇ 2 K and v 2 V .
Compare this definition with that contained in your linear algebra text; notice that we were able to state the definition more concisely by referring to the notion of an abelian group.
A vector space over K is also called a K–vector space. A vector space over R is also called a real vector space and a vector space over C a complex vector space.
Example 3.3.2.
(a) Kn is a vector space over K, and any vector subspace of Kn is a vector space over K.
(b) The set of K–valued functions on a set X is a vector space over K, with pointwise addition of functions and the usual multiplication of functions by scalars.
i
i
i
i
i
“bookmt” — 2006/8/8 — 12:58 — page 159 — #171
i
i
3.3. VECTOR SPACES
159
(c) The set of continuous real–valued functions on Œ0; 1 (or, in fact, on any other metric or topological space) is a vector space over R with pointwise addition of functions and the usual multiplication of functions by scalars.
(d) The set of polynomials KŒx is a vector space over K, as is the set of polynomials of degree n, for any natural number n.
Let’s make a few elementary deductions from the vector space axioms:
Note that the distributive law ˛.v C w/ D ˛v C ˛w says that the map L˛ W v 7! ˛v is a group homomorphism of .V; C/ to itself. It follows that L˛.0/ D 0 and L˛. v/ D L˛.v/ for any v 2 V . This translates to ˛ 0 D 0 and ˛. v/ D .˛v/.
Similarly, .˛ C ˇ/v D ˛v C ˇv says that Rv W ˛ 7! ˛v is a group ho momorphism of .K; C/ to .V; C/. Consequently, 0 v D 0, and . ˛/v D .˛v/. In particular, . 1/v D .1v/ D v.
Lemma 3.3.3. Let V be a vector space over the field K. then for all ˛ 2 K and v 2 V , (a) 0v D ˛0 D 0.
(b)
˛. v/ D .˛v/ D . ˛/v.
(c) . 1/v D v.
(d) If ˛ ¤ 0 and v ¤ 0, then ˛v ¤ 0.
Proof. Parts (a) through (c) were proved above. For (d), suppose ˛ ¤ 0 but ˛v D 0. Then 0 D ˛ 10 D ˛ 1.˛v/ D .˛ 1˛/v D 1v D v:
n
Definition 3.3.4. Let V and W be vector spaces over K. A map T W V ! W is called a linear transformation or linear map if T .x C y/ D T .x/ C T .y/ for all x; y 2 V and T .˛x/ D ˛T .x/ for all ˛ 2 K and x 2 V . An endomorphism of a vector space V is a linear transformation T W V ! V .
The kernel of linear transformation T W V ! W is fv 2 V W T .v/ D 0g. The range of T is T .V /.
Example 3.3.5.
(a) Fix a polynomial f .x/ 2 KŒx. The map g.x/ 7! f .x/g.x/ is a linear transformation from KŒx into KŒx.
i
i
i
i
i
“bookmt” — 2006/8/8 — 12:58 — page 160 — #172
i
i
160
3. PRODUCTS OF GROUPS (b) The formal derivative Pk ˛kxk 7! Pk k˛kxk 1 is a linear transformation from KŒx into KŒx.
(c) Let V denote the complex vector space of C–valued continuous functions on the interval Œ0; 1. The map f 7! f .1=2/ is a linear transformation from V to C.
(d) Let V denote the complex vector space of C–valued continuous functions on the interval Œ0; 1 and let g 2 V . The map f 7!
R 1 f .t/g.t/ dt is a linear transformation from V to C.
0
Linear transformations are the homomorphisms in the theory of vector spaces; in fact, a linear transformation T W V ! W between vector spaces is a homomorphism of abelian groups that additionally satisfies T .˛v/ D ˛T .v/ for all ˛ 2 K and v 2 V . A linear isomorphism between vector spaces is a bijective linear transformation between them.
Definition 3.3.6. A subspace of a vector space V is a (nonempty) subset that is a vector space with the operations inherited from V .
As with groups, we have a criterion for a subset of a vector space to be a subspace, in terms of closure under the vector space operations:
Proposition 3.3.7. For a nonempty subset of a vector space to be a subspace, it suffices that the subset be closed under addition and under scalar multiplication.
Proof. Exercise 3.3.3.
n
Again as with groups, the kernel of a vector space homomorphism (linear transformation) is a subspace of the domain, and the range of a vector space homomorphism is a subspace of the codomain.
Proposition 3.3.8. Let T W V ! W be a linear map between vector spaces. Then the range of T is a subspace of W and the kernel of T is a subspace of V .
Proof. Exercise 3.3.5.
n
Quotients and homomorphism theorems If V is a vector space over K and W is a subspace, then in particular W is a subgroup of the abelian group V , so we can form the quotient i
i
i
i
group V =W , whose elements are cosets v C W of W in V . The additive group operation in V =W is .x C W / C .y C W / D .x C y/ C W . Now attempt to define a multiplication by scalars on V =W in the obvious way: ˛.v C W / D .˛v C W /. We have to check that this this is well–defined.
But this follows from the closure of W under scalar multiplication; namely, if v C W D v0 C W and, then ˛v
˛v0 D ˛.v v0/ 2 ˛W W . Thus ˛vCW D ˛v0CW , and the scalar multiplication on V =W is well-defined.
Theorem 3.3.9. If W is subspace of a vector space V over K, then V =W has the structure of a vector space, and the quotient map W v 7! v C W is a surjective linear map from V to V =W with kernel equal to W .
Proof. We know that V =W has the structure of an abelian group, and that, moreover, there is a well-defined product K V =W ! V =W given by ˛.v C W / D ˛v C W . It is straighforward to check the remaining vector space axioms. Let us indclude one verification for the sake of illustration.
For ˛ 2 K and v1; v2 2 V , ˛..v1 C W / C .v2 C W // D ˛..v1 C v2/ C W // D ˛.v1 C v2/ C W D .˛v1 C ˛v2/ C W D .˛v1 C W / C .˛v2 C W / D ˛.v1 C W / C ˛.v2 C W / Finally, the quotient map is already known to be a group homomorophism.
To check that it is linear, we only need to verify that .˛v/ D ˛.v/ for v 2 V and ˛ 2 K. But this is immediate from the definition of the product, ˛v C W D ˛.v C W /.
n
V =W is called the quotient vector space and v 7! v C W the quotient map or quotient homomorphism. We have a homomorphism theorem for vector spaces that is analogous to, and in fact follows from, the homomorphism theorem for groups.
Theorem 3.3.10. (Homomorphism theorem for vector spaces). Let T W
V ! V be a surjective linear map of vector spaces with kernel N . Let W V ! V =N be the quotient map. There is linear isomorphism Q T W
V =N ! V satisfying Q T ı D T . (See the following diagram.)
i
i
i
i
i
“bookmt” — 2006/8/8 — 12:58 — page 162 — #174
i
i
162
3. PRODUCTS OF GROUPS
T
V
qqqqq
V
q qqqq
∼
= QT
qqqq
V =N Proof. The homomorphism theorem for groups (Theorem 2.7.6) gives us an isomorphism of abelian groups Q T satisfying Q T ı D T . We have only to verify that Q T also respects multiplication by scalars. But this follows at once from the definitions: Q T .˛.x C N // D Q T .˛x C N / D T .˛x/ D ˛T .x/ D ˛ Q T .x C N /.
n
The next three propositions are analogues for vector spaces and linear transformations of results that we have established for groups and group homomorphisms in Section 2.7. Each is proved by adapting the proof from the group situation. Some of the details are left to you.
Proposition 3.3.11. (Correspondence theorem for vector spaces) Let T W V ! V be a surjective linear map, with kernel N . Then M 7! T 1.M / is a bijection between subspaces of V and subspaces of V containing N .
Proof. According to Proposition 2.7.12, B 7! T 1.B/ is a bijection between the subgroups of V and the subgroups of V containing N . I leave it as an exercise to verify that B is a vector subspace of V if, and only if, T 1.B/ is a vector subspace of V ; see Exercise 3.3.6.
n
Proposition 3.3.12. Let T W V ! V be a surjective linear transformation with kernel N . Let M be a subspace of V and let M D T 1.M /. Then x C M 7! T .x/ C M defines a linear isomorphism of V =M to V =M .
Equivalently, .V =N /=.M=N / Š V =M; as vector spaces.
Proof. By Proposition 2.7.13, the map x CM 7! T .x/CM is a group isomorphism from V =M to V =M . But the map also respects multiplication
i
i
i
i
i
“bookmt” — 2006/8/8 — 12:58 — page 163 — #175
i
i
3.3. VECTOR SPACES
163
by elements of K, as ˛.v C M / D ˛v C M 7! T .˛v/ C M D ˛T .v/ C M D ˛.T .v/ C M / We can identify V with V =N , by the homomorphism theorem for vector spaces, and this identification carries the subspace M to the image of M in V =N , namely M=N . Therefore .V =N /=.M=N / Š V =M Š V =M:
n
Proposition 3.3.13. (Factorization Theorem for Vector Spaces) Let V and V be vector spaces over a field K, and let T W V ! V be a surjec tive linear map with kernel M . Let N M be a vector subspace and let W V ! V =N denote the quotient map. The there is a surjective homomorphism Q T W V =N ! V such that Q T ı D T . (See the following diagram.) The kernel of Q T is M=N V =N .
T
V
qqqqq
V
q qqqq
Q
T
qqqq
V =N Proof. By Proposition 2.7.14, Q T W v C N 7! T .v/ defines a group homo morphism from V =N onto V with kernel M=N . We only have to check that this map respects multiplication by elements of K. This follows from the computation:
Q T .˛.v C N // D Q T .˛v C N / D T .˛v/ D ˛T .v/ D ˛ Q T .v C N /:
n
i
i
i
i
i
“bookmt” — 2006/8/8 — 12:58 — page 164 — #176
i
i
164
3. PRODUCTS OF GROUPS Proposition 3.3.14. (Diamond Isomorphism Theorem for Vector Spaces) Let A and N be subspaces of a vector space V . Let denote the quotient map W V ! V =N . Then 1..A// D A C N is a subspace of V containing both A and N . Furthermore, .A C N /=N Š .A/ Š A=.A \ N /.
Proof. Exercise 3.3.8.
n
Bases and dimension
We now consider span, linear independence, bases and dimension for abstract vector spaces.
Definition 3.3.15. A linear combination of a subset S of a vector space V is any element of V of the form ˛1v1 C ˛2v2 C C ˛svs, where for all i , ˛i 2 K and vi 2 S. The span of S is the set of all linear combinations of S. We denote the span of S by span.S /.
The span of the empty set is the set containing only the zero vector f0g.
Definition 3.3.16. A subset S of vector space V is linearly independent if 2˛ 3
1
:
for all natural numbers s, for all ˛ D 6 :: 7 2 Ks, and for all sequences
4
5
˛s
.v1; : : : vs/ of distinct vectors in S, if ˛1v1 C ˛2v2 C C ˛svs D 0, then ˛ D 0. Otherwise, S is linearly dependent.
Note that a linear independent set cannot contain the zero vector. The empty set is linearly independent, since there are no sequences of its elements!
Example 3.3.17. Define en.x/ D einx for n an integer and x 2 R. Then fen W n 2 Zg is a linearly independent subset of the (complex) vector space of C–valued functions on R. To show this, we have to prove that for all natural numbers s, any set consisting of s of the functions en is linearly independent. We prove this statement by induction on s. For s D 1, suppose ˛ 2 C, n1 2 Z, and ˛en D 0. Evaluating at x D 0
1
gives 0 D ˛ein10 D ˛. This shows that fen g is linearly independent.
1
Now fix s > 1 and suppose that any set consisting of fewer than s of the
i
i
i
i
i
“bookmt” — 2006/8/8 — 12:58 — page 165 — #177
i
i
3.3. VECTOR SPACES
165
functions en is linearly independent. Let n1; : : : ; ns be distinct integers, ˛1; : : : ; ˛s 2 C, and suppose that ˛1en C C ˛
D 0:
1
s ens Notice that enem D enCm and e0 D 1. Also, the en are differentiable, with .en/0 D i nen. Multiplying our equation by e n and rearranging gives
1
˛1 D ˛2en C C ˛ :
(3.3.1)
2
n1
s ens n1
Now we can differentiate to get 0 D i.n2
n1/˛2en
C C i.n :
2
n1
s
n1/˛sens n1
The integers nj n1 for 2 j s are all nonzero and distinct, so the induction hypothesis entails ˛2 D D ˛s D 0. But then Equation gives ˛1 D 0 as well.
Definition 3.3.18. Let V be a vector space over K. A subset of V is called a basis of V if the set is linearly independent and has span equal to V .
Example 3.3.19.
(a) The set f1; x; x2; : : : ; xng is a basis of the vector space (over K) of polynomials in KŒx of degree n.
(b) The set f1; x; x2; : : : g is a basis of KŒx.
Lemma 3.3.20. Suppose V is a vector space over K, and A B V are subsets with span.A/ D V and B linearly independent. Then A D B.
Proof. Suppose that A is a proper subset of B and v 2 B n A. Since A spans V , we can write v as a linear combination of elements of A. This give a linear relation
X
v
˛j vj D 0
j
with vj 2 A. But this relation contradicts the linear independence of B.
n
Lemma 3.3.21. Suppose V is a vector space over K, and A V is a linearly dependent subset.
Then A has a proper subset A0 with span.A0/ D span.A/.
i
i
i
i
i
“bookmt” — 2006/8/8 — 12:58 — page 166 — #178
i
i
166
3. PRODUCTS OF GROUPS Proof. Since A is linear dependent, there is a linear relation ˛1v1 C C ˛nvn D 0 with vj 2 A and ˛1 ¤ 0. Therefore, v1 D .1=˛1/.˛2v2 C C ˛nvn/: Let A0 D A n fv1g. Then v1 2 span.A0/ H) A span.A0/ H) span.A/ span.A0/ H) span.A/ D span.A0/.
n
Proposition 3.3.22. Let B be a subset of a vector space V over K. The following properties are equivalent: (a) B is a basis of V .
(b) B is a minimal spanning set for V . That is, B spans V and no proper subset of B spans V .
(c) B is a maximal linearly independent subset of V . That is, B is linearly independent and no subset of V properly containing B is linearly independent.
Proof. The implications (a) H)(b) and (a) H)(c) both follow from Lemma 3.3.20. If B is a minimal spanning set, then B is linearly independent, by Lemma 3.3.21, so B is a basis.
Finally, if B is a maximal linearly independent set, and v 2 V n B, then fvg [ B is linearly dependent, so we have a linear relation
X ˇv C ˛i vi D 0
i
with not all coefficients equal to zero and vi 2 B. Note that ˇ ¤ 0, since otherwise we would have a nontrivial linear relation among elements of B.
Solving, we obtain
X
v D .1=ˇ/ ˛i vi ;
i
so v 2 span.B/. It follows than span.B/ D V . Thus, a maximal linearly independent set is spanning, and therefore is a basis.
n
Definition 3.3.23. A vector space is said to be finite–dimensional if it has a finite spanning set. Otherwise, V is said to be infinite–dimensional.
Proposition 3.3.24. If V is finite dimensional, then V has a finite basis.
In fact, any finite spanning set has a subset that is a basis.
i
i
i
i
i
“bookmt” — 2006/8/8 — 12:58 — page 167 — #179
i
i
3.3. VECTOR SPACES
167
Proof. Suppose that V is finite dimensional and that S is a finite subset with span.S / D V . Since S is finite, S has a subset B that is minimal spanning. By Proposition 3.3.22, B is a basis of V .
n
Let V be a vector space over K. Represent elements of the vector space V n by 1–by–n matrices (row “vectors”) with entries in V . For any n–by–s matrix C with entries in K, right multiplication by C gives an linear map from V n to V s. Namely, if C D .ci;j /, then "
#
X
X
Œv1; : : : ; vn C D ci;1vi ; : : : ; ci;svi : i
i
If B is an s–by–t matrix over K, then the linear map implemented by CB is the composition of the linear maps implemented by C and by B, Œv1; : : : ; vn CB D .Œv1; : : : ; vn C /B; as follows by a familiar computation. If fv1; : : : ; vng is linearly independent and Œv1; : : : ; vn C D 0, then C is the zero matrix. See Exercise 3.3.10.
Proposition 3.3.25. Let V a finite dimensional vector space with a spanning set X D fx1; : : : ; xng. Let Y D fy1; : : : ; ysg be a linearly independent subset of V . Then s n.
Proof. Since X is spanning, we can write each vector yj as a linear combination of elements of X ,
X yj D ci;j xi :
i
These s equations can be written as a single matrix equation
Œy1; : : : ; ys D Œx1; : : : ; xn C; where C is the s–by–n matrix C D .ci;j /. If s > n (C has more columns 2˛ 3
1
:
than rows) then ker.C / ¤ f0g; that is, there is a nonzero a D 6 :: 7 2 Kn
4
5
˛n
such that C a D 0. But then X ˛iyi D Œy1; : : : ; ysa D .Œx1; : : : ; xn C / a
i
D Œx1; : : : ; xn .C a/ D 0; contradicting the linear independence of fy1; : : : ; ysg.
n
i
i
i
i
i
“bookmt” — 2006/8/8 — 12:58 — page 168 — #180
i
i
168
3. PRODUCTS OF GROUPS Corollary 3.3.26. Any two bases of a finite dimensional vector space have the same cardinality.
Proof. It follows from Propostion 3.3.25 that any basis of a finite dimensional vector space is finite. If a finite dimensional vector space has two bases X and Y , then jY j jXj, since Y is linearly independent and X is spanning. But, reversing the roles of X and Y , we also have jY j jXj.
n
Definition 3.3.27. The unique cardinality of a basis of a finite– dimensional vector space V is called the dimension of V and denoted dim.V /. If V is infinite–dimensional, we write dim.V / D 1.
Corollary 3.3.28. Let W be a subspace of a finite dimensional vector space V .
(a) Any linearly independent subset of W is contained in a basis of W .
(b) W is finite dimensional, and dim.W / dim.V /.
(c) Any basis of W is contained in a basis of V .
Proof. Let Y be a linearly independent subset of W . Since no linearly independent subset of W has more than dim.V / elements, by Proposition 3.3.25, Y is contained in linearly independent set B that is maximal among linearly independent subsets of W . By Proposition 3.3.22, B is a basis of W . This proves (a). Point (b) follows, since W has a basis whose cardinality is no more than dim.V /.
Point (a) applies in particular to V ; any linearly independent subset of V is contained in a basis of V . Therefore, a basis of W is contained in a basis of V .
n
Remark 3.3.29. It follows from Zorn’s lemma2 that every vector space has a basis. In fact, by Zorn’s lemma, any linearly independent set Y in a vector space V is contained in a maximal linearly independent set B. By Proposition 3.3.22, B is a basis of V .
2Zorn’s lemma is an axiom of set theory equivalent to the Axiom of Choice.
i
i
i
i
i
“bookmt” — 2006/8/8 — 12:58 — page 169 — #181
i
i
3.3. VECTOR SPACES
169
Remark 3.3.30. The zero vector space, with one element 0, is zero dimensional. The empty set is its unique basis.
An ordered basis of a finite–dimensional vector space is a finite se quence whose entries are the elements of a basis listed without repetition; that is, an ordered basis is just a basis endowed with a particular linear order. Corresponding to an ordered basis B D .v1; : : : ; vn/ of a vector space V over K, we have a linear isomorphism SB W V ! Kn given by 2˛ 3
1
˛
X
X
6
27
SB W ˛i vi 7!
˛i Oei D 6 : 7 ; 6 :: 7
i
i
4
5
˛n
where . Oe1; : : : ; Oen/ is the standard ordered basis of Kn. SB .v/ is called the coordinate vector of v with respect to B.
Proposition 3.3.31. Any two n–dimensional vector spaces over K are linearly isomorphic.
Proof. The case n D 0 is left to the reader. For n 1, any two n– dimensional vector spaces over K are each isomorphic to Kn, and hence isomorphic to each other.
n
This proposition reveals that (finite–dimensional) vector spaces are not very interesting, as they are completely classified by their dimension. That is why the actual subject of finite–dimensional linear algebra is not vector spaces but rather linear maps, which have more interesting structure than vector spaces themselves.
Proposition 3.3.32. (The universal property of bases.) Let V be a vector space over K and let S be a basis of V . Then any function f W S ! W from S into a vector space W extends uniquely to a linear map T W V !
W .
Proof. We will assume that S D fv1; : : : ; vng is finite, in order to simplify the notation, although the result is equally valid if S is infinite.
Let f W S ! W be a function. Any element v 2 V has a unique expression as a linear combination of elements of S , v D Pi ˛i vi . There is only one possible way to define T .v/, namely T .v/ D Pi ˛i f .vi /. It is then straightforward to check that T is linear.
n
i
i
i
i
i
“bookmt” — 2006/8/8 — 12:58 — page 170 — #182
i
i
170
3. PRODUCTS OF GROUPS
Direct sums and complements
The (external) direct sum of several vectors spaces V1, V2, . . . , Vn over a field K is the Cartesian product V1 V2 Vn with component–by– component operations: .a1; a2; : : : ; an/ C .b1; b2; : : : ; bn/ D .a1 C b1; a2 C b2; : : : ; an C bn// and
˛.a1; a2; : : : ; an/ D .˛a1; ˛a2; : : : ; ˛an/; for ai ; bi 2 Vi and ˛ 2 K. The direct sum is denoted by V1˚V2˚ ˚Vn.
How can we recognize that a vector space V is isomorphic to the direct sum of several subspaces A1; A2; : : : ; An? It is neccessary and sufficient that V be be isomorphic to the direct product of the Ai , regarded as abelian groups.
Proposition 3.3.33. Let V be a vector space over a field K with subspaces A1; : : : As such that V D A1 C C As. Then the following conditions are equivalent: (a)
.a1; : : : ; as/ 7! a1 C C as is a group isomorphism of A1 As onto V .
(b)
.a1; : : : ; as/ 7! a1 C C as is a linear isomorphism of A1 ˚ ˚ As onto V .
(c) Each element x 2 V can be expressed as a sum x D a1C Cas, with ai 2 Ai for all i, in exactly one way.
(d) If 0 D a1 C C as, with ai 2 Ai for all i, then ai D 0 for all i .
Proof. The equivalence of (a), (c), and (d) is by Proposition 3.5.1. Clearly (b) implies (a). We have only to show that if .a/ holds, then the map .a1; : : : ; as/ 7! a1 C C as respects multiplication by elements of K.
This is immediate from the computation ˛.a1; : : : ; as/ D .˛a1; : : : ; ˛as/
7! ˛a1 C C ˛as D ˛.a1 C C a1 C C as/:
n
If the conditions of the proposition are satisfied, we say that V is the internal direct sum of the subspaces Ai , and we write V D A1 ˚ ˚ As.
In particular, if M and N are subspaces of V such that M C N D V and M \ N D f0g, then V D M ˚ N .
Let N be a subspace of a vector space V . A subspace M of V is said to be a complement of M if V D M ˚N . Subspaces of finite–dimensional vector spaces always have a complement, as we shall now explain.
i
i
i
i
i
“bookmt” — 2006/8/8 — 12:58 — page 171 — #183
i
i
3.3. VECTOR SPACES
171
Proposition 3.3.34. Let T W V ! W be a surjective linear map of a finite–dimensional vector space V onto a vector space W . Then T admits a right inverse; that is, there exists a linear map S W W ! V such that T ı S D idW .
Proof. First, let’s check that W is finite–dimensional, with dimension no greater than dim.V /. If fv1; : : : ; vng is a basis of V , then fT .v1/; : : : ; T .vn/g is a spanning subset of W , so contains a basis of W as a subset.
Now let fw1; : : : ; wsg be a basis of W . For each basis element wi , let xi be a preimage of wi in V (i.e., choose xi such that T .xi / D wi ).
The map wi 7! xi extends uniquely to a linear map S W W ! V , defined by S.Pi ˛i wi / D Pi ˛i xi , according to Proposition 3.3.32. We have T ı S.Pi ˛i wi / D T .Pi ˛i xi / D Pi ˛i T .xi / D Pi ˛i wi . Thus T ı S D idW .
n
In the situation of the previous proposition, let W 0 denote the image of S . I claim that V D ker.T / ˚ W 0 Š ker.T / ˚ W: Suppose v 2 ker.T / \ W 0. Since v 2 W 0, there is a w 2 W such that v D S.w/. But then 0 D T .v/ D T .S.w// D w, and, therefore, v D S.w/ D S.0/ D 0. This shows that ker.T / \ W 0 D f0g. For any v 2 V , we can write v D S ıT .v/C.v S ıT .v//. The first summand is evidently in W 0, and the second is in the kernel of T , as T .v/ D T ı S ı T .v/. This shows that ker.T / C W 0 D V . We have shown that V D ker.T / ˚ W 0.
Finally, note that S is an isomorphism of W onto W 0, so we also have V Š ker.T / ˚ W . We have shown the following:
Proposition 3.3.35. If T W V ! W is a linear map and V is finite– dimensional, then V Š ker.T / ˚ range.T /. In particular, dim.V / D dim.ker.T // C dim.range.T //.
Now let V be a finite–dimensional vector space and let N be a sub space. The quotient map W V ! V =N is a a surjective linear map with kernel N . Let S be a right inverse of , as in the proposition, and let M be the image of S . The preceding discussion shows that V D N ˚ M Š N ˚ V =N . We have proved the following:
Proposition 3.3.36. Let V be a finite–dimensional vector space and let N be a subspace. Then V Š N ˚ V =N . In particular, dim.V / D dim.N / C dim.V =N /.
i
i
i
i
i
“bookmt” — 2006/8/8 — 12:58 — page 172 — #184
i
i
172
3. PRODUCTS OF GROUPS Corollary 3.3.37. Let V be a finite–dimensional vector space and let N be a subspace. Then there exists a subspace M of V such that V D N ˚ M .
Warning: Complements of a subspace are never unique. For example, both f.0; 0; c/ W c 2 Rg and f.0; c; c/ W c 2 Rg are complements of f.a; b; 0/ W a; b 2 Rg in R3.
Exercises 3.3 3.3.1. Show that the intersection of an arbitrary family of linear subspaces of a vector space is a linear subspace.
3.3.2. Let S be a subset of a vector space.
Show that span.S / D span.span.S //. Show that span.S / is the unique smallest linear subspace of V containing S as a subset, and that it is the intersection of all linear subspaces of V that contain S as a subset.
3.3.3. Prove Proposition 3.3.7.
3.3.4. Show that any composition of linear transformations is linear. Show that the inverse of a linear isomorphism is linear.
3.3.5. Let T W V ! W be a linear map between vector spaces. Show that the range of T is a subspace of W and the kernel of T is a subspace of V .
3.3.6. Prove Proposition 3.3.11.
3.3.7. Give another proof of Proposition 3.3.12 by adapting the proof of Proposition 2.7.13 rather than by using that proposition.
3.3.8. Prove Proposition 3.3.14 by using Proposition 2.7.18, or by adapting the proof of that proposition.
3.3.9. Let A and B be finite–dimensional subspaces of a not necessarily finite–dimensional vector space V . Show that A C B is finite–dimensional and that dim.A C B/ C dim.A \ B/ D dim.A/ C dim.B/.
3.3.10. Let V be a vector space over K (a) Let A and B be matrices over K of size n–by–s and s–by–t respectively. Show that for Œv1; : : : ; vn 2 Kn, Œv1; : : : ; vn.AB/ D .Œv1; : : : ; vnA/B:
(b) Show that if fv1; : : : ; vng is linearly independent subset of V , and Œv1; : : : ; vnA D 0, then A D 0.
3.3.11. Show that the following conditions are equivalent for a vector space V :
i
i
i
i