The concept of invertibility is introduced quite early in the study of functions. Fortunately, many of the intrinsic properties of functions are shared by their inverses. For example, in calculus we learn that the properties of being continuous or differentiable are generally retained by the inverse functions. We see in this section (Theorem 2.17) that the inverse of a linear transformation is also linear. This result greatly aids us in the study of inverses of matrices. As one might expect from Section 2.3, the inverse of the left-multiplication transformation LA
In the remainder of this section, we apply many of the results about in- vertibility to the concept of isomorphism. We will see that finite-dimensional vector spaces (over F) of equal dimension may be identified. These ideas will be made precise shortly.
The facts about inverse functions presented in Appendix B are, of course, true for linear transformations. Nevertheless, we repeat some of the definitions for use in this section.
Let V and W be vector spaces, and let T:V→W
The following facts hold for invertible functions T and U.
(TU)−1=U−1T−1
(T−1)−1=T
We often use the fact that a function is invertible if and only if it is both one-to-one and onto. We can therefore restate Theorem 2.5 as follows.
Let T:V→W
Let T:P1(R)→R2
Let V and W be vector spaces, and let T:V→W
Proof.
Let y1, y2∈W
Let T be an invertible linear transformation from V to W. Then V is finite-dimensional if and only if W is finite-dimensional. In this case, dim(V)=dim(W)
Proof.
Suppose that V is finite-dimensional. Let β={x1, x2, …, xn}
Now suppose that V and W are finite-dimensional. Because T is one-to-one and onto, we have
So by the dimension theorem (p. 70), it follows that dim(V)=dim(W)
It now follows immediately from Theorem 2.5 (p. 71) that if T is a linear transformation between vector spaces of equal (finite) dimension, then the conditions of being invertible, one-to-one, and onto are all equivalent.
We are now ready to define the inverse of a matrix. The reader should note the analogy with the inverse of a linear transformation.
Let A be an n×n
If A is invertible, then the matrix B such that AB=BA=I
The reader should verify that the inverse of
In Section 3.2, we will learn a technique for computing the inverse of a matrix. At this point, we develop a number of results that relate the inverses of matrices to the inverses of linear transformations.
Let V and W be finite-dimensional vector spaces with ordered bases β
Proof.
Suppose that T is invertible. By the Corollary to Theorem 2.17, we have dim(V)=dim(W)
Similarly, [T]γβ[T−1]βγ=In
Now suppose that A=[T]γβ
where γ={w1, w2, …, wn}
by Theorem 2.11 (p. 89). So UT=IV
Let β
It can be verified by matrix multiplication that each matrix is the inverse of the other.
Let V be a finite-dimensional vector space with an ordered basis β
Proof.
Exercise.
Let A be an n×n
Proof.
Exercise.
The notion of invertibility may be used to formalize what may already have been observed by the reader, that is, that certain vector spaces strongly resemble one another except for the form of their vectors. For example, in the case of M2×2(F)
the 4-tuple (a, b, c, d), we see that sums and scalar products associate in a similar manner; that is, in terms of the vector space structure, these two vector spaces may be considered identical or isomorphic.
Let V and W be vector spaces. We say that V is isomorphic to W if there exists a linear transformation T:V→W
We leave as an exercise (see Exercise 13) the proof that “is isomorphic to” is an equivalence relation. (See Appendix A.) So we need only say that V and W are isomorphic.
Define T:F2→P1(F)
Define
It is easily verified that T is linear. By use of the Lagrange interpolation formula in Section 1.6, it can be shown (compare with Exercise 22) that T(f)=O
In each of Examples 4 and 5, the reader may have observed that isomor-phic vector spaces have equal dimensions. As the next theorem shows, this is no coincidence.
Let V and W be finite-dimensional vector spaces (over the same field). Then V is isomorphic to W if and only if dim(V)=dim(W)
Proof.
Suppose that V is isomorphic to W and that T:V→W
Now suppose that dim(V)=dim(W)
So T is onto. From Theorem 2.5 (p. 71), we have that T is also one-to-one. Hence T is an isomorphism.
By the lemma to Theorem 2.18, if V and W are isomorphic, then either both of V and W are finite-dimensional or both are infinite-dimensional.
Let V be a vector space over F. Then V is isomorphic to Fn
Up to this point, we have associated linear transformations with their matrix representations. We are now in a position to prove that, as a vector space, the collection of all linear transformations between two given vector spaces may be identified with the appropriate vector space of m×n
Let V and W be finite-dimensional vector spaces over F of dimensions n and m, respectively, and let β
Proof.
By Theorem 2.8 (p. 83), Φγβ
But this means that [T]γβ=A, or Φγβ(T)=A
Let V and W be finite-dimensional vector spaces of dimensions n and m, respectively. Then L(V, W)
Proof.
The proof follows from Theorems 2.20 and 2.19 and the fact that dim(Mm×n(F))=mn
We conclude this section with a result that allows us to see more clearly the relationship between linear transformations defined on abstract finite- dimensional vector spaces and linear transformations from Fn
We begin by naming the transformation x→[x]β
Let β
Let β={(1, 0), (0, 1)}
We observed earlier that ϕβ
For any finite-dimensional vector space V with ordered basis β, ϕβ
Proof.
Exercise.
This theorem provides us with an alternate proof that an n-dimensional vector space is isomorphic to Fn
Let V and W be vector spaces of dimension n and m, respectively, and let T:V→W
Let us first consider Figure 2.2. Notice that there are two composites of linear transformations that map V into Fm
Map V into Fn
Map V into W with T and follow it by ϕγ
These two composites are depicted by the dashed arrows in the diagram. By a simple reformulation of Theorem 2.14 (p. 92), we may conclude that
that is, the diagram “commutes.” Heuristically, this relationship indicates that after V and W are identified with Fn
Recall the linear transformation T:P3(R)→P2(R)
Consider the polynomial p(x)=2+x−3x2+5x3
But since T(p(x))=p′(x)=1−6x+15x2
So LAϕβ(p(x))=ϕγT(p(x))
Try repeating Example 7 with different polynomials p(x).
Label the following statements as true or false. In each part, V and W are vector spaces with ordered (finite) bases α
(a) ([T]βα)−1=[T−1]βα
(b) T is invertible if and only if T is one-to-one and onto.
(c) T=LA,
(d) M2×3(F)
(e) Pn(F)
(f) AB=I
(g) If A is invertible, then (A−1)−1=A
(h) A is invertible if and only if LA
(i) A must be square in order to possess an inverse.
For each of the following linear transformations T, determine whether T is invertible and justify your answer.
(a) T:R2→R3
(b) T:R2→R3
(c) T:R3→R3
(d) T:P3(R)→P2(R)
(e) T:M2×2(R)→P2(R)
(f) T:M2×2(R)→M2×2(R)
Which of the following pairs of vector spaces are isomorphic? Justify your answers.
(a) F3
(b) F4
(c) M2×2(R)
(d) V={A∈M2×2(R):tr(A)=0}
Let A and B be n×n
† Let A be invertible. Prove that At
Prove that if A is invertible and AB=O
Let A be an n×n
(a) Suppose that A2=O
(b) Suppose that AB=O
Prove Corollaries 1 and 2 of Theorem 2.18.
† Let A and B be n×n
(a) Prove that A and B are invertible. Hint: See Exercise 12 of Section 2.3.
(b) Give an example to show that a product of nonsquare matrices can be invertible even though the factors, by definition, are not.
† Let A and B be n×n
(a) Use Exercise 9 to conclude that A and B are invertible.
(b) Prove A=B−1
(c) State and prove analogous results for linear transformations defined on finite-dimensional vector spaces.
Verify that the transformation in Example 5 is one-to-one.
Prove Theorem 2.21.
Let ∼
Let
Construct an isomorphism from V to F3
Let V and W be n-dimensional vector spaces, and let T:V→W
Let B be an n×n
† Let V and W be finite-dimensional vector spaces and T:V→W
(a) Prove that T(V0)
(b) Prove that dim(V0)=dim(T(V0))
Repeat Example 7 with the polynomial p(x)=1+x+2x2+x3
In Example 5 of Section 2.1, the mapping T:M2×2(R)→M2×2(R)
(a) Compute [T]β
(b) Verify that LAϕβ(M)=ϕβT(M)
† Let T:V→W
Let V and W be finite-dimensional vector spaces with ordered bases β={v1, v2, …, vn}
First prove that {Tij:1≤i≤m, 1≤j≤n}
Let c0, c1, …, cn
Let W denote the vector space of all sequences in F that have only a finite number of nonzero terms (defined in Exercise 18 of Section 1.6), and let Z=P(F)
where n is the largest integer such that σ(n)≠0
The following exercise requires familiarity with the concept of quotient space defined in Exercise 31 of Section 1.3 and with Exercise 42 of Section 2.1.
Let V and Z be vector spaces and T:V→Z
for any coset v+N(T)
(a) Prove that ˉT
(b) Prove that ˉT
(c) Prove that ˉT
(d) Prove that the diagram shown in Figure 2.3 commutes; that is, prove that T=ˉTη
Let V be a nonzero vector space over a field F, and suppose that S is a basis for V. (By the corollary to Theorem 1.13 (p. 61) in Section 1.7, every vector space has a basis.) Let C(S, F) denote the vector space of all functions f∈F(S, F)
otherwise. Prove that Ψ