资源描述
LINEAR ALGEBRA AND
ITS APPLICATIONS
姓名:易
学号:
成绩:
1. Definitions
(1) Pivot position in a matrix;
(2) Echelon Form;
(3) Elementary operations;
(4) Onto mapping and one-to-one mapping;
(5) Linearly independence.
2. Describe the row reduction algorithm which produces a matrix in reduced echelon form.
3. Find the matrix that corresponds to the composite transformation of a scaling by 0.3, a rotation of , and finally a translation that adds (-0.5, 2) to each point of a figure.
4. Find a basis for the null space of the matrix
5. Find a basis for Col of the matrix
6. Let and be positive numbers. Find the area of the region bounded by the ellipse whose equation is
7. Provide twenty statements for the invertible matrix theorem.
8. Show and prove the Gram-Schmidt process.
9. Show and prove the diagonalization theorem.
10. Prove that the eigenvectors corresponding to distinct eigenvalues are linearly independent.
Answers:
1. Definitions
(1) Pivot position in a matrix:
A pivot position in a matrix A is a location in A that corresponds to a leading 1 in the reduced echelon form of A. A pivot column is a column of A that contains a pivot position.
(2) Echelon Form:
A rectangular matrix is in echelon form (or row echelon form) if it has the following three properties:
1. All nonzero rows are above any rows of all zeros.
2. Each leading entry of a row is in a column to the right of the leading entry of the row above it.
3. All entries in a column below a leading entry are zeros.
If a matrix in a echelon form satisfies the following additional conditions, then it is in reduced echelon form (or reduced row echelon form):
4. The leading entry in each nonzero row is 1.
5. Each leading 1 is the only nonzero entry in its column.
(3) Elementary operations:
Elementary operations can refer to elementary row operations or elementary column operations.
There are three types of elementary matrices, which correspond to three types of row operations (respectively, column operations):
1. (Replacement) Replace one row by the sum of itself anda multiple of another row.
2. (Interchange) Interchange two rows.
3. (scaling) Multiply all entries in a row by a nonzero constant.
(4) Onto mapping and one-to-one mapping:
A mapping T : Rn → Rm is said to be onto Rm if each b in Rm is the image of at least one x in Rn.
A mapping T : Rn → Rm is said to be one-to-one if each b in Rm is the image of at most one x in Rn.
(5) Linearly independence:
An indexed set of vectors {V1, . . . ,Vp} in Rn is said to be linearly independent if the vector equation
x1v1+x2v2+ . . . +xpvp = 0
Has only the trivial solution. The set {V1, . . . ,Vp} is said to be linearly dependent if there exist weights c1, . . . ,cp, not all zero, such that
c1v1+c2v2+ . . . +cpvp = 0
2. Describe the row reduction algorithm which produces a matrix in reduced echelon form.
Solution:
Step 1:
Begin with the leftmost nonzero column. This is a pivot column. The pivot position is at the top.
Step 2:
Select a nonzero entry in the pivot column as a pivot. If necessary, interchange rows to move this entry into the pivot position.
Step 3:
Use row replacement operations to create zeros in all positions below the pivot.
Step 4:
Cover (or ignore) the row containing the pivot position and cover all rows, if any, above it. Apply steps 1-3 to the submatrix that remains. Repeat the process until there all no more nonzero rows to modify.
Step 5:
Beginning with the rightmost pivot and working upward and to the left, create zeros above each pivot. If a pivot is not 1, make it 1 by scaling operation.
3. Find the matrix that corresponds to the composite transformation of a scaling by 0.3, a rotation of , and finally a translation that adds (-0.5, 2) to each point of a figure.
Solution:
If ψ=π/2, then sinψ=1 and cosψ=0. Then we have
The matrix for the composite transformation is
4. Find a basis for the null space of the matrix
Solution:
First, write the solution of AX=0 in parametric vector form:
A ~ ,
x1-2x2 -x4+3x5=0
x3+2x4-2x5=0
0=0
The general solution is x1=2x2+x4-3x5, x3=-2x4+2x5, with x2, x4, and x5 free.
u v w
=x2u+x4v+x5w (1)
Equation (1) shows that Nul A coincides with the set of all linear conbinations of u, v and w. That is, {u, v, w}generates Nul A. In fact, this construction of u, v and w automatically makes them linearly independent, because (1) shows that 0=x2u+x4v+x5w only if the weights x2, x4, and x5 are all zero.So {u, v, w} is a basis for Nul A.
5. Find a basis for Col of the matrix
Solution:
A ~ , so the rank of A is 3.
Then we have a basis for Col of the matrix:
U = , v = and w =
6. Let and be positive numbers. Find the area of the region bounded by the ellipse whose equation is
Solution:
We claim that E is the image of the unit disk D under the linear transformation T determined by the matrix A=, because if u= , x=, and x = Au, then
u1 = and u2 =
It follows that u is in the unit disk, with , if and only if x is in E, with . Then we have
{area of ellipse} = {area of T (D)}
= |det A| {area of D}
= abπ(1)2 = πab
7. Provide twenty statements for the invertible matrix theorem.
Let A be a square matrix. Then the following statements are equivalent. That is, for a given A, the statements are either all true or false.
a. A is an invertible matrix.
b. A is row equivalent to the identity matrix.
c. A has n pivot positions.
d. The equation Ax = 0 has only the trivial solution.
e. The columns of A form a linearly independent set.
f. The linear transformation x Ax is one-to-one.
g. The equation Ax = b has at least one solution for each b in Rn.
h. The columns of A span Rn.
i. The linear transformation x Ax maps Rn onto Rn.
j. There is an matrix C such that CA = I.
k. There is an matrix D such that AD = I.
l. AT is an invertible matrix.
m. If , then
n. If A, B are all invertible, then (AB)* = B*A*
o.
p. If , then
q.
r. If , then ( L is a natural number )
s.
t. If , then
8. Show and prove the Gram-Schmidt process.
Solution:
The Gram-Schmidt process:
Given a basis {x1, . . . , xp} for a subspace W of Rn, define
.
.
.
Then {v1, . . . , vp} is an orthogonal basis for W. In addition
Span {v1, . . . , vp} = {x1, . . . , xp} for
PROOF
For , let Wk = Span {v1, . . . , vp}. Set , so that Span {v1} = Span {x1}.Suppose, for some k < p, we have constructed v1, . . . , vk so that {v1, . . . , vk} is an orthogonal basis for Wk. Define
By the Orthogonal Decomposition Theorem, vk+1 is orthogonal to Wk. Note that projWkxk+1 is in Wk and hence also in Wk+1. Since xk+1 is in Wk+1, so is vk+1 (because Wk+1 is a subspace and is closed under subtraction). Furthermore, because xk+1 is not in Wk = Span {x1, . . . , xp}. Hence {v1, . . . , vk} is an orthogonal set of nonzero vectors in the (k+1)-dismensional space Wk+1. By the Basis Theorem, this set is an orthogonal basis for Wk+1. Hence Wk+1 = Span {v1, . . . , vk+1}. When k + 1 = p, the process stops.
9. Show and prove the diagonalization theorem.
Solution:
diagonalization theorem:
If A is symmetric, then any two eigenvectors from different eigenspaces are orthogonal.
PROOF
Let v1 and v2 be eigenvectors that correspond to distinct eigenvalues, say, and. To show that , compute
Since v1 is an eigenvector
Hence , but , so
10. Prove that the eigenvectors corresponding to distinct eigenvalues are linearly independent.
Solution:
If v1, . . . , vr are eigenvectors that correspond to distinct eignvalues λ1, . . . , λr of an matrix A.
Suppose {v1, . . . , vr} is linearly dependent. Since v1 is nonzero, Theorem, Characterization of Linearly Dependent Sets, says that one of the vectors in the set is linear combination of the preceding vectors. Let p be the least index such that vp+1 is a linear combination of he preceding (linearly independent) vectors. Then there exist scalars c1, . . . ,cp such that
(1)
Multiplying both sides of (1) by A and using the fact that Avk = λkvk for each k, we obtain
(2)
Multiplying both sides of (1) by and subtracting the result from (2), we have
(3)
Since {v1, . . . , vp} is linearly independent, the weights in (3) are all zero. But none of the factors are zero, because the eigenvalues are distinct. Hence for i = 1, . . . , p. But when (1) says that , which is impossible. Hence {v1, . . . , vr} cannot be linearly dependent and therefore must be linearly independent.
展开阅读全文