Harvard University,FAS
Fall 2003

Mathematics Math21b
Fall 2003

Linear Algebra
and Differential Equations

Course Head: Oliver knill
Office: SciCtr 434
Email: knill@math.harvard.edu
New Syllabus Calendar Homework Challenge Exam Handout Check Exhibit Cas Faq Link

Some Challenge Problems

These problems are for fun only.

1. Week: 1) Can you find the number of types of 3x3 matrices? (See problems 20-23 in 1.2). Try to find a way, which allows to enumerate all possible types for a general (m x n)-matrix. Hint: Order according to the rank.
2) Try to find a proof that rref(A) is uniquely defined. That is, if you have two matrices which are in row reduced echelon form and both are obtained from A by row reduction, then these two matrices must be the same. Hint. You find this problem together with some hints in the book.
2. Week: Crack the Bretscher Code using a plaintext attack (compare the movie "Enigma"). If x is a vector containing the plain text and y=Ax is the encoded text where A is a secret (n x n) matrix. How many "message text - encoded text" pairs (x,y) do you have to know in general to find the matrix A and how do you find it? Hint: This is a problem with n2 unknowns, the coefficients of the matrix is the key of the code.
3. Week:
1) You have seen that the rotation dilation matrices

    |  a   -b |
A = |         |
    |  b    a |

behave like complex numbers w=a+ib in the sense that if z=x+iy corresponds to v=(x,y) then the product is wz = (ax-by) + i (ay+bx) corresponds to A v.
Especially, the matrix

    |  0   -1 |
J = |         |
    |  1    0 |

corresponds to the square root of -1. Indeed, you can check that J2 is -I. This representation of complex numbers as matrices takes a bit the mystery of these objects.

If t is a real number find a closed form for the matrix eJ t = I + J t + J2 t2/2! + J3 t3/3! +....
Hint: Write down the first few terms in the series and watch what happens with each matrix entry. Use that

ex=1+x+x2/2!+x3/3!+...
cos(x) = 1 - x2/2!+x4/4!-...
sin(x) = x-x3/3!+x5/5!-....
2) Verify that for a square (nxn)matrix A for which all entries are smaller then 1/n, one has
I+A+A2+A3 + ... = (I-A)-1, where I is the identity matrix.
Hint. The condition that all entries are smaller than 1 is only needed to assure that (I-A) is invertible. Verify first formally that the equation is true, then verify that (I-A) has only 0 in the kernel.
4. Week: 1) Let S be the linear transformation on Rnwhich maps e1 to e2, e2 to e3 etc until en to e1. For each k between 1 and k-1, find the dimension of the kernel of the transformation T=I+S+...+Sk. Hint: Experiment first with n=3,4 and write down the matrix in each case.

    | 1 0 0 |         | 1 0 1 |          | 1 1 1 |
I = | 0 1 0 |   I+S = | 1 1 0 |  I+S+S = | 1 1 1 |
    | 0 0 1 |         | 0 1 1 |          | 1 1 1 |

2) Let X be the space of polynomials of degree smaller or equal than 3. Let T be the linear which maps f in X to the vector [f(-1),f(0),f(1)]. For example T(x3+2x2-x+1) = [3,1,3]. Find a basis for the kernel of this transformation. Hint. Just write down what it means that a polynomial is in the kernel.
5. Week: The space Pn of polynomials of degree smaller than n form a linear space because one can add and scale such polynomials in the same way as vectors. The polynomials v0(x)=1, v2(x)=x2,...,vn(x)=xn form a standard basis in this space. A dot product p.q can be defined as the integral of p(x) q(x) from x=-1 to x=1. Perform in the case n=3 the Gram-Schmidt orthogonalisation with this dot product, starting with the standard basis. Hint. Compare your result with the Legendre polynomials p1(x)=1, p2(x)=x, p2(x)=(3x2-1)/2, p3(x)=(5x3-3x)/2.
6. Week: A matrix is called skew-symmetric if AT=-A. A general skew-symmetric (3x3) matrix is
        | 0  a  b  |
    A = |-a  0  c  |
        |-b -c  0  |
If A is a matrix, define its exponential as eA = 1 + A + A2/2! + A3/3! + ... For (1x1) matrices, this is the usual exponential function. Verify that if A is a skew symmetric matrix, then eA is an orthogonal matrix.
Hint. Check first that eA+B is the matrix product of eA with eB if A and B commute and conclude from this that e-A is the inverse of eA. Then look at the transpose of eA.
7. Week: If you look at orbits of a discrete dynamical system given by a (2 x 2) matrix A, you notice that the points ly on curves, curves which are left invariant by the linear map. For example, for a rotation in the plane, the curves are circles. For a diagonal matrix with diagonal 2,3, the curves are hyperbolas. Can you find a formula for these curves in the case
        | 1    -1 | 
    A = | 1     0 |
Hint. Find two eigenvectors to A and conjugate the matrix A to a diagonal matrix B = S-1 A S. Figure out first what curves you have in the case of a diagonal matrix B.
8. Week: You have seen that a Markov matrix
        | p1,1  ...  p1,n | 
    A = | p2,1  ...  p2,n | 
        |   .         .   | 
        | pn,1  ...  pn,n | 
(all entries are positive and the sum of each column is 1) has an eigenvalue 1. We have seen that by looking at the transpose AT which has the eigenvector [1,1,...,1,1]T with eigenvalue 1 and because AT and A have the same eigenvalues also 1 was an eigenvalue of A. The eigenvector v to 1 when scaled so that the sum of its entries are 1 can be interpreted as a stable equilibrium probability distribution of the Markov chain.

Problem.
a) Verify that An is again a Markov matrix.
b) Verify that An stays bounded.
c) Why does A have no eigenvalue larger then 1?
d) Verify that the algebraic and geometric multiplicity of 1 is 1.
e) Prove that any matrix A with strictly positive entries has a maximal eigenvalue with multiplicity 1.
Hint. For d), note that A maps the positive 2n-ant (quadr-ant in case n=2, oct-ant in case n=3) strictly into itself. Conclude that an eigenvector has to be in that 2n-ant. Assume then that there are two different eigenvectors and conclude from this that you find then also an eigenvector with eigenvalue 1 in the boundary of the n-ant.
Remark. With e), you have proven a fancy result called Frobenius theorem.
9. Week: For differential equations d/dt x = A x where A is a 2x2 matrix with different nonzero eigenvalues, we had the following qualitative different pictures at the equilibrium point:
1) all eigenvalues are negative, the origin is asymptotically stable.
2) one eigenvalue is negative, the other positive. There are two eigendirections, one stable, one unstable. The solution curves away from these eigendirections ly on hyperbola. The origin is unstable.
3) both eigenvalues are positive, all solutions run away from the equilibrium point. What possible pictures do we have for differential equations d/dt x = A x, where A is a 3 x 3 matrix with distinct nonzero eigenvalues?
Remark. Try to draw qualitative pictures of the phase space in each case.
10. Week: In quantum mechanics the operator Q f(x) = x f(x) on smooth function is called the position operator. The operator P = i D the momentum operator. The linear transformation H = P2 + Q 2 is called the Hamiltonian of the quantum harmonic oscillator.
a) Verify the commutation relation P Q - Q P = i.
b) Define the annihilation operator A=Q+i P=x+D and creation operator A* = Q-i P=x-D. Show that H = A* A = A A* + 2
c) Verify that the vacuum exp(-x2/2) is in the kernel of A and so in the kernel of H.
d) Verify that if f is an eigenfunction of H to the eigenvalue L, then A* f is an eigenfunction of H to the eigenvalue L+2.
e) Verify that all the even numbers 0,2,4,6,8,... are eigenvalues of H.
Remark. In this exercice you construct n-particle states of the quantum harmonic oscillator. These states are all the product of a polynomial with the vacuum function. This quantum system is the quantum analogue of the harmonic oscillator with energy H(x,p) = x2 + p2, the sum of the potential energy and kinetic energy (usually, one divides everything by 2). The classical oscillator can have any energy E with solutions x(t)=E1/2 cos(t),p(t)=E1/2 sin(t). For the quantum oscillator the possible energies are discrete or "quantized". To show that we have found all energies, one would have still to show that the eigenfunctions form an orthonormal basis in some linear space of functions.
11. Week: Develop Fourier series for functions f(x,y) of two variables which satisfy f(x+2pi,y)=f(x,y+2pi)=f(x,y). An inner product is defined by integration over the square [-pi,pi] x [-pi,pi]. Verify that cos(n x) cos(m y), sin(n x) sin(m y), cos(n x) sin(m y), sin(n x) cos(m x),1/2, cos(n x) 2-1/2, sin(n x) 2-1/2, cos(n y)2-1/2, sin(n y) 2 -1/2 form an orthonormal basis. Write down the Fourier series. How does the Fourier series for functions f(x,y)=-f(-x,y)=f(x,-y) look like? Write down the general solution of the heat equation ft=fxx +fyy for such functions. Hint. The Fourier series is now a sum indexed by two integers m,n.


Please send comments to knill@math.harvard.edu


Fri Jan 30 20:21:17 EST 2004