Fall 2004 

These problems are for fun only. 
1. Week:  1) Can you find the number of types of 3x3 matrices? (See problems 2023 in 1.2). Try to find a way, which allows to enumerate all possible types for a general (m x n)matrix.  Hint: Order according to the rank. 
2) Try to find a proof that rref(A) is uniquely defined. That is, if you have two matrices which are in row reduced echelon form and both are obtained from A by row reduction, then these two matrices must be the same.  Hint. You find this problem together with some hints in the book. 
3. Week: 
If t is a real number find a closed form for the matrix e^{J t} = I + J t + J^{2} t^{2}/2! + J^{3} t^{3}/3! +.... 
Hint: Write down the first few terms in the series and watch what happens with
each matrix entry. Use that e^{x}=1+x+x^{2}/2!+x^{3}/3!+... cos(x) = 1  x^{2}/2!+x^{4}/4!... sin(x) = xx^{3}/3!+x^{5}/5!.... 

2) Verify that for a square (nxn)matrix A for which all entries are smaller then 1/n,
one has I+A+A^{2}+A^{3} + ... = (IA)^{1}, where I is the identity matrix. 
Hint. The condition that all entries are smaller than 1 is only needed to assure that (IA) is invertible. Verify first formally that the equation is true, then verify that (IA) has only 0 in the kernel. 
4. Week:  1) Let S be the linear transformation on R^{n}which maps e_{1} to e_{2}, e_{2} to e_{3} etc until e_{n} to e_{1}. For each k between 1 and k1, find the dimension of the kernel of the transformation T=I+S+...+S^{k}. 
Hint: Experiment first with n=3,4 and write down the matrix in each case.
 1 0 0   1 0 1   1 1 1  I =  0 1 0  I+S =  1 1 0  I+S+S =  1 1 1   0 0 1   0 1 1   1 1 1  
2) Let X be the space of polynomials of degree smaller or equal than 3. Let T be the linear which maps f in X to the vector [f(1),f(0),f(1)]. For example T(x^{3}+2x^{2}x+1) = [3,1,3]. Find a basis for the kernel of this transformation.  Hint. Just write down what it means that a polynomial is in the kernel. 
5. Week:  The space P_{n} of polynomials of degree smaller than n form a linear space because one can add and scale such polynomials in the same way as vectors. The polynomials v_{0}(x)=1, v_{2}(x)=x^{2},...,v_{n}(x)=x^{n} form a standard basis in this space. A dot product p.q can be defined as the integral of p(x) q(x) from x=1 to x=1. Perform in the case n=3 the GramSchmidt orthogonalisation with this dot product, starting with the standard basis.  Hint. Compare your result with the Legendre polynomials p_{1}(x)=1, p_{2}(x)=x, p_{2}(x)=(3x^{2}1)/2, p_{3}(x)=(5x^{3}3x)/2. 
6. Week: 
A matrix is called skewsymmetric if A^{T}=A. A general
skewsymmetric (3x3) matrix is
 0 a b  A = a 0 c  b c 0 If A is a matrix, define its exponential as e^{A} = 1 + A + A^{2}/2! + A^{3}/3! + ... For (1x1) matrices, this is the usual exponential function. Verify that if A is a skew symmetric matrix, then e^{A} is an orthogonal matrix.  Hint. Check first that e^{A+B} is the matrix product of e^{A} with e^{B} if A and B commute and conclude from this that e^{A} is the inverse of e^{A}. Then look at the transpose of e^{A}. 
7. Week: 
If you look at orbits of a discrete dynamical system given
by a (2 x 2) matrix A, you notice that the points ly on curves,
curves which are left invariant by the linear map. For example,
for a rotation in the plane, the curves are circles. For a
diagonal matrix with diagonal 2,3, the curves are hyperbolas.
Can you find a formula for these curves in the case
 1 1  A =  1 0   Hint. Find two eigenvectors to A and conjugate the matrix A to a diagonal matrix B = S^{1} A S. Figure out first what curves you have in the case of a diagonal matrix B. 
8. Week: 
You have seen that a Markov matrix
 p_{1,1} ... p_{1,n}  A =  p_{2,1} ... p_{2,n}   _{ . } _{. }   p_{n,1} ... p_{n,n} (all entries are positive and the sum of each column is 1) has an eigenvalue 1. We have seen that by looking at the transpose A^{T} which has the eigenvector [1,1,...,1,1]^{T} with eigenvalue 1 and because A^{T} and A have the same eigenvalues also 1 was an eigenvalue of A. The eigenvector v to 1 when scaled so that the sum of its entries are 1 can be interpreted as a stable equilibrium probability distribution of the Markov chain. Problem. a) Verify that A^{n} is again a Markov matrix. b) Verify that A^{n} stays bounded. c) Why does A have no eigenvalue larger then 1? d) Verify that the algebraic and geometric multiplicity of 1 is 1. e) Prove that any matrix A with strictly positive entries has a maximal eigenvalue with multiplicity 1. 
Hint. For d), note that A maps the positive 2^{n}ant
(quadrant in case n=2, octant in case n=3)
strictly into itself. Conclude that an eigenvector has to be in that 2^{n}ant.
Assume then that there are two different eigenvectors and conclude from this that you
find then also an eigenvector with eigenvalue 1 in the boundary of the nant. Remark. With e), you have proven a fancy result called Frobenius theorem. 
9. Week: 
For differential equations d/dt x = A x where A is a 2x2 matrix
with different nonzero eigenvalues, we had the following qualitative
different pictures at the equilibrium point: 1) all eigenvalues are negative, the origin is asymptotically stable. 2) one eigenvalue is negative, the other positive. There are two eigendirections, one stable, one unstable. The solution curves away from these eigendirections ly on hyperbola. The origin is unstable. 3) both eigenvalues are positive, all solutions run away from the equilibrium point. What possible pictures do we have for differential equations d/dt x = A x, where A is a 3 x 3 matrix with distinct nonzero eigenvalues?  Remark. Try to draw qualitative pictures of the phase space in each case. 
10. Week: 
In quantum mechanics the operator Q f(x) = x f(x) on smooth function is called the
position operator. The operator P = i D the momentum operator.
The linear transformation H = P^{2} + Q^{ 2} is called the Hamiltonian
of the quantum harmonic oscillator. a) Verify the commutation relation P Q  Q P = i. b) Define the annihilation operator A=Q+i P=x+D and creation operator A^{*} = Qi P=xD. Show that H = A^{*} A = A A^{*} + 2 c) Verify that the vacuum exp(x^{2}/2) is in the kernel of A and so in the kernel of H. d) Verify that if f is an eigenfunction of H to the eigenvalue L, then A^{*} f is an eigenfunction of H to the eigenvalue L+2. e) Verify that all the even numbers 0,2,4,6,8,... are eigenvalues of H.  Remark. In this exercice you construct nparticle states of the quantum harmonic oscillator. These states are all the product of a polynomial with the vacuum function. This quantum system is the quantum analogue of the harmonic oscillator with energy H(x,p) = x^{2} + p^{2}, the sum of the potential energy and kinetic energy (usually, one divides everything by 2). The classical oscillator can have any energy E with solutions x(t)=E^{1/2} cos(t),p(t)=E^{1/2} sin(t). For the quantum oscillator the possible energies are discrete or "quantized". To show that we have found all energies, one would have still to show that the eigenfunctions form an orthonormal basis in some linear space of functions. 
11. Week:  Develop Fourier series for functions f(x,y) of two variables which satisfy f(x+2pi,y)=f(x,y+2pi)=f(x,y). An inner product is defined by integration over the square [pi,pi] x [pi,pi]. Verify that cos(n x) cos(m y), sin(n x) sin(m y), cos(n x) sin(m y), sin(n x) cos(m x),1/2, cos(n x) 2^{1/2}, sin(n x) 2^{1/2}, cos(n y)2^{1/2}, sin(n y) 2 ^{ 1/2} form an orthonormal basis. Write down the Fourier series. How does the Fourier series for functions f(x,y)=f(x,y)=f(x,y) look like? Write down the general solution of the heat equation f_{t}=f_{xx} +f_{yy} for such functions.  Hint. The Fourier series is now a sum indexed by two integers m,n. 
Please send comments to knill@math.harvard.edu 