There is a take home exam being handed out Monday and due Friday at the begining of class(due to a polite request). However the exam should not take that long and the homework for the following Monday is still due on Monday (though I think you will find the homework particularly straight forward.)

By the way I thought for the final that I'd hand it out Fri. Mar 3 and let you do it in any time before Mon Mar 13 at noon. (The exam will be about the same length as our other exams, I'm doing this to ensure that every one can find some window of time in which to do it.) If this a problem for some reason please inform me as soon as possible.

This WeekÕs Homework: Please turn in Section 3.4: 5,6,8,9 and 4.4: 1,2((a),(d),(g)), 3((a),(f)),6

Exam comments:

1.
Note the "with(linalg):" added to the maple code below.

2.
In 1(d) perhaps it should say "....use parts 1(b) and 1(c) to find...."

3.
In the extra credit 2(c) the second expression for $\sin(x)$ in the hint should of course be a $\cos(x)$.

4.
"Magical" in part 2(h). To understand the meaning of the magic it may be necessary to try i and j. The matrix in this basis will be "magical" if you will find the computation in part 2(j) strangely easy. Your job is to articulate what quality of this matrix made the computation easy. (The warning is: even if you get the correct answer you may fail to be able to articulate what makes it so darn magical until you try the next two parts.)

5.
In the extra credit notice that all i's used to index these sums should be n's.

6.
In part 2(j) you are computing (T2)N, which is in the original matrix raised to tne Nth power (i.e. in the original basis). You will have to ask yourself how to exploit the magic in h and the fact proved in i do do this.

7.
In problem one $C^{\infty}({\bf R})$ is the vector space of real valued (thought this doesn't matter) functions on the real line with the property that all their derivatives exist and are continous. (For the purposes of this problem it is fine to simply view it as a vector space of functions that contains the polynomials and N(D). Certainly no detailed understanding of this space or indeed differential equations is needed to do this problem.)

8.
The definition of reflection:

Let u be a unit vector in the inner product spce (V,<-,->). Let the reflection through $(span\{u\})^{\perp}$ be the map from the innner product space to itself given by

\begin{displaymath}R_u(v) = v - 2 \left<v,u\right> u. \end{displaymath}

9.
In respnse to "Is 2(k) realy that easy?". Answer, Yes.

10.
The hint in 1(a) concerns the analog to this fact wich we proved (recently) for linear systems (see section 3.3).

Proposed Topics For this Week: On Monday we will finish our initial discussion of transition matrixes and probability vectors. In particular we need to understand how to relate transition matrixes with graphs and understand all the interpretations involved. For the record:

definitions: A transition matrix is a square matrix with all positive entries and all columns summing up to one. Each application of the transition matrix will be reffered to as a the passing of a time "click". A probability vector is a column vector with all positive entries with its entries summing up to one. A probabilty vector will often represents for us a population's distribution amoung the set of "nodes" used to define the the transition matrix, and will sometimes be reffered to as the population distribution. A "quantity" will be a row vector.

On Wednesday we will discuss determinants. Despite my personal affection for determinants we will only be carefully covering section 4.4 out of chapter 4; which is a summary of properties of determinants. In the process we will only experience a sort of "greatest hits" of the arguments involved. On Friday we will introduce an eigen-vector and eigen-value and give lots of examples.

The X-session Topic

Using maple. We will look at how to use maple to solve some basic problems and discuss the possible alogithms one could use to do these things. Parcipitation is optional.

Some Maple Code:

Herre is some maple code from last Friday's and this Monday's lectures. Note that row reduction (gaussjord) is being used to construct T1's inverse, labeled In. The most confusing bit of code is the picking off of the necessary columns of the reduced augmented matrix, where you must transpose the matrix with its rows given by the corresponding columns.

with(linalg):

T1 := array( [[0,1/4,0,1/3,0],[1/3,1/4,1/10,0,1/6], [1/3,1/4,1/2,2/3,1/6],[1/3,0,3/10,0,1/6],[0,1/4,1/10,0,1/2]] );

v := array([[3/10],[2/10],[2/10],[1/10],[2/10]]);

evalf(multiply(T1,v));

evalf(multiply(T1,T1,v));

evalf(multiply(T1,T1,T1,T1,T1,v));

I5 := diag(1, 1,1,1,1);

A1 := augment(T1,I5);

G1 := gaussjord(A1, 'r');

In := transpose(matrix([col(G1,6..10)]));

multiply(In,v);





Math 24 Winter 2000
2000-02-24