By the way I thought for the final that I'd hand it out Fri. Mar 3 and let you do it in any time before Mon Mar 13 at noon. (The exam will be about the same length as our other exams, I'm doing this to ensure that every one can find some window of time in which to do it.) If this a problem for some reason please inform me as soon as possible.
This WeekÕs Homework: Please turn in Section 3.4: 5,6,8,9 and 4.4: 1,2((a),(d),(g)), 3((a),(f)),6
Exam comments:
Let u be a unit vector in the inner product spce (V,<-,->). Let
the reflection through
be the
map from the innner product space
to itself given by
Proposed Topics For this Week: On Monday we will finish our initial discussion of transition matrixes and probability vectors. In particular we need to understand how to relate transition matrixes with graphs and understand all the interpretations involved. For the record:
definitions: A transition matrix is a square matrix with all positive entries and all columns summing up to one. Each application of the transition matrix will be reffered to as a the passing of a time "click". A probability vector is a column vector with all positive entries with its entries summing up to one. A probabilty vector will often represents for us a population's distribution amoung the set of "nodes" used to define the the transition matrix, and will sometimes be reffered to as the population distribution. A "quantity" will be a row vector.
On Wednesday we will discuss determinants. Despite my personal affection for determinants we will only be carefully covering section 4.4 out of chapter 4; which is a summary of properties of determinants. In the process we will only experience a sort of "greatest hits" of the arguments involved. On Friday we will introduce an eigen-vector and eigen-value and give lots of examples.
The X-session Topic
Using maple. We will look at how to use maple to solve some basic problems and discuss the possible alogithms one could use to do these things. Parcipitation is optional.
Some Maple Code:
Herre is some maple code from last Friday's and this Monday's lectures. Note that row reduction (gaussjord) is being used to construct T1's inverse, labeled In. The most confusing bit of code is the picking off of the necessary columns of the reduced augmented matrix, where you must transpose the matrix with its rows given by the corresponding columns.
with(linalg):
T1 := array( [[0,1/4,0,1/3,0],[1/3,1/4,1/10,0,1/6], [1/3,1/4,1/2,2/3,1/6],[1/3,0,3/10,0,1/6],[0,1/4,1/10,0,1/2]] );
v := array([[3/10],[2/10],[2/10],[1/10],[2/10]]);
evalf(multiply(T1,v));
evalf(multiply(T1,T1,v));
evalf(multiply(T1,T1,T1,T1,T1,v));
I5 := diag(1, 1,1,1,1);
A1 := augment(T1,I5);
G1 := gaussjord(A1, 'r');
In := transpose(matrix([col(G1,6..10)]));
multiply(In,v);