Last week I discussed what principles I am planning to apply to redesign my linear algebra course with best modern practices in mind. The next step is to decide what core topics to cover. But before deciding on the list of core topics, I will use a bottom-up approach and start with student outcomes. First, I want the students to be able to construct and write meaningful mathematical proofs; this means dedicating time to explain the thinking process behind generating proofs. Second, I want the students to see the development of the subject from first principles up to a major milestone. In linear algebra, the major milestone is the classification of finite-dimensional vector spaces. Few mathematical disciplines can boast such a clean and efficient classification of its objects of study.
With these considerations in mind, here is the list of topics I plan to cover:
- Linear systems: augmented matrix, reduced-row echelon form, Gauss–Jordan elimination, and nonsingular matrices. Prove that there are only 3 types of solution sets.
- Column vectors: vector operations, linear combinations, linear independence, and spanning sets. Gently introduce the students to the notion of vectors and linear combinations by using explicit objects such as column vectors.
- Matrices: matrix multiplication, matrix inverses and their relation to linear systems, and column and row spaces. Connect properties of matrices with solutions of linear systems.
- Vector spaces: abstract vector spaces, subspaces, bases, and dimension. Use column vectors and matrices as simple visual examples of vector spaces. Introduce the rank of a matrix, and connect it with solutions of linear systems.
- Determinants: properties of determinants and geometric interpretation. Introduce elementary matrices, and use them to prove key properties of determinants.
- Eigenvalues and eigenvectors: computing eigenvalues and eigenvectors, properties of eigenvalues and eigenvectors, similarity and diagonalization. Show that the similarity relation is an equivalence relation.
- Linear transformations: kernel and range, injective, surjective, and invertible linear transformations. Prove that a linear transformation of column vectors is multiplication by a fixed matrix.
- Representations: tie all notions discussed up to this point together. Prove that all vectors spaces (over a fixed field) can be classified by one positive integer, the dimension. Introduce matrix representations of linear transformations, and connect properties of linear transformations with the properties of their matrix representations. Prove that composition of linear transformations corresponds to multiplication of matrices. Discuss change of basis.
What I will not be able to cover are LU-decomposition and inner products. This is a tradeoff I am willing to make to be able to prove the classification of finite-dimensional vector spaces theorem.
In the next post, I will summarize my current thoughts on using software to solve linear algebra problems numerically. I still have not completely made my mind up on how to implement this aspect in my class, and I will welcome any comments and suggestions.