Review: Strictly speaking, linear transformation is a class of functions that take vectors as inputs and outputs. Linear transformation can be thought of as squeezing and stretching space, with grids evenly spaced and keeping the origin unchanged. The point is that a linear transformation is completely determined by its basis vectors on space. In two dimensions the basis vectors are I hat and j hat, and that’s because any other vector can be represented as a linear combination of the basis vectors. The vector x and y is x times the hat of I plus y times the hat of J, and there’s a neat corollary to the fact that after linear transformation, the grid lines remain parallel and equally spaced, the result of the transformation of the vector x and y is going to be x times the hat of I plus y times the hat of J, So that means that you can just write down the position of the I hat and the j hat and figure out the coordinate of the transformation of a vector of x, y, which is x times the transformation of the I hat plus y times the transformation of the j hat. Traditionally, we take the transformed I hat and j hat as the columns of the matrix, and define the sum of the two columns multiplied by x and y as the product of matrix vectors.

So the matrix represents a particular linear transformation.

And multiplying a matrix by a vector is just applying the linear transformation to that vector.

A lot of times we want to describe something like this: one transformation followed by another transformation, let’s say you rotate the whole matrix 90 degrees counterclockwise and then you shear what happens? The total effect of ab initio is another linear transformation, which is obviously different from rotation and shearing, and this new linear transformation is often referred to as a composite transformation of the first two independent transformations, and like other linear transformations, we can completely describe this composite transformation by tracing the I hat and the J hat and using a matrix.

In this case, the final position of the I hat after two linear transformations is (1,1), and we’re going to make that the first column of the matrix, and likewise, the final position of the j hat after two linear transformations is (-1,0), and we’re going to make that the second column of the matrix.

This new matrix captures the overall effect of rotation and then shearing, but it is a single effect rather than the combination of two successive effects.

Here’s a way to think about this new matrix, if you have a vector that you rotate and then you cut it, a tricky way to calculate it is to first multiply it by the rotation matrix, and then multiply the result by the shear matrix. From a numerical point of view, this means that you rotate and then cut a given vector, but whatever the selected vector is, the result should be exactly the same as the composite transformation, because the new matrix should capture the overall effect of rotation and then cut.

Based on the figure above, I think it makes sense to call this new matrix the product of the original two matrices.

The multiplication of two matrices has a geometric meaning, that is, two linear transformations work in succession.

Compound transformation calculation method

Does the order in which matrix transformations are multiplied matter? Let’s say one is shearing and he keeps the I hat the same (0,1) and pushes the J hat to the right (1,1), and one is rotating the I hat by 90 degrees (0,1) (0, -1), and if you cut and rotate first, you’ll find that the I hat falls at (0,1) and the J hat falls at (-1,1) and the formula is that they’re very close to each other. If you rotate first and then cut, the I hat falls in (1,1) and the j hat falls in different directions (-1,0) and their points are very far apart. The overall effect is different, so the order of the product obviously matters. We think in terms of transformations, which can be visualized in the brain, without matrix multiplication at all. 1. Prove that matrix multiplication is associative:

Matrix multiplication is associative.