Article is under construction
An interesting fact about matrices is that it can always be factored as a multiplication of diagonal and orthogonal matrices. This is commonly knowns as a matrix's SVD (Singular Value Decomposition) factorization. There are lots of applications of SVD because of this property, which I won't get into.
Because of the factorization of SVD, we can express a matrix as a sum of outer products (made from a combination of its left and right singular vectors). When we multiply a vector, by linearity, we can multiply x into all the outer product matrices.
The interpretation of outer matrices as linear maps comes mainly from the inspiration of my NLP professor's fascination of outer products: https://www.cs.columbia.edu/~johnhew/fun-linear-transformations.html.
The singular values then represent the direction that is most scaled (most important) direction that the input vector is mapped onto.