Matrices
- Linear Transformations
- Matrix multiplication and Composition
- Properties
- Non-Linear Transformations
- Non-Square Matrices
- Matrix Transpose
Linear Transformations
Linear Transformation of Matrices
Prereq knowledge: Vectors and Basis Vectors, Linear dependent and independent
Linear Transformation is a way to move around space such that it fulfills these two conditions:
- All grid lines must remain lines
- Origin (0,0) must remain fixed
Transformations (functions f(x) - best described as a movement from input to its output)
The following are Linear Transformation where the grey grid is the original:
The following are Non Linear Transformations
In 3D
With the above two conditions are fulfilled, we can deduce where ANY vector land as long as we have record of where i hat and j hat lands. In 2D space, this only requires two vectors
Example of i hat and j hat moving to their new space based on the formula (linear combination?)
A 2x2 Matrix is created through two vectors
The columns in the matrix are transformed versions of the basis vectors, and the result is the linear combination of those vectors
example of a 90 degree rotation counter clockwise matrix applied to a vector in yellow
This is an example of a Linearly Dependent Columns of a matrix
When you see a matrix, it is interpreted as a certain transformation in space.
You need to use matrix operations in order to do transformations, like Rotate Shear and Scale for matrix multiplication and translation and reflection for matrix addition.
Matrices can encode geometric operations such as rotation, reflection and transformation. Thus if a collection of vectors represents the vertices of a three-dimensional geometric model in Computer Aided Design software then multiplying these vectors individually by a pre-defined rotation matrix will output new vectors that represent the locations of the rotated vertices. This is the basis of modern 3D computer graphics.
Matrix multiplication and Composition
In order to describe the effects of multiple sequential matrix transformation, such as a rotate and then a shear, we use matrix multiplication on those individual matrices to create a new matrix, called composition, to perform the action at once.
Order of the Matrices matter
The following demonstrates that if the order is reversed, the vector output will be different
Matrix Multiplication is Associative
Also this is associative, meaning that parentheses does not matter, the pattern will still have the same result as long as the right to left order is the same
End result will always look the same
Properties
Matrices are rectangular arrays consisting of numbers and are an example of 2nd-order tensors. If � and � are positive integers, that is �,�∈� then the �×� matrix contains �� numbers, with � rows and � columns.
If all of the scalars in a matrix are real-valued then a matrix is denoted with uppercase boldface letters, such as �∈��×�. That is the matrix lives in a �×�-dimensional real-valued vector space. Hence matrices are really vectors that are just written in a two-dimensional table-like manner.
Its components are now identified by two indices � and �. � represents the index to the matrix row, while � represents the index to the matrix column. Each component of � is identified by ���.
The full �� matrix can be written as:
�=[�11�12�13…�1��21�22�23…�2��31�32�33…�3�⋮⋮⋮⋱⋮��1��2��3…���]
It is often useful to abbreviate the full matrix component display into the following expression:
�=[���]��
Where ��� is referred to as the (�,�)-element of the matrix �. The subscript of �� can be dropped if the dimension of the matrix is clear from the context.
Note that a column vector is a size �×1 matrix, since it has � rows and 1 column. Unless otherwise specified all vectors will be considered to be column vectors.
Matrices represent a type of function known as a linear map. Based on rules that will be outlined in subsequent articles, it is possible to define multiplication operations between matrices or between matrices and vectors. Such operations are immensely important across the physical sciences, quantitative finance, computer science and machine learning.
Matrices can encode geometric operations such as rotation, reflection and transformation. Thus if a collection of vectors represents the vertices of a three-dimensional geometric model in Computer Aided Design software then multiplying these vectors individually by a pre-defined rotation matrix will output new vectors that represent the locations of the rotated vertices. This is the basis of modern 3D computer graphics.
In deep learning neural network weights are stored as matrices, while feature inputs are stored as vectors. Formulating the problem in terms of linear algebra allows compact handling of these computations. By casting the problem in terms of tensors and utilising the machinery of linear algebra, rapid training times on modern GPU hardware can be obtained.
Non-Linear Transformations
Definition
-
Matrix Exponentiation:
- Raising a matrix to a power greater than one, as in �2 or ��, is not a linear transformation. The composition of linear transformations is not equivalent to matrix exponentiation.
-
Matrix Logarithm:
- Taking the logarithm of a matrix, denoted as log(�), is not a linear transformation. The composition of linear transformations is not equivalent to matrix logarithm.
-
Element-Wise Operations:
- Operations that apply nonlinear functions element-wise to the matrix entries, such as taking the square root or applying trigonometric functions to individual elements, are not linear transformations.