This website uses cookies to ensure you get the best experience on our website. Learn more

Natural Sciences Notes Mathematics for Natural Sciences Notes

Matrices Notes

Updated Matrices Notes

Mathematics for Natural Sciences Notes

Mathematics for Natural Sciences

Approximately 176 pages

Notes for every topic covered in Mathematics for Natural Sciences, Part IA. The notes emphasise the most important aspects of each topic - specifically the material which is commonly addressed in examination questions.

Where lecture notes have been vague, I have provided further explanation and insight, and have organised the material in a logical fashion.

I have also tried to avoid long and laborious algebraic explanations, instead opting for a more intuitive/physical approach (since this ...

The following is a more accessible plain text extract of the PDF sample above, taken from our Mathematics for Natural Sciences Notes. Due to the challenges of extracting text from PDFs, it will have odd formatting:

Notes for Matrices

Suffix notation:

Suffix notation is a useful form of notation for dealing with tensors. It expresses these tensors in terms of their elements:

  • ai for a vector

  • Aij for a matrix

  • αijk for a three-dimensional tensor

Operations involving tensors can therefore be expressed as summations over their elements, such that:

Two vectors: Vector and matrix:

ab=a1b1+a2b2+…+anbn

$$\mathbf{\therefore a} \bullet \mathbf{b} = \sum_{i}^{}{a_{i}b_{i}}$$

$$\mathbf{A} \bullet \mathbf{b} = \sum_{j}^{}{A_{1j}b_{j}} + \sum_{j}^{}{A_{2j}b_{j}} + \ldots + \sum_{j}^{}{A_{nj}b_{j}}$$

$$\mathbf{\therefore A} \bullet \mathbf{b} = \sum_{i}^{}{\sum_{j}^{}{A_{ij}b_{j}}}$$

The summation convention is then applied, by which if indices are multiplied and repeated, they are summed over implicitly, such that the sigma notation is removed:

ab=aibi Ab=Aijbj

It is important to note that aibi represents a scalar quantity. Thus, in order to specify a vector quantity such as a cross product in suffix notation, it is necessary to specify that the suffixes represent only the ith element of the resulting vector; (ab)i.

Matrix multiplication in suffix notation:

In suffix notation, the product of two matrices can be understood and written as follows:

$$\mathbf{AB} = \sum_{i}^{}{\sum_{j}^{}\left( \mathbf{AB} \right)_{ij}} = \left( \mathbf{AB} \right)_{ij}$$

Introducing some arbitrary vector x, and noting that matrix multiplication is associative:

(AB)x=A(Bx)

[(AB)x]i=[A(Bx)]i=Aik(Bx)k=AikBkjxj

(AB)ijxj=AijBkjxj

$$\boxed{\mathbf{\therefore}\left( \mathbf{AB} \right)_{\mathbf{ij}}\mathbf{=}\mathbf{A}_{\mathbf{ik}}\mathbf{B}_{\mathbf{kj}}}$$

Where k is repeated and thus summed over implicitly.

This rule can be extended, via the same repeated reasoning, to four matrices:

$$\boxed{\left( \mathbf{ABCD} \right)_{\mathbf{ij}}\mathbf{=}\mathbf{A}_{\mathbf{ik}}\mathbf{B}_{\mathbf{kl}}\mathbf{C}_{\mathbf{lm}}\mathbf{D}_{\mathbf{mj}}}$$


The Kronecker delta:

The Kronecker delta is defined as:

$$\boxed{\delta_{ij} = \{\begin{matrix} 0,\ \ i \neq j \\ 1,\ \ i = j \\ \end{matrix}}$$

It can be expressed in terms of the vector differential operator:

$${\frac{\partial x_{j}}{\partial x_{i}}|}_{x_{k}} = \partial_{i}x_{j},\ \ k \neq j$$

Which is equal to 1 if i=j, but if ij then i equals k, and thus ixj=0.

$$\boxed{\therefore\delta_{ij} \equiv \partial_{i}x_{j}}$$

Where x is a vector.

Applying the Kronecker delta:

Applying the Kronecker delta to some vector v leads to the following result:

$$\boxed{\delta_{ij}\mathbf{v}_{\mathbf{j}}\mathbf{=}\mathbf{v}_{\mathbf{i}}}$$

Since if i=j,δij=1 but if ij,δij=0. Thus the Kronecker delta is often said to ‘change the suffix’ of a vector.

The Levi-Civita symbol:

The Levi-Civita symbol is defined as:

$$\boxed{\epsilon_{ijk} = \{\begin{matrix} 1,\ \ ijk = cyclic\ permutation \\ - 1,\ \ ijk = antcyclic\ permutation \\ 0,\ \ ijk = any\ equal\ indices \\ \end{matrix}}$$

It can thus be said that:

$$\boxed{\left( \mathbf{a} \times \mathbf{b} \right)_{i} = \epsilon_{ijk}a_{j}b_{k}}$$

Which can be verified in three dimensions as follows:

Let i=1:

(ab)1=ϵ123a2b3+ϵ132a3b2+zeroes(equalindices)

(ab)1=a2b3a3b2

Let i=2:

(ab)2=ϵ213a1b3+ϵ231a3b1+zeroes(equalindices)

(ab)1=a1b3+a3b1

Let i=3:

(ab)3=ϵ312a1b2+ϵ321a2b1+zeroes(equalindices)

(ab)3=a1b2+a2b1

Relation between δij and ϵijk:

The following relation exists between the two operators:

$$\boxed{\mathbf{\epsilon}_{\mathbf{ijk}}\mathbf{\epsilon}_{\mathbf{ilm}}\mathbf{=}\mathbf{\delta}_{\mathbf{jl}}\mathbf{\delta}_{\mathbf{km}}\mathbf{-}\mathbf{\delta}_{\mathbf{jm}}\mathbf{\delta}_{\mathbf{kl}}}$$

Which can be remembered as ‘same minus different’. That is, δjl has both the second indices j and l, δkm has both the third indices k and m, and δjm and δkl have mixed ‘different’ indices.


The inner/scalar product:

The inner/scalar product is defined for any kind of vector space, but is not part of the axioms which define vector spaces. In three dimensions, the inner/scalar product for real and complex vectors and matrices is defined in the following ways (in suffix notation):

Real Complex
Column vectors xy=xiyi xy=xi*yi
Matrices AB=ATB AB=(AT)*B=A+B

Properties of the inner/scalar product:

The scalar product has the following properties:

  1. xx=xixi=|xi|2, and is real and positive. It defines the magnitude of x as:

$$\left| \mathbf{x} \right| = \sqrt{\mathbf{x \bullet x}}$$

  1. If xy=0, x and y are said to be orthogonal. For column vectors in three dimensions, this means that the vectors are perpendicular, but this inner product definition is much broader, encompassing not only higher-order tensors, but also functions.

  2. A basis {e1,e2,…,en} which satisfies eiei=1 and eiej=0 for ij is said to be orthonormal, and can be written in terms of the Kronecker delta as:

eiej=δij

Basic nomenclature:

The following are basic properties of matrices:

Matrix type: Definition: Properties:
Transpose MT of M

MT is the matrix given by interchanging the rows and columns of M:

$$\boxed{\left( \mathbf{M}^{\mathbf{T}} \right)_{\mathbf{ij}}\mathbf{=}\mathbf{M}_{\mathbf{ji}}}$$

  1. (MT)T=M

  2. (AB)T=BTAT

Identity matrix I

Diagonal matrix with only 1s along the diagonal:

$$\boxed{\begin{pmatrix} \mathbf{1} & \mathbf{0} & \mathbf{0} \\ \mathbf{0} & \mathbf{1} & \mathbf{0} \\ \mathbf{0} & \mathbf{0} & \mathbf{1} \\ \end{pmatrix}\mathbf{\equiv}\mathbf{\delta}_{\mathbf{ij}}}$$

When pre- or post-multiplied with any other matrix M of appropriate dimension, returns M unchanged
Symmetric matrix S

A square matrix which satisfies:

$$\boxed{\mathbf{S}^{\mathbf{T}}\mathbf{= S}}$$

(sij)=(sji)

All square matrices can be formed by a sum of its symmetric and antisymmetric parts, just as sines and cosines can be represented by a sum of symmetric and antisymmetric functions:

$$M = \frac{1}{2}(M + M^{T}) + \frac{1}{2}\left( M - M^{T} \right)$$

$$S = \frac{1}{2}\left( M + M^{T} \right),A = \frac{1}{2}\left( M -...

Buy the full version of these notes or essay plans and more in our Mathematics for Natural Sciences Notes.