Matrices are mathematical tools that store and manipulate data as tables organized in rows and columns. Matrices are used in many areas of mathematics, such as linear algebra, geometry, physics, and computer science.

A matrix can be described as a rectangular array containing numbers, called “elements” or “terms”, arranged in rows and columns. Matrices can be of different sizes, depending on the number of rows and columns they contain.

There are different types of matrices, like diagonal matrices, triangular matrices, symmetric matrices, and orthogonal matrices. Matrices can also be combined to form new matrices using operations like addition, subtraction, and multiplication.

Matrices are a powerful tool for solving systems of linear equations, geometric transformation, graph representation, and many other areas. They are also a key part of matrix calculations which play a crucial role in the fields of physics, computer science, statistics, artificial intelligence, etc.

Definitions & writings

Matrixes of dimension m x n: array of numbers composed of m rows and n columns
Square matrix of order n: n rows and n columns
Null matrix of order n: all zero coefficients
Identity matrix : noted \(I_n\), 1 on the diagonal and 0 elsewhere \(I_n=\left[ \begin{array}{cc} 1 & 0 & 0 \\ 0 & ;1 & 0 \\ 0 & 0 & 1 \end{array} \right] \)

Operations

Addition: Sum of 2 matrices with the same dimension m x n
Example:
$$C=A+B=\left[ \begin{array}{cc} 1 & 2 & -5 \\ 2 & -3 & 3 \end{array} \right] + \left[ \begin{array}{cc} 2 & -1 & 3 \\ -2 & 1 & 1 \end{array} \right] = \left[ \begin{array}{cc} 3 & 1 & -2 \\ 0 & -2 & 4 \end{array} \right]$$
Multiplication by a real: Given a real \(\lambda\) and a matrix A
Example: $$ B=\lambda.A=5* \left[ \begin{array}{cc} 1 & 2 & -5 \\ 2 & -3 & 3 \end{array} \right]=\left[ \begin{array}{cc} 5 & 10 & -25 \\ 10 & -15 & 15 \end{array} \right]$$
Difference : \(A-B \leftrightarrow A+(-1).B \rightarrow \) Opposite matrix of B: \(-B=-1*B\)
Multiplication:
A row matrix by a column matrix:
$$A*B=\sum_{j=1}^{j=n} a_{1j} b_{j1}$$ (column number of A = row number of B)
Example : $$A*B= \left[ \begin{array}{cc} 1 & -2 & 5 \end{array} \right] * \left[ \begin{array}{cc} 4 \\ 1 \\ -3 \end{array} \right] =1*4+(-2)*1+5 *(-3)=-13$$ Matrix m x n by Matrix n x p: A*B=C (number of columns of A = number of rows of B)

Properties: Addition & Multiplication by a real

Commutativity of addition: A+B=B+A
Associativity of addition :(A+B)+C=A+(B+C)=A+B+C
Distributivity of multiplication by a real over addition: \(\lambda(A+B)=\lambda.A + \lambda.B\)
Distributivity of multiplication over addition of reals: \((\lambda+\mu)A=\lambda.A+\mu.A\)
Other: \(\lambda(\mu.A)=(\lambda.\mu)A=\lambda.\mu.A\)

Properties: multiplication of square matrices

A,B,C square matrices of order n; \(\lambda\) a real
Associativity: \((AB)C=A(BC)\)
Multiplication by a real number and a multiplication: \(\lambda(AB)=(\lambda.A)B=A(\lambda.B)\)
Multiplication on addition: \(A(B+C)=AB+AC\)
Neutral element : \(A*I_n=I_n*A=A\)
Non-commutativity: \(AB \neq BA\)

Inverse Matrix

A square matrix A of order n is invertible (or regular) if there exists a square matrix A’ of order n such that: \(A.A’= I_n and =A’.A=I_n \rightarrow \text { Inverse matrix } A^{-1}\)
Matrix: \(\lambda. A\), inverse matrix: \(\frac{1}{\lambda} A^{-1}; (AB)^{-1}=B^{-1} A^{ -1}\)

Determinant

A square matrix A of order 2 (or +) is invertible if and only if its determinant is not equal to 0.
$$A=\left[ \begin{array}{cc} a & b \\ c & d \end{array} \right]$$ Determiner:
$$ det= ad-bc = det⁡(A)=\left| \begin{array}{cc} a_{11} & a_{12} \\ a_{21} & a_{22} \end{array} \right|=a_{11}.a_{22}-a_{12}.a_{21} $$ Inverse matrix:
$$A^{-1}=\frac{1}{det} \left( \begin{array}{cc} d & -b \\ -c & a \end{array} \right)$$ Matrix of order 3:
$$A=\left( \begin{array}{cc} a_{11} & a_{12} & a_{13} \\ a_{21} & a_{22} & a_{23} \ \ a_{31} & a_{32} & a_{33} \end{array} \right)$$ Determining : $$det⁡(A) = \left| \begin{array}{cc} a_{11} & a_{12} & a_{13} \\ a_{21} & a_{22} & a_{23} \\ a_{31} & a_{32} & a_{33} \end{array} \right|= a_11 \left| \begin{array}{cc} a_{22} & a_{23} \\ a_{32} & a_{33} \end{array} \right| – a_12 \left| \begin{array}{cc} a_{21} & a_{23} \\ a_{31} & a_{33} \end{array} \right| + a_13 \left| \begin{array}{cc} a_{21} & a_{22} \\ a_{31} & a_{32} \end{array} \right|$$ Inverse matrix: $$A^{-1}=\frac{1}{det} \left( \begin{array}{cc} a_{22}.a_{33}-a_{32}.a_{23} &- (a_{12}.a_{33}-a_{32}.a_{13}) & a_{12}.a_{23}-a_{22}.a_{13} \\ -(a_{21} .a_{33}-a_{31}.a_{23}) & a_{11}.a_{33}-a_{31}.a_{13} & -(a_{11}.a_{23} -a_{21}.a_{13}) \\ a_{21}.a_{32}-a_{31}.a_{22} & -(a_{11}.a_{32}-a_{31} .a_{12}) & a_{11}.a_{22}-a_{21}.a_{12} \end{array} \right)$$

Properties

$$det⁡(A^T)=det⁡(A)$$ $$det⁡(A.B)=det⁡(A)*det⁡(B)$$ $$det⁡(I)=1$$ $$det⁡(A)^{-1}=\frac{1}{det(A)}$$ If A orthogonal, \(det⁡(A)= \pm 1\)
If A has 1 null line or 2 identical lines, then \(det⁡(A)=0\)
If A has 1 null column or 2 identical columns, then \(det⁡(A)=0\)
Interchanging 2 lines of A only changes the sign of \(det⁡(A)\)
Interchanging 2 columns of A only changes the sign of \(det⁡(A)\)
Multiply a row of A by a scalar \(\alpha\): \(det⁡(A’)= \alpha.det⁡(A)\)
Multiply a column of A by a scalar \(\alpha\): \(det⁡(A’)= \alpha .det⁡(A)\)
Multiply a row of A by a scalar & add it to another line: does not change \(det⁡(A)\)
Multiply a column of A by a scalar & add it to another column: does not change \(det⁡(A)\)
If A is diagonal, then its determinant is equal to the product of the elements of the diagonal
If A upper triangular: $$det⁡(A)= \sum a_{nn}$$ If A lower triangular: $$det⁡(A)= \sum a_{nn}$$ If A block diagonal (or upper triangular or lower triangular): \(det⁡(A)=det⁡(A_{11}) * … *det(⁡A_{nn})\)
Block Matrix:
$$\text{ if }D \in \mathbb{R}^{m*m}\text{ with A,B,C,D matrices}, det\left[ \begin{array}{cc} A & ; B \\ C & D \end{array} \right]=det⁡(A).det⁡(D-C.A^{-1}.B)$$ $$\text{ if }D \in \mathbb{R}_m^{m*m}\text{ with A,B,C,D matrices}, det\left[ \begin{array}{cc} A & B \\ C & D \end{array} \right]=det⁡(D).det⁡(A-B.D^{-1}.C)$$

Diagonal matrix

A diagonal matrix D is a square matrix whose all coefficients outside the main diagonal are zero
Example: Diagonal matrix of order 4
$$D=\left( \begin{array}{cc} 2 & 0 & 0 & 0 \\ 0 & -1 & 0 & 0 \\ 0 & 0 & 1 & 0 \\ 0 & 0 & 0 & 3 \end{array} \right) \rightarrow D^n=\left( \begin{array}{cc} 2^n & 0 & 0 & 0 \ \0 & (-1)^n & 0 & 0 \\ 0 & 0 & 1^n & 0 \\ 0 & 0 & 0 & 3^n \end{array} \ right)$$

Triangular matrix

An upper triangular matrix T is a square matrix where all terms below the diagonal are zero.
\((A^T)^{-1}=(A^{-1})^T\)
A lower triangular matrix T is a square matrix where all terms above the diagonal are zero.
A triangular matrix is ​​strictly triangular when the diagonal is zero.
The nth power of T is a matrix T
If T is a strictly triangular matrix of order n then all its powers from n are zero.

Sparse matrices

Matrices with many zero coefficients: facilitates calculations because it is possible to calculate “by blocks”

Diagonalization, of a matrix of order 2

A square matrix A is diagonalizable if there exists an invertible square matrix P and a diagonal matrix D such that: \(A=P*D*P^{-1} \rightarrow A^n=P*D^n*P ^{-1}\)
$$[diag(D_{ii})]^{-1}=diag(\frac{1}{D_{ii}})$$ Orthogonal matrix if: \(A^{-1}=A^T\)

Rating

Diagonal if \(a_{ij} = 0\) for \(i \neq j\)
Upper triangular if \(a_{ij} = 0\)for \(i > j\)
Lower triangular if \(a_{ij} = 0\) for \(i < j \)
Tridiagonal if \(a_{ij} = 0\) for \( |i-j| > 1\)
Pentadiagonal if \(a_{ij} = 0\) for \( |i-j| > 2\)
Upper Hessemberg if \(a_{ij} = 0\) for \(i-j > 1\)
Lower Hessemberg if \(a_{ij} = 0\) for \(j-i > 1\)

Transposed

The transpose of a matrix A, denoted \(A^T\), is the matrix for which the (i,j)th entry is the (j,i)th entry of A: \((A^T)_ {ij} = a_{ji} \)
Example:
$$A = \left[ \begin{array}{cc} a_{11} & a_{12} & a_{13} \\ a_{21} & a_{22} & a_{23} \end{array} \right] = \left[ \begin{array}{cc} 1 & 2 & 0 \\ 4 & 3 & -1 \end{array} \right] * A^T = \left[ \begin{array}{cc} 1 & 4 \\ 2 & 3 \\ 0 & -1 \end{array} \right]$$ The Hermitian transpose, denoted \(A^H\), its (i,j)th entry is: \( (A^H)_{ij} = \overline{\rm a_{ij}}\) where \( \overline{\rm a_{ij}}\) the conjugate of the complex \(a_{ij} = \alpha + i \beta \)
$$det⁡(A^H)=\overline{\rm det⁡(A)}\text{ (if }A \in \mathbb{C}^{n*n}$$ $$(A^T)^T=A$$ $$(\lambda .A)^T=\lambda .A^T$$ $$(A+B)^T=A^T+B^T$ $ $$(AB)^T=A^T.B^T $$ $$(A^T)^{-1}=(A^{-1})^T$$

Jacobian Matrix

Let E,F,G be 3 \(\mathbb{R}\)-vector spaces different from 0. Let \(dim⁡ E = n\) and \(dim⁡ F = p\)
Let \(U \in E\) be a nonempty open space, \(a \in U \) and \(f : U \rightarrow F\)
Let \(B_E=(e_1,e_2,…,e_n)\) be a basis of E and \(B_F=(\varepsilon_1,\varepsilon_2,…,\varepsilon_n)\) be a basis of F
Definition: Suppose f is differentiable at a. We call a Jacobian matrix of f at a relatively to the bases \(B_E\) and \(B_F\) the matrix of the differential \(df(a)\) of f at a relatively to the bases \(B_E\) and \ (B_F\). Denoted \(J_f(a)\). If \(p=n\), the real \(det⁡(J_f (a)\) is the Jacobian of f at a, denoted \(Jac_f(a)\)
$$J_f(a)=mat(df(a),B_E,B_F) = mat(D_1 f(a),… ,D_n f(a))=(\frac{\partial f_i}{\partial x_j }) (a))_{\begin{array}{cc} 1 < i < p \\ 1 < j < p \end{array}}$$ $$J_f(a)= \left( \begin{array}{cc} \frac{\partial f_1}{\partial x_1}(a) & … & \frac{\partial f_1}{\partial x_n}(a) \\ … & … & … \\ \frac{\partial f_p}{\partial x_1}(a) & … & \frac{\partial f_p }{\partial x_n}(a) \end{array} \right)$$ Operations:
f + g differentiable at a and \(d(f+g)(a)=df(a)+dg'(a) \rightarrow J_{f+g }(a)=J_f(a) + J_g(a) \)
$$\forall \lambda \in \mathbb{R}, \lambda_f \text{ differentiable in a and }d(\lambda_f )(a)= \lambda df(a) \rightarrow J_{\lambda_f }(a)= \lambda.J_f (a)$$ $$J_{g \circ f} (a)=J_g (f(a))*J_f (a)$$

Tagged: orci, lectus, varius, turpis

Leave a Reply

Your email address will not be published. Required fields are marked *