Chapter 3 Matrices
MATRICES
The essence of Mathematics lies in its freedom.  CANTOR
3.1 Introduction
The knowledge of matrices is necessary in various branches of mathematics. Matrices are one of the most powerful tools in mathematics. This mathematical tool simplifies our work to a great extent when compared with other straight forward methods. The evolution of concept of matrices is the result of an attempt to obtain compact and simple methods of solving system of linear equations. Matrices are not only used as a representation of the coefficients in system of linear equations, but utility of matrices far exceeds that use. Matrix notation and operations are used in electronic spreadsheet programs for personal computer, which in turn is used in different areas of business and science like budgeting, sales projection, cost estimation, analysing the results of an experiment etc. Also, many physical operations such as magnification, rotation and reflection through a plane can be represented mathematically by matrices. Matrices are also used in cryptography. This mathematical tool is not only used in certain branches of sciences, but also in genetics, economics, sociology, modern psychology and industrial management.
In this chapter, we shall find it interesting to become acquainted with the fundamentals of matrix and matrix algebra.
3.1 Matrix
Suppose we wish to express the information that Radha has 15 notebooks. We may express it as [15] with the understanding that the number inside [ ] is the number of notebooks that Radha has. Now, if we have to express that Radha has 15 notebooks and 6 pens. We may express it as $\begin{bmatrix}15 & 6\end{bmatrix}$ with the understanding that first number inside [ ] is the number of notebooks while the other one is the number of pens possessed by Radha. Let us now suppose that we wish to express the information of possession of notebooks and pens by Radha and her two friends Fauzia and Simran which is as follows:
$$ \begin{array}{llllll} \text { Radha } & \text { has } & 15 & \text { notebooks } & \text { and } & 6 \text { pens, } \\ \text { Fauzia } & \text { has } & 10 & \text { notebooks } & \text { and } & 2 \text { pens, } \\ \text { Simran } & \text { has } & 13 & \text { notebooks } & \text { and } & 5 \text { pens. } \end{array} $$
Now this could be arranged in the tabular form as follows: $$ \begin{array}{lcc} & \text { Notebooks } & \text { Pens } \\ \text { Radha } & 15 & 6 \\ \text { Fauzia } & 10 & 2 \\ \text { Simran } & 13 & 5 \end{array} $$ and this can be expressed as
or
Radha  Fauzia  Simran  

Notebooks  15  10  13 
Pens  6  2  5 
which can be expressed as:
15  10  13 

6  2  5 
In the first arrangement the entries in the first column represent the number of note books possessed by Radha, Fauzia and Simran, respectively and the entries in the second column represent the number of pens possessed by Radha, Fauzia and Simran, respectively. Similarly, in the second arrangement, the entries in the first row represent the number of notebooks possessed by Radha, Fauzia and Simran, respectively. The entries in the second row represent the number of pens possessed by Radha, Fauzia and Simran, respectively. An arrangement or display of the above kind is called a matrix. Formally, we define matrix as:
Definition 1 A matrix is an ordered rectangular bmatrix of numbers or functions. The numbers or functions are called the elements or the entries of the matrix.
We denote matrices by capital letters. The following are some examples of matrices:
$$ A=\begin{bmatrix} 2 & 5 \\ 0 & \sqrt{5} \\ 3 & 6 \end{bmatrix}, B=\begin{bmatrix} 2+i & 3 & \frac{1}{2} \\ 3.5 & 1 & 2 \\ \sqrt{3} & 5 & \frac{5}{7} \end{bmatrix}, C=\begin{bmatrix} 1+x & x^{3} & 3 \\ \cos x & \sin x+2 & \tan x \end{bmatrix} $$
In the above examples, the horizontal lines of elements are said to constitute, rows of the matrix and the vertical lines of elements are said to constitute, columns of the matrix. Thus $A$ has 3 rows and 2 columns, $B$ has 3 rows and 3 columns while $C$ has 2 rows and 3 columns.
3.1.1 Order of a matrix
A matrix having $m$ rows and $n$ columns is called a matrix of order $m \times n$ or simply $m \times n$ matrix (read as an $m$ by $n$ matrix). So referring to the above examples of matrices, we have $A$ as $3 \times 2$ matrix, $B$ as $3 \times 3$ matrix and $C$ as $2 \times 3$ matrix. We observe that $A$ has $3 \times 2=6$ elements, $B$ and $C$ have 9 and 6 elements, respectively.
In general, an $m \times n$ matrix has the following rectangular bmatrix:
$ \begin{bmatrix} a_{11} & a_{12} & a_{13} & \cdots & a_{1j} & \cdots & a_{1n} \\ a_{21} & a_{22} & a_{23} & \cdots & a_{2j} & \cdots & a_{2n} \\ \vdots & \vdots & \vdots & \vdots & \vdots & \vdots & \vdots \\ a_{i1} & a_{i2} & a_{i3} & \cdots & a_{ij} & \cdots & a_{in} \\ \vdots & \vdots & \vdots & \vdots & \vdots & \vdots & \vdots \\ a_{m1} & a_{m2} & a_{m3} & \cdots & a_{mj} & \cdots & a_{mn} \end{bmatrix} _{m \times n} $
or $ A=[a_{i j}]_{m \times n}, 1 \leq i \leq m, 1 \leq j \leq n \quad i, j \in N $
Thus the $i^{\text {th }}$ row consists of the elements $a_{i 1}, a_{i 2}, a_{i 3}, \ldots, a_{i n}$, while the $j^{\text {th }}$ column consists of the elements $a_{1 j}, a_{2 j}, a_{3 j}, \ldots, a_{m j}$,
In general $a_{i j}$, is an element lying in the $i^{\text {th }}$ row and $j^{\text {th }}$ column. We can also call it as the $(i, j)^{\text {th }}$ element of $A$. The number of elements in an $m \times n$ matrix will be equal to $m n$.
Note In this chapter
1. We shall follow the notation, namely $A=[a_{i j}]_{m \times n}$ to indicate that $A$ is a matrix of order $m \times n$.
2. We shall consider only those matrices whose elements are real numbers or functions taking real values.
We can also represent any point $(x, y)$ in a plane by a matrix (column or row) as $\begin{bmatrix}x \\ y\end{bmatrix}$ (or $.[x, y]$). For example point $P(0,1)$ as a matrix representation may be given as
$$ \mathbf{P}=\begin{bmatrix} 0 \\ 1 \end{bmatrix} \text { or }\begin{bmatrix} 0 & 1 \end{bmatrix} $$
Observe that in this way we can also express the vertices of a closed rectilinear figure in the form of a matrix. For example, consider a quadrilateral $A B C D$ with vertices A $(1,0), B(3,2), C(1,3), D(1,2)$.
Now, quadrilateral $ABCD$ in the matrix form, can be represented as
Thus, matrices can be used as representation of vertices of geometrical figures in a plane.
3.2 Types of Matrices
In this section, we shall discuss different types of matrices.
(i) Column matrix
A matrix is said to be a column matrix if it has only one column.
For example, $A=\begin{bmatrix}{c}0 \\ \sqrt{3} \\ 1 \\ 1 / 2\end{bmatrix}$ is a column matrix of order $4 \times 1$.
In general, $A=[a_{i j}]_{m \times 1}$ is a column matrix of order $m \times 1$.
(ii) Row matrix
A matrix is said to be a row matrix if it has only one row.
For example, $B=[\begin{bmatrix}\frac{1}{2} & \sqrt{5} & 2 & 3\end{bmatrix}]_{1 \times 4}$ is a row matrix.
In general, $B=[b_{i j}]_{1 \times n}$ is a row matrix of order $1 \times n$.
(iii) Square matrix
A matrix in which the number of rows are equal to the number of columns, is said to be a square matrix. Thus an $m \times n$ matrix is said to be a square matrix if $m=n$ and is known as a square matrix of order ’ $n$ ‘.
For example $A=\begin{bmatrix}3 & 1 & 0 \\ \frac{3}{2} & 3 \sqrt{2} & 1 \\ 4 & 3 & 1\end{bmatrix}$ is a square matrix of order 3.
In general, $A=[a_{i j}]_{m \times m}$ is a square matrix of order $m$.
Note If $A=[a_{i j}]$ is a square matrix of order $n$, then elements (entries) $a_{11}, a_{22}, \ldots, a_{n n}$ are said to constitute the diagonal, of the matrix A. Thus, if $A=\begin{bmatrix}1 & 3 & 1 \\ 2 & 4 & 1 \\ 3 & 5 & 6\end{bmatrix}$.
Then the elements of the diagonal of A are 1, 4, 6 .
(iv) Diagonal matrix
A square matrix $B=[b_{ij}]_ {m\times m} $ is said to be a diagonal matrix if all its non diagonal elements are zero, that is a matrix $B=[b_{ij}]_ {m\times m} $ is said to be a diagonal matrix if $b_{i j}=0$, when $i \neq j$.
For example, $A=[4], B=\begin{bmatrix}1 & 0 \\ 0 & 2\end{bmatrix}, C=\begin{bmatrix}1.1 & 0 & 0 \\ 0 & 2 & 0 \\ 0 & 0 & 3\end{bmatrix}$, are diagonal matrices of order 1,2,3, respectively.
(v) Scalar matrix
A diagonal matrix is said to be a scalar matrix if its diagonal elements are equal, that is, a square matrix $B=[b_{i j}]_{n \times n}$ is said to be a scalar matrix if
$$ \begin{aligned} & b_{i j}=0, \quad \text { when } i \neq j \\ & b_{i j}=k, \quad \text { when } i=j, \text { for some constant } k . \end{aligned} $$
For example
$A=[3], \quad B=[\begin{bmatrix}1 & 0 \\ 0 & 1\end{bmatrix}], \quad C=\begin{bmatrix}\sqrt{3} & 0 & 0 \\ 0 & \sqrt{3} & 0 \\ 0 & 0 & \sqrt{3}\end{bmatrix}$
are scalar matrices of order 1,2 and 3, respectively.
(vi) Identity matrix
A square matrix in which elements in the diagonal are all 1 and rest are all zero is called an identity matrix. In other words, the square matrix $A=[a_{i j}]_{n \times n}$ is an
identity matrix, if $a_{ij}=\begin{cases}1 & \text { if } & i=j \\ 0 & \text { if } & i \neq j\end{cases}.$.
We denote the identity matrix of order $n$ by $I_{n}$. When order is clear from the context, we simply write it as I.
For example [1], $\begin{bmatrix}1 & 0 \\ 0 & 1\end{bmatrix}$ $\begin{bmatrix}\sqrt 3 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & \sqrt 3\end{bmatrix}$ are identity matrices of order 1, 2 and 3, respectively.
Observe that a scalar matrix is an identity matrix when $k=1$. But every identity matrix is clearly a scalar matrix.
(vii) Zero matrix
A matrix is said to be zero matrix or null matrix if all its elements are zero.
For example, $[0],\begin{bmatrix}0 & 0 \\ 0 & 0\end{bmatrix},\begin{bmatrix}0 & 0 & 0 \\ 0 & 0 & 0\end{bmatrix},[0,0]$ are all zero matrices. We denote zero matrix by $O$. Its order will be clear from the context.
3.2.1 _Equality of matrices
Definition 2 Two matrices $A=[a_{i j}]$ and $B=[b_{i j}]$ are said to be equal if
(i) they are of the same order
(ii) each element of $A$ is equal to the corresponding element of $B$, that is $a_{i j}=b_{i j}$ for all $i$ and $j$.
For example, $\begin{bmatrix}2 & 3 \\ 0 & 1\end{bmatrix}$ and $\begin{bmatrix}2 & 3 \\ 0 & 1\end{bmatrix}$ are equal matrices but $\begin{bmatrix}3 & 2 \\ 0 & 1\end{bmatrix}$ and $\begin{bmatrix}2 & 3 \\ 0 & 1\end{bmatrix}$ are not equal matrices. Symbolically, if two matrices $A$ and $B$ are equal, we write $A=B$.
$$ \text { If }\begin{bmatrix} x & y \\ z & a \\ b & c \end{bmatrix}=\begin{bmatrix} 1.5 & 0 \\ 2 & \sqrt{6} \\ 3 & 2 \end{bmatrix} \text {, then } x=1.5, y=0, z=2, a=\sqrt{6}, b=3, c=2 $$
3.3 Operations on Matrices
In this section, we shall introduce certain operations on matrices, namely, addition of matrices, multiplication of a matrix by a scalar, difference and multiplication of matrices.
3.3.1 Addition of matrices
Suppose Fatima has two factories at places A and B. Each factory produces sport shoes for boys and girls in three different price categories labelled 1,2 and 3. The quantities produced by each factory are represented as matrices given below:
Factory at A
Girls
$.\begin{bmatrix}{l}60 \\ 65 \\ 85\end{bmatrix}$
Factory at B
Suppose Fatima wants to know the total production of sport shoes in each price category. Then the total production
In category 1 : for boys $(80+90)$, for girls $(60+50)$
In category 2 : for boys $(75+70)$, for girls $(65+55)$
In category 3 : for boys $(90+75)$, for girls $(85+75)$
This can be represented in the matrix form as $\begin{bmatrix}80+90 & 60+50 \\ 75+70 & 65+55 \\ 90+75 & 85+75\end{bmatrix}$.
This new matrix is the sum of the above two matrices. We observe that the sum of two matrices is a matrix obtained by adding the corresponding elements of the given matrices. Furthermore, the two matrices have to be of the same order.
Thus, if $A=\begin{bmatrix}a_{11} & a_{12} & a_{13} \\ a_{21} & a_{22} & a_{23}\end{bmatrix}$ is a $2 \times 3$ matrix and $B=\begin{bmatrix}b_{11} & b_{12} & b_{13} \\ b_{21} & b_{22} & b_{23}\end{bmatrix}$ is another
$2 \times 3$ matrix. Then, we define $A+B=\begin{bmatrix}a_{11}+b_{11} & a_{12}+b_{12} & a_{13}+b_{13} \\ a_{21}+b_{21} & a_{22}+b_{22} & a_{23}+b_{23}\end{bmatrix}$.
In general, if $A=[a_{i j}]$ and $B=[b_{i j}]$ are two matrices of the same order, say $m \times n$. Then, the sum of the two matrices A and B is defined as a matrix $= [c _{ij}] _{m \times n} $, where $ c _{i j} = a _{ij} + b _{ij} $, for all possible values of i and j.
Note
1. We emphasise that if A and B are not of the same order, then A + B is not defined. For example if $A=\begin{bmatrix}2 & 3 \\ 1 & 0\end{bmatrix}, B=\begin{bmatrix}1 & 2 & 3 \\ 1 & 0 & 1\end{bmatrix}$, then $A+B$ is not defined.
2. We may observe that addition of matrices is an example of binary operation on the set of matrices of the same order.
3.4.2 Multiplication of a matrix by a scalar
Now suppose that Fatima has doubled the production at a factory A in all categories (refer to 3.4.1).
Previously quantities (in standard units) produced by factory A were
$ \hspace{2mm} \text{Boys Girls} $
$\begin{matrix}1\\2\\3 \end{matrix} \begin{bmatrix}80 & 23\\ 75 & 60 \\90 & 65\end{bmatrix}$
Revised quantities produced by factory $A$ are as given below:
$ \quad \quad $Boys Girls
$\begin{matrix}1\\2\\3\end{matrix} \begin{bmatrix}2 \times 80 & 2 \times 60 \\ 2 \times 75 & 2 \times 65 \\ 2 \times 90 & 2 \times 85\end{bmatrix}$
This can be represented in the matrix form as $\begin{bmatrix}160 & 120 \\ 150 & 130 \\ 180 & 170\end{bmatrix}$. We observe that
the new matrix is obtained by multiplying each element of the previous matrix by 2 .
In general, we may define multiplication of a matrix by a scalar as follows: if $ A=[a_{ij}]_{m\times n} $ is a matrix and k is a scalar, then k A is another matrix which is obtained by multiplying each element of A by the scalar k.
In other words, $ kA = k[a_{ij}]_ {m\times n} $ $ =[k(a _{ij})] _{m\times n} $ that is, $ (i,j)^{th} $ element of kA is $ka _ {ij} $ for all possible values of i and j
For example, if $A=\begin{bmatrix}3 & 1 & 1.5 \\ \sqrt{5} & 7 & 3 \\ 2 & 0 & 5\end{bmatrix}$, then
$$ 3 A=3[\begin{bmatrix} 3 & 1 & 1.5 \\ \sqrt{5} & 7 & 3 \\ 2 & 0 & 5 \end{bmatrix}]=[\begin{bmatrix} 9 & 3 & 4.5 \\ 3 \sqrt{5} & 21 & 9 \\ 6 & 0 & 15 \end{bmatrix}] $$
Negative of a matrix The negative of a matrix is denoted by $A$. We define $A=(1) A$.
For example, let
$$ \begin{aligned} A & =\begin{bmatrix} 3 & 1 \\ 5 & x \end{bmatrix}, \text { then }A \text { is given by } \\ A & =(1) A=(1)\begin{bmatrix} 3 & 1 \\ 5 & x \end{bmatrix}=\begin{bmatrix} 3 & 1 \\ 5 & x \end{bmatrix} \end{aligned} $$
Difference of matrices If $A=[a_{i j}], B=[b_{i j}]$ are two matrices of the same order, say $m \times n$, then difference $AB$ is defined as a matrix $D=[d_{i j}]$, where $d_{i j}=a_{i j}b_{i j}$, for all value of $i$ and $j$. In other words, $D=AB=A+(1) B$, that is sum of the matrix $A$ and the matrix  B.
3.4.3 Properties of matrix addition
The addition of matrices satisfy the following properties:
(i) Commutative Law If $A=[a_{i j}], B=[b_{i j}]$ are matrices of the same order, say $m \times n$, then $A+B=B+A$.
Now
$$ \begin{aligned} A+B & =[a_{i j}]+[b_{i j}]=[a_{i j}+b_{i j}] \\ & =[b_{i j}+a_{i j}] \text { (addition of numbers is commutative) } \\ & =([b_{i j}]+[a_{i j}])=B+A \end{aligned} $$
(ii) Associative Law For any three matrices $A=[a_{i j}], B=[b_{i j}], C=[c_{i j}]$ of the same order, say $m \times n,(A+B)+C=A+(B+C)$.
Now
$$ \begin{aligned} (A+B)+C & =([a_{i j}]+[b_{i j}])+[c_{i j}] \\ & =[a_{i j}+b_{i j}]+[c_{i j}]=[(a_{i j}+b_{i j})+c_{i j}] \\ & =[a_{i j}+(b_{i j}+c_{i j})] \quad(\text { Why? }) \\ & =[a_{i j}]+[(b_{i j}+c_{i j})]=[a_{i j}]+([b_{i j}]+[c_{i j}])=A+(B+C) \end{aligned} $$ (iii) Existence of additive identity Let $A=[a_{i j}]$ be an $m \times n$ matrix and $O$ be an $m \times n$ zero matrix, then $A+O=O+A=A$. In other words, $O$ is the additive identity for matrix addition.
(iv) The existence of additive inverse Let $A=[a_{ij}]_{m \times n}$
be any matrix, then we have another matrix as $A=[a_{ij}]_{m \times n}$ such that $A+(A)=(A)+A=O$. So $A$ is the additive inverse of $A$ or negative of $A$.
3.4.4 Properties of scalar multiplication of a matrix
If $A=[a_{i j}]$ and $B=[b_{i j}]$ be two matrices of the same order, say $m \times n$, and $k$ and $l$ are scalars, then
(i) $k(A+B)=k A+k B$, (ii) $(k+l) A=k A+l A$
(ii) $k(A+B)=k([a_{i j}]+[b_{i j}])$
$$ \begin{aligned} & =k[a_{i j}+b_{i j}]=[k(a_{i j}+b_{i j})]=[(k a_{i j})+(k b_{i j})] \\ & =[k a_{i j}]+[k b_{i j}]=k[a_{i j}]+k[b_{i j}]=k A+k B \end{aligned} $$
(iii) $(k+l) A=(k+l)[a_{i j}]$
$$ =[(k+l) a_{i j}]+[k a_{i j}]+[l a_{i j}]=k[a_{i j}]+l[a_{i j}]=k A+l A $$
3.4.5 Multiplication of matrices
Suppose Meera and Nadeem are two friends. Meera wants to buy 2 pens and 5 story books, while Nadeem needs 8 pens and 10 story books. They both go to a shop to enquire about the rates which are quoted as follows:
$$ \text { Pen  ₹ } 5 \text { each, story book  ₹ } 50 \text { each. } $$
How much money does each need to spend? Clearly, Meera needs ₹ $(5 \times 2+50 \times 5)$ that is ₹ 260 , while Nadeem needs $(8 \times 5+50 \times 10)$ ₹, that is ₹ 540 . In terms of matrix representation, we can write the above information as follows:
Requirements $\quad$ Prices per piece (in Rupees) $\quad$* Money needed (in Rupees)*
$ \begin{bmatrix} 2 & 5 \\ 8 & 10 \end{bmatrix} \quad\quad\quad\quad\quad\quad\begin{bmatrix} 5 \\ 50 \end{bmatrix} \quad\quad\quad\quad\quad\quad\quad\begin{bmatrix}{l} 5 \times 2+5 \times 50 \\ 8 \times 5+10 \times 50 \end{bmatrix}=\begin{bmatrix} 260 \\ 540 \end{bmatrix} $
Suppose that they enquire about the rates from another shop, quoted as follows:
$$ \text { pen }₹ 4 \text { each, story book }₹ 40 \text { each. } $$
Now, the money required by Meera and Nadeem to make purchases will be respectively ₹ $(4 \times 2+40 \times 5)=₹ 208$ and ₹ $(8 \times 4+10 \times 40)=₹ 432$
Again, the above information can be represented as follows:
Requirements $\quad$Prices per piece (in Rupees) $\quad$Money needed (in Rupees)
$$ \begin{bmatrix} 2 & 5 \\ 8 & 10 \end{bmatrix} \quad\quad\quad\quad\quad\quad\begin{bmatrix} 4 \\ 40 \end{bmatrix} \quad\quad\quad\quad\quad\quad\quad\begin{bmatrix} 4 \times 2+40 \times 5 \\ 8 \times 4+10 \times 40 \end{bmatrix}=\begin{bmatrix} 208 \\ 432 \end{bmatrix} $$
Now, the information in both the cases can be combined and expressed in terms of matrices as follows:
Requirements $\quad$Prices per piece (in Rupees) $\quad$Money needed (in Rupees)
$\begin{aligned} {\begin{bmatrix} 2 & 5 \\ 8 & 10 \end{bmatrix} \quad\quad\quad\quad \begin{bmatrix} 5 & 4 \\ 50 & 40 \end{bmatrix} } \quad\quad\quad\quad\quad \begin{bmatrix} 5 \times 2+5 \times 50 & 4 \times 2+40 \times 5 \\ 8 \times 5+10 \times 50 & 8 \times 4+10 \times 40 \end{bmatrix} \\ = \begin{bmatrix} 260 & 208 \\ 540 & 432 \end{bmatrix} \end{aligned}$
The above is an example of multiplication of matrices. We observe that, for multiplication of two matrices A and B, the number of columns in A should be equal to the number of rows in B. Furthermore for getting the elements of the product matrix, we take rows of A and columns of B, multiply them elementwise and take the sum. Formally, we define multiplication of matrices as follows:
The product of two matrices A and B is defined if the number of columns of A is equal to the number of rows of $B$. Let $A=[a_{i j}]$ be an $m \times n$ matrix and $B=[b_{j k}]$ be an $n \times p$ matrix. Then the product of the matrices $A$ and $B$ is the matrix $C$ of order $m \times p$. To get the $(i, k)^{\text {th }}$ element $c_{i k}$ of the matrix C, we take the $i^{t h}$ row of A and $k^{\text {th }}$ column of $B$, multiply them elementwise and take the sum of all these products. In other words, if $A=[a _ {ij}] _ {m \times n}, B = [b _ {jk} ]_ {n \times p} $, then the $i^{th}$ row of A is $[a _ {i1} a_{i2} \ldots a _ {in}]$ and the $k^t {th}$ column of
B is $\begin{bmatrix}{c}b_{1 k} \\ b_{2 k} \\ \vdots \\ b_{n k}\end{bmatrix}$, then $c_{i k}=a_{i 1} b_{1 k}+a_{i 2} b_{2 k}+a_{i 3} b_{3 k}+\ldots+a_{i n} b_{n k}=\sum_{j=1}^{n} a_{i j} b_{j k}$.
The matrix $C=[c_{i k}]_{m \times p}$ is the product of $A$ and $B$.
For example, if $C=\begin{bmatrix}1 & 1 & 2 \\ 0 & 3 & 4\end{bmatrix}$ and $D=\begin{bmatrix}2 & 7 \\ 1 & 1 \\ 5 & 4\end{bmatrix}$, then the product $C D$ is defined and is given by $C D=\begin{bmatrix}1 & 1 & 2 \\ 0 & 3 & 4\end{bmatrix}\begin{bmatrix}2 & 7 \\ 1 & 1 \\ 5 & 4\end{bmatrix}$. This is a $2 \times 2$ matrix in which each entry is the sum of the products across some row of $C$ with the corresponding entries down some column of $D$. These four computations are
$\begin{aligned} & \text { Entry in } \\ & \text { first row } \\ & \text { first column }\end{aligned}\begin{bmatrix}1 & 1 & 2 \\ 0 & 3 & 4\end{bmatrix}\begin{bmatrix}2 & 7 \\ 1 & 1 \\ 5 & 4\end{bmatrix}=\begin{bmatrix}(1)(2)+(1)(1)+(2)(5) & ? \\ ? & ?\end{bmatrix}$
$\begin{aligned} & \text { Entry in } \\ & \text { first row } \\ & \text { second column }\end{aligned}\begin{bmatrix}1 & 1 & 2 \\ 0 & 3 & 4\end{bmatrix}\begin{bmatrix}2 & 7 \\ 1 & 1 \\ 5 & 4\end{bmatrix}=\begin{bmatrix}13 & (1)(7)+(1)(1)+2(4) \\ ? & ?\end{bmatrix}$
$\begin{aligned} & \text { Entry in } \\ & \text { second row } \\ & \text { first column }\end{aligned}\begin{bmatrix}1 & 1 & 2 \\ 0 & 3 & 4\end{bmatrix}\begin{bmatrix}2 & 7 \\ 1 & 1 \\ 5 & 4\end{bmatrix}=\begin{bmatrix}13 & 2 \\ 0(2)+3(1)+4(5) & ?\end{bmatrix}$
$\begin{aligned} &\begin{bmatrix}\text { Entry in } \\ \text { second row } \\ \text { second column }\end{bmatrix}\end{aligned}\begin{bmatrix}1 & 1 & 2 \\ 0 & 3 & 4\end{bmatrix}\begin{bmatrix}2 & 7 \\ 1 & 1 \\ 5 & 4\end{bmatrix}=\begin{bmatrix}13 & 2 \\ 17 & 0(7)+3(1)+4(4)\end{bmatrix}$
Thus $CD=\begin{bmatrix}13 & 2 \\ 17 & 13\end{bmatrix}$
Remark If $AB$ is defined, then $BA$ need not be defined. In the above example, $AB$ is defined but $BA$ is not defined because $B$ has 3 column while $A$ has only 2 (and not 3 ) rows. If $A, B$ are, respectively $m \times n, k \times l$ matrices, then both $AB$ and $BA$ are defined if and only if $n=k$ and $l=m$. In particular, if both $A$ and $B$ are square matrices of the same order, then both $AB$ and $BA$ are defined.
Noncommutativity of multiplication of matrices
Now, we shall see by an example that even if $AB$ and $BA$ are both defined, it is not necessary that $AB=BA$.
3.4.6 Properties of multiplication of matrices
The multiplication of matrices possesses the following properties, which we state without proof.
1. The associative law For any three matrices $A, B$ and $C$. We have $(AB) C=A(BC)$, whenever both sides of the equality are defined.
2. The distributive law For three matrices $A, B$ and $C$.
(i) $A(B+C)=AB+AC$
(ii) $(A+B) C=AC+BC$, whenever both sides of equality are defined.
3. The existence of multiplicative identity For every square matrix A, there exist an identity matrix of same order such that $IA=AI=A$.
Now, we shall verify these properties by examples.
3.5 Transpose of a Matrix
In this section, we shall learn about transpose of a matrix and special types of matrices such as symmetric and skew symmetric matrices.
Definition 3 If $A=[a_{i j}]$ be an $m \times n$ matrix, then the matrix obtained by interchanging the rows and columns of A is called the transpose of A. Transpose of the matrix A is denoted by $A^{\prime}$ or $(A^{T})$. In other words, if $A=[a _{i j}] _{m \times n}$, then $A^{\prime}=[a _{j i}] _{n \times m}$. For example,
if $A=\begin{bmatrix}3 & 5 \\ \sqrt{3} & 1 \\ 0 & \frac{1}{5}\end{bmatrix}_{3 \times 2}$,
then $A^{\prime}=\begin{bmatrix}3 & \sqrt{3} & 0 \\ 5 & 1 & \frac{1}{5}\end{bmatrix}_{2 \times 3}$
3.5.1 Properties of transpose of the matrices
We now state the following properties of transpose of matrices without proof. These may be verified by taking suitable examples.
For any matrices A and B of suitable orders, we have (i) $(A^{\prime})^{\prime}=A$, (ii) $(k A)^{\prime}=k A^{\prime}$ (where $k$ is any constant) (iii) $(A+B)^{\prime}=A^{\prime}+B^{\prime}$ (iv) $(A B)^{\prime}=B^{\prime} A^{\prime}$
3.6 Symmetric and Skew Symmetric Matrices
Definition 4 A square matrix $A=[a_{i j}]$ is said to be symmetric if $A^{\prime}=A$, that is, $[a_{i j}]=[a_{j i}]$ for all possible values of $i$ and $j$.
$$ \text { For example } A=\begin{bmatrix} \sqrt{3} & 2 & 3 \\ 2 & 1.5 & 1 \\ 3 & 1 & 1 \end{bmatrix} \text { is a symmetric matrix as } A^{\prime}=A $$
Definition 5 A square matrix $A=[a_{i j}]$ is said to be skew symmetric matrix if $A^{\prime}=A$, that is $a_{j i}=a_{i j}$ for all possible values of $i$ and $j$. Now, if we put $i=j$, we have $a_{i i}=a_{i i}$. Therefore $2 a_{i i}=0$ or $a_{i i}=0$ for all $i$ ’s.
This means that all the diagonal elements of a skew symmetric matrix are zero.
For example, the matrix $B=\begin{bmatrix}0 & e & f \\ e & 0 & g \\ f & g & 0\end{bmatrix}$ is a skew symmetric matrix as $B^{\prime}=B$
Now, we are going to prove some results of symmetric and skewsymmetric matrices.
Theorem 1 For any square matrix $A$ with real number entries, $A+A^{\prime}$ is a symmetric matrix and $AA^{\prime}$ is a skew symmetric matrix.
Proof Let $B=A+A^{\prime}$, then
$$\begin{aligned} B^{\prime} & =(A+A^{\prime})^{\prime} \\ & =A^{\prime}+(A^{\prime})^{\prime}(\text { as }(A+B)^{\prime}=A^{\prime}+B^{\prime}) \\ & =A^{\prime}+A(\text { as }(A^{\prime})^{\prime}=A) \\ & =A+A^{\prime}(\text { as } A+B=B+A) \\ & =B \end{aligned} $$
Therefore
$B=A+A^{\prime}$ is a symmetric matrix
Now let
$$ C=AA^{\prime} $$
$$\begin{aligned} C^{\prime} & =(AA^{\prime})^{\prime}=A^{\prime}(A^{\prime})^{\prime} \quad \text { (Why?) } \\ & =A^{\prime}A \quad(\text { Why? }) \\ & =(AA^{\prime})=C \end{aligned} $$
Therefore
$$ C=AA^{\prime} \text { is a skew symmetric matrix. } $$
Theorem 2 Any square matrix can be expressed as the sum of a symmetric and a skew symmetric matrix.
Proof Let A be a square matrix, then we can write
$$ A=\frac{1}{2}(A+A^{\prime})+\frac{1}{2}(AA^{\prime}) $$
From the Theorem 1, we know that $(A+A^{\prime})$ is a symmetric matrix and $(AA^{\prime})$ is a skew symmetric matrix. Since for any matrix $A,(k A)^{\prime}=k A^{\prime}$, it follows that $\frac{1}{2}(A+A^{\prime})$ is symmetric matrix and $\frac{1}{2}(AA^{\prime})$ is skew symmetric matrix. Thus, any square matrix can be expressed as the sum of a symmetric and a skew symmetric matrix.
3.7 Invertible Matrices
Definition 6 If $A$ is a square matrix of order $m$, and if there exists another square matrix $B$ of the same order $m$, such that $AB=BA=I$, then $B$ is called the inverse matrix of $A$ and it is denoted by $A^{1}$. In that case $A$ is said to be invertible.
For example, let $A=\begin{bmatrix}2 & 3 \\ 1 & 2\end{bmatrix}$ and $B=\begin{bmatrix}2 & 3 \\ 1 & 2\end{bmatrix}$ be two matrices.
Now
$$\begin{aligned} AB & =\begin{bmatrix} 2 & 3 \\ 1 & 2 \end{bmatrix}\begin{bmatrix} 2 & 3 \\ 1 & 2 \end{bmatrix} \\ & =\begin{bmatrix} 43 & 6+6 \\ 22 & 3+4 \end{bmatrix}=\begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix}=I \end{aligned} $$
Also $BA=\begin{bmatrix}1 & 0 \\ 0 & 1\end{bmatrix}=I$. Thus $B$ is the inverse of $A$, in other words $B=A^{1}$ and $A$ is inverse of $B$, i.e., $A=B^{1}$
$\sigma$ Note
1. A rectangular matrix does not possess inverse matrix, since for products $B A$ and $AB$ to be defined and to be equal, it is necessary that matrices $A$ and $B$ should be square matrices of the same order.
2. If $B$ is the inverse of $A$, then $A$ is also the inverse of $B$.
Theorem 3 (Uniqueness of inverse) Inverse of a square matrix, if it exists, is unique. Proof Let $A=[a_{i j}]$ be a square matrix of order $m$. If possible, let $B$ and $C$ be two inverses of $A$. We shall show that $B=C$.
Since B is the inverse of A
$$ AB=BA=I $$
Since $C$ is also the inverse of $A$
$$ AC=CA=I $$
Thus
$$ B=BI=B(AC)=(BA) C=IC=C $$
Theorem 4 If $A$ and $B$ are invertible matrices of the same order, then $(A B)^{1}=B^{1} A^{1}$. Proof From the definition of inverse of a matrix, we have
or
$$ (AB)(AB)^{1}=1 $$
or
$$ A^{1}(AB)(AB)^{1}=A^{1} I \bigcirc(\text { Pre multiplying both sides by } A^{1}) $$
$(A^{1} A) B(A B)^{1}=A^{1} \quad(.$ Since $.A^{1} I=A^{1})$
or IB $(A B)^{1}=A^{1}$
or
$B(AB)^{1}=A^{1}$
or
$$ B^{1} B(AB)^{1}=B^{1} A^{1} $$
or
$I(AB)^{1}=B^{1} A^{1}$
Hence $(AB)^{1}=B^{1} A^{1}$
Summary
A matrix is an ordered rectangular bmatrix of numbers or functions.

A matrix having $m$ rows and $n$ columns is called a matrix of order $m \times n$.

$[a_{i j}]_{m \times 1}$ is a column matrix.

$[a_{i j}]_{1 \times n}$ is a row matrix.

An $m \times n$ matrix is a square matrix if $m=n$.

$A=[a _{i j}] _{m \times m}$ is a diagonal matrix if $a _{i j}=0$, when $i \neq j$.

$A=[a_{ij}]_{n \times n}$
is a scalar matrix if $a _ {i j}=0$, when $i \neq j, a _ {ij}=k$, $k$ is some constant, when $i=j$.

$A=[a_{ij}]_{n \times n}$
is an identity matrix, if $a_{i j}=1$, when $i=j, a_{i j}=0$, when $i \neq j$.

A zero matrix has all its elements as zero.

$A=[a_ {i j}]=[b_ {i j}]=B$ if (i) A and B are of same order, (ii) $a_{i j}=b_{i j}$ for all possible values of $i$ and $j$. $k A=k[a_ {i j}]_ {m \times n}=[k(a_{i j})] _{m \times n}$
$\DeltaA=(1) A$
$\Delta AB=A+(1) B$
$\Delta A+B=B+A$
 $(A+B)+C=A+(B+C)$, where $A, B$ and $C$ are of same order.
$\Delta k(A+B)=k A+k B$, where $A$ and $B$ are of same order, $k$ is constant.

$ (k+l) A=k A+l A$, where $k$ and $l$ are constant.

If $A=[a_ {i j}]_ {m \times n}$ and $B=[b_ {j k}]_ {n \times p}$, then $AB=C=[c_ {i k}]_ {m \times p}$, where $c_ {i k}=\sum_ {j=1}^{n} a_ {i j} b_{j k}$
(i) $A(BC)=(AB) C$,
(ii) $A(B+C)=AB+AC$, (iii) $(A+B) C=AC+BC$
$\Delta$ If $A=[a_ {i j}]_ {m \times n}$, then $A^{\prime}$ or $A^{T}=[a_ {j i}]_ {n \times m}$
(i) $(A^{\prime})^{\prime}=A$, (ii) $(k A)^{\prime}=k A^{\prime}$, (iii) $(A+B)^{\prime}=A^{\prime}+B^{\prime}$, (iv) $(AB)^{\prime}=B^{\prime} A^{\prime}$
 $A$ is a symmetric matrix if $A^{\prime}=A$.
$\Delta A$ is a skew symmetric matrix if $A^{\prime}=A$.
 Any square matrix can be represented as the sum of a symmetric and a skew symmetric matrix.
$\Delta$ If $A$ and $B$ are two square matrices such that $AB=BA=I$, then $B$ is the inverse matrix of $A$ and is denoted by $A^{1}$ and $A$ is the inverse of $B$.
$\Delta$ Inverse of a square matrix, if it exists, is unique.