Chapter 04 Determinants

Short Answer Type Questions

1. $ \begin{vmatrix} x^{2}-x+1 & x-1 \\ x+1 & x+1\end{vmatrix} $

Show Answer

Solution

Let $\Delta= \begin{vmatrix} x^{2}-x+1 & x-1 \\ x+1 & x+1\end{vmatrix} $

$ \begin{aligned} C_1 \to C_1 & -C_2 \\ & = \begin{vmatrix} x^{2}-2 x+2 & x-1 \\ 0 & x+1 \end{vmatrix} \\ & =(x+1)(x^{2}-2 x+2)-0 \\ & =x^{3}-2 x^{2}+2 x+x^{2}-2 x+2=x^{3}-x^{2}+2 \end{aligned} $

2. $ \begin{vmatrix} a+x & y & z \\ x & a+y & z \\ x & y & a+z\end{vmatrix} $

Show Answer

Solution

Let $\Delta= \begin{vmatrix} a+x & y & z \\ x & a+y & z \\ x & y & a+z\end{vmatrix} $

$ C_1 \to C_1+C_2+C_3 $

$ = \begin{vmatrix} a+x+y+z & y & z \\ a+x+y+z & a+y & z \\ a+x+y+z & y & a+z \end{vmatrix} =(a+x+y+z) \begin{vmatrix} 1 & y & z \\ 1 & a+y & z \\ 1 & y & a+z \end{vmatrix} $

(Taking $a+x+y+z$ common)

$R_1 \to R_1-R_2, R_2 \to R_2-R_3$

$ =(a+x+y+z) \begin{vmatrix} 0 & -a & 0 \\ 0 & a & -a \\ 1 & y & a+z \end{vmatrix} $

Expanding along $C_1=(a+x+y+z)|1(a^{2}-0)|=a^{2}(a+x+y+z)$

3. $ \begin{vmatrix} 0 & x y^{2} & x z^{2} \\ x^{2} y & 0 & y z^{2} \\ x^{2} z & z y^{2} & 0\end{vmatrix} $

Show Answer

Solution

Let $\Delta= \begin{vmatrix} 0 & x y^{2} & x z^{2} \\ x^{2} y & 0 & y z^{2} \\ x^{2} z & z y^{2} & 0\end{vmatrix} $

Taking $x^{2}, y^{2}$ and $z^{2}$ common from $C_1, C_2$ and $C_3$ respectively $=x^{2} y^{2} z^{2} \begin{vmatrix} 0 & x & x \\ y & 0 & y \\ z & z & 0\end{vmatrix} $

Expanding along $R_1$

$ \begin{aligned} & =x^{2} y^{2} z^{2}[0 \begin{vmatrix} 0 & y \\ z & 0 \end{vmatrix} -x \begin{vmatrix} y & y \\ z & 0 \end{vmatrix} +x \begin{vmatrix} y & 0 \\ z & z \end{vmatrix} ] \\ & =x^{2} y^{2} z^{2}[-x(0-y z)+x(y z-0)] \\ & =x^{2} y^{2} z^{2}(x y z+x y z)=x^{2} y^{2} z^{2}(2 x y z)=2 x^{3} y^{3} z^{3} \end{aligned} $

4. $ \begin{vmatrix} 3 x & -x+y & -x+z \\ x-y & 3 y & z-y \\ x-z & y-z & 3 z\end{vmatrix} $

Show Answer

Solution

Let $\Delta= \begin{vmatrix} 3 x & -x+y & -x+z \\ x-y & 3 y & z-y \\ x-z & y-z & 3 z\end{vmatrix} $

$C_1 \to C_1+C_2+C_3$

$=\lvert, \begin{matrix} x+y+z & -x+y & -x+z \\ x+y+z & 3 y & z-y \\ x+y+z & y-z & 3 z\end{matrix} .$

Taking $(x+y+z)$ common from $C_1$

$ =(x+y+z) \begin{vmatrix} 1 & -x+y & -x+z \\ 1 & 3 y & z-y \\ 1 & y-z & 3 z \end{vmatrix} $

$R_1 \to R_1-R_2, R_2 \to R_2-R_3$

$=(x+y+z) \begin{vmatrix} 0 & -x-2 y & -x+y \\ 0 & 2 y+z & -y-2 z \\ 1 & y-z & 3 z\end{vmatrix} $

Expanding along $C_1$

$ \begin{aligned} & =(x+y+z)[1 \lvert, \begin{matrix} -x-2 y & -x+y \\ 2 y+z & -y-2 z \end{matrix} .] \\ & =(x+y+z)[(-x-2 y)(-y-2 z)-(2 y+z)(-x+y)] \\ & =(x+y+z)(x y+2 z x+2 y^{2}+4 y z+2 x y-2 y^{2}+z x-z y) \\ & =(x+y+z)(3 x y+3 z x+3 y z)=3(x+y+z)(x y+y z+z x) \end{aligned} $

5. $ \begin{vmatrix} x+4 & x & x \\ x & x+4 & x \\ x & x & x+4\end{vmatrix} $

Show Answer

Solution

Let $\Delta= \begin{vmatrix} x+4 & x & x \\ x & x+4 & x \\ x & x & x+4\end{vmatrix} $

$ \begin{aligned} C_1 \to C_1 & +C_2+C_3 \\ & = \begin{vmatrix} 3 x+4 & x & x \\ 3 x+4 & x+4 & x \\ 3 x+4 & x & x+4 \end{vmatrix} \end{aligned} $

Taking $(3 x+4)$ common from $C_1$

$ =(3 x+4) \begin{vmatrix} 1 & x & x \\ 1 & x+4 & x \\ 1 & x & x+4 \end{vmatrix} $

$ R_1 \to R_1-R_2, R_2 \to R_2-R_3 $

$ =(3 x+4) \begin{vmatrix} 0 & -4 & 0 \\ 0 & 4 & -4 \\ 1 & x & x+4 \end{vmatrix} $

Expanding along $C_1$

$ =(3 x+4)[1 \begin{vmatrix} -4 & 0 \\ 4 & -4 \end{vmatrix} ]=(3 x+4)(16-0)=16(3 x+4) $

6. $ \begin{vmatrix} a-b-c & 2 a & 2 a \\ 2 b & b-c-a & 2 b \\ 2 c & 2 c & c-a-b\end{vmatrix} $

Show Answer

Solution

Let $\Delta= \begin{vmatrix} a-b-c & 2 a & 2 a \\ 2 b & b-c-a & 2 b \\ 2 c & 2 c & c-a-b\end{vmatrix} $

$ \begin{aligned} & R_1 \to R_1+R_2+R_3 \\ &= \begin{vmatrix} a+b+c & a+b+c & a+b+c \\ 2 b & b-c-a & 2 b \\ 2 c & 2 c & c-a-b \end{vmatrix} \end{aligned} $

Taking $(a+b+c)$ common from $R_1$

$ \begin{aligned} & =(a+b+c) \begin{vmatrix} 1 & 1 & 1 \\ 2 b & b-c-a & 2 b \\ 2 c & 2 c & c-a-b \end{vmatrix} \\ C_1 \to C_1-C_2, C_2 & \to C_2-C_3 \\ & =(a+b+c) \begin{vmatrix} 0 & 0 & 1 \\ b+c+a & -(b+c+a) & 2 b \\ 0 & a+b+c & c-a-b \end{vmatrix} \end{aligned} $

Taking $(b+c+a)$ from $C_1$ and $C_2$

$ =(a+b+c)^{3} \begin{vmatrix} 0 & 0 & 1 \\ 1 & -1 & 2 b \\ 0 & 1 & c-a-b \end{vmatrix} $

Expanding along $R_1$

$ =(a+b+c)^{3}[1 \begin{vmatrix} 1 & -1 \\ 0 & 1 \end{vmatrix} ]=(a+b+c)^{3} \text{. } $

Using the properties of determinants in Exercises 7 to 9, prove that:

7. $ \begin{vmatrix} y^{2} z^{2} & y z & y+z \\ z^{2} x^{2} & z x & z+x \\ x^{2} y^{2} & x y & x+y\end{vmatrix} =0$

Show Answer

Solution

$\quad$ L.H.S. $= \begin{vmatrix} y^{2} z^{2} & y z & y+z \\ z^{2} x^{2} & z x & z+x \\ x^{2} y^{2} & x y & x+y\end{vmatrix} $

$R_1 \to x R_1, R_2 \to y R_2, R_3 \to z R_3$ and dividing the determinant by $x y z$.

$ =\frac{1}{x y z} \begin{vmatrix} x y^{2} z^{2} & x y z & x y+z x \\ y z^{2} x^{2} & y z x & y z+x y \\ z x^{2} y^{2} & z x y & z x+z y \end{vmatrix} $

Taking $x y z$ common from $C_1$ and $C_2$

$ =\frac{x y z \cdot x y z}{x y z} \begin{vmatrix} y z & 1 & x y+z x \\ z x & 1 & y z+x y \\ x y & 1 & z x+z y \end{vmatrix} $

$C_3 \to C_3+C_1$

$ =x y z \begin{vmatrix} y z & 1 & x y+y z+z x \\ z x & 1 & x y+y z+z x \\ x y & 1 & x y+y z+z x \end{vmatrix} $

Taking $(x y+y z+z x)$ common from $C_3$

$ \begin{aligned} & =(x y z)(x y+y z+z x) \begin{vmatrix} y z & 1 & 1 \\ z x & 1 & 1 \\ x y & 1 & 1 \end{vmatrix} \\ & =(x y z)(x y+y z+z x) \begin{vmatrix} y z & 1 & 1 \\ z x & 1 & 1 \\ x y & 1 & 1 \end{vmatrix} =0 \\ & {[\because \quad C_2 \text{ and } C_3 \text{ are identical }]} \end{aligned} $

L.H.S. $=$ R.H.S. Hence proved.

8. $ \begin{vmatrix} y+z & z & y \\ z & z+x & x \\ y & x & x+y\end{vmatrix} =4 x y z$

Show Answer

Solution

$\quad$ L.H.S. $= \begin{vmatrix} y+z & z & y \\ z & z+x & x \\ y & x & x+y\end{vmatrix} $

$C_1 \to C_1-(C_2+C_3)$

$ = \begin{vmatrix} 0 & z & y \\ -2 x & z+x & x \\ -2 x & x & x+y \end{vmatrix} $

Taking -2 common from $C_1$

$ \begin{aligned} & =-2 \begin{vmatrix} 0 & z & y \\ x & z+x & x \\ x & x & x+y \end{vmatrix} \\ R_2 \to R_2 & -R_3 \\ & =-2 \begin{vmatrix} 0 & z & y \\ 0 & z & -y \\ x & x & x+y \end{vmatrix} \end{aligned} $

Expanding along $C_1$

L.H.S. $=$ R.H.S.

$=-2[x|-z y-z y|]=-2(-2 x y z)=4 x y z$ R.H.S.

Hence, proved.

9. $ \begin{vmatrix} a^{2}+2 a & 2 a+1 & 1 \\ 2 a+1 & a+2 & 1 \\ 3 & 3 & 1\end{vmatrix} =(a-1)^{3}$

Show Answer

Solution

L.H.S. $= \begin{vmatrix} a^{2}+2 a & 2 a+1 & 1 \\ 2 a+1 & a+2 & 1 \\ 3 & 3 & 1\end{vmatrix} $

$ \begin{aligned} & R_1 \to R_1-R_2, R_2 \to R_2-R_3 \\ &= \begin{vmatrix} a^{2}-1 & a-1 & 0 \\ 2 a-2 & a-1 & 0 \\ 3 & 3 & 1 \end{vmatrix} = \begin{vmatrix} (a+1)(a-1) & a-1 & 0 \\ 2(a-1) & a-1 & 0 \\ 3 & 3 & 1 \end{vmatrix} \end{aligned} $

Taking $(a-1)$ common from $C_1$ and $C_2$

Expanding along $C_3$

$ =(a-1)(a-1) \begin{vmatrix} a+1 & 1 & 0 \\ 2 & 1 & 0 \\ 3 & 3 & 1 \end{vmatrix} $

$ \begin{aligned} & .=(a-1)^{2}[1 \lvert, \begin{matrix} a+1 & 1 \\ 2 & 1 \end{matrix} .]] \\ & =(a-1)^{2}(a+1-2)=(a-1)^{2}(a-1)=(a-1)^{3} \text{ R.H.S. } \end{aligned} $

L.H.S. $=$ R.H.S.

Hence, proved.

10. If $A+B+C=0$, then prove that

$ \begin{vmatrix} 1 & \cos C & \cos B \\ \cos C & 1 & \cos A \\ \cos B & \cos A & 1 \end{vmatrix} =0 $

Show Answer

Solution

L.H.S. $= \begin{vmatrix} 1 & \cos C & \cos B \\ \cos C & 1 & \cos A \\ \cos B & \cos A & 1\end{vmatrix} $

Expanding along $C_1$

$ \begin{aligned} =1 \begin{vmatrix} 1 & \cos A \\ \cos A & 1 \end{vmatrix} -\cos C \begin{vmatrix} \cos C & \cos B \\ \cos A & 1 \end{vmatrix} \\ +\cos B \begin{vmatrix} \cos C & \cos B \\ 1 & \cos A \end{vmatrix} \end{aligned} $

$ \begin{aligned} = & 1(1-\cos ^{2} A)-\cos C(\cos C-\cos A \cos B) \\ & +\cos B(\cos A \cos C-\cos B) \\ = & \sin ^{2} A-\cos ^{2} C+\cos A \cos B \cos C \\ & +\cos A \cos B \cos C-\cos ^{2} B \\ = & \sin ^{2} A-\cos ^{2} B-\cos ^{2} C+2 \cos A \cos B \cos C \\ = & -\cos (A+B) \cdot \cos (A-B)-\cos ^{2} C+2 \cos A \cos B \cos C \\ & \quad[\because \quad \sin ^{2} A-\cos ^{2} B=-\cos (A+B) \cdot \cos (A-B)] \\ = & -\cos (-C) \cdot \cos (A-B)+\cos C(2 \cos A \cos B-\cos C) \end{aligned} $

$ \begin{aligned} = & -\cos C(\cos A \cos B+\sin A \sin B) \\ & +\cos C(2 \cos A \cos B-\cos C) \\ = & -\cos C(\cos A \cos B+\sin A \sin B-2 \cos A \cos B+\cos C) \\ = & -\cos C(-\cos A \cos B+\sin A \sin B+\cos C) \\ = & \cos C(\cos A \cos B-\sin A \sin B-\cos C) \\ = & \cos C[\cos (A+B)-\cos C] \\ = & \cos C[\cos (-C)-\cos C] \\ = & \cos C[\cos C-\cos C]=\cos C \cdot 0=0 \text{ R.H.S. } \end{aligned} $

L.H.S. $=$ R.H.S.

Hence, proved.

11. If the coordinates of the vertices of an equilateral triangle with sides of length ’ $a$ ’ are $(x_1, y_1),(x_2, y_2)$ and $(x_3, y_3)$, then

$ \begin{vmatrix} x_1 & y_1 & 1 \\ x_2 & y_2 & 1 \\ x_3 & y_3 & 1 \end{vmatrix} ^{2}=\frac{3 a^{4}}{4} $

Show Answer

Solution

Area of triangle whose vertices are $(x_1, y_1),(x_2, y_2)$ and $(x_3, y_3)$

$=\frac{1}{2} \begin{vmatrix} x_1 & y_1 & 1 \\ x_2 & y_2 & 1 \\ x_3 & y_3 & 1\end{vmatrix} $

Let $\Delta=\frac{1}{2} \begin{vmatrix} x_1 & y_1 & 1 \\ x_2 & y_2 & 1 \\ x_3 & y_3 & 1\end{vmatrix} \Rightarrow \Delta^{2}=\frac{1}{4} \begin{vmatrix} x_1 & y_1 & 1 \\ x_2 & y_2 & 1 \\ x_3 & y_3 & 1\end{vmatrix} ^{2}$

But area of equilateral triangle whose side is ’ $a$ ’ $=\frac{\sqrt{3}}{4} a^{2}$

$ \therefore \quad(\frac{\sqrt{3}}{4} a^{2})^{2}=\frac{1}{4} \begin{vmatrix} x_1 & y_1 & 1 \\ x_2 & y_2 & 1 \\ x_3 & y_3 & 1 \end{vmatrix} ^{2} $

$ \Rightarrow \frac{3}{16} a^{4}=\frac{1}{4} \begin{vmatrix} x_1 & y_1 & 1 \\ x_2 & y_2 & 1 \\ x_3 & y_3 & 1 \end{vmatrix} ^{2} \Rightarrow \begin{vmatrix} x_1 & y_1 & 1 \\ x_2 & y_2 & 1 \\ x_3 & y_3 & 1 \end{vmatrix} ^{2}=\frac{3}{16} a^{4} \times 4=\frac{3}{4} a^{4} $

Hence, proved.

12. Find the value of $\theta$ satisfying $ \begin{bmatrix} 1 & 1 & \sin 3 \theta \\ -4 & 3 & \cos 2 \theta \\ 7 & -7 & -2 \end{bmatrix} =0$.

Show Answer

Solution

Let $\quad A= \begin{bmatrix} 1 & 1 & \sin 3 \theta \\ -4 & 3 & \cos 2 \theta \\ 7 & -7 & -2 \end{bmatrix} =0$

$C_1 \to C_1-C_2$

$ |A|= \begin{vmatrix} 1 & 1 & \sin 3 \theta \\ -4 & 3 & \cos 2 \theta \\ 7 & -7 & -2 \end{vmatrix} =0 $

$\Rightarrow \quad \begin{vmatrix} 0 & 1 & \sin 3 \theta \\ -7 & 3 & \cos 2 \theta \\ 14 & -7 & -2\end{vmatrix} =0$

Taking 7 common from $C_1$

$ \begin{matrix} \Rightarrow & 7 \begin{vmatrix} 0 & 1 & \sin 3 \theta \\ -1 & 3 & \cos 2 \theta \\ 2 & -7 & -2 \end{vmatrix} =0 \\ \Rightarrow & \begin{vmatrix} 0 & 1 & \sin 3 \theta \\ -1 & 3 & \cos 2 \theta \\ 2 & -7 & -2 \end{vmatrix} =0 \end{matrix} $

Expanding along $C_1$

$ \begin{matrix} \Rightarrow & 1 \begin{vmatrix} 1 & \sin 3 \theta \\ -7 & -2 \end{vmatrix} +2 \begin{vmatrix} 1 & \sin 3 \theta \\ 3 & \cos 2 \theta \end{vmatrix} =0 \\ \Rightarrow & -2+7 \sin 3 \theta+2(\cos 2 \theta-3 \sin 3 \theta)=0 \\ \Rightarrow & -2+7 \sin 3 \theta+2 \cos 2 \theta-6 \sin 3 \theta=0 \\ \Rightarrow & -2+2 \cos 2 \theta+\sin 3 \theta=0 \\ \Rightarrow & -2+2(1-2 \sin ^{2} \theta)+3 \sin \theta-4 \sin ^{3} \theta=0 \\ \Rightarrow & -2+2-4 \sin ^{2} \theta+3 \sin \theta-4 \sin ^{3} \theta=0 \end{matrix} $

$ \begin{aligned} & \Rightarrow \quad-4 \sin ^{3} \theta-4 \sin ^{2} \theta+3 \sin \theta=0 \\ & \Rightarrow \quad-\sin \theta(4 \sin ^{2} \theta+4 \sin \theta-3)=0 \\ & \sin \theta=0 \quad \text{ or } \quad 4 \sin ^{2} \theta+4 \sin \theta-3=0 \\ & \therefore \quad \theta=n \pi \quad \text{ or } 4 \sin ^{2} \theta+6 \sin \theta-2 \sin \theta-3=0 \\ & \Rightarrow \quad 2 \sin \theta(2 \sin \theta+3)-1(2 \sin \theta+3)=0 \\ & \Rightarrow \quad(2 \sin \theta+3)(2 \sin \theta-1)=0 \\ & \Rightarrow \quad 2 \sin \theta+3=0 \quad \text{ or } \quad 2 \sin \theta-1=0 \\ & \sin \theta=\frac{-3}{2} \quad \text{ or } \quad \sin \theta=\frac{1}{2} \end{aligned} $

$\sin \theta=\frac{-3}{2}$ is not possible as $-1 \leq x \leq 1$

$\therefore \quad \sin \theta=\frac{1}{2} \quad \Rightarrow \quad \sin \theta=\sin \frac{\pi}{6} \quad \Rightarrow \quad \theta=n \pi+(-1)^{n} \cdot \frac{\pi}{6}$

Hence, $\theta=n \pi \quad$ or $\quad n \pi+(-1)^{n} \frac{\pi}{6}$

13. If $ \begin{bmatrix} 4-x & 4+x & 4+x \\ 4+x & 4-x & 4+x \\ 4+x & 4+x & 4-x \end{bmatrix} =0$, then find values of $x$.

Show Answer

Solution

Let $\quad A= \begin{bmatrix} 4-x & 4+x & 4+x \\ 4+x & 4-x & 4+x \\ 4+x & 4+x & 4-x \end{bmatrix} =0$

$|A|= \begin{vmatrix} 4-x & 4+x & 4+x \\ 4+x & 4-x & 4+x \\ 4+x & 4+x & 4-x\end{vmatrix} =0$

$R_1 \to R_1+R_2+R_3$

$\Rightarrow \quad \begin{vmatrix} 12+x & 12+x & 12+x \\ 4+x & 4-x & 4+x \\ 4+x & 4+x & 4-x\end{vmatrix} =0$

Taking $(12+x)$ common from $R_1$,

$ \begin{aligned} & \Rightarrow \quad(12+x) \begin{vmatrix} 1 & 1 & 1 \\ 4+x & 4-x & 4+x \\ 4+x & 4+x & 4-x \end{vmatrix} =0 \\ & C_1 \to C_1-C_2, C_2 \to C_2-C_3 \end{aligned} $

$ (12+x) \begin{vmatrix} 0 & 0 & 1 \\ 2 x & -2 x & 4+x \\ 0 & 2 x & 4-x \end{vmatrix} =0 $

Expanding along $R_1$

$ \begin{aligned} & \Rightarrow \quad(12+x)[1 \cdot \begin{vmatrix} 2 x & -2 x \\ 0 & 2 x \end{vmatrix} ]=0 \\ & \Rightarrow \quad x=-12 \text{ or } x=0 \end{aligned} $

14. If $a_1, a_2, a_3, \ldots, a_r$ are in G.P., then prove that the determinant

$ \begin{vmatrix} a _{r+1} & a _{r+5} & a _{r+9} \\ a _{r+7} & a _{r+11} & a _{r+15} \\ a _{r+11} & a _{r+17} & a _{r+21} \end{vmatrix} \text{ is independent of } r \text{. } $

Show Answer

Solution

If $a_1, a_2, a_3, \ldots a_r$ be the terms of G.P., then

$ a_n=AR^{n-1} $

(where $A$ is the first term and $R$ is the common ratio of the G.P. )

$ \begin{aligned} a _{r+1} & =AR^{r+1-1}=AR^{r} ; a _{r+5}=AR^{r+5-1}=AR^{r+4} \\ a _{r+9} & =AR^{r+9-1}=AR^{r+8} ; a _{r+7}=AR^{r+7-1}=AR^{r+6} \\ a _{r+11} & =AR^{r+11-1}=AR^{r+10} ; a _{r+15}=AR^{r+15-1}=AR^{r+14} \\ a _{r+17} & =AR^{r+17-1}=AR^{r+16} ; a _{r+21}=AR^{r+21-1}=AR^{r+20} \end{aligned} $

$\therefore \quad$ The determinant becomes

$ \begin{vmatrix} AR^{r} & AR^{r+4} & AR^{r+8} \\ AR^{r+6} & AR^{r+10} & AR^{r+14} \\ AR^{r+10} & AR^{r+16} & AR^{r+20} \end{vmatrix} $

Taking $AR^{r}, AR^{r+6}$ and $AR^{r+10}$ common from $R_1, R_2$ and $R_3$ respectively.

$ \begin{aligned} & AR^{r} \cdot AR^{r+6} \cdot AR^{r+10} \begin{vmatrix} 1 & R^{4} & R^{8} \\ 1 & R^{4} & R^{8} \\ 1 & R^{6} & R^{10} \end{vmatrix} \\ = & AR^{r} \cdot AR^{r+6} \cdot AR^{r+10}|0| \\ = & 0 \quad[\because \quad R_1 \text{ and } R_2 \text{ are identical rows }] \end{aligned} $

Hence, the given determinant is independent of $r$.

15. Show that the points $(a+5, a-4),(a-2, a+3)$ and $(a, a)$ do not lie on a straight line for any value of $a$.

Show Answer

Solution

If the given points lie on a straight line, then the area of the triangle formed by joining the points pairwise is zero.

So, $ \begin{vmatrix} a+5 & a-4 & 1 \\ a-2 & a+3 & 1 \\ a & a & 1\end{vmatrix} $

$R_1 \to R_1-R_2, R_2 \to R_2-R_3$

$\Rightarrow \quad \begin{vmatrix} 7 & -7 & 0 \\ -2 & 3 & 0 \\ a & a & 1\end{vmatrix} $

Expanding along $C_3$

$ 1 \cdot \begin{vmatrix} 7 & -7 \\ -2 & 3 \end{vmatrix} =21-14=7 \text{ units } $

As $7 \neq 0$. Hence, the three points do not lie on a straight line for any value of $a$.

16. Show that the $\triangle ABC$ is an isosceles triangle if the determinant

$ \Delta= \begin{vmatrix} 1 & 1 & 1 \\ 1+\cos A & 1+\cos B & 1+\cos C \\ \cos ^{2} A+\cos A & \cos ^{2} B+\cos B & \cos ^{2} C+\cos C \end{vmatrix} =0 . $

Show Answer

Solution

$ \begin{vmatrix} 1 & 1 & 1 \\ 1+\cos A & 1+\cos B & 1+\cos C \\ \cos ^{2} A+\cos A & \cos ^{2} B+\cos B & \cos ^{2} C+\cos C \end{vmatrix} =0 . $

$C_1 \rarr C_1-C_2,C_2 \rarr C_2-C_3$

$\Rightarrow \left|\begin{array}{ccc} 0 & 0 & 1 \\ \cos A-\cos B & \cos B-\cos C & 1+\cos C \\ \cos ^2 A+\cos A & \cos ^2 B+\cos B & \cos ^2 C+\cos C \\ -\cos ^2 B-\cos B & -\cos ^2 C-\cos C & \end{array}\right|=0$

$\Rightarrow \left|\begin{array}{ccc} 0 & 0 & 1 \\ \cos A-\cos B & \cos B-\cos C & 1+\cos C \\ \cos ^2 A-\cos ^2 B & \cos ^2 B-\cos ^2 C & \cos ^2 C+\cos C \\ +\cos A-\cos B & +\cos B-\cos C & \end{array}\right|=0$

$\Rightarrow \left|\begin{array}{ccc} 0 & 0 & 1 \\ \cos A-\cos B & \cos B-\cos C & 1+\cos C \\ (\cos A+\cos B) \times & (\cos B+\cos C) \times & \\ (\cos A-\cos B) & (\cos B-\cos C) & \cos ^2 C+\cos C \\ +(\cos A-\cos B) & +(\cos B+\cos C) & \end{array}\right|=0$

Taking $(\cos A-\cos B)$ and $(\cos B-\cos C)$ common from $C_1$ and $C_2$ respectively.

$\Rightarrow \quad(\cos A-\cos B)(\cos B-\cos C) \begin{vmatrix} 0 & 0 & 1 \\ 1 & 1 & 1+\cos C \\ \cos A+ & \cos B+ & \cos ^{2} C+ \\ \cos B+1 & \cos C+1 & \cos C\end{vmatrix} =0$

Expanding along $R_1$

$\Rightarrow(\cos A-\cos B)(\cos B-\cos C)\left[1\left|\begin{array}{cc} 1 & 1 \\ \cos \mathrm{A}+ & \cos \mathrm{B}+ \\ \cos \mathrm{B}+1 & \cos \mathrm{C}+1 \end{array}\right|\right]=0$

$\Rightarrow(\cos A-\cos B)(\cos B-\cos C) \begin{vmatrix} (\cos B+\cos C+1)- \\ (\cos A+\cos B+1) \end{vmatrix} =0$

$\Rightarrow(\cos A-\cos B)(\cos B-\cos C)[\cos B+\cos C+1-\cos A-\cos B-1]=0$

$\Rightarrow(\cos A-\cos B)(\cos B-\cos C)(\cos C-\cos A)=0$

$\cos A-\cos B=0$ or $\cos B-\cos C=0$

or $\cos C-\cos A=0$

$\Rightarrow \cos A=\cos B$ or $\cos B=\cos C$ or $\cos C=\cos A$

$\Rightarrow \angle A=\angle C$ or $\angle B=\angle C \Rightarrow \angle A=\angle B$

Hence, $\triangle ABC$ is an isosceles triangle.

17. Find $A^{-1}$ if $A= \begin{bmatrix} 0 & 1 & 1 \\ 1 & 0 & 1 \\ 1 & 1 & 0 \end{bmatrix} $ and show that $A^{-1}=\frac{A^{2}-3 I}{2}$.

Show Answer

Solution

Here, $\quad A= \begin{bmatrix} 0 & 1 & 1 \\ 1 & 0 & 1 \\ 1 & 1 & 0 \end{bmatrix} $

$ \begin{aligned} |A| & =0 \begin{vmatrix} 0 & 1 \\ 1 & 0 \end{vmatrix} -1 \begin{vmatrix} 1 & 1 \\ 1 & 0 \end{vmatrix} +1 \begin{vmatrix} 1 & 0 \\ 1 & 1 \end{vmatrix} \\ & =0-1(0-1)+1(1-0) \\ & =1+1=2 \neq 0 \text{ (non-singular matrix.) } \end{aligned} $

Now, co-factors,

$ \begin{aligned} & a _{11}=+ \begin{vmatrix} 0 & 1 \\ 1 & 0 \end{vmatrix} =-1, \quad a _{12}=- \begin{vmatrix} 1 & 1 \\ 1 & 0 \end{vmatrix} =1, \quad a _{13}=+ \begin{vmatrix} 1 & 0 \\ 1 & 1 \end{vmatrix} =1 \\ & a _{21}=- \begin{vmatrix} 1 & 1 \\ 1 & 0 \end{vmatrix} =1, \quad a _{22}=+ \begin{vmatrix} 0 & 1 \\ 1 & 0 \end{vmatrix} =-1, \quad a _{23}=- \begin{vmatrix} 0 & 1 \\ 1 & 1 \end{vmatrix} =1 \end{aligned} $

$ \begin{aligned} & a _{31}=+ \begin{vmatrix} 1 & 1 \\ 0 & 1 \end{vmatrix} =1, \quad a _{32}=- \begin{vmatrix} 0 & 1 \\ 1 & 1 \end{vmatrix} =1, \quad a _{33}=+ \begin{vmatrix} 0 & 1 \\ 1 & 0 \end{vmatrix} =-1 \\ & Adj(A)= \begin{bmatrix} -1 & 1 & 1 \\ 1 & -1 & 1 \\ 1 & 1 & -1 \end{bmatrix} ^{\prime}= \begin{bmatrix} -1 & 1 & 1 \\ 1 & -1 & 1 \\ 1 & 1 & -1 \end{bmatrix} \\ & \therefore \quad A^{-1}=\frac{1}{|A|} Adj(A)=\frac{1}{2} \begin{bmatrix} -1 & 1 & 1 \\ 1 & -1 & 1 \\ 1 & 1 & -1 \end{bmatrix} \\ & \text{ Now, } \quad A^{2}=A \cdot A= \begin{bmatrix} 0 & 1 & 1 \\ 1 & 0 & 1 \\ 1 & 1 & 0 \end{bmatrix} \begin{bmatrix} 0 & 1 & 1 \\ 1 & 0 & 1 \\ 1 & 1 & 0 \end{bmatrix} \\ & = \begin{bmatrix} 0+1+1 & 0+0+1 & 0+1+0 \\ 0+0+1 & 1+0+1 & 1+0+0 \\ 0+1+0 & 1+0+0 & 1+1+0 \end{bmatrix} = \begin{bmatrix} 2 & 1 & 1 \\ 1 & 2 & 1 \\ 1 & 1 & 2 \end{bmatrix} \\ & \text{ Hence, } A^{2}= \begin{bmatrix} 2 & 1 & 1 \\ 1 & 2 & 1 \\ 1 & 1 & 2 \end{bmatrix} \end{aligned} $

Now, we have to prove that $A^{-1}=\frac{A^{2}-3 I}{2}$

$ \begin{aligned} \text{ R.H.S. } & =\frac{ \begin{bmatrix} 2 & 1 & 1 \\ 1 & 2 & 1 \\ 1 & 1 & 2 \end{bmatrix} -3 \begin{bmatrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{bmatrix} }{2} \\ & =\frac{ \begin{bmatrix} 2 & 1 & 1 \\ 1 & 2 & 1 \\ 1 & 1 & 2 \end{bmatrix} - \begin{bmatrix} 3 & 0 & 0 \\ 0 & 3 & 0 \\ 0 & 0 & 3 \end{bmatrix} }{2}=\frac{1}{2} \begin{bmatrix} -1 & 1 & 1 \\ 1 & -1 & 1 \\ 1 & 1 & -1 \end{bmatrix} \\ & =A^{-1}=\text{ L.H.S. } \end{aligned} $

Hence, proved.

Long Answer Type Questions

18. If $A= \begin{bmatrix} 1 & 2 & 0 \\ -2 & -1 & -2 \\ 0 & -1 & 1 \end{bmatrix} $, find $A^{-1}$. Using $A^{-1}$, solve the system of linear equations $x-2 y=10,2 x-y-z=8,-2 y+z=7$.

Show Answer

Solution

Given that

$ \begin{aligned} A & = \begin{bmatrix} 1 & 2 & 0 \\ -2 & -1 & -2 \\ 0 & -1 & 1 \end{bmatrix} \\ |A| & =1 \begin{vmatrix} -1 & -2 \\ -1 & 1 \end{vmatrix} -2 \begin{vmatrix} -2 & -2 \\ 0 & 1 \end{vmatrix} +0 \begin{bmatrix} -2 & -1 \\ 0 & -1 \end{bmatrix} \\ & =1(-1-2)-2(-2-0)+0 \\ & =-3+4=1 \neq 0 \text{ (non-singular matrix.) } \end{aligned} $

Now co-factors,

$ \begin{aligned} & a _{11}=+ \begin{vmatrix} -1 & -2 \\ -1 & 1 \end{vmatrix} =-3, a _{12}=- \begin{vmatrix} -2 & -2 \\ 0 & 1 \end{vmatrix} =2, a _{13}=+ \begin{vmatrix} -2 & -1 \\ 0 & -1 \end{vmatrix} =2 \\ & a _{21}=- \begin{vmatrix} 2 & 0 \\ -1 & 1 \end{vmatrix} =-2, a _{22}=+ \begin{vmatrix} 1 & 0 \\ 0 & 1 \end{vmatrix} =1, a _{23}=- \begin{vmatrix} 1 & 2 \\ 0 & -1 \end{vmatrix} =1 \\ & a _{31}=+ \begin{vmatrix} 2 & 0 \\ -1 & -2 \end{vmatrix} =-4, a _{32}=- \begin{vmatrix} 1 & 0 \\ -2 & -2 \end{vmatrix} =2, a _{33}=+ \begin{vmatrix} 1 & 2 \\ -2 & -1 \end{vmatrix} =3 \\ & Adj(A)= \begin{bmatrix} -3 & 2 & 2 \\ -2 & 1 & 1 \\ -4 & 2 & 3 \end{bmatrix} ^{\prime}= \begin{bmatrix} -3 & -2 & -4 \\ 2 & 1 & 2 \\ 2 & 1 & 3 \end{bmatrix} \\ & \therefore \quad A^{-1}=\frac{1}{|A|} Adj(A)=\frac{1}{1} \begin{bmatrix} -3 & -2 & -4 \\ 2 & 1 & 2 \\ 2 & 1 & 3 \end{bmatrix} \\ & \Rightarrow \quad A^{-1}= \begin{bmatrix} -3 & -2 & -4 \\ 2 & 1 & 2 \\ 2 & 1 & 3 \end{bmatrix} \end{aligned} $

Now, the system of linear equations is given by $x-2 y=10$, $2 x-y-z=8$ and $-2 y+z=7$, which is in the form of $CX=D$.

$ \begin{bmatrix} 1 & -2 & 0 \\ 2 & -1 & -1 \\ 0 & -2 & 1 \end{bmatrix} \begin{bmatrix} x \\ y \\ z \end{bmatrix} = \begin{bmatrix} 10 \\ 8 \\ 7 \end{bmatrix} $

where $C= \begin{bmatrix} 1 & -2 & 0 \\ 2 & -1 & -1 \\ 0 & -2 & 1\end{bmatrix} , X= \begin{bmatrix} x \\ y \\ z\end{bmatrix} $ and $D= \begin{bmatrix} 10 \\ 8 \\ 7 \end{bmatrix} $

$ \begin{aligned} & \because \quad(A^{T})^{-1}=(A^{-1})^{T} \\ & \therefore \quad C^{T}= \begin{bmatrix} 1 & 2 & 0 \\ -2 & -1 & -2 \\ 0 & -1 & 1 \end{bmatrix} =A \\ & \therefore \begin{bmatrix} x \\ y \\ z \end{bmatrix} =C^{-1} D \Rightarrow \begin{bmatrix} x \\ y \\ z \end{bmatrix} = \begin{bmatrix} -3 & 2 & 2 \\ -2 & 1 & 1 \\ -4 & 2 & 3 \end{bmatrix} \begin{bmatrix} 10 \\ 8 \\ 7 \end{bmatrix} \\ & \Rightarrow \begin{bmatrix} x \\ y \\ z \end{bmatrix} = \begin{bmatrix} -30+16+14 \\ -20+8+7 \\ -40+16+21 \end{bmatrix} \Rightarrow \begin{bmatrix} x \\ y \\ z \end{bmatrix} = \begin{bmatrix} 0 \\ -5 \\ -3 \end{bmatrix} \end{aligned} $

Hence, $x=0, y=-5$ and $z=-3$

19. Using matrix method, solve the system of equation

$ 3 x+2 y-2 z=3, x+2 y+3 z=6,2 x-y+z=2 \text{. } $

Show Answer

Solution

Given that

$ \begin{aligned} & 3 x+2 y-2 z=3 \\ & x+2 y+3 z=6 \\ & 2 x-y+z=2 \\ & A= \begin{bmatrix} 3 & 2 & -2 \\ 1 & 2 & 3 \\ 2 & -1 & 1 \end{bmatrix} \text{ and } B= \begin{bmatrix} 3 \\ 6 \\ 2 \end{bmatrix} \\ & |A|=3 \begin{vmatrix} 2 & 3 \\ -1 & 1 \end{vmatrix} -2 \begin{vmatrix} 1 & 3 \\ 2 & 1 \end{vmatrix} -2 \begin{vmatrix} 1 & 2 \\ 2 & -1 \end{vmatrix} \\ & =3(2+3)-2(1-6)-2(-1-4) \\ & =15+10+10=35 \neq 0 \text{ non-singular matrix } \end{aligned} $

Now, co-factors,

$ \begin{aligned} & a _{11}=+ \begin{vmatrix} 2 & 3 \\ -1 & 1 \end{vmatrix} =5, a _{12}=- \begin{vmatrix} 1 & 3 \\ 2 & 1 \end{vmatrix} =5, a _{13}=+ \begin{vmatrix} 1 & 2 \\ 2 & -1 \end{vmatrix} =-5 \\ & a _{21}=- \begin{vmatrix} 2 & -2 \\ -1 & 1 \end{vmatrix} =0, a _{22}=+ \begin{vmatrix} 3 & -2 \\ 2 & 1 \end{vmatrix} =7, a _{23}=- \begin{vmatrix} 3 & 2 \\ 2 & -1 \end{vmatrix} =7 \\ & a _{31}=+ \begin{vmatrix} 2 & -2 \\ 2 & 3 \end{vmatrix} =10, a _{32}=- \begin{vmatrix} 3 & -2 \\ 1 & 3 \end{vmatrix} =-11, a _{33}=+ \begin{vmatrix} 3 & 2 \\ 1 & 2 \end{vmatrix} =4 \\ & Adj(A)= \begin{bmatrix} 5 & 5 & -5 \\ 0 & 7 & 7 \\ 10 & -11 & 4 \end{bmatrix} = \begin{bmatrix} 5 & 0 & 10 \\ 5 & 7 & -11 \\ -5 & 7 & 4 \end{bmatrix} \end{aligned} $

$ \begin{aligned} & \therefore \quad A^{-1}=\frac{1}{|A|} Adj(A)=\frac{1}{35} \begin{bmatrix} 5 & 0 & 10 \\ 5 & 7 & -11 \\ -5 & 7 & 4 \end{bmatrix} \\ & \text{ Now, } X=A^{-1} B \\ & \therefore \begin{bmatrix} x \\ y \\ z \end{bmatrix} =\frac{1}{35} \begin{bmatrix} 5 & 0 & 10 \\ 5 & 7 & -11 \\ -5 & 7 & 4 \end{bmatrix} \begin{bmatrix} 3 \\ 6 \\ 2 \end{bmatrix} =\frac{1}{35} \begin{bmatrix} 15+0+20 \\ 15+42-22 \\ -15+42+8 \end{bmatrix} =\frac{1}{35} \begin{bmatrix} 35 \\ 35 \\ 35 \end{bmatrix} \\ & { \begin{bmatrix} x \\ y \\ z \end{bmatrix} = \begin{bmatrix} 1 \\ 1 \\ 1 \end{bmatrix} } \end{aligned} $

Hence, $x=1, y=1$ and $z=1$.

20. If $A= \begin{bmatrix} 2 & 2 & -4 \\ -4 & 2 & -4 \\ 2 & -1 & 5\end{bmatrix} $ and $B= \begin{bmatrix} 1 & -1 & 0 \\ 2 & 3 & 4 \\ 0 & 1 & 2 \end{bmatrix} $, then find $B A$ and use this to solve the system of equations $y+2 z=7, x-y=3$ and $2 x+3 y+4 z=17$.

Show Answer

Solution

We have, $A= \begin{bmatrix} 2 & 2 & -4 \\ -4 & 2 & -4 \\ 2 & -1 & 5\end{bmatrix} $ and $B= \begin{bmatrix} 1 & -1 & 0 \\ 2 & 3 & 4 \\ 0 & 1 & 2 \end{bmatrix} $

$ \begin{aligned} BA & = \begin{bmatrix} 1 & -1 & 0 \\ 2 & 3 & 4 \\ 0 & 1 & 2 \end{bmatrix} \begin{bmatrix} 2 & 2 & -4 \\ -4 & 2 & -4 \\ 2 & -1 & 5 \end{bmatrix} \\ & = \begin{bmatrix} 2+4+0 & 2-2+0 & -4+4+0 \\ 4-12+8 & 4+6-4 & -8-12+20 \\ 0-4+4 & 0+2-2 & 0-4+10 \end{bmatrix} \\ & = \begin{bmatrix} 6 & 0 & 0 \\ 0 & 6 & 0 \\ 0 & 0 & 6 \end{bmatrix} =6 \begin{bmatrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{bmatrix} =6 I \\ \therefore \quad B^{-1} & =\frac{1}{6} A=\frac{1}{6} \begin{bmatrix} 2 & -4 \\ -4 & 2 & -4 \\ 2 & -1 & 5 \end{bmatrix} \end{aligned} $

The given equations can be re-write as,

$ \begin{aligned} & x-y=3,2 x+3 y+4 z=17 \text{ and } y+2 z=7 \\ & \therefore \quad \begin{bmatrix} 1 & -1 & 0 \\ 2 & 3 & 4 \\ 0 & 1 & 2 \end{bmatrix} \begin{bmatrix} x \\ y \\ z \end{bmatrix} = \begin{bmatrix} 3 \\ 17 \\ 7 \end{bmatrix} \\ & \Rightarrow \quad \begin{bmatrix} x \\ y \\ z \end{bmatrix} = \begin{bmatrix} 1 & -1 & 0 \\ 2 & 3 & 4 \\ 0 & 1 & 2 \end{bmatrix} ^{-1} \begin{bmatrix} 3 \\ 17 \\ 7 \end{bmatrix} \\ & =\frac{1}{6} \begin{bmatrix} 2 & 2 & -4 \\ -4 & 2 & -4 \\ 2 & -1 & 5 \end{bmatrix} \begin{bmatrix} 3 \\ 17 \\ 7 \end{bmatrix} \\ & =\frac{1}{6} \begin{bmatrix} 6+34-28 \\ -12+34-28 \\ 6-17+35 \end{bmatrix} =\frac{1}{6} \begin{bmatrix} 12 \\ -6 \\ 24 \end{bmatrix} = \begin{bmatrix} 2 \\ -1 \\ 4 \end{bmatrix} \end{aligned} $

Hence, $x=2, y=-1$ and $z=4$

21. If $a+b+c \neq 0$ and $ \begin{bmatrix} a & b & c \\ b & c & a \\ c & a & b \end{bmatrix} =0$, then prove that $a=b=c$.

Show Answer

Solution

Given that: $a+b+c \neq 0$ and $ \begin{bmatrix} a & b & c \\ b & c & a \\ c & a & b \end{bmatrix} =0$

$C_1 \to C_1+C_2+C_3$

$ \begin{aligned} & \Rightarrow \quad \begin{bmatrix} a+b+c & b & c \\ a+b+c & c & a \\ a+b+c & a & b \end{bmatrix} =0 \\ & \Rightarrow \quad(a+b+c) \begin{vmatrix} 1 & b & c \\ 1 & c & a \\ 1 & a & b \end{vmatrix} =0 \quad \begin{matrix} \text{ (Taking } a+b+c \\ \text{ common from } C_1 \text{ ) } \end{matrix} \\ & \Rightarrow \quad a+b+c \neq 0 \quad \therefore \quad \begin{vmatrix} 1 & b & c \\ 1 & c & a \\ 1 & a & b \end{vmatrix} =0 \end{aligned} $

$\Rightarrow \quad \begin{vmatrix} 0 & b-c & c-a \\ 0 & c-a & a-b \\ 1 & a & b\end{vmatrix} =0$

Expanding along $C_1$

$ \begin{aligned} & 1 \begin{vmatrix} b-c & c-a \\ c-a & a-b \end{vmatrix} =0 \\ & \Rightarrow \quad(b-c)(a-b)-(c-a)^{2}=0 \\ & \Rightarrow \quad a b-b^{2}-a c+b c-c^{2}-a^{2}+2 a c=0 \\ & \Rightarrow \quad-a^{2}-b^{2}-c^{2}+a b+b c+a c=0 \\ & \Rightarrow \quad a^{2}+b^{2}+c^{2}-a b-b c-a c=0 \\ & \Rightarrow \quad 2 a^{2}+2 b^{2}+2 c^{2}-2 a b-2 b c-2 a c=0 \end{aligned} $

(Multiplying both sides by 2 )

$\Rightarrow(a^{2}+b^{2}-2 a b)+(b^{2}+c^{2}-2 b c)+(a^{2}+c^{2}-2 a c)=0$

$\Rightarrow \quad(a-b)^{2}+(b-c)^{2}+(a-c)^{2}=0$

It is only possible when $(a-b)^{2}=(b-c)^{2}=(a-c)^{2}=0$

$\therefore a=b=c \quad$ Hence, proved.

22. Prove that $ \begin{vmatrix} b c-a^{2} & c a-b^{2} & a b-c^{2} \\ c a-b^{2} & a b-c^{2} & b c-a^{2} \\ a b-c^{2} & b c-a^{2} & a c-b^{2}\end{vmatrix} $ is divisible by $a+b+c$ and find the quotient.

Show Answer

Solution

Let $\Delta= \begin{vmatrix} b c-a^{2} & c a-b^{2} & a b-c^{2} \\ c a-b^{2} & a b-c^{2} & b c-a^{2} \\ a b-c^{2} & b c-a^{2} & a c-b^{2}\end{vmatrix} $

$C_1 \to C_1+C_2+C_3$

$\Rightarrow \quad \begin{vmatrix} a b+b c+a c-a^{2}-b^{2}-c^{2} & c a-b^{2} & a b-c^{2} \\ a b+b c+a c-a^{2}-b^{2}-c^{2} & a b-c^{2} & b c-a^{2} \\ a b+b c+a c-a^{2}-b^{2}-c^{2} & b c-a^{2} & a c-b^{2}\end{vmatrix} $

Taking $a b+b c+a c-a^{2}-b^{2}-c^{2}$ common from $C_1$

$(a b+b c+a c-a^{2}-b^{2}-c^{2}) \begin{vmatrix} 1 & c a-b^{2} & a b-c^{2} \\ 1 & a b-c^{2} & b c-a^{2} \\ 1 & b c-a^{2} & a c-b^{2}\end{vmatrix} $

$R_1 \to R_1-R_2$ and $R_2 \to R_2-R_3$

$ \begin{aligned} & \Rightarrow \quad(a b+b c+a c-a^{2}-b^{2}-c^{2}) \\ & \begin{vmatrix} 0 & c a-b^{2}-a b+c^{2} & a b-c^{2}-b c+a^{2} \\ 0 & a b-c^{2}-b c+a^{2} & b c-a^{2}-a c+b^{2} \\ 1 & b c-a^{2} & a c-b^{2} \end{vmatrix} \\ & \Rightarrow \quad(a b+b c+a c-a^{2}-b^{2}-c^{2}) \\ & \begin{vmatrix} 0 & a(c-b)+(c+b)(c-b) & b(a-c)+(a+c)(a-c) \\ 0 & b(a-c)+(a+c)(a-c) & c(b-a)+(b+a)(b-a) \\ 1 & b c-a^{2} & a c-b^{2} \end{vmatrix} \\ & \Rightarrow \quad(a b+b c+a c-a^{2}-b^{2}-c^{2}) \\ & \begin{vmatrix} 0 & (c-b)(a+b+c) & (a-c)(a+b+c) \\ 0 & (a-c)(a+b+c) & (b-a)(a+b+c) \\ 1 & b c-a^{2} & a c-b^{2} \end{vmatrix} \\ & \Rightarrow \quad(a b+b c+a c-a^{2}-b^{2}-c^{2})(a+b+c)(a+b+c) \\ & \begin{vmatrix} 0 & c-b & a-c \\ 0 & a-c & b-a \\ 1 & b c-a^{2} & a c-b^{2} \end{vmatrix} \\ & \Rightarrow \quad(a+b+c)^{2}(a b+b c+a c-a^{2}-b^{2}-c^{2}) \\ & \begin{vmatrix} 0 & c-b & a-c \\ 0 & a-c & b-a \\ 1 & b c-a^{2} & a c-b^{2} \end{vmatrix} \end{aligned} $

Expanding along $C_1$

$\Rightarrow(a+b+c)^{2}(a b+b c+a c-a^{2}-b^{2}-c^{2})[1 \begin{vmatrix} c-b & a-c \\ a-c & b-a\end{vmatrix} ]$

$\Rightarrow(a+b+c)^{2}(a b+b c+a c-a^{2}-b^{2}-c^{2})[(c-b)(b-a)-(a-c)^{2}]$

$\Rightarrow(a+b+c)^{2}(a b+b c+a c-a^{2}-b^{2}-c^{2})(b c-c a-b^{2}+a b-a^{2}-c^{2}+2 a c)$

$\Rightarrow(a+b+c)^{2}(a b+b c+a c-a^{2}-b^{2}-c^{2})(a b+b c+c a-a^{2}-b^{2}-c^{2})$

$\Rightarrow(a+b+c)^{2}(a b+b c+a c-a^{2}-b^{2}-c^{2})^{2}$

$\Rightarrow(a+b+c)(a+b+c)(a^{2}+b^{2}+c^{2}-a b-b c-a c)^{2}$

Hence, the given determinant is divisible by $a+b+c$ and the quotient is

$ \begin{aligned} & (a+b+c)(a^{2}+b^{2}+c^{2}-a b-b c-a c)^{2} \\ \Rightarrow \quad & (a+b+c)(a^{2}+b^{2}+c^{2}-a b-b c-a c)(a^{2}+b^{2}+c^{2}-a b-b c-a c) \\ \Rightarrow \quad & (a^{3}+b^{3}+c^{3}-3 a b c)(a^{2}+b^{2}+c^{2}-a b-b c-a c) \end{aligned} $

$ \begin{aligned} & \Rightarrow \quad-(a^{3}+b^{3}+c^{3}-3 a b c)(2 a^{2}+2 b^{2}+2 c^{2}-2 a b-2 b c-2 a c) \\ & \Rightarrow \quad \frac{1}{2}(a^{3}+b^{3}+c^{3}-3 a b c)[(a-b)^{2}+(b-c)^{2}+(a-c)^{2}] \end{aligned} $

23. If $x+y+z=0$, prove that $ \begin{vmatrix} x a & y b & z c \\ y c & z a & x b \\ z b & x c & y a\end{vmatrix} =x y z \begin{vmatrix} a & b & c \\ c & a & b \\ b & c & a\end{vmatrix} $

Show Answer

Solution

L.H.S.

$ \text{ Let } \Delta= \begin{vmatrix} x a & y b & z c \\ y c & z a & x b \\ z b & x c & y a \end{vmatrix} $

Expanding along $R_1$

$ \begin{aligned} & \Rightarrow \quad x a \begin{vmatrix} z a & x b \\ x c & y a \end{vmatrix} -y b \begin{vmatrix} y c & x b \\ z b & y a \end{vmatrix} +z c \begin{vmatrix} y c & z a \\ z b & x c \end{vmatrix} \\ & \Rightarrow \quad x a(y z a^{2}-x^{2} b c)-y b(y^{2} a c-x z b^{2})+z c(x y c^{2}-z^{2} a b) \\ & \Rightarrow \quad x y z a^{3}-x^{3} a b c-y^{3} a b c+x y z b^{3}+x y z c^{3}-z^{3} a b c \\ & \Rightarrow \quad x y z(a^{3}+b^{3}+c^{3})-a b c(x^{3}+y^{3}+z^{3}) \\ & \Rightarrow \quad x y z(a^{3}+b^{3}+c^{3})-a b c(3 x y z) \\ & \quad[(\because \quad x+y+z=0)(\therefore \quad x^{3}+y^{3}+z^{3}=3 x y z)] \\ & \Rightarrow x y z(a^{3}+b^{3}+c^{3}-3 a b c) \end{aligned} $

R.H.S. $x y z \begin{vmatrix} a & b & c \\ c & a & b \\ b & c & a\end{vmatrix} $

$ R_1 \to R_1+R_2+R_3 $

$ \begin{aligned} & \Rightarrow \quad x y z \begin{vmatrix} a+b+c & a+b+c & a+b+c \\ c & a & b \\ b & c & a \end{vmatrix} \\ & \Rightarrow \quad x y z(a+b+c) \begin{vmatrix} 1 & 1 & 1 \\ c & a & b \\ b & c & a \end{vmatrix} \end{aligned} $

$C_1 \to C_1-C_2, C_2 \to C_2-C_3$

$\Rightarrow \quad x y z(a+b+c) \begin{vmatrix} 0 & 0 & 1 \\ c-a & a-b & b \\ b-c & c-a & a\end{vmatrix} $

Expanding along $R_1$ $$ \begin{aligned} & \Rightarrow \quad x y z(a+b+c)\left[1\left|\begin{array}{cc} c-a & a-b \\ b-c & c-a \end{array}\right|\right] \\ & \Rightarrow \quad x y z(a+b+c)\left[(c-a)^2-(b-c)(a-b)\right] \\ & \Rightarrow \quad x y z(a+b+c)\left(c^2+a^2-2 c a-a b+b^2+a c-b c\right) \\ & \Rightarrow \quad x y z(a+b+c)\left(a^2+b^2+c^2-a b-b c-c a\right) \\ & \Rightarrow \quad x y z\left(a^3+b^3+c^3-3 a b c\right) \\ & \quad\left[a^3+b^3+c^3-3 a b c=(a+b+c)\left(a^2+b^2+c^2-a b-b c-c a\right)\right] \end{aligned} $$ L.H.S. $=$ R.H.S.

Hence, proved.

Objective Type Questions (M.C.Q.)

24. If $ \begin{vmatrix} 2 x & 5 \\ 8 & x\end{vmatrix} = \begin{vmatrix} 6 & -2 \\ 7 & 3\end{vmatrix} $, then the value of $x$ is

(a) 3

(b) $\pm 3$

(c) $\pm 6$

(d) 6

Show Answer

Solution

Given that

$ \begin{aligned} & \Rightarrow \quad \begin{vmatrix} 2 x & 5 \\ 8 & x \end{vmatrix} = \begin{vmatrix} 6 & -2 \\ 7 & 3 \end{vmatrix} \\ & \Rightarrow \quad 2 x^{2}-40=18+14 \Rightarrow 2 x^{2}=32+40 \\ & \Rightarrow \quad 2 x^{2}=72 \Rightarrow x^{2}=36 \\ & \therefore \quad x= \pm 6 \end{aligned} $

Hence, the correct option is (c).

25. The value of determinant $ \begin{vmatrix} a-b & b+c & a \\ b-a & c+a & b \\ c-a & a+b & c\end{vmatrix} $ is

(a) $a^{3}+b^{3}+c^{3}$

(b) $3 b c$

(c) $a^{3}+b^{3}+c^{3}-3 a b c$

(d) None of these

Show Answer

Solution

Here, we have $ \begin{vmatrix} a-b & b+c & a \\ b-a & c+a & b \\ c-a & a+b & c\end{vmatrix} $

$C_2 \to C_2+C_3$

$ \begin{aligned} & \Rightarrow \begin{vmatrix} a-b & a+b+c & a \\ b-a & a+b+c & b \\ c-a & a+b+c & c \end{vmatrix} \\ & \Rightarrow \quad(a+b+c) \begin{vmatrix} a-b & 1 & a \\ b-a & 1 & b \\ c-a & 1 & c \end{vmatrix} \quad \text{ (Taking } a+b+c \text{ common } \\ & R_1 \to R_1-R_2, R_2 \to R_2-R_3 \\ & \Rightarrow \quad(a+b+c) \begin{vmatrix} 2(a-b) & 0 & a-b \\ b-c & 0 & b-c \\ c-a & 1 & c \end{vmatrix} \end{aligned} $

Taking $(a-b)$ and $(b-c)$ common from $R_1$ and $R_2$ respectively

$ \Rightarrow \quad(a+b+c)(a-b)(b-c) \begin{vmatrix} 2 & 0 & 1 \\ 1 & 0 & 1 \\ c-a & 1 & c \end{vmatrix} $

Expanding along $C_2$

$ \begin{aligned} & \Rightarrow \quad(a+b+c)(a-b)(b-c)[-1 \begin{vmatrix} 2 & 1 \\ 1 & 1 \end{vmatrix} ] \\ & \Rightarrow \quad(a+b+c)(a-b)(b-c)(-1) \\ & \Rightarrow \quad(a+b+c)(a-b)(c-b) \end{aligned} $

Hence, the correct option is $(d)$.

26. The area of a triangle with vertices $(-3,0),(3,0)$ and $(0, k)$ is 9 sq units. Then, the value of $k$ will be

(a) 9

(b) 3

(c) -9

(d) 6

Show Answer

Solution

Area of triangle with vertices $(x_1 y_1),(x_2 y_2)$ and $(x_3, y_3)$ will be:

$ \begin{matrix} \Delta & =\frac{1}{2} \begin{vmatrix} x_1 & y_1 & 1 \\ x_2 & y_2 & 1 \\ x_3 & y_3 & 1 \end{vmatrix} \Rightarrow \Delta=\frac{1}{2} \begin{vmatrix} -3 & 0 & 1 \\ 3 & 0 & 1 \\ 0 & k & 1 \end{vmatrix} \\ \Rightarrow \quad & =\frac{1}{2}[-3 \begin{vmatrix} 0 & 1 \\ k & 1 \end{vmatrix} -0 \begin{vmatrix} 3 & 1 \\ 0 & 1 \end{vmatrix} +1 \begin{vmatrix} 3 & 0 \\ 0 & k \end{vmatrix} ] \\ \Rightarrow \quad & =\frac{1}{2}[-3(-k)-0+1(3 k)] \\ \Rightarrow \quad & =\frac{1}{2}(3 k+3 k) \quad \Rightarrow \quad \frac{1}{2}(6 k)=3 k \end{matrix} $

$\qquad 3k=9 \Rightarrow k=3$

Hence, the correct option is (b).

27. The determinant $ \begin{vmatrix} b^{2}-a b & b-c & b c-a c \\ a b-a^{2} & a-b & b^{2}-a b \\ b c-a c & c-a & a b-a^{2}\end{vmatrix} $ equals

(a) $a b c(b-c)(c-a)(a-b)$

(b) $(b-c)(c-a)(a-b)$

(c) $(a+b+c)(b-c)(c-a)(a-b)$

(d) None of these

Show Answer

Solution

Let

$ \begin{aligned} \Delta & = \begin{vmatrix} b^{2}-a b & b-c & b c-a c \\ a b-a^{2} & a-b & b^{2}-a b \\ b c-a c & c-a & a b-a^{2} \end{vmatrix} \\ & = \begin{vmatrix} b(b-a) & b-c & c(b-a) \\ a(b-a) & a-b & b(b-a) \\ c(b-a) & c-a & a(b-a) \end{vmatrix} \text{ (Taking }(b-a) \text{ common } \\ \text{ from } C_1 \text{ and } C_3) \\ & =(b-a)^{2} \begin{vmatrix} b & b-c & c \\ a & a-b & b \\ c & c-a & a \end{vmatrix} \\ -C_3 & \\ & =(a-b)^{2} \begin{vmatrix} b-c & b-c & c \\ a-b & a-b & b \\ c-a & c-a & a \end{vmatrix} \quad C_1 \text{ and } C_2 \text{ identical columns.) } \\ & =(a-b)^{2} \cdot 0 \\ & =0 \end{aligned} $

$ C_1 \to C_1-C_3 $

Hence, the correct option is $(d)$.

28. The number of distinct real roots of $ \begin{vmatrix} \sin x & \cos x & \cos x \\ \cos x & \sin x & \cos x \\ \cos x & \cos x & \sin x\end{vmatrix} =0$

(a) 0

(b) 2

(c) 1

(d) 3

Show Answer

Solution

Given that

$ \begin{aligned} & C_1 \to C_1+C_2+C_3 \\ & \Rightarrow \end{aligned} $

Taking $2 \cos x+\sin x$ common from $C_1$

$ \begin{aligned} & \Rightarrow \quad(2 \cos x+\sin x) \begin{vmatrix} 1 & \cos x & \cos x \\ 1 & \sin x & \cos x \\ 1 & \cos x & \sin x \end{vmatrix} =0 \\ & R_1 \to R_1-R_2, R_2 \to R_2-R_3 \\ & \Rightarrow(2 \cos x+\sin x) \begin{vmatrix} 0 & \cos x-\sin x & 0 \\ 0 & \sin x-\cos x & \cos x-\sin x \\ 1 & \cos x & \sin x \end{vmatrix} =0 \\ & \Rightarrow(2 \cos x+\sin x)[1 \begin{vmatrix} \cos x-\sin x & 0 \\ \sin x-\cos x & \cos x-\sin x \end{vmatrix} ] \\ & \Rightarrow \quad(2 \cos x+\sin x)(\cos x-\sin x)^{2}=0 \\ & 2 \cos x+\sin x=0 \\ & 2+\tan x=0 \\ & \therefore \quad \tan x=-2 \\ & \Rightarrow \quad \tan x=1 \\ & -\frac{\pi}{4} \leq x \leq \frac{\pi}{4} \\ & (\cos x-\sin x)^{2}=0 \\ & \cos x-\sin x=0 \\ & \Rightarrow \quad \tan x=\tan \frac{\pi}{4} \\ & \therefore \quad x=\frac{\pi}{4} \in[\frac{-\pi}{4}, \frac{\pi}{4}] \end{aligned} $

So, $x$ has no solution. So, it will have only one real root.

Hence, the correct option is (c).

29. If $A, B$ and $C$ are angles of a triangle, then the determinant $ \begin{vmatrix} -1 & \cos C & \cos B \\ \cos C & -1 & \cos A \\ \cos B & \cos A & -1\end{vmatrix} $ is equal to

(a) 0

(b) -1

(c) 1

(d) None of these

Show Answer

Solution

Let $\Delta= \begin{vmatrix} -1 & \cos C & \cos B \\ \cos C & -1 & \cos A \\ \cos B & \cos A & -1\end{vmatrix} $

$C_1 \to a C_1+b C_2+c C_3$

$\Rightarrow \begin{vmatrix} -a+b \cos C+c \cos B & \cos C & \cos B \\ a \cos C-b+c \cos A & -1 & \cos A \\ a \cos B+b \cos A-C & \cos A & -1\end{vmatrix} $

$ \begin{aligned} & \Rightarrow \begin{vmatrix} -a+a & \cos C & \cos B \\ -b+b & -1 & \cos A \\ -c+c & \cos A & -1 \end{vmatrix} \begin{bmatrix} \because \quad \text{ From projection formula } \\ a=b \cos C+c \cos B \\ b=a \cos C+c \cos A \\ c=b \cos A+a \cos B \end{bmatrix} \\ & \Rightarrow \begin{bmatrix} 0 & \cos C & \cos B \\ 0 & -1 & \cos A \\ 0 & \cos A & -1 \end{bmatrix} =0 \end{aligned} $

Hence, the correct option is (a).

30. Let $f(t)= \begin{bmatrix} \cos t & t & 1 \\ 2 \sin t & t & 2 t \\ \sin t & t & t \end{bmatrix} $, then $\lim _{t \to 0} \frac{f(t)}{t^{2}}$ is equal to

(a) 0

(b) -1

(c) 2

(d) 3

Show Answer

Solution

We have $f(t)= \begin{bmatrix} \cos t & t & 1 \\ 2 \sin t & t & 2 t \\ \sin t & t & t \end{bmatrix} $

Expanding along $R_1$

$ \begin{aligned} & =\cos t \begin{vmatrix} t & 2 t \\ t & t \end{vmatrix} -t \begin{vmatrix} 2 \sin t & 2 t \\ \sin t & t \end{vmatrix} +1 \begin{vmatrix} 2 \sin t & t \\ \sin t & t \end{vmatrix} \\ & =\cos t(t^{2}-2 t^{2})-t(2 t \sin t-2 t \sin t)+(2 t \sin t-t \sin t) \\ & =-t^{2} \cos t+t \sin t \\ & \therefore \quad \frac{f(t)}{t^{2}}=\frac{-t^{2} \cos t+t \sin t}{t^{2}} \\ & \Rightarrow \quad \frac{f(t)}{t^{2}}=-\cos t+\frac{\sin t}{t} \\ & \Rightarrow \quad \lim _{t \to 0} \frac{f(t)}{t^{2}}=\lim _{t \to 0}(-\cos t)+\lim _{t \to 0} \frac{\sin t}{t}=-1+1=0 \end{aligned} $

Hence, the correct option is (a).

31. The maximum value of $\Delta= \begin{vmatrix} 1 & 1 & 1 \\ 1 & 1+\sin \theta & 1 \\ 1+\cos \theta & 1 & 1\end{vmatrix} $ is

( $\theta$ is real number)

(a) $\frac{1}{2}$

(b) $\frac{\sqrt{3}}{2}$

(c) $\sqrt{2}$

(d) $\frac{2 \sqrt{3}}{4}$

Show Answer

Solution

Given that: $\Delta= \begin{vmatrix} 1 & 1 & 1 \\ 1 & 1+\sin \theta & 1 \\ 1+\cos \theta & 1 & 1\end{vmatrix} $ $C_1 \to C_1-C_2, C_2 \to C_2-C_3$

Expanding along $R_1$

$ = \begin{vmatrix} 0 & 0 & 1 \\ -\sin \theta & \sin \theta & 1 \\ \cos \theta & 0 & 1 \end{vmatrix} $

$ \begin{aligned} & =1 \begin{vmatrix} -\sin \theta & \sin \theta \\ \cos \theta & 0 \end{vmatrix} =-\sin \theta \cos \theta \\ \Rightarrow \quad & =-\frac{1}{2} \cdot 2 \sin \theta \cos \theta=-\frac{1}{2} \sin 2 \theta \end{aligned} $

but maximum value of $\sin 2 \theta=1 \Rightarrow|-\frac{1}{2} \cdot 1|=\frac{1}{2}$

Hence, the correct option is $(a)$.

32. If $f(x)= \begin{vmatrix} 0 & x-a & x-b \\ x+a & 0 & x-c \\ x+b & x+c & 0\end{vmatrix} $, then

(a) $f(a)=0$

(b) $f(b)=0$

(c) $f(0)=0$

(d) $f(1)=0$

Show Answer

Solution

Given that: $f(x)= \begin{vmatrix} 0 & x-a & x-b \\ x+a & 0 & x-c \\ x+b & x+c & 0\end{vmatrix} $

$ f(a)= \begin{vmatrix} 0 & 0 & a-b \\ 2 a & 0 & a-c \\ a+b & a+c & 0 \end{vmatrix} $

Expanding along $R_1=(a-b) \begin{vmatrix} 2 a & 0 \\ a+b & a+c\end{vmatrix} $

$ \begin{aligned} & =(a-b)[2 a(a+c)]=(a-b) \cdot 2 a \cdot(a+c) \neq 0 \\ f(b) & = \begin{vmatrix} 0 & b-a & 0 \\ b+a & 0 & b-c \\ 2 b & b+c & 0 \end{vmatrix} \end{aligned} $

Expanding along $R_1$

$ \begin{aligned} & -(b-a) \begin{vmatrix} b+a & b-c \\ 2 b & 0 \end{vmatrix} \\ & =-(b-a)[(-2 b)(b-c)]=2 b(b-a)(b-c) \neq 0 \\ f(0) & = \begin{vmatrix} 0 & -a & -b \\ a & 0 & -c \\ b & c & 0 \end{vmatrix} \end{aligned} $

$ \begin{matrix} \text{ Expanding along } R_1=a \begin{vmatrix} a & -c \\ b & 0 \end{vmatrix} -b \begin{vmatrix} a & 0 \\ b & c \end{vmatrix} \\ =a(b c)-b(a c)=a b c-a b c=0 \end{matrix} $

Hence, the correct option is (c).

33. If $A= \begin{bmatrix} 2 & \lambda & -3 \\ 0 & 2 & 5 \\ 1 & 1 & 3 \end{bmatrix} $, then $A^{-1}$ exists if

(a) $\lambda=2$

(b) $\lambda \neq 2$

(c) $\lambda \neq-2$

(d) None of these

Show Answer

Solution

We have,

$ A= \begin{bmatrix} 2 & \lambda & -3 \\ 0 & 2 & 5 \\ 1 & 1 & 3 \end{bmatrix} \Rightarrow|A|= \begin{vmatrix} 2 & \lambda & -3 \\ 0 & 2 & 5 \\ 1 & 1 & 3 \end{vmatrix} $

Expanding along $R_1=2 \begin{vmatrix} 2 & 5 \\ 1 & 3\end{vmatrix} -\lambda \begin{vmatrix} 0 & 5 \\ 1 & 3\end{vmatrix} -3 \begin{vmatrix} 0 & 2 \\ 1 & 1\end{vmatrix} $

$ \begin{aligned} & =2(6-5)-\lambda(0-5)-3(0-2) \\ & =2+5 \lambda+6=8+5 \lambda \end{aligned} $

If $A^{-1}$ exists then $|A| \neq 0$

$\therefore \quad 8+5 \lambda \neq 0$ so $\lambda \neq \frac{-8}{5}$

Hence, the correct option is $(d)$.

34. If $A$ and $B$ are invertible matrices, then which of the following is not correct?

(a) $adj A=|A| \cdot A^{-1}$

(b) $det(A)^{-1}=[det(A)]^{-1}$

(c) $(AB)^{-1}=B^{-1} A^{-1}$

(d) $(A+B)^{-1}=B^{-1}+A^{-1}$

Show Answer

Solution

If $A$ and $B$ are two invertible matrices then

(a) $adj A=|A| \cdot A^{-1}$ is correct

(b) $det(A)^{-1}=[det(A)]^{-1}=\frac{1}{det(A)}$ is correct

(c) Also, $(AB)^{-1}=B^{-1} A^{-1}$ is correct

(d) $(A+B)^{-1}=\frac{1}{|A+B|} \cdot adj(A+B)$

$\therefore \quad(A+B)^{-1} \neq B^{-1}+A^{-1}$

Hence, the correct option is $(d)$.

35. If $x, y, z$ are all different from zero an $ \begin{vmatrix} 1+x & 1 & 1 \\ 1 & 1+y & 1 \\ 1 & 1 & 1+z\end{vmatrix} =0$, then the value of $x^{-1}+y^{-1}+z^{-1}$ is

(a) $x y z$

(b) $x^{-1} y^{-1} z^{-1}$

(c) $-x-y-z$

(d) -1

Show Answer

Solution

Given that

$ \begin{vmatrix} 1+x & 1 & 1 \\ 1 & 1+y & 1 \\ 1 & 1 & 1+z \end{vmatrix} =0 $

Taking $x, y$ and $z$ common from $R_1, R_2$ and $R_3$ respectively.

$\Rightarrow \quad x y z \begin{vmatrix} \frac{1}{x}+1 & \frac{1}{x} & \frac{1}{x} \\ \frac{1}{y} & \frac{1}{y}+1 & \frac{1}{y} \\ \frac{1}{z} & \frac{1}{z} & \frac{1}{z}+1\end{vmatrix} =0$

$R_1 \to R_1+R_2+R_3$

$\Rightarrow x y z \begin{vmatrix} \frac{1}{x}+\frac{1}{y}+\frac{1}{z}+1 & \frac{1}{x}+\frac{1}{y}+\frac{1}{z}+1 & \frac{1}{x}+\frac{1}{y}+\frac{1}{z}+1 \\ \frac{1}{y} & \frac{1}{y}+1 & \frac{1}{y} \\ \frac{1}{z} & \frac{1}{z} & \frac{1}{z}+1\end{vmatrix} =0$

Taking $\frac{1}{x}+\frac{1}{y}+\frac{1}{z}+1$ common from $R_1$

$\Rightarrow \quad x y z(\frac{1}{x}+\frac{1}{y}+\frac{1}{z}+1) \begin{vmatrix} 1 & 1 & 1 \\ \frac{1}{y} & \frac{1}{y}+1 & \frac{1}{y} \\ \frac{1}{z} & \frac{1}{z} & \frac{1}{z}+1\end{vmatrix} =0$

$C_1 \to C_1-C_2, C_2 \to C_2-C_3$

$\Rightarrow \quad x y z(\frac{1}{x}+\frac{1}{y}+\frac{1}{z}+1) \begin{vmatrix} 0 & 0 & 1 \\ -1 & 1 & \frac{1}{y} \\ 0 & -1 & \frac{1}{z}+1\end{vmatrix} =0$

Expanding along $R_1$

$ \begin{aligned} & \Rightarrow \quad x y z(\frac{1}{x}+\frac{1}{y}+\frac{1}{z}+1)[1 \begin{vmatrix} -1 & 1 \\ 0 & -1 \end{vmatrix} ]=0 \\ & \Rightarrow \quad x y z(\frac{1}{x}+\frac{1}{y}+\frac{1}{z}+1)(1)=0 \\ & \Rightarrow \quad \frac{1}{x}+\frac{1}{y}+\frac{1}{z}+1=0 \text{ and } x y z \neq 0 \quad(x \neq y \neq z \neq 0) \\ & \therefore \quad x^{-1}+y^{-1}+z^{-1}=-1 \end{aligned} $

Hence, the correct option is $(d)$.

36. The value of the determinant $ \begin{vmatrix} x & x+y & x+2 y \\ x+2 y & x & x+y \\ x+y & x+2 y & x\end{vmatrix} $ is

(a) $9 x^{2}(x+y)$

(b) $9 y^{2}(x+y)$

(c) $3 y^{2}(x+y)$

(d) $7 x^{2}(x+y)$

Show Answer

Solution

Let $\Delta= \begin{vmatrix} x & x+y & x+2 y \\ x+2 y & x & x+y \\ x+y & x+2 y & x\end{vmatrix} $

$C_1 \to C_1+C_2+C_3$

$= \begin{vmatrix} 3 x+3 y & x+y & x+2 y \\ 3 x+3 y & x & x+y \\ 3 x+3 y & x+2 y & x\end{vmatrix} $

$=(3 x+3 y) \begin{vmatrix} 1 & x+y & x+2 y \\ 1 & x & x+y \\ 1 & x+2 y & x\end{vmatrix} $

[Taking $(3 x+3 y)$ common from $C_1$ ]

$R_1 \to R_1-R_2, R_2 \to R_2-R_3$

$\Rightarrow 3(x+y) \begin{vmatrix} 0 & y & y \\ 0 & -2 y & y \\ 1 & x+2 y & x\end{vmatrix} $

Expanding along $C_1$

$\Rightarrow \quad 3(x+y)[1 \begin{vmatrix} y & y \\ -2 y & y\end{vmatrix} ]$

$ \Rightarrow 3(x+y)(y^{2}+2 y^{2}) \Rightarrow 3(x+y)(3 y^{2}) \Rightarrow 9 y^{2}(x+y) $

Hence, the correct option is $(b)$.

37. There are two values of ’ $a$ ’ which makes determinant, $\Delta= \begin{vmatrix} 1 & -2 & 5 \\ 2 & a & -1 \\ 0 & 4 & 2 a\end{vmatrix} =86$, then sum of these numbers is

(a) 4

(b) 5

(c) -4

(d) 9

Show Answer

Solution

Given that, $\Delta= \begin{vmatrix} 1 & -2 & 5 \\ 2 & a & -1 \\ 0 & 4 & 2 a\end{vmatrix} =86$

Expanding along $C_1$

$ \begin{matrix} \Rightarrow & 1 \begin{vmatrix} a & -1 \\ 4 & 2 a \end{vmatrix} -2 \begin{vmatrix} -2 & 5 \\ 4 & 2 a \end{vmatrix} +0 \begin{vmatrix} -2 & 5 \\ a & -1 \end{vmatrix} =86 \\ \Rightarrow & (2 a^{2}+4)-2(-4 a-20) =86 \\ \Rightarrow & 2 a^{2}+4+8 a+40 =86 \\ \Rightarrow & 2 a^{2}+8 a+4+40-86 =0 \\ \Rightarrow & 2 a^{2}+8 a-42 =0 \\ \Rightarrow & a^{2}+4 a-21 =0 \\ \Rightarrow & a^{2}+7 a-3 a-21 =0 \\ \Rightarrow & (a-7)-3(a+7) =0 \\ \Rightarrow & (a-3)(a+7) =0 \end{matrix} $

$\therefore \quad a=3,-7$

Required sum of the two numbers $=3-7=-4$.

Hence, the correct option is (c).

Fillers

38. If $A$ is a matrix of order $3 \times 3$, then $|3 A|=$ ……

Show Answer

Solution

We know that for a matrix of order $3 \times 3$,

$ |KA|=K^{3}|A| $

$ \therefore \quad|3 A|=3^{3}|A|=27|\mathbf{A}| $

39. If $A$ is invertible matrix of order $3 \times 3$, then $|A^{-1}|$ ……

Show Answer

Solution

We know that for an invertible matrix $A$ of any order, $|A^{-1}|=\frac{1}{|\mathbf{A}|}$.

40. If $x, y, z \in R$, then the value of determinant

$ \begin{vmatrix} (2^{x}+2^{-x})^{2} & (2^{x}-2^{-x})^{2} & 1 \\ (3^{x}+3^{-x})^{2} & (3^{x}-3^{-x})^{2} & 1 \\ (4^{x}+4^{-x})^{2} & (4^{x}-4^{-x})^{2} & 1 \end{vmatrix} \text{ is equal to }…… $

Show Answer

Solution

We have, $ \begin{vmatrix} (2^{x}+2^{-x})^{2} & (2^{x}-2^{-x})^{2} & 1 \\ (3^{x}+3^{-x})^{2} & (3^{x}-3^{-x})^{2} & 1 \\ (4^{x}+4^{-x})^{2} & (4^{x}-4^{-x})^{2} & 1\end{vmatrix} $

$ C_1 \to C_1-C_2 $

$ \begin{aligned} & \Rightarrow \begin{vmatrix} (2^{x}+2^{-x})^{2}-(2^{x}-2^{-x})^{2} & (2^{x}-2^{-x})^{2} & 1 \\ (3^{x}+3^{-x})^{2}-(3^{x}-3^{-x})^{2} & (3^{x}-3^{-x})^{2} & 1 \\ (4^{x}+4^{-x})^{2}-(4^{x}-4^{-x})^{2} & (4^{x}-4^{-x})^{2} & 1 \end{vmatrix} \\ & \Rightarrow \begin{vmatrix} 4 \cdot 2^{x} \cdot 2^{-x} & (2^{x}-2^{-x})^{2} & 1 \\ 4 \cdot 3^{x} \cdot 3^{-x} & (3^{x}-3^{-x})^{2} & 1 \\ 4 \cdot 4^{x} \cdot 4^{-x} & (4^{x}-4^{-x})^{2} & 1 \end{vmatrix} \quad(a+b)^{2}-(a-b)^{2}=4 a b \text{ ] } \\ & \Rightarrow \begin{vmatrix} 4 & (2^{x}-2^{-x})^{2} & 1 \\ 4 & (3^{x}-3^{-x})^{2} & 1 \\ 4 & (4^{x}-4^{-x})^{2} & 1 \end{vmatrix} \\ & \Rightarrow 4 \begin{vmatrix} 1 & (2^{x}-2^{-x})^{2} & 1 \\ 1 & (3^{x}-3^{-x})^{2} & 1 \\ 1 & (4^{x}-4^{-x})^{2} & 1 \end{vmatrix} \end{aligned} $

(Taking 4 common from $C_1$ )

$\Rightarrow \quad 4 \cdot 0=0$

( $\because C_1$ and $C_3$ are identical columns)

41. If $\cos 2 \theta=0$, then $ \begin{vmatrix} 0 & \cos \theta & \sin \theta \\ \cos \theta & \sin \theta & 0 \\ \sin \theta & 0 & \cos \theta\end{vmatrix} ^{2}=$ ……

Show Answer

Solution

Given that: $\quad \cos 2 \theta=0$

$ \begin{aligned} \Rightarrow & & \cos 2 \theta & =\cos \frac{\pi}{2} \Rightarrow 2 \theta=\frac{\pi}{2} \\ & \therefore & \theta & =\frac{\pi}{4} \end{aligned} $

The determinant can be written as

$$ \begin{aligned} & \Rightarrow \begin{vmatrix} 0 & \cos \frac{\pi}{4} & \sin \frac{\pi}{4} \\ \cos \frac{\pi}{4} & \sin \frac{\pi}{4} & 0 \\ \sin \frac{\pi}{4} & 0 & \cos \frac{\pi}{4} \end{vmatrix} ^{2} \Rightarrow \begin{vmatrix} 0 & \frac{1}{\sqrt{2}} & \frac{1}{\sqrt{2}} \\ \frac{1}{\sqrt{2}} & \frac{1}{\sqrt{2}} & 0 \\ \frac{1}{\sqrt{2}} & 0 & \frac{1}{\sqrt{2}} \end{vmatrix} ^{2} \\ & \Rightarrow[\frac{1}{\sqrt{2}} \cdot \frac{1}{\sqrt{2}} \cdot \frac{1}{\sqrt{2}} \begin{vmatrix} 0 & 1 & 1 \\ 1 & 1 & 0 \\ 1 & 0 & 1 \end{vmatrix} ]^{2} \quad(\begin{vmatrix} \text{ Taking } \frac{1}{\sqrt{2}} \text{ common from } \\ C_1, C_2 \text{ and } C_3 \end{vmatrix} \\ & \text{ Expanding along } C_1, \\ & \Rightarrow[.\frac{1}{2 \sqrt{2}}|-1 \begin{vmatrix} 1 & 1 \\ 0 & 1 \end{vmatrix} +1 \begin{vmatrix} 1 & 1 \\ 1 & 0 \end{vmatrix} \rvert,]^{2} \Rightarrow[\frac{1}{2 \sqrt{2}}|-1(1)+1(0-1)|]^{2} \\ & \Rightarrow[\frac{1}{2 \sqrt{2}}|-1-1|]^{2} \Rightarrow \frac{1}{8} \cdot(4)=\frac{\mathbf{1}}{\mathbf{2}} \end{aligned} $$

42. If $A$ is a matrix of order $3 \times 3$, then $(A^{2})^{-1}=$ ……

Show Answer

Solution

For any square matrix $A,(A^{2})^{-1}=(A^{-1})^{2}$.

43. If $A$ is a matrix of order $3 \times 3$, then the number of minors in the determinants of $A$ are ……

Show Answer

Solution

The order of a matrix is $3 \times 3$

$\therefore$ Total number of elements $=3 \times 3=9$

Hence, the number of minors in the determinant is 9 .

44. The sum of the products of elements of any row with the co-factors of corresponding elements is equal to ……

Show Answer

Solution

The sum of the products of elements of any row with the co-factors of corresponding elements is equal to the value of the determinant of the given matrix.

Let $\Delta= \begin{vmatrix} a _{11} & a _{12} & a _{13} \\ a _{21} & a _{22} & a _{23} \\ a _{31} & a _{32} & a _{33}\end{vmatrix} $

Expanding along $R_1$

$ \begin{aligned} & a _{11} \begin{vmatrix} a _{22} & a _{23} \\ a _{32} & a _{33} \end{vmatrix} -a _{12} \begin{vmatrix} a _{21} & a _{23} \\ a _{31} & a _{33} \end{vmatrix} +a _{13} \begin{vmatrix} a _{21} & a _{22} \\ a _{31} & a _{32} \end{vmatrix} \\ & \Rightarrow a _{11} M _{11}+a _{12} M _{12}+a _{13} M _{13} \end{aligned} $

$ \begin{matrix} \text{ (where } M _{11}, M _{12} \text{ and } M _{13} \text{ are the minors of the } \\ \text{ corresponding elements) } \end{matrix} $

45. If $x=-9$ is a root of $ \begin{vmatrix} x & 3 & 7 \\ 2 & x & 2 \\ 7 & 6 & x\end{vmatrix} =0$, then other two roots are ……

Show Answer

Solution

We have, $ \begin{vmatrix} x & 3 & 7 \\ 2 & x & 2 \\ 7 & 6 & x\end{vmatrix} =0$

Expanding along $R_1$

$ \begin{aligned} & \Rightarrow \quad x \begin{vmatrix} x & 2 \\ 6 & x \end{vmatrix} -3 \begin{vmatrix} 2 & 2 \\ 7 & x \end{vmatrix} +7 \begin{vmatrix} 2 & x \\ 7 & 6 \end{vmatrix} =0 \\ & \Rightarrow \quad x(x^{2}-12)-3(2 x-14)+7(12-7 x)=0 \\ & \Rightarrow \quad x^{3}-12 x-6 x+42+84-49 x=0 \\ & \Rightarrow \quad x^{3}-67 x+126=0 \end{aligned} $

The roots of the equation may be the factors of 126 i.e., $2 \times 7 \times 9$ 9 is given the root of the determinant put $x=2$ in eq. (1)

$ (2)^{3}-67 \times 2+126 \Rightarrow 8-134+126=0 $

Hence, $x=\mathbf{2}$ is the other root.

Now, put $x=7$ in eq. (1)

$ (7)^{3}-67(7)+126 \Rightarrow 343-469+126=0 $

Hence, $x=7$ is also the other root of the determinant.

46. $ \begin{vmatrix} 0 & x y z & x-z \\ y-x & 0 & y-z \\ z-x & z-y & 0\end{vmatrix} =$ ……

Show Answer

Solution

Let $\Delta= \begin{vmatrix} 0 & x y z & x-z \\ y-x & 0 & y-z \\ z-x & z-y & 0\end{vmatrix} $

$C_1 \to C_1-C_3$

Taking $(z-x)$ common from $C_1$

$ = \begin{vmatrix} z-x & x y z & x-z \\ z-x & 0 & y-z \\ z-x & z-y & 0 \end{vmatrix} $

$ =(z-x) \begin{vmatrix} 1 & x y z & x-z \\ 1 & 0 & y-z \\ 1 & z-y & 0 \end{vmatrix} $

$ R_1 \to R_1-R_2, R_2 \to R_2-R_3 $

$ =(z-x) \begin{vmatrix} 0 & x y z & x-y \\ 0 & y-z & y-z \\ 1 & z-y & 0 \end{vmatrix} $

Taking $(y-z)$ common from $R_2$

$ =(z-x)(y-z) \begin{vmatrix} 0 & x y z & x-y \\ 0 & 1 & 1 \\ 1 & z-y & 0 \end{vmatrix} $

Expanding along $C_1$

$ =(z-x)(y-z)[1 \begin{vmatrix} x y z & x-y \\ 1 & 1 \end{vmatrix} ] $

$ =(z-x)(y-z)(x y z-x+y)=(y-z)(z-x)(y-x+x y z) $

47. If $\quad f(x)= \begin{vmatrix} (1+x)^{17} & (1+x)^{19} & (1+x)^{23} \\ (1+x)^{23} & (1+x)^{29} & (1+x)^{34} \\ (1+x)^{41} & (1+x)^{43} & (1+x)^{47}\end{vmatrix} =A+B x+C x^{2}+\cdots$

then $A=$ ……

Show Answer

Solution

Given that

$ \begin{vmatrix} (1+x)^{17} & (1+x)^{19} & (1+x)^{23} \\ (1+x)^{23} & (1+x)^{29} & (1+x)^{34} \\ (1+x)^{41} & (1+x)^{43} & (1+x)^{47} \end{vmatrix} =A+B x+C x^{2}+\cdots $

Taking $(1+x)^{17},(1+x)^{23}$ and $(1+x)^{41}$ common from $R_1, R_2$ and $R_3$ respectively

$ (1+x)^{17} \cdot(1+x)^{23} \cdot(1+x)^{41} \begin{vmatrix} 1 & (1+x)^{2} & (1+x)^{6} \\ 1 & (1+x)^{6} & (1+x)^{11} \\ 1 & (1+x)^{2} & (1+x)^{6} \end{vmatrix} $

$\Rightarrow \quad(1+x)^{17} \cdot(1+x)^{23} \cdot(1+x)^{41} \cdot 0 \quad(R_1.$ and $R_3$ are identical $)$

$\therefore \quad 0=A+B x+C x^{2}+\ldots$

By comparing the like terms, we get $A=\mathbf{0}$.

True/False

48. $(A^{3})^{-1}=(A^{-1})^{3}$, where $A$ is a square matrix and $|A| \neq 0$.

Show Answer

Solution

Since $(A^{K})^{-1}=(A^{-1})^{K}$ where $K \in N$

So,

$ (A^{3})^{-1}=(A^{-1})^{3} \text{ is true } $

49. $(a A)^{-1}=\frac{1}{a} A^{-1}$, where $a$ is any real number and $A$ is a square matrix.

Show Answer

Solution

If $A$ is a non-singular square matrix, then for any non-zero scalar ’ $a$ ‘, $a A$ is invertible.

$\therefore \quad(a A) \cdot(\frac{1}{a} A^{-1})=a \cdot \frac{1}{a} \cdot A \cdot A^{-1}=I$

So, $(a A)$ is inverse of $(\frac{1}{a} A^{-1})$

$\Rightarrow \quad(a A)^{-1}=\frac{1}{a} A^{-1}$ is true.

50. $|A^{-1}| \neq|A|^{-1}$, where $A$ is a non-singular matrix.

Show Answer

Solution

False.

Since $|A^{-1}|=|A|^{-1}$ for a non-singular matrix.

51. If $A$ and $B$ are matrices of order 3 and $|A|=5,|B|=3$ then $|3 AB|=27 \times 5 \times 3=405$

Show Answer

Solution

True.

$ |3 AB|=3^{3}|AB|=27|A||B|=27 \times 5 \times 3 \quad[\because|KA|=K^{n}|A|] $

52. If the value of a third order determinant is 12 , then the value of the determinant formed by replacing each element by its co-factor will be 144 .

Show Answer

Solution

True.

Since $|A|=12$

If $A$ is a square matrix of order $n$

then $\quad|Adj A|=|A|^{n-1}$

$\therefore \quad \mid$ Adj A $.|=| A|^{3-1}=|A|^{2}=(12)^{2}=144 \quad[n=3]$

53. $ \begin{vmatrix} x+1 & x+2 & x+a \\ x+2 & x+3 & x+b \\ x+3 & x+4 & x+c\end{vmatrix} =0$, where $a, b, c$ are in A.P.

Show Answer

Solution

True.

Let

$ \Delta= \begin{vmatrix} x+1 & x+2 & x+a \\ x+2 & x+3 & x+b \\ x+3 & x+4 & x+c \end{vmatrix} $

$a, b, c$ are in A.P.

$ = \begin{vmatrix} x+1 & x+2 & x+a \\ 0 & 0 & 2 b-(a+c) \\ x+3 & x+4 & x+c \end{vmatrix} $

$ \begin{aligned} \therefore b-a=c-b \Rightarrow 2 b & =a+c \\ & = \begin{vmatrix} x+1 & x+2 & x+a \\ 0 & 0 & 0 \\ x+3 & x+4 & x+c \end{vmatrix} =0 \end{aligned} $

54. $|adj A|=|A|^{2}$, where $A$ is a square matrix of order two.

Show Answer

Solution

False.

Since $|adj A|=|A|^{n-1}$ where $n$ is the order of the square matrix.

55. The determinant $ \begin{vmatrix} \sin A & \cos A & \sin A+\cos B \\ \sin B & \cos A & \sin B+\cos B \\ \sin C & \cos A & \sin C+\cos B\end{vmatrix} $ is equal to zero.

Show Answer

Solution

True.

Let $\Delta= \begin{vmatrix} \sin A & \cos A & \sin A+\cos B \\ \sin B & \cos A & \sin B+\cos B \\ \sin C & \cos A & \sin C+\cos B\end{vmatrix} $

Splitting up $C_3$

$= \begin{vmatrix} \sin A & \cos A & \sin A \\ \sin B & \cos A & \sin B \\ \sin C & \cos A & \sin C\end{vmatrix} + \begin{vmatrix} \sin A & \cos A & \cos B \\ \sin B & \cos A & \cos B \\ \sin C & \cos A & \cos B\end{vmatrix} $

$=0+ \begin{vmatrix} \sin A & \cos A & \cos B \\ \sin B & \cos A & \cos B \\ \sin C & \cos A & \cos B\end{vmatrix} \quad[\because C_1.$ and $C_3$ are identical $]$

$=\cos A \cos B \begin{vmatrix} \sin A & 1 & 1 \\ \sin B & 1 & 1 \\ \sin C & 1 & 1\end{vmatrix} $

[Taking $\cos A$ and $\cos B$ common from $C_2$ and $C_3$ respectively] $=\cos A \cos B(0) \quad[\because C_2.$ and $C_3$ are identical $]$ $=0$

56. If the determinant $ \begin{vmatrix} x+a & p+u & l+f \\ y+b & q+v & m+g \\ z+c & r+w & n+h\end{vmatrix} $ splits into exactly $K$ determinants of order 3 , each element of which contains only one term, then the value of $K$ is 8 .

Show Answer

Solution

True.

Let

$ \Delta= \begin{vmatrix} x+a & p+u & l+f \\ y+b & q+v & m+g \\ z+c & r+w & n+h \end{vmatrix} $

Splitting up $C_1$

$ \Rightarrow \begin{vmatrix} x & p+u & l+f \\ y & q+v & m+g \\ z & r+w & n+h \end{vmatrix} + \begin{vmatrix} a & p+u & l+f \\ b & q+v & m+g \\ c & r+w & n+h \end{vmatrix} $

Splitting up $C_2$ in both determinants

$ \begin{aligned} \Rightarrow & \begin{vmatrix} x & p & l+f \\ y & q & m+g \\ z & r & n+h \end{vmatrix} + \begin{vmatrix} x & u & l+f \\ y & v & m+g \\ z & w & n+h \end{vmatrix} + \begin{vmatrix} a & p & l+f \\ b & q & m+g \\ c & r & n+h \end{vmatrix} \\ & + \begin{vmatrix} a & u & l+f \\ b & v & m+g \\ c & w & n+h \end{vmatrix} \end{aligned} $

Similarly by splitting $C_3$ in each determinant, we will get 8 determinants.

57. Let

$ \begin{aligned} \Delta & = \begin{vmatrix} a & p & x \\ b & q & y \\ c & r & z \end{vmatrix} 16 \\ \text{then}\quad \Delta_1 & = \begin{vmatrix} p+x & a+x & a+p \\ q+y & b+y & b+q \\ r+z & c+z & c+r \end{vmatrix} =32 \end{aligned} $

Show Answer

Solution

True.

$ \begin{aligned} & \text{ Given that: } \\ & \Delta= \begin{vmatrix} a & p & x \\ b & q & y \\ c & r & z \end{vmatrix} =16 \\ & \text{ L.H.S. } \Delta_1= \begin{vmatrix} p+x & a+x & a+p \\ q+y & b+y & b+q \\ r+z & c+z & c+r \end{vmatrix} \\ & C_1 \to C_1+C_2+C_3 \\ & = \begin{vmatrix} 2 p+2 x+2 a & a+x & a+p \\ 2 q+2 y+2 b & b+y & b+q \\ 2 r+2 z+2 c & c+z & c+r \end{vmatrix} \\ & =2 \begin{vmatrix} p+x+a & a+x & a+p \\ q+y+b & b+y & b+q \\ r+z+c & c+z & c+r \end{vmatrix} \quad \text{ [Taking } 2 \text{ common from }C_1 \end{aligned} $

$ \begin{aligned} & C_1 \to C_1-C_2=2 \begin{vmatrix} p & a+x & a+p \\ q & b+y & b+q \\ r & c+z & c+r \end{vmatrix} \\ & .C_3 \to C_3-C_1{ }^{d}=2 \begin{vmatrix} p & a+x & a \\ \text{ Splitting up } C_2 \end{vmatrix} \begin{vmatrix} r & b+y & b \\ r & c+z & c \end{vmatrix} \rvert, \\ & =2 \begin{vmatrix} p & a & a \\ q & b & b \\ r & c & c \end{vmatrix} +2 \begin{vmatrix} p & x & a \\ q & y & b \\ r & z & c \end{vmatrix} =2(0)+2 \begin{vmatrix} p & x & a \\ q & y & b \\ r & z & c \end{vmatrix} \\ & =2 \begin{vmatrix} p & x & a \\ q & y & b \\ r & z & c \end{vmatrix} \Rightarrow 2 \begin{vmatrix} a & p & x \\ b & q & y \\ c & r & z \end{vmatrix} \quad(C_1 \leftrightarrow C_3 \text{ and } C_2 \leftrightarrow C_3 \\ & =2 \times 16=32 \end{aligned} $

58. The maximum value of $\begin{vmatrix} 1 & 1 & 1 \\ 1 & (1+\sin \theta) & 1 \\ 1 & 1 & 1+\cos \theta \end{vmatrix}$ is $\frac{1}{2}$.

Show Answer

Solution

True.

Let $\Delta= \begin{vmatrix} 1 & 1 & 1 \\ 1 & (1+\sin \theta) & 1 \\ 1 & 1 & 1+\cos \theta\end{vmatrix} $

$C_1 \to C_1-C_2, C_2 \to C_2-C_3$

$ = \begin{vmatrix} 0 & 0 & 1 \\ -\sin \theta & \sin \theta & 1 \\ 0 & -\cos \theta & 1+\cos \theta \end{vmatrix} $

Expanding along $C_3$

$ \begin{aligned} & =1 \begin{vmatrix} -\sin \theta & \sin \theta \\ 0 & -\cos \theta \end{vmatrix} =\sin \theta \cos \theta-0=\sin \theta \cos \theta \\ & =\frac{1}{2} \cdot 2 \sin \theta \cos \theta=\frac{1}{2} \sin 2 \theta \\ & =\frac{1}{2} \times 1 \quad \text{ [Maximum value of } \sin 2 \theta=1 \text{ ] } \\ & =\frac{1}{2} \end{aligned} $



Table of Contents