Let A be an m×n matrix defined by
A=a11a21⋮am1a12a22am2⋯⋯⋯a1na2namn
A=[aij]
Transpose of A is denoted by A⊤
and it in defined by
A⊤=a11a12⋮a1na21a22⋮a2n−am1am2⋮amn
when m=n, we call A as square matrix.
let A be an n×n square matrix
⇒AT=−A
⇒aij=−aji∀i,j
⇒aij=−aji
⇒aij=0∀i
⇒ diagonal entries are zero.
If c is a scalar then
c⋅A=[c⋅aij]
(A+B)⊤=A⊤+B⊤
(AB)⊤=B⊤A⊤
n=2,A=[a11a21a12a22]
∣A∣=a11⋅a22−a21a12
n=3,A=a11a21a31a12a22a32a13a23a33
∣A∣=a11a22a32a23a33−a12a21a31a23a33+a13a21a31a22a32
sign of aij is given by the sign of (−i)i+j
A=a12a21a31a12a22a32a13a23a33
∣A∣=a31a12a22a13a23−a32a11a21a13a23+a33a11a21a12a22
i) detA=detA⊤
ii) If any two rows (or columns) of a determinant are inter changed them sign of determinant changes.
iii) If all the elements of a row (or column) of A ore zero,
detA=0
iv) If any two rows (or columns) of a matrix A are identical, detA=0
v)
ka11a21a31ka12a22a32ka13a23a33=ka11a21a31a12a22a32a13a23a33
vi)
a11a21a31a12+xa22+ya32+za13a23a33=a11a21a31a12a22a32a13a23a33+a12a21a31xyza13a23a33
vii) det(AB)=detA⋅detB
viii) det(cA)=cndetA
Let A be an n×n matrix its inverse is defined by A−1=detAadjA where detA=0 ⇒A−1 exists only if detA=0
We call an invertible matrix as non-singular matrix.
adjA= transpose of co-factor matrix.
AadjA=a11a21a31a12a22a32a13a23a33=A11A12A13A21A22A23A31A32A33
Aij=(−1)i+j× determinant of sub matrix obtained by deleting ith-row & j th column
e.g.,
A11=a22a32a23a33
A12=−a21a31a23a33
Similarly other Aij can be calculated.
detA−1=detA1
AA−1=I
det(AA−1)detI=1
detA⋅detA−1=1
detA−1=detA1
det(adjA)=(detA)n−1A→ nxn matrix.
Let M be 3×3 matrix satisfying
M010=−123,
M1−10=11−1,
M111=0012
Then, what will be the sum of the diagonal entries of M.
M=m11m21m31mnm22m32m13m23m33
We need to find the value of m11+m22+m33
M010=−123⇒m12m22m32=−123
⇒m12=−1,m22=2,m32=3
M1−10=11−1,⇒m11−m22m21−m22m31−m32=11−1
⇒m11−m12=1⇒m11=1+m12=1−1=0
m11=0
m21−m22=1⇒m21=1+2=3
m31−m32=−1
⇒m31=−1+m32=2
M111=0012
⇒m11+m12+m13m21+m22+m23m31+m32+m33=0012
m31+m32+m33=12
m33=12−m31−m32=12−2−3=7
m33=7
m33=7m11+m22+m33=0+2+7=9
Let ω=1 be a cube root of unity and S be the set of all non- singular matrices of the form
1ωω2a1ωbc1
where each of a,b,c is either ω or ω2. Then, what is the cardinality of S.
Solution: Given S in set of all non-singular matrices
1ωω2a1ωbc1=0
1(1−ωc)−a(ω−ω2c)+b(ω2−ω2)=0
⇒(1−ωc2−aw(b−ωc)=0
⇒(1−ωc)(1−aω)=0
⇒1−ωc=0 and (1−aω)=0
ω2=1, a & c can take only ω or
⇒a=ω2,c=ω2la=ω c=ω
where as b can be either ω or ω2
S=1ωω2ω1ωωω1,1ωω2ω1ωω2ω1
⇒ Cardinality of S=2
Let P=[aij] be a 3×3 matrix and let Q=[bij], where
bij=2i+jaij,1≤i,j≤3
If the determinant of P is 2 , what will be the value detQ?
Solution: Q=4a118a2116ba318a1216a2232a3216a1332a2364a33
detQ=4a118a2116a318a1216a2232a3216a1332a2364a33
=4×8×16a11a21a312a122a222a324a134a234a33
=22×23×24×2×22 =212×detP
a11a21a31a12a22a32a13a23a33⇒detQ=212×2=213
let P be a 3×3 matrix such that P⊤=2P+I, where I is the 3×3 identity matrix. Consider a column matrix x=xyz=000
Then, which of the following is true
i) PX=000 ii) PX=X
iii) PX=2x iv) px=−x
P=2P⊤+I ←(I)
(P⊤)⊤=(2P+I)⊤
P=2P⊤+I ←(II)
P⊤−P=2(P−P⊤)
⇒3(P⊤−P)=0→ Zero matrix
⇒P⊤=P
from
(1) P=2P+I⇒P=−I
i)
PX=000
⇒−IX=000
⇒x=000
but x =000
⇒ i) is not true