A matrix A satisfies Aᵀ = −A and is 4×4; det(A) can be
A Always zero
B Any real
C Possibly nonzero
D Always one
A skew-symmetric matrix of even order can have nonzero determinant (unlike odd order where det must be 0). So a 4×4 skew-symmetric matrix may be invertible.
If A is 2×2 and det(A)=3, then det(AᵀA) equals
A 9
B 3
C 6
D 1/9
det(AᵀA)=det(Aᵀ)det(A)=det(A)·det(A)=det(A)². With det(A)=3, the value becomes 3²=9. This holds for any square invertible A.
If A is 3×3 and det(A)=2, then det(3A) equals
A 6
B 54
C 18
D 8
For n×n matrix, det(kA)=kⁿ det(A). Here n=3, k=3, so det(3A)=3³·2=27·2=54. Determinant scales by cube factor.
If A is 4×4 and det(A)=2, then det(3A) equals
A 6
B 18
C 54
D 162
For 4×4 matrix, det(3A)=3⁴ det(A)=81·2=162. The power equals the order of the square matrix, so scaling grows quickly with dimension.
If det(A)=5 and det(B)=0 for same order, then det(AB) is
A 0
B 5
C 1
D 25
det(AB)=det(A)det(B). If det(B)=0, product becomes 0. That means AB is singular, even if A is invertible, because B collapses dimension.
If det(AB)=0 and A is invertible, then det(B) is
A 1
B det(A)
C 0
D Cannot decide
det(AB)=det(A)det(B). If A invertible, det(A)≠0. So for product to be zero, det(B) must be zero. Hence B must be singular.
If AB=0 and A is invertible, then B must be
A Zero matrix
B Identity matrix
C Diagonal matrix
D Symmetric matrix
Multiply AB=0 on the left by A⁻¹: A⁻¹AB = A⁻¹0 gives IB=0, so B=0. Invertibility allows cancellation, forcing the only solution.
If A is invertible and AX=AY, then X equals
A A
B X+Y
C Cannot decide
D Y
From AX=AY, multiply by A⁻¹ on left: A⁻¹AX = A⁻¹AY, giving IX=IY, so X=Y. This is a standard cancellation property for invertible matrices.
If XA=YA and A is invertible, then X equals
A Aᵀ
B 0 matrix
C Y
D adj(A)
Multiply XA=YA on the right by A⁻¹: XAA⁻¹ = YAA⁻¹, giving XI=YI, so X=Y. Right cancellation works when A is invertible.
If A is 3×3 and det(A)=2, then det(A⁻²) equals
A 1/4
B 4
C 1/2
D 2
det(A⁻¹)=1/det(A)=1/2. Then det(A⁻²)=det((A⁻¹)²)=(det(A⁻¹))²=(1/2)²=1/4. Determinant respects powers.
If A is 2×2 and det(A)=−2, then det(A³) equals
A 8
B −2
C 4
D −8
det(A³)=(det(A))³. Here det(A)=−2, so (−2)³=−8. Determinant turns matrix multiplication into number multiplication, including sign.
If A is invertible, then (A⁻¹B)ᵀ equals
A Bᵀ(A⁻¹)ᵀ
B AᵀBᵀ
C (A⁻¹)ᵀBᵀ
D BᵀAᵀ
Transpose of product reverses order: (PQ)ᵀ=QᵀPᵀ. Here P=A⁻¹ and Q=B, so (A⁻¹B)ᵀ=Bᵀ(A⁻¹)ᵀ.
If A is invertible, then (A⁻¹)ᵀ equals
A Aᵀ
B −A⁻¹
C (Aᵀ)⁻¹
D adj(A)
Inverse and transpose satisfy (A⁻¹)ᵀ=(Aᵀ)⁻¹. It comes from transposing AA⁻¹=I to get (A⁻¹)ᵀAᵀ=I, showing (A⁻¹)ᵀ is inverse of Aᵀ.
If A is 2×2 and A²=I, then det(A) must be
A 0
B ±1
C 1
D Any real
Take determinant on both sides: det(A²)=det(I)=1. But det(A²)=(det(A))², so (det(A))²=1, giving det(A)=±1.
If A is idempotent (A²=A), possible eigenvalues are
A 0 or 1
B −1 or 1
C Any real
D 2 only
If Av=λv, then A²v=λ²v. But A²=A gives λ²v=λv, so λ(λ−1)=0. Thus eigenvalues of idempotent matrices are 0 or 1.
If A is nilpotent, then det(A) must be
A Nonzero
B One
C −1
D Zero
Nilpotent means Aᵏ=0 for some k. Taking determinant gives det(Aᵏ)=(det(A))ᵏ=det(0)=0, so det(A)=0. Hence nilpotent matrices are singular.
If A is nilpotent and nonzero, then A is
A Invertible
B Orthogonal
C Singular
D Scalar
A nonzero nilpotent matrix has determinant 0, so it cannot be invertible. It collapses vectors after repeated application, indicating rank deficiency and singularity.
If det(A)=0, then adj(A) is always
A Not always zero
B Zero matrix
C Nonzero always
D Identity matrix
When det(A)=0, A has no inverse. But adj(A) may still be nonzero (especially when rank is n−1). Only in some cases, like very low rank, adj(A) becomes zero.
If A is 3×3 and rank(A)=1, then adj(A) is
A Always identity
B Always zero
C Always diagonal
D Always symmetric
For 3×3, adj(A) is formed from 2×2 cofactors. If rank(A)≤1, all 2×2 minors are zero, so every cofactor is zero, making adj(A) the zero matrix.
If A is 3×3 and rank(A)=2, then adj(A) is
A Zero always
B Identity always
C Nonzero possible
D Singular impossible
For 3×3 with rank 2, some 2×2 minors can be nonzero, so adj(A) can be nonzero. However det(A)=0 still, so A remains singular.
If A is 3×3, det(A)=0 but adj(A)≠0, then rank(A) is
A 2
B 0
C 1
D 3
For 3×3, adj(A) is zero when rank ≤1. If det(A)=0 and adj(A) is not zero, the rank must be exactly 2. This is a common rank–adjoint link.
If A is 2×2 with det(A)=0, then adj(A) is
A Always zero
B Never zero
C Identity always
D Can be nonzero
For 2×2 matrix [[a b],[c d]], adj(A)=[[d −b], [−c a]]. Even if det(A)=ad−bc=0, adj(A) may still be nonzero unless all entries force it.
Which row operation changes determinant value by k
A Row swap
B Add multiple row
C Multiply one row
D Column swap
Multiplying a single row by k multiplies the determinant by k. Row swap changes sign, and adding a multiple of one row to another does not change determinant.
If det(A)=7 and one row is multiplied by −2, new det is
A −14
B 14
C −7
D 7/2
Multiplying one row by −2 scales determinant by −2. So new determinant becomes (−2)×7=−14. Only that row scaling changes determinant magnitude and sign.
If a single row is multiplied by 0, determinant becomes
A Same det
B 0
C −det
D det²
If a row becomes all zeros, determinant becomes zero because rows become dependent and one row contributes a zero factor in linearity. Such a matrix is singular.
If two rows are interchanged and then one row multiplied by 3, determinant factor is
A 3
B −1/3
C −3
D 1
Row interchange multiplies determinant by −1. Multiplying one row by 3 multiplies by 3. Combined effect is (−1)×3=−3 times the original determinant.
If A is 3×3 and you add 5 times row1 to row2, det becomes
A D
B 5D
C −D
D 0
Operation R2 → R2 + 5R1 does not change determinant. This property helps simplify determinants and solve systems using elimination without altering determinant value.
If a matrix has det(A)=0, then A⁻¹ is
A Defined
B Equal to Aᵀ
C Equal to adj(A)
D Not defined
Inverse exists only when det(A)≠0. With det(A)=0, matrix is singular and cannot be inverted. Adjoint exists, but division by det(A) fails.
If A is invertible, then A⁻¹A equals
A I
B A
C 0
D Aᵀ
Inverse satisfies both A⁻¹A=I and AA⁻¹=I. Identity matrix acts like 1 for matrices, confirming that inverse “undoes” the effect of A from either side.
If A is 2×2 with det(A)=1, then det(A⁻¹)=
A −1
B 0
C 1
D 2
det(A⁻¹)=1/det(A). If det(A)=1, then det(A⁻¹)=1. This is common for orthogonal and rotation-like matrices where determinant magnitude is 1.
If A is 3×3 and det(A)=−2, then det(A⁻¹)=
A 1/2
B −1/2
C −2
D 2
Determinant of inverse is reciprocal: det(A⁻¹)=1/det(A). With det(A)=−2, det(A⁻¹)=−1/2. Sign remains negative because reciprocal keeps sign.
If AB is invertible, then A and B are
A Both invertible
B Both singular
C A invertible only
D B invertible only
If det(AB)≠0, then det(A)det(B)≠0. That means det(A)≠0 and det(B)≠0, so both A and B must be invertible square matrices.
If A is invertible and B is singular, then AB is
A Always invertible
B Always identity
C Cannot decide
D Always singular
det(AB)=det(A)det(B). If B is singular, det(B)=0, so det(AB)=0. Hence AB is singular, even though A is invertible.
If A is singular and B is invertible, then BA is
A Always invertible
B Always diagonal
C Always singular
D Cannot decide
det(BA)=det(B)det(A). If A is singular, det(A)=0, so det(BA)=0. Multiplying by an invertible matrix cannot “fix” singularity.
In Gauss–Jordan inverse method, you start with
A [A|I]
B [A|0]
C [I|A]
D [Aᵀ|I]
To find A⁻¹, you form the augmented matrix [A|I] and apply row operations to convert A into I. Then the right side becomes A⁻¹, if A is invertible.
If Gauss–Jordan reduces [A|I] to [I|X], then X equals
A A
B adj(A)
C Aᵀ
D A⁻¹
Row operations applied to [A|I] mimic multiplying by elementary matrices. When left side becomes I, the same operations transform I into A⁻¹, so right side equals inverse.
A 3×3 system has unique solution when
A det(A)≠0
B det(A)=0
C trace(A)=0
D rank(A)<3
For square system AX=B, det(A)≠0 means A is invertible. Then solution is unique and given by X=A⁻¹B. If det(A)=0, uniqueness is not guaranteed.
If A is 3×3 and det(A)=0, then the system AX=B can have
A Only unique
B Only no solution
C No or infinite
D Only infinite
Singular A implies no inverse, so system cannot have a guaranteed unique solution. Depending on consistency (rank test), it may have no solution or infinitely many solutions.
For 3×3, the cofactor sign pattern starts with
A − + −
B + − +
C + + +
D − − −
Cofactor sign is (−1)^{i+j}. In first row, signs are +, −, +. Second row is −, +, − and third row is +, −, +, forming a checkerboard pattern.
If det(A)=4, then det(adj(A)) for 4×4 equals
A 64
B 4
C 16
D 256
For 4×4, det(adj(A))=(det(A))³. With det(A)=4, 4³=64. This is a standard determinant–adjoint identity for square matrices.
If A is 4×4 and det(A)=4, then det(A⁻¹) is
A 4
B 1/16
C 16
D 1/4
det(A⁻¹)=1/det(A). For det(A)=4, det(A⁻¹)=1/4. This holds for any invertible square matrix, regardless of dimension.
If A is 2×2 and det(A)=2, then det(adj(A)) is
A 4
B 1/2
C 2
D 1
For n×n, det(adj(A))=(det(A))^(n−1). For 2×2, exponent is 1, so det(adj(A))=det(A)=2. This matches basic adjoint properties.
If A is 3×3 and det(A)=2, then det(adj(A)) is
A 4
B 2
C 6
D 8
For 3×3 matrix, det(adj(A))=(det(A))^(2). With det(A)=2, that becomes 2²=4. This works even when A is invertible or singular.
If A is invertible, then det(AᵀA⁻¹) equals
A det(A)²
B 1
C det(A)
D −1
det(AᵀA⁻¹)=det(Aᵀ)det(A⁻¹)=det(A)·(1/det(A))=1. This is a neat identity combining transpose and inverse determinant rules.
If A is invertible, then det((Aᵀ)⁻¹) equals
A det(A)
B det(A)²
C 1/det(A)
D −det(A)
(Aᵀ)⁻¹ has determinant 1/det(Aᵀ). Since det(Aᵀ)=det(A), determinant becomes 1/det(A). This is true for every invertible square matrix.
A 2×2 matrix has inverse using formula only if
A det ≠ 0
B trace ≠ 0
C diagonal ≠ 0
D symmetric
The inverse formula A⁻¹ = (1/det(A)) adj(A) requires division by det(A). If det(A)=0, division is impossible and inverse does not exist.
If A is diagonal with entries 2,3,5 then det(A) is
A 10
B 15
C 0
D 30
Determinant of a diagonal matrix equals product of its diagonal entries. So det(A)=2×3×5=30. This is a quick method for triangular or diagonal matrices.
If A is diagonal with entries 2,3,5 then det(A⁻¹) is
A 1/30
B 30
C 1/10
D 0
If A is invertible, det(A⁻¹)=1/det(A). Since det(A)=30, det(A⁻¹)=1/30. For diagonal matrices, inverse is also diagonal with reciprocal entries.
If A is 3×3 and det(A)=1, then det(2A⁻¹) equals
A 2
B 4
C 8
D 1/8
det(2A⁻¹)=2³ det(A⁻¹). Here det(A⁻¹)=1/det(A)=1. So det(2A⁻¹)=8×1=8. Use scaling and inverse determinant rules.
If A is 3×3 and det(A)=−1, then det(2A⁻¹) equals
A 8
B −8
C −2
D 2
det(A⁻¹)=1/det(A)=−1. Then det(2A⁻¹)=2³·(−1)=8·(−1)=−8. Scaling contributes 8, and sign comes from determinant of inverse