Chapter 13: Vector Spaces and Linear Transformations (Set-3)

Which subset of R2R2 is a subspace?

A Line not through
B Line through origin
C Circle centered origin
D First quadrant only

For nonzero vectors u,vu,v, dependence means

A u⋅v=0u⋅v=0
B ∥u∥=∥v∥∥u∥=∥v∥
C u=cvu=cv for some cc
D u+v=0u+v=0 always

If S={v1,v2,v3}S={v1,v2,v3} spans R2R2, then SS must be

A Linearly independent
B Orthonormal always
C Same as standard
D Linearly dependent

If dim⁡(V)=ndim(V)=n, any set of n+1n+1 vectors in VV is

A Dependent
B Independent
C A basis
D A subspace

If a set has exactly nn vectors in nn-dimensional VV and spans VV, then it is

A Dependent set
B Zero subspace
C A basis
D Not possible

If a set has nn independent vectors in nn-dimensional VV, then it

A Spans VV
B Spans kernel only
C Spans image only
D Cannot exist

The coordinate vector [v]B[v]B is defined relative to

A Any spanning set
B An ordered basis
C Any subspace
D Any inner product

If BB is a basis, the map v↦[v]Bv↦[v]B is

A Not one-one
B Not onto
C An isomorphism
D Nonlinear

If V=U⊕WV=U⊕W, every v∈Vv∈V can be written

A Only as u−wu−w
B Only as uwuw
C Not as sum
D Uniquely as u+wu+w

If U∩W≠{0}U∩W={0}, uniqueness of u+wu+w decomposition is

A Not guaranteed
B Always guaranteed
C Depends on norms
D Always impossible

In quotient space V/WV/W, two cosets are equal when

A Representatives are equal
B Their lengths match
C Their representatives differ by WW
D Determinants match

If W=ker⁡(π)W=ker(π) for projection π:V→V/Wπ:V→V/W, then ππ is

A One-one
B Onto
C Zero map
D Not linear

A linear map induced on V/WV/W typically requires WW to be

A Invariant under TT
B Orthogonal to VV
C Equal to VV
D Finite set

The isomorphism theorem (basic link) says V/ker⁡(T)V/ker(T) is isomorphic to

A ker⁡(T)ker(T)
B VV always
C Im(T)Im(T)
D FF

If T:V→WT:V→W is linear and VV finite-dimensional, then

A rank + nullity = dim(V)
B rank + nullity = dim(W)
C rank = dim(V) always
D nullity = 0 always

If T:R4→R2T:R4→R2 has rank 2, then nullity is

A 2
B 6
C 4
D 0

If T:R3→R3T:R3→R3 is onto, then rank is

A 2
B 1
C 0
D 3

For a linear operator on R3R3, injective implies

A Not onto
B Onto
C Rank is 1
D Nullity is 3

If a linear operator has nontrivial kernel, then it is

A Always onto
B Always diagonal
C Not one-one
D Always invertible

If TT is invertible, then its matrix in any basis is

A Nonsingular
B Zero matrix
C Always diagonal
D Always symmetric

In matrix form, null space corresponds to solutions of

A Ax=bAx=b always
B ATx=0ATx=0 only
C xTA=0xTA=0
D Ax=0Ax=0

If Ax=bAx=b is consistent, then its solution set is

A Always unique
B Always empty
C One solution + null space
D Always a basis

Rank of a matrix equals the dimension of its

A Column space
B Null space
C Diagonal entries
D Eigenvalue set

Rank of a matrix also equals

A Number of columns
B Number of pivots
C Number of rows always
D Trace value

A change-of-basis matrix is used to

A Find eigenvalues
B Compute determinants
C Convert coordinates
D Build quotient

If AA represents TT in standard bases, then T(ei)T(ei) equals

A Column ii
B Row ii
C Diagonal ii
D Determinant ii

Similarity transformation has the form

A PAP−1PAP−1 only
B PA+P−1PA+P−1
C P−1APP−1AP
D A−1PA−1P

If AA and BB are similar, then they must share

A Row reductions
B Pivot positions
C Entry sums
D Eigenvalues

A nonzero vector vv is an eigenvector if

A AvAv is orthogonal
B AvAv is parallel
C AvAv is zero always
D AvAv is unit

If λ=0λ=0 is an eigenvalue, then

A AA is singular
B AA is invertible
C AA is diagonal
D AA is orthogonal

Eigenspace for λλ is a subspace of

A The scalar field
B The quotient space
C The domain space
D The dual space

If AA is n×nn×n, then det⁡(A−λI)det(A−λI) is a polynomial of degree

A nn
B n+1n+1
C 2n2n
D 1

For 2×22×2 matrix, trace equals

A Product of diagonal
B Sum of diagonal
C Sum of eigenvectors
D Rank plus nullity

Determinant equals

A Sum of eigenvalues
B Dimension of kernel
C Number of pivots
D Product of eigenvalues

If eigenvalues are all distinct for an n×nn×n matrix, then it is

A Never diagonalizable
B Always singular
C Diagonalizable
D Always nilpotent

If geometric multiplicity is less than algebraic multiplicity, then matrix may be

A Not diagonalizable
B Always diagonalizable
C Always invertible
D Always symmetric

A symmetric real matrix has

A Complex eigenvalues only
B No eigenvectors
C Zero determinant always
D Real eigenvalues

Spectral theorem (basic mention) says a real symmetric matrix can be

A Row-reduced only
B Made triangular only
C Orthogonally diagonalized
D Turned into zero

The companion matrix is associated with a

A Dot product space
B Polynomial equation
C Coset operation
D Projection theorem

A matrix with Ak=0Ak=0 for some kk is called

A Idempotent
B Orthogonal
C Nilpotent
D Invertible

A projection matrix has eigenvalues typically

A 0 or 1
B 1 or 2
C -1 or 1
D All distinct

Gram–Schmidt produces vectors that are

A Linearly dependent
B Always eigenvectors
C Always in kernel
D Mutually orthogonal

Dual space dimension equals

A Zero always
B Original space dimension
C Twice the dimension
D Depends on basis order

A linear functional is determined by its values on

A Any subset
B Only zero vector
C A basis
D Only eigenvectors

Jordan form is mainly used when matrix is

A Not diagonalizable
B Always diagonalizable
C Always symmetric
D Always orthogonal

An eigenvalue of AA is a root of

A Minimal basis
B Row space equation
C Characteristic polynomial
D Gram matrix

If AA is invertible, then 00 is

A Always an eigenvalue
B Not an eigenvalue
C Always a root
D Always repeated

Stability idea in x′=Axx′=Ax depends mainly on

A Matrix trace only
B Vector norms only
C Basis choice only
D Eigenvalue real parts

A linear map preserves

A Distances always
B Angles always
C Addition and scaling
D Determinants always

Orthogonality basics are defined using

A Inner product
B Determinant
C Quotient operation
D Matrix rank

Leave a Reply

Your email address will not be published. Required fields are marked *