Chapter 13: Vector Spaces and Linear Transformations (Set-4)

For W={(x,y)∈R2:x+y=0}W={(x,y)∈R2:x+y=0}, which is true?

A Not closed addition
B Not closed scalar
C Subspace of R2R2
D Missing zero vector

For S={(x,y):x+y=1}S={(x,y):x+y=1}, which is correct?

A Not a subspace
B Subspace of R2R2
C Equals zero subspace
D Closed under scaling

If v∈span(S)v∈span(S), then vv can be expressed as

A Infinite product only
B Dot products sum
C Determinant expression
D Finite linear combo

If {v1,v2,v3}{v1,v2,v3} is independent, then {v1,v2}{v1,v2} is

A Dependent
B Not defined
C Independent
D Always spanning

If {v1,v2}{v1,v2} is dependent, then adding v3v3 makes {v1,v2,v3}{v1,v2,v3}

A Independent
B Dependent
C A basis always
D Orthonormal

In a finite-dimensional space, if a set spans VV, then it contains a

A Basis subset
B Kernel subset
C Quotient subset
D Orthogonal subset

In a finite-dimensional space, any independent set can be extended to a

A Kernel only
B Zero set
C Basis
D Quotient space

If U,WU,W are subspaces, then dim⁡(U+W)dim(U+W) equals

A dim⁡U+dim⁡W+dim⁡(U∩W)dimU+dimW+dim(U∩W)
B dim⁡U+dim⁡W−dim⁡(U∩W)dimU+dimW−dim(U∩W)
C dim⁡U−dim⁡WdimU−dimW
D dim⁡(U∩W)dim(U∩W)

If U∩W={0}U∩W={0}, then dim⁡(U+W)dim(U+W) becomes

A dim⁡U+dim⁡WdimU+dimW
B dim⁡U−dim⁡WdimU−dimW
C dim⁡(U∩W)dim(U∩W)
D Always zero

A vector in U∩WU∩W belongs to

A Only UU
B Only WW
C Neither set
D Both UU and WW

In quotient V/WV/W, coset addition is

A (v+W)+(u+W)=(v+u)+W(v+W)+(u+W)=(v+u)+W
B (v+W)+(u+W)=(vu)+W(v+W)+(u+W)=(vu)+W
C (v+W)+(u+W)=(v−u)+W(v+W)+(u+W)=(v−u)+W
D Not defined

To ensure coset operations are valid, we need

A Orthogonality
B Symmetry of WW
C Well-definedness
D Determinant nonzero

The natural projection π:V→V/Wπ:V→V/W is always

A Injective
B Surjective
C Zero map
D Not linear

The kernel of π(v)=v+Wπ(v)=v+W is

A V/WV/W
B {0}{0} only
C VV
D WW

If T:V→WT:V→W is linear, then T(0)T(0) must be

A 1
B Undefined
C 0
D Any vector

If TT is linear, then T(−v)T(−v) equals

A −T(v)−T(v)
B T(v)T(v)
C 0 always
D T(v)−1T(v)−1

If TT is linear and T(v)=T(w)T(v)=T(w), then v−wv−w lies in

A Im(T)Im(T)
B V/WV/W
C ker⁡(T)ker(T)
D Dual space

If TT is injective, then T(v)=T(w)T(v)=T(w) implies

A v=wv=w
B v=−wv=−w
C v+w=0v+w=0
D Always many solutions

If TT is surjective, then for each yy in codomain there exists

A Unique xx always
B No preimage
C Only zero preimage
D Some xx with T(x)=yT(x)=y

Rank of TT equals the dimension of

A ker⁡(T)ker(T)
B Domain always
C Im(T)Im(T)
D Quotient space

For T:R5→R3T:R5→R3, the rank can be at most

A 5
B 3
C 8
D 15

If T:R5→R3T:R5→R3 has rank 3, then nullity is

A 2
B 3
C 5
D 8

If matrix AA is m×nm×n, then AA represents a map

A Rm→RnRm→Rn
B Rn→RnRn→Rn
C Rn→RmRn→Rm
D Scalars to vectors

If AA has a pivot in every column, then Ax=0Ax=0 has

A Only trivial solution
B Infinitely many solutions
C No solutions
D Exactly two solutions

If AA has a free variable, then nullity is

A Exactly 0
B Negative
C At least 1
D Always equals rank

For linear operator TT, matrix of T−1T−1 is

A Transpose matrix
B Inverse matrix
C Adjoint always
D Row-reduced form

If AA is invertible, then its columns are

A Linearly independent
B Always orthogonal
C Always equal length
D Always nonnegative

If AA is n×nn×n and rank is nn, then nullity is

A nn
B 2n2n
C 1
D 0

Eigenvalues satisfy which equation?

A det⁡(A+λI)=1det(A+λI)=1
B det⁡(A−λI)=0det(A−λI)=0
C A=λA=λ
D A2=λIA2=λI always

If v≠0v=0 and (A−λI)v=0(A−λI)v=0, then vv is

A Row vector only
B Pivot column
C Eigenvector for λλ
D Kernel of AA

If λλ is eigenvalue, then eigenspace dimension equals

A Nullity of A−λIA−λI
B Rank of A−λIA−λI
C Trace of A−λIA−λI
D Determinant of A−λIA−λI

A matrix is diagonalizable if the total number of independent eigenvectors is

A 1
B 0
C nn
D n2n2

If a matrix has a repeated eigenvalue, it is diagonalizable when

A Trace is nonzero
B Enough eigenvectors exist
C Determinant is one
D It is triangular

Coefficients of characteristic polynomial relate to

A Trace and determinant
B Rank and nullity
C Basis and dimension
D Dot product only

Cayley–Hamilton can help compute

A Vector lengths
B Basis size
C Higher matrix powers
D Coset count

If AA is similar to BB, then their characteristic polynomials are

A The same
B Always different
C Same degree only
D Unrelated

In R3R3, cross product is NOT needed to define

A Area formula
B Normal direction
C Torque concept
D Vector space axioms

In function space, a typical linear combination looks like

A f(x)g(x)f(x)g(x)
B f(x)/g(x)f(x)/g(x)
C af(x)+bg(x)af(x)+bg(x)
D sin⁡(f)sin(f)

A linear transformation between vector spaces must preserve

A Linear combinations
B Only lengths
C Only angles
D Only determinants

If PP is projection onto a subspace UU, then Im(P)Im(P) equals

A ker⁡(P)ker(P)
B UU
C Whole space
D Dual space

For projection PP, the kernel represents vectors

A Sent to unit
B Sent to eigenvalue
C Sent to zero
D Sent to trace

Orthogonal diagonalization requires matrix to be

A Real symmetric
B Any invertible
C Any triangular
D Any nilpotent

A singular value mention relates to

A Trace of AA
B ATAATA eigenvalues
C Determinant of AA
D Nullity of AA only

A matrix is idempotent if

A A2=0A2=0
B AT=A−1AT=A−1
C A=IA=I only
D A2=AA2=A

If AA is nilpotent, then all eigenvalues are

A 0
B 1
C -1
D Nonzero

If AA is upper triangular, determinant equals

A Sum of diagonal
B Rank of AA
C Product of diagonal
D Nullity of AA

If AA and BB are equivalent (row/column operations), then they must share

A Same eigenvalues
B Same rank
C Same trace
D Same determinant

The solution space of Ax=0Ax=0 is a

A Subspace
B Coset only
C Empty set
D Nonlinear curve

The solution set of Ax=bAx=b (consistent) is a

A Always a subspace
B Always empty
C Coset of null space
D Always a basis

A linear operator “scaling” by factor cc has eigenvalues

A 0 only
B 1 only
C ±1±1 only
D cc only

Leave a Reply

Your email address will not be published. Required fields are marked *