we first define vector which is a collection of data in n×1 as x1x2x3⋮xn, and matrix is a collection of data in n×p as x11x21⋯xn1x12x22⋯xn2⋯⋯⋯⋯x1px2p⋯xnp
Just in case, n×p means n rows and p column
We define a transpose of a matrix(the example from above) is x11⋯x1px21⋯x2p⋯⋯⋯xn1⋯xnp which from n×p to p×n, we also denote as vT or v′
Since we have vector, we also has its belong, the vector space. In the most of courses, vector spaces are defined over the field R, moreover over the n-dimensional Euclidean space Rn
Vector space also contain operation of those vector.
Add: if x,y with the same row p, then we can add them together where x+y=x1+y1x2+y2⋮xn+yn
Product with constant: if c∈R,x is a vector of p×1, then we have cx=cx1cx2⋮cxn
Inner Product: Let x,y be the p×1 vector, the inner/dot product x⋅y is defined as ⟨x,y⟩=∑i=1pxiyi=x′y=y′x
⟨ax+by,cz+dw⟩=ac⟨x,z⟩+bc⟨y,z⟩+ad⟨x,w⟩+bd⟨y,w⟩
Let A be a p×p matrix, ⟨Ax,y⟩=⟨x,A′y⟩
We define the length/norm of a vector x as Lx=∣∣x∣∣=⟨x,x⟩; if we have y=cx, then Ly=∣c∣Lx
We can also obtain a unit vector∣∣xˉ∣∣=1 by xˉ=Lx−1x=⟨x,x⟩x
Cross Product: Let x,y be the p×1 vector, the cross product x×y is defined as ∣∣a∣∣∣∣b∣∣sin(θ)n
n is the unit vector perpendicular to the plane containing x and y
θ is the angle between x and y where cos(θ)=LxLy⟨x,y⟩,⟨x,y⟩=0⟹x⊥y which shows x and y perpendicular to each other
We can also project a vector to the other. Denote the projection of vector x on y as Lx∣cos(θ)∣=⟨y,y⟩⟨x,y⟩y=⟨x,y⟩yˉ
Cauchy-Schwarz Inequality: (x′y)≤(x′x)(y′y)
We say x1,x2,…,xk are linearly independent if c1x1+c2x2+…+ckxk=0⟺c1=c2=…=ck=0. Else they are linearly dependent