Part 17 : Projections

Part 17 : Projections

We have covered projection in Dot Product. Now, we will take deep dive into projections and projection matrix.

Consider two vectors in 3D vector space: a and b

Image for post

Say, we have a point R on vector a that is closest to vector b

Image for post

We can construct a vector to the point R having the same direction as vector a.

Image for post

As the new vector r shares the direction with vector a, it could be represented as a product of vector a with some scalar quantity (in this case, X).

Assuming the distance between point R and vector b is (b-r).

Image for post

From the figure, we can see that e is perpendicular to a.

So

Image for postSee orthogonal vectors

This equation could be simplified further for the value of X

Image for post

We know that

Image for postScalar multiplication is commutative

Now we will simplify this equation

Image for post

Coming back to the figure

Image for post

If vector r is a projection of vector b on vector a.

Then vector r could be represented as

Image for postP is Projection matrix

and

Image for post

Properties of Projection Matrix

The projection matrix has some amazing properties like

Image for postProjection matrix is symmetric

and

Image for postSquare of projection matrix is itself

The matrices that having this property are called Idempotent Matrices.

So, if we project a vector twice the result will be same as with projecting once.

Why use projections?

Suppose we have a matrix A comprising of two column vectors a1 and a2.

Image for postColumn Space of A

Now, we have to find solution of

Image for post

such that vector b does not lie in column space of matrix A.

Image for postVector b lies outside the column space of A

Looks like there are no solutions to this equation because there is no way vector b could be represented as a linear combination of a1 and a2.

But, if we project vector b onto the column space of A

Image for postVector p is projection of vector b on the column space of matrix A

Vectors p, a1 and a2 all lie in the same vector space.

Therefore, vector p could be represented as a linear combination of vectors a1 and a2.

Image for postx1 and x2 are the coefficients to be multiplied with vectors

This could be broken down in the form of matrices

Image for postImage for post

We can solve for matrix x and we may get a solution to the problem we previously thought was unsolvable.

Read Part 18 : Norms

You can view the complete series here

Connect with me on LinkedIn.

14