Orthogonal Projections: Unlocking the Power of Vector Spaces
Dive into the world of orthogonal projections and discover their crucial role in linear algebra. Master key concepts, formulas, and applications to elevate your mathematical prowess.

  1. Intros0/3 watched
  2. Examples0/5 watched
  1. 0/3
  2. 0/5
Now Playing:Orthogonal projections – Example 0a
Intros
0/3 watched
  1. Orthogonal Projections Overview:
  2. Orthogonal Projections Overview:
    The Orthogonal Decomposition Theorem
    • Make yy as the sum of two vectors y^\hat{y} and zz
    • Orthogonal basis → y^=yv1v1v1v1++yvpvpvpvp\hat{y}= \frac{y \cdot v_1}{v_1 \cdot v_1}v_1 + \cdots + \frac{y \cdot v_p}{v_p \cdot v_p}v_p
    • Orthonormal basis → y^=(yv1)v1++(yvp)vp\hat{y}=(y\cdot v_1)v_1+\cdots +(y\cdots v_p)v_p
    z=yy^z=y - \hat{y}
  3. Orthogonal Projections Overview:
    Property of Orthogonal Projections
    • projsy=y_s y=y
    • Only works if yy is in SS
Examples
0/5 watched
  1. The Orthogonal Decomposition Theorem
    Assume that {v1,v2,v3v_1,v_2,v_3 } is an orthogonal basis for Rn\Bbb{R}^n. Write yy as the sum of two vectors, one in Span{v1v_1}, and one in Span{v2,v3v_2,v_3}. You are given that:
    vector 1, 2, 3, 4
    Inner product, length, and orthogonality
    Jump to:Notes
    Notes

    Orthogonal projections


    What is an orthogonal projection

    The orthogonal projection of a vector onto another is, just as the name says it, the projection of the first vector above the second one. You may be wondering what is that supposed to mean then?, well, for a better explanation let us show you graphically in the next figure:

    Orthogonal projections
    Figure 1: Projection of a vector onto another

    Notice that in the right hand side of the figure we can see how we added the vector which is the orthogonal projection of a onto b. The word orthogonal (which you know already that it means there is perpendicularity involved) comes from the angle made by the projection and the normal line connecting the projection and the original vector, for this case, this normal line (the dashed line in figure 1) is the component of vector a that is orthogonal to b, and can its length can be written as: ab \parallel a-b \parallel.

    Orthogonal projection vector


    A formal orthogonal projection definition would be that it refers to the projection of a vector onto a plane which is parallel to another vector, in other words and taking figure 1 in mind, the projection of vector a falls in the same plane as vector bb, and so, the projection of vector aa is a vector parallel to vector bb.

    And so, if we consider a subspace spanned by the vector vv, then the orthogonal projection of yy onto vv is defined as y^\hat{y} and can be calculated with the next equation:

    projproj Ly=y^= y= \hat{y}= yvvv\large \frac{y \, \cdot \, v}{v \, \cdot \, v} vv
    Equation 1: Projection of y onto v

    Where y^\hat{y} is called the orthogonal projection vector, and so, equation 1 may be referred to (in general) as the orthogonal projection formula .

    Orthogonal projections
    Figure 2: Vector y and its projection onto v


    Notice the component of y orthogonal to v is equal to z=yy^z=y-\hat{y}.

    Now let us talk about orthogonal projections onto a subspace, not another vector, but a plane. For that:

    Solving systems of linear equations by graphing
    Figure 3: Vector y and its projection onto plane S


    Let SS be a subspace in RnR^n, then each vector yy in RnR^n can be written as: y=y^+zy= \hat{y}+z

    Where y^\hat{y} is in SS and z=yy^z=y-\hat{y} is in SS^{\perp} . And so, y^\hat{y} is the orthogonal projection of yy onto SS.
    Therefore, if we want to calculate y^\hat{y} we need to check if {v1,...,vpv_1, ... , v_p} is an orthogonal basis of SS, if the vectors from the subset happen to form an orthogonal set and orthogonal basis (which remember, this can be checked by performing the dot product of all of the vectors in the set) then we can calculate the projection of yy onto SS as:

    projproj Sy=y^= y = \hat{y} = yv1v1v1\large \frac{y \, \cdot \, v_1} {v_1 \, \cdot \, v_1} v1+v_1 \, + \, yv2v2v2\large \frac{y \, \cdot \, v_2} {v_2 \, \cdot \, v_2} v2+...v_2 \, + \, ... \, yvpvpvp\large \frac{y \, \cdot \, v_p} {v_p \, \cdot \, v_p} vpv_p
    Equation 2: Orthogonal projection of y onto S for an orthogonal basis


    However, if {v1,...,vpv_1, ... , v_p} is an orthonormal basis of SS, then the equation changes a little bit:

    projproj Sy=y^=(yv1)v1+(yv2)v2+...+(yvp)vp y = \hat{y} = (y \cdot v_1)v_1 \, +\, (y \cdot v_2)v_2 \, +...+ \, (y \cdot v_p)v_p
    Equation 3: Orthogonal projection of y onto S for an orthonormal basis

    Remember that an orthonormal basis is a basis conformed by a set of vectors which are orthogonal with each other AND at the same time they are all unit vectors themselves.

    How to find orthogonal projection


    The steps to find the orthogonal projection of vector y onto a subspace are as follows:
    1. Verify that the set of vectors provided is either an orthogonal basis or an orthonormal basis
      1. If orthogonal basis continue on step 2
      2. If orthonormal basis continue on step 3

    2. Having an orthogonal basis containing a set of vectors {v1,...,vpv_1,... , v_p}, compute the projection of yy onto SS by solving the formula found in equation 2. In order to do that, follow the next steps:
      1. Calculate the dot products yv1,...,yvpy \cdot v_1, ... , y \cdot v_p
      2. Calculate the dot products v1v1,...,vpvpv_1 \cdot v_1 , ... , v_p \cdot v_p
      3. Compute the divisions yv1v1v1,...,yvvvvvv\large \frac{y \, \cdot \, v_1} {v_1 \, \cdot \, v_1} , ... , \frac{y \, \cdot \, v_v} {v_v \, \cdot \, v_v}
      4. Multiply each result from our last step with its corresponding vector {v1,...,vpv_1 ,... , v_p}
      5. Add all of the resulting vectors together to find the final projection vector.

    3. Having an orthonormal basis containing a set of vectors {v1,...,vnv_1 ,... , v_n}, compute the projection of yy onto SS by solving the formula found in equation 3. In order to do that, follow the next steps:
      1. Calculate the dot products yv1,...,yvpy \cdot v_1 , ... , y \cdot v_p
      2. Multiply each result from our last step with its corresponding vector {v1,...,vpv_1 ,... , v_p}
      3. Add all of the resulting vectors together to find the final projection vector.

    And now you are ready to solve some exercise problems!


    Orthogonal projection examples


    Example 1

    Assume that {v1,v2,v3v_1, v_2, v_3} is an orthogonal basis for RnR^n. Write yy as a sum of two vectors, one in Span{v1v_1} and one in Span{v2,v3v_2, v_3}. The vectors v1,v2,v3v_1, v_2, v_3 and yy are defined as follows:

    Orthogonal projections
    Equation 4: Vectors v1, v2, v3 and y

    For this first problem we are already assuming that the vectors provided form an orthogonal basis in RnR^n, that means each vector is orthogonal to each other, and linearly independent. Therefore the spans Span{v1v_1} and Span{v2,v3v_2,v_3} each have basis {v1v_1} and {v2,v3v_2, v_3} containing orthogonal vectors which makes them linearly independent, these characteristics make them orthogonal bases!.
    Therefore, we can be sure already that we can use equation 2 in order to solve for y^\hat{y} when needed, in BOTH cases: yy as a sum of two vectors in Span{v1v_1} and yy as a sum of two vectors in Span{v2,v3v_2, v_3}.

    If we need to write y as a sum of two vectors, remember from figures 2 and 3 that y=y^+zy = \hat{y} + z,
    And so, we calculate y^ \hat{y} first and then add it to zz.
    We will work only on the first part of the problem writing yy as a sum of two vectors in Span{v1v_1} and leave the second case for you to solve on your own.

    So let us calculate y^\hat{y}!
    For this case we have only one vector in the basis of the span SS, and so, the formula goes as:

    y^=yv1v1v1v1=(4)(1)+(3)(2)+(4)(2)(1)(1)+(2)(2)+(2)(2)v1=4+6+81+4+4v1=189v1=2v1 \large \hat{y} = \frac{y \cdot v_1}{v_1 \cdot v_1}v_1 = \frac{(4)(1)+(3)(2)+(4)(2)}{(1)(1)+(2)(2)+(2)(2)}v_1 = \frac{4+6+8}{1+4+4}v_1 = \frac{18}{9}v_1= 2v_1

    Orthogonal projections
    Equation 5: Projection of y onto S

    With that we can now write y as a sum of two vectors in Span{v1v_1} as follows:

    Orthogonal projections
    Equation 6: Writing y as a sum of two vectors in Span{v1}

    And what is zz? Easy! We can calculate it just to see what it is:

    Orthogonal projections
    Equation 7: component of y orthogonal to S

    Example 2

    Verify that {v1,v2v_1, v_2} is an orthonormal set, and then define orthogonal projection of yy onto Span {v1,v2v_1, v_2}.

    Orthogonal projections
    Equation 8: Vectors v1, v2 and y

    To verify if the set {v1,v2v_1, v_2} is orthonormal we first check if the vectors in the set are orthogonal to each other by computing their dot product:

    Orthogonal projections
    Equation 9: Dot product of vectors v1 and v2

    Since the dot product yielded a result of zero, then it means the vectors are orthogonal to each other. The second condition for the set to be an orthonormal set is that its vectors are unit vectors, thus, let us check if their magnitude is one.

      v1=(12)2+(0)2+(12)2=12+12=1=1 \large\parallel\; v_1\parallel \enspace = \enspace \sqrt{(\frac{1}{\sqrt{2}})^2+(0)^2+(\frac{1}{\sqrt{2}})^2 } = \sqrt{ \frac{1}{2}+ \frac{1}{2}} =\sqrt{1} =1

      v2=(13)2+(13)2+(13)2=13+13+13=1=1 \large\parallel\; v_2\parallel \enspace = \enspace \sqrt{(-\frac{1}{\sqrt{3}})^2+(\frac{1}{\sqrt{3}})^2+(\frac{1}{\sqrt{3}})^2 } = \sqrt{ \frac{1}{3}+ \frac{1}{3}+\frac{1}{3}} =\sqrt{1} =1
    Equation 10: Magnitudes of vectors v1 and v2

    And so, we have an orthonormal set since we just proved that the vectors v1v_1 and v2v_2 are unit vectors. Now we have to find the orthogonal projection of y onto Span {v1,v2v_1, v_2}.

    proj proj Sy=y^=(yv1)v1+(yv2)v2 y = \hat{y} = (y \cdot v_1)v_1 \, + \,(y \cdot v_2)v_2
    Equation : Orthogonal projection of y onto S for an orthonormal basis

    Using this equation, we plug the values that we have for vectors v1,v2v_1, v_2 and yy in order to calculate the projection vector y^\hat{y}:

    Orthogonal projections

    Orthogonal projections

    Orthogonal projections

    Orthogonal projections
    Equation 11: Computing orthogonal projection of y onto Span {v1, v2}


    Example 3

    Find the best approximation of yy by vectors of the form c1v1+c2v2c_1v_1 \, + \, c_2v_2, where:

    Orthogonal projections
    Equation 12: Vectors y, v1 and v2

    Having vectors of the form c1v1+c2v2c_1v_1 \, + \, c_2v_2 means that we have a linear combination that is the same as having a span of vectors, in this case, the span of vectors v1 v_1 and v2v_2. And so, we can obtain a basis SS such as:

    S=Span S=Span{v1,v2v_1,v_2 } =c1v1+c2v2= c_1 v_1 \,+ \, c_2v_2
    Equation 13: Basis for S or Span of vectors v1,v2

    Now the first thing to do in order to find the best approximation of y is to check if the basis provided is an orthogonal basis, for that, we obtain the inner product of the two vectors inside the basis:

    Orthogonal projections
    Equation 14: Inner product of the vectors inside the basis S

    And given that the set is an orthogonal set due the inner product above resulting in a zero, we can now finally compute the vector y^\hat{y} (which is the best approximation of yy) by using the projection formula shown in equation 2 as shown below:

    proj proj Sy=y^ y = \hat{y} =yv1v1v1 \large = \frac{y\, \cdot v_1}{v_1 \, \cdot v_1} v1v_1 +yv2v2v2 \large +\frac{y\, \cdot v_2}{v_2 \, \cdot v_2} v2v_2

    Orthogonal projections

    Orthogonal projections
    Equation 15: Orthogonal projection of y onto S for an orthogonal basis

    Example 4

    Find the closest point to yy in the subspace SS spanned by v1v_1 and v2v_2:

    Orthogonal projections
    Equation 16: Vectors v1, v2 and y

    The closest point to yy is simply y^\hat{y} since its the shortest distance compared to any other vector given that is the best approximation of yy itself. And so, the purpose of this problem is to calculate y^\hat{y}.
    For that we will use equation 2 once more in order to calculate the orthogonal projection of y onto SS, but the problem is that at this point we need more information for us to use that equation.

    First we need to check if we have an orthogonal basis for SS.
    In here we have a subspace spanned by v1v_1 and v2v_2, which means that we have the linear combination of v1v_1 and v2v_2 (so v1v_1 and v2v_2 are linearly independent) which is equal to the basis S=S= Span{v1,v2v_1 , v_2}. In order to check if this basis SS is an orthogonal basis, we have to see if v1v_1 and v2v_2 are orthogonal to each other, therefore, we compute their dot product!

    v1v2=(7)(1)+(1)(1)+(4)(2)=7+18=0 v_1 \, \cdot \, v_2=(7)(1)+(-1)(-1)+(-4)(2)=7+1-8=0
    Equation 17: Dot product of vectors v1 and v2

    And so, we have an orthogonal basis since the dot product above yielded a result of zero.
    So now we can calculate y^\hat{y} using equation 2:

    proj proj Sy=y^ y = \hat{y} =yv1v1v1 \large = \frac{y\, \cdot v_1}{v_1 \, \cdot v_1} v1v_1 +yv2v2v2 \large +\frac{y\, \cdot v_2}{v_2 \, \cdot v_2} v2v_2

    Orthogonal projections

    Orthogonal projections

    Orthogonal projections
    Equation 18: Orthogonal projection of y onto S for an orthogonal basis

    Example 5

    Find the closest distance from yy to S=S= Span{v1,v2v_1, v_2} if v1,v2v_1, v_2 and yy are defined as below:

    Orthogonal projections
    Equation 19: Vectors v1, v2 and y

    In this case, the closest distance from yy to SS can be graphically represented below:

    Orthogonal projections
    Figure 4: Closest distance from y to the span S

    Therefore the closest distance is equal to the magnitude of the subtraction of the vector yy and yy, therefore the closest distance =yy^ = \enspace \parallel y -\hat{y}\parallel
    So if we want to calculate the closest distance, we need to compute y^\hat{y} and for that, we first need to check that we have an orthogonal basis, and so, we check for an orthogonal basis by calculating the dot product of v1v_1 and v2v_2:

    v1v2=(1)(1)+(1)(1)+(0)(0)=1+1=0 v_1 \cdot v_2 = (-1)(1)+(1)(1)+(0)(0)=-1+1=0
    Equation 20: Dot product of v1 and v2

    Knowing that we have an orthogonal basis due the result above, we can now compute y^\hat{y}:

    Orthogonal projections

    Orthogonal projections
    Equation 21: Finding y-hat

    And now that we have the vector y^\hat{y} we can finally compute the length =yy^ = \enspace \parallel y -\hat{y}\parallel :

    Orthogonal projections
    Equation 22: Closest distance from y to S=Span{v1,v2}

    ***

    And so, we have arrived to the end of our lesson, we hope you enjoyed it and see you in the next topic!
    The Orthogonal Decomposition Theorem
    Let SS be a subspace in Rn\Bbb{R}^n. Then each vector yy in Rn\Bbb{R}^n can be written as:

    y=y^+zy=\hat{y}+z

    where y^\hat{y} is in SS and zz is in SS^{\perp}. Note that y^\hat{y} is the orthogonal projection of yy onto SS

    If {v1,,vpv_1,\cdots ,v_p } is an orthogonal basis of SS, then

    projSy=y^=yv1v1v1v1+yv2v2v2v2++yvpvpvpvpproj_{S}y=\hat{y}=\frac{y \cdot v_1}{v_1 \cdot v_1}v_1 + \frac{y \cdot v_2}{v_2 \cdot v_2}v_2 + \cdots + \frac{y \cdot v_p}{v_p \cdot v_p}v_p

    However if {v1,,vpv_1,\cdots ,v_p } is an orthonormal basis of SS, then

    projSy=y^=(yv1)v1+(yv2)v2++(yvp)vpproj_{S}y=\hat{y}=(y \cdot v_1)v_1+(y \cdot v_2)v_2 + \cdots + (y \cdot v_p)v_p

    Property of Orthogonal Projection
    If {v1,,vpv_1,\cdots ,v_p } is an orthogonal basis for SS and if yy happens to be in SS, then
    projSy=yproj_{S}y=y

    In other words, if y is in S=S=Span{v1,,vpv_1,\cdots ,v_p}, then projSy=yproj_{S}y=y.

    The Best Approximation Theorem
    Let SS be a subspace of Rn\Bbb{R}^n. Also, let yy be a vector in Rn\Bbb{R}^n, and y^\hat{y} be the orthogonal projection of yy onto SS. Then yy is the closest point in SS, because

    yy^\lVert y- \hat{y} \rVert < yu\lVert y-u \rVert

    where uu are all vectors in SS that are distinct from y^\hat{y}.