The concept of vector subspace is very simple.
It is a subset of vectors that is closed under addition and scalar multiplication.
Classic example is a plane in three dimensional space.
A plane is a sub space because if you combine together vectors
lying on the plane, you will remain in the plane.
You will never be able to leave the plane to the third dimension.
This is actually the theme of a very famous book called Flatland,
published at the end of the 19th century,
which I encourage you to check out if you don't know it already.
We can extend the concept of subspace to more
complicated vector spaces such as function vector spaces.
Take, for instance, the set of symmetric function over the interval.
This is a subspace in the sense that,
whenever we take a linear combination of symmetric functions over these interval,
we will always end up with the symmetric function.
Here, for example, you have cosine of 5, t.
We take a second function cosine of 5 pi of t, both are symmetric around 0.
And if we sum them together, the result, of course, will be symmetric as well.
Subspaces have their own bases, so, for instance,
consider the plane in three-dimensional space.
So the 3 bases vectors in the canonical bases are e0,
e1 and e2 and if we consider just this plane here,
our bases for the plane will be e0 and e1.
And here we get to the problem of approximation.
Suppose you have a full fledged vector in your vector space,
in this case a vector in 3 dimensional space,
you have a subspace S let's take for instance the plane here.
And the question is, how do I approximate x using only elements from this subspace?
Intuitively, the answer is, I take the projection of this vector
over the subspace, and this will be my approximation.
The power of vector space paradigm for signal processing is that we can extend
this geometric intuition to arbitrarily complex vector space.
So now we will define the concept of orthogonal projection for
approximation in abstract terms.
Consider an orthonormal basis for the subspace S, called this S of K.
The orthogonal projection of the vector X onto this subspace is defined like so.
This is like a partial basis expansion, where we take the coefficients as
inner products between the original vector and the basis for the subspace.