The GramSchmidt Algorithm
In any innerproduct space, we can choose the basis in which to
work. It often greatly simplifies calculations to work in an
orthogonal basis. For one thing, if S = {v_{1}, v_{2},…:, v_{n}} is an orthogonal basis for an inner product space V,
then it is a simple matter to express any vector w ∈ V as a linear
combination of the vectors in S:
w = 
<w,v_{1}>  v_{1} ^{2}

v_{1} + 
<w,v_{2}>  v_{2} ^{2}

v_{2} +…:+ 
<w,v_{n}>  v_{n} ^{2}

v_{n}. 

That is, w has coordinates 

< w,v_{1} > v_{1}^{2} 
v_{1} 
< w,v_{2} > v_{2}^{2} 
v_{2} . . . 
< w,v_{n} > v_{n}^{2} 
v_{n} 


T
 relative to the basis S. 


Given an arbitrary basis {u_{1}, u_{2}, …:, u_{n}} for an
ndimensional inner product space V, the GramSchmidt
algorithm constructs an orthogonal basis {v_{1}, v_{2},…:, v_{n}} for V:
Step 1: 

Let v_{1} = u_{1} 
Step 2: 

Let v_{2} = u_{2}  proj_{W1}u_{2} = u_{2}  [ <u_{2},v_{1}> /  v_{1}^{2} ]v_{1} where W_{1} is the space spanned by v_{1}, and
proj_{W1}u_{2} is the orthogonal projection of u_{2} on W_{1}.

Step 3: 

Let v_{3} = u_{3} proj_{W2} u_{3} = u_{3} [ <u_{3},v_{1}> /  v_{1} ^{2} ]v_{1}  [ <u_{3},v_{2}> /  v_{2} ^{2} ]v_{2} where W_{2} is
the space spanned by v_{1} and v_{2}.

Step 4: 

Let v_{4} = u_{4} proj_{W3} u_{4} = u_{4} [ <u_{4},v_{1}> /  v_{1} ^{2} ]v_{1}  [ <u_{4},v_{2}> /  v_{2} ^{2} ]v_{2}  [ <u_{4},v_{3}> /  v_{3} ^{2} ]v_{3} where W_{3} is
the space spanned by v_{1}, v_{2} and v_{3}.


. . . 

Continue this process up to v_{n}. The resulting orthogonal set
{v_{1},v_{2},…:,v_{n}} consists of n linearly
independent vectors in V and so forms an orthogonal basis for V.
Notes
 To obtain an orthonormal basis for an inner product space
V, use the GramSchmidt algorithm to construct an orthogonal basis.
Then simply normalize each vector in the basis.
 For R^{n} with the Eudlidean inner product (dot product), we
of course already know of the orthonormal basis { (1,0,0,…:,0), (0,1,0,…:,0), …:, (0, …:,0,1)}. For more abstract spaces, however, the existence of an
orthonormal basis is not obvious. The GramSchmidt algorithm is
powerful in that it not only guarantees the existence of an
orthonormal basis for any inner product space, but actually gives the
construction of such a basis.
Example
Let V = R^{3} with the Euclidean inner product. We will apply the
GramSchmidt algorithm to orthogonalize the basis { (1,1,1),(1,0,1),(1,1,2)}.
Step 2: v_{2} 
= 
(1,0,1)  
(1,0,1) · (1,1,1)
(1,1,1)^{2} 
(1,1,1) 


= 


= 

Step 3: v_{3} 
= 
(1,1,2)  
(1,1,2) · (1,1,1)
(1,1,1)^{2} 
(1,1,1)  
(1,1,2) · (^{1}/_{3},^{2}/_{3},^{1}/_{3})
(^{1}/_{3},^{2}/_{3},^{1}/_{3})^{2} 
(^{1}/_{3},^{2}/_{3},^{1}/_{3}) 


= 
(1,1,2)  
2 3 
(1,1,1)  
5 2 
(^{1}/_{3},^{2}/_{3},^{1}/_{3}) 


= 
(^{1}/_{2},0,^{1}/_{2}) 
You can verify that { (1,1,1),(1/3,2/3,1/3),(1/2,0,1/2)} forms an orthogonal
basis for R^{3}. Normalizing the vectors in the orthogonal basis,
we obtain the orthonormal basis





√3 3

, 
√3 3

, 
√3 3


, 


√6 6

, 
√6 3

, 
√6 6


, 


√2 2

, 0, 
√2 2



. 

Key Concept [index]
Given an arbitrary basis { u_{1},u_{2},…:,u_{n}}
for an ndimensional inner product space V, the GramSchmidt
algorithm constructs an orthogonal basis { v_{1},v_{2},…:,v_{n}} for V:
Step 1: Let v_{1} 
= 
u_{1}. 
Step 2: Let v_{2} 
= 
u_{2}  
<u_{2}, v_{1}> v_{1}^{2} 
v_{1}. 

Step 3: Let v_{3} 
= 
u_{3}  
<u_{3}, v_{1}> v_{1}^{2} 
v_{1}  
<u_{3}, v_{2}> v_{2}^{2} 
v_{2}. 

