Decomposition of Vector Spaces

by ayushkhaitan3437

 

Let E_1, E_2,\dots, E_s be linear transformations on an n-dimensional vector space such that I=E_1+E_2+\dots+E_s and E_iE_j=0 for i\neq j. Then V=E_1V\oplus E_2V\oplus\dots\oplus E_sV.

How does this happen? Take the expression I=E_1+E_2+\dots+E_s and multiply by any v\in V on both sides. We see that v=E_1v+E_2v+\dots+E_sv. Hence any vector v can be expressed as a sum of elements in E_iV for i\in\{1,2,\dots,s\}.

Why do we have a direct sum decomposition? Let v_1+v_2+\dots+v_s=0 for v_i\in E_iV. Then consider E_k(v_1+v_2+\dots+v_s)=E_k(0)=0. For any v_i where i\neq k, $v_i=E_i v$ for some v\in V. Hence E_kv_i=E_kE_iv=0.v=0. Hence, we have E_kv_k=0. Now v_k=E_kv' for some v'\in V. Hence E_kv_k=E_k^2v'. Note that E_k^2=E_k (just multiply the expression I=E_1+E_2+\dots+E_s by E_k on both sides). Hence, we have E_k v'=0. Now E_kv'=v_k. Hence, we have v_k=0. This is true for all k\in\{1,2,\dots,s\}. Hence, all the v_i's=0, which proves that we have a direct sum decomposition of V=E_1V\oplus E_2V\oplus\dots\oplus E_sV.

Why is all this relevant? Because using the minimal polynomial f(x)=p_1(x)^{e_1}\dots p_s(s)^{e_s} of any transformation T\in L(V,V), we can construct such E_i‘s which satisfy the above two conditions, and can hence decompose the vector space as a direct sum of s subspaces. Moreover, these subspaces have the additional property that they’re T-invariant. Each E_i=f(x)/p_i(x)^{e_i}

Advertisements