Abstract:
One of the most famous dimensionality reduction methods is Principal Component
Analysis (PCA), which is successfully used worldwide. However this method is sensitive
to outliers and hence a few number of them cause bias in the resulting subspace. There
are a number of techniques now for the robustification of PCA, but we stick to the
version introduced in [ 30 ]. The numerical technique for optimization in [ 30 ] relied on
Iteratively Reweighted Least Squares (IRLS) method. In the present paper we adopted
the Conjugate Gradient Descent algorithm with orthogonal matrix constraints from [ 18 ]
for solving the nonconvex matrix optimization problem. We discuss the arising
computational and convergence problems and compare effectiveness of the methods.