Kernel Machine Based Feature Extraction
Algorithms for Regression Problems
Csaba Szepesvári , András Kocsor,
Kornél Kovács
In this paper we consider two novel kernel machine based feature
extraction algorithms in a regression settings. The first method is
derived based on the principles underlying the recently introduced
Maximum Margin Discimination Analysis (MMDA) algorithm. However, here
it is shown that the orthogonalization principle employed by the original
MMDA algorithm can be motivated using the well-known ambiguity decomposition,
thus providing a firm ground for the good performance of the algorithm.
The second algorithm combines kernel machines with average derivative
estimation and is derived from the assumption that the true regressor
function depends only on a subspace of the original input space. The
proposed algorithms are evaluated in preliminary experiments conducted
with artificial and real datasets.