|
Classification using a sparse combination
of basis functions
Kornél Kovács, András Kocsor
Combinations of basis functions are applied here to generate and solve
a convex reformulation of several well-known machine learning algorithms
like certain variants of boosting methods and Support Vector Machines.
We call such a reformulation a Convex Networks (CN) approach. The nonlinear
Gauss-Seidel iteration process for solving the CN problem converges globally
and fast as we prove. A major property of CN solution is the sparsity,
the number of basis functions with nonzero coefficients. The sparsity
of the method can effectively be controlled by heuristics where our techniques
are inspired by the methods from linear algebra. Numerical results and
comparisons demonstrate the effectiveness of the proposed methods on publicly
available datasets. As a consequence, the CN approach can perform learning
tasks using far fewer basis functions and generate sparse solutions.
|
|