THESIS
2013
ix, 36 pages : illustrations ; 30 cm
Abstract
In this thesis, we considered two high-dimensional regression problems. In the
first part, we considered a linear regression problem in high-dimensional setting,
where the covariates can be ordered in some meaningful way. We proposed a
so-called spline-lasso (with thresholding) to better capture the different effects
within the influential grouped variables as well as to improve the feature selection
ability. In the second part, a binary classification problem was considered.
We proposed a new classification method to find a separating hyperplane. The
method tried to assign a distance-based decaying weight to each training observations.
Typically, a correctly classified point near the separating hyperplane
would have higher weight than those far away from the boundary. Also mi...[
Read more ]
In this thesis, we considered two high-dimensional regression problems. In the
first part, we considered a linear regression problem in high-dimensional setting,
where the covariates can be ordered in some meaningful way. We proposed a
so-called spline-lasso (with thresholding) to better capture the different effects
within the influential grouped variables as well as to improve the feature selection
ability. In the second part, a binary classification problem was considered.
We proposed a new classification method to find a separating hyperplane. The
method tried to assign a distance-based decaying weight to each training observations.
Typically, a correctly classified point near the separating hyperplane
would have higher weight than those far away from the boundary. Also misclassified points would be penalized so that our classification rule would enjoy good
generalization performance. To solve for the optimal separating hyperplane, the
majorization-minimization (MM) algorithm was applied.
Post a Comment