- \\lancs\homes\60\zhangy3\My Desktop\PURE PDF\Semi
Submitted manuscript, 1.17 MB, PDF document

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSN › Conference contribution/Paper › peer-review

Published

**Semi-supervised regression using Hessian energy with an application to semi-supervised dimensionality reduction.** / Kim, Kwang In; Steinke, Florian; Hein, Matthias.

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSN › Conference contribution/Paper › peer-review

Kim, KI, Steinke, F & Hein, M 2010, Semi-supervised regression using Hessian energy with an application to semi-supervised dimensionality reduction. in *Advances in Neural Information Processing Systems (NIPS).* MPI for Biological Cybernetics, Germany, pp. 979-987. <http://www.ml.uni-saarland.de/Publications/KimSteHei-SSRUsingHessianEnergy.pdf>

Kim, K. I., Steinke, F., & Hein, M. (2010). Semi-supervised regression using Hessian energy with an application to semi-supervised dimensionality reduction. In *Advances in Neural Information Processing Systems (NIPS) *(pp. 979-987). MPI for Biological Cybernetics. http://www.ml.uni-saarland.de/Publications/KimSteHei-SSRUsingHessianEnergy.pdf

Kim KI, Steinke F, Hein M. Semi-supervised regression using Hessian energy with an application to semi-supervised dimensionality reduction. In Advances in Neural Information Processing Systems (NIPS). Germany: MPI for Biological Cybernetics. 2010. p. 979-987

@inproceedings{147a89e68c8d4a7c9944c5ecc9be6a76,

title = "Semi-supervised regression using Hessian energy with an application to semi-supervised dimensionality reduction",

abstract = "Semi-supervised regression based on the graph Laplacian suffers from the factthat the solution is biased towards a constant and the lack of extrapolating power. Based on these observations, we propose to use the second-order Hessian energy for semi-supervised regression which overcomes both these problems. If the data lies on or close to a low-dimensional submanifold in feature space, the Hessian energy prefers functions whose values vary linearly with respect to geodesic distance. We first derive the Hessian energy for smooth manifolds and continue to give a stable estimation procedure for the common case where only samples of the underlying manifold are given. The preference of {\textquoteleft}{\textquoteright}linear” functions on manifolds renders the Hessian energy particularly suited for the task of semi-supervised dimensionality reduction, where the goal is to find a user-defined embedding function given some labeled points which varies smoothly (and ideally linearly) along the manifold. The experimental results suggest superior performance of our method compared with semi-supervised regression using Laplacian regularization or standard supervised regression techniques applied to this task.",

author = "Kim, {Kwang In} and Florian Steinke and Matthias Hein",

year = "2010",

language = "English",

pages = "979--987",

booktitle = "Advances in Neural Information Processing Systems (NIPS)",

publisher = "MPI for Biological Cybernetics",

}

TY - GEN

T1 - Semi-supervised regression using Hessian energy with an application to semi-supervised dimensionality reduction

AU - Kim, Kwang In

AU - Steinke, Florian

AU - Hein, Matthias

PY - 2010

Y1 - 2010

N2 - Semi-supervised regression based on the graph Laplacian suffers from the factthat the solution is biased towards a constant and the lack of extrapolating power. Based on these observations, we propose to use the second-order Hessian energy for semi-supervised regression which overcomes both these problems. If the data lies on or close to a low-dimensional submanifold in feature space, the Hessian energy prefers functions whose values vary linearly with respect to geodesic distance. We first derive the Hessian energy for smooth manifolds and continue to give a stable estimation procedure for the common case where only samples of the underlying manifold are given. The preference of ‘’linear” functions on manifolds renders the Hessian energy particularly suited for the task of semi-supervised dimensionality reduction, where the goal is to find a user-defined embedding function given some labeled points which varies smoothly (and ideally linearly) along the manifold. The experimental results suggest superior performance of our method compared with semi-supervised regression using Laplacian regularization or standard supervised regression techniques applied to this task.

AB - Semi-supervised regression based on the graph Laplacian suffers from the factthat the solution is biased towards a constant and the lack of extrapolating power. Based on these observations, we propose to use the second-order Hessian energy for semi-supervised regression which overcomes both these problems. If the data lies on or close to a low-dimensional submanifold in feature space, the Hessian energy prefers functions whose values vary linearly with respect to geodesic distance. We first derive the Hessian energy for smooth manifolds and continue to give a stable estimation procedure for the common case where only samples of the underlying manifold are given. The preference of ‘’linear” functions on manifolds renders the Hessian energy particularly suited for the task of semi-supervised dimensionality reduction, where the goal is to find a user-defined embedding function given some labeled points which varies smoothly (and ideally linearly) along the manifold. The experimental results suggest superior performance of our method compared with semi-supervised regression using Laplacian regularization or standard supervised regression techniques applied to this task.

M3 - Conference contribution/Paper

SP - 979

EP - 987

BT - Advances in Neural Information Processing Systems (NIPS)

PB - MPI for Biological Cybernetics

CY - Germany

ER -