Home > Research > Publications & Outputs > Tactile mesh saliency

Electronic data

  • TactileMeshSaliency_SIGGRAPH

    Rights statement: "© ACM, YYYY. This is the author's version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution. The definitive version was published in PUBLICATION, {VOL#, ISS#, (DATE)} http://doi.acm.org/10.1145/nnnnnn.nnnnnn"

    Accepted author manuscript, 11.9 MB, PDF document

Links

Text available via DOI:

View graph of relations

Tactile mesh saliency

Research output: Contribution to journalJournal articlepeer-review

Published

Standard

Tactile mesh saliency. / Lau, Manfred; Dev, Kapil; Shi, Weiqi; Dorsey, Julie; Rushmeier, Holly.

In: ACM Transactions on Graphics, Vol. 35, No. 4, a52, 11.07.2016.

Research output: Contribution to journalJournal articlepeer-review

Harvard

Lau, M, Dev, K, Shi, W, Dorsey, J & Rushmeier, H 2016, 'Tactile mesh saliency', ACM Transactions on Graphics, vol. 35, no. 4, a52. https://doi.org/10.1145/2897824.2925927

APA

Lau, M., Dev, K., Shi, W., Dorsey, J., & Rushmeier, H. (2016). Tactile mesh saliency. ACM Transactions on Graphics, 35(4), [a52]. https://doi.org/10.1145/2897824.2925927

Vancouver

Lau M, Dev K, Shi W, Dorsey J, Rushmeier H. Tactile mesh saliency. ACM Transactions on Graphics. 2016 Jul 11;35(4). a52. https://doi.org/10.1145/2897824.2925927

Author

Lau, Manfred ; Dev, Kapil ; Shi, Weiqi ; Dorsey, Julie ; Rushmeier, Holly. / Tactile mesh saliency. In: ACM Transactions on Graphics. 2016 ; Vol. 35, No. 4.

Bibtex

@article{f4d3b3c062a54f11a710e5965f23683d,
title = "Tactile mesh saliency",
abstract = "While the concept of visual saliency has been previously explored in the areas of mesh and image processing, saliency detection also applies to other sensory stimuli. In this paper, we explore the problem of tactile mesh saliency, where we define salient points on a virtual mesh as those that a human is more likely to grasp, press, or touch if the mesh were a real-world object. We solve the problem of taking as input a 3D mesh and computing the relative tactile saliency of every mesh vertex. Since it is difficult to manually define a tactile saliency measure, we introduce a crowdsourcing and learning framework. It is typically easy for humans to provide relative rankings of saliency between vertices rather than absolute values. We thereby collect crowdsourced data of such relative rankings and take a learning-to-rank approach. We develop a new formulation to combine deep learning and learning-to-rank methods to compute a tactile saliency measure. We demonstrate our framework with a variety of 3D meshes and various applications including material suggestion for rendering and fabrication",
keywords = "saliency, deep learning, perception, crowdsourcing, fabrication material suggestion",
author = "Manfred Lau and Kapil Dev and Weiqi Shi and Julie Dorsey and Holly Rushmeier",
year = "2016",
month = jul,
day = "11",
doi = "10.1145/2897824.2925927",
language = "English",
volume = "35",
journal = "ACM Transactions on Graphics",
issn = "0730-0301",
publisher = "Association for Computing Machinery (ACM)",
number = "4",

}

RIS

TY - JOUR

T1 - Tactile mesh saliency

AU - Lau, Manfred

AU - Dev, Kapil

AU - Shi, Weiqi

AU - Dorsey, Julie

AU - Rushmeier, Holly

PY - 2016/7/11

Y1 - 2016/7/11

N2 - While the concept of visual saliency has been previously explored in the areas of mesh and image processing, saliency detection also applies to other sensory stimuli. In this paper, we explore the problem of tactile mesh saliency, where we define salient points on a virtual mesh as those that a human is more likely to grasp, press, or touch if the mesh were a real-world object. We solve the problem of taking as input a 3D mesh and computing the relative tactile saliency of every mesh vertex. Since it is difficult to manually define a tactile saliency measure, we introduce a crowdsourcing and learning framework. It is typically easy for humans to provide relative rankings of saliency between vertices rather than absolute values. We thereby collect crowdsourced data of such relative rankings and take a learning-to-rank approach. We develop a new formulation to combine deep learning and learning-to-rank methods to compute a tactile saliency measure. We demonstrate our framework with a variety of 3D meshes and various applications including material suggestion for rendering and fabrication

AB - While the concept of visual saliency has been previously explored in the areas of mesh and image processing, saliency detection also applies to other sensory stimuli. In this paper, we explore the problem of tactile mesh saliency, where we define salient points on a virtual mesh as those that a human is more likely to grasp, press, or touch if the mesh were a real-world object. We solve the problem of taking as input a 3D mesh and computing the relative tactile saliency of every mesh vertex. Since it is difficult to manually define a tactile saliency measure, we introduce a crowdsourcing and learning framework. It is typically easy for humans to provide relative rankings of saliency between vertices rather than absolute values. We thereby collect crowdsourced data of such relative rankings and take a learning-to-rank approach. We develop a new formulation to combine deep learning and learning-to-rank methods to compute a tactile saliency measure. We demonstrate our framework with a variety of 3D meshes and various applications including material suggestion for rendering and fabrication

KW - saliency

KW - deep learning

KW - perception

KW - crowdsourcing

KW - fabrication material suggestion

U2 - 10.1145/2897824.2925927

DO - 10.1145/2897824.2925927

M3 - Journal article

VL - 35

JO - ACM Transactions on Graphics

JF - ACM Transactions on Graphics

SN - 0730-0301

IS - 4

M1 - a52

ER -