Home > Research > Publications & Outputs > Tactile Mesh Saliency

Electronic data

Links

Text available via DOI:

View graph of relations

Tactile Mesh Saliency: a brief synopsis

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Published
Publication date15/09/2016
Host publicationComputer Graphics and Visual Computing (CGVC)
EditorsCagatay Turkay, Tao Ruan Wan
PublisherThe Eurographics Association
ISBN (print)9783038680222
<mark>Original language</mark>English

Abstract

This work has previously been published [LDS 16] and this extended abstract provides a synopsis for further discussion at the UK CGVC 2016 conference. We introduce the concept of tactile mesh saliency, where tactile salient points on a virtual mesh are those that a human is more likely to grasp, press, or touch if the mesh were a real-world object. We solve the problem of taking as input a 3D mesh and computing the tactile saliency of every mesh vertex. The key to solving this problem is in a new formulation that combines deep learning and learning-to-rank methods to compute a tactile saliency measure. Finally, we discuss possibilities for future work.