We are looking for a student interested in writing their thesis for modeling natural human hand motion for grasp animation. Gleechi provides the first software solution VirtualGrasp which makes it possible to animate hands that can move and interact freely and realistically in games and Virtual Reality. Interested? Keep scrolling and read the whole description.
About Gleechi:
Gleechi is a Stockholm-based startup that have developed the first software to make it possible to animate hands that can move and interact freely and realistically in games and Virtual Reality. The technology is based on 8 years of robotics research, and the company now has customers including one of the top 10 largest VR developers in the world as well as a world-leading automation company. Gleechi has received several awards, including Super Startup of 2015 by Veckans Affärer and ALMI Invest and Winner of the european competition EIT Digital Idea Challenge 2015.
Video demo: https://www.youtube.com/watch?v=xkCt17JHEzY
Introduction:
With the recent growth of virtual reality (VR) applications there is a demand to create highly immersive environments in which the avatar that the user embodies reflects any kind of actions in the virtual world as precise as possible. The major action humans use for interacting with the world is grasping of objects with their hands. Until now, the visual representation of grasping in VR has been resolved by very simple means only, such as attaching a rigid hand to the object that does not adapt to the shape, or manually animating a sparse set of grasps for pre-defined objects, or just not showing hands at all. Initial experiments have shown that hands that are too human-like, or hand that do not match the the players’ expectations in appearance or behavior, often leads to a loss of the feeling of presence (i.e. making the players feel they are not really in the game). The effect is closely related to the “Uncanny Valley” effect, which refers to when features look and move almost, but not exactly, like natural beings, it causes a response of revulsion among the observers.
Description:
Gleechi provides a software solution called VirtualGrasp which makes it possible to animate grasping interactions in real-time based on the constraints of the virtual world (such as shape of objects, kinematics of the hand, etc). This solution is not a hand tracking algorithm, but a tool that animates a given hand model. In VR applications, important measures of success for such a system is to create hand and finger motions that both satisfy the physical constraints placed by the object, and are natural and realistic to the human eyes. The first is easy to measure, the second however is difficult to quantify. We believe a data-driven approach exploiting machine learning techniques is a good solution to quantify the “realism” and “naturalism” of the grasps. Such an approach also provides a foundation to synthesize grasps toward this end.
As a background, human hand is a complex organ that consists of many joints with complex mechanical structure and dynamic properties. This makes animating natural hand motion very difficult because there are a high number of interdependent degrees of freedom (dofs) to be controlled. However studies have shown that when interacting and grasping objects, human hand and finger motion are highly coordinated leaving the resulting control space fairly simplified [1]. This fact has been exploited in both robotic grasp control [2] and animation industry for grasp synthesis [3], just to name a few. The goal of this thesis is to apply unsupervised machine learning approach to identify a low-dimensional space of hand and finger motion, construct a motion model of object grasping, and apply this model for grasp evaluation and synthesis.
Therefore, we propose a thesis to study “Modeling Natural Human Hand Motion For Grasp Animation”.
Tasks:
Supervisor at Gleechi: Dr. Dan Song
References:
[1] Postural hand synergies for tool use, JNS 1998 : https://www.researchgate.net/publication/13463480_Postural_hand_synergies_for_tool_use
[2] Eigen grasp for robotic grasping control, rss 2007 : http://projects.csail.mit.edu/manipulation/rss07/paper__dexterous_grasping_via_eigengrasps_a_low_dimensional_approach_to_a_high_complexity_problem__ciocarlie.pdf
[3] Grasp synthesis from low-dimensional probabilistic model, cavw 2008 : https://www.yumpu.com/en/document/view/48655778/grasp-synthesis-from-low-dimensional-probabilistic-grasp-models
This job comes with several perks and benefits