Finger Sculpting with Digital Clay
College of Computing and
Georgia Institute of Technology
(In collaboration with Chris Shaw, with students Schwa Gargus, Ignacio Llamas, Byungmoon Kim and with faculty members Mark Allen, Wayne Book, Imme Ebertuphoff, Ari Glezer, John Goldthwaite, Beth Mynatt, David Rosen, and their students and postdocs who participate in the Digital Clay ITR project sponsored by the NSF.)
The ambitious interdisciplinary Digital Clay project, funded by an NSF/ITR grant, brings together faculty members and students from Mechanical and Electrical Engineering and Computer Science. It is focused on building a computer controlled physical surface capable of changing its shape to reflect changes in a digital 3D model, of sensing the pressure of bare human hands, and of intelligently reacting to this pressure by opposing a force that conveys prescribed stiffness properties and by altering its shape and the associated 3D digital model to support a variety of intuitive shape editing operations. Thus our exploration of the Digital Clay should lead to a new generation of haptic devices for bare-hands touch-based input and output interaction with 3D shapes. We plan to explore applications in Computer Aided Design, Architecture, Art, Medical Training, and assistance to the visually impaired. For instance, Digital Clay will allow two remotely located users to simultaneously sculpt the same object. Each user will have a version of the Digital Clay that has the form of a shared 3D surface model. Finger pressures of one user will alter the local shape, which will alter the computer model, which will be transmitted to the other location, which will alter the other shape. The full scale realization of the Digital Clay will involve a very large number of miniature valves, sensors, hydraulic actuators, and articulated structural elements.
The Finger Sculpting effort, in part supported by a SEED GRANT from Georgia Tech’s GVU Center Jarek Rossignac, Chris Shaw and graduate students Schwa Gargus, Ignacio Llamas, and Byungmoon Kim, all from the College of Computing. It is focused on developing the interface model between the human finger and the surface. Starting from the nature of the contact: How will the surface know that you are push it to create a depression? How will it know that you want to pull it to create a protrusion? How much resistance should it offer? How does it know that you want to stop editing its shape and simply explore it. We have developed models for this contact behavior and have simulated them on a Phantom force-feedback haptic device, so that we can validate and fine-tune the models before the actual Digital Clay hardware is operational. The next question is: How do you control the shape of the surface? We have explored an intuitive approach which mimics grabbing a different portion of the surface with each hand and then simultaneously dragging and rotating the associated coordinate systems with your hands to impose two sets of simultaneous constraints, each involving 6 degrees of freedom. We have developed a solver that deforms the surface to match all 12 constraints in realtime and offers the intuitive look-and-feel of an elastic material. To validate this solution before the Digital Clay is operational, we have developed the "Twister" environment, which uses two Polhemus trackers allowing the operator to use two hands to simultaneously stretch, twist, and bend the surface of a virtual model of a 3D object. Once a prototype of the Digital Clay becomes available, the operator will touch the surface with bare hands directly. We plan to use these prototypes and early versions of Digital Clay for assessing the usability and tangible benefits of our two-hand finger sculpting paradigm and of the use of a digitally controlled real surface, rather than of a virtual model of it.
is Full Professor in the