Hello from 3Gear Systems!
We're a three-person team based out of San Francisco trying to fundamentally change the way people interact with computers. We're excited to kick things off on our blog by announcing the release of a software development kit (SDK) for adding gestures to your applications.
The story so far
It's easy to forget that the mouse is over 40 years old. While today's mice are smoother and have more buttons, they haven't changed all that much. The biggest innovation in UI since the mouse has been the touchscreen, which, like the mouse, treats your hand as if it's one big pointing finger (or at most two fingers sliding around pictures under glass).
But your hands can do so much more than point at things! They can grab things, turn things over, assemble things, animate things, etc.
At 3Gear, we're creating technology that uses your entire hand (fingers, thumbs, wrists and all) for user interaction. This is especially useful when you're doing something 3D, say assembling 3D parts in Computer Aided Design (CAD), flying through a medical 3D MRI scan, or playing 3D games. With the rise of 3D printing and the Maker community, we're especially interested in making it easier to create in 3D.
Making the Kinect "finger-precise"
We're using 3D camera hardware (e.g., the Microsoft Kinect) to make this possible. However, existing Kinect software only work on large, full-body motions. We've developed software that creates a finger-precise representation of what your hands are doing, capturing tiny motions of your index finger and subtle movements of the wrist. This means your applications can use small, comfortable gestures such as pinching and pointing rather than sweeping arm motions.
To make this work, we had to develop new computer graphics algorithms for reconstructing the precise pose of the user's hands from 3D cameras. A key component of the algorithm is to use a database of pre-computed 3D images corresponding to each possible hand configuration in the workspace. The 3D image database is efficiently sampled and indexed to enable extremely fast searches. At run-time, the images from the 3D cameras are used to "look up" the pose of the hand using the database. This way, the user's hand pose can be determined within milliseconds — fast enough for interactive applications and a short enough time to avoid the effects of "lag" or high latency.
Gestures you can use all day
Before founding 3Gear, Rob spent a lot of his PhD working on tracking hands with a color patchwork glove. He eventually abandoned the idea because it was so hard to get people to put on gloves. Above all, our input system is designed to be practical and comfortable. We're trying to fit into your workflow rather than interrupt it. We've been careful to design an input device that you can use all day. For instance, we mount the cameras above the desk so that the hands can be tracked well even a couple centimeters above the keyboard with the forearms resting on the desk, avoiding the so-called "gorilla arm" problem. You don't have to wave your arm around much to interact with the system; it can pick up motions on the order of millimeters.
Try our software development kit (SDK)!
Today, we're releasing a "public beta" version of a software development kit (SDK) that allows you to quickly incorporate 3Gear's technology into your applications or invent new uses of gestural user interfaces.
Here's what our software can do:
- Intuitive 3D manipulation. Our 3D input technology provides 1-to-1 3D control of virtual objects. Users can grab objects and move them around in 3D with their hands.
- Touchless (aseptic) control. Whether the user is a surgeon in an operating room or a chemist in a pharmaceutical lab, sometimes it's just too inconvenient to touch a computer. Our technology offers precise control of computer systems for dirty jobs.
- Runs on commodity hardware. Our input system currently uses two Kinect cameras and an aluminum frame for mounting the cameras. All of the components are available off-the-shelf right now.
And here are the limitations:
- Our software requires each first-time user to go through a short (five-minute) calibration step. A user only ever has to do this once ever.
- It's good at recognizing the set of useful gestures covered in the calibration, but it can't track arbitrary hand gestures quite yet.
We're working hard to relax these limitations. We're also actively trying out new camera tech and frame designs — we know it looks a little clunky right now.
Our software is free for both non-commercial and commercial applications up until the end of the beta period (November 30th, 2012). After the beta period, we will continue to offer a free version of the software for researchers, hobbyists, and small commercial entities (i.e., annual turnover of US$100,000 or less).
About us
Okay, so since this is our first post, here's a little more about us. We're just three people right now, but we've all been thinking about computer graphics and HCI for a while:
Rob Wang wrote his doctoral dissertation at MIT on tracking colorful things using computer vision. Most notably, he created a color patchwork glove and a set of algorithms for tracking a user's hands in real-time with a webcam.
Chris Twigg figured out how to step back in time while getting his PhD at Carnegie Mellon University. Before founding 3Gear Systems, Chris pushed the envelope of digital visual effects at Industrial Light and Magic R&D.
Kenrick Kin got his Ph.D. from Cal but spent much of his time in grad school at Pixar Animation Studios inventing ways to use multitouch screens to build rich 3D environments for computer-animated films.
We're supported by K9 Ventures, Uj Ventures and a research grant from the National Science Foundation
“Kinect” is a registered trademark of Microsoft.