Wednesday, April 2, 2014

From color gloves to tiny cameras: 3Gear raises $1.9M

We started 3Gear Systems with the singular goal of creating a gesture recognition system that actually works. We've been disappointed by what's currently available, and felt that with our backgrounds, we could build something a lot better—something reliable enough and powerful enough to become a basic part of interacting with a computer. Since we founded the company in March 2012, we've stayed lean and 100% focused on solving the hard computer vision and human-computer interaction problems involved in building the world's best gesture recognition technology.

Our latest work uses a tiny front-facing depth camera to precisely track a user's gestures
MIT thesis work on color gloves Original two-Kinect hand tracker Single depth camera,
arbitrary gestures

What started as a project from one of our co-founder's PhD work at MIT, evolved into a system that used two Kinect cameras set on a gantry-like frame, to one that used a single camera, to one that now uses a tiny camera that sits on your laptop or beneath your monitor. When we started, our technology could track a hand in six poses and required fifteen minutes to calibrate each new user. Today our system can track arbitrary 10-finger gestures and takes under a second to calibrate. Lastly, we've designed a catalog of gestures around activities our users care about, including browsing the web, watching videos, playing games and manipulating 3D models.

We're proud to announce today that we've raised $1.9M in a seed round led by K9 Ventures, with participation from Intel Capital, CrunchFund, Ovo Fund (Eric Chen), Safa Rashtchy and others. We're excited about the progress we've made with our technology, but are even more excited about what the future holds as we begin building our technology into consumer products.

We're growing our team. Come join us to build a more powerful way to interact with computers.

Tuesday, January 21, 2014

State of Gesture: New release (v0.9.32), PrimeSense acquisition, pmdtec partnership, new camera support

A lot has happened in the past four months: we've made huge improvements to the tracking quality and stability; PrimeSense got acquired by Apple; we entered into a partnership with pmdtec; and, we're broadening support to new cameras.

New release (v0.9.32)

Above: Our latest release offers significantly more robust tracking
First things first: we're releasing a new version of our SDK today (v0.9.32). It's got plenty of improvements, but the biggest one is significantly more robust tracking. You'll notice a lot less flip-flopping of rotations, and a lot more rock-solid hand position / orientation tracking. In addition, we upgraded support to the latest PrimeSense firmware / OpenNI, lowered the latency on PrimeSense and Asus sensors, and expanded the API to include finger joint angles. This release also combines the previous GesturalApps and GesturalUserInterface packages into a single (smaller!) package -- which we call Nimble SDK. Please give it a try, and let us know what you think.
Download v0.9.32 now

PrimeSense acquisition

After months of rumors, Apple acknowledged its acquisition of PrimeSense (who made the first Microsoft Kinect, Asus Xtion and Carmine sensors) on November 24th. (Congratulations PrimeSense!) Unfortunately, this means won't be able to supply the Carmine 1.09 short-range cameras for our DevKit anymore, although the Asus and Kinect cameras are still widely available. Even before the acquisition, we started investigating other sensors and partnerships with other sensor makers. One of these sensor makers is a small German company called pmdtec technologies...

pmdtec partnership

Today, we're happy to announce a partnership with pmdtec technologies to deploy our gesture tracking middleware and applications on their (impressively tiny and accurate) sensor. Together, our finger-tracking software and pmdtec's hardware combine to form the Nimble UX platform. We're targeting consumer applications for this platform, including browser control, image and 3D manipulation. While the target customers of the platform are PC OEMs, let us know if you're interested in developing for the platform, and we'll try to get you a developer kit.
Image credit: Engadget. Our partnership with pmdtec puts us on their impressively small and accurate sensors.

New camera support coming soon!

In addition to continuing support for all PrimeSense (Carmine, Asus Xtion, Kinect 360, Kinect for Windows) cameras and the new pmdtec (Pico S, Pico XS) sensors, we're investigating support for the SoftKinetic / Creative cameras and the new Kinect for Windows. Let us know which of these cameras is most interesting to you.
As always, feel free to drop us a line at We're always open to suggestions for new features and improvements that would help you.
Wishing you a happy new year,
The 3Gear Team

Friday, September 13, 2013

AMD processor support, smoother installation, performance improvements (v0.9.30)

We've gotten a lot of requests for better support for AMD processors (Athlon II, Phenom, FX, etc.), and we deliver in this version. This involved rewriting the parts of our code which used SSSE3, SSE4.1 to only use SSE2. The rewrite actually helped us increase performance overall (even on Intel Core i7 processors). We're pretty happy with the result and encourage you to try out our latest version!

Another improvement in this version is a smoother installation process. For people who don't have administrative access on their machines, this version is a big deal. Now, once our software has been installed, it can be configured and run by any user on the local machine without permission issues or stomping on someone else's data.

As always, you can grab our latest version from our download page.

Tuesday, August 27, 2013

Tracking arbitrary 10-finger motion, robustness, latency improvements (v0.9.29)

This version marks the biggest improvement in both functionality and robustness since we released our single-camera DevKit in April. Today, we're announcing experimental support for arbitrary 10-finger hand-tracking. Here's a 30 second video illustrating what we mean by "arbitrary":

You can learn more about arbitrary 10-finger tracking mode in the installation notes included in the download.

There have also been a number of robustness improvements since v0.9.27. In particular, the hands are much better behaved when touching and when the hand occludes the arm. Furthermore, we are better at detecting when the left and right hands are swapped and will swap them back.

We've also made some significant gains in reducing latency. This started in v0.9.26, when we cut 50ms off of most example applications on the Mac, and continues in this version, where HandViewer is 20ms faster. Take a look at the responsiveness in this new recording of HandViewer, Heart Demo and Cricket Defender.

Lastly, we've added some convenience features. It's easier than ever to get started using the Heart Demo and Cricket Defender, which have a built-in 1-second hand calibration step now. We're also providing access to the raw depth data in the API.

This is all to say that you should really download our latest version here.

For existing users of our software, you'll need to re-sample your databases for this version. This probably means running "handcalibration.bat sample" from the GesturalUserInterface directory. Running "camerasetup.bat" to recalibrate the camera(s) should also do the trick.

Tuesday, July 2, 2013

One-second hand calibration, more accurate hand orientation tracking (v0.9.27)

We're delivering the fastest (and most robust) hand calibration routine yet in this version. It takes about one second to calibrate your hands now, and we have calibration-mode on by default. This is especially useful when many people are cycling through using the system -- e.g. doctors in an operating room or users of a public kiosk.

The hand orientation tracking is also significantly more accurate than before. We've fixed some bugs that were causing the hands to flip along the arm axis occasionally.

Finally, the HandTrackingServer on Windows behaves more intelligently. It no longer stomps on top of another running instance of HandTrackingServer or launches multiple copies of DebugViewer.

Every version, we try to push the accuracy of the hand-tracking a bit more. We have empirically found that the new scale calibration and hand orientation improvements are making a big difference, and we hope you'll enjoy trying out these updates too! Check it out at

Monday, June 17, 2013

Performance improvements, bug fixes and much easier installation (v0.9.26)

We've made some big performance improvements and fixed some major installation hurdles in v0.9.26, along with bug fixes and hand tracking improvements. You can (always) grab the latest version at:

The Mac OS X applications are nearly twice as responsive now (50ms improvement in most cases). Games such as CricketDefender, Xylophone and Slingshot in particular also refresh / render faster.

Installation on both Mac and Windows is also significantly easier. Windows has a GUI installer now designed to catch missing requirements and pitfalls. Mac includes libusb and removes the dependency on some unnecessary libraries (OpenCV's highgui) which were causing problems for OS X 10.7 users.

These improvements, combined with the usual bug fixes and hand-tracking quality improvements make upgrading to v0.9.26 well worth it!

Let us know if this helps, and keep us posted about what you would like to see in the next version!

Thursday, May 2, 2013

Play Minecraft, Control Google Maps with Gestures; Release v0.9.24

Play Minecraft, Control Google Maps with Gestures; Release v0.9.24

We've been hard at work connecting our technology to real-world existing applications. Now, you can do this too, without any programming, using the included Dashboard application. Here are two 1-minute videos showing off what we mean:

Navigate Reddit, Control Google Maps, three.js demos using gestures

Play Minecraft, Team Fortress 2, Portal using gestures.

This isn't just proof-of-concept, it's actual footage of working code. These demos work by controlling the Windows / Mac OS X mouse cursor. It's a quick and dirty way to prototype any interaction you're currently using the mouse for with gestures.

For more information check out our (hardware) DevKit and our technology. It's never been easier to add gestures to your applications.

A full list of the v0.9.24 changes follow:

  • Significant improvements to the Dashboard application
  • Bug fixes to the way we load Kinect/OpenNI/OpenNI2 libraries in both Mac and Windows
  • Debug Viewer now reconnects to HandTrackingServer automatically