Thursday, May 2, 2013

Play Minecraft, Control Google Maps with Gestures; Release v0.9.24

Play Minecraft, Control Google Maps with Gestures; Release v0.9.24

We've been hard at work connecting our technology to real-world existing applications. Now, you can do this too, without any programming, using the included Dashboard application. Here are two 1-minute videos showing off what we mean:

Navigate Reddit, Control Google Maps, three.js demos using gestures

Play Minecraft, Team Fortress 2, Portal using gestures.


This isn't just proof-of-concept, it's actual footage of working code. These demos work by controlling the Windows / Mac OS X mouse cursor. It's a quick and dirty way to prototype any interaction you're currently using the mouse for with gestures.

For more information check out our (hardware) DevKit and our technology. It's never been easier to add gestures to your applications.

A full list of the v0.9.24 changes follow:

  • Significant improvements to the Dashboard application
  • Bug fixes to the way we load Kinect/OpenNI/OpenNI2 libraries in both Mac and Windows
  • Debug Viewer now reconnects to HandTrackingServer automatically

Wednesday, April 3, 2013

Single-camera support, Hardware DevKit and more

We've been busy over the last couple of months and are announcing some exciting new features today:
Check out the new Development Kit
  • Single-camera support makes our system more portable and much easier to setup.
  • Our new (Hardware) DevKit is now the preferred way to use our system. It bundles the best sensor (PrimeSense short-range sensor) and the best frame together in a single easy-to-install package. Get yours today!
  • The new Quick-start mode lets you use our system immediately, without going through the longer hand calibration process.
  • The new Dashboard application makes it easy to control existing applications without writing a single line of code by driving the mouse and keyboard.

More details about the v0.9.22 release:

Single-Camera Support

Setting up and calibrating two cameras has been a nightmare for a lot of you. With single-camera mode, you still get robust and accurate tracking, but in a more portable, compact package. Switching to single-camera mode is as easy as flipping a switch in the config file. Add the line,

NumCameras: 1
to your config.yml file. Re-position and re-calibrate your camera (camerasetup.bat / camerasetup.sh) and you're all set. If you're serious about single-camera, you should check out our DevKit which is optimized for that case.

3Gear Development Kit

Our new DevKit has all the hardware and software you need to get started, without all the USB, driver and compatibility issues. Our developer kit puts the best hardware in a single package that's fully tested to play well on both Mac and Windows.

Quick-Start Mode

Quick-start mode is a way to use our system without going through the longer hand calibration process. It's especially useful when there are many users of the same installation or for giving demos. The legacy hand calibration process is still available, but we're transitioning towards the user-friendly quick-start mode, which is suitable for most cases.

Dashboard

The dashboard is a way to control existing applications by driving the mouse and keyboard -- no programming necessary. We support several mappings, including XY, XZ and finger, as well as (experimental) swipe gestures.

Tuesday, January 22, 2013

VLC Media Player Demo

One of the great things about in-the-air hand gestures is that they're not mutually exclusive from mouse and keyboard input devices. Hand gestures can be a great addition to the desktop experience. Say you're playing music while you work or browse the web, but the next song you have no great affinity for and wish to skip it or the volume is too low and you want to turn it up so you can bounce around in your chair, you could use gestures to adjust the player running in the background. No need to alt-tab and switch context from your current application!

This video demonstrates possible gestural controls for VLC Media Player. It's playing NASA clips, but the controls would work equally well with a music playlist.


Summary of the gestures:

Play/pause - Double pinch
Seeking forward/backward - Pinch and drag to the right/left
Volume up/down - Pinch and rotate clockwise/counter-clockwise
Next/Previous clip - Spread hand flick right/left

This is just one possible gesture set. All the gestures can be operated with either hand, but since we can identify which hand you are using, it is also possible to assign operations to specific hands.

Monday, January 7, 2013

Mac OS X Support

The heart exploration demo running on a 15" MacBook Pro.

Good news everyone!

As many of you have requested, we've started supporting the 3Gear Systems SDK for the Mac. You can now add hand gesture controls to your shiny Mac OS X devices! Download the latest Mac OS X SDK here. For system requirements and details on how to set up the SDK, please refer to our Mac installation guide.

Note: Due to limitations of the Asus Xtion Pro and Kinect for Windows, only the Xbox 360 Kinect sensors are currently supported on the Mac.

Also, please refer to our licensing page for information on how to obtain a license.

Happy coding!

Thursday, January 3, 2013

Licensing

Now that the public beta period is over, you might be wondering how to get a license for our hand tracking SDK. We offer three types of licenses: a 30-day evaluation license, a commercial license, and an academic license. For more specific details on how to get one of these license, please refer to our licensing page. We hope to hear from you soon!

Monday, November 12, 2012

FlightGear Demo

Ah, the miracle of simulated flight. We take a stab at creating 3D gesture controls for piloting an aircraft in FlightGear, a free, open source flight simulator.

Here's a video detailing the different gestures and the operations that they map to. Disclaimer: I'm not a pilot, or even a seasoned simulated pilot, so pardon my reckless maneuvers.


Let me give a more detailed explanation of the gestures.

Startup controls
We mapped the operations to each hand, based on how one might operate the controls in a real cockpit. The throttle and the parking brake can only be accessed by pinching with the right hand. The physical interaction space in front of the monitor was divided such that pinching in the rightmost area of the interaction space could trigger only the throttle. Moving the hand forward and back mimics the physical act of pushing a throttle forward and back. Pinching in the center of the interaction space accesses the parking brake. We use a rotation of the hand to reflect the motion of the parking brake being released or set. Finally, the ignition is located on the left side of the flight controls, so we use a left hand pinch to access the ignition. Rotating clockwise closely mimics the act of turning a key.

Flight controls
Flight controls only become engaged when both hands perform a pointing gestures. I admit that a real pilot probably doesn't point his or her fingers while piloting a plane, but this is a demonstration of the different hand poses we are working on exposing to the developer. Robust recognition of different hand poses would allow the developer to use hand pose to set the mode of the application. The two hands control three things: ailerons, elevators, and rudders, which control the roll, pitch, and yaw of the aircraft respectively.

Ailerons:
Elevators:
Rudders:

In summary the gestures are as follows:
  • Throttle - A pinch with the right hand, pushing forward and back, in the right side of the interaction space
  • Parking brake - A pinch with the right hand, rotating clockwise to releast the break and counter-clockwise to set the break, in the center of the interaction space
  • Ignition - A pinch with the left hand, rotating clockwise
  • Ailerons - Pointing with both hands, moving the hands like turning a vertical steering wheel
  • Elevators - Pointing with both hands, rotating both hands upwards or downwards
  • Rudders - Pointing with both hands, moving the hands like turning a horizontal steering wheel

Kenrick

Friday, November 2, 2012

Recipe Demo

Hi!

I thought I'd share a little demo I put together using our hand tracking system. One of the advantages of using a touchless interface is it keeps your hands clean from dirty equipment and vice versa. For example, a surgeon who has scrubbed in does not want to handle unsterile equipment such as a computer. Conversely the surgeon does not want to contaminate equipment with the byproducts of surgery. Rather than constantly breaking scrubs or dictating instructions to another person, the surgeon could manipulate images and patient data through a touchless interface. Or imagine someone in waste management who needs to operate a computer or someone in the food industry using a shared computer touched by countless individuals. A touchless interface would keep the equipment and the user's hands clean.

Okay, so maybe you're not a surgeon and you don't work in the dirty jobs industry. But, have you ever cooked while following a recipe on a computer? You probably didn't want to get your food-covered hands all over your nice devices. I've cobbled together a simple demo that allows a budding pastry chef to mark off the steps in a recipe for chocolate chip cookies.

The gestures are simple. The Swipe and Lock gesture crosses off items on the list - with the right hand, the user can perform a pinch in the air and drag to the right to begin crossing off a step in the recipe. Then, rotating the hand clockwise roughly 90 degrees confirms the cross-off. A pinch and drag to the left undos the cross-off. A pinch and drag up or down traverses the steps. The user can also view the ingredients list by using the left hand to perform a double-pinch (like a double-click) and dragging to the right to reveal the second panel.

Check it out:


Following a recipe is a cute example, but there are other applications for checklists such as in a laboratory setting where a scientist might want to cross off steps in a procedure that requires aseptic techniques. Or in the operating room, the doctor might use a checklist as a cognitive aid while performing a procedure.

Kenrick