When I first heard of the Leap Motion Controller I immediately went searching for reviews to see what how it was being received. Every review I found focused on (1) how well you can control the computer, and (2) the limited amount of software for the device. If that is what you are looking for I can add my take to this discussion. Controlling the computer with gestures is more time consuming, and more, detailed software would make it more fun for the general public.
We need to remember that we have spent an incredible amount of time learning how to use the mouse, and the input system is designed around the mouse, not gesture input, so that will come in time. Also the Leap Motion Controller is being launched by a start up company, and not Apple, or Microsoft, so, with continued support, we will see better software.
In my testing I am focusing on how well the device does what it does, and not so much on the software. My concern is that people will try out the Leap Motion Controller, and then put it away as a neat toy. What I have attempted to do is to give better details on the detection data available and maybe entice more developers to take on the challenge to break new ground here.
I am not going to spend a lot of time discussing the software side of this device. I had the opportunity to download a large number of apps and try them out. As a fellow Benchmark Reviews contributing editor said, “There is a ways to go with the software, however, the Leap Motion Controller basically performed as advertised.”
And that is what I found. When playing games that involved 3D movement, throwing, shooting, flying, dodging or the like, the Leap Motion Controller really does shine. There is a ways to go on the physical input side of things, especially typing and grabbing objects, but I am sure that will come.
I would like to see software developed to teach the user to progress from rough to fine control using this device. To refer back to the mouse – early on playing solitaire was seen as a way to build up speed and control. I would suggest that a similar game be developed so that touch control, motion and grasping could be developed so that users could experience the differences between the Leap Motion Controller and a simple mouse. I think it would go a long way to sell this idea.
HP Pavilion g6-2288ca Notebook
Motherboard: ACPI x64-based PC (Mobile)
System Memory: 7650 MB
Processor: Mobile Quad-Core AMD A10-4600M, 2700 MHz
Audio: IDT 92HD87B2/4 High Definition Audio Controller
Video: AMD Radeon HD 7660G
Disk Drive 1: TOSHIBA MQ01ABD075 SATA Disk Device
Monitor: 15.5″ led 1280×800
Operating System: Microsoft Windows 8
Other Peripherals Used:
- Screen 1: Vision Quest VQL 19WSD
- Screen 2: View Sonic VA2703
Testing the Leap Motion Controller
What I will focus on is the input and data representation that is generated directly from the Leap Motion Controller. i was luck that while I was exploring the Leap Motion Control Panel, I found the Diagnostic Visualizeron on the trouble shooting tab.
This program turns out to be very close to a raw feed of data, at it runs very light. As I am writing, I have the visualizer running is it is using about 9.2 MB RAM, and about 1.5 % of the CPU working. The Diagnostic Visualizer has a very nice visual presentation where the hands are depicted by circles, the fingers by vectors, and the palm orientation by large arrows out of the center of each
Hand gestures are displayed as vectors showing direction, in 3D, and circular motion can be in any plane or orientation. In addition each finger is tracked separately and has rotary motion relative to the knuckles as seen below.
Above you can see, in red, the effect of bending a finger. This brings up another interesting effect that 3D motion using your hand or a stylus creates. When seen in a 2D screen view a circle can be traced so that it shows up pretty close to what a circle should be. If you rotate your view 90 degrees, as shown below and movement in the 3rd dimension is evident. So that when using the gestures as input, the developer must be congnisant of this fact.
This is not too surprising once you consider that every movement we make will not be perfectly 2D because every joint in the human body operates in a rotary fashion. Why bring this up? Well, when you are trying to manipulate something in 2D you may find that as you try move linearly the object will move in path that has unforeseen consequences. Of course, with every challenge is an opportunity. In this case true 3D art can be made possible and depth can also come into play for game functioning.
You may be wondering how accurate are objects tracked with the Leap Motion Controller. The above image is a screen capture of the dynamic tracking data that can be displayed with the Diagnostic Visualizer. What this data represents is the position, in space of each finger, with a speed associated with its location. I was unable to determine what units are used, but if they are close to millimetres I would question the need to carry the info to the third decimal place for tracking, just given the general shakiness of the human hand.
From the same image I have isolated this graph. While it is not clear exactly what is being displayed, it is related to the data stream. As I mentioned above this device is tracking multiple objects, measuring data and calculating motion at an incredible rate. What this graph shows is that the data is that the signal is pretty rough, so, once again, the developers must have incorporate some data smoothing so that movement can be smooth and not jumpy. It is still extremely impressive. Below is one example where it seems that multiple echoes or phantom objects appear, and maybe some filtering or predictive movement analysis may assist in reducing the effect.
The image above is of one hand, in this case the green circle. At times, there can be multiple “hands” seen when only one or two are present. What is causing this is not apparent. One possible cause is other IR sources in the viewing room because this device uses IR to illuminate the objects in the viewing area. Another issue may be random data points but I am sure as this technology matures these issues will be addressed.
The final issue that I would like to highlight is the lack of the ability to “calibrate” your Leap Motion Controller to your hand, in the environment where you are working. As the device is activated there is a “compensation” that takes place for exterior IR sources. The only re-calibration tool provided is for the sensor to itself.
To me it would make sense to provide the operator a way of defining the envelope of performance so that extraneous factors are minimized. For example reducing sensitivity so only active fingers are detected. There are cases where the sensor “sees” other digits when they are firmly in the palm, thus creating extraneous tracks that can upset the precision of the movement. Another situation, when lateral movement is required that setting a sweep area would be handy. The sweep area would treat motion, within a certain range, in that direction, as a straight, flat move in 2D.
I am not an expert on this type of sensing. I am just trying to indicate directions where, with some sort of control on the interpretation of the device data, the developer can make the motion correlate to a particular action. I apologize if I sound like I am trying to nit pick. I am not.
The Leap Motion Controller represents an extremely impressive and truly functional entry into the arena of 3D motion detection and control. The testing has shown that the detection field is accurate and the movement variations are, maybe not limitless, but certainly quite numerous.
I have produced a short (5.25 min) video with some of the testing above to show my results: