Notice: MediaWiki has been updated. Report any rough edges to marcan@marcan.st

Difference between revisions of "Gallery"

From OpenKinect
Jump to: navigation, search
m (Reference to websites in the intro...)
(Constrained all videos to 320x ; borderless tables for compact formatting; pls replicate for further updates; needs to be split...)
Line 16: Line 16:
 
=== 3D Depth Sculpting using copies of my own body and other objects ===
 
=== 3D Depth Sculpting using copies of my own body and other objects ===
  
I became intrigued with the possibility of sculpting in 3D using the  
+
{|
Kinect by periodically recording the nearest object to the camera in  
+
|{{#ev:youtube|LKjzbyBpkM8|320}}
each part of the image. By taking multiple 3D snapshots at different  
+
|valign="top"|
times and then merging them to show the closest object, I can create a  
+
* I became intrigued with the possibility of sculpting in 3D using the Kinect by periodically recording the nearest object to the camera in each part of the image. By taking multiple 3D snapshots at different times and then merging them to show the closest object, I can create a 3D sculpture that I can walk through. Here I merge multiple 3D video streams. http://www.youtube.com/watch?v=LKjzbyBpkM8
3D sculpture that I can walk through.  
+
|-
 
+
|{{#ev:youtube|inim0xWiR0o|320}}
 
+
|valign="top"|
Here I merge multiple 3D video streams
+
* Here I use a "depth strobe", where I grab snapshots 1-2x a second.
{{#ev:youtube| LKjzbyBpkM8}}
+
|-
http://www.youtube.com/watch?v=LKjzbyBpkM8
+
|{{#ev:youtube|abS7G5ZT17c|320}}
 
+
|valign="top"|
 
+
* Here I take a single snapshot with some furniture and then remove it, allowing me to wander through the ghost of the furniture. http://www.youtube.com/watch?v=abS7G5ZT17c
Here I use a "depth strobe", where I grab snapshots 1-2x a second  
+
|-
{{#ev:youtube| inim0xWiR0o}}
+
|{{#ev:youtube|rBLYsB9BBSk|320}}
http://www.youtube.com/watch?v=inim0xWiR0o
+
|valign="top"|
 
+
* Here I update the 3D background continuously, creating a slur of objects. http://www.youtube.com/watch?v=rBLYsB9BBSk. Created in Python using the Python wrapper.
 
+
|}
Here I take a single snapshot with some furniture and then remove it,  
 
allowing me to wander through the ghost of the furniture:
 
{{#ev:youtube| abS7G5ZT17c}}
 
http://www.youtube.com/watch?v=abS7G5ZT17c  
 
 
 
 
 
Here I update the 3D background continuously, creating a slur of  
 
objects:
 
{{#ev:youtube| rBLYsB9BBSk}}
 
http://www.youtube.com/watch?v=rBLYsB9BBSk  
 
 
 
Created in Python using the Python wrapper.
 
  
 
=== People Detection ===
 
=== People Detection ===
Real-time people detection using two kinect
+
{|
{{#ev:youtube| x--xlKWBTAE}}
+
|{{#ev:youtube|x--xlKWBTAE|320}}
http://www.youtube.com/watch?v=x--xlKWBTAE
+
|valign="top"|
 +
* Real-time people detection using two kinect. http://www.youtube.com/watch?v=x--xlKWBTAE
 +
|}
  
 
=== Florian Echtler (floemuc) ===
 
=== Florian Echtler (floemuc) ===
"Multi-touch" interactions http://www.youtube.com/watch?v=ho6Yhz21BJI
 
 
Created with the libTISCH multitouch library: http://tisch.sf.net/
 
  
{{#ev:youtube|ho6Yhz21BJI}}
+
{|
 +
|{{#ev:youtube|ho6Yhz21BJI|320}}
 +
|valign="top"|
 +
* "Multi-touch" interactions http://www.youtube.com/watch?v=ho6Yhz21BJI Created with the libTISCH multitouch library: http://tisch.sf.net/
 +
|}
  
 
=== Pete & Matt ===
 
=== Pete & Matt ===
Fun with a "Kinect piano"!
 
  
A vastly improved piano:
+
{|
http://www.youtube.com/watch?v=VgLp-KyK5g8
+
|{{#ev:youtube|VgLp-KyK5g8|320}}
{{#ev:youtube|VgLp-KyK5g8}}
+
|valign="top"|
 +
* Fun with a "Kinect piano"! A vastly improved piano: http://www.youtube.com/watch?v=VgLp-KyK5g8
 +
|-
 +
||{{#ev:youtube|ppHcj15LypM|320}}
 +
|valign="top"|
 +
* Original version for posterity: http://www.youtube.com/watch?v=ppHcj15LypM
 +
|}
  
 +
=== okreylos ===
  
(Original version for posterity:)
+
{|
http://www.youtube.com/watch?v=ppHcj15LypM
+
|{{#ev:youtube|7QrnwoO1-8A|320}}
{{#ev:youtube|ppHcj15LypM}}
+
|valign="top"|
 
+
* 3D depth video capture http://www.youtube.com/watch?v=7QrnwoO1-8A measuring objects in 3D http://idav.ucdavis.edu/~okreylos/ResDev/Kinect/
=== okreylos ===
+
|}
3D depth video capture http://www.youtube.com/watch?v=7QrnwoO1-8A
 
measuring objects in 3D http://idav.ucdavis.edu/~okreylos/ResDev/Kinect/
 
  
 
=== koabi's work ===
 
=== koabi's work ===
http://www.youtube.com/profile?user=DerKorb#g/u
+
* http://www.youtube.com/profile?user=DerKorb#g/u
 
 
  
 
=== Diarmuid Wrenne diarmuid@bluekulu.com ===
 
=== Diarmuid Wrenne diarmuid@bluekulu.com ===
  
A second version of the green screen demo. This one does movies too.
+
{|
 
+
|{{#ev:youtube|7QrnwoO1-8A|320}}
{{#ev:youtube|qy7lKS6L75w}}
+
|valign="top"|
 
+
* 3D depth video capture http://www.youtube.com/watch?v=7QrnwoO1-8A measuring objects in 3D http://idav.ucdavis.edu/~okreylos/ResDev/Kinect/
3D point cloud rendered with boxes with scaling  
+
|-
 
+
|{{#ev:youtube|qy7lKS6L75w|320}}
Also superimposing 3d models into the scene (a rifle of course)  
+
|valign="top"|
 
+
* A second version of the green screen demo. This one does movies too.
Also using OSC from an ipad to control the scene
+
|-
{{#ev:youtube|4yp37U-YHv4}}
+
|{{#ev:youtube|4yp37U-YHv4|320}}
 
+
|valign="top"|
http://www.youtube.com/watch?v=4yp37U-YHv4
+
* 3D point cloud rendered with boxes with scaling. Also superimposing 3d models into the scene (a rifle of course). Also using OSC from an ipad to control the scene. http://www.youtube.com/watch?v=4yp37U-YHv4
 
+
|-
3D Point cloud viewer
+
|{{#ev:youtube|T8bkAQ-VxXg|320}}
http://www.youtube.com/watch?v=T8bkAQ-VxXg
+
|valign="top"|
{{#ev:youtube|T8bkAQ-VxXg}}
+
* 3D Point cloud viewer. http://www.youtube.com/watch?v=T8bkAQ-VxXg
 
+
|-
Extracting rotation data from real objects and mapping that to new virtual ones.
+
|{{#ev:youtube|bO3YwW3WajI|320}}
 
+
|valign="top"|
http://www.youtube.com/watch?v=bO3YwW3WajI
+
* Extracting rotation data from real objects and mapping that to new virtual ones. Shows how I can extract the rotation of objects seen by the kinect and use that rotation to change the orientation of virtual objects within the Box2d space to create a virtual bat out of a real one!!. Notice that I have mirrored the color video stream so that it acts more like a mirror than a web cam so that I can overlay the 2d graphics onto the camera images for more realism. http://www.youtube.com/watch?v=bO3YwW3WajI
{{#ev:youtube|bO3YwW3WajI}}
+
|-
 
+
|{{#ev:youtube|uvP2u2yOcNw|320}}
Shows how I can extract the rotation of objects seen by the kinect and use that rotation to change the orientation of virtual objects within the Box2d space to create a virtual bat out of a real one!!.
+
|valign="top"|
 
+
* Driving Quake Live with a kinect. It uses openkinect, python bindings and web.py on the linux to expose nearest point data. The imac runs Quake and a custom java program that calls the linux web server. It uses java.awt.Robot to generate mouse and key stroke events. http://www.youtube.com/watch?v=uvP2u2yOcNw Sorry about the resolution but I'll try to upload a better one later.
Notice that I have mirrored the color video stream so that it acts more like a mirror than a web cam so that I can overlay the 2d graphics onto the camera images for more realism.  
+
|-
 
+
|{{#ev:youtube|bO3YwW3WajI|320}}
Driving Quake Live with a kinect
+
|valign="top"|
 
+
* Extracting rotation data from real objects and mapping that to new virtual ones. Shows how I can extract the rotation of objects seen by the kinect and use that rotation to change the orientation of virtual objects within the Box2d space to create a virtual bat out of a real one!!. Notice that I have mirrored the color video stream so that it acts more like a mirror than a web cam so that I can overlay the 2d graphics onto the camera images for more realism. http://www.youtube.com/watch?v=bO3YwW3WajI
http://www.youtube.com/watch?v=uvP2u2yOcNw
+
|-
{{#ev:youtube|uvP2u2yOcNw}}
+
|{{#ev:youtube|pR46sXjEtzE|320}}
It uses openkinect, python bindings and web.py on the linux to expose nearest point data. The imac runs Quake and a custom java program that calls the linux web server. It uses java.awt.Robot to generate mouse and key stroke events.  
+
|valign="top"|
 
+
* Openframeworks, box2d, opencv and ofxkinect. This uses the depth map to determine the closest point to the kinect. It uses this point to draw a line that is part of the box2d world. This line can then be moved around by moving your hand or a magic wand (in my case a roll of string!!) so that other objects with in the 2d world can be manipulated. Works well. http://www.youtube.com/watch?v=pR46sXjEtzE
Sorry about the resolution but I'll try to upload a better one later.  
+
|-
 
+
|{{#ev:youtube|NlrKcpUPtwM|320}}
 
+
|valign="top"|
Openframeworks, box2d, opencv and ofxkinect
+
* Openframeworks, Box2d and opencv. Uses the blobs generated by opencv contours to generate a box2d object to manibulate other box2d objects. Works OK but filtering the blobs is quite error prone http://www.youtube.com/watch?v=NlrKcpUPtwM
 
+
|-
This uses the depth map to determine the closest point to the kinect. It uses this point to draw a line that is part of the box2d world. This line can then be moved around by moving your hand or a magic wand (in my case a roll of string!!) so that other objects with in the 2d world can be manipulated. Works well.  
+
|{{#ev:youtube|PYq9gkdpiS8|320}}
http://www.youtube.com/watch?v=pR46sXjEtzE
+
|valign="top"|
 
+
* Using web.py to web enable a kinect on linux. http://www.youtube.com/watch?v=PYq9gkdpiS8
Openframeworks, Box2d and opencv
+
|}
 
 
Uses the blobs generated by opencv contours to generate a box2d object to manibulate other box2d objects. Works OK but filtering the blobs is quite error prone
 
http://www.youtube.com/watch?v=NlrKcpUPtwM
 
 
 
Using web.py to web enable a kinect on linux.
 
http://www.youtube.com/watch?v=PYq9gkdpiS8
 
  
 
=== Kyle point cloud viewer ===
 
=== Kyle point cloud viewer ===
initial render in processing: http://www.flickr.com/photos/kylemcdonald/5167174610/
 
  
viewer substituting glview.c: http://www.openframeworks.cc/forum/viewtopic.php?p=24884#p24884
+
* Initial render in processing: http://www.flickr.com/photos/kylemcdonald/5167174610/
 +
* Viewer substituting glview.c: http://www.openframeworks.cc/forum/viewtopic.php?p=24884#p24884
 +
* advanced viewer with DOF using ofxKinect: http://www.openframeworks.cc/forum/viewtopic.php?p=24958#p24958
  
advanced viewer with DOF using ofxKinect: http://www.openframeworks.cc/forum/viewtopic.php?p=24958#p24958
+
=== cclaan ===
  
=== cclaan ===
+
{|
Point cloud plus color
+
|{{#ev:vimeo|16788233|320}}
http://vimeo.com/16788233
+
|valign="top"|
 +
* Point cloud plus color. http://vimeo.com/16788233
 +
|}
  
 
=== yankeyan ===
 
=== yankeyan ===
Awesome object recognition! http://www.youtube.com/watch?v=fQ59dXOo63o and http://www.youtube.com/watch?v=cRBozGoa69s
 
  
Lightsaber tracking and rendering http://www.youtube.com/watch?v=3EeJCln5KYg
+
{|
 +
|{{#ev:youtube|fQ59dXOo63o|320}}
 +
|valign="top"|
 +
* Awesome object recognition! http://www.youtube.com/watch?v=fQ59dXOo63o and http://www.youtube.com/watch?v=cRBozGoa69s
 +
|-
 +
|{{#ev:youtube|cRBozGoa69s|320}}
 +
|-
 +
|{{#ev:youtube|3EeJCln5KYg|320}}
 +
|valign="top"|
 +
* Lightsaber tracking and rendering http://www.youtube.com/watch?v=3EeJCln5KYg
 +
|}
  
 
=== Memo Akten ===
 
=== Memo Akten ===
Drawing in 3D http://vimeo.com/16818988
+
 
{{#ev:vimeo|16818988}}
+
{|
 +
|{{#ev:vimeo|16818988|320}}
 +
|valign="top"|
 +
* Drawing in 3D http://vimeo.com/16818988
 +
|}
  
 
=== L14M333 ===
 
=== L14M333 ===
Moving the motor with a GUI while showing video streams: http://www.youtube.com/watch?v=KsvQpUnlp-Y
 
  
Plotting accelerometer to graph and calibration http://www.youtube.com/watch?v=GWvcgZkADUU (Uses CL NUI (Will switch later))
+
{|
 +
|{{#ev:youtube|KsvQpUnlp-Y|320}}
 +
|valign="top"|
 +
* Moving the motor with a GUI while showing video streams: http://www.youtube.com/watch?v=KsvQpUnlp-Y
 +
|-
 +
|{{#ev:youtube|GWvcgZkADUU|320}}
 +
|valign="top"|
 +
* Plotting accelerometer to graph and calibration http://www.youtube.com/watch?v=GWvcgZkADUU (Uses CL NUI (Will switch later))
 +
|-
 +
|{{#ev:youtube|-3-TA6URf9M|320}}
 +
|valign="top"|
 +
* Using kinect as a mouse http://www.youtube.com/watch?v=-3-TA6URf9M (Uses CL NUI (Will switch later))
 +
|-
 +
|{{#ev:youtube|wBoVrZZlmJ0&|320}}
 +
|valign="top"|
 +
* Kinect used to control mouse (v2) with source: http://www.youtube.com/watch?v=wBoVrZZlmJ0&hd=1
 +
|-
 +
|{{#ev:youtube|tlHzqeIhDiQ|320}}
 +
|valign="top"|
 +
* Turning TV into touch screen TV using Kinect: http://www.youtube.com/watch?v=tlHzqeIhDiQ
 +
|-
 +
|{{#ev:youtube|JlwWp8ItCVA&|320}}
 +
|valign="top"|
 +
* Another KinectTouch demo improved source fully commented: http://www.youtube.com/watch?v=JlwWp8ItCVA&hd=1
 +
|}
  
Using kinect as a mouse http://www.youtube.com/watch?v=-3-TA6URf9M (Uses CL NUI (Will switch later))
+
* Single object detection and improved framerate: http://tinyurl.com/handtrack
 +
* Get latest source code of mine from: http://get.essexconsolerepairs.co.uk/kinecttouch
  
Kinect used to control mouse (v2) with source: http://www.youtube.com/watch?v=wBoVrZZlmJ0&hd=1
+
=== arne_ ===
 
 
Turning TV into touch screen TV using Kinect: http://www.youtube.com/watch?v=tlHzqeIhDiQ
 
 
 
Another KinectTouch demo improved source fully commented: http://www.youtube.com/watch?v=JlwWp8ItCVA&hd=1
 
 
 
Single object detection and improved framerate: http://tinyurl.com/handtrack
 
 
 
Get latest source code of mine from: http://get.essexconsolerepairs.co.uk/kinecttouch
 
  
=== arne_ ===
 
 
Full image of the IR dot field:
 
Full image of the IR dot field:
  
http://livingplace.informatik.haw-hamburg.de/blog/wp-content/uploads/2010/11/kinect1.png
+
* http://livingplace.informatik.haw-hamburg.de/blog/wp-content/uploads/2010/11/kinect1.png
  
 
=== kode80apps ===
 
=== kode80apps ===
Head tracking for 3D vision using OpenKinect software:
+
 
http://www.youtube.com/watch?v=Y-f_oMOvNAk
+
{|
http://www.youtube.com/watch?v=CddvvlP4UNM
+
|{{#ev:youtube|Y-f_oMOvNAk|320}}
 +
|valign="top"|
 +
* Head tracking for 3D vision using OpenKinect software. http://www.youtube.com/watch?v=Y-f_oMOvNAk  
 +
|-
 +
|{{#ev:youtube|CddvvlP4UNM|320}}
 +
|valign="top"|
 +
* http://www.youtube.com/watch?v=CddvvlP4UNM
 +
|}
  
 
=== James Patten ===
 
=== James Patten ===
Multitouch Tracking on arbitrary planes with Kinect http://www.youtube.com/watch?v=Bth0TkRLVtk
+
 
 +
{|
 +
|{{#ev:youtube|Bth0TkRLVtk|320}}
 +
|valign="top"|
 +
* Multitouch Tracking on arbitrary planes with Kinect http://www.youtube.com/watch?v=Bth0TkRLVtk
 +
|}
  
 
=== cruxphotography ===
 
=== cruxphotography ===
Box cloud http://www.youtube.com/watch?v=e8h6DL0Dc84
+
 
 +
{|
 +
|{{#ev:youtube|e8h6DL0Dc84|320}}
 +
|valign="top"|
 +
* Box cloud http://www.youtube.com/watch?v=e8h6DL0Dc84
 +
|}
  
 
=== Daniel Reetz ===
 
=== Daniel Reetz ===
 
Removing the IR cut filter from your camera and examining the Kinect IR field, including high res image:
 
Removing the IR cut filter from your camera and examining the Kinect IR field, including high res image:
http://www.futurepicture.org/?p=97
+
* http://www.futurepicture.org/?p=97
  
 
=== Emily Gobeille - Theo Watson ===
 
=== Emily Gobeille - Theo Watson ===
Digital puppetry with Kinect http://vimeo.com/16985224
+
 
{{#ev:vimeo|16985224}}
+
{|
 +
|{{#ev:vimeo|16985224|320}}
 +
|valign="top"|
 +
* Digital puppetry with Kinect http://vimeo.com/16985224
 +
|}
  
 
=== DustyDingo ===
 
=== DustyDingo ===
  
Two Kinects facing same direction, experiments:[http://atommuell.mum.jku.at/~aurel/two_kinects_same_direction.png]
+
Two Kinects facing same direction, experiments:
[http://atommuell.mum.jku.at/~aurel/two_kinects_same_direction_c_both.png]
+
* [http://atommuell.mum.jku.at/~aurel/two_kinects_same_direction.png]
[http://atommuell.mum.jku.at/~aurel/two_kinects_same_direction_c_single.png]
+
* [http://atommuell.mum.jku.at/~aurel/two_kinects_same_direction_c_both.png]
 +
* [http://atommuell.mum.jku.at/~aurel/two_kinects_same_direction_c_single.png]
  
 
Rolling shutter experiments:
 
Rolling shutter experiments:
Setup:
+
* Setup: http://atommuell.mum.jku.at/~aurel/kinect_roling_shuter_proto_2.jpg
http://atommuell.mum.jku.at/~aurel/kinect_roling_shuter_proto_2.jpg
 
  
 
The top image is a plain kinect, the other one has a rolling shuter in front of the laser projector.
 
The top image is a plain kinect, the other one has a rolling shuter in front of the laser projector.
 
Both kinect operate at the same time.
 
Both kinect operate at the same time.
http://atommuell.mum.jku.at/~aurel/two_kinects_same_direction_c_both_one_proj_shutter.png
+
* http://atommuell.mum.jku.at/~aurel/two_kinects_same_direction_c_both_one_proj_shutter.png
  
 
Holding the kinect automatically on level:
 
Holding the kinect automatically on level:
[http://atommuell.mum.jku.at/~aurel/hold_level_short.avi]
+
* [http://atommuell.mum.jku.at/~aurel/hold_level_short.avi]
  
 
=== yoda-- ===
 
=== yoda-- ===
  
Fist detection http://yoda-jm.performous.org/freenect/in2.png
+
* Fist detection http://yoda-jm.performous.org/freenect/in2.png
  
 
=== Zephod / Stijn Kuipers ===
 
=== Zephod / Stijn Kuipers ===
  
Windows RBG to Z Projection http://vimeo.com/17007842
+
{|
 +
|{{#ev:vimeo|17007842|320}}
 +
|valign="top"|
 +
* Windows RBG to Z Projection http://vimeo.com/17007842
 +
|}
  
 
=== Zappadoc / URBI Implementation ===
 
=== Zappadoc / URBI Implementation ===
  
URBI/Urbiscript implementation http://www.urbiforge.org/index.php/Modules/UKinect
+
* URBI/Urbiscript implementation http://www.urbiforge.org/index.php/Modules/UKinect
  
 
=== netpro2k / Dominick D'Aniello ===
 
=== netpro2k / Dominick D'Aniello ===
  
Quick background removal and parallax demo http://vimeo.com/17023522
+
{|
 
+
|{{#ev:vimeo|17023522|320}}
3D object manipulation http://vimeo.com/17045326
+
|valign="top"|
{{#ev:vimeo|17045326}}
+
*Quick background removal and parallax demo http://vimeo.com/17023522
 
+
|-
Kinect Kart http://vimeo.com/17075573
+
|{{#ev:vimeo|17045326|320}}
{{#ev:vimeo|17075573}}
+
|valign="top"|
 +
*3D object manipulation http://vimeo.com/17045326
 +
|-
 +
|{{#ev:vimeo|17075573|320}}
 +
|valign="top"|
 +
*Kinect Kart http://vimeo.com/17075573
 +
|}
  
 
=== evoluce ===
 
=== evoluce ===
  
Controlling Windows 7 applications with Kinect http://www.youtube.com/watch?v=M-wLOfjVfVc
+
{|
 +
|{{#ev:youtube|M-wLOfjVfVc|320}}
 +
|valign="top"|
 +
* Controlling Windows 7 applications with Kinect http://www.youtube.com/watch?v=M-wLOfjVfVc
 +
|}
  
 
=== [[User:marcan|marcan]] ===
 
=== [[User:marcan|marcan]] ===
  
[http://www.youtube.com/watch?v=rKhW-cvpkks The video that started it all]<br />
+
{|
[http://www.youtube.com/watch?v=Q1heqFVrQGU Laser projection on a moving surface in perspective, using the Kinect for tracking]
+
|{{#ev:youtube|rKhW-cvpkks|320}}
 +
|valign="top"|
 +
* http://www.youtube.com/watch?v=rKhW-cvpkks The video that started it all.
 +
|-
 +
|{{#ev:youtube|Q1heqFVrQGU|320}}
 +
|valign="top"|
 +
* http://www.youtube.com/watch?v=Q1heqFVrQGU Laser projection on a moving surface in perspective, using the Kinect for tracking.
 +
|}
  
 
=== [[User:imekinox|imekinox]] ===
 
=== [[User:imekinox|imekinox]] ===
"AS3 first communication" http://www.youtube.com/watch?v=_9sDY6WO4mY
 
  
"AS3 second phase, motor & accelerometers" http://www.youtube.com/watch?v=SvfDvGaY348
+
{|
 +
|{{#ev:youtube|_9sDY6WO4mY|320}}
 +
|valign="top"|
 +
* "AS3 first communication" http://www.youtube.com/watch?v=_9sDY6WO4mY
 +
|-
 +
|{{#ev:youtube|SvfDvGaY348|320}}
 +
|valign="top"|
 +
* "AS3 second phase, motor & accelerometers" http://www.youtube.com/watch?v=SvfDvGaY348
 +
|-
 +
|{{#ev:youtube|yFy_NYcWDMk|320}}
 +
|valign="top"|
 +
* "AS3 Very simple hand tracking" http://www.youtube.com/watch?v=yFy_NYcWDMk
 +
|-
 +
|{{#ev:youtube|HWIRSQuuqm0|320}}
 +
|valign="top"|
 +
* "AS3 Multiple hand tracking" http://www.youtube.com/watch?v=HWIRSQuuqm0
 +
|}
  
"AS3 Very simple hand tracking" http://www.youtube.com/watch?v=yFy_NYcWDMk
+
=== Eric Gradman (egradman) ===
  
"AS3 Multiple hand tracking" http://www.youtube.com/watch?v=HWIRSQuuqm0
+
{|
 +
|{{#ev:youtube|uaci4dcZxYE|320}}
 +
|valign="top"|
 +
* "Standard gravity" art project using libfreenect and python http://www.youtube.com/watch?v=uaci4dcZxYE
 +
|}
  
=== Eric Gradman (egradman) ===
+
=== Scott Byrns ===
"Standard gravity" art project using libfreenect and python
 
http://www.youtube.com/watch?v=uaci4dcZxYE
 
  
=== Scott Byrns ===
+
{|
Point Cloud in Java
+
|{{#ev:youtube|oIyTCvi-XSI|320}}
http://www.youtube.com/watch?v=oIyTCvi-XSI
+
|valign="top"|
 +
* Point Cloud in Java. http://www.youtube.com/watch?v=oIyTCvi-XSI
 +
|}
  
Poor-mans Object Tracking in Java
+
* Poor-mans Object Tracking in Java. http://www.screencast.com/users/mscottb/folders/Jing/media/fca9fb5e-2053-4614-b1f6-047ebbd4a8ad
http://www.screencast.com/users/mscottb/folders/Jing/media/fca9fb5e-2053-4614-b1f6-047ebbd4a8ad
 
  
 
=== [[User:benxtan|Ben X Tan]] ===
 
=== [[User:benxtan|Ben X Tan]] ===
Actually...this isn't using OpenKinect. I'm using CL NUI at the moment which is in C#, but I will switch over to libfreenect c++ version on Mac soon...
+
 
 +
{|
 +
|{{#ev:youtube|z1Wp0HEYxSg|320}}
 +
|valign="top"|
 +
* Actually...this isn't using OpenKinect. I'm using CL NUI at the moment which is in C#, but I will switch over to libfreenect c++ version on Mac soon...
 
* Midi Controller
 
* Midi Controller
 
** v0.1 - http://www.youtube.com/watch?v=z1Wp0HEYxSg
 
** v0.1 - http://www.youtube.com/watch?v=z1Wp0HEYxSg
Line 265: Line 348:
 
*** http://www.youtube.com/watch?v=OilcNuX404o - TIM885885 using my program to play chinese music. Actually sounds musical!
 
*** http://www.youtube.com/watch?v=OilcNuX404o - TIM885885 using my program to play chinese music. Actually sounds musical!
 
** v0.2 - http://www.youtube.com/watch?v=fCQwNLKJlCU
 
** v0.2 - http://www.youtube.com/watch?v=fCQwNLKJlCU
*** Now shows heat map and plays the C major scale in a 2 octave range. Will be configurable later.
+
*** Now shows heat map and plays the C major scale in a 2 octave range. Will be configurable later
 
** v0.3 - No video yet, but I just got it to talk to Ableton Live!
 
** v0.3 - No video yet, but I just got it to talk to Ableton Live!
 +
|-
 +
|{{#ev:youtube|OilcNuX404o|320}}
 +
|
 +
|-
 +
|{{#ev:youtube|fCQwNLKJlCU|320}}
 +
|
 +
|}
  
 
=== [[User:jmpelletier|Jean-Marc Pelletier]] - Nenad Popov - Andrew Roth ===
 
=== [[User:jmpelletier|Jean-Marc Pelletier]] - Nenad Popov - Andrew Roth ===
jit.freenect.grab: A Max/Jitter external to grab data from the Kinect.<br />
+
 
http://jmpelletier.com/freenect/ <br />
+
* jit.freenect.grab: A Max/Jitter external to grab data from the Kinect. http://jmpelletier.com/freenect/ https://github.com/jmpelletier/jit.freenect.grab
https://github.com/jmpelletier/jit.freenect.grab <br />
+
 
http://www.youtube.com/watch?v=WIJA46ocia0 <br />
+
{|
http://www.youtube.com/watch?v=wS8wyIYn77w <br />
+
|{{#ev:youtube|WIJA46ocia0|320}}
 +
|valign="top"|
 +
* http://www.youtube.com/watch?v=WIJA46ocia0
 +
|-
 +
|{{#ev:youtube|wS8wyIYn77w|320}}
 +
|valign="top"|
 +
* http://www.youtube.com/watch?v=wS8wyIYn77w
 +
|}
  
 
=== Phillipp Robbel ===
 
=== Phillipp Robbel ===
Early experiments with a Microsoft Kinect depth camera on a mobile robot base
+
 
http://www.youtube.com/watch?v=dRPEns8MS2o
+
{|
 +
|{{#ev:youtube|dRPEns8MS2o|320}}
 +
|valign="top"|
 +
* Early experiments with a Microsoft Kinect depth camera on a mobile robot base. http://www.youtube.com/watch?v=dRPEns8MS2o
 +
|}
  
 
=== Jasper Brekelmans ===
 
=== Jasper Brekelmans ===
Kinect 3D PointCloud Scanner
 
http://www.youtube.com/watch?v=bZi-1NmQAGM&feature=player_embedded
 
  
 +
{|
 +
|{{#ev:youtube|bZi-1NmQAGM|320}}
 +
|valign="top"|
 +
* Kinect 3D PointCloud Scanner. http://www.youtube.com/watch?v=bZi-1NmQAGM&feature=player_embedded
 +
|}
 
=== Thomas Hansen ===
 
=== Thomas Hansen ===
Segmenting point cloud pixels based on depth ranges
+
 
http://vimeo.com/17085043
+
{|
{{#ev:vimeo|17085043}}
+
|{{#ev:vimeo|17085043|320}}
 +
|valign="top"|
 +
* Segmenting point cloud pixels based on depth ranges. http://vimeo.com/17085043
 +
|}
  
 
=== Preston Holmes (ptone) ===
 
=== Preston Holmes (ptone) ===
Control of DMX based lighting via Kinect
 
http://www.youtube.com/watch?v=Kg0Rvj-Seto
 
  
Controlling GarageBand: http://www.youtube.com/watch?v=u2-e9YMdadE
+
{|
 +
|{{#ev:youtube|Kg0Rvj-Seto|320}}
 +
|valign="top"|
 +
* Control of DMX based lighting via Kinect. http://www.youtube.com/watch?v=Kg0Rvj-Seto
 +
|-
 +
|{{#ev:youtube|u2-e9YMdadE|320}}
 +
|valign="top"|
 +
* Controlling GarageBand: http://www.youtube.com/watch?v=u2-e9YMdadE
 +
|}
  
 
=== Peter Nash (pedronash) ===
 
=== Peter Nash (pedronash) ===
Playing around with Processing and Java - seeing what's fun. My experience so far is mostly in smart calibration.
+
 
http://www.youtube.com/watch?v=2lF8OJoWogY
+
{|
 +
|{{#ev:youtube|2lF8OJoWogY|320}}
 +
|valign="top"|
 +
* Playing around with Processing and Java - seeing what's fun. My experience so far is mostly in smart calibration. http://www.youtube.com/watch?v=2lF8OJoWogY
 +
|}
  
 
=== Willow Garage ===
 
=== Willow Garage ===
Several projects including multiple kinect integration and teleoperation of robots.  Read more at http://www.willowgarage.com/blog/2010/11/22/kinect-ros-moving-forward-quickly video: http://www.youtube.com/watch?v=rYUFu64VXkg
+
 
 +
{|
 +
|{{#ev:youtube|rYUFu64VXkg|320}}
 +
|valign="top"|
 +
* Several projects including multiple kinect integration and teleoperation of robots.  Read more at http://www.willowgarage.com/blog/2010/11/22/kinect-ros-moving-forward-quickly video: http://www.youtube.com/watch?v=rYUFu64VXkg
 +
|}
  
 
=== Henry Chu ===
 
=== Henry Chu ===
Reflection - art exhibit
+
{|
Interacting with a three-second delayed shadow of yourself
+
|{{#ev:vimeo|16959492|320}}
 
+
|valign="top"|
http://vimeo.com/16959492<br />
+
* Reflection - art exhibit. http://vimeo.com/16959492
http://vimeo.com/17040999
+
|-
 
+
|{{#ev:vimeo|17040999|320}}
=== [[User:pkmital|Parag K Mital]] ===
+
|valign="top"|
Sound modulating point clouds:
+
* Interacting with a three-second delayed shadow of yourself. http://vimeo.com/17040999
 +
|}
  
http://vimeo.com/16964263
+
=== [[User:pkmital|Parag K Mital]] ===
 +
{|
 +
|{{#ev:vimeo|16964263|320}}
 +
|valign="top"|
 +
* Sound modulating point clouds: http://vimeo.com/16964263
 +
|}
  
 
=== [[User:CADET|CADET]] ===  
 
=== [[User:CADET|CADET]] ===  
*Making opensource engine 2Real for fullbody interaction and mixed reality applications
+
{|
*Kinect Generative Game Prototype http://www.vimeo.com/17135419
+
|{{#ev:vimeo|17135419|320}}
*Kinect skeleton library first baby steps http://www.vimeo.com/17220351 (we just started)
+
|valign="top"|
*Center for Advances in Digital Entertainment Technologies http://www.cadet.at
+
* Making opensource engine 2Real for fullbody interaction and mixed reality applications. Kinect Generative Game Prototype http://www.vimeo.com/17135419
 +
|-
 +
|{{#ev:vimeo|17220351|320}}
 +
|valign="top"|
 +
* Kinect skeleton library first baby steps http://www.vimeo.com/17220351 (we just started). Center for Advances in Digital Entertainment Technologies http://www.cadet.at
 +
|}
  
 
=== Achim Kern ===
 
=== Achim Kern ===
Quick test using Kinect to control a digital pin-board in TouchDesigner.  
+
{|
http://vimeo.com/17211596
+
|{{#ev:vimeo|17211596|320}}
{{#ev:vimeo|17211596}}
+
|valign="top"|
 +
* Quick test using Kinect to control a digital pin-board in TouchDesigner. http://vimeo.com/17211596
 +
|}
  
 
=== [[User:nzjrs|John Stowers]] ===
 
=== [[User:nzjrs|John Stowers]] ===
{{#ev:youtube|4EGQZsu-kkk}}
+
{|
 +
|{{#ev:youtube|4EGQZsu-kkk|320}}
 +
|}
  
 
=== Dongchul Kim ([[User:t9t9|T9T9]]) ===
 
=== Dongchul Kim ([[User:t9t9|T9T9]]) ===
{{#ev:youtube| _rIV3FMbekM}}
+
{|
{{#ev:youtube| 9XwPpTOPY8k}}
+
|{{#ev:youtube|_rIV3FMbekM|320}}
*3D Cube Clouds http://www.youtube.com/watch?v=_rIV3FMbekM
+
|valign="top"|
*Hand Printing http://www.youtube.com/watch?v=9XwPpTOPY8k
+
* OpenKinect group in Korea, http://openKinect.co.kr. 3D Cube Clouds http://www.youtube.com/watch?v=_rIV3FMbekM
*openKinect group in Korea, http://openKinect.co.kr
+
|-
 
+
|{{#ev:youtube|9XwPpTOPY8k|320}}
 +
|valign="top"|
 +
* Hand Printing http://www.youtube.com/watch?v=9XwPpTOPY8k
 +
|}
  
 
=== Integration in MRPT: Very simple visual SLAM ===
 
=== Integration in MRPT: Very simple visual SLAM ===
 
+
{|
'''Part 1''': Example code for how to grab Kinect observations, do multi-threading, convert range data to a 3D point cloud and render in real-time. More info [http://www.mrpt.org/Kinect_and_MRPT here].
+
|{{#ev:youtube|GLaPwKbrbIM|320}}
 
+
|valign="top"|
{{#ev:youtube|GLaPwKbrbIM}}
+
* '''Part 1''': Example code for how to grab Kinect observations, do multi-threading, convert range data to a 3D point cloud and render in real-time. More info [http://www.mrpt.org/Kinect_and_MRPT here].
 
+
|-
 
+
|{{#ev:youtube|dGVnPvgqu3M|320}}
'''Part 2''': Very simple demonstration of how to grab Kinect observations, perform visual feature tracking and build a 3D map (SLAM). More info [http://www.mrpt.org/Kinect_and_MRPT here].
+
|valign="top"|
 
+
* '''Part 2''': Very simple demonstration of how to grab Kinect observations, perform visual feature tracking and build a 3D map (SLAM). More info [http://www.mrpt.org/Kinect_and_MRPT here].
{{#ev:youtube|dGVnPvgqu3M}}
+
|}
  
 
=== Kinect running a website ===
 
=== Kinect running a website ===
First Kinect compatible website http://www.youtube.com/watch?v=1QjYIrNWxks
+
{|
 
+
|{{#ev:youtube|1QjYIrNWxks|320}}
 +
|valign="top"|
 +
* First Kinect compatible website http://www.youtube.com/watch?v=1QjYIrNWxks
 +
|}
  
 
=== Martin Kaltenbrunner ===
 
=== Martin Kaltenbrunner ===
 
+
{|
{{#ev:youtube|vZSEEnMP6pg}}
+
|{{#ev:youtube|vZSEEnMP6pg|320}}
TuioKinect tracks simple hand gestures using the Kinect controller and sends control data based on the [http://tuio.org/ TUIO] protocol. This is a preliminary proof of concept implementation, which still needs several improvements to become fully usable. Nevertheless it should work out of the box with most TUIO client applications. You can download the source code and a mac binary from its [http://code.google.com/​p/​tuiokinect/project page]
+
|valign="top"|
 
+
* TuioKinect tracks simple hand gestures using the Kinect controller and sends control data based on the [http://tuio.org/ TUIO] protocol. This is a preliminary proof of concept implementation, which still needs several improvements to become fully usable. Nevertheless it should work out of the box with most TUIO client applications. You can download the source code and a mac binary from its [http://code.google.com/?/?uiokinect/?project page]
{{#ev:youtube|DUCZ5L5Y578}}
+
|-
The Therenect is a virtual Theremin, which defines two virtual antenna points that allow controlling the pitch and volume of a simple oscillator. The distance to these points can be controlled by freely moving the hand in three dimensions or by reshaping the hand, which allows gestures that should be quite similar to playing an actual Theremin.
+
|{{#ev:youtube|DUCZ5L5Y578|320}}
 +
|valign="top"|
 +
* The Therenect is a virtual Theremin, which defines two virtual antenna points that allow controlling the pitch and volume of a simple oscillator. The distance to these points can be controlled by freely moving the hand in three dimensions or by reshaping the hand, which allows gestures that should be quite similar to playing an actual Theremin.
 +
|}
  
 
=== Switch On The Code ===
 
=== Switch On The Code ===
Kinect getting started tutorial using libfreenect and the C# wrapper. Demonstrates how to display the RBG data and depth data as a grayscale depth map.
+
* Kinect getting started tutorial using libfreenect and the C# wrapper.
 +
* Demonstrates how to display the RBG data and depth data as a grayscale depth map.
  
 
http://www.switchonthecode.com/tutorials/kinect-tutorial-hacking-101
 
http://www.switchonthecode.com/tutorials/kinect-tutorial-hacking-101
  
 
=== RGBDemo: 3D Scene Reconstruction and people tracking/height estimation ===
 
=== RGBDemo: 3D Scene Reconstruction and people tracking/height estimation ===
 
+
{|
* Freehand scanning of a room using image-based relative pose estimation:
+
|{{#ev:youtube|2ml8GiUPTao|320}}
{{#ev:youtube|2ml8GiUPTao}}
+
|valign="top"|
 
+
* Freehand scanning of a room using image-based relative pose estimation.
* People detection with height estimation:
+
|-
http://www.youtube.com/watch?v=nnCDOKLuu0g
+
|{{#ev:youtube|nnCDOKLuu0g|320}}
 +
|valign="top"|
 +
* People detection with height estimation: http://www.youtube.com/watch?v=nnCDOKLuu0g
 +
|}
  
 
Software available at http://nicolas.burrus.name/index.php/Research/KinectRgbDemoV4
 
Software available at http://nicolas.burrus.name/index.php/Research/KinectRgbDemoV4
  
 
=== GeckoSystems Mobile Robot ===
 
=== GeckoSystems Mobile Robot ===
 
+
{|
{{#ev:youtube|kn93BS44Das}}
+
|{{#ev:youtube|kn93BS44Das|320}}
 
+
|valign="top"|
A GeckoSystems Carebot(tm) has been retrofitted with a pair of Kinects to improve data acquisition for it's autonomous navigation software GeckoNav.
+
* A GeckoSystems Carebot(tm) has been retrofitted with a pair of Kinects to improve data acquisition for it's autonomous navigation software GeckoNav.
 +
|}
  
 
=====3D Body Scan Using Multiple Kinects - Data and Interference Analysis=====
 
=====3D Body Scan Using Multiple Kinects - Data and Interference Analysis=====
We simulated a full body scanner using data from 4 views
+
{|
 
+
|{{#ev:youtube|9MwS_nk9n2A|320}}
http://www.youtube.com/watch?v=9MwS_nk9n2A
+
|valign="top"|
 +
* We simulated a full body scanner using data from 4 views. http://www.youtube.com/watch?v=9MwS_nk9n2A
 +
|}
  
 
=== MikuMikuDance ===
 
=== MikuMikuDance ===
MikuMikuDance is a free dance simulator that uses takes its own file format for 3d human characters, and performs a dance with it by editing the position of "bones" inside the model, frame by frame.
+
{|
 
+
|{{#ev:youtube|JQvLt7DQhaI|320}}
It is a companion program to a nonfree singing text-to-speech program called Vocaloid.
+
|valign="top"|
 
+
* MikuMikuDance is a free dance simulator that uses takes its own file format for 3d human characters, and performs a dance with it by editing the position of "bones" inside the model, frame by frame
The later versions can match the 3d animated character's position to Kinect motion data.
+
* It is a companion program to a nonfree singing text-to-speech program called Vocaloid
 
+
* The later versions can match the 3d animated character's position to Kinect motion data
{{#ev:youtube|JQvLt7DQhaI}}
+
|}
  
 
=== Gestural Interaction for Training Simulations ===
 
=== Gestural Interaction for Training Simulations ===
 
+
{|
We used the Microsoft Kinect to create a simple free-handed interface for navigating a 3D world and performing triage. We also developed a walking system for physical exertion.
+
|{{#ev:youtube|-lLCu22INT8|320}}
 
+
|valign="top"|
We actually use the OpenNI software with the driver package provided by avin and the Unity wrapper provided by tinkerer.
+
* We used the Microsoft Kinect to create a simple free-handed interface for navigating a 3D world and performing triage.
{{#ev:youtube| -lLCu22INT8}}
+
* We also developed a walking system for physical exertion.
http://www.youtube.com/watch?v=-lLCu22INT8
+
* We actually use the OpenNI software with the driver package provided by avin and the Unity wrapper provided by tinkerer.
 +
** http://www.youtube.com/watch?v=-lLCu22INT8
 +
|}

Revision as of 08:38, 20 February 2011

This page collects videos of various people doing tests and experiments with OpenKinect software, and Kinect themed websites:

Kinect Themed Sites

Demos

3D Depth Sculpting using copies of my own body and other objects

  • I became intrigued with the possibility of sculpting in 3D using the Kinect by periodically recording the nearest object to the camera in each part of the image. By taking multiple 3D snapshots at different times and then merging them to show the closest object, I can create a 3D sculpture that I can walk through. Here I merge multiple 3D video streams. http://www.youtube.com/watch?v=LKjzbyBpkM8
  • Here I use a "depth strobe", where I grab snapshots 1-2x a second.

People Detection

Florian Echtler (floemuc)

Pete & Matt

okreylos

koabi's work

Diarmuid Wrenne diarmuid@bluekulu.com

  • A second version of the green screen demo. This one does movies too.
  • Extracting rotation data from real objects and mapping that to new virtual ones. Shows how I can extract the rotation of objects seen by the kinect and use that rotation to change the orientation of virtual objects within the Box2d space to create a virtual bat out of a real one!!. Notice that I have mirrored the color video stream so that it acts more like a mirror than a web cam so that I can overlay the 2d graphics onto the camera images for more realism. http://www.youtube.com/watch?v=bO3YwW3WajI
  • Driving Quake Live with a kinect. It uses openkinect, python bindings and web.py on the linux to expose nearest point data. The imac runs Quake and a custom java program that calls the linux web server. It uses java.awt.Robot to generate mouse and key stroke events. http://www.youtube.com/watch?v=uvP2u2yOcNw Sorry about the resolution but I'll try to upload a better one later.
  • Extracting rotation data from real objects and mapping that to new virtual ones. Shows how I can extract the rotation of objects seen by the kinect and use that rotation to change the orientation of virtual objects within the Box2d space to create a virtual bat out of a real one!!. Notice that I have mirrored the color video stream so that it acts more like a mirror than a web cam so that I can overlay the 2d graphics onto the camera images for more realism. http://www.youtube.com/watch?v=bO3YwW3WajI
  • Openframeworks, box2d, opencv and ofxkinect. This uses the depth map to determine the closest point to the kinect. It uses this point to draw a line that is part of the box2d world. This line can then be moved around by moving your hand or a magic wand (in my case a roll of string!!) so that other objects with in the 2d world can be manipulated. Works well. http://www.youtube.com/watch?v=pR46sXjEtzE
  • Openframeworks, Box2d and opencv. Uses the blobs generated by opencv contours to generate a box2d object to manibulate other box2d objects. Works OK but filtering the blobs is quite error prone http://www.youtube.com/watch?v=NlrKcpUPtwM

Kyle point cloud viewer

cclaan

yankeyan

Memo Akten

L14M333

arne_

Full image of the IR dot field:

kode80apps

James Patten

cruxphotography

Daniel Reetz

Removing the IR cut filter from your camera and examining the Kinect IR field, including high res image:

Emily Gobeille - Theo Watson

DustyDingo

Two Kinects facing same direction, experiments:

Rolling shutter experiments:

The top image is a plain kinect, the other one has a rolling shuter in front of the laser projector. Both kinect operate at the same time.

Holding the kinect automatically on level:

yoda--

Zephod / Stijn Kuipers

Zappadoc / URBI Implementation

netpro2k / Dominick D'Aniello

evoluce

marcan

imekinox

Eric Gradman (egradman)

Scott Byrns

Ben X Tan

Jean-Marc Pelletier - Nenad Popov - Andrew Roth

Phillipp Robbel

Jasper Brekelmans

Thomas Hansen

Preston Holmes (ptone)

Peter Nash (pedronash)

Willow Garage

Henry Chu

Parag K Mital

CADET

  • Making opensource engine 2Real for fullbody interaction and mixed reality applications. Kinect Generative Game Prototype http://www.vimeo.com/17135419

Achim Kern

John Stowers

Dongchul Kim (T9T9)

Integration in MRPT: Very simple visual SLAM

  • Part 1: Example code for how to grab Kinect observations, do multi-threading, convert range data to a 3D point cloud and render in real-time. More info here.
  • Part 2: Very simple demonstration of how to grab Kinect observations, perform visual feature tracking and build a 3D map (SLAM). More info here.

Kinect running a website

Martin Kaltenbrunner

  • TuioKinect tracks simple hand gestures using the Kinect controller and sends control data based on the TUIO protocol. This is a preliminary proof of concept implementation, which still needs several improvements to become fully usable. Nevertheless it should work out of the box with most TUIO client applications. You can download the source code and a mac binary from its page
  • The Therenect is a virtual Theremin, which defines two virtual antenna points that allow controlling the pitch and volume of a simple oscillator. The distance to these points can be controlled by freely moving the hand in three dimensions or by reshaping the hand, which allows gestures that should be quite similar to playing an actual Theremin.

Switch On The Code

  • Kinect getting started tutorial using libfreenect and the C# wrapper.
  • Demonstrates how to display the RBG data and depth data as a grayscale depth map.

http://www.switchonthecode.com/tutorials/kinect-tutorial-hacking-101

RGBDemo: 3D Scene Reconstruction and people tracking/height estimation

  • Freehand scanning of a room using image-based relative pose estimation.

Software available at http://nicolas.burrus.name/index.php/Research/KinectRgbDemoV4

GeckoSystems Mobile Robot

  • A GeckoSystems Carebot(tm) has been retrofitted with a pair of Kinects to improve data acquisition for it's autonomous navigation software GeckoNav.
3D Body Scan Using Multiple Kinects - Data and Interference Analysis

MikuMikuDance

  • MikuMikuDance is a free dance simulator that uses takes its own file format for 3d human characters, and performs a dance with it by editing the position of "bones" inside the model, frame by frame
  • It is a companion program to a nonfree singing text-to-speech program called Vocaloid
  • The later versions can match the 3d animated character's position to Kinect motion data

Gestural Interaction for Training Simulations

  • We used the Microsoft Kinect to create a simple free-handed interface for navigating a 3D world and performing triage.
  • We also developed a walking system for physical exertion.
  • We actually use the OpenNI software with the driver package provided by avin and the Unity wrapper provided by tinkerer.