Notice: MediaWiki has been updated. Report any rough edges to marcan@marcan.st

Difference between revisions of "Gallery"

From OpenKinect
Jump to: navigation, search
(added links)
m (Demos - Page 1)
 
(107 intermediate revisions by 52 users not shown)
Line 1: Line 1:
This page collects videos of various people doing tests and experiments with OpenKinect software.
+
This page collects videos of various people doing tests and experiments with OpenKinect software, and Kinect themed websites:
 +
 
 +
==Kinect Themed Sites==
 +
 
 +
* http://www.reddit.com/r/openkinect OpenKinect news and discussion community
 +
* http://www.kinecthacks.com - KinectHacks.com - Source for Kinect hacking community news, walkthroughs and downloads
 +
* http://kinect-hacks.net - Kinect Hacks, news, gossip and videos
 +
* http://www.kinectHacks.net - Kinect hacking blog and forum
 +
* http://www.freenect.com - Kinect hacking news
 +
* http://www.modmykinect.com - The innovation begins.
 +
* http://www.Bodensee-Deal.de - Kinect hacking news in German
 +
* http://www.kinectxbox360.es - Novedades, hacks y juegos de kinect en español.
 +
* http://www.kinectHacks.nl - Dutch Kinect hacking blog and forum
 +
* http://live4d.de/index.php?/XBox-360-Kinect-Sensor-Hacks-Blog/Kinect-Sensor-Hacks-News/ - German site
 +
 
 +
== Demos - Page 1 ==
 +
=== Augmented Reality, Smart Glasses and 2 MS Kinects using 3GearSystem and OpenKinect ===
 +
{|
 +
|{{#ev:youtube|8FggsGUK5iA|320}}
 +
|valign="top"|
 +
* Uses 2 x MS Kinect using 3GearSystem on top of OpenKinect drivers for hand tracking
 +
* Maps hand tracking to RGB world using RGB cameras
 +
* Co ordinates augmented reality and hand tracking from kinects
 +
* Allows for 3D hand pinch and 3D multi hand actions such as pinch to zoom in 3D
 +
|}
 +
=== 3D Facial Performance Capture using Kinect ===
 +
{|
 +
|{{#ev:youtube|nYsqNnDA1l4|320}}
 +
|valign="top"|
 +
* Using the Kinect to capture data (green) of a markerless moving face (distance about 1 m, very limited coverage)
 +
* Mapping the rigid and non-rigid face motion to an animatable 3D face model (purple)
 +
* No use of the color information yet
 +
|}
 +
 
 +
=== Blink Solution brings Vodafone 3g’s Super Zoozoo to life ===
 +
{|
 +
|{{#ev:youtube|7AsbEnWYiDw|320}}
 +
|valign="top"|
 +
* What do you get when you combine Super Zoozoo with a Microsoft Kinect, an awesome programmer, a digital artist, a creative genius and some kick ass visuals??
 +
* Thanks to Blink Solution’s hunger for experimentation and a desire to stretch the boundaries of  technology and art, we bring you this video. http://digitalanalog.in/2011/03/25/blink-solution-brings-vodafone-3gs-super-zoozoo-to-life/
 +
|}
 +
 
 +
=== 3D Depth Sculpting using copies of my own body and other objects ===
 +
 
 +
{|
 +
|{{#ev:youtube|LKjzbyBpkM8|320}}
 +
|valign="top"|
 +
* I became intrigued with the possibility of sculpting in 3D using the Kinect by periodically recording the nearest object to the camera in each part of the image. By taking multiple 3D snapshots at different times and then merging them to show the closest object, I can create a 3D sculpture that I can walk through. Here I merge multiple 3D video streams. http://www.youtube.com/watch?v=LKjzbyBpkM8
 +
|-
 +
|{{#ev:youtube|inim0xWiR0o|320}}
 +
|valign="top"|
 +
* Here I use a "depth strobe", where I grab snapshots 1-2x a second.
 +
|-
 +
|{{#ev:youtube|abS7G5ZT17c|320}}
 +
|valign="top"|
 +
* Here I take a single snapshot with some furniture and then remove it, allowing me to wander through the ghost of the furniture. http://www.youtube.com/watch?v=abS7G5ZT17c
 +
|-
 +
|{{#ev:youtube|rBLYsB9BBSk|320}}
 +
|valign="top"|
 +
* Here I update the 3D background continuously, creating a slur of objects. http://www.youtube.com/watch?v=rBLYsB9BBSk. Created in Python using the Python wrapper.
 +
|}
 +
 
 +
=== People Detection ===
 +
{|
 +
|{{#ev:youtube|x--xlKWBTAE|320}}
 +
|valign="top"|
 +
* Real-time people detection using two kinect. http://www.youtube.com/watch?v=x--xlKWBTAE
 +
|}
  
 
=== Florian Echtler (floemuc) ===
 
=== Florian Echtler (floemuc) ===
"Multi-touch" interactions http://www.youtube.com/watch?v=ho6Yhz21BJI
+
 
 +
{|
 +
|{{#ev:youtube|ho6Yhz21BJI|320}}
 +
|valign="top"|
 +
* "Multi-touch" interactions http://www.youtube.com/watch?v=ho6Yhz21BJI Created with the libTISCH multitouch library: http://tisch.sf.net/
 +
|}
  
 
=== Pete & Matt ===
 
=== Pete & Matt ===
Fun with a Kinect piano: http://www.youtube.com/watch?v=ppHcj15LypM
+
 
 +
{|
 +
|{{#ev:youtube|a0mdgdQfa-Q|320}}
 +
|valign="top"|
 +
* Fun with a "Kinect piano"! A vastly improved piano: http://www.youtube.com/watch?v=a0mdgdQfa-Q
 +
|-
 +
||{{#ev:youtube|ppHcj15LypM|320}}
 +
|valign="top"|
 +
* Original version for posterity: http://www.youtube.com/watch?v=ppHcj15LypM
 +
|}
  
 
=== okreylos ===
 
=== okreylos ===
3D depth video capture http://www.youtube.com/watch?v=7QrnwoO1-8A
+
 
measuring objects in 3D http://idav.ucdavis.edu/~okreylos/ResDev/Kinect/
+
{|
 +
|{{#ev:youtube|7QrnwoO1-8A|320}}
 +
|valign="top"|
 +
* 3D depth video capture http://www.youtube.com/watch?v=7QrnwoO1-8A measuring objects in 3D http://idav.ucdavis.edu/~okreylos/ResDev/Kinect/
 +
|}
  
 
=== koabi's work ===
 
=== koabi's work ===
http://www.youtube.com/profile?user=DerKorb#g/u
+
* http://www.youtube.com/profile?user=DerKorb#g/u
 +
 
 +
=== Diarmuid Wrenne diarmuid@bluekulu.com ===
 +
 
 +
{|
 +
|{{#ev:youtube|7QrnwoO1-8A|320}}
 +
|valign="top"|
 +
* 3D depth video capture http://www.youtube.com/watch?v=7QrnwoO1-8A measuring objects in 3D http://idav.ucdavis.edu/~okreylos/ResDev/Kinect/
 +
|-
 +
|{{#ev:youtube|qy7lKS6L75w|320}}
 +
|valign="top"|
 +
* A second version of the green screen demo. This one does movies too.
 +
|-
 +
|{{#ev:youtube|4yp37U-YHv4|320}}
 +
|valign="top"|
 +
* 3D point cloud rendered with boxes with scaling. Also superimposing 3d models into the scene (a rifle of course). Also using OSC from an ipad to control the scene. http://www.youtube.com/watch?v=4yp37U-YHv4
 +
|-
 +
|{{#ev:youtube|T8bkAQ-VxXg|320}}
 +
|valign="top"|
 +
* 3D Point cloud viewer. http://www.youtube.com/watch?v=T8bkAQ-VxXg
 +
|-
 +
|{{#ev:youtube|bO3YwW3WajI|320}}
 +
|valign="top"|
 +
* Extracting rotation data from real objects and mapping that to new virtual ones. Shows how I can extract the rotation of objects seen by the kinect and use that rotation to change the orientation of virtual objects within the Box2d space to create a virtual bat out of a real one!!. Notice that I have mirrored the color video stream so that it acts more like a mirror than a web cam so that I can overlay the 2d graphics onto the camera images for more realism. http://www.youtube.com/watch?v=bO3YwW3WajI
 +
|-
 +
|{{#ev:youtube|uvP2u2yOcNw|320}}
 +
|valign="top"|
 +
* Driving Quake Live with a kinect. It uses openkinect, python bindings and web.py on the linux to expose nearest point data. The imac runs Quake and a custom java program that calls the linux web server. It uses java.awt.Robot to generate mouse and key stroke events. http://www.youtube.com/watch?v=uvP2u2yOcNw Sorry about the resolution but I'll try to upload a better one later.
 +
|-
 +
|{{#ev:youtube|bO3YwW3WajI|320}}
 +
|valign="top"|
 +
* Extracting rotation data from real objects and mapping that to new virtual ones. Shows how I can extract the rotation of objects seen by the kinect and use that rotation to change the orientation of virtual objects within the Box2d space to create a virtual bat out of a real one!!. Notice that I have mirrored the color video stream so that it acts more like a mirror than a web cam so that I can overlay the 2d graphics onto the camera images for more realism. http://www.youtube.com/watch?v=bO3YwW3WajI
 +
|-
 +
|{{#ev:youtube|pR46sXjEtzE|320}}
 +
|valign="top"|
 +
* Openframeworks, box2d, opencv and ofxkinect. This uses the depth map to determine the closest point to the kinect. It uses this point to draw a line that is part of the box2d world. This line can then be moved around by moving your hand or a magic wand (in my case a roll of string!!) so that other objects with in the 2d world can be manipulated. Works well. http://www.youtube.com/watch?v=pR46sXjEtzE
 +
|-
 +
|{{#ev:youtube|NlrKcpUPtwM|320}}
 +
|valign="top"|
 +
* Openframeworks, Box2d and opencv. Uses the blobs generated by opencv contours to generate a box2d object to manibulate other box2d objects. Works OK but filtering the blobs is quite error prone http://www.youtube.com/watch?v=NlrKcpUPtwM
 +
|-
 +
|{{#ev:youtube|PYq9gkdpiS8|320}}
 +
|valign="top"|
 +
* Using web.py to web enable a kinect on linux. http://www.youtube.com/watch?v=PYq9gkdpiS8
 +
|}
  
 
=== Kyle point cloud viewer ===
 
=== Kyle point cloud viewer ===
initial render in processing: http://www.flickr.com/photos/kylemcdonald/5167174610/
 
  
viewer substituting glview.c: http://www.openframeworks.cc/forum/viewtopic.php?p=24884#p24884
+
* Initial render in processing: http://www.flickr.com/photos/kylemcdonald/5167174610/
 +
* Viewer substituting glview.c: http://www.openframeworks.cc/forum/viewtopic.php?p=24884#p24884
 +
* advanced viewer with DOF using ofxKinect: http://www.openframeworks.cc/forum/viewtopic.php?p=24958#p24958
 +
 
 +
=== cclaan ===
 +
 
 +
{|
 +
|{{#ev:vimeo|16788233|320}}
 +
|valign="top"|
 +
* Point cloud plus color. http://vimeo.com/16788233
 +
|}
  
advanced viewer with DOF using ofxKinect: http://www.openframeworks.cc/forum/viewtopic.php?p=24958#p24958
+
{|
 +
|{{#ev:vimeo|25852368|320}}
 +
|valign="top"|
 +
* 3d Augmented Reality Video on iPad http://vimeo.com/25852368
 +
|}
  
=== cclaan ===
 
Point cloud plus color
 
http://vimeo.com/16788233
 
  
 
=== yankeyan ===
 
=== yankeyan ===
Awesome object recognition! http://www.youtube.com/watch?v=cRBozGoa69s
+
 
 +
{|
 +
|{{#ev:youtube|fQ59dXOo63o|320}}
 +
|valign="top"|
 +
* Awesome object recognition! http://www.youtube.com/watch?v=fQ59dXOo63o and http://www.youtube.com/watch?v=cRBozGoa69s
 +
|-
 +
|{{#ev:youtube|cRBozGoa69s|320}}
 +
|-
 +
|{{#ev:youtube|3EeJCln5KYg|320}}
 +
|valign="top"|
 +
* Lightsaber tracking and rendering http://www.youtube.com/watch?v=3EeJCln5KYg
 +
|}
  
 
=== Memo Akten ===
 
=== Memo Akten ===
Drawing in 3D http://vimeo.com/16818988
+
 
 +
{|
 +
|{{#ev:vimeo|16818988|320}}
 +
|valign="top"|
 +
* Drawing in 3D http://vimeo.com/16818988
 +
|}
  
 
=== L14M333 ===
 
=== L14M333 ===
Moving the motor with a GUI while showing video streams: http://www.youtube.com/watch?v=KsvQpUnlp-Y
+
 
 +
{|
 +
|{{#ev:youtube|KsvQpUnlp-Y|320}}
 +
|valign="top"|
 +
* Moving the motor with a GUI while showing video streams: http://www.youtube.com/watch?v=KsvQpUnlp-Y
 +
|-
 +
|{{#ev:youtube|GWvcgZkADUU|320}}
 +
|valign="top"|
 +
* Plotting accelerometer to graph and calibration http://www.youtube.com/watch?v=GWvcgZkADUU (Uses CL NUI (Will switch later))
 +
|-
 +
|{{#ev:youtube|-3-TA6URf9M|320}}
 +
|valign="top"|
 +
* Using kinect as a mouse http://www.youtube.com/watch?v=-3-TA6URf9M (Uses CL NUI (Will switch later))
 +
|-
 +
|{{#ev:youtube|wBoVrZZlmJ0&|320}}
 +
|valign="top"|
 +
* Kinect used to control mouse (v2) with source: http://www.youtube.com/watch?v=wBoVrZZlmJ0&hd=1
 +
|-
 +
|{{#ev:youtube|tlHzqeIhDiQ|320}}
 +
|valign="top"|
 +
* Turning TV into touch screen TV using Kinect: http://www.youtube.com/watch?v=tlHzqeIhDiQ
 +
|-
 +
|{{#ev:youtube|JlwWp8ItCVA&|320}}
 +
|valign="top"|
 +
* Another KinectTouch demo improved source fully commented: http://www.youtube.com/watch?v=JlwWp8ItCVA&hd=1
 +
|}
 +
 
 +
* Single object detection and improved framerate: http://tinyurl.com/handtrack
 +
* Get latest source code of mine from: http://get.essexconsolerepairs.co.uk/kinecttouch
  
 
=== arne_ ===
 
=== arne_ ===
 +
 
Full image of the IR dot field:
 
Full image of the IR dot field:
  
http://livingplace.informatik.haw-hamburg.de/blog/wp-content/uploads/2010/11/kinect1.png
+
* http://livingplace.informatik.haw-hamburg.de/blog/wp-content/uploads/2010/11/kinect1.png
  
 
=== kode80apps ===
 
=== kode80apps ===
Head tracking for 3D vision using OpenKinect software:
+
 
http://www.youtube.com/watch?v=Y-f_oMOvNAk
+
{|
http://www.youtube.com/watch?v=CddvvlP4UNM
+
|{{#ev:youtube|Y-f_oMOvNAk|320}}
 +
|valign="top"|
 +
* Head tracking for 3D vision using OpenKinect software. http://www.youtube.com/watch?v=Y-f_oMOvNAk  
 +
|-
 +
|{{#ev:youtube|CddvvlP4UNM|320}}
 +
|valign="top"|
 +
* http://www.youtube.com/watch?v=CddvvlP4UNM
 +
|}
  
 
=== James Patten ===
 
=== James Patten ===
Multitouch Tracking on arbitrary planes with Kinect http://www.youtube.com/watch?v=Bth0TkRLVtk
+
 
 +
{|
 +
|{{#ev:youtube|Bth0TkRLVtk|320}}
 +
|valign="top"|
 +
* Multitouch Tracking on arbitrary planes with Kinect http://www.youtube.com/watch?v=Bth0TkRLVtk
 +
|}
  
 
=== cruxphotography ===
 
=== cruxphotography ===
Box cloud http://www.youtube.com/watch?v=e8h6DL0Dc84
+
 
 +
{|
 +
|{{#ev:youtube|e8h6DL0Dc84|320}}
 +
|valign="top"|
 +
* Box cloud http://www.youtube.com/watch?v=e8h6DL0Dc84
 +
|}
  
 
=== Daniel Reetz ===
 
=== Daniel Reetz ===
 
Removing the IR cut filter from your camera and examining the Kinect IR field, including high res image:
 
Removing the IR cut filter from your camera and examining the Kinect IR field, including high res image:
http://www.futurepicture.org/?p=97
+
* http://www.futurepicture.org/?p=97
  
 
=== Emily Gobeille - Theo Watson ===
 
=== Emily Gobeille - Theo Watson ===
Digital puppetry with Kinect http://vimeo.com/16985224
 
  
=== nink ===
+
{|
 
+
|{{#ev:vimeo|16985224|320}}
Three Kinects (on three computers) pointing at same subject http://yfrog.com/n3i4bgj
+
|valign="top"|
 +
* Digital puppetry with Kinect http://vimeo.com/16985224
 +
|}
  
 
=== DustyDingo ===
 
=== DustyDingo ===
  
Two Kinects facing same direction, experiments:[http://atommuell.mum.jku.at/~aurel/two_kinects_same_direction.png]
+
Two Kinects facing same direction, experiments:
[http://atommuell.mum.jku.at/~aurel/two_kinects_same_direction_c_both.png]
+
* [http://atommuell.mum.jku.at/~aurel/two_kinects_same_direction.png]
[http://atommuell.mum.jku.at/~aurel/two_kinects_same_direction_c_single.png]
+
* [http://atommuell.mum.jku.at/~aurel/two_kinects_same_direction_c_both.png]
 +
* [http://atommuell.mum.jku.at/~aurel/two_kinects_same_direction_c_single.png]
  
 
Rolling shutter experiments:
 
Rolling shutter experiments:
Setup:
+
* Setup: http://atommuell.mum.jku.at/~aurel/kinect_roling_shuter_proto_2.jpg
http://atommuell.mum.jku.at/~aurel/kinect_roling_shuter_proto_2.jpg
 
  
 
The top image is a plain kinect, the other one has a rolling shuter in front of the laser projector.
 
The top image is a plain kinect, the other one has a rolling shuter in front of the laser projector.
 
Both kinect operate at the same time.
 
Both kinect operate at the same time.
http://atommuell.mum.jku.at/~aurel/two_kinects_same_direction_c_both_one_proj_shutter.png
+
* http://atommuell.mum.jku.at/~aurel/two_kinects_same_direction_c_both_one_proj_shutter.png
  
 
Holding the kinect automatically on level:
 
Holding the kinect automatically on level:
[http://atommuell.mum.jku.at/~aurel/hold_level_short.avi]
+
* [http://atommuell.mum.jku.at/~aurel/hold_level_short.avi]
  
=== yoda-- ===
+
=== Zephod / Stijn Kuipers ===
  
Fist detection http://yoda-jm.performous.org/freenect/in2.png
+
{|
 
+
|{{#ev:vimeo|17007842|320}}
=== Zephod / Stijn Kuipers ===
+
|valign="top"|
 +
* Windows RBG to Z Projection http://vimeo.com/17007842
 +
|}
 +
===Sebastian Frisch ===
  
Windows RBG to Z Projection http://vimeo.com/17007842
+
{|
 +
|{{#ev:vimeo|21247205|320}}
 +
|valign="top"|
 +
* Asteroids 3D. Interaction Game with ofxKinect and OpenCv http://vimeo.com/21247205
 +
|}
  
=== netpro2k / Dominick D'Aniello ===
 
  
Quick background removal and parallax demo http://vimeo.com/17023522
+
=== Kinect Earth ===
  
=== evoluce ===
+
{|
 +
|{{#ev:vimeo|21563702|320}}
 +
|valign="top"|
 +
* Kinect Earth. Controlling Google Earth in a browser window with Kinect Core Vision http://vimeo.com/21563702
 +
|}
  
Controlling Windows 7 applications with Kinect http://www.youtube.com/watch?v=M-wLOfjVfVc
+
=== Kinect Juggle ===
  
=== [[User:marcan|marcan]] ===
+
{|
 +
|{{#ev:youtube|9AXpnlkqPCI|320}}
 +
|valign="top"|
 +
* Virtual juggling with OpenNI http://www.youtube.com/watch?v=9AXpnlkqPCI
 +
* Source code: http://kinect-juggle.googlecode.com
 +
|}
  
[http://www.youtube.com/watch?v=rKhW-cvpkks The video that started it all]<br />
 
[http://www.youtube.com/watch?v=Q1heqFVrQGU Laser projection on a moving surface in perspective, using the Kinect for tracking]
 
  
=== [[User:imekinox|imekinox]] ===
+
=== Aquila 1.7b - iCub teleoperation ===
"AS3 first communication" http://www.youtube.com/watch?v=_9sDY6WO4mY
 
  
"AS3 second phase, motor & accelerometers" http://www.youtube.com/watch?v=SvfDvGaY348
+
{|
 +
|{{#ev:youtube|JDKu4q6dr84|320}}
 +
|valign="top"|
 +
* Website: http://www.martinpeniak.com
 +
* Source code: http://sourceforge.net/projects/aquila/
 +
|}
  
=== Eric Gradman (egradman) ===
+
== Demos - Page 2 ==
"Standard gravity" art project using libfreenect and python
+
[[Gallery_2|Next Page]]
http://www.youtube.com/watch?v=uaci4dcZxYE
 

Latest revision as of 17:02, 28 November 2012

This page collects videos of various people doing tests and experiments with OpenKinect software, and Kinect themed websites:

Kinect Themed Sites

Demos - Page 1

Augmented Reality, Smart Glasses and 2 MS Kinects using 3GearSystem and OpenKinect

  • Uses 2 x MS Kinect using 3GearSystem on top of OpenKinect drivers for hand tracking
  • Maps hand tracking to RGB world using RGB cameras
  • Co ordinates augmented reality and hand tracking from kinects
  • Allows for 3D hand pinch and 3D multi hand actions such as pinch to zoom in 3D

3D Facial Performance Capture using Kinect

  • Using the Kinect to capture data (green) of a markerless moving face (distance about 1 m, very limited coverage)
  • Mapping the rigid and non-rigid face motion to an animatable 3D face model (purple)
  • No use of the color information yet

Blink Solution brings Vodafone 3g’s Super Zoozoo to life

3D Depth Sculpting using copies of my own body and other objects

  • I became intrigued with the possibility of sculpting in 3D using the Kinect by periodically recording the nearest object to the camera in each part of the image. By taking multiple 3D snapshots at different times and then merging them to show the closest object, I can create a 3D sculpture that I can walk through. Here I merge multiple 3D video streams. http://www.youtube.com/watch?v=LKjzbyBpkM8
  • Here I use a "depth strobe", where I grab snapshots 1-2x a second.

People Detection

Florian Echtler (floemuc)

Pete & Matt

okreylos

koabi's work

Diarmuid Wrenne diarmuid@bluekulu.com

  • A second version of the green screen demo. This one does movies too.
  • Extracting rotation data from real objects and mapping that to new virtual ones. Shows how I can extract the rotation of objects seen by the kinect and use that rotation to change the orientation of virtual objects within the Box2d space to create a virtual bat out of a real one!!. Notice that I have mirrored the color video stream so that it acts more like a mirror than a web cam so that I can overlay the 2d graphics onto the camera images for more realism. http://www.youtube.com/watch?v=bO3YwW3WajI
  • Driving Quake Live with a kinect. It uses openkinect, python bindings and web.py on the linux to expose nearest point data. The imac runs Quake and a custom java program that calls the linux web server. It uses java.awt.Robot to generate mouse and key stroke events. http://www.youtube.com/watch?v=uvP2u2yOcNw Sorry about the resolution but I'll try to upload a better one later.
  • Extracting rotation data from real objects and mapping that to new virtual ones. Shows how I can extract the rotation of objects seen by the kinect and use that rotation to change the orientation of virtual objects within the Box2d space to create a virtual bat out of a real one!!. Notice that I have mirrored the color video stream so that it acts more like a mirror than a web cam so that I can overlay the 2d graphics onto the camera images for more realism. http://www.youtube.com/watch?v=bO3YwW3WajI
  • Openframeworks, box2d, opencv and ofxkinect. This uses the depth map to determine the closest point to the kinect. It uses this point to draw a line that is part of the box2d world. This line can then be moved around by moving your hand or a magic wand (in my case a roll of string!!) so that other objects with in the 2d world can be manipulated. Works well. http://www.youtube.com/watch?v=pR46sXjEtzE
  • Openframeworks, Box2d and opencv. Uses the blobs generated by opencv contours to generate a box2d object to manibulate other box2d objects. Works OK but filtering the blobs is quite error prone http://www.youtube.com/watch?v=NlrKcpUPtwM

Kyle point cloud viewer

cclaan


yankeyan

Memo Akten

L14M333

arne_

Full image of the IR dot field:

kode80apps

James Patten

cruxphotography

Daniel Reetz

Removing the IR cut filter from your camera and examining the Kinect IR field, including high res image:

Emily Gobeille - Theo Watson

DustyDingo

Two Kinects facing same direction, experiments:

Rolling shutter experiments:

The top image is a plain kinect, the other one has a rolling shuter in front of the laser projector. Both kinect operate at the same time.

Holding the kinect automatically on level:

Zephod / Stijn Kuipers

Sebastian Frisch


Kinect Earth

Kinect Juggle


Aquila 1.7b - iCub teleoperation

Demos - Page 2

Next Page