Tag: multi-touch

Low-cost multi-touch surfaces using a Wiimote and IR light pens

Via Hack a day:

Johnny Lee’s back again with his Wiimote interactive whiteboard. Commercial versions of these things are expensive and heavy. His technique doesn’t even need a projector, just a computer, a Wiimote and a simple IR emitting pen. The pen is just a stylus with an infrared LED in the tip.

Johnny Lee is back again indeed I posted about his method to track your fingers using a Wiimote earlier. This time he uses a the Wiimote’s infrared camera to track light pens (pens that emit an infrared light at the tip) on a surface to create an interactive whiteboard. It’s really nice that he can use any surface. You could use a projector in combination with an ordinary projection screen, a wall or a desk. If you don’t have a projector, you could turn any LCD display into a tablet surface.

Since the Wiimote can track up to four different points, these surfaces are also multi-touch. This means you can have multi-touch interaction on any projected image. It would be interesting to combine this with a steerable projector system.

[youtube:http://www.youtube.com/watch?v=5s5EvhHy7eQ]

The source code is available. I will definitely keep an eye on his Wii projects page.

Using a Wiimote to realize the Minority Report user interface

Via Gizmodo:

This Wiimote hack is one of the more astounding mods we’ve seen to Nintendo’s pride and joy, but even more remarkably, it’s really only taking advantage of the Wiimote’s IR and Bluetooth capabilities to create what may be the multitouch mecca — multitouch without the touch. So would you wear little reflective rings on your fingers to have tactile control of your television screen? We would. In a heartbeat. And then we’d call Captain Planet to kick some ass when we’re finished watching 30 Rock.

Very cool stuff. Since almost anyone at our institute has a Wii nowadays (including me), this should not be too hard to create ourselves.

[youtube:http://www.youtube.com/watch?v=0awjPUkBXOU]

The author of the video is Johnny Lee and works at Carnegie-Mellon. Just had a quick look through his impressive list of publications (UIST, SIGGRAPH, DIS, CHI, etc.), and found an interesting paper on how one can predict the task a user is currently performing by analyzing his EEG signals. This one is on my reading list about general sensing techniques (I hope I find some time soon to start reading papers again).