The joke that got it all started….
Not only the average gamer is hyper and greedy of the possibilities that the Xbox Kinect offers. Researchers are welcome to Microsoft’s piece of ingenuity. Today’s new ‘webcam’ is making advances in science, and the possibilities seem endless.
Overtaken by your own 1 april joke – it happened to Google in 2011. Google employees launched Google Motion, a program that helps you operate your e-mail by waving your hands in front of the webcam. Licking an imaginary stamp and paste it on an equally imaginary letter; and the program opened a new e-mail. Hilarious of course, but it was much funnier when some hobbyists got it together to make it reality. And even thanks to technology of Google’s biggest rival: Microsoft’s Xbox Kinect.
For those who don’t know about Kinect: the Kinect is an expansion pack for the popular game console Xbox 360. It looks like an elongated webcam with three eyes, but under that sleek exterior lies an arsenal of advanced technology. The thing records sound, movement and depth, allowing you to play games with your body instead of a controller. The Kinect was Microsoft’s answer to the competition, especially on the motion controllers that rival Nintendo introduced the Wii. The device was an instant hit: in the first ten days it was a million times (!) Over the counter, good for a Guinness world record in the category ‘fastest selling piece of consumer electronics ever.
Fun for gamers so, but soon discovered even people outside the game world the value of the Kinect. Additional encouragement came from do-it-yourself electronics manufacturer Adafruit Industries, which offered a sum of $ 1000.- for the first hacker who got the Kinects combination of a camera, depth sensor and a pair of microphones to work on a PC. Because once that was done, the possibilities were endless – and not only for hackers with too much free time. Scientists can now use Kinect in applications that require registration of movement and depth: from robotic eyes glaciers equipment to mapping!
Kinect is a device that literally changes our future! In June 2011 Microsoft released a Windows 7-compatible test version of the software behind the successful Kinect ‘motion-sensing game device, hoping that developers will develop a variety of “hands-free” features for standard PC’s.
Motion can easily be transformed to machines and robots. It opens possibillities to send, for example, a diving robot to an oil leak and have someone perhaps even on the other side of the planet controll it’s movements!
Hatsune Miku and friends are among the most famous singers ever in Asia. And yet they are animations. Their concerts in the largest arena’s and concert halls are sold out in minutes….
The music is done live. The singing comes live from a keyboard in which all the voice sounds are programmed. Hatsune herself is live thanks to Kinect…
The dancer can react live on the music and singing. Hatsune can interact with the audience but also with props and background video.
Imagine being able to have Justin Bieber sing in 5 arena’s at the same time. He himself in front of a Kinect device in his own bedroom, and without even having to leave his home he can perform everywhere over the world. The band can be in a studio on the other side of the planet, and all Bieber has to do is get his bed to the side and start dancing and singing….
In earlier blogs I wrote about the opportunities for holograms: it’s the start of a whole new virtual world, a new generation…
Remember those very expensive cars that are programmed to drive all by themselve? It takes teams years and millions of dollars to develop such a car.
Chaotic Moon Labs’ “Board of Awesomeness” is intended as a technology teaser to show how perceptive computing can turn around the way we look at user experiences. The project utilizes a Microsoft Kinect device, Samsung Windows 8 tablet, a motorized longboard, and some standard and custom hardware to create a longboard that watches the user to determine what to do rather than have the operator use a wired or wireless controller. The project uses video recognition, speech recognition, localization data, accelerometer data, and other factors to determine what the user wants to do and allows the board to follow the operators commands without additional aid.
Soon the first cars will appear that have a Kinect on all sides and just a simple laptop that controlls the whole driving…
Helping the blind
Two computer science students from the University of Pennsylvania, Eric Berdinis and Jeff Kiske, have hacked together a very impressive tactile feedback system for the visually impaired using a Microsoft Kinect device and a number of vibration actuators. The Kinecthesia is a belt worn camera system that detects the location and depth of objects in front of the wearer using depth information detected by the Kinect sensor. This information is processed on a BeagleBoard open computer platform and then used to drive six vibration motors located to the left, center and right of the user. The video below shows a demo of the system in use and gives a quick explanation of its operation.
A team at Sunnybrook has come up with a novel new medical use for the Xbox Kinect. The following video shows how doctors at Sunnybrook Health Sciences Center in Canada are using the Kinect to solve the “touching dirty things” problem. Since they can just gesture in the air, they can control the computer and adjust images without ever having to sacrifice the sterile field.
Watch the video. It’s amazing, and shows where medicine might go with a little help from Kinect. Welcome Kinect, the newest member of the surgical team.
Kinect can also help you when shopping. Women can find out if the dress looks good on them without even trying it on. It could speed shopping up a little bit. Or not because now they can check all the dresses in the shop…
With more high tech options the possibillities are endless….. Watch this and be amazed!
While the millitairy is spending billions and billions on the most advanced radar systems and software to fly their unmanned vehicles, some geeks achieved the same result with the little piece of equipment called Kinect.
Lights data transport
And now it’s already getting spooky. Thanks to Kinect we know that data can travel via light. It means that you can point the camera of your phone or tablet to a lightsource, and the very fast flickering of that light (thanks to leds) can provide your phone with data with a speed of 800 megabits per second!
So what can this Kinect device do?
Kinect is smarter than your average webcam. First, it has excellent sight thanks to its secret trick: spotlighting the room with invisible infrared light. The camera sees you wonderfully thanks to this infrared. And, coupled with some advanced software that will run on the 360, it can track 48 points of your body in realtime for up to two players simultaneously.
Kinect is equipped with a microphone so you can talk to the 360. Also, it doesn’t just see you in IR; it can also film you in full RGB color, recognizing your face to automatically sign you in. And its tilt? Fully motorized to track you!
Kinect is truly impressive in our early hands-ons, no doubt. It can track your full body as you spike a volleyball, or it can just watch your hands as you mime a steering wheel. But there’s a perpetual, slight lag. And frame rates in even the somewhat simple games can suffer—most probably because Kinect requires the Xbox 360 to process all of its data—there’s no internal processor in their final build of the device.
The Kinect Kit for scientists, commercial and entertainment use
The SDK includes Windows 7 compatible PC drivers for Kinect device. It provides Kinect capabilities to developers to build their own applications and includes following features:
- Raw sensor streams: Access to low-level streams from the depth sensor, color camera sensor, and four-element microphone array.
- Skeletal tracking: The capability to track the skeleton image of one or two people moving within the Kinect field of view for gesture-driven applications.
- Advanced audio capabilities: Audio processing capabilities include sophisticated accoustic noise suppression and echo cancellation, beam formation to identify the current sound source, and integration with the Windows Speech Recognition API.
- Sample code and Documentation.
There is a lot of open source software available already. Like on this site: http://openkinect.org/wiki/Main_Page …