updated 08:51 pm EDT, Sat June 9, 2012
System is not ready for retail, tracks items to within five inches
University of Virginia researchers have developed a Microsoft Kinect-based system that keeps track of items as they are moved about the home. The project, Kinsight, uses several Kinect sensor bars running in parallel to give input to the server running the software. While radio-frequency item-locator technology already exists, the researchers claim that the Kinect system was between four and 16 times cheaper than RFID equipment cost based on a same-sized area.
Although the software and Kinect combination isn't ready for retail distribution and is only experimental, it seems to work in a real-world simulation. The system doesn't track items in real time. Rather, the software focuses on tracking humans, then seeking objects that have changed place in the area of the movement.
Artificial intelligence routines were coded to help the computer learn the appearance and context of items. For example, a user's car keys are unlikely to be found in the bathtub. To prove the functionality of the system, the scientists labeled 48 items and identified 80 possible locations around a simulated home.
Volunteers moved the items around at random. The Kinect's range is somewhat limited -- 15 feet under optimal conditions, which led to some issues with the software. When items were very small or far away, the software confused identifications. The team said these problems are addressable using more sensors or adopting better depth-sensitive cameras.
Hardware requirements were fairly modest. The experimentation was performed using an unspecified number of Kinect sensors connected to a 2.3GHz Intel i5 laptop, with 4GB of RAM. Overall, the Kinsight system maintained location of items within an error of approximately 13cm (5.1 inches).
The Kinect has been a popular device for research and "hacks." A NUI Group forum member constructed a touch table that makes use of a Kinect sensor and a projector to make it appear that the user is manipulating objects floating in space. The Queen's University Human Media Lab has developed a hologram-like teleconferencing system that uses six Kinect sensors at the centerpiece of the technology. The patent application for the Kinect suggested that the Kinect hardware would have the precision to recognize American Sign Language, but as shipped, the first iteration of the technology is incapable of doing so.