updated 08:35 am EDT, Thu September 4, 2008
Apple Multi-Touch Fusion
Apple has been mulling a multi-touch system that would incorporate voice and visual data into the control system for its devices, according to a new US patent filing. Determining that some actions are best performed with additional senses besides just touch, the company suggests in one implementation that voice can be used to alter some properties of an object touched by the user. A visual editor could let a user speak out color changes for an object selected with a finger, while a word processor could enable users to drag and select text with fingers but speak out font changes.
The proposed technology would also allow cameras or other visual sensors to intelligently recognize the user's intentions, such as through the user's gaze at a particular object onscreen, the profile of their hands relative to the display or the mood expressed by the user's face. A user's frustration could tell a computer that it's misinterpreting a command, Apple says.
Additional information could also come from more advanced data generated by the user's grip, including modifying a touched object based on the angle and motion of the device itself, the size of fingers and hands, and biological data such as body temperature or heart rate.
Apple is broad in describing the possibilities for these systems and says they could apply to cellphones, computers and portable media players among other devices. However, it explains specifically that the camera needed for the visual aspects of this system could be the iSight camera in one of its MacBook computers and provides an illustration suggesting that an iMac or a stand-alone display with a camera could also be used.
The inventors are listed as Wayne Westerman and John Elias, the founders of pioneering multi-touch firm FingerWorks that was ultimately acquired by Apple and has been the source of several Apple multi-touch patents.