On September 7, the US Patent & Trademark Office published Apple’s patent application titled “Multi-functional hand-held device” originally filed in March 2006.
In light of Apple’s September 12th “Showtime” media event â€“ this patent holds the potential of providing us with valuable insights into a new iPod and/or other hand-held device which could debut on Tuesday. Telephony is covered in this patent. However, the unique aspect of this chameleon style unit, is that it holds the potential of changing UI. The patent notes:
“Before a particular device functionality can be used, it typically must be selected for use. The selection can come in a variety of forms. For example, the selection may be made via a main menu that includes soft buttons or icons that, when selected, activate the device functionality associated with the soft button. During activation, the GUI for that particular device is brought into view on the display (see FIGS. 9-17 – below) and the software associated with the device is installed, loaded or activated. From that point on, the multi-functional device operates like the selected device.
This powerful patent covers the following topics: Touch Screen, Touch Sensitive Housing, Display Actuator, Multi-Functionality, Form Factor, One-Handed vs. Two-Handed Operation, Footprint/Size, Full Screen Display, Limited Number of Mechanical Actuators, Adaptability, GUI Based on Functionality, Switching Between Devices (GUI), Operating at Least Two Functionalities Simultaneously, Configurable GUI (User Preferences), Input Devices, Pressure or Force Sensing Devices, Force Sensitive Housing, Motion Actuated Input Device, Mechanical Actuators, Microphone, Image Sensor, Touch Gestures, 3-D Spatial Gestures, Perform Action Based on Multiple Inputs, Differentiating Between Light and Hard Touches, Example of a New Touch Vocabulary, Speaker, Audio/Tactile Feedback Devices, Communication Devices (wired & wireless) and Change UI Based on Received Communication Signals.
Apple’s Abstract: Disclosed herein is a multi-functional hand-held device capable of configuring user inputs based on how the device is to be used. Preferably, the multi-functional hand-held device has at most only a few physical buttons, keys, or switches so that its display size can be substantially increased. The multi-functional hand-held device also incorporates a variety of input mechanisms, including touch sensitive screens, touch sensitive housings, display actuators, audio input, etc. The device also incorporates a user-configurable GUI for each of the multiple functions of the devices.
Apple’s Summary: Disclosed herein is a multi-functional hand-held device capable of configuring user inputs based on how the device is to be used. Preferable, the multi-functional hand-held device has at most only a few physical buttons, keys, or switches so that its display size can be substantially increased. In other words, by eliminating physical buttons, keys, or switches from a front surface of an electronic device, additional surface area becomes available for a larger display. Ultimately this strategy would allow a substantially full screen display. As used herein, a full screen display is a display that consumes, or at least dominates, a surface (e.g., front surface) of an electronic device. Various embodiments of a multi-functional hand-held device are discussed below with reference to FIGS. 2-28.
Electronic device manufacturers have discovered the advantages of combining separate hand-held electronic devices to form multi-function devices. By having a single multi-function device, a user is not burdened with carrying, purchasing, and maintaining multiple devices. Further, the user is not limited in the operations that can be performed, i.e., the user can perform different operations with a single device that would have otherwise required the use of a different devices.
FIGS. 1A-1F are diagrams of various electronic devices.
As used herein, the term “multi-functional” is used to define a device that has the capabilities of two or more traditional devices in a single device. The multi-functional device may, for example, include two or more of the following device functionalities: PDA, cell phone, music player, video player, game player, digital camera, handtop, Internet terminal [and/or] GPS or remote control. For each new device functionality that is added to a single device, the complexity and size of the device tends to increase. Therefore, with hand-held devices, there is typically a trade-off between beeping the footprint small and complexity low while still maximizing the functionality of the device.
In some cases, combing devices may result in redundant hardware components, which allows components to be used for multiple different, device functionalities. In other cases, certain hardware components are distinct to each device and therefore additional space and connectivity must be made available. Furthermore, each device functionality typically has its own programming or application software and, therefore, the multifunction device must be designed with enough memory to accommodate all the various software components.
A personal digital assistant (PDA) is a mobile hand-held device that provides computing and information storage and retrieval capabilities for personal and/or business use. PDAs are severally capable of tracking names, addresses, phone numbers and appointments. They are also often capable of taking notes, performing calculations, paging, data messaging, and electronic mail. PDAs may also include functionality for playing simple games, music, and other media files. Examples of PDAs include the Palm Pilot and Blackberry.
Like most hand-held devices, PDAs typically include a display and various input devices. The input devices may include a stylus and touch screen that work in combination with a handwriting recognition program, keypads, mini-keyboards, navigation pads, and/or soft or fixed function buttons.
Cell phones are mobile telephones that allow a user to connect to other telephones using a cellular network. Cell phones typically include a transceiver for transmitting and receiving telephone calls, controls such as a navigation pad for traversing through a display, a keypad for making numeric entries (and in some cases alphabetic entries), and soft or fixed function buttons. For example, in many cell phones one fixed function button is used for starting a call and another fixed function button is used for ending a call.
Media players come in a variety of forms. Music players are generally configured to store, process and output music. Music players can be based on the MP3 or AAC format, which is a compression system for music. Music Players typically include a microprocessor, memory, display, audio jack, data port and playback controls. The playback controls typically include features such as menu, play/pause, next, previous, volume up, and volume down. Video players are similar to music players in most respects. In some cases, they may include a data storage device for receiving a removable storage medium such as a DVD. The iPod [RTM.] media player manufactured by Apple Computer, Inc. of Cupertino, Calif. is one example of a media player.
Handtops are general purpose computers similar to laptops, but in a smaller form factor. Handtops typically include a display and a full keyboard.
Simplified diagram of multi-functional hand-held
FIG. 2 is a simplified diagram of a multi-functional hand-held device 100. The multi-functional hand-held device 100 integrates at least two devices 102 into a single device. Each device 102 includes both hardware and software components 104 and 106, which are integrated into multi-functional hand-held device 100. It should be pointed out that the multi-functional hand-held device 100 is not limited to only two devices, and may in fact integrate any number of devices.
Multi-functional device 100 also includes switch 110, which that allows multi-functional device 100 to be switched from one device operating mode to another device operating mode. For example, switch 110 may allow a user to cycle through cell phone, media player, and PDA modes of operation. Once a particular operating mode is selected, the multi-functional device 100 operates as the selected device. For example, the programming related to the selected device is activated for use by the multi-functional hand-held device. The programming may include reconfiguring the UI based on the selected device so that the inputs made by the user correlate to the device in use. For example, the functions of any physical buttons, switches or dials as well as soft buttons, switches or dials can be reconfigured to correspond to the selected device.
However, the operating modes of multi-functional hand-held device 100 need not completely independent. In many cases, it will be desirable to allow the multiple functionalities to interact with each other. For Example, a user may look up a telephone number of a contact in the PDA and pass this number to the phone to be dialed.
Patent FIG. 3 is a perspective view of a substantially full screen multi-functional hand-held device 120 with a limited number of buttons. There are no physical buttons on the front and side surfaces 124 and 126. The front surface is used entirely for the display 122. Further, because the sides 126 are used for grasping the device 120 it may be preferred to leave the sides free from buttons to prevent accidental actions in the event a user inadvertently presses a button while supporting the device. Although the top surface 128 and bottom surface 13 O would not typically be used to hold the device, these surfaces are not ideal locations for buttons that are often actuated because it would be awkward to reach these buttons when operating the device with one hand.
The top surface 128 may be reserved for buttons that have limited action and generic functions that are cross-functional, for example, power and hold switches. The top and bottom surfaces 128 and 130 are also well suited for placement of I/O and communication ports. The top surface 128 may, for example, include a headset/microphone jack and an antenna, and the bottom surface 130 may include power and data ports.
As shown in FIG. 4, the hand-held device 120 includes a button 140 in the upper region on the side surface 126 of the hand-held device 120. Because the button 140 is in the upper region, it tends to be out of the way of the grasping hand and therefore accidental activation is substantially eliminated. The upper button may be configured to switch the functionality of the multi-functional device i.e., button 140 may be switch 110 of FIG. 2. For example, by pressing the button 140, a new device functionality is activated, and the current device functionality is deactivated. Although the term button is used, it should be appreciated that the button 140 may correspond to a dial, wheel, switch and/or the like.
Various Modes Available using Virtual Controls
FIG. 10 is a diagram of a GUI 180 that is used in a cell phone mode. As shown, the GUI 180 is divided into a standard region 152 and a control region 154. Located inside the control region 154 are a virtual keypad 182, a virtual navigation pad 184 and two virtual buttons 186.
FIG. 11 is a diagram of a GUI 190 that is used in a music player mode. As shown, the GUI 190 is divided into a standard region 152 and a control region 154. Located inside the control region 154 are a virtual scroll wheel 192 and five virtual buttons 194. Additional details on a virtual scroll wheel are provided in U.S. patent application Ser. No. 11/038,590, titled “Mode-Based Graphical User Interfaces for Touch Sensitive Input Devices,” filed on Jan. 18, 2005.
FIG. 12 is a diagram of a GUI 200 that is used in a video player mode. As shown, the GUI 200 is divided into a standard region 152 and a control region 154. Located inside the control region 154 are a plurality of virtual buttons 202. Alternatively, the controls may appear and disappear as needed since the video player is primarily used in conjunction with a full screen viewing mode.
FIG. 13 is a diagram of a GUI 210 that is used in a game player mode. As shown, the GUI 210 is divided into a standard region 152 and two control regions 154A and 154B on the sides of the standard region 152. The left side control region 154A includes a navigation or directional pad 212, and the right side control region includes four virtual buttons 214 (or vice versa depending on the users particular needs, left or right handed).
FIG. 14 is a diagram of a GUI 220 that is used in a camera mode. As shown, the GUI 220 is divided into a standard region 152 and a control region 154. The standard region 152 may represent the view finder. Located inside the control region 154 are various buttons 222 including for example picture click, zoom, flash, etc. A navigation pad 224 may also be included so that the pictures can be scrolled through or for menu navigation.
FIG. 15 is a diagram of a GUI 230 that is used in a GPS receiver mode. As shown, the GUI 230 is divided into a standard region 152 and a control region 154. Located inside the control region 154 are various buttons 222 including for example zoom, pan, etc. A navigation pad 224 may also be included.
FIG. 16 is a diagram of a GUI 240 that is used in a hand top mode. As shown, the GUI 240 is divided into a standard region 152 and a control region 154. Located inside the control region 154 is a virtual keyboard 242.
FIG. 17 is a diagram of a GUI 250 that is used in a remote control mode. As shown, the GUI 250 is divided into a standard region 152 and a control region 154. Located inside the control region 154 are various keys and buttons 252 associated with controlling a remote device such as a TV, DVD player, A/V amplifier, VHS, CD player, etc.
FIG. 18 illustrated an exemplary main menu GUI 260 of a multi-functional device. As shown, the GUI 260 includes icons/buttons 262 for launching each of the various device functionalities. In this particular example, the main menu page 260 includes a PDA button 262A, a cell phone button 262B, a music player button 262C, a game player button 262D, a video player button 262E, a GPS button 262F, a remote control button 262G, a camera button 262H and a handtop button 262I. The various buttons 262 are virtual buttons. When a button is pressed, the main page for the selected functionally (e.g., as shown in FIGS. 9-17) is brought into view on the display. To select another device, the user simply selects a soft home button 264 located in the GUI of each device to return to the main menu page 260, and thereafter selects the desired functionality in the main menu page 260.
The selection of alternative functionalities may also be accomplished by flipping (or scrolling) through the various GUIs until the desired GUI is found. For example, the different GUIs may be incrementally brought into view page after page (or frame after frame) when a next (flip) command signal is generated (e.g., slide show effect). The transition between pages may be widely varied. The transition may be from side to side, top to bottom or center to center. The transition may also include fading in and out, popping in and out, or enlarging and reducing. The command signal may be generated by a physical or virtual button or wheel. Using a button, each press may cause a new page to be displayed. Using a wheel, a predetermined amount of rotation may cause a new page to be displayed.
The command signal may also be generated in a variety of other ways. For example, the command signal may also be generated by gestures initiated on the touch screen. For example, sliding a finger (or stylus) across the display may cause a new page to be displayed. If slid to the right, the next page may be displayed. If slid to the left, the previous page may be displayed. The command signal may also be generated by 3D device gestures created when the entire hand-held device is moved spatially. By way of example, shaking the device may cause a new page to be displayed.
The command signal may also be generated by forces that are applied to the device. By way of example, squeezing the device may cause a new page to be displayed. The command signal may also be generated by sensing the orientation of the device either relative to the ground, as sensed by accelerometers, or relative to a compass direction indicated by an internal compass. For example, if the device is at 0 degrees, a first page is displayed, at 90 degrees a second page is displayed, at 180 degrees a third page is displayed and at 270 degrees a fourth page is displayed.
The command signal may also be generated by monitoring a user’s voice (i.e., voice recognition). If the user calls out “PHONE,” the page associated with the phone is displayed, if the user calls out “PDA,” the page associated with the PDA is displayed.
The command signal may also be generated by monitoring incoming signals from other systems (whether transmitted wirelessly or via a cable). For example, if a call is received, the device may automatically configure the system as a phone. Alternatively, it may only present a control panel for taking or passing on the call.
As an alternative to integrating functionalities, the device may be configured to keep the various modes separate. That is, the device does not merge the functionality together (integrated layers and GUIs), but instead keeps them distinct from one another. In some cases, by keeping different functionalities distinct, user confusion may be reduced.
FIG. 25 is a diagram of a touch method 400 for implementing this technique. The method 400 begins at block 402 where one or more touches are detected. The touches include not only x any y components but also z components. The x and y components may be supplied by a touch sensing device such as touch screen, touch pad, or touch housing. The z component may be provided by force sensors or display actuators located behind the touch surface of the touch sensing device.
Following block 402, the method proceeds to block 404 where a determination is made as to whether the touch is a light or hard touch. The determination is generally based on the force or pressure of the touch (z component). For example, if the force of the touch is smaller than a predetermined threshold then the touch is considered a light touch and if the force of the touch is larger than the predetermined threshold then the touch is considered a hard touch. If it is determined that the touch is a light touch, the method proceeds to block 406 where a passive action associated with the touch is initiated. If it is determined that the touch is hard touch, an active action associated with the touch is performed (block 408).
The touch method may additionally include a block where the one or more touches are classified as a primary touch or a secondary touch. Primary touches are touches that are intended to cause an action while secondary touches are touches that are not intended to cause an action. Gestures are examples of primary touches while a thumb positioned over the touch area to hold the device is an example of a secondary touch. Once the touches are classified as primary or secondary, the secondary touches are filtered out, and the determination of whether a touch is a light or hard touch is made with the primary touches.
Additional Touch Method
FIG. 26 is an additional touch method 500 implementing this technique. The method begins at block 502 when one or more touches are detected. Thereafter, in block 504, the UI mode is determined. In block 506, a determination is made as to whether the touches are light touches or hard touches. Alternatively, blocks 502 and 504 could be reversed, effectively resulting in an instance of the touch method for each mode. In block 508, the number of distinct touches (e.g., fingers) is determined. In block 510, a determination is made as to whether the touches are stationary or in motion. In block 512, the duration of the touches is determined. In block 514, the locations of the touches are determined. Following blocks 502-514, the method proceeds to block 516 where an action is performed based on the UI mode, the pressure of the touch, the number of touches, whether or not the touch is moving, the duration of the touch, and the touch location. The actions may be passive or active depending on the values of each characteristic.
Exemplary Hand-Held Device
FIG. 28 is a block diagram of an exemplary hand-held device 600. The hand-held device 600 typically includes a controller 602 (e.g., CPU) configured to execute instructions and to carry out operations associated with the hand-held device. For example, using instructions retrieved for example from memory, the controller 602 may control the reception and manipulation of input and output data between components of the hand-held device 600. The controller 602 can be implemented on a single chip, multiple chips or multiple electrical components. For example, various architectures can be used for the controller 602, including dedicated or embedded processor, single purpose processor, controller, ASIC, etc. By way of example, the controller may include microprocessors, DSP, A/D converters, D/A converters, compression, decompression, etc.
In most cases, the controller 602 together with an operating system operates to execute computer code and produce and use data. The operating system may correspond to well known operating systems such as OS/2, DOS, UNIX, Linux, and Palm OS, or alternatively to special purpose operating system, such as those used for limited purpose appliance-type devices. The operating system, other computer code and data may reside within a memory block 604 that is operatively coupled to the controller 602. Memory block 604 generally provides a place to store computer code and data that are used by the hand-held device. By way of example, the memory block 604 may include read-only memory (ROM), random-access memory (RAM), hard disk drive (e.g., a micro drive), flash memory, etc. In conjunction with the memory block 604, the hand-held device may include a removable storage device such as an optical disc player that receives and plays DVDs, or card slots for receiving mediums such as memory cards (or memory sticks). Because the form factor of the hand-held device is small, the optical drive may only be configured for mini DVDs.
The hand-held device 600 also includes various input devices 606 that are operatively coupled to the controller 602. The input devices 606 are configured to transfer data from the outside world into the hand-held device 600. As shown, the input devices 606 may correspond to both data entry mechanisms and data capture mechanisms. In particular, the input devices 606 may include touch sensing devices 608 such as touch screens, touch pads and touch sensing surfaces, mechanical actuators 610 such as button or wheels or hold switches (611), motion sensing devices 612 such as accelerometers, force sensing devices 614 such as force sensitive displays and housings, image sensors 616, and microphones 618. The input devices 606 may also include a clickable display actuator 619.
The hand-held device 600 also includes various output devices 620 that are operatively coupled to the controller 602. The output devices 620 are configured to transfer data from the hand-held device 600 to the outside world. The output devices 620 may include a display 622 such as an LCD, speakers or jacks 624, audio/tactile feedback devices 626, light indicators 628, and the like.
The hand-held device 600 also includes various communication devices 630 that are operatively coupled to the controller 602. The communication devices 630 may, for example, include both wired and wireless connectivity selected from I/O ports 632 such as IR, USB, or FireWire ports, GPS receiver 634, and a radio receiver 636.
The hand-held device 600 also includes a battery 650 and possibly a charging system 652. The battery may be charged through a transformer and power cord or through a host device or through a docking station. In the cases of the docking station, the charging may be transmitted through electrical ports or possibly through an inductance charging means that does not require a physical electrical connection to be made.
The sole inventor listed on the patent is Steven P. Hotelling who was the inventor of the Chameleon Patent of 2004.
NOTICE: MacNN presents only a brief summary of patents with associated graphic(s) for journalistic news purposes as each such patent application and/or grant is revealed by the U.S. Patent & Trade Office. Readers are cautioned that the full text of any patent applications and/or grants should be read in its entirety for further details.
Researched and written by Neo.