On July 10, 2008, the US Patent & Trademark Office published a major Apple patent application that generally relates to a system and method of managing, manipulating, and editing media objects, such as graphical objects on a display, by using hand gestures on a touch sensitive device. More importantly however, is that Apple’s patent presents us with a wide array of illustrations that would definitively confirm that touch technologies are coming to Apple’s iMac and a notebook/tablet. Some of the more interesting touch related illustrations cover such aspects as working with the application dock, unique music and video applications and new ways to work with iPhoto and more.
One of Apple’s patents relates to detecting gestures with event sensitive devices (such as a touch/proximity sensitive display) for effecting commands on a computer system. Specifically, gestural inputs of a human hand over a touch/proximity sensitive device can be used to control and manipulate graphical user interface objects, such as opening, moving and viewing graphical user interface objects. Gestural inputs over an event sensitive computer desktop application display can be used to effect conventional mouse/trackball actions, such as target, select, right click action, scrolling, etc. Gestural inputs can also invoke the activation of an UI element, after which gestural interactions with the invoked UI element can affect further functions.
In accordance with another embodiment, gestural inputs over a touch sensitive display may be used to affect editing commands for editing image files, such as photo files. The gestural inputs can be recognized via a user interface (“UI”) element, such as a slide bar. The gestural inputs via a UI element can be varied by changing the number of touchdown points on the UI element.
Be a Cool DJ
Apple’s patent FIG. 23 illustrates an embodiment of the invention for manipulating the replay and recording of audio or musical files. As shown in FIG. 23 below, a music application (830) can display a pair of virtual turntables (842 and 843), on which two musical records (834 and 835) are playing, the records being one of a single or a LP record. The records can be graphical representations of a digital musical file (e.g., song A and song B) that are being replayed via the music application. In other words, the records can be graphical imprints of the musical files as if the musical files were imprinted on physical records.
Like a pair of physical turntables, stylus 844 and stylus 845 can be graphical icon indications of a playback queue, the position of which can be varied by touching the queue on a touch sensitive display screen and dragging the icon to the desired position on the graphical record. The moving of the stylus would cause a jump in the playback point of the corresponding song, as on a physical turntable.
Also like a pair of physical turn tables start/stop buttons (838 and 839) can be touched by one or more fingers to toggle the start or stop/pause of the song reproduction. Speed variants bars (840 and 841) can be linearly adjusted to control the playback speed of the songs. Windows (831 and 833) can graphically reproduce the frequency representation of the reproduced songs, while window (832) can display the frequency representation of the actual output of the music application, which can be simply one of the songs being reproduced, or a mixed/combination of the songs. Mixing/pan bar (850) can be manipulated to modulate or demodulate the two songs being reproduced.
During song reproduction, the records can be manipulated similar to a physical record. For instance, rapid back and forth movement of a record can cause the sound effect of a record “scratching,” as disc jockeys often do on physical turn tables.
Touch Video Applications
Additional editing/playback functions of video files can be implemented using gestural inputs over certain pre-existing control elements. In accordance with a preferred embodiment, a non-linear time playback of a video file can be affected by selectively contracting or expanding the playback timeline indicating bar. Specifically, FIG. 21A shows a video application 790 (such as a video playback application) displays video playback (791) along with a progress bar (792), on which a playback queue (793) indicates the time progress of the video playback.
According to a preferred embodiment, the playback queue can be moved forward or backwards on the progress bar to effect fast forward and rewind of the video. The queue can also be held at the same place or otherwise modulated in a non-linear speed to effect variable speed playback or pause of the video. According to a preferred embodiment, the video application can be displayed on a touch sensitive display, and the position of the playback queue can be manipulated via manual touch of the queue by a finger of hand 501 at a location where the queue can be displayed on the screen. That is, the playback queue can serve both as a progress indicator as well as a UI element for controlling the speed and temporal location of the video playback.
In accordance with a preferred embodiment, the entire progress bar can serve as a UI element whereby a user can affect non-linear playback of the video by expanding or contracting one or more sections of the progress bar. Specifically, as shown in FIG. 21B, the UI element progress bar can be manipulated via a two finger zoom in or zoom out gesture (as discussed above with respect to FIG. 12). In the example shown in FIG. 21B, a zoom in gesture invokes an expansion of the playback time between the 60 minute mark and the 80 minute mark. In the example shown in FIG. 21B, playback speed of the video becomes non-linear in that the playback speed of the video can be slowed during the time period between the 60 and 80 minute mark. Alternatively, the playback speed of the video can be accelerated between the 0 and 60 minute mark, and after the 80 minute mark, whereas the playback speed is normal between the 60 and the 80 minute mark.
Apple’s patent FIG. 21C below, illustrates an additional UI element 794 being displayed within the video application 790. In this embodiment, UI element 794 can be a virtual scroll wheel whereby a user can further control the playback speed of the video. In combination with the manipulation of the progress bar 792, a user can first designate a section of the video for which playback speed is slowed, and whereby the user can use the scroll wheel 794 to further modulate the playback queue 793 to control the playback direction and/or speed of the video.
Apple’s patent FIG. 21D above, illustrates other additional touch sensitive UI elements that can be added to the video application for editing purposes. For instance, as shown in FIG. 21D, slide bar UI element (796) can be added to detect gestural inputs for invoking level adjustments, such as pan adjustment or brightness, contrast, hue, gamma, etc. types of adjustments
UI element 795 can also be displayed within the video application to effect sound editing of the video. Specifically, UI element 795 can include a plurality of level adjustments for recording or playback of different channels or sounds or music to be mixed with the video.
Touch Photo Application
Apple patent FIGS. 19C and 19D show another form of UI element, a virtual scrolling wheel 755, for receiving gestural input to scroll the display of the photos. In this embodiment, the virtual scroll wheel can be invoked by a simple gesture of performing a circular touch on the photo with one finger, or a touch- down of three fingers. Once the virtual scroll wheel UI element can be presented, the user can “rotate” the virtual scroll wheel to scroll through the photos. In this particular embodiment, the speed of the scrolling is not controlled by how many touchdown points are detected on the scroll wheel, but rather by the speed at which the touchdown point rotates about the center of the virtual scroll wheel.
Touch Photo Camera
Apple’s patent FIGS. 19E and 19F illustrate the concept of photo manipulation on a display screen of a digital camera. While this may indicate a future Apple camera, it’s likely that the concept is meant to work with the camera on the iPhone. In accordance with a preferred embodiment, the display screen of the digital camera can be made of a multi-touch sensitive panel, such as the multi-touch sensitive panel.
Apple’s patent FIG. 19E shows an embodiment where, in a playback mode of the digital camera 780, a detection of a vertically downward swipe gesture input of at least one finger in a touch detection zone 782 invokes a playback scrolling action whereby a next photo can be displayed. In accordance with another embodiment, a downward gestural input on any part of the display 781 can automatically invoke the scrolling action.
Apple’s patent FIG. 19F shows an alternative embodiment of FIG. 19E, where a detection of two touches are required in order to invoke playback scrolling. Specifically, a combination of a touchdown point at touch-down zone 783 along with a downward sliding input at or near touchdown zone 782 can invoke a scrolling action to display the next photo. It should be noted that the methods described in FIGS. 19A through 19E are not form factor specific, in that the methods can be implemented on a PC monitor, a laptop monitor, a digital camera, or any type of device having a touch screen.
Other Desktop Touch Applications
Other random patent figures below illustrate Apple’s Application Dock (FIG.7K), drag and drop feature, a photo management screen and more.
Apple’s patent was originally filed in June 2007 and published today by the USPTO.
NOTICE: MacNN presents only a brief summary of patents with associated graphic(s) for journalistic news purposes as each such patent application and/or grant is revealed by the U.S. Patent & Trade Office. Readers are cautioned that the full text of any patent applications and/or grants should be read in its entirety for further details.
Written and researched by Neo.
Leave a Reply
You must be logged in to post a comment.