FreshPatents.com Logo
stats FreshPatents Stats
1 views for this patent on FreshPatents.com
2012: 1 views
Updated: October 13 2014
newTOP 200 Companies filing patents this week


    Free Services  

  • MONITOR KEYWORDS
  • Enter keywords & we'll notify you when a new patent matches your request (weekly update).

  • ORGANIZER
  • Save & organize patents so you can view them later.

  • RSS rss
  • Create custom RSS feeds. Track keywords without receiving email.

  • ARCHIVE
  • View the last few months of your Keyword emails.

  • COMPANY DIRECTORY
  • Patents sorted by company.

Follow us on Twitter
twitter icon@FreshPatents

Switching back to a previously-interacted-with application

last patentdownload pdfdownload imgimage previewnext patent


20120304132 patent thumbnailZoom

Switching back to a previously-interacted-with application


This document describes techniques and apparatuses for switching back to a previously-interacted-with application. In some embodiments, these techniques and apparatuses enable selection of a user interface not currently exposed on a display through a simple gesture that is both easy-to-use and remember.

Inventors: Chaitanya Dev Sareen, Tsz Yan Wong, Jesse Clay Satterfield, Matthew I. Worley, Bret P. Anderson, Nils A. Sundelin, Patrice L. Miner, Jennifer Nan, Robert J. Jarrett, David A. Matthews
USPTO Applicaton #: #20120304132 - Class: 715863 (USPTO) - 11/29/12 - Class 715 
Data Processing: Presentation Processing Of Document, Operator Interface Processing, And Screen Saver Display Processing > Operator Interface (e.g., Graphical User Interface) >Gesture-based

view organizer monitor keywords


The Patent Description & Claims data below is from USPTO Patent Application 20120304132, Switching back to a previously-interacted-with application.

last patentpdficondownload pdfimage previewnext patent

BACKGROUND

moon Conventional techniques for selecting a previously-interacted-with application that is not currently exposed on a display are often confusing, take up valuable display space, cannot be universally applied across different devices, or provide a poor user experience.

Some conventional techniques, for example, enable selection of a previously-interacted-with application through onscreen controls in a task bar, within a floating window, or on a window frame. These onscreen controls, however, take up valuable display real estate and can annoy users by requiring users to find and select the correct control.

Some other conventional techniques enable selection of a previously-interacted-with application through hardware, such as hot keys and buttons. At best these techniques require users to remember what key, key combination, or hardware button to select. Even in the best case users often accidentally select keys or buttons. Further, in many cases hardware-selection techniques cannot be universally applied, as hardware on computing devices can vary by device model, generation, vendor, or manufacturer. In such cases, either the techniques will not work or work differently across different computing devices. This exacerbates the problem of users needing to remember the correct hardware, as many users have multiple devices, and so may need to remember different hardware selections for different devices. Further still, for many computing devices hardware selection forces users to engage a computing device outside the user\'s normal flow of interaction, such as when a touch-screen device requires a user to change his or her mental and physical orientation from display-based interactions to hardware-based interactions.

Still another conventional technique enables users to select a previously-interacted-with application by selecting to enter an application-selecting mode, search through various gesture-based controls, select the control, and then select to interact with the selected application\'s user interface once it is presented. As is readily apparent, this technique requires numerous user actions and can provide a poor user experience.

SUMMARY

This document describes techniques and apparatuses for switching back to a previously-interacted-with application. In some embodiments, these techniques and apparatuses enable selection of a user interface not currently exposed on a display through a simple gesture that is both easy-to-use and remember.

This summary is provided to introduce simplified concepts for switching back to a previously-interacted-with application that are further described below in the Detailed Description. This summary is not intended to identify essential features of the claimed subject matter, nor is it intended for use in determining the scope of the claimed subject matter. Techniques and/or apparatuses for switching back to a previously-interacted-with application are also referred to herein separately or in conjunction as the “techniques” as permitted by the context.

BRIEF DESCRIPTION OF THE DRAWINGS

Embodiments for switching back to a previously-interacted-with application are described with reference to the following drawings. The same numbers are used throughout the drawings to reference like features and components:

FIG. 1 illustrates an example system in which techniques for switching back to a previously-interacted-with application can be implemented.

FIG. 2 illustrates an example method for enabling edge gestures that can be used to select to switch back to a previously-interacted-with application, the edge gestures being approximately perpendicular to an edge in which the gesture begins.

FIG. 3 illustrates an example tablet computing device having a touch-sensitive display presenting an immersive interface.

FIG. 4 illustrates the example immersive interface of FIG. 3 along with example edges.

FIG. 5 illustrates the example immersive interface of FIGS. 3 and 4 along with angular variance lines from a perpendicular line and a line from a start point to a later point of a gesture.

FIG. 6 illustrates the edges of the immersive interface shown in FIG. 4 along with two regions in the right edge.

FIG. 7 illustrates an application-selection interface presented by a system-interface module in response to an edge gesture made over the immersive interface and webpage of FIG. 3.

FIG. 8 illustrates an example method for enabling edge gestures including determining an interface to present based on some factor of the gesture.

FIG. 9 illustrates an example method enabling expansion of, or ceasing presentation of, a user interface presented in response to an edge gesture or presentation of another user interface.

FIG. 10 illustrates a laptop computer having a touch-sensitive display having a windows-based email interface and two immersive interfaces.

FIG. 11 illustrates the interfaces of FIG. 10 along with two gestures having a start point, later points, and one or more successive points.

FIG. 12 illustrates the windows-based email interface of FIGS. 10 and 11 along with an email handling interface presented in response to an edge gesture.

FIG. 13 illustrates the interfaces of FIG. 12 along with an additional-email-options interface presented in response to a gesture determined to have a successive point a preset distance from the edge.

FIG. 14 illustrates a method for switching back to a previously-interacted-with application using a queue.

FIG. 15 illustrates an example interaction order in which a user interacts with various applications.

FIG. 16 illustrates the immersive interface of FIG. 3 along with a thumbnail image of a user interface of a prior application.

FIG. 17 illustrates a method for switching back to a previously-interacted-with application, which may or may not use a queue.

FIG. 18 illustrates the immersive interface of FIGS. 3 and 16, two progressive presentations, and two gesture portions.

FIG. 19 illustrates an example device in which techniques for switching back to a previously-interacted-with application can be implemented.

DETAILED DESCRIPTION

Overview

This document describes techniques and apparatuses for switching back to a previously-interacted-with application. These techniques, in some embodiments, enable a user to quickly and easily select prior applications.

Consider a case where a user visits a shopping website, then interacts with a local word-processing document, then visits the shopping website again, then visits a social-networking website, and then visits a web-enabled radio website. Assume that the user, when listening to songs on the radio website, decides to revisit the shopping website. The described techniques and apparatuses enable her to do so quickly and easily with two or even one input. She may simply swipe at an edge of her display to quickly see the social-networking website and then swipe again to revisit the shipping website. The techniques in this example permit the user to revisit prior applications based on which were most-recently interacted with rather than when opened, without taking up valuable display real estate with onscreen controls, and without requiring the user to find or remember a hardware button. Further still, no gesture, other than one starting from an edge, is used by the techniques in this example, thereby permitting applications to use nearly all commonly-available gestures.

This is but one example of many ways in which the techniques enabling switching back to a previously-interacted-with applications, others of which are described below.

Example System

FIG. 1 illustrates an example system 100 in which techniques for switching back to a previously-interacted-with application can be embodied. System 100 includes a computing device 102, which is illustrated with six examples: a laptop computer 104, a tablet computer 106, a smart phone 108, a set-top box 110, a desktop computer 112, and a gaming device 114, though other computing devices and systems, such as servers and netbooks, may also be used.

Computing device 102 includes computer processor(s) 116 and computer-readable storage media 118 (media 118). Media 118 includes an operating system 120, windows-based mode module 122, immersive mode module 124, system-interface module 126, gesture handler 128, application manager 130, which includes or has access to application queue 132, and one or more applications 134, each having one or more application user interfaces 136.

Computing device 102 also includes or has access to one or more displays 138 and input mechanisms 140. Four example displays are illustrated in FIG. 1. Input mechanisms 140 may include gesture-sensitive sensors and devices, such as touch-based sensors and movement-tracking sensors (e.g., camera-based), as well as mice (free-standing or integral with a keyboard), track pads, and microphones with accompanying voice recognition software, to name a few. Input mechanisms 140 may be separate or integral with displays 138; integral examples include gesture-sensitive displays with integrated touch-sensitive or motion-sensitive sensors.

Windows-based mode module 122 presents application user interfaces 136 through windows having frames. These frames may provide controls through which to interact with an application and/or controls enabling a user to move and resize the window.

Immersive mode module 124 provides an environment by which a user may view and interact with one or more of applications 134 through application user interfaces 136. In some embodiments, this environment presents content of, and enables interaction with, applications with little or no window frame and/or without a need for a user to manage a window frame\'s layout or primacy relative to other windows (e.g., which window is active or up front) or manually size or position application user interfaces 136.

This environment can be, but is not required to be, hosted and/or surfaced without use of a windows-based desktop environment. Thus, in some cases immersive mode module 124 presents an immersive environment that is not a window (even one without a substantial frame) and precludes usage of desktop-like displays (e.g., a taskbar). Further still, in some embodiments this immersive environment is similar to an operating system in that it is not closeable or capable of being un-installed. While not required, in some cases this immersive environment enables use of all or nearly all of the pixels of a display by applications. Examples of immersive environments are provided below as part of describing the techniques, though they are not exhaustive or intended to limit the techniques described herein.

System-interface module 126 provides one or more interfaces through which interaction with operating system 120 is enabled, such as an application-launching interface, a start menu, or a system tools or options menu, to name just a few.

Operating system 120, modules 122, 124, and 126, as well as gesture handler 128 and application manager 130 can be separate from each other or combined or integrated in any suitable form.

Example Methods

Example methods 200, 800, and 900 address edge gestures and are described prior to methods 1400 and 1700, which address switching back to a previously-interacted-with application. Any one or more of methods 200, 800, and 900 may be used separately or in combination with, in whole or in part, methods 1400 and/or 1700. An edge gesture may be used to select to switch back to a prior application as but one example of ways in which the various methods can be combined or act complimentary, though use of the edge gesture is not required by methods 1400 and/or 1700.

FIG. 2 depicts a method 200 for enabling edge gestures based on the edge gesture being approximately perpendicular to an edge in which the gesture begins. In portions of the following discussion reference may be made to system 100 of FIG. 1, reference to which is made for example only.

Block 202 receives a gesture. This gesture may be received at various parts of a display, such as over a windows-based interface, an immersive interface, or no interface. Further, this gesture may be made and received in various manners, such as a pointer tracking a movement received through a touch pad, mouse, or roller ball or a physical movement made with arm(s), finger(s), or a stylus received through a motion-sensitive or touch-sensitive mechanism.

By way of example consider FIG. 3, which illustrates a tablet computing device 106. Tablet 106 includes a touch-sensitive display 302 shown displaying an immersive interface 304 that includes a webpage 306. As part of an ongoing example, at block 202 gesture handler 128 receives gesture 308 as shown in FIG. 3.

Block 204 determines whether a start point of the gesture is at an edge. As noted above, the edge in question can be an edge of a user interface, whether immersive or windows-based, and/or of a display. In some cases, of course, an edge of a user interface is also an edge of a display. The size of the edge can vary based on various factors about the display or interface. A small display or interface may have a smaller size in absolute or pixel terms than a large display or interface. A highly sensitive input mechanism permits a smaller edge as well. Example edges are rectangular and vary between one and twenty pixels in one dimension and an interface limit of the interface or display in the other dimension, though other sizes and shapes, including convex and concave edges may instead be used.

Continuing the ongoing example, consider FIG. 4, which illustrates immersive interface 304 and gesture 308 of FIG. 3 as well as left edge 402, top edge 404, right edge 406, and bottom edge 408. For visual clarity webpage 306 is not shown. In this example the dimensions of the interface and display are of a moderate size, between that of smart phones and that of many laptop and desktop displays. Edges 402, 404, 406, and 408 have a small dimension of twenty pixels, an area of each shown bounded by dashed lines at twenty pixels from the display or interface limit at edge limit 410, 412, 414, and 416, respectively.

Gesture handler 128 determines that gesture 308 has a start point 418 and that this start point 418 is within left edge 402. Gesture handler 128 determines the start point in this case by receiving data indicating [X,Y] coordinates in pixels at which gesture 308 begins and comparing the first of these coordinates to those pixels contained within each edge 402-408. Gesture handler 128 often can determine the start point and whether it is in an edge faster than a sample rate, thereby causing little or no performance downgrade from techniques that simply pass gestures directly to an exposed interface over which a gesture is made.

Returning to method 200 generally, if block 204 determines that the start point of the gesture is not at an edge, method 200 proceeds along a “No” path to block 206. Block 206 passes the gestures to an exposed user interface, such as an underlying interface over which the gesture was received. Altering the ongoing example, assume that gesture 308 was determined not to have a start point within an edge. In such a case gesture handler 128 passes buffered data for gesture 308 to immersive user interface 304. After passing the gesture, method 200 ends.



Download full PDF for full patent description/claims.

Advertise on FreshPatents.com - Rates & Info


You can also Monitor Keywords and Search for tracking patents relating to this Switching back to a previously-interacted-with application patent application.
###
monitor keywords



Keyword Monitor How KEYWORD MONITOR works... a FREE service from FreshPatents
1. Sign up (takes 30 seconds). 2. Fill in the keywords to be monitored.
3. Each week you receive an email with patent applications related to your keywords.  
Start now! - Receive info on patent apps like Switching back to a previously-interacted-with application or other areas of interest.
###


Previous Patent Application:
Edge gesture
Next Patent Application:
Exposure data generation method
Industry Class:
Data processing: presentation processing of document
Thank you for viewing the Switching back to a previously-interacted-with application patent info.
- - - Apple patents, Boeing patents, Google patents, IBM patents, Jabil patents, Coca Cola patents, Motorola patents

Results in 0.74157 seconds


Other interesting Freshpatents.com categories:
Qualcomm , Schering-Plough , Schlumberger , Texas Instruments ,

###

Data source: patent applications published in the public domain by the United States Patent and Trademark Office (USPTO). Information published here is for research/educational purposes only. FreshPatents is not affiliated with the USPTO, assignee companies, inventors, law firms or other assignees. Patent applications, documents and images may contain trademarks of the respective companies/authors. FreshPatents is not responsible for the accuracy, validity or otherwise contents of these public document patent application filings. When possible a complete PDF is provided, however, in some cases the presented document/images is an abstract or sampling of the full patent application for display purposes. FreshPatents.com Terms/Support
-g2--0.7121
     SHARE
  
           

FreshNews promo


stats Patent Info
Application #
US 20120304132 A1
Publish Date
11/29/2012
Document #
13118302
File Date
05/27/2011
USPTO Class
715863
Other USPTO Classes
International Class
06F3/033
Drawings
20



Follow us on Twitter
twitter icon@FreshPatents