FreshPatents.com Logo
stats FreshPatents Stats
2 views for this patent on FreshPatents.com
2012: 2 views
Updated: July 25 2014
Browse: Nokia patents
newTOP 200 Companies filing patents this week


    Free Services  

  • MONITOR KEYWORDS
  • Enter keywords & we'll notify you when a new patent matches your request (weekly update).

  • ORGANIZER
  • Save & organize patents so you can view them later.

  • RSS rss
  • Create custom RSS feeds. Track keywords without receiving email.

  • ARCHIVE
  • View the last few months of your Keyword emails.

  • COMPANY DIRECTORY
  • Patents sorted by company.

Follow us on Twitter
twitter icon@FreshPatents

Apparatus, methods and computer program products providing finger-based and hand-based gesture commands for portable electronic device applications

last patentdownload pdfimage previewnext patent


Title: Apparatus, methods and computer program products providing finger-based and hand-based gesture commands for portable electronic device applications.
Abstract: A method includes executing a gesture with a user-manipulated physical object in the vicinity of a device; generating data that is descriptive of the presence of the user-manipulated object when executing the gesture; and interpreting the data as pertaining to at least one object, such as an object displayed by the device. ...


Nokia Corporation - Browse recent Nokia patents - ,
Inventors: Zoran Radivojevic, Yanming Zou, Kong Qiao Wang, Roope Tapio Takala, Vuokko Tuulikki Lantz, Reijo Lehtiniemi, Jukka Iimari Rantala, Ramin Vatanparast
USPTO Applicaton #: #20120056804 - Class: 345156 (USPTO) - 03/08/12 - Class 345 


view organizer monitor keywords


The Patent Description & Claims data below is from USPTO Patent Application 20120056804, Apparatus, methods and computer program products providing finger-based and hand-based gesture commands for portable electronic device applications.

last patentpdficondownload pdfimage previewnext patent

TECHNICAL FIELD

The teachings in accordance with the exemplary embodiments of this invention relate generally to user interfaces to electronic devices and, more specifically, relate to manually activated user input devices, methods, systems and computer program products.

BACKGROUND

Input devices employed in the converging multimedia electronics industry are becoming increasingly important. The human-computing terminal interface has long challenged systems designers, yet has not significantly evolved since the advent of the mouse several decades ago. This is a particularly challenging problem in the area of mobile and wireless devices, where the objectives of device miniaturization and usability directly conflict with one another. A natural and intelligent interaction between humans and computing terminals (CT) can be achieved if the simplest modalities, such as finger movement and/or user gestures, are used to provide basic input information to the CT (non-limiting examples of which can include multimedia terminals, communication terminals, display dominated systems (DDS) and devices, gaming devices and laptop computers).

Technology related to input devices has conventionally relied on a set of electro-mechanical switches (such as the classic keyboard). Such an approach requires a relatively large area for a set of switches (keyboard keys), which are usually dedicated to only one operation. A more advanced solution is offered by touch screen displays where touch sensitive switches are embedded into the display itself, such as in Active Matrix LCD with Integrated Optical Touch Screen (AMLCD) technology. In this approach the “single button” trend is evolving towards that of a “distributed sensor system” that may be embedded into the device and/or even directly into the display itself (AMLCD). The physical operation of such a sensor-based input device can be based on mechanical movement of different materials, change of electrical conductivity/capacity, influences by electrostatic field or optical properties (made by finger shadow/reflection from the surface). Reference with regard to AMLCD technology may be made to documents: 56.3, W. den Boer et al., “Active Matrix LCD with Integrated Optical Touch Screen”, SID 03 Digest (Baltimore, 2003) pgs. 1494-1497, and to 59.3, A. Abileah et al., “Integrated Optical Touch Panel in a 14.1” AMLCD″, SID 04 Digest, v. 35, Issue 1, pgs. 1544-1547, and incorporated by reference herein in their entireties.

Reference may also be made to U.S. Pat. No. 7,009,663 B2 (Mar. 7, 2006), entitled “Integrated Optical Light Sensitive Active Matrix Liquid Crystal display”, A. Abileah et al., and U.S. Pat. No. 7,053,967 B2 (May 30, 2006), entitled “Light Sensitive Display”, A. Abileah et al. (both assigned to Planar Systems, Inc.), which are incorporated by reference herein in their entireties.

The current trend in the development of multimedia device equipment involves hardware miniaturization together with a demand to provide a large input capacity. If the input device can be miniaturized then more space can be allocated for the visualization component(s), particularly in display dominated concept (DDC) devices. The situation in gaming devices is even more challenging, since improvements in the input devices may provide new design freedom and additional game-related functionalities.

Examples of current user input devices include those based on touch-motion, as in certain music storage and playback devices, and certain personal digital assistant (PDA) and similar devices that are capable of recognizing handwritten letters and commands.

Also of interest may be certain structured light based systems, such as those described in U.S. Pat. No. 6,690,354 B2 (Feb. 10, 2004), entitled “Method for Enhancing Performance in a System Utilizing an Array of Sensors that Sense at Least Two Dimensions”, Sze; U.S. Pat. No. 6,710,770 (Mar. 23, 2004), entitled “Quasi-Three-Dimensional Method and Apparatus to Detect and Localize Interaction of User-Object and Virtual Transfer Device”, Tomasi et al.; and U.S. Pat. No. 7,050,177 B2 (May 23, 2006), entitled “Method and Apparatus for Approximating Depth of an Object\'s Placement Onto a Monitored Region with Applications to Virtual Interface Devices”, Tomasi et al. (all assigned to Canesta, Inc.), which are incorporated by reference herein in their entireties.

SUMMARY

OF THE EXEMPLARY EMBODIMENTS

The foregoing and other problems are overcome, and other advantages are realized, in accordance with the non-limiting and exemplary embodiments of this invention.

In accordance with one aspect thereof the exemplary embodiments of this invention provide a method that includes executing a gesture with a user-manipulated physical object in the vicinity of a device; generating data that is descriptive of the presence of the user-manipulated object when executing the gesture; and interpreting the data as pertaining to at least one object displayed by the device.

In accordance with another aspect thereof the exemplary embodiments of this invention provide computer program product embodied in a computer readable medium, execution of the computer program product by at least one data processor resulting in operations that comprise, in response to a user executing a gesture with a user-manipulated physical object in the vicinity of a device, generating data that is descriptive of the presence of the user-manipulated object when executing the gesture; and interpreting the data as pertaining to information displayed to the user.

In accordance with a further aspect thereof the exemplary embodiments of this invention provide a device that comprises a unit to display information; an imaging system to generate data that is descriptive of the presence of a user-manipulated object when executing a gesture; and a data processor to interpret the data as pertaining to displayed information.

In accordance with a further aspect thereof the exemplary embodiments of this invention provide a method that includes, in response to a user employing at least one finger to form a gesture in the vicinity of a device, generating data that is descriptive of a presence of the at least one finger in forming the gesture; and interpreting the data as pertaining to at least one object that appears on a display screen.

In accordance with a still further aspect thereof the exemplary embodiments of this invention provide an apparatus that includes a display to visualize information; a sensor arrangement that is responsive to the user executing a gesture with a user-manipulated physical object in the vicinity of a surface of the apparatus, the sensor arrangement having an output to provide data descriptive of the presence of the user-manipulated object when executing the gesture; and a unit having an input coupled to the output of the sensor arrangement and operating to interpret the data to identify the executed gesture, and to interpret the identified gesture as pertaining in some manner to visualized information.

BRIEF DESCRIPTION OF THE DRAWINGS

The foregoing and other aspects of the teachings of this invention are made more evident in the following Detailed Description, when read in conjunction with the attached Drawing Figures, wherein:

FIG. 1A shows a device that incorporates a plurality of ultrasonic transducers (USTs) as user input devices;

FIG. 1B is a simplified block diagram of the device of FIG. 1A;

FIG. 2A shows a further exemplary embodiment of this invention where the USTs are incorporated into a device that embodies a mini-projector;

FIG. 2B is a simplified block diagram of the mini-projector device of FIG. 2A;

FIGS. 3A, 3B, collectively referred to as FIG. 3, FIGS. 4A-4D, collectively referred to as FIG. 4, FIGS. 5A, 5B, collectively referred to as FIG. 5, and FIG. 6 depict exemplary finger-based gestures that may be used to select various commands for execution in accordance with exemplary embodiments of this invention;

FIG. 7 shows the principles of the ultrasonic observation of finger distance;

FIGS. 8A-8D, collectively referred to as FIG. 8, show exemplary finger-based gestures that may be used to select various commands for execution in accordance with further exemplary embodiments of this invention;

FIG. 9 is a logic flow diagram depicting an exemplary finger detection process executed by the device shown in FIG. 10B, and that is suitable for capturing the finger-based gestures shown in FIGS. 8 and 10A;

FIG. 10A shows an example of the sensing of multiple points of simultaneous touch detected by device of FIG. 10B;

FIG. 10B is a simplified block diagram of a device having a display capable of generating an image of one or more fingertips; and

FIG. 11 is a logic flow diagram that depicts a method in accordance with the exemplary embodiments of this invention.

DETAILED DESCRIPTION

Reference is made to FIGS. 1A and 1B, collectively referred to as FIG. 1, that show a device 10, such as a display dominated device having at least one visual display 12 capable of visualizing information, that incorporates a plurality of ultrasonic transducers (USTs) 14A, 14B and 14C (collectively referred to as USTs 14) as user input devices, while FIG. 1B is a simplified block diagram of the device of FIG. 1A. Note in FIG. 1B that the device 10 is assumed to include a data processor (DP) coupled to a memory (MEM) 18 that stores a program 18A that is suitable for use in implementing this exemplary embodiment of the invention. The device 10 may be or may include, as non-limiting examples, a PDA, a wireless communications device, a gaming device, an Internet appliance, a remote control device (such as one suitable for use with a TV set or with public interactive billboards), a music storage and playback device, projectors, a video storage and playback device, a multimedia device, a computer such as a desktop or a laptop computer, or in general any type of electronic device that includes a user interface for presenting information to a user (such as a display screen or display surface) and for receiving commands and/or input information from the user.

In the exemplary embodiment of FIG. 1 the three USTs 14 are arrayed on a surface 10A of the device 10 and enable the use of triangulation to detect the locations in three dimensional space of the user\'s fingers 20A, 20B (referred to also as finger a, finger b). The device 10 exploits the ultrasonic field established in the vicinity of the surface of the device 10 by the USTs 14 to provide a perception technology that enables the device 10 to perceive and react to finger position, and possibly movement, in real time.

In general, a given UST 14 uses high frequency sound energy to conduct examinations and make measurements. To illustrate the general principle, a typical pulse/echo set-up configuration is shown in FIG. 7. A typical UST system includes several functional units, such as a pulser/receiver 15A and the ultrasonic transducer 15B. The pulser/receiver 15A is an electronic device that can produce mechanical movement and/or an electrical pulse, respectively. Driven by the pulser portion the transducer 15B generates high frequency ultrasonic energy. The sound energy is introduced and propagates through the air in the form of waves. When there is a discontinuity (such as a finger movement) in the wave path, part of the energy is reflected back from the discontinuity. The reflected wave signal is transformed into an electrical signal by the transducer 15B and is processed to provide distance from the transducer 15B to the discontinuity (based on a round trip time-of-flight measurement, as is well known). The reflected signal strength may be displayed versus the time from signal generation to when an echo was received. Both phase and intensity change of the reflected signal may also be exploited to measure finger-transducer distances.

When the user\'s finger(s) or more generally hand(s) enter the scanned field in front of the device 10 the UST 14 system measures the distances to the individual fingers. The three UST 14 sensors (which in some exemplary embodiments may have a fixed relative position on the CT) are capable of providing individual finger-sensor distance measurements (a1, a2, a3, b1, b2, b3). Note that the device 10 may be implemented with less than three UST 14 sensors, however by providing the third UST sensor it is possible to use finger movement for execution and basic operational commands (such as, but not limited to, Select; Copy; Paste; Move; Delete) by observation of a change in direction of the finger movement in three dimensional (3D) space. The device 10 may also be implemented using more than three UST 14 sensors in form of, for example, a UST sensor array when/if higher spatial detection resolution is needed.

In general, it is typically desirable to limit the range of the detection mechanism so that it encompasses a fairly limited volume of space (which may be considered to define a ‘working envelope’) in the vicinity of the sensing surface (whether the sensors be UST sensors or other types of sensors) of the device 10 so as not to, for example, generate unintended inputs due to the presence and/or movement of background objects, such as other parts of the user\'s body. Typically the sensing range will be less than about a meter, and more typically the value will be about, for example, 10-20 cm (or less). The maximum sensing range may typically be a function of the sensor technology. For example, the UST embodiments of this invention may typically have a greater detection/sensing range than the AMLCD embodiments discussed below. As can be appreciated, when the user places a finger or fingers, or a hand or hands, within the vicinity of the device 10, “within the vicinity of the device” or sensing surface will be a volume of space, or a plane or more generally a surface, contained within the maximum useful sensing range of the sensing device(s) both in depth (away from the sensing surface) and lateral extent (within an area capable of being sensed from the sensing surface).

Note in FIG. 1A that the detected finger position may be translated and presented to the user by displaying two pointers (e.g., two crosses) 12A, 12B on the display 12.

The described UST 14 system may serve to track the finger position of the user in 3D space and in real time. Visualization of the tracking (which may be used to provide perceptual feedback to the user) can be performed by showing one or more of the pointers 12A, 12B on the display 12. This technique provides visual coordination to the user, and facilitates the manipulation of objects presented on the display 12 (such as icons and command bars). Furthermore, if a standard set of characters is shown on the display 12 the user may be provided with typewriting (keyboarding) capabilities, where a classical keyboard is replaced by a virtual keyboard. Tactile feedback (which appears in mechanical keyboards) can be replaced by, for example, short blinking of a finger “shadow” on the display 12 for indicating that a particular key has been accepted and the character inputted or a corresponding command executed. Furthermore, sound effects may be added to confirm that a certain command has been accepted.

In some applications, instead of detecting particular fingers, all or some of the entire hand can be detected. In other words, a displayed pointer (e.g., 12A) can be associated to the center of gravity of the hand and used to drive/navigate the pointer. Such a configuration may significantly simplify the overall requirements (of hardware and software), and is particularly suitable in those cases when only a single pointer navigation/control is required.

FIG. 2 shows a further exemplary embodiment of this invention where the UST 14 system is incorporated into a device that embodies a mini-projector 30, while FIG. 2B is a simplified block diagram of the mini-projector device 30 of FIG. 2A. Components that are found as well in FIG. 1 are numbered accordingly. The mini-projector device 30 includes a projector or projection engine 32 coupled to the DP 16 and projects an image 34 for viewing by the user. For the purposes of this invention the image 34 may be considered to be on a “display screen” or a “display surface”. Pointers 34A, 34B corresponding to the locations of the user\'s fingers 20A, 20B can be displayed as well. The mini-projector device 30 may be linked via some wired or a wireless interface 36, such as a Bluetooth transceiver, to a phone or other multimedia device 38, and may display data sourced by the device 38. The same or a similar UST 14 scanning concept may be employed as in FIG. 1. Furthermore, the resulting user input system based on finger/hand placement and/or movement, combined with the projector engine 32, may be exploited for use in, for example, advanced gaming concepts that combine a large projected image and user gesture-based input. The use of a gesture-based language with the larger format displayed image 34 enables enhancements to be made to gaming concepts, as well as the design of games based on dynamical user movements in 3D.

The use of real-time finger tracing and the presentation of attributed pointers on the display/projector image 12/34 can be used to determine basic object-oriented or gesture-oriented commands. Commands such as: Select, Copy, Paste, Move, Delete and Switch may be applied on different displayed objects (such as icons, boxes, scroll-bars and files). These may be classified as object-oriented and gesture/browsing oriented operations, as follows in accordance with several non-limiting examples.

Object-Oriented:

Select: Finger 1 at a display corner or some reserved area Finger 2 moves slowly under a displayed object to be selected Copy: when selected click by single finger on the object Paste: fast double click by a single finger Move: move slowly two fingers located on the moving object Delete: double (fast) click by two fingers on previously selected object

Download full PDF for full patent description/claims.

Advertise on FreshPatents.com - Rates & Info


You can also Monitor Keywords and Search for tracking patents relating to this Apparatus, methods and computer program products providing finger-based and hand-based gesture commands for portable electronic device applications patent application.
###
monitor keywords



Keyword Monitor How KEYWORD MONITOR works... a FREE service from FreshPatents
1. Sign up (takes 30 seconds). 2. Fill in the keywords to be monitored.
3. Each week you receive an email with patent applications related to your keywords.  
Start now! - Receive info on patent apps like Apparatus, methods and computer program products providing finger-based and hand-based gesture commands for portable electronic device applications or other areas of interest.
###


Previous Patent Application:
Performance audience display system
Next Patent Application:
Content output system, output control device and output control method
Industry Class:
Computer graphics processing, operator interface processing, and selective visual display systems
Thank you for viewing the Apparatus, methods and computer program products providing finger-based and hand-based gesture commands for portable electronic device applications patent info.
- - - Apple patents, Boeing patents, Google patents, IBM patents, Jabil patents, Coca Cola patents, Motorola patents

Results in 0.74663 seconds


Other interesting Freshpatents.com categories:
Electronics: Semiconductor Audio Illumination Connectors Crypto

###

All patent applications have been filed with the United States Patent Office (USPTO) and are published as made available for research, educational and public information purposes. FreshPatents is not affiliated with the USPTO, assignee companies, inventors, law firms or other assignees. Patent applications, documents and images may contain trademarks of the respective companies/authors. FreshPatents is not affiliated with the authors/assignees, and is not responsible for the accuracy, validity or otherwise contents of these public document patent application filings. When possible a complete PDF is provided, however, in some cases the presented document/images is an abstract or sampling of the full patent application. FreshPatents.com Terms/Support
-g2-0.2671
     SHARE
  
           

FreshNews promo


stats Patent Info
Application #
US 20120056804 A1
Publish Date
03/08/2012
Document #
13295340
File Date
11/14/2011
USPTO Class
345156
Other USPTO Classes
International Class
09G5/00
Drawings
10



Follow us on Twitter
twitter icon@FreshPatents