FreshPatents.com Logo
stats FreshPatents Stats
1 views for this patent on FreshPatents.com
2013: 1 views
Updated: July 21 2014
newTOP 200 Companies filing patents this week


    Free Services  

  • MONITOR KEYWORDS
  • Enter keywords & we'll notify you when a new patent matches your request (weekly update).

  • ORGANIZER
  • Save & organize patents so you can view them later.

  • RSS rss
  • Create custom RSS feeds. Track keywords without receiving email.

  • ARCHIVE
  • View the last few months of your Keyword emails.

  • COMPANY DIRECTORY
  • Patents sorted by company.

Follow us on Twitter
twitter icon@FreshPatents

Haptic response system and method of use

last patentdownload pdfdownload imgimage previewnext patent


20120278711 patent thumbnailZoom

Haptic response system and method of use


An apparatus and method for assessing a hazard associated with an object are disclosed. The apparatus includes a haptic input/output device coupled to a computer with haptic modeling software and a display device. A virtual object and a virtual passageway are displayed on the display device. The virtual passageway includes a haptic layer along a surface thereof. Force applied by a user to the haptic input/output device causes a cursor on the display device to move the virtual object into the virtual passageway. An interaction of the virtual object with the haptic layer generates a virtual contact force which may be determined by the user sensing a corresponding tactile feedback force generated by the haptic input/output device and/or by the computer processor. The magnitude of the virtual contact force may be used to assess a hazard associated with the virtual object.

Browse recent Labtest International, Inc. D/b/a Intertek Consumer Goods North America patents - Oak Brook, IL, US
Inventors: Robert Altkorn, Xiao Chen, Scott Milkovich, John Owens, Brian William Rider, Eugene Rider, Daniel Stool
USPTO Applicaton #: #20120278711 - Class: 715701 (USPTO) - 11/01/12 - Class 715 
Data Processing: Presentation Processing Of Document, Operator Interface Processing, And Screen Saver Display Processing > Operator Interface (e.g., Graphical User Interface) >Force Feedback Interaction

view organizer monitor keywords


The Patent Description & Claims data below is from USPTO Patent Application 20120278711, Haptic response system and method of use.

last patentpdficondownload pdfimage previewnext patent

CROSS REFERENCE TO RELATED APPLICATIONS

This application claims priority to U.S. Provisional Application No. 10/941,088 which claims priority to U.S. Provisional Application No. 60/502,983 filed on Sep. 16, 2003, which are incorporated herein by reference in its entirety.

FIELD OF THE INVENTION

The invention relates to hazard assessment simulators, and more particularly to a haptic response system and method of use which enables a user to assess a hazard, such as a choking, aspiration, or blockage hazard, in humans caused by an inanimate object.

DESCRIPTION OF RELATED ART

Haptic, or force feedback, technology includes hardware and associated software that allows a user to physically feel objects existing in a virtual (e.g., computational) environment. Haptic hardware integrates force sensors and motors or actuators and is often shaped to simulate specific tools, such as surgical devices or sculpting tools. In haptic technology, haptic hardware replaces conventional tactile computer input devices such as a mouse, trackball, or keyboard. The force sensors measure a magnitude and direction of forces applied by a user and input these measurements to a computer. Software installed on the computer converts the inputted measurements into movement of one or more virtual objects that are displayed on a display device, calculates one or more interactions between objects, and outputs the interactions as computer signals. The motors or actuators in each input/output device resist forces applied by a user, or apply forces to the user, pursuant to the signals received from the computer.

Various haptic hardware devices have been developed. Illustratively, known haptic hardware devices include a MagLev Wrist developed by Carnegie Mellon University, an Eye Surgery Simulator developed by Georgia Tech University, a Laparoscopic Impulse Engine developed by Immersion Corporation, and a Cybergrasp Force Feedback Glove developed by Virtual Technologies, Inc.

Haptic technologies have been applied to various disciplines, including the training of surgeons in minimally invasive surgery or other medical procedures. Specific medical procedures for which haptic technologies have been developed include, for example, bronchoscopy, urinary tract endoscopy, epidural injections, cardiovascular surgery, and gynecology. These technologies are specifically designed to mimic the interaction between a surgical instrument and a part of the human body. However, currently, such haptic systems may not accurately model the forces experienced during the actual surgery or performance of a medical procedure for various reasons, the foremost being the inaccurate modeling techniques. For example, these known haptic models do not account for variations in the size, shape, and elasticity over different population groups. Thus, the modeling is generally a “gross” calculation of a particular body part and interactions with a surgical tool, without taking into account variables that may exist between persons.

Additionally, the known haptic surgical simulators do not provide body parts that are dimensionally sized and imbued with specific material properties unique to persons within a particular age group. Consequently, such simulators cannot generate anatomically correct models of parts of the human body that are statistically representative of a particular sector of the population.

Moreover, surgical haptic response simulators are generally modeled to show an interaction strictly with a surgical tool and a body part. Such interaction is very limited to the human manipulation of a surgical instrument (e.g., cutting and moving), ranging from incisions in the skin to removal of body parts such as a spleen, cataracts, etc. These systems do not model objects which have no human interaction such as, for example, objects which were accidentally swallowed. Additionally, these simulators are primarily concerned with modeling the treatment and repair of body parts, not with determining how inanimate objects interact with the human body in way that creates an injury hazard, such as causing a blockage with a passageway located within the body.

Other haptic applications include virtual assembly path planning and virtual maintenance path planning. Virtual assembly path planning haptic technologies permit users to manipulate or simulate tools and components within a virtual environment to verify that an assembly process may be successfully completed. Similarly, virtual maintenance path planning technologies permit users to manipulate tools and components within a virtual environment to confirm that a broken component may be removed and replaced by a working component. Consequently, the haptic training systems used in virtual assembly path planning and virtual maintenance path planning simulate mechanical systems that exist outside the human body. As such, they are not concerned with, nor configured to show interactions with a part of the human body.

SUMMARY

OF THE INVENTION

In one embodiment, the invention provides a virtual haptic response system and method of use that enable a user to assess a choking, ingestion, blocking, insertion, aspiration, or any other physical hazard in humans caused by an inanimate object. As an example, the virtual haptic response system and method of use enables assessment of a hazard associated with an insertion of a manufactured, or yet to be manufactured, object into a human passageway. Illustratively, the object may be a toy or other articles intended for use by children, as well as other consumer products intended for use by teenagers and adults. The hazards may be assessed using an anatomically correct, virtual model of a passageway, such as, but not limited to a nasal pharynx, an oral cavity, an oral pharynx, a trachea, a hypo-pharynx, and an esophagus, and accurate, realistic tactile force feedback generated by a haptic input/output device. Additionally, the virtual model of the passageway may be dimensionally sized and imbued with specific material properties unique to persons within a particular age group. Consequently, the dimensions and material properties modeled by the virtual model of the passageway may statistically represent a particular sector of the population.

Thus, an embodiment of the invention is directed to a virtual computer model, tangibly embodied in computer executable instructions, which simulates on a display device a virtual object modeled after a particular real object, a virtual passageway modeled after a particular real human passageway, and an interaction between them. An interaction occurs when the virtual object and the virtual passageway are positioned proximate to or in contact with each other. Intensities of a force or forces generated by the interaction may be calculated and analyzed to determine whether the virtual object poses a hazard to the virtual passageway. Once calculated, the values of the generated force or forces may be processed so that one or more areas of the virtual object and/or the virtual passageway visibly deform and/or turn a non-anatomical color in response thereto.

In one embodiment, one or more forces generated by the interaction are output as computer signals to an input/output device manipulated by a user. In response, one or more actuators within the input/output device generate one or more feedback forces that simulate an intensity level of one or more real forces that would be exerted if an interaction occurred between the real object and the real passageway. The force feedback enables the user to determine whether the virtual object is capable of traversing the virtual passageway, and if not, where in the virtual passageway the virtual object is likely to lodge. The intensity of one or more calculated forces may be displayed on the display device by color variations and/or alphanumeric data.

BRIEF DESCRIPTION OF THE DRAWINGS

FIG. 1A is a cut-away, profile view of a virtual human head showing placement of a virtual object within a virtual airway, according to one embodiment of the invention;

FIG. 1B is a perspective view of an apparatus useable with an embodiment of the invention;

FIG. 2 is a screenshot of magnetic resonance images (MRI) used in embodiments of the invention to create the virtual human body part shown in FIG. 1;

FIG. 3A is a screenshot illustrating a three-dimensional, frontal view of a human skull constructed using data and measurements obtained from the magnetic resonance images of FIG. 2, according to one embodiment of the invention;

FIG. 3B is a cut-away profile view of a human head illustrating construction of reference layers, according to one embodiment of the invention;

FIG. 3C is a screenshot illustrating a three-dimensional, frontal view of a child\'s head with left and right side skin layers, according to one embodiment of the invention;

FIG. 4 is a screenshot illustrating four representative views of a model of a hypopharnyx that may define an airspace used in the haptic modeling system, according to one embodiment of the invention;

FIG. 5 is a screenshot of an exemplary interface used in an embodiment of the invention to select a type of virtual object;

FIG. 6A is a three-dimensional virtual view of human internal organs illustrating an interaction with a virtual object positioned therein;

FIG. 6B is a screen shot of an exemplary interface used in an embodiment of the invention to adjust one or more spring-constant and/or mass values in one or more spring-mass models;

FIG. 7A is a cross-sectional, side view of a diagram used to illustrate a virtual object and a virtual passageway, according to one embodiment of the invention;

FIG. 7B is an end view of a diagram used to illustrate a virtual object and a virtual passageway, according to one embodiment of the invention;

FIG. 7C is a cross-sectional, side view of a diagram used to illustrate an interaction between a virtual object and a virtual passageway, according to one embodiment of the invention;

FIG. 7D is an end view of a diagram used to illustrate an interaction between a virtual object and a virtual passageway, according to one embodiment of the invention;

FIG. 8 is a flowchart of an exemplary method according to one embodiment of the invention;

FIG. 9 is a flowchart of another exemplary method according to one embodiment of the invention;

FIG. 10 is a flowchart of yet another exemplary method according to one embodiment of the invention; and

FIG. 11 is a flowchart of yet another exemplary method according to one embodiment of the invention.

DETAILED DESCRIPTION

OF EMBODIMENTS OF THE INVENTION

In one embodiment, the invention provides a virtual haptic response system and method that enables a user to visually and tactilely assess a choking, ingestion, blocking, insertion, aspiration, or other hazard associated with an insertion of a manufactured, or yet to be manufactured, object into a human passageway. Illustratively, the object may be a toy or other article intended for use by children, as well as other consumer or food products intended for use by any age group. The object may be modeled by a virtual object that includes the dimensions and material properties of the real object.

Hazards associated with the object may be assessed by interacting the virtual object with an anatomically correct, virtual passageway that models a real human passageway, such as, but not limited to, a nasal pharynx, an oral cavity, an oral pharynx, a trachea, a hypo-pharynx, and an esophagus. The virtual passageway may be dimensionally sized and imbued with specific material properties unique to persons within a particular age group. Additionally, the dimensions and material properties modeled by the virtual passageway may be statistically obtained to represent a particular sector of the population.

In one embodiment, a haptic input/output device is connected to a display device through a computer. The display device displays a two-dimensional or three-dimensional view of a virtual object and a virtual passageway, both of which may model the exact or substantially exact dimensions and material characteristics of a real object and a real passageway, respectively. The display device may also indicate a magnitude of a force caused by an interaction of the virtual object with the virtual passageway. Additionally, the haptic input/output device may generate a tactile force that enables a user to feel the interaction of the virtual object with the virtual passageway in order to assist in assessing a degree of hazard associated with the virtual object. Optionally, assessment of the hazard may be performed by the computer itself using computational techniques. Simulating a design of an object being considered for manufacture and testing it for hazards in the manner described herein enables the designer and/or manufacturer to modify the object\'s dimensions and/or material properties early in the design cycle, which reduces costs and saves time.

System of the Invention

FIG. 1A is a cut-away, profile view of a virtual human head 100 showing placement of a virtual object 105 within a virtual passageway 110. The head 100 includes a haptic layer 103, which may be a virtual, complex, computer-generated surface that forms an interface between the virtual passageway 110 and corresponding portions of the head 100. The haptic layer 103 may be formed using, for example, Boolean subtraction to remove a volume having the size and shape of a normal or expanded passageway from the head 100. The haptic layer 103 may be used to calculate a magnitude of a contact force (or contact forces) exerted between the virtual object 105 and the virtual passageway 110.

Additionally, as shown in FIG. 1A, the haptic layer 103 may be positioned to correspond to an inner surface of a passageway. For example, the haptic layer 103 may be positioned on an inner surface of a nasal passageway 102. Similarly, another haptic layer 103 may be positioned on an inner surface of an oral passageway 101. Although the haptic layer 103 may not be visible to a user, the user may deduce its position by a tactile force generated by a haptic input/output device whenever the virtual device interacts with the haptic layer 103. The haptic layer may be toggled on and off. In the latter case, no force feedback is provided.

The virtual object 105 may be, for example, a computer generated model of a real object. Accordingly, the virtual object 105 may have the exact dimensions and material properties (modulus of elasticity, poisson\'s ratio, density, texture, friction, etc.) of a natural object, a manufactured object, or a yet to be manufactured object. The dimensions and material properties of the real object may be obtained from reference sources or experimentally measured data. These properties may be linear or non-linear, isotropic or anisotropic, homogeneous or inhomogeneous. Once obtained, the dimensions and material properties of the virtual object and virtual passageway may be imported or otherwise input into a computer program that creates the virtual object 105 and the virtual passageway 110, respectively. In one embodiment, an optional handle 107 connected to the object 105 is provided so that a user can more clearly see an interaction of the object 105 with the virtual passageway 110. Additionally, the handle 107 may be used to manipulate the virtual object 105 through a portion of the virtual passageway 110, or to position the virtual object 105 at any particular location within the virtual passageway 110 for hazard assessment. In an implementation, the virtual object 105 may be created using the FreeForm® Concept™ software produced by SensAble Technologies, Inc. of Woburn, Mass., or other graphical software programs.

The virtual passageway 110 may be, for example, a computer generated model of a nasal pharynx, an oral cavity, an oral pharynx, a trachea, a hypopharnyx, an esophagus, or other anatomical entity, such as an ear canal, a lumen, intestine, lungs, or other passageway. In an implementation, the virtual passageway 110 will accurately represent, anatomically, a human passageway. This accurate representation will include the interaction of tissue, bone, and muscle groups associated with the passageway. The dimensions of such tissues, bone, and muscle groups may be determined using MRI modeling, CT modeling, statistical modeling, or other empirical data discussed below. The material properties of such tissues, bone, and muscle groups may be determined by direct measurement, or from compilations such as H. Yamada, Strength of Biological Materials, Wilkins and Williams, Baltimore, Mass., 1970, herein incorporated in its entirety, or from statistical modeling of data from a single or multiple sources. In one embodiment, the virtual passageway will comprise the haptic layer in order to provide feedback and modeling according to an aspect of the invention. The model of FIG. 1A, as well as any of the remaining models of the invention may further include one or more reference layers 109A and 109B.

In an implementation, the reference layers are computer-generated artistic or mathematical renderings of certain anatomical features that may have a fixed shape. In an embodiment, the reference layers may include haptic properties, such that a user will feel resistance (e.g., feedback force) when passing the virtual object through one or more of the reference layers. In an embodiment, one or more of the reference layers may be toggled off to permit placement of the virtual object at any particular location of the virtual passageway, and then toggled back on to provide a cumulative resistance that, combined with the resistance provided by the haptic layer, realistically simulates the force(s) exerted by and on the virtual object. Alternatively, once the virtual object is positioned, only the reference layer(s) may be toggled on to permit determination of the resistance(s) provided by the tissues which surround the virtual passageway. The one or more reference layers 109A and 109B may be simultaneously displayed with a visible or invisible haptic layer 103 to provide frames of reference to a user and to enable the user to better understand and visualize relevant anatomy. Additionally, the reference layers may be toggled on and off separately, or simultaneously.

The one or more of the reference layers 109A and 109B may be created using data imported from MRI scans and CT scans, together, or separately, with other inputted data that is either experimentally measured or obtained from reference sources. A combination of MRI and CT scans is preferable because MRI scans offer excellent soft tissue discrimination, and CT scans offer excellent bone discrimination. In an implementation, the reference layers 109A and 109B, and the haptic layer 103, may be created using multiple software applications and then imported into the FreeForm® or other development environment.

In one embodiment, high resolution CT and MRI scans, in DICOM or other format, are imported into a software program that allows each scan to be viewed individually, and which recreates an approximate volumetric representation of a head or other body part using a polygonal or other finite element mesh that may serve as the basis for a virtual spring-mass damper model or other mathematical method of modeling the material properties of tissues. One such software program is the Mimics software program, manufactured by Materialise, Inc. of Ann Arbor, Mich. However, other software programs may be used with the invention.

Once the data from the CT and MRI scans is imported, reference layers that correspond to specific anatomical entities, such as the skull layer 109A and the mandible layer 109B, may be isolated by “filtering” one or more CT or MRI images. Filtering may include highlighting only those areas of the image that correspond to a specific range of gray shades. After filtering, the selected anatomical entities are exported by the software program in .stl (stereolithography) format, and imported into sculpting or general geometrical modeling software such as FreeForm® Concept™ Software, where the quality of the images may be improved and extraneous tissue identical in density to the desired anatomical entity may be removed, according to a user\'s preference.

In an implementation, one or more pre-assembled virtual objects 105, virtual passageways 110, reference layers 109A and 109B, and haptic layers 103 may be stored in one or more databases for later retrieval by a user. This enables a user to select from among a range of choices. For example, embodiments of pre-assembled virtual objects 105 may include square, round, rectangular, polygonal, and other shaped objects, of various sizes, textures, and rigidity. Additionally, embodiments of virtual passageways 110, may include anatomically correct, non-anatomically correct, and statistically characterized passageways.

In one embodiment, the virtual passageway 110 is not only anatomically correct, but it also includes anatomical measurements and/or material properties that have been statistically correlated to correspond to a particular age group. For example, the virtual passageway 110 shown in FIG. 1A may represent a passageway having the dimensions and material properties most likely to be found in the 75th percentile of children ages 3 years to 6 years. Naturally, the invention is not limited to this percentile or age group, but may include any other percentile or age group.

One technique for creating a statistically characterized virtual passageway 110 may include obtaining detailed external and internal anatomical measurements for different age groups and different demographic groups of children, teenagers, or adults. Illustratively, external anatomical measurements such as height and various facial dimensions for children in different age and demographic groups may be obtained from existing reference sources. In some cases, the dimensions of internal passageways may also be obtained from existing reference sources. However, in some cases, existing studies of human passageways may not provide sufficient data to provide a statistical basis for embodiments of the present invention. Accordingly, in one implementation, internal passageway dimensions from CT and MRI scans may be obtained and compared with measurements of external anatomical features in the same CT and MRI scans to find an external anatomical feature that correlates with an internal anatomical feature. The best-correlated pair of external and internal features may then be used to statistically calculate the passageway\'s size percentile within a particular population group.



Download full PDF for full patent description/claims.

Advertise on FreshPatents.com - Rates & Info


You can also Monitor Keywords and Search for tracking patents relating to this Haptic response system and method of use patent application.
###
monitor keywords



Keyword Monitor How KEYWORD MONITOR works... a FREE service from FreshPatents
1. Sign up (takes 30 seconds). 2. Fill in the keywords to be monitored.
3. Each week you receive an email with patent applications related to your keywords.  
Start now! - Receive info on patent apps like Haptic response system and method of use or other areas of interest.
###


Previous Patent Application:
Graphical user interface, system and method for managing contacts within an online stationery system
Next Patent Application:
Multi-input gestures in hierarchical regions
Industry Class:
Data processing: presentation processing of document
Thank you for viewing the Haptic response system and method of use patent info.
- - - Apple patents, Boeing patents, Google patents, IBM patents, Jabil patents, Coca Cola patents, Motorola patents

Results in 1.0205 seconds


Other interesting Freshpatents.com categories:
Software:  Finance AI Databases Development Document Navigation Error

###

All patent applications have been filed with the United States Patent Office (USPTO) and are published as made available for research, educational and public information purposes. FreshPatents is not affiliated with the USPTO, assignee companies, inventors, law firms or other assignees. Patent applications, documents and images may contain trademarks of the respective companies/authors. FreshPatents is not affiliated with the authors/assignees, and is not responsible for the accuracy, validity or otherwise contents of these public document patent application filings. When possible a complete PDF is provided, however, in some cases the presented document/images is an abstract or sampling of the full patent application. FreshPatents.com Terms/Support
-g2--0.6658
     SHARE
  
           

FreshNews promo


stats Patent Info
Application #
US 20120278711 A1
Publish Date
11/01/2012
Document #
13540210
File Date
07/02/2012
USPTO Class
715701
Other USPTO Classes
International Class
06F3/048
Drawings
13



Follow us on Twitter
twitter icon@FreshPatents