FreshPatents.com Logo
stats FreshPatents Stats
12 views for this patent on FreshPatents.com
2014: 4 views
2013: 8 views
Updated: November 27 2014
Browse: Google patents
newTOP 200 Companies filing patents this week


    Free Services  

  • MONITOR KEYWORDS
  • Enter keywords & we'll notify you when a new patent matches your request (weekly update).

  • ORGANIZER
  • Save & organize patents so you can view them later.

  • RSS rss
  • Create custom RSS feeds. Track keywords without receiving email.

  • ARCHIVE
  • View the last few months of your Keyword emails.

  • COMPANY DIRECTORY
  • Patents sorted by company.

Follow us on Twitter
twitter icon@FreshPatents

Wearable computing device with indirect bone-conduction speaker

last patentdownload pdfdownload imgimage previewnext patent

20130022220 patent thumbnailZoom

Wearable computing device with indirect bone-conduction speaker


Exemplary wearable computing systems may include a head-mounted display that is configured to provide indirect bone-conduction audio. For example, an exemplary head-mounted display may include at least one vibration transducer that is configured to vibrate at least a portion of the head-mounted display based on the audio signal. The vibration transducer is configured such that when the head-mounted display is worn, the vibration transducer vibrates the head-mounted display without directly vibrating a wearer. However, the head-mounted display structure vibrationally couples to a bone structure of the wearer, such that vibrations from the vibration transducer may be indirectly transferred to the wearer's bone structure.
Related Terms: Wearable Computing Audio Transducer Computing Device Wearable

Google Inc. - Browse recent Google patents - Mountain View, CA, US
USPTO Applicaton #: #20130022220 - Class: 381151 (USPTO) - 01/24/13 - Class 381 
Electrical Audio Signal Processing Systems And Devices > Electro-acoustic Audio Transducer >Body Contact Wave Transfer (e.g., Bone Conduction Earphone, Larynx Microphone)



Inventors: Jianchun Dong, Liang-yu Tom Chi, Mitchell Heinrich, Leng Ooi

view organizer monitor keywords


The Patent Description & Claims data below is from USPTO Patent Application 20130022220, Wearable computing device with indirect bone-conduction speaker.

last patentpdficondownload pdfimage previewnext patent

BACKGROUND

Unless otherwise indicated herein, the materials described in this section are not prior art to the claims in this application and are not admitted to be prior art by inclusion in this section.

Computing devices such as personal computers, laptop computers, tablet computers, cellular phones, and countless types of Internet-capable devices are increasingly prevalent in numerous aspects of modern life. Over time, the manner in which these devices are providing information to users is becoming more intelligent, more efficient, more intuitive, and/or less obtrusive.

The trend toward miniaturization of computing hardware, peripherals, as well as of sensors, detectors, and image and audio processors, among other technologies, has helped open up a field sometimes referred to as “wearable computing.” In the area of image and visual processing and production, in particular, it has become possible to consider wearable displays that place a very small image display element close enough to a wearer\'s (or user\'s) eye(s) such that the displayed image fills or nearly fills the field of view, and appears as a normal sized image, such as might be displayed on a traditional image display device. The relevant technology may be referred to as “near-eye displays.”

Near-eye displays are fundamental components of wearable displays, also sometimes called “head-mounted displays” (HMDs). A head-mounted display places a graphic display or displays close to one or both eyes of a wearer. To generate the images on a display, a computer processing system may be used. Such displays may occupy a wearer\'s entire field of view, or only occupy part of wearer\'s field of view. Further, head-mounted displays may be as small as a pair of glasses or as large as a helmet.

SUMMARY

In one aspect, an exemplary wearable-computing system may include: (a) one or more optical elements; (b) a support structure comprising a front section and at least one side section, wherein the support structure is configured to support the one or more optical elements; (c) an audio interface configured to receive an audio signal; and (d) at least one vibration transducer located on the at least one side section, wherein the at least one vibration transducer is configured to vibrate at least a portion of the support structure based on the audio signal. In this exemplary wearable-computing system, the vibration transducer is configured such that when the support structure is worn, the vibration transducer vibrates the support structure without directly vibrating a wearer. Further, the support structure is configured such that when worn, the support structure vibrationally couples to a bone structure of the wearer.

In another aspect, an exemplary wearable-computing system may include: (a) a support structure comprising a front section and at least one side section, wherein the support structure is configured to support the one or more optical elements; (b) a means for receiving an audio signal; and (c) a means for vibrating at least a portion of the support structure based on the audio signal, wherein the means for vibrating is located on the at least one side section. In this exemplary wearable-computing system, the means for vibrating is configured such that when the support structure is worn, the means for vibrating vibrates the support structure without directly vibrating a wearer. Further, the support structure is configured such that when worn, the support structure vibrationally couples to a bone structure of the wearer.

These as well as other aspects, advantages, and alternatives, will become apparent to those of ordinary skill in the art by reading the following detailed description, with reference where appropriate to the accompanying drawings.

BRIEF DESCRIPTION OF THE DRAWINGS

FIG. 1 illustrates a wearable computing system according to an exemplary embodiment.

FIG. 2 illustrates an alternate view of the wearable computing system of FIG. 1.

FIG. 3 illustrates an exemplary schematic drawing of a wearable computing system.

FIG. 4 is a simplified illustration of a head-mounted display configured for indirect bone-conduction audio, according to an exemplary embodiment.

FIG. 5 is another block diagram illustrating an HMD configured for indirect bone-conduction audio, according to an exemplary embodiment.

FIG. 6 is another block diagram illustrating an HMD configured for indirect bone-conduction audio, according to an exemplary embodiment.

DETAILED DESCRIPTION

Exemplary methods and systems are described herein. It should be understood that the word “exemplary” is used herein to mean “serving as an example, instance, or illustration.” Any embodiment or feature described herein as “exemplary” is not necessarily to be construed as preferred or advantageous over other embodiments or features. The exemplary embodiments described herein are not meant to be limiting. It will be readily understood that certain aspects of the disclosed systems and methods can be arranged and combined in a wide variety of different configurations, all of which are contemplated herein.

I. Overview

The disclosure generally involves a wearable computing system with a head-mounted display (HMD), and in particular, an HMD having at least one vibration transducer that functions as a speaker. An exemplary HMD may employ vibration transducers that are commonly referred to as bone-conduction transducers. However, standard applications of bone-conduction transducers involve direct transfer of sound to the inner ear by attaching the transducer directly to the bone (or a pad that is adjacent to the bone). An exemplary HMD, on the other hand, may include a bone-conduction transducer (or another type of vibration transducer) that transfers sound to the wearer\'s ear via “indirect bone conduction.”

More specifically, an exemplary HMD may include a vibration transducer that does not vibrationally couple to wearer\'s bone structure (e.g., a vibration transducer that is located so as to avoid substantial contact with the wearer when the HMD is worn). Instead, the vibration transducer is configured to vibrate the frame of the HMD. The HMD frame is in turn vibrationally coupled to the wearer\'s bone structure. As such, the HMD frame transfers vibration to the wearer\'s bone structure such that sound is perceived in the wearer\'s inner ear. In this arrangement, the vibration transducer does not directly vibrate the wearer, and thus may be said to function as an “indirect” bone conduction speaker.

In an exemplary embodiment, the vibration transducer may be placed at a location on the HMD that does not contact the wearer. For example, on a glasses-style HMD, a vibration transducer may be located on a side-arm of the HMD, near where the side-arm connects to the front of the HMD. Further, in an exemplary embodiment, the HMD may be configured such that when worn, there is space (e.g., air) between the portion of the HMD where the vibration transducer is located and the wearer. As such, the portion of the HMD that contacts and vibrationally couples to the wearer may be located away from the vibration transducer.

In another aspect, because the vibration transducer vibrates the frame of the HMD instead of directly vibrating a wearer, the frame may transmit the audio signal through the air as well. In some embodiments, the airborne audio signal may be heard by the wearer, and may actually enhance the sound perceived via indirect bone conduction. At the same time, this airborne audio signal may be much quieter than airborne audio signals emanating by traditional diaphragm speakers, and thus may provide more privacy to the wearer.

In a further aspect, one or more couplers may be attached to the HMD frame to enhance the fit of the HMD to the wearer and help transfer of vibrations from the frame to the wearer\'s bone structure. For example, a fitting piece, which may be moldable and/or made of rubber or silicone gel, for example, may be attached to the HMD frame. The fitting piece may be attached to the HMD frame in various ways. For instance, a fitting piece may be located behind the wearer\'s temple and directly above their ear, or in the pit behind the wearer\'s ear lobe, among other locations.

II. Exemplary Wearable Computing Devices

Systems and devices in which exemplary embodiments may be implemented will now be described in greater detail. In general, an exemplary system may be implemented in or may take the form of a wearable computer (i.e., a wearable-computing device). In an exemplary embodiment, a wearable computer takes the form of or includes an HMD. However, an exemplary system may also be implemented in or take the form of other devices, such as a mobile phone, among others. Further, an exemplary system may take the form of non-transitory computer readable medium, which has program instructions stored thereon that are executable by at a processor to provide the functionality described herein. An exemplary, system may also take the form of a device such as a wearable computer or mobile phone, or a subsystem of such a device, which includes such a non-transitory computer readable medium having such program instructions stored thereon.

In a further aspect, an HMD may generally be any display device that is worn on the head and places a display in front of one or both eyes of the wearer. An HMD may take various forms such as a helmet or eyeglasses. As such, references to “eyeglasses” herein should be understood to refer to an HMD that generally takes the form of eyeglasses. Further, features and functions described in reference to “eyeglasses” herein may apply equally to any other kind of HMD.

FIG. 1 illustrates a wearable computing system according to an exemplary embodiment. The wearable computing system is shown in the form of eyeglass 102. However, other types of wearable computing devices could additionally or alternatively be used. The eyeglasses 102 include a support structure that is configured to support the one or more optical elements.

In general, the support structure of an exemplary HMD may include a front section and at least one side section. In FIG. 1, the support structure has a front section that includes lens-frames 104 and 106 and a center frame support 108. Further, in the illustrated embodiment, side-arms 114 and 116 serve as a first and a second side section of the support structure for eyeglasses 102. It should be understood that the front section and the at least one side section may vary in form, depending upon the implementation.

Herein, the support structure of an exemplary HMD may also be referred to as the “frame” of the HMD. For example, the support structure of eyeglasses 102, which includes lens-frames 104 and 106, center frame support 108, and side-arms 114 and 116, may also be referred to as the “frame” of eyeglasses 102.

The frame of the eyeglasses 102 may function to secure eyeglasses 102 to a user\'s face via a user\'s nose and ears. More specifically, the side-arms 114 and 116 are each projections that extend away from the frame elements 104 and 106, respectively, and are positioned behind a user\'s ears to secure the eyeglasses 102 to the user. The side-arms 114 and 116 may further secure the eyeglasses 102 to the user by extending around a rear portion of the user\'s head. Additionally or alternatively, for example, the system 100 may connect to or be affixed within a head-mounted helmet structure. Other possibilities exist as well.

In an exemplary embodiment, each of the frame elements 104, 106, and 108 and the side-arms 114 and 116 may be formed of a solid structure of plastic or metal, or may be formed of a hollow structure of similar material so as to allow wiring and component interconnects to be internally routed through the eyeglasses 102. Other materials or combinations of materials are also possible. Further, the size, shape, and structure of eyeglasses 102, and the components thereof, may vary depending upon the implementation.

Further, each of the optical elements 110 and 112 may be formed of any material that can suitably display a projected image or graphic. Each of the optical elements 110 and 112 may also be sufficiently transparent to allow a user to see through the optical element. Combining these features of the optical elements can facilitate an augmented reality or heads-up display where the projected image or graphic is superimposed over a real-world view as perceived by the user through the optical elements.

The system 100 may also include an on-board computing system 118, a video camera 120, a sensor 122, and finger-operable touchpads 124, 126. The on-board computing system 118 is shown to be positioned on the side-arm 114 of the eyeglasses 102; however, the on-board computing system 118 may be provided on other parts of the eyeglasses 102. The on-board computing system 118 may include a processor and memory, for example. The on-board computing system 118 may be configured to receive and analyze data from the video camera 120 and the finger-operable touchpads 124, 126 (and possibly from other sensory devices, user interfaces, or both) and generate images for output from the optical elements 110 and 112.

The video camera 120 is shown to be positioned on the side-arm 114 of the eyeglasses 102; however, the video camera 120 may be provided on other parts of the eyeglasses 102. The video camera 120 may be configured to capture images at various resolutions or at different frame rates. Many video cameras with a small form-factor, such as those used in cell phones or webcams, for example, may be incorporated into an example of the system 100. Although FIG. 1 illustrates one video camera 120, more video cameras may be used, and each may be configured to capture the same view, or to capture different views. For example, the video camera 120 may be forward facing to capture at least a portion of the real-world view perceived by the user. This forward facing image captured by the video camera 120 may then be used to generate an augmented reality where computer generated images appear to interact with the real-world view perceived by the user.

The sensor 122 is shown mounted on the side-arm 116 of the eyeglasses 102; however, the sensor 122 may be provided on other parts of the eyeglasses 102. The sensor 122 may include one or more of a gyroscope or an accelerometer, for example. Other sensing devices may be included within the sensor 122 or other sensing functions may be performed by the sensor 122.

In an exemplary embodiment, sensors such as sensor 122 may be configured to detect head movement by a wearer of eyeglasses 102. For instance, a gyroscope and/or accelerometer may be arranged to detect head movements, and may be configured to output head-movement data. This head-movement data may then be used to carry out functions of an exemplary method, such as method 100, for instance.

The finger-operable touchpads 124, 126 are shown mounted on the side-arms 114, 116 of the eyeglasses 102. Each of finger-operable touchpads 124, 126 may be used by a user to input commands. The finger-operable touchpads 124, 126 may sense at least one of a position and a movement of a finger via capacitive sensing, resistance sensing, or a surface acoustic wave process, among other possibilities. The finger-operable touchpads 124, 126 may be capable of sensing finger movement in a direction parallel or planar to the pad surface, in a direction normal to the pad surface, or both, and may also be capable of sensing a level of pressure applied. The finger-operable touchpads 124, 126 may be formed of one or more transparent or transparent insulating layers and one or more transparent or transparent conducting layers. Edges of the finger-operable touchpads 124, 126 may be formed to have a raised, indented, or roughened surface, so as to provide tactile feedback to a user when the user\'s finger reaches the edge of the finger-operable touchpads 124, 126. Each of the finger-operable touchpads 124, 126 may be operated independently, and may provide a different function.

FIG. 2 illustrates an alternate view of the wearable computing system of FIG. 1. As shown in FIG. 2, the optical elements 110 and 112 may act as display elements. The eyeglasses 102 may include a first projector 128 coupled to an inside surface of the side-arm 116 and configured to project a display 130 onto an inside surface of the optical element 112. Additionally or alternatively, a second projector 132 may be coupled to an inside surface of the side-arm 114 and configured to project a display 134 onto an inside surface of the optical element 110.

The optical elements 110 and 112 may act as a combiner in a light projection system and may include a coating that reflects the light projected onto them from the projectors 128 and 132. In some embodiments, a special coating may not be used (e.g., when the projectors 128 and 132 are scanning laser devices).

In alternative embodiments, other types of display elements may also be used. For example, the optical elements 110, 112 themselves may include: a transparent or semi-transparent matrix display, such as an electroluminescent display or a liquid crystal display, one or more waveguides for delivering an image to the user\'s eyes, or other optical elements capable of delivering an in focus near-to-eye image to the user. A corresponding display driver may be disposed within the frame elements 104 and 106 for driving such a matrix display. Alternatively or additionally, a laser or LED source and scanning system could be used to draw a raster display directly onto the retina of one or more of the user\'s eyes. Other possibilities exist as well.

While FIGS. 1 and 2 show two touchpads and two display elements, it should be understood that many exemplary methods and systems may be implemented in wearable computing devices with only one touchpad and/or with only one optical element having a display element. It is also possible that exemplary methods and systems may be implemented in wearable computing devices with more than two touchpads.

FIG. 3 illustrates an exemplary schematic drawing of a wearable computing system. In particular, a computing device 138 communicates using a communication link 140 (e.g., a wired or wireless connection) to a remote device 142. The computing device 138 may be any type of device that can receive data and display information corresponding to or associated with the data. For example, the computing device 138 may be a heads-up display system, such as the eyeglasses 102 described with reference to FIGS. 1 and 5.

Thus, the computing device 138 may include a display system 144 comprising a processor 146 and a display 148. The display 148 may be, for example, an optical see-through display, an optical see-around display, or a video see-through display. The processor 146 may receive data from the remote device 142, and configure the data for display on the display 148. The processor 146 may be any type of processor, such as a micro-processor or a digital signal processor, for example.

The computing device 138 may further include on-board data storage, such as memory 150 coupled to the processor 146. The memory 150 may store software that can be accessed and executed by the processor 146, for example.

The remote device 142 may be any type of computing device or transmitter including a laptop computer, a mobile telephone, etc., that is configured to transmit data to the device 138. The remote device 142 and the device 138 may contain hardware to enable the communication link 140, such as processors, transmitters, receivers, antennas, etc.

In FIG. 3, the communication link 140 is illustrated as a wireless connection; however, wired connections may also be used. For example, the communication link 140 may be a wired link via a serial bus such as a universal serial bus or a parallel bus. A wired connection may be a proprietary connection as well. The communication link 140 may also be a wireless connection using, e.g., Bluetooth® radio technology, communication protocols described in IEEE 802.11 (including any IEEE 802.11 revisions), Cellular technology (such as GSM, CDMA, UMTS, EV-DO, WiMAX, or LTE), or Zigbee® technology, among other possibilities. The remote device 142 may be accessible via the Internet and may comprise a computing cluster associated with a particular web service (e.g., social-networking, photo sharing, address book, etc.).

II. Exemplary HMDs Configured for Indirect Bone Conduction

A. Exemplary HMD with Vibration Transducer

FIG. 4 is a simplified illustration of an HMD configured for indirect bone-conduction audio, according to an exemplary embodiment. As shown, HMD 400 includes two optical elements 402, 404. The frame of HMD 400 includes two side arms 408-L and 408-R, two lens-frames 409-L and 409-R, and a nose bridge 407. The nose bridge 407 and side arms 408-L and 408-R are arranged to fit behind a wearer\'s ears and hold the optical elements 402 and 404 in front of the wearer\'s eyes via attachments to the lens-frames 409-L and 409-R.

Further, HMD 400 may include various audio sources, from which audio signals may be acquired. For example, HMD 400 includes a microphone 410. Further, HMD 400 may additionally or alternatively an internal audio playback device. For example, an on-board computing system (not shown) may be configured to play digital audio files. Yet further, HMD 400 may be configured to acquire an audio signal from an auxiliary audio playback device 412, such as a portable digital audio player, smartphone, home stereo, car stereo, and/or personal computer. Other audio sources are also possible.

An exemplary HMD may also include one or more audio interfaces for receiving audio signals from various audio sources, such as those described above. For example, HMD 400 may include an interface for receiving an audio signal from microphone 410. Further, HMD 400 may include an interface 411 for receiving an audio signal from auxiliary audio playback device 412 (e.g., an “aux in” input). The interface to the auxiliary audio playback device 412 may be a tip, ring, sleeve (TRS) connector, or may take another form. HMD 412 may additionally or alternatively include an interface to an internal audio playback device. For example, an on-board computing system (not shown) may be configured to process digital audio files and output audio signals to a speaker or speakers. Other audio interfaces are also possible.



Download full PDF for full patent description/claims.

Advertise on FreshPatents.com - Rates & Info


You can also Monitor Keywords and Search for tracking patents relating to this Wearable computing device with indirect bone-conduction speaker patent application.
###
monitor keywords



Keyword Monitor How KEYWORD MONITOR works... a FREE service from FreshPatents
1. Sign up (takes 30 seconds). 2. Fill in the keywords to be monitored.
3. Each week you receive an email with patent applications related to your keywords.  
Start now! - Receive info on patent apps like Wearable computing device with indirect bone-conduction speaker or other areas of interest.
###


Previous Patent Application:
Voice coil speaker
Next Patent Application:
Shaping sound responsive to speaker orientation
Industry Class:
Electrical audio signal processing systems and devices
Thank you for viewing the Wearable computing device with indirect bone-conduction speaker patent info.
- - - Apple patents, Boeing patents, Google patents, IBM patents, Jabil patents, Coca Cola patents, Motorola patents

Results in 0.60885 seconds


Other interesting Freshpatents.com categories:
Nokia , SAP , Intel , NIKE ,

###

Data source: patent applications published in the public domain by the United States Patent and Trademark Office (USPTO). Information published here is for research/educational purposes only. FreshPatents is not affiliated with the USPTO, assignee companies, inventors, law firms or other assignees. Patent applications, documents and images may contain trademarks of the respective companies/authors. FreshPatents is not responsible for the accuracy, validity or otherwise contents of these public document patent application filings. When possible a complete PDF is provided, however, in some cases the presented document/images is an abstract or sampling of the full patent application for display purposes. FreshPatents.com Terms/Support
-g2-0.2819
     SHARE
  
           

Key IP Translations - Patent Translations


stats Patent Info
Application #
US 20130022220 A1
Publish Date
01/24/2013
Document #
13269935
File Date
10/10/2011
USPTO Class
381151
Other USPTO Classes
International Class
04R1/00
Drawings
6


Wearable Computing
Audio
Transducer
Computing Device
Wearable


Follow us on Twitter
twitter icon@FreshPatents