FreshPatents.com Logo
stats FreshPatents Stats
n/a views for this patent on FreshPatents.com
Updated: December 09 2014
newTOP 200 Companies filing patents this week


Advertise Here
Promote your product, service and ideas.

    Free Services  

  • MONITOR KEYWORDS
  • Enter keywords & we'll notify you when a new patent matches your request (weekly update).

  • ORGANIZER
  • Save & organize patents so you can view them later.

  • RSS rss
  • Create custom RSS feeds. Track keywords without receiving email.

  • ARCHIVE
  • View the last few months of your Keyword emails.

  • COMPANY DIRECTORY
  • Patents sorted by company.

Your Message Here

Follow us on Twitter
twitter icon@FreshPatents

Object information obtaining device, display method, and non-transitory computer-readable storage medium

last patentdownload pdfdownload imgimage previewnext patent

20140182383 patent thumbnailZoom

Object information obtaining device, display method, and non-transitory computer-readable storage medium


An object information obtaining device includes a light source which emits light, an acoustic wave detecting unit which detects a photoacoustic wave generated by irradiation of an object with the light, and outputs an electric signal in response to detection of the photoacoustic wave, and a processing unit configured to perform two or more types of processing to photoacoustic signal data based on the electric signal to obtain object information corresponding to each of the two or more types of processing, and to display on a display unit the object information corresponding to at least one processing selected by a user out of the two or more types of processing.
Related Terms: Irradiation Acoustic Wave

Browse recent Canon Kabushiki Kaisha patents - Tokyo, JP
USPTO Applicaton #: #20140182383 - Class: 73655 (USPTO) -
Measuring And Testing > Vibration >Sensing Apparatus >With Light Beam Indicator



Inventors: Koichi Suzuki, Hiroshi Abe

view organizer monitor keywords


The Patent Description & Claims data below is from USPTO Patent Application 20140182383, Object information obtaining device, display method, and non-transitory computer-readable storage medium.

last patentpdficondownload pdfimage previewnext patent

BACKGROUND OF THE INVENTION

1. Field of the Invention

The present invention relates to technology to obtain object information based on a photoacoustic wave generated by irradiation of light to an object.

2. Description of the Related Art

Photo acoustic imaging (PAI) in an optical imaging technique developed based on the photoacoustic effect. In photo acoustic imaging, for example, an object such as a living body is irradiated with pulsed light and a light absorber such as a blood vessel absorbs energy of the pulsed light to generate a photoacoustic wave. An acoustic wave detecting unit detects the photoacoustic wave generated by the photoacoustic effect. Then, a detection signal output from the acoustic wave detecting unit is analyzed by image processing, for example, and object information is obtained.

As an example of photo acoustic imaging, Non-Patent Document 1 entitled “Universal back-projection algorithm for photoacoustic computed tomography”, disclosed by Xu et al., PHYSICAL REVIEW E 71,016706 (2005), discloses obtaining initial sound pressure distribution as the object information by applying universal back-projection reconstruction processing (hereinafter, referred to as “UBP processing”) to the detection signal of the photoacoustic wave.

SUMMARY

OF THE INVENTION

An object information obtaining device disclosed in this specification is provided with a light source configured to emit light, an acoustic wave detecting unit configured to detect a photoacoustic wave generated by irradiation of an object with the light, and to output an electric signal in response to detection of the acoustic wave, and a processing unit configured to perform two or more types of processing to photoacoustic signal data based on the electric signal to obtain object information corresponding to each of the two or more types of processing, and to display on a display unit the object information corresponding to at least one processing selected by a user out of the two or more types of processing. Further features of the present invention will become apparent from the following description of exemplary embodiments (with reference to the attached drawings).

BRIEF DESCRIPTION OF THE DRAWINGS

FIG. 1 is a view illustrating an object information obtaining device according to this embodiment.

FIG. 2 is a view illustrating a processing unit according to this embodiment in detail.

FIG. 3 is a view illustrating a flow of a method of obtaining object information according to this embodiment.

FIG. 4A is a view illustrating a simulation model according to this embodiment.

FIG. 4B is a view illustrating a simulation result of a Fourier domain reconstruction processing according to this embodiment.

FIG. 4C is a view illustrating a simulation result of a time domain reconstruction processing according to this embodiment.

FIG. 4D is a view illustrating a simulation result of a model base reconstruction processing according to this embodiment.

FIG. 5 is a view illustrating a flow of a method of obtaining object information according to Example 1 of the present invention.

FIG. 6 is a view illustrating a processing unit according to Example 1 of the present invention in detail.

FIG. 7 is a view illustrating a screen displayed on a display according to Example 1 of the present invention.

FIG. 8 is a view illustrating a flow of a method of obtaining object information according to Example 2 of the present invention.

FIG. 9 is a view illustrating a screen displayed on a display according to Example 2 of the present invention.

DESCRIPTION OF THE EMBODIMENTS

Object information according to one embodiment includes initial sound pressure of a photoacoustic wave generated by a photoacoustic effect, optical energy absorption density derived from the initial sound pressure, an absorption coefficient, density of a substance forming tissue and the like. Herein, density of a substance may be determined by levels of oxygen saturation, oxyhemoglobin density, deoxyhemoglobin density, total hemoglobin density and the like. The total hemoglobin density is a sum of the oxyhemoglobin density and the deoxyhemoglobin density.

The object information in this embodiment may be not numerical data but distribution information of each position in an object. That is to say, the distribution information such as absorption coefficient distribution and oxygen saturation distribution may be used as the object information.

Further improvement in method of displaying the object information obtained only by specific processing (UBP reconstruction processing) as disclosed in Non-Patent Document 1 is desired from a diagnostic viewpoint.

For example, a real image corresponding to the object might be displayed in a different manner depending on a type of the processing. Therefore, usefulness in diagnosis of an observation object might be different depending on the type of the processing.

A virtual image referred to as an artifact might be present in a diagnostic image obtained through the reconstruction processing. The artifact might preclude appropriate diagnosis. As it is known, depending on the type of the reconstruction processing, artifacts appear differently in a reconstructed image.

Therefore, display of object information obtained by the specific processing alone might be insufficient at the time of diagnosis.

In accordance with at least one embodiment of the present invention, at least one processing is selected by a user from two or more types of processing to photoacoustic signal data (also referred to as raw data). According to this, the user may confirm the object information obtained by desired processing, so that the user may selectively use the image corresponding to the processing determined to be useful according to a symptom in the diagnosis.

With the object information obtaining device capable of executing only one specific processing, there is a case in which processing requiring long processing time should be executed even though the user wants to see a diagnostic result in a short time. With the object information obtaining device capable of executing only the specific processing, there also is a case in which processing based on a simple model should be executed even though the user wants to observe detailed information even if it takes long processing time.

Therefore, according to an embodiment disclosed herein, the user may also select the desired processing in consideration of acceptable processing time to the user. That is to say, according to this embodiment, the user may select the object information corresponding to the desired processing determined by the user to be highly useful within the acceptable processing time to the user.

The present embodiment is hereinafter described with reference to the drawings. In the drawings, the same reference sign is assigned to the same component, and the description thereof is not repeated.

A basic configuration of the object information obtaining device (information obtaining apparatus) according to this embodiment illustrated in FIG. 1 is first described.

The object information obtaining device illustrated in FIG. 1 includes a light source 110, an optical system 120, an acoustic wave detecting unit 130, a processing unit 140 as a computer, an input unit 150, and a display unit 160 in order to obtain information of a living body 100 as the object.

FIG. 2 is a block diagram illustrating relevant parts of a computer, which is an example of a data processing apparatus including the processing unit 140 and peripheral elements of the processing unit 140. As illustrated in FIG. 2, the processing unit 140 is provided with an arithmetic unit 141 and a storage unit 142. An example of the processing unit 140 includes, but is not limited to, a microprocessor chip, such as a CPU (central processing unit) or MPU (micro processing unit). An example of storage unit 140 includes, but is not limited to, RAM or ROM memory.

The arithmetic unit 141 controls operation of each component forming the object information obtaining device through a data network 200. The arithmetic unit 141 reads a program in which processing steps for (a method of) obtaining object information to be described later is saved in the storage unit 142 and allows the object information obtaining device to execute the method of obtaining object information.

Each component of the object information obtaining device according to this embodiment is hereinafter described in detail.

(Light Source 110)

The light source 110 is preferably a pulse light source capable of emitting light pulses lasting a few nanoseconds to few microseconds. Specifically, the light source 110 is preferably capable of emitting light having a pulse width of approximately 10 nanoseconds in order to efficiently generate the photoacoustic wave. A wavelength of the light which can be emitted by the light source 110 is desirably the wavelength at which the light propagates into the object. Specifically, when the object is a living body, such as a human or animal body, a preferable wavelength is not shorter than 500 nm and not longer than 1500 nm.

A laser or a light-emitting diode are examples of a light source that may be used in some embodiments disclosed herein. As the laser, various lasers such as a solid-state laser, a gas laser, a dye laser, and a semiconductor laser may be used. For example, the laser used in this embodiment includes an alexandrite laser, an yttrium-aluminum-garnet laser, a titanium-sapphire laser and the like.

(Optical System 120)

The light emitted from the light source 110 is typically guided to the living body 100 while being shaped into a desired light distribution shape by means of an optical component such as a lens and a mirror. In addition, it is also possible to propagate the pulsed light by using a waveguide or an optical fiber. The optical component used to shape the light distribution includes, for example, a mirror reflecting the light, a lens collecting and magnifying the light or changing a focusing shape thereof, a prism dispersing, refracting, and reflecting the light, the optical fiber propagating the light, a diffusion plate dispersing the light and other like optical components or combinations thereof. Any type or number of such optical components may be used as long as the object is irradiated with the light emitted from the light source 110 in the desired manner.

However, when the light emitted by the light source 110 may be guided directly to the object as desired light, it may not be necessary to use the optical system 120.

(Acoustic Wave Detecting Unit 130)

The acoustic wave detecting unit 130 is provided with one or more opto-acoustic transducers and a housing enclosing the transducer(s). An opto-acoustic transducer, as used herein, is an element capable of detecting an acoustic wave.

The transducer receives the acoustic wave such as the photoacoustic wave and an ultrasonic echo to transform it to an electric signal being an analog signal. Any transducer may be used as long as the transducer is configured to receive the acoustic wave. Examples of transducer include a transducer using a piezoelectric phenomenon, a transducer using optical resonance, a transducer using change in capacitance, and other like transducers. The acoustic wave detecting unit 130 is preferably provided with a plurality of transducers arranged in an array.

(Processing Unit 140)

The processing unit 140 is provided with the arithmetic unit 141 and the storage unit 142 as illustrated in FIG. 2.

The arithmetic unit 141 is typically formed of an arithmetic element such as a CPU, a GPU, an A/D converter, a FPGA (field programmable gate array) card, and an ASIC (application specific integrated circuit) chip. Meanwhile, the arithmetic unit 141 may be formed not only of one arithmetic element but also of a plurality of arithmetic elements. Any arithmetic element may be used to perform the disclosed process.

The storage unit 142 is typically formed of a storage medium such as a ROM memory, a RAM memory, a hard disk drive, or a combination thereof. That is, the storage unit 142 may be formed not only of one storage medium but also of a plurality of storage media.

The arithmetic unit 141 may make a gain adjustment to increase or decrease an amplification gain according to time that elapses from irradiation of the light to arrival of the acoustic wave at the element of the acoustic wave detecting unit 130 in order to obtain the image having a uniform contrast regardless of a depth in the living body.

The arithmetic unit 141 may control light emission timing of the pulsed light emitted from the light source 110, and may also control operation start timing of the acoustic wave detecting unit 130 by using the pulsed light as a trigger signal. The arithmetic unit 141 may control display operations of the display unit 160.

The arithmetic unit 141 is preferably configured to simultaneously perform pipeline processing of a plurality of signals when a plurality of detecting signals is obtained from the acoustic wave detecting unit 130. According to this, time that elapses before the object information is obtained may be shortened.

Preferably, each processing operation performed by the processing unit 140 may be saved in the storage unit 142 as part of the program to be executed by the arithmetic unit 141. The storage unit 142 in which the program is saved is a non-transitory computer-readable recording medium.

The processing unit 140 and the acoustic wave detecting unit 130 may be provided as an integrated unit. Then, the processing unit provided on the acoustic wave detecting unit may perform a part of signal processing, and the processing unit provided outside the acoustic wave detecting unit may perform the remainder of signal processing. In this case, the processing unit provided on the acoustic wave detecting unit and the processing unit provided outside the acoustic wave detecting unit may be collectively referred to as the processing unit according to this embodiment.

(Input Unit 150)

The input unit 150 is a user interface (I/F) configured to accept an operation (e.g., input) by the user. Information input by the user is input from the input unit 150 to the processing unit 140.

For example, a pointing device such as a mouse and a keyboard, a graphics tablet type and the like may be adapted as the input unit 150. A mechanical device such as a button and a dial provided on a device forming the object information obtaining device, or other I/F device may also be adapted as the input unit 150. When a touch panel display is used as the display unit 160, the display unit 160 may also be adapted to function as the input unit 150.

Naturally, the input unit 150 may be provided as a user I/F disposed separately from the object information obtaining device and connected thereto via the data network 200.

(Display Unit 160)

The display unit 160 is a device which displays the object information output from the processing unit 140.

Although a liquid crystal display (LCD) and the like is typically used as the display unit 160, another type of display such as a plasma display, an organic EL display, and a FED may also be used. It is also possible to integrally form the input unit 150 and the display unit 160 by adopting the touch panel display as the display unit 160.

The display unit 160 may also be provided separately from the object information obtaining device according to this embodiment.

Next, the method of obtaining object information according to this embodiment using the object information obtaining device illustrated in FIGS. 1 and 2 is described with reference to a flow illustrated in FIG. 3. The flow process illustrated in FIG. 3 is example of an algorithm executed by the processing unit 140.

(S301: Step of Obtaining Photoacoustic Signal Data)

At step S301, the light emitted by the light source 110 is applied to the living body 100 as pulse light 121 through the optical system 120. Then, a light absorber 101 absorbs the pulse light 121 and a photoacoustic wave 102 is generated by the photoacoustic effect.

Next, the acoustic wave detecting unit 130 transforms the photoacoustic wave 102 to the electric signal being the analog signal to output to the processing unit 140. The arithmetic unit 141 saves the electric signal output from the acoustic wave detecting unit 130 in the storage unit 142 as the photoacoustic signal data.

In this embodiment, data obtained when the electric signal output from the acoustic wave detecting unit 130 is saved in the storage unit 142 is made into the photoacoustic signal data. The photoacoustic signal data may be read from the storage unit 142 by the arithmetic unit 141 to be used in the two or more types of processing to be described later.

The electric signal output from the acoustic wave detecting unit 130 is typically amplified and subjected to the A/D conversion to be saved in the storage unit 142 as the photoacoustic signal data. The electric signal output from the acoustic wave detecting unit 130 may also be saved in the storage unit 142 as the photoacoustic signal data after being averaged.

The photoacoustic signal data is saved in the storage unit 142 in this manner. The arithmetic unit 141 may use the photoacoustic signal data including the same photoacoustic signal data corresponding to the photoacoustic wave detected at certain time in a plurality of types of processing to be described later.

In photo acoustic imaging, it is possible to apply different types of processing to the photoacoustic signal data including the same photoacoustic signal data obtained by detecting the photoacoustic wave at certain time. According to this, the object information at the same time corresponding to each of the different types of processing may be obtained.

That is, the object information corresponding to the desired processing out of pieces of object information at the same time obtained by applying each of the two or more types of processing to the photoacoustic signal data including the same data may be selectively displayed.

The arithmetic unit 141 may also obtain the object information corresponding to each processing by performing the two or more types of processing to the photoacoustic signal data not including the same data.

(S302: Step of Selecting Information of Desired Processing from Two or More Types of Processing)

At step S302, the user selects the desired processing from two or more types of processing by using the input unit 150. Then, the input unit 150 outputs the information of the processing selected by the user to the processing unit 140. At that time, the information of the selected processing is saved in the storage unit 142.

An example of the input unit 150 for the user to select the desired processing from the two or more types of processing is hereinafter described. That is, an example of a method of inputting the information of the desired processing by the user is described.

For example, the user may select the desired processing by pressing a mechanical button as the input unit 150 corresponding to each of the two or more types of processing. Alternatively, the user may select the desired processing by turning a mechanical dial as the input unit 150 corresponding to each of the two or more types of processing.

As another example, the user may also select the desired processing by selecting an item indicating the processing displayed on the display unit 160 by means of a pointing device (mouse), the keyboard and the like as the input unit 150. At that time, the display unit 160 may display the items indicating the processing next to one another as icons or display them as a menu. The item related to the processing displayed on the display unit 160 may be always displayed beside the image of the object information or may be configured to be displayed when the user performs some operation by using the input unit 150. For example, the display unit 160 may be configured such that the item indicating the processing is displayed on the display unit 160 by a click of the mechanical button provided on the mouse as the input unit 150.

The method is not limited to the above-described method and any method may be adopted as long as the user may select the desired processing out of the two or more types of processing.

The object information obtaining device is preferably configured such that progress of each processing is visually presented to the user. For example, it is possible to configure the object information obtaining device such that the progress of each processing is visually presented by displaying a progress bar or displaying a predicted calculation termination time on the display unit 160. In addition, it is also possible to use a circular progress mark in which an angle of a part with changed color changes as the processing advances. Alternatively, a color of the item corresponding to the processing may be changed according to a progress status such as completion of the processing or the progress status may be displayed in characters in the vicinity of the item.

The object information obtaining device according to this embodiment is preferably configured such that the progress of the processing may be grasped and the user may optionally stop the processing currently being calculated. Such configuration allows the user to start a different process operation when the user sees the progress bar and determines that the progress of the processing currently being calculated is not convenient (e.g., the processing is taking too long, the processing is not good due to a processing error, the type of processing was chosen in error, etc.).

Image reconstruction processing selected by default may be set in advance in a file in the storage unit 142. In this case, the arithmetic unit 141 may read default processing at the beginning of step S302 and execute the processing selected by default if the user does not especially select other processing. It is also possible that the user may intentionally select the processing set by default.

The desired processing selected by the user may be at least one type of processing. In this embodiment, at least two types of processing may be selected from three or more types of processing. At that time, the object information obtaining device according to this embodiment may be configured such that a plurality of combinations of at least two types of processing may be selected. According to this, the user may select the desired processing with a high degree of freedom and it becomes possible to display the object information useful in the diagnosis.

(S303: Step of Obtaining Object Information by Performing Desired Processing)

At step S303, the arithmetic unit 141 obtains the object information by performing the desired processing selected at S200 based on the photoacoustic signal data saved in the storage unit 142. Herein, the object information obtained by performing the desired processing is referred to as “object information corresponding to the desired processing”.

Meanwhile, the arithmetic unit 141 may read the program in which an algorithm of the processing is described stored in the storage unit 142 and apply this processing to the photoacoustic signal data to obtain the object information.

In this embodiment, three-dimensional voxel data and two-dimensional pixel data as the object information may be obtained by the processing.

Herein, the processing according to this embodiment is intended to mean every processing performed during transform from the photoacoustic signal data to the object information having a pathological value. For example, the processing according to this embodiment includes signal processing such as probe response correction processing and noise removal processing to generate different photoacoustic signal data based on the photoacoustic signal data stored in the storage unit 142. There also is, for example, reconstruction processing such as time domain reconstruction processing, Fourier domain reconstruction processing, and model base reconstruction processing to generate the object information from the photoacoustic signal data stored in the storage unit 142 as the processing according to this embodiment. For example, the processing according to this embodiment includes image processing such as resolution improvement processing to generate different object information based on the object information generated by the above-described reconstruction processing.

An example of each processing is hereinafter described.

The probe response correction processing (hereinafter, referred to as “BD processing”) as the signal processing according to this embodiment is the processing to correct signal deterioration due to band limitation of a probe by applying processing based on a blind deconvolution algorithm to the photoacoustic signal data (refer to Patent Document 1 (Japanese Patent Application Laid-Open No. 2012-135462)). When the photoacoustic wave is transformed to the electric signal by the acoustic wave detecting unit 130, there is limitation in receiving bandwidth of the acoustic wave detecting unit 130, so that a waveform of the electric signal might change to generate ringing. This ringing causes the artifact appearing in the vicinity of the light absorber on the image to deteriorate resolution. Probe response correction has an effect of decreasing the ringing by the acoustic wave detecting unit, thereby decreasing the artifact and improving the resolution.

The noise removal processing (hereinafter, referred to as “wavelet processing”) as the signal processing according to this embodiment is the processing to remove a noise component of the photoacoustic signal data through basis pursuit by a wavelet function of the photoacoustic signal data. A waveform of the signal resulting from the photoacoustic wave is known to be an N-shaped waveform under an ideal condition (refer to Non-Patent Document 2 (Sergey A. Ermilov, RedaGharieb, Andre Conjusteau, Tom Miller, Ketan Mehta, and Alexander A. Oraevsky, “Data Processing and quasi-3D optoacoustic imaging of tumors in the breast using a linear arc-shaped array of ultrasonic transducers”, Proc. of SPIE, Vol. 6856). On the other hand, random noise being an irregular waveform mixed from an electric system and the like of the device is superimposed on the signal resulting from the photoacoustic wave. Therefore, the signal resulting from the noise is discriminated from the signal resulting from the photoacoustic wave by applying a discrete wavelet transform to the photoacoustic signal data and removing a coefficient having a small absolute value from a result thereof. The wavelet processing has a large effect when the signal resulting from the photo acoustic wave has the waveform close to the ideal waveform. On the other hand, when a frequency of the photoacoustic wave is significantly different from a bandwidth of the acoustic wave detecting unit, when the noise is too large, and when a plurality of waveforms are superimposed due to a feature of the object, there is a case in which an effect of improving an image quality by the wavelet processing is small.

The time domain reconstruction processing (hereinafter, referred to as “TD processing”) as the reconstruction processing is the processing to estimate a sonic wave source by superimposing sonic wave signals in a real space by using a property that the photoacoustic wave is a spherical wave to generate the voxel data (refer to Patent Document 2 (Japanese Patent Application Laid-Open No. 2010-35806)). The TD processing specifically includes UBP processing disclosed in Non-Patent Document 1. The TD processing is performed in the real space, so that an effect of a measurement system is easily introduced as compared to the Fourier domain reconstruction processing and the like to be described later. For example, it is possible to decrease a side-lobe artifact by applying weighted correction processing of a solid angle and the like in consideration of a state of the acoustic wave detecting unit 130, for example.



Download full PDF for full patent description/claims.

Advertise on FreshPatents.com - Rates & Info


You can also Monitor Keywords and Search for tracking patents relating to this Object information obtaining device, display method, and non-transitory computer-readable storage medium patent application.
###
monitor keywords

Browse recent Canon Kabushiki Kaisha patents

Keyword Monitor How KEYWORD MONITOR works... a FREE service from FreshPatents
1. Sign up (takes 30 seconds). 2. Fill in the keywords to be monitored.
3. Each week you receive an email with patent applications related to your keywords.  
Start now! - Receive info on patent apps like Object information obtaining device, display method, and non-transitory computer-readable storage medium or other areas of interest.
###


Previous Patent Application:
Object information acquisition apparatus, display method, and non-transitory computer-readable storage medium
Next Patent Application:
Photoacoustic probe and photoacoustic device having the same
Industry Class:

Thank you for viewing the Object information obtaining device, display method, and non-transitory computer-readable storage medium patent info.
- - - Apple patents, Boeing patents, Google patents, IBM patents, Jabil patents, Coca Cola patents, Motorola patents

Results in 0.63113 seconds


Other interesting Freshpatents.com categories:
Nokia , SAP , Intel , NIKE ,

###

Data source: patent applications published in the public domain by the United States Patent and Trademark Office (USPTO). Information published here is for research/educational purposes only. FreshPatents is not affiliated with the USPTO, assignee companies, inventors, law firms or other assignees. Patent applications, documents and images may contain trademarks of the respective companies/authors. FreshPatents is not responsible for the accuracy, validity or otherwise contents of these public document patent application filings. When possible a complete PDF is provided, however, in some cases the presented document/images is an abstract or sampling of the full patent application for display purposes. FreshPatents.com Terms/Support
-g2--0.7252
Key IP Translations - Patent Translations

     SHARE
  
           

stats Patent Info
Application #
US 20140182383 A1
Publish Date
07/03/2014
Document #
14134957
File Date
12/19/2013
USPTO Class
73655
Other USPTO Classes
International Class
01H9/00
Drawings
10


Your Message Here(14K)


Irradiation
Acoustic Wave


Follow us on Twitter
twitter icon@FreshPatents

Canon Kabushiki Kaisha

Browse recent Canon Kabushiki Kaisha patents

Measuring And Testing   Vibration   Sensing Apparatus   With Light Beam Indicator