FreshPatents.com Logo
stats FreshPatents Stats
n/a views for this patent on FreshPatents.com
Updated: December 09 2014
newTOP 200 Companies filing patents this week


Advertise Here
Promote your product, service and ideas.

    Free Services  

  • MONITOR KEYWORDS
  • Enter keywords & we'll notify you when a new patent matches your request (weekly update).

  • ORGANIZER
  • Save & organize patents so you can view them later.

  • RSS rss
  • Create custom RSS feeds. Track keywords without receiving email.

  • ARCHIVE
  • View the last few months of your Keyword emails.

  • COMPANY DIRECTORY
  • Patents sorted by company.

Your Message Here

Follow us on Twitter
twitter icon@FreshPatents

Image sensor and electronic device including the same

last patentdownload pdfdownload imgimage previewnext patent

20140055635 patent thumbnailZoom

Image sensor and electronic device including the same


An image sensor includes a sensing device including a pixel array having a plurality of unit pixels, the sensing device being configured to generate pixel data in response to an incident light signal having information of an image of an object and information of an ambient light; an image data generation unit configured to generate image data corresponding to the object based on the pixel data; and an illuminance data generation unit configured to generate illuminance data corresponding to the ambient light based on the pixel data.
Related Terms: Electronic Device Inanc Incident Light

USPTO Applicaton #: #20140055635 - Class: 3482221 (USPTO) -


Inventors: Jong-seok Seo

view organizer monitor keywords


The Patent Description & Claims data below is from USPTO Patent Application 20140055635, Image sensor and electronic device including the same.

last patentpdficondownload pdfimage previewnext patent

CROSS-REFERENCE TO RELATED APPLICATION(S)

This application claims priority under 35 USC §119 to Korean Patent Application No. 10-2012-0091084, filed on Aug. 21, 2012 in the Korean Intellectual Property Office (KIPO), the contents of which are herein incorporated by reference in their entirety.

BACKGROUND

1. Technical Field

Example embodiments relate to an image sensor, and more particularly to an image sensor that is able to measure illuminance and an electronic device including the image sensor.

2. Description of the Related Art

Charge coupled device (CCD) image sensors and complementary metal oxide semiconductor (CMOS) image sensors have been used as devices for capturing an image of an object. Image sensors convert light signals into electric signals. Recently, as various kinds of electronic devices include image sensors, image sensors are required to measure illuminance of ambient light as well as to capture an image of an object.

SUMMARY

Some example embodiments are directed to provide an image sensor that is able to capture an image of an object and to measure illuminance of ambient light.

Some example embodiments are directed to provide an electronic device including the image sensor.

According to example embodiments, an image sensor includes a sensing unit, an image data generation unit, and an illuminance data generation unit. The sensing unit includes a pixel array having a plurality of unit pixels, and generates a pixel data in response to an incident light signal having information of an image of an object and information of an ambient light. The image data generation unit generates an image data corresponding to the object based on the pixel data. The illuminance data generation unit generates an illuminance data corresponding to the ambient light based on the pixel data.

In example embodiments, the illuminance data generation unit may generate the illuminance data based on an angle of view of the image sensor that is used for generating the image data.

The image data and the illuminance data may be generated at substantially the same time.

In example embodiments, the pixel array may include a plurality of ambient light sensing units each of which includes at least two adjacent unit pixels among the plurality of unit pixels, and the illuminance data generation unit may select effective sensing units among the plurality of ambient light sensing units by performing a crop operation and a sub-sampling operation on the pixel array, and generate the illuminance data based on effective pixel data, which correspond to the effective sensing units, among the pixel data.

The illuminance data generation unit may include a spectrum response compensation unit configured to extract the effective pixel data among the pixel data, and to generate a luminance data based on the effective pixel data, an automatic exposure adjustment unit configured to generate an exposure control signal, which is used for controlling an exposure time of the plurality of unit pixels, based on the luminance data, and a calculation unit configured to generate the illuminance data based on the luminance data and the exposure control signal.

The pixel data may include a plurality of pixel values corresponding to the plurality of unit pixels, respectively, and the luminance data may include a plurality of luminance values corresponding to the effective sensing units, respectively. The spectrum response compensation unit may generate a first luminance value corresponding to a first effective sensing unit based on pixel values, which correspond to unit pixels included in the first effective sensing unit, and gains for the unit pixels included in the first effective sensing unit.

The illuminance data may be proportional to a sum of the plurality of luminance values and inversely proportional to the exposure time of the plurality of unit pixels.

The illuminance data generation unit may further include a control unit configured to control the spectrum response compensation unit, the automatic exposure adjustment unit and the calculation unit.

Unit pixels included in a same effective sensing unit may include at least one of a red filter, a green filter and a blue filter.

Unit pixels included in a same effective sensing unit may include at least one of a yellow filter, a magenta filter and a cyan filter.

In example embodiments, the image data generation unit and the illuminance data generation unit may be embodied in one data processing unit.

In example embodiments, the image sensor may further comprise a mode selection unit configured to activate one of the image data generation unit and the illuminance data generation unit in response to a mode selection signal.

In example embodiments, the sensing unit may further include a correlated double sampling (CDS) unit configured to generate a plurality of CDS signals by performing a CDS operation on a plurality of analog pixel signals provided from the pixel array, and an analog-to-digital conversion unit configured to generate the pixel data by digitalizing the plurality of CDS signals.

In example embodiments, the image sensor may be a complementary metal oxide semiconductor (CMOS) image sensor.

According to example embodiments, an electronic device includes an image sensor and a display device. The image sensor generates an image data corresponding to an object and illuminance data corresponding to an ambient light in response to an incident light signal having information of an image of the object and information of the ambient light. The display device displays the object based on the image data and the illuminance data. The image sensor includes a sensing unit, an image data generation unit, and an illuminance data generation unit. The sensing unit includes a pixel array having a plurality of unit pixels, and generates a pixel data in response to the incident light signal. The image data generation unit generates the image data based on the pixel data. The illuminance data generation unit generates the illuminance data based on the pixel data.

According to at least one example embodiment, an image sensor may include a sensing device including a pixel array having a plurality of unit pixels, the sensing device being configured to generate pixel data in response to an incident light signal having information of an image of an object and information of an ambient light; an image data generation unit configured to generate image data corresponding to the object based on the pixel data; and an illuminance data generation unit configured to generate illuminance data corresponding to the ambient light based on the pixel data.

The illuminance data generation unit may be configured to generate the illuminance data based on an angle of view of the image sensor that is used for generating the image data.

The image data generation unit and the illuminance data generation unit may be configured such that the image data and the illuminance data are generated at substantially the same time.

The pixel array may include a plurality of ambient light sensing units each of which includes at least two adjacent unit pixels among the plurality of unit pixels, and the illuminance data generation unit may be configured to select effective sensing units among the plurality of ambient light sensing units by performing a crop operation and a sub-sampling operation on the pixel array, and configured to generate the illuminance data based on effective pixel data, which correspond to the effective sensing units, among the pixel data.

The illuminance data generation unit may include a spectrum response compensation unit configured to extract the effective pixel data among the pixel data, and to generate a luminance data based on the effective pixel data; an automatic exposure adjustment unit configured to generate an exposure control signal based on the luminance data; and a calculation unit configured to generate the illuminance data based on the luminance data and the exposure control signal, the sensing device being configured to control an exposure time of the plurality of unit pixels based on the exposure control signal.

The pixel data may include a plurality of pixel values corresponding to the plurality of unit pixels, respectively, the luminance data may include a plurality of luminance values corresponding to the effective sensing units, respectively, and the spectrum response compensation unit may be configured to generate a first luminance value corresponding to a first effective sensing unit based on pixel values, which correspond to unit pixels included in the first effective sensing unit, and gains for the unit pixels included in the first effective sensing unit.

The illuminance data may be proportional to a sum of the plurality of luminance values and inversely proportional to the exposure time of the plurality of unit pixels.

The illuminance data generation unit may further include a control unit configured to control the spectrum response compensation unit, the automatic exposure adjustment unit and the calculation unit.

Unit pixels included in a same effective sensing unit may include at least one of a red filter, a green filter and a blue filter.

Unit pixels included in a same effective sensing unit may include at least one of a yellow filter, a magenta filter and a cyan filter.

The image data generation unit and the illuminance data generation unit are embodied in one data processing unit.

The image sensor may further include a mode selection unit configured to activate one of the image data generation unit and the illuminance data generation unit in response to a mode selection signal.

The sensing unit may further includes a correlated double sampling (CDS) unit configured to generate a plurality of CDS signals by performing a CDS operation on a plurality of analog pixel signals provided from the pixel array; and an analog-to-digital conversion unit configured to generate the pixel data by digitalizing the plurality of CDS signals.

The image sensor may be a complementary metal oxide semiconductor (CMOS) image sensor.

According to at least one example embodiment, an electronic device may include an image sensor configured to generate image data corresponding to an object and illuminance data corresponding to an ambient light in response to an incident light signal having information of an image of the object and information of the ambient light, the image sensor including, a sensing unit including a pixel array having a plurality of unit pixels, the sensing unit generating a pixel data in response to the incident light signal, an image data generation unit configured to generate the image data based on the pixel data, and an illuminance data generation unit configured to generate the illuminance data based on the pixel data; and the electronic device may further include a display device configured to display the object based on the image data and the illuminance data.

According to at least one example embodiment, an image capture device may include a pixel array including a plurality of pixels, the pixel array being configured to convert light incident on the pixel array into pixel data, the incident light including light corresponding to an object and ambient light; an image data generation unit configured to generate image data corresponding to the object based on the pixel data; and an illuminance data generation unit configured to generate illuminance data corresponding to the ambient light based on the pixel data.

The illuminance data generation unit may be configured to generate the illuminance data based on an angle of view of the image sensor that is used for generating the image data.

The pixel array may include a plurality of ambient light sensing units, each of which includes at least two adjacent pixels among the plurality of pixels, the illuminance data generation unit may be configured to select, as effective sensing units, a sub set of the plurality of ambient light sensing units by performing a crop operation and a sub-sampling operation on the pixel array, and the illuminance data generation unit may be configured to generate the illuminance data based on effective data, the effective data being data, from among the pixel data, that corresponds to the pixels of the effective sensing units.

BRIEF DESCRIPTION OF THE DRAWINGS

The above and other features and advantages of example embodiments will become more apparent by describing in detail example embodiments with reference to the attached drawings. The accompanying drawings are intended to depict example embodiments and should not be interpreted to limit the intended scope of the claims. The accompanying drawings are not to be considered as drawn to scale unless explicitly noted.

FIG. 1 is a block diagram illustrating an image sensor according to example embodiments.

FIG. 2 is a block diagram illustrating an example of a sensing unit included in the image sensor of FIG. 1.

FIG. 3 is a circuit diagram illustrating an example of a unit pixel included in a sensing unit of FIG. 2.

FIG. 4 is a block diagram illustrating an example of an illuminance data generation unit included in the image sensor of FIG. 1.

FIGS. 5, 6A, 6B, 6C, 6D, 7A and 7B are diagrams for describing an operation of a spectrum response compensation unit included in an illuminance data generation unit of FIG. 4.

FIG. 8 is a block diagram illustrating an image sensor according to example embodiments.

FIG. 9 is a block diagram illustrating an image sensor according to example embodiments.

FIG. 10 is a flow chart illustrating a method of driving an image sensor according to example embodiments.

FIG. 11 is a flow chart illustrating an example of a step of generating illuminance data of FIG. 10.

FIG. 12 is a block diagram illustrating an electronic device according to example embodiments.

FIG. 13 is a block diagram illustrating an example of an interface used in the electronic device of FIG. 12.

DETAILED DESCRIPTION

OF THE EMBODIMENTS

Detailed example embodiments are disclosed herein. However, specific structural and functional details disclosed herein are merely representative for purposes of describing example embodiments. Example embodiments may, however, be embodied in many alternate forms and should not be construed as limited to only the embodiments set forth herein.

Accordingly, while example embodiments are capable of various modifications and alternative forms, embodiments thereof are shown by way of example in the drawings and will herein be described in detail. It should be understood, however, that there is no intent to limit example embodiments to the particular forms disclosed, but to the contrary, example embodiments are to cover all modifications, equivalents, and alternatives falling within the scope of example embodiments. Like numbers refer to like elements throughout the description of the figures.

It will be understood that, although the terms first, second, etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first element could be termed a second element, and, similarly, a second element could be termed a first element, without departing from the scope of example embodiments. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items.

It will be understood that when an element is referred to as being “connected” or “coupled” to another element, it may be directly connected or coupled to the other element or intervening elements may be present. In contrast, when an element is referred to as being “directly connected” or “directly coupled” to another element, there are no intervening elements present. Other words used to describe the relationship between elements should be interpreted in a like fashion (e.g., “between” versus “directly between”, “adjacent” versus “directly adjacent”, etc.).

The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of example embodiments. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises”, “comprising,”, “includes” and/or “including”, when used herein, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.

It should also be noted that in some alternative implementations, the functions/acts noted may occur out of the order noted in the figures. For example, two figures shown in succession may in fact be executed substantially concurrently or may sometimes be executed in the reverse order, depending upon the functionality/acts involved.

FIG. 1 is a block diagram illustrating an image sensor according to example embodiments.

Referring to FIG. 1, an image sensor 100 includes a sensing unit 120, an image data generation unit 140 and an illuminance data generation unit 160.

The sensing unit 120 includes a pixel array having a plurality of unit pixels. The sensing unit 120 generates a pixel data PDAT in response to an incident light signal LS that is arrived on the pixel array. The incident light signal LS has information of an image of an object and information of an ambient light. The pixel data PDAT may be a digital data. The pixel data PDAT may include a plurality of pixel values corresponding to the plurality of unit pixels, respectively.

In some example embodiments, the image sensor 100 may be a complementary metal oxide semiconductor (CMOS) image sensor. Hereinafter, the various exemplary embodiments will be described based on a CMOS image sensor. However, it is understood that the image sensor 100 may be other types of image sensors, including a charge-coupled device (CCD) image sensor, without departing from the scope of the present teachings.

The image data generation unit 140 generates an image data IMG corresponding to the object based on the pixel data PDAT. For example, the image data generation unit 140 may generate the image data IMG by performing an image interpolation, a color correction, a white balance adjustment, a gamma correction, a color conversion, etc. on the pixel data PDAT.

The illuminance data generation unit 160 generates illuminance data ILM corresponding to the ambient light based on the pixel data PDAT. The illuminance data ILM may correspond to a illuminance value of the ambient light. In some example embodiments, the illuminance data generation unit 160 may generate the illuminance data ILM based on an angle of view (AOV) of the image sensor 100 that is used for generating the image data IMG. That is, the image data generation unit 140 and the illuminance data generation unit 160 may generate the image data IMG and the illuminance data ILM, respectively, under a condition of a same angle of view (AOV) of the image sensor 100. In this case, the image data IMG and the illuminance data ILM may be generated at substantially the same time.

Recently, image sensors are required to measure illuminance of ambient light as well as to capture an image of an object. For this purpose, image sensors generally include a first sensing unit for capturing an image of an object and a second sensing unit for measuring illuminance of ambient light. In this case, a size of an image sensor and a cost for manufacturing an image sensor may increase.

The image sensor 100 according to example embodiments includes one sensing unit 120. That is, the image sensor 100 generates both the image data IMG corresponding to the object and the illuminance data ILM corresponding to the ambient light based on the pixel data PDAT generated from the one sensing unit 120. In addition, the image sensor 100 may generate both the image data IMG and the illuminance data ILM under a condition of a same angle of view (AOV). That is, the image sensor 100 may generate the image data IMG and the illuminance data ILM at substantially the same time without changing parameters of the image sensor 100. Therefore, the image sensor 100 may be able to capture an image of the object and to measure illuminance of the ambient light without increasing a size of the image sensor 100 and a cost for manufacturing the image sensor 100.

FIG. 2 is a block diagram illustrating an example of a sensing unit included in the image sensor of FIG. 1.

Referring to FIG. 2, the sensing unit 120 includes a pixel array 121. The sensing unit 120 may further include a row driver 125, a correlated double sampling (CDS) unit 126, an analog-to-digital conversion (ADC) unit 127 and a timing controller 129.

The pixel array 121 includes a plurality of unit pixels 122 arranged in rows and columns. The pixel array 121 may generate a plurality of analog pixel signals AS in response to the incident light signal LS having information of an image of the object and information of the ambient light.

FIG. 3 is a circuit diagram illustrating an example of a unit pixel included in a sensing unit of FIG. 2.

Referring to FIG. 3, a unit pixel 122 may include a photoelectric conversion unit 131 and a signal generation circuit 132.

The photoelectric conversion unit 131 may perform photoelectric conversion. That is, the photoelectric conversion unit 131 may convert an incident light signal into photo-charges during an integration mode. For example, when the image sensor 100 is a CMOS image sensor, information of an image of an object to be captured and information of an ambient light may be obtained by collecting charge carriers (e.g., electron-hole pairs) generated from the photoelectric conversion unit 131 in response to the incident light signal passed through an open shutter of the image sensor 100 during the integration mode.

During a readout mode, the signal generation circuit 132 may generate a pixel output signal VOUT based on the photo-charges generated by the photoelectric conversion. For example, when the image sensor 100 is a CMOS image sensor, the shutter is closed and the pixel output signal VOUT may be generated based on information of an image of the object and information of the ambient light, which is obtained in a form of charge carriers during the readout mode after the integration mode.

The unit pixel 122 may have various structures including, for example, a one-transistor structure, a three-transistor structure, a four-transistor structure, a five-transistor structure, a structure in which some transistors are shared by multiple unit pixels, etc. FIG. 3 illustrates the four-transistor structure, according to an exemplary embodiment, for purposes of discussion. The signal generation circuit 132 may include a transfer transistor 133, a reset transistor 135, a drive transistor 136, and a selective transistor 137. The signal generation circuit 132 may also include a floating diffusion (FD) node 134.

The transfer transistor 133 may include a first electrode connected to the photoelectric conversion unit 131, a second electrode connected to the FD node 134, and a gate electrode to which a transfer signal TX is applied. The reset transistor 135 may include a first electrode to which a power supply voltage VDD is applied, a second electrode connected to the FD node 134, and a gate electrode to which a reset signal RST is applied. The drive transistor 136 may include a first terminal to which the power supply voltage VDD is applied, a gate electrode connected to the FD node 134, and a second electrode connected to the selective transistor 137. The selective transistor 137 may include a first electrode connected to the second electrode of the drive transistor 136, a gate electrode to which a select signal SEL is applied, and a second electrode from which the pixel output signal VOUT is output.

Referring again to FIG. 2, the pixel array 121 may include a plurality of ambient light sensing units 123. Each of the plurality of ambient light sensing units 123 may include at least two adjacent unit pixels among the plurality of unit pixels 122. For example, each of the plurality of ambient light sensing units 123 may include four unit pixels arranged in a 2×2 formation. The illuminance data generation unit 160 included in the image sensor 100 of FIG. 1 may generate the illuminance data ILM based on the plurality of ambient light sensing units 123. A structure and an operation of the illuminance data generation unit 160 will be described below with reference to FIG. 4.

The row driver 125, the CDS unit 126, the ADC unit 127 and a timing controller 129 may form a signal processing unit of the sensing unit 120. The signal processing unit may generate the pixel data PDAT, which is a digital data, by processing the plurality of analog pixel signals AS.

The row driver 125 may be connected to each row of the pixel array 121. The row driver 125 may generate driving signals to drive each row. For example, the row driver 125 may drive the plurality of unit pixels included in the pixel array 121 in the unit of a row.

The CDS unit 126 may generate a plurality of CDS signals SS by performing a CDS operation on the plurality of analog pixel signals AS provided from the pixel array 121. For example, the CDS unit 126 may perform the CDS operation by obtaining a difference between a voltage level representing a reset component of each pixel signal and a voltage level representing an image component and an ambient light component of each pixel signal, to generate the plurality of CDS signals SS corresponding to effective signal components. The CDS unit 126 may include a plurality of CDS circuits connected to column lines of the pixel array 121, respectively, and output the plurality of CDS signals SS corresponding to columns of the pixel array 121, respectively.

The ADC unit 127 may generate the pixel data PDAT by digitalizing the plurality of CDS signals SS. The ADC unit 127 may include a counter and a buffer unit. The counter may generate counting signals by performing a counting operation with respect to reset and image components of the pixel signals, and provide the counting signals to the buffer unit. The buffer unit may include a plurality of latch circuits connected to the column lines, respectively, latch the counting signals using the plurality of latch circuits, and output the latched counting signals as the pixel data PDAT.

The timing controller 129 may control the row driver 125, the CDS unit 126, and the ADC unit 127. The timing controller 129 may provide control signals, such as a clock signal, a timing control signal, etc., to the row driver 125, the CDS unit 126, and the ADC unit 127. In some example embodiments, the timing controller 129 may include a logic control circuit, a phase locked loop (PLL) circuit, a timing control circuit, a communication interface circuit, etc.

The sensing unit 120 may further include a voltage generation unit generating various voltage signals such as a reference voltage, a ramp voltage, etc.

In an example embodiment of FIG. 2, the sensing unit 120 may perform an analog double sampling. In other example embodiments, the sensing unit 120 may perform a digital double sampling in which an analog reset signal and an analog data signal are converted into digital signals and a difference between the two digital signals is obtained to represent an effective signal component. In other example embodiments, the sensing unit 120 may perform a dual correlated double sampling in which both an analog double sampling and a digital double sampling are performed.

FIG. 4 is a block diagram illustrating an example of an illuminance data generation unit included in the image sensor of FIG. 1.

Referring to FIGS. 2 and 4, the illuminance data generation unit 160 may include a spectrum response compensation (SRC) unit 162, an automatic exposure (AE) adjustment unit 164 and a calculation unit 166. The illuminance data generation unit 160 may further include a control unit 168.

The illuminance data generation unit 160 may select effective sensing units among the plurality of ambient light sensing units 123 of FIG. 2 by performing a crop operation and a sub-sampling operation on the pixel array 121, and generate the illuminance data ILM based on effective pixel data, which correspond to the effective sensing units, among the pixel data PDAT.

The spectrum response compensation unit 162 may extract the effective pixel data among the pixel data PDAT based on the crop operation and the sub-sampling operation, and generate a luminance data LDAT based on the effective pixel data.



Download full PDF for full patent description/claims.

Advertise on FreshPatents.com - Rates & Info


You can also Monitor Keywords and Search for tracking patents relating to this Image sensor and electronic device including the same patent application.
###
monitor keywords

Keyword Monitor How KEYWORD MONITOR works... a FREE service from FreshPatents
1. Sign up (takes 30 seconds). 2. Fill in the keywords to be monitored.
3. Each week you receive an email with patent applications related to your keywords.  
Start now! - Receive info on patent apps like Image sensor and electronic device including the same or other areas of interest.
###


Previous Patent Application:
Image processing device and method, program, and solid-state imaging device
Next Patent Application:
Installation for conveying signals between a video camera equipment and a remote equipment
Industry Class:
Television
Thank you for viewing the Image sensor and electronic device including the same patent info.
- - - Apple patents, Boeing patents, Google patents, IBM patents, Jabil patents, Coca Cola patents, Motorola patents

Results in 0.56498 seconds


Other interesting Freshpatents.com categories:
Tyco , Unilever , 3m

###

Data source: patent applications published in the public domain by the United States Patent and Trademark Office (USPTO). Information published here is for research/educational purposes only. FreshPatents is not affiliated with the USPTO, assignee companies, inventors, law firms or other assignees. Patent applications, documents and images may contain trademarks of the respective companies/authors. FreshPatents is not responsible for the accuracy, validity or otherwise contents of these public document patent application filings. When possible a complete PDF is provided, however, in some cases the presented document/images is an abstract or sampling of the full patent application for display purposes. FreshPatents.com Terms/Support
-g2-0.2671
Key IP Translations - Patent Translations

     SHARE
  
           

stats Patent Info
Application #
US 20140055635 A1
Publish Date
02/27/2014
Document #
13943033
File Date
07/16/2013
USPTO Class
3482221
Other USPTO Classes
International Class
04N5/235
Drawings
11


Your Message Here(14K)


Electronic Device
Inanc
Incident Light


Follow us on Twitter
twitter icon@FreshPatents