GSoC: automatic brightness adjustment
Mike Dean
miketdean at gmail.com
Sun Apr 8 12:25:00 BST 2012
A "real" light sensor is typically just a CdS cell. CdS, Cadmium Sulfide,
reacts to light by changing its resistance. The circuit is very simple
(put it in series with a resistor to form a voltage divider, and read the
resultant voltage), and very cheap. That, of course, is why we'd use them
for ambient light sensors were it not for RoHS complaining about filling up
landfills with Cadmium (it's a heavy metal). We have no perfect solution
to replace CdS cells, so we typically use photodiodes which, unfortunately,
aren't nearly as good for this sort of thing.
Being able to sense light intensity with just one output value from such
sensors isn't very useful if you want an image; hence, CCD and CMOS imagers
were created for just such tasks. They are composed of MANY individual
light sensors behind a lens and an RGB color filter.
Now that the nonsense responses have been answered, let's get to the real
issue. If you have a webcam which takes 5-10 seconds to turn on, it's
almost certainly limited by the device driver and the operating system.
The imager itself will typically be able to produce an image in 200
microseconds or less (depending on the imager type). From there, it goes
into the AFE (Analog Front End). The AFE is a purpose built chip capable
of passing through the video stream with minimal delay, and will be on (if
it even includes a low power/standyby mode) before the sensor is ready.
The AFE digitizes the signal and outputs a bitstream ready to be used by
the Video Processor. The job of the Video Processor in a webcam is
typically to do the color correction and conversion to a compressed format.
The video processor will be powered up first, as it has to signal the
other chips to wake up.
Now, in a hardware design, I would either tap into the analog signal or
process the bitstream from the AFE to average all the pixels to give a
brightness reading. Since you're writing software, you're limited to
processing the bitstream. To keep the processing power to a minimum,
simply configure the webcam to output a bistream in its lowest resolution.
Vector instructions can then perform this task in a minimum number of
clock cycles.
Keeping the imager on should not be necessary unless there is a limitation
in the driver chips used. Dig in on the documentation for the chips, and
you may find a way to put them into an appropriate mode for this task. An
alternative, but ill fitting, resort, would be to use the LEDs on the
laptop as light sensors (by taking advantage of the variable capacitance
inherent to all LED's due to the photoelectric effect). This would require
custom driver code for each laptop design, however, and is therefore
unusable as a generic solution.
The idea that the bitstream would be impossible to share is outrageous.
This is nothing more than an excuse for laziness. I would also like to
see some data to back up the "shorten battery life by an hour" figure that
someone pulled out of thin air. FACTS people, FACTS. Idle speculation
will get you nowhere in life.
Oh, and an arduino is grossly overkill for something so simple. More
importantly, it requires an additional piece of hardware, which is very
inconvenient and cumbersome.
On Thu, Apr 5, 2012 at 2:24 PM, Simon Oosthoek <somlist at xs4all.nl> wrote:
> On 05/04/12 16:39, Dario Freddi wrote:
> > Any other engineers want to give out their opinion? :D (joking)
>
> Sure ;-)
> On my laptop, the webcam is covered by a bit of paper (not joking, some
> of us are paranoid ;-)
>
> But why not design and build a real ambient light sensor with a usb
> interface and use that? It probably isn't that complicated with an
> arduino and some hobbying...
>
> Cheers
>
> Ir. Simon (going back into deep-lurk mode)
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.kde.org/pipermail/kde-core-devel/attachments/20120408/93251183/attachment.htm>
More information about the kde-core-devel
mailing list