A "real" light sensor is typically just a CdS cell. CdS, Cadmium Sulfide, reacts to light by changing its resistance. The circuit is very simple (put it in series with a resistor to form a voltage divider, and read the resultant voltage), and very cheap. That, of course, is why we'd use them for ambient light sensors were it not for RoHS complaining about filling up landfills with Cadmium (it's a heavy metal). We have no perfect solution to replace CdS cells, so we typically use photodiodes which, unfortunately, aren't nearly as good for this sort of thing.<div>
<br></div><div>Being able to sense light intensity with just one output value from such sensors isn't very useful if you want an image; hence, CCD and CMOS imagers were created for just such tasks. They are composed of MANY individual light sensors behind a lens and an RGB color filter.<br>
<br>Now that the nonsense responses have been answered, let's get to the real issue. If you have a webcam which takes 5-10 seconds to turn on, it's almost certainly limited by the device driver and the operating system. The imager itself will typically be able to produce an image in 200 microseconds or less (depending on the imager type). From there, it goes into the AFE (Analog Front End). The AFE is a purpose built chip capable of passing through the video stream with minimal delay, and will be on (if it even includes a low power/standyby mode) before the sensor is ready. The AFE digitizes the signal and outputs a bitstream ready to be used by the Video Processor. The job of the Video Processor in a webcam is typically to do the color correction and conversion to a compressed format. The video processor will be powered up first, as it has to signal the other chips to wake up.</div>
<div><br></div><div>Now, in a hardware design, I would either tap into the analog signal or process the bitstream from the AFE to average all the pixels to give a brightness reading. Since you're writing software, you're limited to processing the bitstream. To keep the processing power to a minimum, simply configure the webcam to output a bistream in its lowest resolution. Vector instructions can then perform this task in a minimum number of clock cycles.</div>
<div><br></div><div>Keeping the imager on should not be necessary unless there is a limitation in the driver chips used. Dig in on the documentation for the chips, and you may find a way to put them into an appropriate mode for this task. An alternative, but ill fitting, resort, would be to use the LEDs on the laptop as light sensors (by taking advantage of the variable capacitance inherent to all LED's due to the photoelectric effect). This would require custom driver code for each laptop design, however, and is therefore unusable as a generic solution.</div>
<div><br></div><div>The idea that the bitstream would be impossible to share is outrageous. This is nothing more than an excuse for laziness. I would also like to see some data to back up the "shorten battery life by an hour" figure that someone pulled out of thin air. FACTS people, FACTS. Idle speculation will get you nowhere in life.</div>
<div><br></div><div>Oh, and an arduino is grossly overkill for something so simple. More importantly, it requires an additional piece of hardware, which is very inconvenient and cumbersome.</div><div><br></div><div><br>
</div>
<div><br><div class="gmail_quote">On Thu, Apr 5, 2012 at 2:24 PM, Simon Oosthoek <span dir="ltr"><<a href="mailto:somlist@xs4all.nl">somlist@xs4all.nl</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
<div class="im">On 05/04/12 16:39, Dario Freddi wrote:<br>
> Any other engineers want to give out their opinion? :D (joking)<br><br>
</div>Sure ;-)<br>
On my laptop, the webcam is covered by a bit of paper (not joking, some<br>
of us are paranoid ;-)<br>
<br>
But why not design and build a real ambient light sensor with a usb<br>
interface and use that? It probably isn't that complicated with an<br>
arduino and some hobbying...<br>
<br>
Cheers<br>
<br>
Ir. Simon (going back into deep-lurk mode)<br>
</blockquote></div><br></div>