Foveon took as its mission another radically simple idea Mead loves: "Use all the light."
Don't cameras already use all the light that enters the lens? Film cameras do, but digital cameras, with few exceptions, don't. As Mead puts it, "They throw away two-thirds of the light." That makes sense only if you understand how a typical image sensor works. It's basically a rectangle of silicon on which millions of microscopic light-sensitive pixels (technically they're not pixels, but that's what these light-sensing points have come to be called in the digital-camera business) are arranged in a grid. Pixels can't sense color. So a checkerboard of tiny red, green, or blue filters must be bonded to the surface of the sensor so that each pixel lets in one of the three primary colors of light. In so doing, it blocks out the other two.
By comparing each pixel's single-color reading with that of its neighbors, software can derive the values of the two missing colors at each site. That takes approximately 100 calculations per pixel. In a four-mega-pixel camera, a size commonly available today, that adds up to a lot of number crunching. The process is called interpolation, and Mead has a less kind name for it.
"It's a hack," he says. "They have to do all this guesswork to figure out what they threw away. They end up with a lot of data, but two-thirds of it is made up. We end up with the same amount of data, except ours is real."
That is because X3 does what until now only film has been able to do: in one exposure, on one image plane, measure all three primary colors of light at every point on the picture.