The iPhone 7 Plus’s stunner of a camera has reinvigorated the smartphone industry’s pursuit of dual-sensor tech. Sony, a major smartphone camera supplier, said dual-lens camera modules would appear in a number of high-end handsets in the coming months. LG’s top-of-the-line G5 sports a dual-sensor camera, as does Huawei’s P9. And phonemakers aren’t the only companies jumping aboard. Another is veritable chipmaker Qualcomm, and on Wednesday, the company took the wraps off its latest dual-camera project: Clear Sight.
Qualcomm describes Clear Sight as a “processing solution” for phones with dual cameras, and that’s more or less the gist. The platform comprises a hardware module, a Snapdragon 820 or 821 processor, and a “computational low light … algorithm,” and leverages dual cameras to produce photos with contrast, sharpness, and clarity better than what they’d spit out independently. The key is the method of capture: While one sensor grabs a monochrome image, the other, captures a full-color frame in tandem.
Related: Should your next phone have a dual-camera setup?
The improvements are reportedly measurable. The black-and-white sensor is able to capture light three times better than its color-capturing companion, Qualcomm said, while the color sensor’s able to nab a much more nuanced light spectrum than it’d be otherwise able. There’s a downside in that few phones on the market pack the requisite hardware — one of the newest, LG’s V20, specifically doesn’t because of sensors that have “incompatible lens angles” — but Qualcomm said it’s working with manufacturers to build Clear Sight into future smartphones.
Clear Sight is an expansion of the imaging platform that Qualcomm launched last October. The chipmaker rolled out specialized hardware, a dual-image signal processor (ISP) dubbed Spectra, destined for two-sensor camera phones like the ZTE Axon and Axon Pro HTC One M8, and HTC Butterfly 3. It facilitates image processing between the twin shooters, of course, but enables a bevy of features besides. One, “refocus,” allows the adjustment of an image’s focus by simultaneously capturing different focal lengths between the phones’ cameras. Another, “segmentation,” maps a pic’s foreground and background focus planes. And a third, fast autofocus, boosts the speed at which the cameras narrow on focus planes.
They’re far different approaches than the one Apple’s taken with the iPhone 7 Plus. The handset’s module combines image data from two rear-facing sensors — a wide-angle lens and a 2x telephoto — in order to deliver optical zoom without loss of detail. In the near future, it’ll gain another unique trick: the ability to simulate bokeh, or depth of field. Apple said that functionality will roll out in October.