Sunday, October 22, 2023

Single-photon Imaging at ICCV’23 in Paris

U. Toronto presented their award-winning paper on passive high-speed single-photon imaging. It’s a nice shot-in-the-arm for this exciting and rapidly developing research area! Congratulations!

WISION Lab presented four papers on various aspects of single-photon imaging:

a) SoDaCam: If you capture every incident photon, you can computationally recreate a variety of cameras (normal cameras, HDR cameras, event cameras, high-speed cameras, etc.) — after-the-fact — all from photons captured by one single-photon camera. Taking the idea of software-defined cameras to its limit, starting from digital cameras (pixels), light-field cameras (rays) to individual photons. Getting closer toward the goal of a single, all-purpose camera?

b) Eulerian Single-Photon Vision: Single-photon cameras are known to be power / compute / bandwidth hungry? This work shows that it’s possible to do fast low-level vision (edge detection, optical flow) in extreme low light with single-photon cameras on a tight computational and memory budget.

c) Panoramas from Photons: Can we capture scene representation under extreme conditions (ultra low-light, rapidly moving camera)? How many of us have been prompted to slow down the camera when making a panorama with our iPhones? Wouldn’t it be nice to take a high-quality gigapixel panoramic of a starry night sky? This paper tackles classical vision problems (motion estimation, feature correspondence) under the extreme regime of single-photon imaging. Also features the largest (so far) single-photon panorama ever captured!

d) Compressive Single-Photon Cameras: Another major challenge faced by single-photon cameras is that of data deluge. These sensors capture orders of magnitude more data, making it challenging, if not impossible, for downstream algorithms to keep up. This is especially critical in real-time and resource-constrained applications (e.g., robot navigation, autonomous driving). This work takes steps to address this problem by designing novel compressed representations of raw single-photon data. The key is to compute these representations on the fly as photons come in, thus achieving 1-2 orders of magnitude compression only with lightweight computational operations that can be implemented on-chip.