Skip to main content
Spectrum Inspired Low-light Image Translation for Saliency Detection

Spectrum Inspired Low-light Image Translation for Saliency Detection

Date18th Apr 2024

Time03:30 PM

Venue SSB-233 (ALC)



Saliency detection methods are crucial in various practical applications, including robot navigation and satellite imagery analysis. However, the effectiveness of current techniques diminishes in low-light environments as they are predominately trained on well-lit image datasets. Addressing this challenge typically involves acquiring a new dataset specifically tailored for low-light conditions, a process that demands meticulous pixel-level annotations and consumes substantial time and resources. In this work, we propose a novel approach that transforms well-lit images into proxy low-light images using classical band-pass filtering techniques in the Fourier domain. Unlike popular deep learning approaches that require learning thousands of parameters and enormous amounts of training data, the proposed transformation is fast, simple and easy to extend to tasks such as low-light depth estimation. Through extensive experimentation, we demonstrate that saliency detection and depth estimation networks trained on our simulated proxy low-light images outperform those trained using conventional methods when applied to real low-light scenarios. This research provides a practical and effective solution for enhancing saliency detection and depth estimation in low-light settings, paving the way for more robust and versatile applications.


Kitty Varghese

Computer Science and Engg