Illumination Estimation Challenges
Welcome to the "Night Photography" challenge part of the NTIRE workshop at CVPR 2024. The challenge seeks to advance image processing methods for night photography by addressing the complexities of rendering images captured at night, particularly this year with raw mobile phone captures, and evaluating the results based on perceptual quality and computational efficiency.

News and updates

March 25th, 2024

March 22nd, 2024 March 13th, 2024 March 9th, 2024 March 6th, 2024 March 1, 2024 February 26th, 2024 February 23rd, 2024 February 22nd, 2024 February 21st, 2024 February 15th, 2024 February 14th, 2024 January 26th, 2024

Motivation for this challenge

The procedure of capturing and processing images taken by cameras involves employing onboard processing to transform raw sensor images into the final, polished photographs, subsequently encoding them in a standard color space like sRGB. However, capturing images at night presents distinct challenges not typically encountered in daylight shots. Unlike daytime images, where assuming a single global illumination is often adequate, night images frequently feature multiple illuminants, many of which are visible in the scene. This complexity makes it difficult to determine the optimal illumination correction for rendering night images. Additionally, commonly used tone curves and photo-finishing techniques for daytime images may not be suitable for night photography. Furthermore, widely employed image metrics like SSIM and LPIPS may not effectively evaluate night images. The absence of established best practices and limited research in the realm of night photography pose significant challenges. The main objective of addressing this challenge is to stimulate research and advance the field of image processing specifically tailored for night photography.

The upcoming challenge will diverge from its predecessors as it incorporates raw images captured by mobile phones. This shift is prompted by the prevalent demand for processing such images and the distinctive features they possess in contrast to the conventional camera images employed in prior challenges. Noteworthy differences, like increased noise and vignetting, render this year's challenge particularly intriguing. Moreover, due to the limited computational resources of mobile phones, an extra ranking list will be introduced in this year's challenge. This list will specifically prioritize the execution time of the top-performing solutions from the regular list, which is traditionally based solely on average quality as determined by the mean opinion score.

Challenge Goal and Uniqueness

This challenge addresses a specific issue within the realm of computer vision because there are no ground-truth images. The objective is to devise a methodology for generating authentic and visually captivating photographs of nighttime scenes, which is a challenging problem that remains incompletely resolved. Night photography not only holds technical significance in domains such as surveillance and security but also has a place in art since it can result in stunning and awe-inspiring images. Evaluation of submissions will involve mean opinion scores from observers and additional computation cost analysis, thereby addressing the efficiency of the developed procedures as well.

SotA Image SotA
Baseline Image` Baseline

Challenge data

The participants in this challenge will be granted access to raw-RGB images of night scenes, which have been captured using the Huawei Mate 40 Pro sensor type and are encoded in 16-bit PNG files. Accompanying these images will be additional meta-data, provided in JSON files. The challenge will commence with the provision of the initial images to participants, which will be utilized for the development and testing of their algorithms. Data access will be granted upon registration, and further information can be found on the registration form at the bottom of the page. Additional images will be made available throughout the challenge, and further details can be found in the information provided on the evaluation and leaderboard. As an added resource, the organizers have also provided code for a baseline algorithm, as well as a demonstration, on GitHub.

Evaluation/Leadersboard

The evaluation of the participant’s solutions in this challenge will consist of four checkpoints: three validation checkpoints during the contest and one final checkpoint at the conclusion of the contest. It is worth noting that only the final checkpoint is mandatory, and the validation checkpoints are optional. As such, new participants may join the challenge at any time before the final submission deadline, provided they fulfill the other requirements of the challenge.

Mean opinion scores will be obtained through visual comparison, carried out using Yandex Tasks (similar to Mechanical Turk). Yandex Tasks users will rank the solutions in a forced-choice manner. It is important to note that Yandex Tasks primarily relies on observers from Eastern Europe and and Central Asia to perform the image ranking, and as a result, there may be a cultural bias in terms of the preferred image aesthetics by the observers. Yandex Tasks users will not be made aware of the identity of the participants. An example of this evaluation process can be found at a specified link.

The results obtained during the validation checkpoints will provide feedback to the challenge teams on the quality of their solutions. During each validation checkpoint, 125 new test images will be provided to the participants. Each participating team will be able to submit up to two distinct solution image sets, with each set consisting of exactly 125 images, and will be intended to help the participants test the behavior of different solutions.

For the conclusive assessment, your submission should comprise a Docker container containing the executable solution to generate results for the final test set on our end. Additionally, it should encompass the outcomes of the 3rd validation set, generated by the solution within the Docker container. This is solely for confirming that we are employing the submitted solution as intended by its authors. The final test set will be kept confidential until after the submission deadline. However, either the encrypted version of the test set or its hash will be disclosed in advance.

During the solution submission, the participants will have the option to make their Docker container open, i.e., publicly available after the challenge. This will be a prerequisite to be eligible for receiving winner certificates should they win one of the first three places and if they wish to be included in the final report.

It is important to note that the evaluation of this challenge is subjective in nature, and the challenge organizers have designed it to be as fair as possible, given the nature of the task. The challenge organizers reserve the right to modify the evaluation procedures as necessary to improve the fairness of the challenge.

It should be noted that only those solutions are allowed that use open components available to all participants in the competition (at the time of the start of the competition). Proprietary solutions will be disqualified.

The solutions will be executed on the computer with the following configuration:

CPU: Intel(R) Core(TM) i7-4790 CPU @ 3.60GHz
RAM: 16.0 GB
GPU: MSI GeForce RTX 2060 12Gb

The solutions should produce images of size 1024×768 for horizontal images and 768×1024 for vertical ones.

Submissions

The submission rules will be described here around the same time when the challenge data becomes available.

Timeline

* Please note that the timeline can be slightly changed, so it is advised to check the challenge web page over time.

Registered Teams

The list of 50+ registered teams is available here.

Reporting

In order to be eligible for the prizes, the participants will be required to send code and reports about their solutions in the form of short papers during the submission. If this report is not submitted, the participants that would otherwise win a prize will be passed over.

Prizes

Winners will receive a winner certificate and will have an opportunity to submit their paper to NTIRE'2024 and participate in the common report which also will be submitted to CVPR workshop.

There will also be monetary prizes in US dollars. The first three solutions according to the quality score obtained by using the Yandex Tasks platform will be awarded $1000, $650, and $350, respectively.

Additionally, top 5 solutions according to the quality score obtained by using the Yandex Tasks platform will be sorted again according to the execution time required to render the images. The first three solutions after this sorting, i.e., the three fastest high quality solutions will be awarded $1000, $650, and $350, respectively.

Q&A

If you still have any questions, please send an email: nightphotochallenge@gmail.com

Organizers

Russian Academy of Sciences - IITP (Moscow, Russia)
  • Egor Ershov
  • Maria Efimova
  • Dima Iarchuk
  • Oleg Karasev
  • Sergey Korchagin
  • Artyom Panshin
  • Alexandr Startsev
  • Arseniy Terekhin
  • Daniil Vladimirov
  • Ekaterina Zaychenkova
  • Shepelev Lev
Gideon Brothers (Croatia)
  • Nikola Banić
Swiss Federal Institute of Technology in Zurich (Switzerland) and University of Würzburg (Germany)
  • Radu Timofte

Sponsorship

AIRI Institute