12

How would you algorithmically detect for any given photo whether the sun was shining when the picture was taken?

Examples

A sample from this webcam at a mountain top:

sunshine example

Clearly the sun is shining.

In this other sample it's far less obvious:

cloudy example

One could probably detect fairly easy whether it's foggy by trying to identify the tiny church spire on the chapel in the center. However, knowing very little about image processing I'd be surprised if there was a (combination of) algorithm that could reliably tell if there's sunshine or not.

Marcel Stör
  • 221
  • 1
  • 4
  • 1
    I think it fits to CS.SE very well. It would not fit, if you asked about an imagemagick script that sorts your photos depending on sunshine. – frafl Apr 05 '13 at 15:07
  • 1
    Do you mean detecting whether the sun was shining at the time or not, or detecting where (orientation) the sun was shining? – Paresh Apr 05 '13 at 15:40
  • @Paresh, whether the sun was shining at the time the photo was taken (think web cam). – Marcel Stör Apr 06 '13 at 11:55
  • 1
    There is this lovely anecdote about neural network follies: The Pentagon tried to detect enemy tanks on photos with a neural network. But in the end they had a multi-million dollar mainframe computer that could distinguish photos that where take on a cloudy day from ones that were taken on sunny day. :-) – uli Apr 06 '13 at 14:04
  • Can we assume a calibrated camera, i.e. can we trust colors? – Raphael Apr 07 '13 at 11:52
  • @Raphael, I don't know as not all cameras are under my control. Let's assume you can trust colors. – Marcel Stör Apr 07 '13 at 12:27
  • I commented before reading decden's answer below; I think what he proposes is reasonable: sunlight can be distinguished from many other light sources by its spectrum. – Raphael Apr 07 '13 at 12:32

1 Answers1

10

If you can access the meta-data you could apply a number of heuristics:

  1. Check the white-balance setting, the camera has chosen for the photo. Basically it indicates the color temperature of the light at the time the photo was taken. Usually sunlight is around 5500 Kelvin. Indoor lighting or cloudy days, usually have different temperatures.

  2. Check the exposure settings. Usually when the sun shines, there is more light available, so this influences the following settings:

    1. A lower shutter speed
    2. A lower ISO setting
    3. A higher f-value

    In the old days of photography, the Sunny 16 rule was used to estimate exposure. And you could use it, together with the information in the list above, to check if the photo was taken during the day.

  3. Check if a flash fired or not.

  4. Check the time the photo was taken. Assuming the user has configured the clock, you might immediately remove photos shot at night.

If however you want to approach this problem simply from an image-processing point of view. I can say that sunlight has usually higher contrast, and harsher edges. Therefore histogram analysis and finding harsh edges might give a good indication.

Assuming the photos you are processing are in a Raw Image Format you could apply the same white-balance trick described above. This however does not work for standard images like jpeg or png. This because the image processor on the camera already compensates for the shift in color temperature, and bakes the result into the final image, merely recording the wb-setting it in the metadata.

decden
  • 201
  • 1
  • 5
  • I was rolling on the floor, when I read "Check the time the photo was taken". However, I'm not sure the OP wants to reconstruct what the camera did. – frafl Apr 05 '13 at 20:41
  • @frafl, I added two examples – Marcel Stör Apr 06 '13 at 12:16
  • @frafl A heuristic, is any approach to problem solving, learning, or discovery that employs a practical method not guaranteed to be optimal or perfect, but sufficient for the immediate goals. Checking the time the photo was taken (if the location is even partially known) would be a perfectly acceptable heuristic. – Andy Gee Apr 21 '16 at 10:05