I have a picture consisting of a two-dimensional array of ordered triples (red, green, blue) of real numbers from 0 to 1. I'm looking for something like a norm on pictures which expresses the range of colors used.
The idea is that a grayscale image should have norm 0 and an image with pure red, green, and blue should have norm 1. Here's the key part: A colorized image (black and red, for example, instead of black and white) should have norm 0 just like a grayscale image. So if all colors are linear combinations of two colors the norm is 0, and the extent to which a third basis is needed to represent the colors used (even if only for one pixel) is the norm.
Any ideas on how to formalize this?
My first instinct is to change bases to (hue, saturation, lightness) and look at the maximum difference of hues (mod 1). But then two points with colors $(\varepsilon, 0, 0)$ and $(0, \varepsilon/2, \varepsilon/2)$ would seem very distant where they actually represent colors which are very close (near-black).
It seems natural at this point to transform the problem into one of geometry and look at the color bicone, but what is the most natural metric to use here? Euclidean? L1? Something else?
Of course other approaches would be welcome.