I’m trying to measure sharpness through a script in order to measure sharpness across different devices.
- I have a pattern with pure white and black stripes like this:
- I use HD printing, matte, no reflection or texture
- I take a picture in exactly the same conditions
When I take the picture with a device, I then analyse the picture pixel by pixel. Of course, the picture will never be perfect but as testing conditions will always be the same, I should be able to compare sharpness.
So what I measure:
- the amount of pixels in pure white (usually close to 0)
- the amount of pixels in pure black (usually close to 0)
- the amount of pixels in grey
- the amount of pixels for everything else
For each of those measurements, I have the amount of unique colors (in LAB format) as well as the sum of all pixels for each type of color.
From what I see with my eyes and what the pixels are saying, I see some common trends but I also see different directions I could take.
- the number of distinct greys seems to give a good indication about sharpness (less distinct greys = more sharpness)
- the difference of light greys (when L from lab color is >50) and dark greys (when L from lab color is <50) also seems to give a hint about sharpness/contrast as when the difference between both is big, sharpness is better
Do you have ideas of the criterias I should use to measure sharpness?