I’m trying to reproduce MTF chart programmatically but I’m stuck at some point.
I’m starting with a clean bar chart like this:
I then take a picture of this pattern with a smartphone (printed on HD quality paper, no gloss, no texture, …) and I get something like this: (there is a big of distortion)
I used the following page to understand how MTF is calculated:
With my script, I’m basically scanning all vertical lines one by one, I scan pixels one by one from top to bottom for each line and I get the min and max value for luminance (I get the LAB value for each picture and I take the “l” value).
For the first step (amplitude), I get the following graph for my “perfect” example:
And when I do the same with the picture I took, I get something like this:
So far, it seems to be in line with what I see on the Imatest page except that my values seem inverted and that I go above 100 while I only apply the following formula: C(f)=Vmax−Vmin/Vmax+Vmin
Now my problem is that I can’t get to the MTF formula, if I apply MTF(f)=100%×C(f)C(0), I get exactly the same graph. My maths are rusty and maybe I’m missing something somewhere.
What I’m trying to achieve is to give a easy to understand score to sharpness. MTF is a good starting point but in the end, I would like to give a score like an average. I know it’s not accurate because the center is sharper than the edge but at least it would allow to rank sharpness in a more or less neutral way.
Any idea on how I can build the MTF chart or how I could assign a final score?