Visually, this looks absolutely correct to me. In your unmatched image on my color-calibrated monitor, the steps between the gray spots appear at irregular intervals. In the adjusted image, they are more noticeable.
Is color saturation a natural consequence of applying a gamma value, and if so, what can be done to compensate for this effect?
No. This is a consequence if correct black levels and white points are not set. You should do this before you apply the gamma curve.
The gamma curve that you used has the following form:
The light line applies to the values displayed in the histogram. Once you've applied this curve, you'll get a histogram like this:
This is basically a boring, "flat" curve – that is, even though it has no linear value, it is perceptively basically like that. You can see, however, that the values are all in the middle. This is very functional and maybe that's what you want for image processing, but in general it's not what we want visually Ones. You may just want to increase the contrast by dragging the black and white dots as follows:
What gives a picture like this:
… much less washed out.
Your camera may have an adjustable black level, and you may want to raise it slightly. You must also set a suitable white level for your camera for conversion. You may want to check the Dcraw code to see how it normally does. (Well, spoiler: dcraw sets the white level to the 99th percentile of the histogram.)
Incidentally, this result has such a histogram:
what you can see extends to the extremes of the histogram. Since I am working with an 8-bit image in an 8-bit area, you can see that the colors are getting sparse. For "real work" you want to work with a higher bit depth (and probably only apply a transformation instead of a series).
This is definitely a bit boring. In the visual world, we may want to apply an s-curve to increase the power:
… or something like that.