reference request – Divergence between random variables after transformation

Let $X$ and $Y$ be random variables with laws $mu_X$, $mu_Y$ and $d$ be some $f$-divergence (e.g. KL, total variation, Hellinger). Writing $d(X,Y)$ for the divergence between $mu_X$ and $mu_Y$, are there known (upper or lower) bounds on $d(g(X),g(Y))$ in terms of $g$ and $d(X,Y)$?

Of course, the most natural candidate for these bounds would be an expression involving $d(X,Y)$: For example, by taking $g$ to be a constant, we know that there is no lower bound of the form $d(X,Y)le Ccdot d(g(X),g(Y))$, but maybe there is an upper bound.

(There is nothing really special about divergences here, and $d$ could be a metric such as Wasserstein. Divergences have a nice representation via densities, in which case the usual Jacobian transformation seems like an appealing tool, but I have not gotten far with this yet.)

Clarification: In the most general form, the bounds would of course also depend on $f$. I would be satisfied for interesting bounds that hold for one of the “usual” $f$ such as those outlined above.