# mg.metric geometry – Banach fixed point theorem / convergence squeeze

I am currently investigating an iterative learning algorithm and its convergence time. If we let $$x_1 = g(x_0)$$ and let $$epsilon := |x_t – x^*|$$ be our desired error bound from the fixed state, then we have
$$t geq lnleft( frac{epsilon(1-L)}{|g(x_0) – x_0|} right) / ln(L)$$
where L is the Lipschitz constant. My question is this: our function is of the form $$g(x) = frac{1 – s(x)}{Ccdot s(x)}$$ where $$C$$ is some constant and $$s(x)$$ is a function is not always known. If I can verify that the unknown $$s(x)$$ is sandwiched between two polynomials, does this guarantee that $$g(x)$$‘s convergence time can thus be bounded as well? For example if I prove
$$C_1x^{k_1} leq s(x) leq C_2x^{k_2}$$
then can I say
$$text{Conv. time of } C_1x^{k_1} leq text{Conv. time of } s(x) leq text{Conv. time of } C_2x^{k_2}$$