selection principles – Can I drop a variable from a logistic regression, score on the other variables and then add back in for a different model? (See Body)

So what I have is a logistic regression equation with the first variable having a large (not overly so) coefficient ie.

ln(p/1-p) = C + B1V1 + B2V2 + B3V3 + B4V4 +B5*V5

Because this coefficient is so large there is a need to remove this as part of a strategy but we could also try to adjust the power of this variable in the above model. I’ve considered ridge regression but I think the following method is much simpler and quicker if it indeed works.

My question is can I have the following model with B1*V1 removed:

ln(p/1-p) = C + B2V2 + B3V3 + B4V4 +B5V5

Then convert to a score base on pdo and base score and use this score to go into a new model with the previous first variable ie.

ln(p/1-p) = C + B1.1*(Score calculated from first model) + B1*V1

And would this result in the same model? I am thinking it would not because the iterative process would have less coefficients to optimise and the score from the previous model would be much more predictive than the variable we dropped and used in the second model.