performance – Intentionally using “bad(der) machines” to develop game?

I developing a game intended for desktop computers.

I am afraid of designing my game to costly in aspects of performance, and therefore I am afraid to use a machine that might be better than the target devices.

However, I don’t know yet what the target devices will look like when my game is finished in 5 years.

How do Indy developers choose a device for developing their game in order not to be taken away by its performance?

Just an example to make it more clear:

If a developer had access to a quantum computer, I guess he shouldn’t use it. Else he would lose his ability to estimate the performance on the target devices.

Or is the “developer machine selection approach” not how it’s done, and instead, there is simply a certain tri count limit for everything?

I remember that for RE4, there was simply a hard limit for tri count for meshes.

Thank you!