# complexity theory – Is the multiplicative constant in the Big O notation are ignored because of Linear Speed-Up theorem?

I just want to know if Big O notation was designed to be used as a consequences of the Linear speed up theorem or not. For me I guess the answer is yes.

For example, if we didn’t have a linear speed-up theorem, then does it mean that we would have a different measure of time/space complexity? i.e. multiplicative constants does makes different. For example, $$f(n) = 100 n$$ isn’t the same as $$g(n)=10^{82}n$$. Therefore, in this regard, Big O notation is not useful. So, probably we have another way to measure algorithms.

For downvoters, thank you for reading, just try to put your comments below in order to improve the question or in worst case I will delete the question.