# runtime analysis – How to analyse the worst-case time complexity of this algorithm(a mix of Bubble Sort and Merge Sort)?

Suppose I have a sorting algorithm that sorts a list integers. When the input size(the number of elements) $$n$$ is odd, it sorts using Bubble Sort and for even $$n$$ it uses Merge Sort. How do we perform the worst-case time complexity analysis for this algorithm?

The context in which this question came about is when I was going through the analysis of MAX-HEAPIFY algorithm given in CLRS(3rd edition) on page 154. In the worst-case analysis, the author had assumed some arbitrary input size $$n$$ and then concluded that the worst case occurs when the bottom-most level of the heap is exactly half full. This threw me off since in various texts and articles, $$n$$ is assumed to be fixed when performing the worst case analysis(and even for best or average cases for that matter) and that the number of elements at the bottom-most level of a heap of $$n$$ nodes is fixed. In that light, I concocted this algorithm so as to have the worst case dependent on $$n$$.

My intuition tells me that the worst case time complexity for this algorithm is $$mathcal O(n^2)$$ since that’s the worst case runtime for Bubble Sort. But I want to know the precise mathematical formulation of the worst-case time complexity analysis for any algorithm. Any insight would be much appreciated.