Cheers, I am implementing a hashmap using separate chaining and AVL Trees as buckets but I am having some trouble proving some time complexities. I have to find the worst and average time complexities and also amortized vs real time complexities. So basically I have 4 different complexities combinations (*average time amortized*, *average time real*, *worst case amortized*, *worst case real time*). My main problem is that I don’t know in which combination I should count the rehashing that I do.

I am now gonna explain some more things about my approach:

When inserting an item I first hash its key and insert in the tree at the hashed position. That insertion should probably take O(log(k)) complexity where k <= n, because k is the number of values inside that bucket, but that makes that O(logn). Now my problem is with the rehash. I set the max load factor for the hash map to 0.9, which is the optimal for the separate chaining approach.

In the rehash:

First I allocate a new hash map with double capacity, I then create the trees for each position of that new capacity hash map, so I make one iteration for the hashmap capacity. I then traverse the old hashmap, I visit every tree and every node inside it and I re-add it into the hashmap using insertion ( so I call hashmap_insert again) , so it hashes to the correct position. However, when traversing the trees, I first get the first node of that tree, and for every node in that tree I call a function called tree_next, which given a node returns the next node in that tree, taking O(logn) time, where n is the number of elements in that tree. However, because we keep the trees relatively “short”, then we can suppose that it’s O(1). When is that though ? When we study the amortized time, or the average ? Or when studying their combination (average time-amortized)?

I would love some help! Thanks!