You could take an implementation of an AVL-Tree in your language of choice, and extend the nodes by the information what the minimum key in the left tree and the maximum key in the right tree is (i.e. by introducing minkey and maxkey fields).
The maintenance of this should be fairly easy. For a new node, simply set minkey = maxkey = key. On insert, whenever you make the decision to follow the left or the right path, update the minkey or maxkey field of the current node accordingly. Similarly on delete and rotations. This way, you'll always have the minimum and maximum available in the top node.
The added beneift would be that lookups could be slightly faster, lookup can be terminated as soon as the searched key is lower than minkey or greater than maxkey.
But to me it is not apparent that you in fact would need that, as to find the minimum/maximum key in a well balanced search tree (such as an AVL Tree) should be O(log n) averaged, hence maybe the whole thing is not even worth the effort.
If you need the fast insert/delete of an unordered_map (i.e. a hash map), you can at least reach amortized constant tiome for min and max, by maintaining the following data structure:
class umapminmax {
map<int, int> themap;
int min, max;
}
You must insert and delete only through this class and record the minima/maxima. Whenever the minimum or the maximum key gets deleted, you need to scan all your keys to get the new minimum/maximum. This scanning is O(n). But if it doesn't happen too often, the amortized time of accessing the minimum/maximum will still be constant.
If you know, OTOH, that it is likely that the minimum/maximum value is removed often, then the O(log n) solution with a search tree will probably be the fastest you can get.
Still, from how you describe your application, even the following is imaginable:
- It happens often, that the maximum is removed. But, frequently a new maximum will get inserted before you ask for the current maximum the next time.
If this is the case, you can apply a bit more logic: Say that the maximum key gets deleted. Now, instead of scanning the map for the new maximum, set a flag (another new field) that indicates that the maximum key is not valid. Then, when by chance another value with a key that is greater or equal than the invalid maximum is inserted, you can just record this key as maximum and reset the flag so that in indicates the maximum is valid.
When retrieving the maximum, however, you must consult the flag and perform a scan if it says it is invalid.
You see, you really need a model implementation and do some statistics with real world data.