In Part 1, I have shown how you can increment a “binary” counter in `O(1)` time if you use a redundant number system. Now I am going to show you how you can apply this simple abstraction to deamortize Binomial Heaps (here is the Wikipedia page for Binomial Heaps if you want a refresher).

Where does the amortization come in for Binomial Heaps? Well, all the time bounds are indeed stated as worst-case bounds. But, as is commonly taught in many places, you can have a more precise time bound for Insert if you do an amortized analysis. The argument is particularly simple if we use a binary counter abstraction.

We will encode the configuration of the binomial trees using a binary counter `d_{log n}…d_1d_0`. We set `d_i` to be 1 iff a binomial tree of rank `i` is present. Inserting into a binomial heap is now precisely incrementing this counter. Linking two binomial tree of rank `i` to form a binomial tree of rank `i+1` is exactly converting a 2 in `d_i` to a 1 in `d_{i+1}`. Guess there must be a reason why it’s called a binomial heap.

The amortized analysis can be done using the credit invariant method. We argue that over a series of `n` insertions, the digits only changed at most `2n` times. For each rank (digit), we open a bank account. Each insertion pays two dollars: one spent immediately for the actual insertion and one deposited at the account of the first 0 that turned into a 1—observe that precisely one 0 will turn into a 1 in every increment as the propagation will stop immediately afterwards. The dollar deposited in the account of this 1 will be used to turn it back into a 0 during a future insertion.

Check that we will have enough money to pay for all digit changes during any insertion. As we never go bankrupt, we conclude that 2 dollars per insertion is enough. In other words, insertion actually takes “only” amortized `O(1)` time.

Equipped with our technology from Part 1, now we can implement insertion in worst-case `O(1)` time simply by using a redundant counter. Indeed we will allow two binomial trees of rank `i` to co-exist for a while, until the counter needs to link the two when fixing its regularity. Simple enough.

For those of you who want to know how redundant counters were invented in the first place, see

- Michael J. Clancy and Donald E. Knuth.
*A Programming and Problem-Solving Seminar*. Technical Report STAN-CS-77-606, Computer Science Department, Stanford University, April 1977.

Note that over the years many extensions to redundant counters have been developed, and in fact the terminologies used in this series are “modern”. We will talk more about those in the future.