12

From Tower of Hanoi to Counting Bits

 3 years ago
source link: https://susam.in/blog/from-tower-of-hanoi-to-counting-bits/
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
From Tower of Hanoi to Counting Bits

From Tower of Hanoi to Counting Bits

By Susam Pal on 25 Sep 2011

Tower of Hanoi

A few weeks ago, I watched Rise of the Planet of the Apes. The movie showed a genetically engineered chimpanzee trying to solve a puzzle involving four discs, initially stacked in ascending order of size on one of three pegs. The chimpanzee was supposed to transfer the entire stack to one of the other pegs, moving only one disc at a time, and never placing a larger disc on a smaller one.

The problem was called the Lucas' Tower in the movie. I have always known this problem as the Tower of Hanoi puzzle. The minimum number of moves required to solve the problem is 2n−1 where n is the number of discs. In the movie, the chimpanzee solved the problem in 15 moves, the minimum number of moves required when there are 4 discs.

Referring to the problem as the Lucas' Tower made me wonder why it was called so instead of calling it the Tower of Hanoi. I guessed it was probably because the puzzle was invented by the French mathematician Édouard Lucas. Later when I checked the Wikipedia article on this topic, I realized I was right about this. In fact, the article mentioned that there is another version of this problem known as the Tower of Brahma that involves 64 discs made of pure gold and three diamond needles. According to a legend, a group of Brahmin priests are working at the problem and the world will end when the last move of the puzzle is completed. Now, even if they make one move every second, it'll take 18 446 744 073 709 551 615 seconds to complete the puzzle. That's about 585 billion years. The article also had this nice animation of a solution involving four discs.

Animated solution of the Tower of Hanoi puzzle Animated solution of the Tower of Hanoi puzzle created by André Karwath (original source)

I'll not discuss the solution of this puzzle in this blog post. There are plenty of articles on the web including the Wikpedia article that describes why it takes a minimum of 2n−1 moves to solve the puzzle when there are n discs involved. In this post, I'll talk about an interesting result I discovered while playing with this puzzle one afternoon.

Binary Numbers

If we denote the minimum number of moves required to solve the Tower of Hanoi puzzle as Tn, then Tn when expressed in binary is the largest possible n-bit integer. For example, T4=1510=11112. That makes sense because Tn=2n−1 indeed represents the maximum possible n-bit integer where all n bits are set to 1.

While playing with different values of Tn for different values of n, I stumbled upon an interesting result which I will pose as a problem in a later section below.

Assumptions

Before proceeding to the problem, let us define the bit-length of an integer to eliminate any possibility of ambiguity:

  • A positive integer x is said to be an n-bit integer if and only if the minimum number of bits required to express the integer is n, or equivalently, ⌊log2⁡x⌋+1=n.

We will be dealing with arbitrary precision integers (bignums) in the problem, so let us also make a few assumptions:

  • Addition or subtraction of an m-bit integer and an n-bit integer (m≤n) takes O(n) time.
  • Counting the number of 1-bits in an n-bit integer takes O(n) time.

The definition along with the assumptions lead to the following conclusions:

  • Adding or subtracting two integers a and b takes O(log⁡(max(a,b))) time.
  • Counting the number of 1-bits in an integer a takes O(log⁡(a)) time.

Binary Puzzle

What is the most efficient way to compute the number of 1-bits in

T1+T2+⋯+Tn

where n is a positive integer, each Ti=2i−1 for integers 1≤i≤n, and efficiency is measured in terms of time and space complexity?

The naive approach involves adding all the n integers and counting the number of 1-bits in the sum. It takes O(n2) time to add the n integers. The sum is an (n+1)-bit integer, so it takes O(n) time to count the number of 1-bits in the sum. Since the sum is (n+1)-bit long, it takes O(n) memory to store the sum. If n is as large as, say, 264, it takes 2 exbibytes plus one more bit of memory to store the sum.

We can arrive at a much more efficient solution if we look at what the binary representation of the sum looks like. We first arrive at a closed-form expression for the sum:

T1+T2+⋯+Tn=(2−1)+(22−1)+⋯+(2n−1)=(2+22+⋯+2n)−n=(2n+1−2)−n=(2n+1−1)−(n+1).

Now 2n+1−1 is an (n+1)-bit number with all its bits set to 1. Subtracting n+1 from it is equivalent to performing the following operation with their binary representations: for each 1-bit in (n+1), set the corresponding bit in (2n+1−1) to 0.

If we use the notation bitcount(n) to represent the number of 1-bits in the binary representation of a positive integer n, then we get

bitcount(T1+T2+⋯+Tn)=(n+1)−bitcount(n+1).

Now the computation involves counting the number of 1-bits in n+1 which takes O(log⁡n) and subtracting this count from n+1 which also takes O(log⁡n) time. Further, the largest number that we keep in memory is n+1 which occupies O(log⁡n) space. Therefore, the entire problem can be solved in O(log⁡n) time with O(log⁡n) space.

What would have taken 2 exbibytes and 1 bit of memory with the naive approach requires 8 bytes and 1 bit of memory now.


Home Feed Dark About GitHub Twitter

© 2006–2020 Susam Pal


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK