Problem 11

This is the archived version of this course from the Spring 2013 semester. You might find something more recent by visitning my teaching page.

Unweighting a die

Due: January 18
Points: 5-7

You are playing a game in which you repeatedly have 3 choices, A B or C, and you are supposed to pick one of them at random by rolling a die. As a 3-sided die is geometrically impossible, you will have to "simulate" the 3-sided die by doing some other kind of random process. For example, if you had a fair six-sided die, you could simply roll that once, than take 1 or 2 to mean "A", 3 or 4 to mean "B", 5 or 6 to mean "C". Since each of these three outcomes would be equally likely, the simulation works.

But instead, you are given a "weighted" six-sided die, with the following probabilities of rolling each number from 1-6:

Side Probability of rolling it
1 1/6
2 1/9
3 1/9
4 1/9
5 1/4
6 1/4

First, calculate the entropies involved.

Deliverable 1 (1 point): Calculate the entropy in one roll of a fair 3-sided die, and in one roll of the weighted 6-sided die.

Next, develop an algorithm to simulate a "fair" 3-sided die by repeatedly rolling this weighed 6-sided die. It is OK if you (sometimes) need more than one roll of the weighted die to give one roll of the 3-sided one.

Deliverable 2 (2 points): Pseudocode for your algorithm, and show why it is correct.

Finally, show how efficiently your algorithm makes use of its randomness. To do this, you need to first determine the expected number of weighted die rolls that your algorithm makes per each "fair" output. Multiply this by the entropy in each weighted-die roll to give you the amount of entropy your algorithm uses per output, on average. To compute the entropy loss, you subtract the entropy out from the entropy in, and divide by the entropy in.

For example, if the entropy in each fair 3-sided roll output is 2 bits (it isn't), and the entropy in each weighted 6-sided roll input is 3 bits (it isn't), and the algorithm needs 4 weighted rolls on average to compute each output (it shouldn't), then the entropy loss is \(\frac{12-2}{12}\), a little more than 83%.

Deliverable 3 (2 points): Calculate of the entropy loss for your algorithm.

Of course, you should try to make the entropy loss as small as possible.

Bonus points: 1 point for entropy loss less than 50%, 2 for entropy loss less than 25%.