Coin-Flipping Magic: E Cen E Cen
Coin-Flipping Magic: E Cen E Cen
Nadia Benbernou
Erik D. Demaine
Martin L. Demaine
Benjamin Rossman
Preliminary draft. The latest version of this paper can be found at http://erikdemaine.org/papers/CoinFlipping/
MIT Computer Science and Articial Intelligence Laboratory, 32 Vassar St., Cambridge, MA 02139, USA, {nbenbern,
edemaine,mdemaine,brossman}@mit.edu
1
THH, HTT, THT, TTH, TTT). How do we navigate to two of these states (HHH or TTT) using just three
moves (and often fewer)? Motivated by this simple question, this paper studies several generalizations and
variations of this magic trick.
Equalizing four coins:
1. Flip the rst coin.
2. Flip the second coin.
3. Flip the rst coin.
4. Flip the third coin.
5. Flip the rst coin.
6. Flip the second coin.
7. Flip the rst coin.
Figure 2: Equalizing four coins
with at most seven ips.
Results. We begin in Section 2 with a simple generalization of the trick
to n coins. This generalization, and the original trick, turn out to be
easy to analyze: they are equivalent to Hamiltonian paths in an (n
1)-dimensional hypercube graph, and the classic Gray code gives one
such solution. Not surprisingly, the number of moves required grows
exponentially with n. More interesting is that we save an entire factor of
two by having two goal states, all heads and all tails. Namely, the worst-
case optimal number of blindfolded ips is 2
n1
1. The analysis also
easily generalizes to k-sided dice instead of coins: in Section 3, we show
that the worst-case optimal number of blindfolded operations is k
n1
1.
This family of tricks is thus really most impressive for n = 3 coins, where
the number 3 of ips is really quite small; beyond n = 3 or k = 2, the
number of ips grows quickly beyond feasibility. For sake of illustration,
however, Figure 2 shows a magic-trick sequence for four coins.
Balancing six coins:
1. Flip the second coin.
2. Flip the fth coin.
3. Flip the third coin.
4. Flip the rst coin.
5. Flip the fourth coin.
Figure 3: Equalizing the numbers
of heads and tails in six coins using
at most four ips.
One solution to this exponential growth is to change the goal from
the two all-heads and all-tails states to some larger collection of states. In
Section 4, we consider a natural such goal: equalize the number of heads
and tails. This goal is exponentially easier to achieve. Within just n 1
coin ips, the magician can force the numbers of heads and tails to be
equal. The algorithm is simple: just ip one coin at a time, in any order,
until the goal has been reached. Figure 3 shows an example for n = 6.
Although not obvious, this algorithm will succeed before every coin has
been ipped once. Furthermore, by randomly ipping all the coins in
each move, the magician expects to require only around
n moves. The
practicality of this type of trick clearly scales to much larger n.
Varying number of coins ipped:
1. Flip the left and middle coins.
2. Flip the left coin.
3. Flip the left and middle coins.
Exactly two coins ipped:
1. Flip the left and middle coins.
2. Flip the middle and right coins.
3. Flip the left and middle coins.
Figure 4: Alternate solutions to equalizing
three coins with at most three moves, ipping
more than one coin in some moves.
Returning to the goal of equalizing all the coins, the next gen-
eralization we consider is to allow the magician to ip more than
one coin at once, between asking for whether the coins are yet all
the same. This exibility cannot help the magician to equalize
the coins any faster than 2
n1
1 moves, but it can help obscure
what the magician is doing. For example, if the magician had
to repeat the three-coin trick several times, it might help to try
some of the variations in Figure 4. Under what conditions can
the magician still equalize n coins, ideally in the same number
2
n1
1 of moves? Obviously, ipping n 1 of the coins is
equivalent to ipping just one coin. On the negative side, in Sec-
tion 5, we show that the sequence of ips cannot vary arbitrarily:
if the spectator is allowed to choose how many coins the ma-
gician should ip in each move, then the magician can succeed
only if n 4 (no matter how many moves are permitted). On
the positive side, in Section 6, we show that it is possible to ip most xed numbers of coins in every move
and achieve the optimal worst case of 2
n1
1 moves. This result is fairly technical but interesting in the
way that it generalizes Gray codes.
2
Equalizing four rotating coins:
1. Flip the north and south coins.
2. Flip the north and east coins.
3. Flip the north and south coins.
4. Flip the north coin.
5. Flip the north and south coins.
6. Flip the north and east coins.
7. Flip the north and south coins.
Figure 5: Equalizing four coins that the
spectator can rotate at each stage, using at
most seven moves.
The nal variation we consider allows the spectator to re-
arrange the coins in certain ways after each move. Again this
can only make the magicians job harder. In each move, the ma-
gician species a subset of coins to ip, but before the spectator
actually ips them, the spectator can re-arrange the coins accord-
ing to certain permutations. Then, after ipping these coins, the
spectator reveals whether all coins are the same, and if not, the
trick continues. In Section 7, we characterize the exact group
structures of allowed re-arrangements that still permit the magi-
cian to equalize the coins. For example, if 2
k
coins are arranged
on a table, then the spectator can rotate and/or ip the table in
each move, and still the magician can perform the trick. Figure 5
shows the solution for four coins arranged in a square which the
spectator can rotate by 0, 90
, 180
, or 270
and n = p
. (Interestingly,
the optimal number of moves for a specic sequence of coins gives rise to the notion of word depth, now
studied in the context of linear codes in information theory [Etz97, LFW00].)
Our general scenario considers an arbitrary group of permutations, instead of just a rotating circular
table. This scenario was also essentially solved by Ehrenborg and Skinner [ES95]: they characterize per-
formability in terms of the chain structure of the group. Our characterization is simpler: the group must
have a number of elements equal to an exact power of two. Our proof is also simpler, showing a connection
to invariant ags from group representation theory. It also uses the most sophisticated mathematical tools
among proofs in this paper.
2 n Coins
000 100
101 001
010 110
111 011
Figure 6: The graph correspond-
ing to the three-coin trick: the 3-
dimensional binary cube.
The simplest generalization is to consider n coins instead of three. The
goal is to make the coins all the same (all heads or all tails) by a sequence
of single coin ips, where after each ip the magician asks are the coins
all the same yet?
We can visualize this problem as navigating a graph, where each
vertex corresponds to a possible state of all the coins, and an edge cor-
responds to ipping a single coin. This graph is the well-known n-
dimensional binary hypercube; Figure 6 shows the case n = 3. In gen-
eral, the n-dimensional binary hypercube has 2
n
vertices, one for each
possible binary string of length n (where 0 bits correspond to heads and 1 bits correspond to tails), and has
an edge between two vertices whose binary strings differ in exactly one bit.
3
In the magic trick, the spectator chooses an arbitrary start vertex, and the magicians goal is to reach one
of two goal vertices: 00 0 (all heads) or 11 1 (all tails). At each step, the magician can pick which
edge to move along: ipping the ith coin corresponds to ipping the ith bit. The only feedback is when the
magician hits one of the goal vertices.
An equivalent but more useful viewpoint is to map coin congurations onto the binary hypercube by
dening a bit in a binary string to be 0 if that coin is the same orientation (heads/tails) as the spectators
original choice, and 1 if the coin is different from the spectators original choice. In this view, the magician
always starts at the same vertex 00 0. The two goal congurations g and g are now the unknown part; the
only knowledge is that they are inversions of each other (with 0s turned into 1s and vice versa).
In order for the magician to be sure of visiting one of the two solution states, the chosen path (sequence
of coin ips) must visit either v or its inversion v for every vertex v in the hypercube. There are 2
n
total
vertices, so the path must visit 2
n
/2 = 2
n1
different vertices. This argument proves a worst-case lower
bound of 2
n1
1 ips in the worst-case execution of the magic trick.
To see that 2
n1
1 ips also sufce in the worst case, it sufces to nd a Hamiltonian path in any
(n 1)-dimensional subcube of the n-dimensional cube, dropping whichever dimension we prefer. (A
Hamiltonian path visits each vertex exactly once.) The spectator sets this dimension arbitrarily to heads or
tails, and the Hamiltonian path explores all possible values for the remaining n 1 bits, so eventually we
will reach the conguration in which all bits match the dropped bit. The subcube has 2
n1
total vertices, so
the presumed Hamiltonian path has exactly 2
n1
1 edges as desired.
The nal piece of the puzzle is that n-dimensional cubes actually have Hamiltonian paths. This fact
is well-known. One such path is given by the binary Gray code, also known as the reected binary code
[Gra53]. This code/path can be constructed recursively as follows. The n-bit Gray code rst visits all strings
starting with 0 in the order given by the (n 1)-bit Gray code among the remaining bits; then it visits all
strings starting with 1 in the reverse of the order given by the (n 1)-bit Gray code among the remaining
bits. For example, the 1-bit Gray code is just 0, 1; the 2-bit Gray code is 00, 01, 11, 10; and the 3-bit Gray
code is 000, 001, 011, 010, 110, 111, 101, 100. Figure 6 illustrates this last path.
Theorem 1 The optimal sequence of ips guaranteed to eventually make n coins all heads or all tails uses
exactly 2
n1
1 ips in the worst case.
3 n Dice
One natural extension of ipping coins is to rolling k-sided dice. Suppose we have n dice, each die has k
faces, and each face is labeled uniquely with an integer 0, 1, . . . , k 1. The spectator arranges each die with
a particular face up. As before, the magician is blindfolded. In a single move, the magician can increment or
decrement any single die by 1 (wrapping around fromk to 0). At the end of such a move, the magician asks
whether all the dice display the same value face up. The magicians goal is to reach such a conguration.
As we show in this section, our analysis extends fairly easily to show that the magician can succeed in
k
n1
1 steps. A conguration of n dice becomes a k-ary string of n digits between 0 and k 1. In the
most useful viewpoint, a digit of 0 represents the same as the original state chosen by the spectator, and a
digit of i represents that the die value is i larger (modulo k) than the original die value. Thus (0, 0, . . . , 0)
represents the initial conguration chosen by the spectator, and the k goal states g
0
, g
1
, . . . , g
k1
have the
property that g
i
corresponds to adding i to each entry of g
0
(modulo k).
The analogous graph here is the k-ary n-dimensional torus. Figure 7 shows the case n = k = 3.
In general, the vertices correspond to k-ary strings of length n, and edges connect two vertices a =
(a
1
, a
2
, . . . , a
n
) and b = (b
1
, b
2
, . . . , b
n
) that differ by exactly 1 (modulo k) in exactly one position:
b = (a
1
, a
2
, . . . , a
i1
, a
i
1, a
i+1
, . . . , a
n
).
4
Figure 7: 3-ary 3-dimensional torus.
Again we drop an arbitrary digit/dimension, and focus on the
resulting k-ary (n 1)-dimensional subtorus. The magic trick be-
comes equivalent to nding a Hamiltonian path in this subtorus.
Such a path exists by a natural generalization of the Gray code
[Gua98]. Visiting all congurations of the other dice will even-
tually match the value of the dropped dimension.
Theorem 2 The optimal sequence of die increments/decrements
guaranteed to eventually make n k-sided dice all the same uses
exactly k
n1
1 moves in the worst case.
4 Equal Numbers of Heads and Tails
In this section, we explore the variation of the magic trick in which the goal is to equalize the numbers of
heads and tails, instead of equalizing the coins themselves. We consider two strategies: a fast randomized
strategy, and a simple deterministic strategy that achieves a balanced conguration in linear time.
Call a conguration balanced if it has an equal number of heads and tails. Throughout, we assume n
is even, although we could adapt these results to the case of n odd by expanding the denition of balanced
conguration to allow the numbers of heads and tails to differ by at most 1.
4.1 Randomized Strategy
In the randomized strategy, in each move, the magician ips each coin with probability
1
2
. Thus the magician
ips around half the coins in each move, effectively randomizing the entire conguration. We show that the
magician reaches a balanced conguration in around
n such moves:
Theorem 3 Using the randomized strategy, the magician balances n coins within O(
2n(n/e)
n
e
1/(12n+1)
< n! <
2n(n/e)
n
e
1/(12n)
.
Thus
_
n
n/2
_
=
n!
(n/2)!(n/2)!
2n(
n
e
)
n
e
1/(12n+1)
_
n(
n/2
e
)
n/2
e
1/(6n)
_
2
=
_
2
n
2
n
e
9n+1
3n(12n+1)
.
Hence,
Pr{reaching balanced conguration in one move} =
_
n
n/2
_
2
n
_
2
n
e
9n+1
3n(12n+1)
> 0.7/
n.
Next we upper bound the probability of never reaching a goal state within t steps:
(1 Pr{reaching goal state in one step})
t
(1 0.7/
n)
t
e
0.7 t/
n
5
using the fact that (1 x)
t
e
xt
for all 0 x 1 and t 0. Hence the probability of obtaining a
balanced conguration within t steps is at least 1 e
0.7 t/
n
. Therefore, within t =
n steps, we reach a
balanced conguration with constant probability, and within t = (c/0.7)
G
k
=
G
k1
, ip all but the (n k + 1)st coin,
G
k1
.
In the base case, G
0
=
G
0
= . Validity of the solution follows as in Section 2; indeed, for any starting
conguration, the number of moves performed before the conguration becomes all-heads or all-tails is the
same in sequences G
k
and
G
k
.
Every move ips the rst coin, so the destiny of a conguration is determined by its parity and the
parity of n: if n and the number of coins equal to the rst coin (say heads) have the same parity, then the
conguration is destined is that value (heads); and if n and the (heads) count have opposite parity, then
the destiny is the opposite value (tails). To see why this is true, consider the hypercube viewpoint where
0s represent coins matching the initial conguration and 1s represent ipped coins in an execution of G
k
(not
G
k
). Then, at all times, the number of 1 bits in the conguration has the same parity as the number of
steps made so far. At the same time, every move in
G
k
ips the rst coin, so the rst coin in the current
conguration matches its original value precisely when there have been an even number of steps so far. Thus,
when we reach a target conguration of all-heads or all-tails, it will match the original rst coin precisely
if there have been an even number of steps so far, which is equivalent to there being an even number of 1
bits in the G
k
view, which means that the initial and target congurations differ in an even number of bits.
In this case, the initial and target congurations have the same parity of coins equal to their respective rst
coins; but, in the target conguration, all coins match the rst coin, so in particular n has the same parity as
the number of coins equal to the rst coin. We have thus shown this property to be equivalent to the target
conguration matching the initial rst coin.
It remains to verify the ipping claims. Flipping the rst j coins for even j < k preserves the parity of
the number of heads as well as the number of tails, but inverts the rst coin, so inverts the destiny. Flipping
the rst j coins for odd j < k changes the parity of the number of heads as well as the number of tails, and
inverts the rst coin, which together preserve the destiny. 2
With this base case in hand, we complete the induction to conclude Theorem 8. In the nonbase case,
n > k + 2. There are three cases to consider:
8
Case 1: Both n and k are odd. By induction, we obtain a solution sequence
of length 2
n2
1 for
n
.
This solution has length || = 2 |
| + 1 = 2
n1
1.
Next we prove that sequence solves the trick. Consider any conguration on n coins, and assume by
symmetry that the last n 1 of its coins are destined for heads in
prex will not complete the trick, at which point the magician ips the rst k coins. This move has
the effect of ipping the rst coin to heads as well as ipping the rst k 1 of the
subproblem, which is
destiny-preserving because k1 is even and (n1)k is even. Therefore, during the second
, the magician
will arrive at the all-heads conguration.
Now we verify the destiny claims. Note that the destiny of a conguration in equals the destiny of
the last n 1 coins in
, so we can apply induction almost directly. Flipping the rst j coins for even
j < k ips the rst j 1 of the last n 1 coins, which inverts destiny by induction because j 1 is even
and (n 1)k is even. Similarly, ipping the rst j coins for odd j < k preserves destiny by induction.
Finally, ipping coins 2, 3, . . . , k ips the rst k 1 coins of the last n 1 coins, which preserves destiny
by induction because k 1 is even.
Case 2: For n even and k odd, by induction we again obtain a solution sequence
of length 2
n2
1
for n
= n 1, viewed as acting on only the last n 1 coins. We construct a solution for n as follows:
=
, ip coins 1, 3, 4, . . . , k + 1,
.
Again has length 2
n1
1. Flipping coins 1, 3, 4, . . . , k+1 has the effect of ipping the rst 2, 3, . . . , k of
the last n 1 coins, which by induction is destiny-preserving because (n 1)k is odd. Thus, if the destiny
of
does not match the rst coin, then it will match the newly ipped rst coin during the second
. As
before, destiny in matches destiny in
on the last n 1 coins. Flipping the rst j coins for even j < k
ips the rst j 1 of the last n 1 coins, which preserves destiny by induction because j 1 is odd and
(n 1)k is odd. Similarly, ipping the rst j coins for odd j < k inverts destiny by induction.
Case 3: For n odd and k even, by induction we obtain a solution sequence
of length 2
n3
1 for
n
= n 2, which we view as acting on only the last n 2 coins. Then we construct a solution for n as
follows:
=
, ip coins 1, 3, 4, . . . , k + 1,
.
This solution has length || = 4 |
| +3 = 2
n1
1. Restricting attention to the rst two coins, rst ips
both coins, then ips the rst coin only, then ips both coins again. Together these enumerate all possibilities
for the rst two coins. Restricting to the last n 2 coins, these moves correspond to ipping the rst k 1
coins, coins 2, 3, . . . , k, and again the rst k 1 coins. By induction, all three of these operations preserve
destiny because k 1 and (n 2)k are odd. Therefore all three executions of
0} = W
0
W
1
. . . W
n1
W
n
=
V where dim(W
i
) = i for i = 0, 1, . . . , n. A ag W
0
W
1
. . . W
n1
W
n
is G-invariant if each
W
i
is G-invariant.
Next we describe the known connection between G-invariant ags and 2-groups:
Lemma 12 [Miy71] If G is a 2-group and W is any F
2
-representation of G, then there a G-invariant ag
on W.
Finally we show the connection between G-invariant ags and the magic trick. For simplicity, we show
here how to perform a more specic version of the trick: make the coins all heads. This version requires
2
n
1 moves. A slight modication allows the all-tails case and reduces the number of moves to 2
n1
1.
The characterization of valid groups G remains unaffected. These move bounds are optimal because in
particular we are solving the regular n-coin game (with the trivial group G = {
0
= and
i
=
i1
, w
(i)
,
i1
for i = 1, 2, . . . , n. By a simple induction,
i
consists of 2
i
1 moves.
The magicians strategy is
n
with 2
n
1 moves.
We prove by induction on i that
i
brings any initial conguration v W
i
to the all-heads conguration
0. Then, in particular,
n
brings any v W
n
= V to
0. In the base case, i = 0 and v W
0
= {
0},
11
so the magician has already won. In the induction step, i > 0, and there are two cases: v W
i1
and
v W
i
\ W
i1
. If v W
i1
, then by induction the prex
i1
of
i
brings v to
0. Otherwise, we analyze
the three parts of
i
separately. In the prex
i1
of
i
, we transform conguration v
into g(v
) + w
(j)
where 1 j < i. Because W
i
is G-invariant, v
W
i
implies g(v
) W
i
. Because W
i1
is G-invariant,
v
/ W
i1
implies g(v
) / W
i1
. Because w
(j)
W
i1
for j < i, v
W
i
\ W
i1
implies g(v
) + w
(j)
W
i
\ W
i1
. (In contrapositive, g(v
) + w
(j)
W
i1
implies (g(v
) + w
(j)
) w
(j)
= g(v
) W
i1
and
by G-invariance v
W
i1
.) Therefore the conguration v
remains in W
i
\ W
i1
throughout the prex
i1
of
i
. Next
i
takes the resulting conguration v
and applies w
(i)
W
i
\ W
i1
, so the resulting
conguration v
+ w
(i)
drops to W
i1
. (A simple counting argument shows that v
= x w
(i)
for some
x W
i1
, and hence v
+ w
(i)
= x W
i1
.) Finally, by induction, the second copy of
i1
brings the
conguration to
0. 2
Acknowledgments. We thank Patricia Cahn, Joseph ORourke, and Gail Parsloe for helpful initial discus-
sions about these problems. We thank the participants of Gathering for Gardner 8 for pointing us to [LR81];
and Noga Alon, Simon Litsyn, and Madhu Sudan for pointing us to [ABCO88].
References
[ABCO88] N. Alon, E. E. Bergmann, D. Coppersmith, and A. M. Odlyzko. Balancing sets of vectors. IEEE Trans-
actions on Information Theory, 34(1), January 1988.
[ES95] Richard Ehrenborg and Chris M. Skinner. The blind bartenders problem. Journal of Combinatorial
Theory, Series A, 70(2):249266, May 1995.
[Etz97] Tuvi Etzion. The depth distributiona new characterization for linear codes. IEEE Transactions on
Information Theory, 43(4):13611363, July 1997.
[Ful80] Karl Fulves. The Childrens Magic Kit: 16 Easy-to-do Tricks Complete with Cardboard Punchouts. Dover
Publications, Inc., New York, 1980.
[Gar91] Martin Gardner. The rotating table and other problems. In Fractal Music, Hypercards and More. . . :
Mathematical Recreations from Scientic American Magazine. W. H. Freeman & Co., October 1991.
Based on Mathematical Games: About rectangling rectangles, parodying Poe and many another pleasing
problem, Scientic American, 240(2):1624, February 1979, and the answers in Mathematical Games:
On altering the past, delaying the future and other ways of tampering with time, Scientic American,
240(3):2130, March 1979.
[Gra53] F. Gray. Pulse code communication. U.S. Patent 2,632,058, March 1953. http://patft.uspto.gov/netacgi/
nph-Parser?patentnumber=2632058.
[Gua98] Dah-Jyh Guan. Generalized gray codes with applications. Proc. Natl. Sci. Counc., 22:841848, 1998.
[Joh98] Richard Johnsonbaugh. A discrete intermediate value theorem. The College Mathematics Journal,
29(1):42, January 1998.
[LFW00] Yuan Luo, Fang-Wei Fu, and Victor K.-W. Wei. On the depth distribution of linear codes. IEEE Transac-
tions on Information Theory, 46(6):21972203, September 2000.
[LR81] William T. Laaser and Lyle Ramshaw. Probing the rotating table. In D. A. Klarner, editor, The Mathe-
matical Gardner. Wadsworth, Belmont, California, 1981. Republished in 1998 by Dover in Mathematical
Recreations: A Collection in Honor of Martin Gardner, pages 285307.
[LW80] Ted Lewis and Stephen Willard. The rotating table. Mathematics Magazine, 53(3):174179, May 1980.
[Miy71] Takehito Miyata. Invariants of certain groups I. Nagoya Mathematical Journal, 41:6973, 1971.
[Roo02] Eric Roode. Coin puzzle. Posting to Philadelphia Perl Mongers mailing list, May 7 2002. http://lists.
netisland.net/archives/phlpm/phlpm-2002/msg00137.html.
[YEM93] Reuven Bar Yehuda, Tuvi Etzion, and Shlomo Moran. Rotating-table games and derivatives of words.
Theoretical Computer Science, 108(2):311329, February 1993.
12