0% found this document useful (0 votes)
400 views12 pages

Coin-Flipping Magic: E Cen E Cen

This paper analyzes generalizations of a coin-flipping magic trick where a blindfolded magician equalizes coins by a few flips. It studies varying the number of coins or sides, allowing multiple simultaneous flips, equalizing heads/tails instead of all heads/tails, and letting the spectator rearrange coins between flips. The key results show the optimal number of flips scales exponentially with coins but heads/tails can be equalized with linear flips, and certain rearrangements are still solvable.

Uploaded by

Roxana
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
400 views12 pages

Coin-Flipping Magic: E Cen E Cen

This paper analyzes generalizations of a coin-flipping magic trick where a blindfolded magician equalizes coins by a few flips. It studies varying the number of coins or sides, allowing multiple simultaneous flips, equalizing heads/tails instead of all heads/tails, and letting the spectator rearrange coins between flips. The key results show the optimal number of flips scales exponentially with coins but heads/tails can be equalized with linear flips, and certain rearrangements are still solvable.

Uploaded by

Roxana
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 12

Coin-Flipping Magic

Nadia Benbernou

Erik D. Demaine

Martin L. Demaine

Benjamin Rossman

Prepared in honor of Martin Gardner for Gathering for Gardner 8


Abstract
This paper analyzes a variety of generalizations of a coin-ipping magic trick invented independently
by Martin Gardner and Karl Fulves. In the original trick, a blindfolded magician asks the spectator to
ip three coins, forcing them into an all-equal state by surprisingly few moves. We generalize to any
number of coins, coins/dice with more than two sides, and multiple ips at once. Next we consider a
generalization described by Martin Gardner in which the spectator can re-arrange the coins in certain
ways in between each ip. Finally we consider the variation in which the magician equalizes the number
of heads and tails, which can be achieved exponentially faster.
1 Introduction
O
N
E
CE
N
T
O
N
E
CE
N
T
Figure 1: Three pennies.
The trick. Thank you, dear reader, for volunteering for a
magic trick. May I ask, do you have a few coins that we could
use for the trick? If not, you can borrow mine from Figure 1.
Please get out three coins; they can be different denominations.
Now please arrange the coins in a line from left to right. Very good. Now I will blindfold myself and look
away. I guarantee that I cannot see the coins.
To get started, I would like you to ip over some of the coins. You can ip over as many or as few as you
like. The only rule is that the coins should not be all heads or all tails. Let me know when you are nished.
Good, let us proceed.
I am now visualizing your coins in my mind. With you acting as my hand, I will make the coins all the
same: all heads or all tails.
Please ip over the left coin. Are the coins now all the same? One third of my readers will shout yes!
and be blown away by my omniscience. For the rest of you, the trick continues.
Please ip over the middle coin. Very good. Now are the coins the same? Another third of my readers
will be surprised by my fast success. For the rest of you, the trick continues.
Let me see; I must have made a mistake in visualizing your coins. Ah yes, I see. I shouldnt have ipped
the left coin in the rst place. Please ip over the left coin, back the way it was. Now, I tell you, the coins
are all the same. Feel free to check my blindfolds.
Background. This self-working magic trick appears in Karl Fulvess book The Childrens Magic Kit
[Ful80, page 15]. According to that book, the trick was independently devised by Martin Gardner and
Fulves, based on an idea of Sam Schwartz. It works with coins or cards, over the telephone or the radio.
At rst we, and presumably many spectators, nd it surprising that just three blindfolded ips are enough
to equalize the three coins. Indeed, there are 2 2 2 = 8 possible states of the coins (HHH, HHT, HTH,

Preliminary draft. The latest version of this paper can be found at http://erikdemaine.org/papers/CoinFlipping/

MIT Computer Science and Articial Intelligence Laboratory, 32 Vassar St., Cambridge, MA 02139, USA, {nbenbern,
edemaine,mdemaine,brossman}@mit.edu
1
THH, HTT, THT, TTH, TTT). How do we navigate to two of these states (HHH or TTT) using just three
moves (and often fewer)? Motivated by this simple question, this paper studies several generalizations and
variations of this magic trick.
Equalizing four coins:
1. Flip the rst coin.
2. Flip the second coin.
3. Flip the rst coin.
4. Flip the third coin.
5. Flip the rst coin.
6. Flip the second coin.
7. Flip the rst coin.
Figure 2: Equalizing four coins
with at most seven ips.
Results. We begin in Section 2 with a simple generalization of the trick
to n coins. This generalization, and the original trick, turn out to be
easy to analyze: they are equivalent to Hamiltonian paths in an (n
1)-dimensional hypercube graph, and the classic Gray code gives one
such solution. Not surprisingly, the number of moves required grows
exponentially with n. More interesting is that we save an entire factor of
two by having two goal states, all heads and all tails. Namely, the worst-
case optimal number of blindfolded ips is 2
n1
1. The analysis also
easily generalizes to k-sided dice instead of coins: in Section 3, we show
that the worst-case optimal number of blindfolded operations is k
n1
1.
This family of tricks is thus really most impressive for n = 3 coins, where
the number 3 of ips is really quite small; beyond n = 3 or k = 2, the
number of ips grows quickly beyond feasibility. For sake of illustration,
however, Figure 2 shows a magic-trick sequence for four coins.
Balancing six coins:
1. Flip the second coin.
2. Flip the fth coin.
3. Flip the third coin.
4. Flip the rst coin.
5. Flip the fourth coin.
Figure 3: Equalizing the numbers
of heads and tails in six coins using
at most four ips.
One solution to this exponential growth is to change the goal from
the two all-heads and all-tails states to some larger collection of states. In
Section 4, we consider a natural such goal: equalize the number of heads
and tails. This goal is exponentially easier to achieve. Within just n 1
coin ips, the magician can force the numbers of heads and tails to be
equal. The algorithm is simple: just ip one coin at a time, in any order,
until the goal has been reached. Figure 3 shows an example for n = 6.
Although not obvious, this algorithm will succeed before every coin has
been ipped once. Furthermore, by randomly ipping all the coins in
each move, the magician expects to require only around

n moves. The
practicality of this type of trick clearly scales to much larger n.
Varying number of coins ipped:
1. Flip the left and middle coins.
2. Flip the left coin.
3. Flip the left and middle coins.
Exactly two coins ipped:
1. Flip the left and middle coins.
2. Flip the middle and right coins.
3. Flip the left and middle coins.
Figure 4: Alternate solutions to equalizing
three coins with at most three moves, ipping
more than one coin in some moves.
Returning to the goal of equalizing all the coins, the next gen-
eralization we consider is to allow the magician to ip more than
one coin at once, between asking for whether the coins are yet all
the same. This exibility cannot help the magician to equalize
the coins any faster than 2
n1
1 moves, but it can help obscure
what the magician is doing. For example, if the magician had
to repeat the three-coin trick several times, it might help to try
some of the variations in Figure 4. Under what conditions can
the magician still equalize n coins, ideally in the same number
2
n1
1 of moves? Obviously, ipping n 1 of the coins is
equivalent to ipping just one coin. On the negative side, in Sec-
tion 5, we show that the sequence of ips cannot vary arbitrarily:
if the spectator is allowed to choose how many coins the ma-
gician should ip in each move, then the magician can succeed
only if n 4 (no matter how many moves are permitted). On
the positive side, in Section 6, we show that it is possible to ip most xed numbers of coins in every move
and achieve the optimal worst case of 2
n1
1 moves. This result is fairly technical but interesting in the
way that it generalizes Gray codes.
2
Equalizing four rotating coins:
1. Flip the north and south coins.
2. Flip the north and east coins.
3. Flip the north and south coins.
4. Flip the north coin.
5. Flip the north and south coins.
6. Flip the north and east coins.
7. Flip the north and south coins.
Figure 5: Equalizing four coins that the
spectator can rotate at each stage, using at
most seven moves.
The nal variation we consider allows the spectator to re-
arrange the coins in certain ways after each move. Again this
can only make the magicians job harder. In each move, the ma-
gician species a subset of coins to ip, but before the spectator
actually ips them, the spectator can re-arrange the coins accord-
ing to certain permutations. Then, after ipping these coins, the
spectator reveals whether all coins are the same, and if not, the
trick continues. In Section 7, we characterize the exact group
structures of allowed re-arrangements that still permit the magi-
cian to equalize the coins. For example, if 2
k
coins are arranged
on a table, then the spectator can rotate and/or ip the table in
each move, and still the magician can perform the trick. Figure 5
shows the solution for four coins arranged in a square which the
spectator can rotate by 0, 90

, 180

, or 270

during each move.


The four-coin magic trick of Figure 5 goes back to a March 1979 letter from Miner S. Keeler to Martin
Gardner [Gar91], and was disseminated more recently by Eric Roode [Roo02]. Keelers letter was inspired
by an article of [Gar91] about a somewhat weaker magic trick where, after the spectator turns the table arbi-
trarily, the magician can feel two coins before deciding which to ip. This weaker trick has been generalized
to n coins on a rotating circular table and k hands: the trick can be performed if and only if k (1 1/p)n
where p is the largest prime divisor of n [LW80, LR81]. The fully blind trick we consider, without the
ability to feel coins, was rst generalized beyond Keelers four-coin trick to n dice, each with k sides, on
a rotating circular table: the trick can be performed if and only if k and n are powers of a common prime
[YEM93]. Interestingly, this characterization remains the same even if the magician can see the dice at
all times (but the spectator can still turn the table before actually ipping coins); however, the worst-case
number of moves reduces from k
n
1 to n +(1)(n p
1
) where k = p

and n = p

. (Interestingly,
the optimal number of moves for a specic sequence of coins gives rise to the notion of word depth, now
studied in the context of linear codes in information theory [Etz97, LFW00].)
Our general scenario considers an arbitrary group of permutations, instead of just a rotating circular
table. This scenario was also essentially solved by Ehrenborg and Skinner [ES95]: they characterize per-
formability in terms of the chain structure of the group. Our characterization is simpler: the group must
have a number of elements equal to an exact power of two. Our proof is also simpler, showing a connection
to invariant ags from group representation theory. It also uses the most sophisticated mathematical tools
among proofs in this paper.
2 n Coins
000 100
101 001
010 110
111 011
Figure 6: The graph correspond-
ing to the three-coin trick: the 3-
dimensional binary cube.
The simplest generalization is to consider n coins instead of three. The
goal is to make the coins all the same (all heads or all tails) by a sequence
of single coin ips, where after each ip the magician asks are the coins
all the same yet?
We can visualize this problem as navigating a graph, where each
vertex corresponds to a possible state of all the coins, and an edge cor-
responds to ipping a single coin. This graph is the well-known n-
dimensional binary hypercube; Figure 6 shows the case n = 3. In gen-
eral, the n-dimensional binary hypercube has 2
n
vertices, one for each
possible binary string of length n (where 0 bits correspond to heads and 1 bits correspond to tails), and has
an edge between two vertices whose binary strings differ in exactly one bit.
3
In the magic trick, the spectator chooses an arbitrary start vertex, and the magicians goal is to reach one
of two goal vertices: 00 0 (all heads) or 11 1 (all tails). At each step, the magician can pick which
edge to move along: ipping the ith coin corresponds to ipping the ith bit. The only feedback is when the
magician hits one of the goal vertices.
An equivalent but more useful viewpoint is to map coin congurations onto the binary hypercube by
dening a bit in a binary string to be 0 if that coin is the same orientation (heads/tails) as the spectators
original choice, and 1 if the coin is different from the spectators original choice. In this view, the magician
always starts at the same vertex 00 0. The two goal congurations g and g are now the unknown part; the
only knowledge is that they are inversions of each other (with 0s turned into 1s and vice versa).
In order for the magician to be sure of visiting one of the two solution states, the chosen path (sequence
of coin ips) must visit either v or its inversion v for every vertex v in the hypercube. There are 2
n
total
vertices, so the path must visit 2
n
/2 = 2
n1
different vertices. This argument proves a worst-case lower
bound of 2
n1
1 ips in the worst-case execution of the magic trick.
To see that 2
n1
1 ips also sufce in the worst case, it sufces to nd a Hamiltonian path in any
(n 1)-dimensional subcube of the n-dimensional cube, dropping whichever dimension we prefer. (A
Hamiltonian path visits each vertex exactly once.) The spectator sets this dimension arbitrarily to heads or
tails, and the Hamiltonian path explores all possible values for the remaining n 1 bits, so eventually we
will reach the conguration in which all bits match the dropped bit. The subcube has 2
n1
total vertices, so
the presumed Hamiltonian path has exactly 2
n1
1 edges as desired.
The nal piece of the puzzle is that n-dimensional cubes actually have Hamiltonian paths. This fact
is well-known. One such path is given by the binary Gray code, also known as the reected binary code
[Gra53]. This code/path can be constructed recursively as follows. The n-bit Gray code rst visits all strings
starting with 0 in the order given by the (n 1)-bit Gray code among the remaining bits; then it visits all
strings starting with 1 in the reverse of the order given by the (n 1)-bit Gray code among the remaining
bits. For example, the 1-bit Gray code is just 0, 1; the 2-bit Gray code is 00, 01, 11, 10; and the 3-bit Gray
code is 000, 001, 011, 010, 110, 111, 101, 100. Figure 6 illustrates this last path.
Theorem 1 The optimal sequence of ips guaranteed to eventually make n coins all heads or all tails uses
exactly 2
n1
1 ips in the worst case.
3 n Dice
One natural extension of ipping coins is to rolling k-sided dice. Suppose we have n dice, each die has k
faces, and each face is labeled uniquely with an integer 0, 1, . . . , k 1. The spectator arranges each die with
a particular face up. As before, the magician is blindfolded. In a single move, the magician can increment or
decrement any single die by 1 (wrapping around fromk to 0). At the end of such a move, the magician asks
whether all the dice display the same value face up. The magicians goal is to reach such a conguration.
As we show in this section, our analysis extends fairly easily to show that the magician can succeed in
k
n1
1 steps. A conguration of n dice becomes a k-ary string of n digits between 0 and k 1. In the
most useful viewpoint, a digit of 0 represents the same as the original state chosen by the spectator, and a
digit of i represents that the die value is i larger (modulo k) than the original die value. Thus (0, 0, . . . , 0)
represents the initial conguration chosen by the spectator, and the k goal states g
0
, g
1
, . . . , g
k1
have the
property that g
i
corresponds to adding i to each entry of g
0
(modulo k).
The analogous graph here is the k-ary n-dimensional torus. Figure 7 shows the case n = k = 3.
In general, the vertices correspond to k-ary strings of length n, and edges connect two vertices a =
(a
1
, a
2
, . . . , a
n
) and b = (b
1
, b
2
, . . . , b
n
) that differ by exactly 1 (modulo k) in exactly one position:
b = (a
1
, a
2
, . . . , a
i1
, a
i
1, a
i+1
, . . . , a
n
).
4
Figure 7: 3-ary 3-dimensional torus.
Again we drop an arbitrary digit/dimension, and focus on the
resulting k-ary (n 1)-dimensional subtorus. The magic trick be-
comes equivalent to nding a Hamiltonian path in this subtorus.
Such a path exists by a natural generalization of the Gray code
[Gua98]. Visiting all congurations of the other dice will even-
tually match the value of the dropped dimension.
Theorem 2 The optimal sequence of die increments/decrements
guaranteed to eventually make n k-sided dice all the same uses
exactly k
n1
1 moves in the worst case.
4 Equal Numbers of Heads and Tails
In this section, we explore the variation of the magic trick in which the goal is to equalize the numbers of
heads and tails, instead of equalizing the coins themselves. We consider two strategies: a fast randomized
strategy, and a simple deterministic strategy that achieves a balanced conguration in linear time.
Call a conguration balanced if it has an equal number of heads and tails. Throughout, we assume n
is even, although we could adapt these results to the case of n odd by expanding the denition of balanced
conguration to allow the numbers of heads and tails to differ by at most 1.
4.1 Randomized Strategy
In the randomized strategy, in each move, the magician ips each coin with probability
1
2
. Thus the magician
ips around half the coins in each move, effectively randomizing the entire conguration. We show that the
magician reaches a balanced conguration in around

n such moves:
Theorem 3 Using the randomized strategy, the magician balances n coins within O(

n) steps with con-


stant probability, and within O(

nlg n) steps with probability 1 O(1/n


c
) for any desired c > 0.
Proof: For any two congurations a and b on n coins, a single move from a reaches b with probability
1/2
n
. Hence each move uniformly samples the conguration space. The number of balanced congurations
is
_
n
n/2
_
, so the probability of reaching a balanced conguration in each step is
_
n
n/2
_
/2
n
. We simply need
to determine the number of such trials before one succeeds.
First we lower bound the probability of success using Stirlings formula:

2n(n/e)
n
e
1/(12n+1)
< n! <

2n(n/e)
n
e
1/(12n)
.
Thus
_
n
n/2
_
=
n!
(n/2)!(n/2)!

2n(
n
e
)
n
e
1/(12n+1)
_

n(
n/2
e
)
n/2
e
1/(6n)
_
2
=
_
2
n
2
n
e
9n+1
3n(12n+1)
.
Hence,
Pr{reaching balanced conguration in one move} =
_
n
n/2
_
2
n

_
2
n
e
9n+1
3n(12n+1)
> 0.7/

n.
Next we upper bound the probability of never reaching a goal state within t steps:
(1 Pr{reaching goal state in one step})
t
(1 0.7/

n)
t
e
0.7 t/

n
5
using the fact that (1 x)
t
e
xt
for all 0 x 1 and t 0. Hence the probability of obtaining a
balanced conguration within t steps is at least 1 e
0.7 t/

n
. Therefore, within t =

n steps, we reach a
balanced conguration with constant probability, and within t = (c/0.7)

nlnn steps, we reach a balanced


conguration with probability 1 1/n
c
for any constant c > 0. 2
4.2 Deterministic Strategy
At rst glance, a fast deterministic strategy may not be obvious. Nonetheless, our deterministic strategy is
simple: the magician ips the rst coin, then the second coin, then the third, and so on (or in any permutation
thereof), until reaching a balanced conguration. With the strategy in hand, its analysis is a straightforward
continuity argument:
Theorem 4 Using the deterministic strategy, the magician balances n coins in at most n 1 coin ips.
Proof: Let d
i
denote the number of heads minus the number of tails after the ith coin ip in the deterministic
strategy. In particular, d
0
is the imbalance of the initial conguration. If we reach n ips, we would have
ipped all coins, so d
n
= d
0
. Thus d
0
and d
n
have opposite signs (or are possibly both 0). We also know
that |d
i
d
i1
| = 1. By the discrete intermediate value theorem [Joh98], d
i
= 0 for some i with 0 i < n.
Thus the magician reaches a balanced conguration after i n 1 ips. 2
This deterministic strategy is fast, but still takes the square of the time required by the randomized strat-
egy. In contrast, for equalizing all coins, randomization would help by only a constant factor in expectation.
Is there a better deterministic strategy for reaching a balanced conguration? The answer is negative, even
for strategies that ip multiple coins in a single move, using results from coding theory:
Theorem 5 Every deterministic strategy for balancing n coins makes at least n1 moves in the worst case.
Proof: A general deterministic strategy can be viewed as a list of k bit vectors s
0
, s
1
, . . . , s
k1
where 0s
represent coins in their original state and 1s represent coins ipped from their original state. For example,
our (n 1)-ip strategy is given by the n vectors
s
i
= (1, 1, . . . , 1
. .
i
0, 0, . . . , 0
. .
ni
), 0 i < n.
For a strategy to balance any initial conguration given by a bit vector x, where 0s represent heads and
1s represent tails, x s
i
must have exactly n/2 1s for some i, where denotes bitwise XOR (addition
modulo 2). In other words, every bit vector x must differ in exactly n/2 bits from some s
i
. Alon et
al. [ABCO88] proved that the optimal such balancing set of vectors s
0
, s
1
, . . . , s
k1
consists of exactly n
vectors, and therefore our (n 1)-ip strategy is optimal. 2
5 Flipping More Coins At Once
Returning to the magic trick of ipping n coins to become all the same, another generalization is to allow
the magician the additional exibility of ipping more than one coin at once. The number of coins ipped
per move might be a constant value (as considered in the next section), or might change from move to move
(as in Figure 4, left). In either case, we let k denote the number of coins ipped in a move.
In this section, we consider what happens when the spectator gets to choose how many coins the ma-
gician must ip in each move. Obviously, if n is even, then the spectator must choose odd values for k, or
else the magician could never get out of the odd parity class. But even then the magician is in trouble. We
provide a complete answer to when the magician can still succeed:
6
Lemma 6 If n 5, the magician is doomed.
Proof: The spectator uses the following strategy: if the distance between the current conguration and the
all-heads or all-tails conguration is 1, then the spectator tells the magician to ip three coins. Otherwise,
the spectator tells the magician to ip one coin. Because n 5, being at distance 1 from one target
conguration means being at distance at least 4 from the other target conguration, and 4 > 3, so the
magician can never hit either target conguration. The spectator always says odd numbers, so this strategy
satises the constraint when n is even. 2
Lemma 7 If n = 3 or n = 4, the magician can succeed.
Proof: As mentioned above, ipping k or n k coins are dual to each other. For n = 3 or n = 4, the
spectator can only ask to ip 1 or n 1 coins. Thus the magician effectively has the same control as when
ipping one coin at a time. More precisely, if the spectator says to ip 1 coin, the magician ips the next
coin in the k = 1 strategy. If the spectator says to ip n1 coins, the magician ips all coins except the next
coin in the k = 1 strategy. This transformation has effectively the same behavior because the two targets are
bitwise negations of each other. 2
Despite this relatively negative news, it would be interesting to characterize the sequences of k values for
which the magician can win. Such a characterization would provide the magician with additional exibility
and variability for the equalizing trick. In the next section, we make partial progress toward this goal by
showing that the magician can succeed for most xed values of k.
6 Flipping Exactly k Coins at Once
In this section, we characterize when the magician can equalize n coins by ipping exactly k coins in each
move. Naturally, we must have 0 < k < n, because both 0-ip and n-ip moves cannot equalize a not-
already-equal conguration. Also, as observed in the previous section, we cannot have both n and k even,
because then we could never change an odd-parity conguration into the needed even parity of an all-equal
conguration. We show that these basic conditions sufce for the magician:
Theorem 8 The magic trick with k-ip moves can be performed if and only if 0 < k < n and either n or k
is odd. The optimal solution sequence uses exactly 2
n1
1 moves in the worst case.
A lower bound of 2
n1
1 follows in the same way as Section 2. Again we can view the trick on the
n-dimensional hypercube, where 0 represents a bit unchanged from its initial conguration and 1 represents
a changed bit. The difference is that now moves (edges) connect two congurations that differ in exactly
k bits. The lower bound of 2
n1
1 follows because we need to visit every bit string or its complement
among 2
n
possibilities.
Our construction of a (2
n1
1)-move solution is by induction on n. If k is even, we can consider only
odd values of n. The base cases are thus when n = k + 1 for both even and odd k. The n = k + 1 case has
k = n 1, so it is effectively equivalent to k = 1 from Section 2. We will, however, need to prove some
additional properties about this solution.
It seems difcult to work with general solutions for smaller values of n, so we strengthen our induction
hypothesis. Given a solution to a trick, we call a conguration destined for heads if the solution transforms
that conguration into the all-heads conguration (and never all-tails), and destined for tails if it transforms
into all-tails (and never all-heads). (Because our solutions are always optimal length, they only ever reach
one all-heads conguration or one all-tails conguration, never both, even if run in entirety.) We call a
7
transformation destiny-preserving if every conguration on n coins has the same destiny before and after
applying the transformation. A transformation is destiny-inverting if every conguration on n coins has the
opposite destiny before and after applying the transformation. Now the stronger inductive statement is the
following:
1. for nk even, ipping the rst j coins for even j < k preserves destiny, while ipping the rst j coins
for odd j < k inverts destiny; and
2. for nk odd, ipping the rst j coins for even j < k inverts destiny, while ipping the rst j coins for
odd j < k preserves destiny, and ipping coins 2, 3, . . . , k preserves destiny.
To get this stronger induction hypothesis started, we begin with the base case:
Lemma 9 For any k > 0, the k-ip trick with n = k + 1 coins has a solution sequence of length 2
k
1
such that ipping the rst j coins for even j < k preserves destiny, while ipping the rst j coins for odd
j < k inverts destiny.
Proof: The construction follows the Gray code of Section 2. That ip sequence, ignoring the rst coin, can
be described recursively by
G
k
= G
k1
, ip the (n k + 1)st coin, G
k1
.
To ip n 1 coins in each move, we invert this sequence into

G
k
=

G
k1
, ip all but the (n k + 1)st coin,

G
k1
.
In the base case, G
0
=

G
0
= . Validity of the solution follows as in Section 2; indeed, for any starting
conguration, the number of moves performed before the conguration becomes all-heads or all-tails is the
same in sequences G
k
and

G
k
.
Every move ips the rst coin, so the destiny of a conguration is determined by its parity and the
parity of n: if n and the number of coins equal to the rst coin (say heads) have the same parity, then the
conguration is destined is that value (heads); and if n and the (heads) count have opposite parity, then
the destiny is the opposite value (tails). To see why this is true, consider the hypercube viewpoint where
0s represent coins matching the initial conguration and 1s represent ipped coins in an execution of G
k
(not

G
k
). Then, at all times, the number of 1 bits in the conguration has the same parity as the number of
steps made so far. At the same time, every move in

G
k
ips the rst coin, so the rst coin in the current
conguration matches its original value precisely when there have been an even number of steps so far. Thus,
when we reach a target conguration of all-heads or all-tails, it will match the original rst coin precisely
if there have been an even number of steps so far, which is equivalent to there being an even number of 1
bits in the G
k
view, which means that the initial and target congurations differ in an even number of bits.
In this case, the initial and target congurations have the same parity of coins equal to their respective rst
coins; but, in the target conguration, all coins match the rst coin, so in particular n has the same parity as
the number of coins equal to the rst coin. We have thus shown this property to be equivalent to the target
conguration matching the initial rst coin.
It remains to verify the ipping claims. Flipping the rst j coins for even j < k preserves the parity of
the number of heads as well as the number of tails, but inverts the rst coin, so inverts the destiny. Flipping
the rst j coins for odd j < k changes the parity of the number of heads as well as the number of tails, and
inverts the rst coin, which together preserve the destiny. 2
With this base case in hand, we complete the induction to conclude Theorem 8. In the nonbase case,
n > k + 2. There are three cases to consider:
8
Case 1: Both n and k are odd. By induction, we obtain a solution sequence

of length 2
n2
1 for
n

= n 1 satisfying the destiny claims. We view

as acting on only the last n 1 of our n coins. Then


we construct a solution for n as follows:
=

, ip the rst k coins,

.
This solution has length || = 2 |

| + 1 = 2
n1
1.
Next we prove that sequence solves the trick. Consider any conguration on n coins, and assume by
symmetry that the last n 1 of its coins are destined for heads in

. If the rst coin is also heads, then the


magician arrives at the all-heads conguration within the rst

prex of . If the rst coin is tails, then


the

prex will not complete the trick, at which point the magician ips the rst k coins. This move has
the effect of ipping the rst coin to heads as well as ipping the rst k 1 of the

subproblem, which is
destiny-preserving because k1 is even and (n1)k is even. Therefore, during the second

, the magician
will arrive at the all-heads conguration.
Now we verify the destiny claims. Note that the destiny of a conguration in equals the destiny of
the last n 1 coins in

, so we can apply induction almost directly. Flipping the rst j coins for even
j < k ips the rst j 1 of the last n 1 coins, which inverts destiny by induction because j 1 is even
and (n 1)k is even. Similarly, ipping the rst j coins for odd j < k preserves destiny by induction.
Finally, ipping coins 2, 3, . . . , k ips the rst k 1 coins of the last n 1 coins, which preserves destiny
by induction because k 1 is even.
Case 2: For n even and k odd, by induction we again obtain a solution sequence

of length 2
n2
1
for n

= n 1, viewed as acting on only the last n 1 coins. We construct a solution for n as follows:
=

, ip coins 1, 3, 4, . . . , k + 1,

.
Again has length 2
n1
1. Flipping coins 1, 3, 4, . . . , k+1 has the effect of ipping the rst 2, 3, . . . , k of
the last n 1 coins, which by induction is destiny-preserving because (n 1)k is odd. Thus, if the destiny
of

does not match the rst coin, then it will match the newly ipped rst coin during the second

. As
before, destiny in matches destiny in

on the last n 1 coins. Flipping the rst j coins for even j < k
ips the rst j 1 of the last n 1 coins, which preserves destiny by induction because j 1 is odd and
(n 1)k is odd. Similarly, ipping the rst j coins for odd j < k inverts destiny by induction.
Case 3: For n odd and k even, by induction we obtain a solution sequence

of length 2
n3
1 for
n

= n 2, which we view as acting on only the last n 2 coins. Then we construct a solution for n as
follows:
=

, ip the rst k coins,

, ip coins 1, 3, 4, . . . , k + 1,

, ip the rst k coins,

.
This solution has length || = 4 |

| +3 = 2
n1
1. Restricting attention to the rst two coins, rst ips
both coins, then ips the rst coin only, then ips both coins again. Together these enumerate all possibilities
for the rst two coins. Restricting to the last n 2 coins, these moves correspond to ipping the rst k 1
coins, coins 2, 3, . . . , k, and again the rst k 1 coins. By induction, all three of these operations preserve
destiny because k 1 and (n 2)k are odd. Therefore all three executions of

produce the same target


conguration (all-heads or all-tails) which will eventually match one of the combinations of the rst two
coins. Flipping the rst j coins for even j < k ips the rst j 2 of the last n 2 coins, which preserves
destiny by induction because j 2 is even and (n 1)k is odd. Similarly, ipping the rst j coins for odd
j < k inverts destiny by induction.
This concludes the inductive proof of Theorem 8.
9
7 Permuting Coins Between Flips
Our nal variation of the coin-ipping magic trick is parameterized by a group G of permutations on
{1, 2, . . . , n}. We start with n coins, labeled 1, 2, . . . , n, in initial orientations decided by the spectator.
At each step, the blindfolded magician can choose an arbitrary collection of coins to ip. Prior to ipping
the coins, the spectator chooses an arbitrary permutation from the permutation group G, and re-arranges
the coins according to that permutation. The spectator then ips the coins at the locations specied by the
magician. The magician then asks are the coins all the same? and the trick ends if the answer is positive.
Whether the magician has a winning strategy depends on the permutation group G. In this section,
we will characterize exactly which permutation groups allow the magician to perform such a trick. Our
characterization also applies to the supercially easier version of the trick where the spectator ips the coins
specied by the magician before permuting the coins, because we consider deterministic strategies.
Our characterization of valid groups turns out to match the existing notion of 2-groups. A group G
is a 2-group if the number |G| of its group elements is an exact power of 2. The simplest example of such
a group is the cyclic group C
2
k of order 2
k
, that is, a rotating table of coins with 2
k
coins. Another simple
example is the dihedral group D
2
k of symmetries of a regular 2
k
-gon (acting as a permutation group on
the vertices), that is, allowing the spectator to confuse left (counterclockwise on the table) from right
(clockwise on the table). A more sophisticated example is the iterated wreath product of k copies of the
group S
2
of permutations on two elements. This group can be viewed as permutations on the 2
k
leaves of a
perfect binary tree, generated by reversal of the leaves beneath any internal node in the tree. Of course, we
can also obtain a 2-group by taking direct products of 2-groups.
Theorem 10 The magician can successfully perform the n-coin trick with permutation group G if and only
if G is a 2-group. In this case, the worst-case optimal solution sequence makes exactly 2
n1
1 moves.
To prove this theorem, it is convenient to speak in the language of group representations. For a group G
and a eld F, an n-dimensional F-representation of G is an n-dimensional F-vector space V together with
a left action of G on V such that g(v + w) = gv + gw for all g G and v, w V and F. (A left
action is a function GV V such that (gh)v = g(hv) for all g, h G and v V .)
In the context of our magic trick, we have a permutation group G on the coins; call the coins 1, 2, . . . , n
for simplicity. The vector space V = (F
2
)
n
represents all possible congurations of the n coins. We
consider the F
2
-representation of G on V dened by g(v
1
, v
2
, . . . , v
n
) = (v
g(1)
, v
g(2)
, . . . , v
g(n)
). In other
words, a group action g simply permutes the coins. In this algebraic language, we can view one move
in the magic trick as follows. Suppose the current conguration of coins is v = (v
1
, v
2
, . . . , v
n
) V ,
where v
i
is 0 if the ith coin is heads and 1 if it is tails. The blindfolded magician species a vector w =
(w
1
, w
2
, . . . , w
n
) V , where w
i
is 1 if the magician species to ip the ith coin and 0 otherwise. The
spectator then picks a permutation g G, applies that permutation to v, and applies the ips specied by w
to g(v). Hence the resulting conguration is g(v) + w. If g(v) + w = (0, 0, . . . , 0) =

0 V (all heads) or
g(v) + w = (1, 1, . . . , 1) =

1 V (all tails), then the magician has succeeded.


Our proof of Theorem 10 consists of three lemmas. The rst lemma shows that, if G is not a 2-group,
then the magician cannot guarantee a successful performance of the trick. Next we dene the notion of a
G-invariant ag. The second lemma shows that the existence of G-invariant ag on V implies a winning
strategy for the magician. The third lemma establishes that V has a G-invariant ag if G is a 2-group.
Together, these three lemmas prove the theorem.
Lemma 11 If G is not a 2-group, then the magician is doomed.
Proof: Suppose Gis not a 2-group, i.e., |G| is not a power of 2. Thus there is an odd prime p that divides |G|.
By Cauchys group theorem, there is a permutation g G of order p, i.e., for which g
p
is the smallest power
10
of g that equals the identity permutation. The order of a permutation is the least common multiple of its
cycle lengths in its disjoint-cycle decomposition, and p is prime, so there must in fact be a cycle of length p,
i.e., a coin i {1, 2, . . . , n} such that i, g(i), g
2
(i), . . . , g
p1
(i) are all distinct, while g
p
(i) = i. We can
assume by renaming some of the coins that this cycle appears among the rst p coins: i = 1, g(i) = 2,
g
2
(i) = 3, . . . , g
p1
(i) = p.
We dene the set X of trouble congurations to consist of congurations in which the rst three
coins are not all equal, i.e., X = {x V : (x
1
, x
2
, x
3
) / {(0, 0, 0), (1, 1, 1)}}. The spectator chooses
a conguration in X as the initial conguration. We next give a strategy for the spectator that guarantees
staying within X, no matter how the magician moves. This strategy then foils the magician, because not all
the coins can be equal if the rst three coins are never equal.
Consider any trouble conguration x X and magician move w V . We need to show that the
spectator has a move h G resulting in conguration h(x) + w X. Look at the magician moves for the
rst three coins: w
1
, w
2
, w
3
. There are eight possibilities for these three bits. We can factor out a symmetry
by letting a {0, 1} be arbitrary and letting b = 1 a. Then the three bits have four possible patterns: aaa,
aab, aba, and abb. The aaa pattern ips none or all of the rst three coins, which means they remain not all
equal, and thus the conguration remains in X if the spectator chooses the identity permutation (i.e., does
not permute the coins). Three patterns remain: aab, aba, and abb.
The cyclic sequence x
1
, x
2
, . . . , x
p
of p bits forming the p-cycle in g consists of an odd number of bits.
Because x X, at least one of these bits is 0 and at least one is 1, Thus both the patterns ccd and eff
must occur in the cyclic sequence, where d = 1 c and f = 1 e. Now, if w
1
, w
2
, w
3
has pattern aba or
abb, we use the ccd pattern; and if w
1
, w
2
, w
3
has pattern aab, we use the eff pattern. In either case, say
the latter pattern appears as (x
k+1
, x
k+2
, x
k+3
), where k {0, 1, . . . , p 1}. The spectator then chooses
h = g
k
, so that h(x) puts the pattern in positions 1, 2, 3. Thus h(x) + w sums the two patterns, resulting in
aba + ccd = ghh, abb + ccd = ghg, or aab + eff = iji. In all cases, h(x) + w X. 2
The next two lemmas use the notion of G-invariant ag. A subspace W V is G-invariant if gv W
for all v W and all g G. A ag on V is a chain of subspaces {

0} = W
0
W
1
. . . W
n1
W
n
=
V where dim(W
i
) = i for i = 0, 1, . . . , n. A ag W
0
W
1
. . . W
n1
W
n
is G-invariant if each
W
i
is G-invariant.
Next we describe the known connection between G-invariant ags and 2-groups:
Lemma 12 [Miy71] If G is a 2-group and W is any F
2
-representation of G, then there a G-invariant ag
on W.
Finally we show the connection between G-invariant ags and the magic trick. For simplicity, we show
here how to perform a more specic version of the trick: make the coins all heads. This version requires
2
n
1 moves. A slight modication allows the all-tails case and reduces the number of moves to 2
n1
1.
The characterization of valid groups G remains unaffected. These move bounds are optimal because in
particular we are solving the regular n-coin game (with the trivial group G = {

0}) from Section 2.


Lemma 13 If V has a G-invariant ag, then the magician can make all coins heads in 2
n
1 moves.
Proof: Suppose W
0
W
1
. . . W
n1
W
n
is a G-invariant ag on V . Choose any element
w
(i)
W
i
\ W
i1
for each i = 1, 2, . . . , n. Dene the move sequences
1
,
2
, . . . ,
n
recursively by

0
= and
i
=
i1
, w
(i)
,
i1
for i = 1, 2, . . . , n. By a simple induction,
i
consists of 2
i
1 moves.
The magicians strategy is
n
with 2
n
1 moves.
We prove by induction on i that
i
brings any initial conguration v W
i
to the all-heads conguration

0. Then, in particular,
n
brings any v W
n
= V to

0. In the base case, i = 0 and v W
0
= {

0},
11
so the magician has already won. In the induction step, i > 0, and there are two cases: v W
i1
and
v W
i
\ W
i1
. If v W
i1
, then by induction the prex
i1
of
i
brings v to

0. Otherwise, we analyze
the three parts of
i
separately. In the prex
i1
of
i
, we transform conguration v

into g(v

) + w
(j)
where 1 j < i. Because W
i
is G-invariant, v

W
i
implies g(v

) W
i
. Because W
i1
is G-invariant,
v

/ W
i1
implies g(v

) / W
i1
. Because w
(j)
W
i1
for j < i, v

W
i
\ W
i1
implies g(v

) + w
(j)

W
i
\ W
i1
. (In contrapositive, g(v

) + w
(j)
W
i1
implies (g(v

) + w
(j)
) w
(j)
= g(v

) W
i1
and
by G-invariance v

W
i1
.) Therefore the conguration v

remains in W
i
\ W
i1
throughout the prex

i1
of
i
. Next
i
takes the resulting conguration v

and applies w
(i)
W
i
\ W
i1
, so the resulting
conguration v

+ w
(i)
drops to W
i1
. (A simple counting argument shows that v

= x w
(i)
for some
x W
i1
, and hence v

+ w
(i)
= x W
i1
.) Finally, by induction, the second copy of
i1
brings the
conguration to

0. 2
Acknowledgments. We thank Patricia Cahn, Joseph ORourke, and Gail Parsloe for helpful initial discus-
sions about these problems. We thank the participants of Gathering for Gardner 8 for pointing us to [LR81];
and Noga Alon, Simon Litsyn, and Madhu Sudan for pointing us to [ABCO88].
References
[ABCO88] N. Alon, E. E. Bergmann, D. Coppersmith, and A. M. Odlyzko. Balancing sets of vectors. IEEE Trans-
actions on Information Theory, 34(1), January 1988.
[ES95] Richard Ehrenborg and Chris M. Skinner. The blind bartenders problem. Journal of Combinatorial
Theory, Series A, 70(2):249266, May 1995.
[Etz97] Tuvi Etzion. The depth distributiona new characterization for linear codes. IEEE Transactions on
Information Theory, 43(4):13611363, July 1997.
[Ful80] Karl Fulves. The Childrens Magic Kit: 16 Easy-to-do Tricks Complete with Cardboard Punchouts. Dover
Publications, Inc., New York, 1980.
[Gar91] Martin Gardner. The rotating table and other problems. In Fractal Music, Hypercards and More. . . :
Mathematical Recreations from Scientic American Magazine. W. H. Freeman & Co., October 1991.
Based on Mathematical Games: About rectangling rectangles, parodying Poe and many another pleasing
problem, Scientic American, 240(2):1624, February 1979, and the answers in Mathematical Games:
On altering the past, delaying the future and other ways of tampering with time, Scientic American,
240(3):2130, March 1979.
[Gra53] F. Gray. Pulse code communication. U.S. Patent 2,632,058, March 1953. http://patft.uspto.gov/netacgi/
nph-Parser?patentnumber=2632058.
[Gua98] Dah-Jyh Guan. Generalized gray codes with applications. Proc. Natl. Sci. Counc., 22:841848, 1998.
[Joh98] Richard Johnsonbaugh. A discrete intermediate value theorem. The College Mathematics Journal,
29(1):42, January 1998.
[LFW00] Yuan Luo, Fang-Wei Fu, and Victor K.-W. Wei. On the depth distribution of linear codes. IEEE Transac-
tions on Information Theory, 46(6):21972203, September 2000.
[LR81] William T. Laaser and Lyle Ramshaw. Probing the rotating table. In D. A. Klarner, editor, The Mathe-
matical Gardner. Wadsworth, Belmont, California, 1981. Republished in 1998 by Dover in Mathematical
Recreations: A Collection in Honor of Martin Gardner, pages 285307.
[LW80] Ted Lewis and Stephen Willard. The rotating table. Mathematics Magazine, 53(3):174179, May 1980.
[Miy71] Takehito Miyata. Invariants of certain groups I. Nagoya Mathematical Journal, 41:6973, 1971.
[Roo02] Eric Roode. Coin puzzle. Posting to Philadelphia Perl Mongers mailing list, May 7 2002. http://lists.
netisland.net/archives/phlpm/phlpm-2002/msg00137.html.
[YEM93] Reuven Bar Yehuda, Tuvi Etzion, and Shlomo Moran. Rotating-table games and derivatives of words.
Theoretical Computer Science, 108(2):311329, February 1993.
12

You might also like