Guaranteed Scoring Games

The class of Guaranteed Scoring Games (GS) are two-player combinatorial games with the property that Normal-play games (Conway et. al.) are ordered embedded into GS. They include, as subclasses, the scoring games considered by Milnor (1953), Ettinger (1996) and Johnson (2014). We present the structure of GS and the techniques needed to analyze a sum of guaranteed games. Firstly, GS form a partially ordered monoid, via defined Right- and Left-stops over the reals, and with disjunctive sum as the operation. In fact, the structure is a quotient monoid with partially ordered congruence classes. We show that there are four reductions that when applied, in any order, give a unique representative for each congruence class. The monoid is not a group, but in this paper we prove that if a game has an inverse it is obtained by `switching the players'. The order relation between two games is defined by comparing their stops in \textit{any} disjunctive sum. Here, we demonstrate how to compare the games via a finite algorithm instead, extending ideas of Ettinger, and also Siegel (2013).


Introduction
Combinatorial Game Theory (CGT) studies two-player games, (the players are called Left and Right) with perfect information and no chance device. A common, almost defining feature, is that these games often decompose into sub-components and a player is only allowed to move in one of these at each stage of play. This situation is called a disjunctive sum of games. It is also commonplace to allow addition of games with similar and well defined properties, games in such a family do not necessarily need to have the same rule sets.
The convention we wish to study, has the winner as the player with the best score. This convention includes rule sets such as dots-&-boxes, go and mancala. A general, useful, theory has been elusive and, to our current knowledge, only four approaches appear in the literature. Milnor [11], see also Hanner [8], considers dicot games (both players have a move from any non-terminal position) with nonnegative incentive. In games with a nonnegative incentive, a move never worsens the player's score; that is, zugzwang games, where neither player wishes to move, do not appear. Ettinger [6,5] considers all dicot games. Stewart [15] defines a comprehensive class but it has few useful algebraic properties. Johnson [9] considers another subclass of dicot games, for which, for any position, the lengths of every branch of the game tree has the same parity.
We study the class of Guaranteed Scoring Games, GS which were introduced in [10]. This class has a partial order relation, , which together with the disjunctive sum operation induces a congruence relation (∼, +). The resulting quotient monoid inherits partially ordered congruence classes, and it is the purpose of this paper to continue the study of these classes. In [10], it was shown that Normal-play games (see Remark 1) can be ordered embedded in a natural way and that a positive incentive for games without Right or Left options is an obstacle to the order embedding. It was also demonstrated how to compare games with numbers using waiting moves (images of Normal-play integers) and pass-allowed stops. Intuitively, this class of games has the property that the players want the component games to continue; every game in which at least one player cannot move has non-positive incentive.
Here we show that GS has the properties: 1. There is a constructive way to give the order relation between games G and H. It only requires G, H and a special type of simplistic games that we call 'waiting moves', games with the sole purpose of giving one of the player an extra number of moves, but with no change in score.
GS. Section 2.3 presents results on pass-allowed stops and waiting moves. Section 3.1 proves four reductions that simplify games. Section 3.3 proves that applying these reductions leads to a unique game. The proofs require extending Siegel's 'linked' concept for misère games to scoring games which is in Section 3.2. Section 4 shows that the Conjugate Property holds in GS. In Section 5 we give a brief intro to the Scoring Games Calculator.
Remark 1. Other famous winning conditions in CGT are considering who moves last. Normal-play games, the first player who cannot move loses, find their origins with the analysis of nim [3]; see also [7,14]. Conway developed the first encompassing theory; see [2,4]. A comprehensive Misère theory, the first player who cannot move wins, has not yet been developed but large strides have been made for impartial games, see [12]. A related winning convention arises in the Maker-Breaker (or Maker-Maker) games usually played on a graph-one player wishes to create a structure and the opponent wants to stop this (or both want to create a structure) such as hex or generalized tic-tac-toe. See Beck [1] for more details.

Background
For any combinatorial game G (regardless of the winning condition) there are two players who, by convention, are called Left (female) and Right (male) 5 . From G, a position that some player can move to (in a single move) is an option of G. The left options are those to which Left can move and the corresponding set is denoted by G L . An element of G L is often denoted by G L . Similarly, there is a set of right options denoted by G R , with a typical game G R . There is no requirement that G L and G R be disjoint. A game can be recursively defined in terms of its options. We will use the representation G = G L | G R (so as to distinguish them from Normalplay games where the convention is {G L | G R }). The followers of G are defined recursively: G and all its options are followers of G and each follower of a follower of G is a follower of G. The set of proper followers of G are the followers except for G itself. The game tree of a position G would then consist of all the followers of G drawn recursively: i.e. the options of a follower H of G are the children of H in the tree.
Positions fall into two kinds: atomic positions in which at most one player can move, and non-atomic positions in which both players can move. A position with no Left options is called left-atomic, and in case of no Right options it is right-atomic. A game with no options at all is called purely-atomic, that is, such games are both left-atomic and right-atomic.

Introduction to Guaranteed Scoring Games
In scoring combinatorial games, the score of a game is determined at the end of the game, that is when the player to move has no option.
Definition 2 (Game termination). Let G be a left-atomic game. We write G L = ∅ ℓ , ℓ ∈ R to indicate that, if Left to move, the game is over and the score is the real number ℓ. Similarly, if G is right-atomic then G R = ∅ r , and if it is Right's move then there are no Right options and the score is r ∈ R. Left wins if the score is positive, Right wins if the score is negative, and it is a tie if the score is zero.
Since the game ∅ s | ∅ s results in a score of s regardless of whose turn it is, we call this game (the number) s. We refer to the adorned empty set, ∅ s , s ∈ R, as an atom or, if needed for specificity, the s-atom. By an atom in a game G, we mean an atom of some atomic follower of G. By an atom in a set of games we mean an atom in one of the games in that set. In the general scoring universe, denoted by S (see also [10,15]), there is no restriction to the form of the atomic games.
The class of Guaranteed Scoring Games, GS, can be defined directly as the class that contains all guaranteed games. We give an equivalent recursive definition. Definition 4. Let GS 0 be the set of birthday 0 guaranteed games. These are of the form { ∅ ℓ | ∅ r : ℓ, r ∈ R, ℓ ≤ r}. Suppose that G and H are sets of guaranteed games of birthday less than i. The set of non-atomic games of the form G | H together with atomic games of the forms ∅ ℓ | ∅ r , ∅ ℓ | H and G | ∅ r are the games in GS i . For i > 0, if G ∈ GS i \ GS i−1 then G is said to have birthday i and we write b(G) = i.
It follows that GS = ∪ i 0 GS i , with notation as in Definition 4. The birthday of a game corresponds to the depth of its game tree. This stratification into birthdays is very useful for proofs by induction.
A player may be faced with several component games/positions, and if there is at least one in which he can move then he has an option and the game is not over yet. A move in a disjunctive sum of positions is a move in exactly one of the component positions, and the other ones remain unchanged. It is then the other players turn to move. We formalize this in the next definition by listing all the possible cases. The distinction between the two uses of +, the disjunctive sum of games and the addition of real numbers, will always be clear from the context. If G = {G 1 , . . . , G m } is a set of games and H is a single game then G + H = {G 1 + H, . . . , G m + H} if G is non-empty; otherwise G + H is not defined and will be removed from any list of games.
An intuitively obvious fact that is worthwhile highlighting at this point: if Left has no move in G + H then Left has no move in neither of G and H (and reverse), that is: and at least one of G R and H R is non-empty; and at least one of G L and H L is non-empty; Note that in the last equality, if there are no left options in G, then G L + H gets removed, unless both G L and H L are atoms, in which case some earlier item applies. Proof. In all cases, the proof is by induction on the sum of the birthdays of the positions.
Suppose that G + H is left-atomic. Then both G = ∅ g | G R and H = ∅ h | H R are left-atomic. Since both games are guaranteed, then each satom in G satisfies g s and each t-atom in H satisfies h t. Therefore g + h min{s+ t}, and so G+ H = ∅ g+h | (G+ H) R is also guaranteed. this case includes the possibility that (G+H) R is the (s+t)-atom. Finally, suppose that both G L and G R are non-empty sets of games of GS. Both players have moves in G + H that, by induction, are games of GS. So, G + H ∈ GS.
The other cases are analogous using induction and the fact that the addition of real numbers is commutative.
is just a consequence of that the addition of real numbers is associative.
The other cases are analogous using induction and the fact that the addition of real numbers is associative.
4. It follows directly from the definition of disjunctive sum that G+0 = 0+G = G so the identity of (GS, +) is 0.
When analyzing games, the following observation, which follows from the definition of the disjunctive sum, is useful for human players.
Observation 7 (Number Translation). Let G ∈ GS and x ∈ R then Next, we give the fundamental definitions for comparing games.
Definition 8. For a game G ∈ GS: and We call Ls(G) the Left-stop of G and Rs(G) the Right-stop of G.  Proof. Both assertions follow directly from their definitions and the fact that the reals are totally ordered.
Theorem 10 shows that the monoid (GS, +) can be regarded as the algebraic structure (GS, +, ). The next three results show that (GS, +) modulo ∼ is a quotient monoid, and that in fact (∼, +) is a congruence relation; the additive structure on the equivalence classes (GS, +) modulo ∼ is inherited from (GS, +). (A natural function from the congruence classes to the outcomes can be obtained via the unique representatives which define the canonical forms as discussed in Section 3.3.) Lemma 11. Let G, H ∈ GS. If G H then G + J H + J for any J ∈ GS.
Proof. Consider any game J ∈ GS. Since G H, it follows that, Ls(G+ (J + X)) Ls(H +(J +X)), for any X ∈ GS. Since disjunctive sum is associative this inequality is the same as Ls((G + J) + X)) Ls((H + J) + X). The same argument gives Rs((G + J) + X)) Rs((H + J) + X) and thus, since X is arbitrary, this gives that G + J H + J.  Proof. Since X ∼ Y means X Y and Y X, the result follows by applying Corollary 12 twice.
The conjugate of a game G, ↔ G, is defined recursively: It is easy to see that if a game is guaranteed, then its conjugate is also. As mentioned early, this is equivalent to interchanging Left and Right. In The next two results will be useful in proving the Conjugate Property in Section 4.
Proof. By Corollary 12, we already know that G + H 0. So, it is enough to show that G + H ≁ 0. Since G ≻ 0 then, without loss of generality, we may assume that

Relation between Normal-play and Guaranteed Games
One of the main results in [10] is that Normal-play games are order-embedded in GS.
Definition 16. For a Normal-play game G, let G be the scoring game obtained by replacing each empty set, ∅, in G by the atom ∅ 0 . This operation retains the game tree structure. For example, the leaves of a Normal-play game tree are labelled 0 = {∅ | ∅} which is replaced by 0 = ∅ 0 | ∅ 0 for the scoring game. That is, G H in Normal-play if and only if G H in guaranteed games. Let n be an integer. The games n are called waiting moves. For example, 0 = ∅ 0 | ∅ 0 = 0 and 1 = 0 | ∅ 0 and 2 = 1 | ∅ 0 . Regardless, the score of a waiting move will be 0, but in a game G + 1, Left has the ability to force Right to play consecutive moves in the G component.
The ability to pass may appear as something beneficial for a player. This is true in GS but not necessarily in the general universe of scoring games. For example, let G = ∅ 1 | ∅ −9 | ∅ 9 and note G ∈ GS. Clearly Left wins playing first. In G + 1, Left has no move in G and she must play her waiting move, 1. Right then plays to ∅ −9 | ∅ 9 . Now Left has no move and the score is −9, a Right win.
There are useful inequalities relating Normal-play and Scoring games.
Definition 18. Let G ∈ GS, and let G x be as G, but with each atom replaced by ∅ x . Let max(G) = max{s | ∅ s is an atom in G} and min(G) = min{s | ∅ s is an atom in G}.
Proof. For part 1, for any X, we establish the inequalities Ls(G min + X) Ls(G + X) and Rs(G min + X) Rs(G + X). First, if the game G + X is purely atomic, then, so is G min + X, and the inequalities are trivial, given Definition 18. Consider the game, (G min + X) L , obtained after an optimal move by Left. Ignoring the scores, Left can make exactly the same move in the game G + X, to say (G + X) L . Because, we maintain an identical tree structure of the respective games, we get by induction.
To prove the inequality for the Right scores, we consider the game (G + X) R , obtained after an optimal move by Right. Ignoring the scores, Right can make exactly the same move in the game G min + X, to say (G min + X) R . Therefore by induction. For part 2, it suffices to prove that min(G) − N G (and the proof of second inequality is similar). It is easy to see that N − N ∼ 0. Therefore, it suffices to prove that min(G) G + N , which holds if and only if min(G) Ls(G + N ) and the latter is easy to see. Part 3 follows by definition of waiting-moves.

Pass-allowed stops and Waiting moves
The following three points about the stops are immediate from the definitions but we state them explicitly since they will appear in many proofs.
Observation 20. Given a game G ∈ GS, (i) Ls(G) Rs(G L ) for all G L , and there is some G L for which Ls(G) = Rs(G L ); (ii) Rs(G) Ls(G R ) for all G R , and there is some G R for which Rs(G) = Ls(G R ); (iii) Ls(G + s) = Ls(G) + s for any number s.
The next result indicates that we only need to consider one of Ls and Rs for game comparison in GS. However, in the sequel, the proofs that use induction on the birthdays need the inequalities for both the Left-and Right-stops, because we must consider games with a fixed birthday. However, Theorem 21 enables a simple proof of Lemma 42.
Proof. The proof depends on the following result.
Claim 1: Rs(H + Y ) where the first inequality follows from the assumption about X, and, since X is a Right option of Y , the second inequality follows from Observation 20 (ii).
End of the proof of Claim 1.
In the next definition, "pass-allowed" typically means that one player has an arbitrary number of waiting moves in another component.
The 'overline' indicates that Left can pass and the 'underline' that Right can pass. Note that, in Ls(G), Left can even start by passing.
Since Left begins, Right does not require more than b(G) waiting-moves, until Left has run out of moves in G. Hence ). This proves the first claim, and the claim for the Right-stop is analogous. In the global game G + H, suppose that Right responds in H to Left's first move in G, then, for the rest of the game, Left can copy each local move in the global setting and has enough waiting moves to achieve a score of Ls(G) + Rs(H). Since she has other strategies, we have Ls(G)+Rs(H) Ls(G+H). The other inequality is proved analogously.
The results for the rest of the paper are sometimes stated only for Left. The proofs for Right are the same with the roles of Left and Right interchanged.
s and, for all G L , there exists G LR such that G LR is right-s-protected.
In [10] we prove a necessary and sufficient condition for a game to be greater than or equal to a number.
Theorem 27 (A Generalized Ettinger's Theorem [10]). Let s ∈ R and G ∈ GS. Then G s if and only if G is left-s-protected.

Reductions and Canonical Form
The reduction results, Theorems 30, 32, and 34, give conditions under which the options of a game can be modified resulting in a game in the same equivalence class. In all cases, it is easy to check that the new game is also in GS. Theorem 35 requires an explicit check that the modified game is a guaranteed game. In Normal-play games, the reduction procedures result in a unique game, which also has minimum birthday, called the 'canonical form'. It is noted by Johnson that both the scoring games he studied and those studied by Ettinger there may be many equivalent games with the minimum birthday. The same is true for guaranteed games. However, Theorem 35 gives a reduction that while it does not necessarily reduce the birthday does lead to a unique reduced game.
The results in this section will often involve showing that G H or G ∼ H for some games G, H where both have the same right options and they differ only slightly in the left options. Strategically, one would believe that only the noncommon left options need to be considered in inductive proofs, that is, the positions of (G L \ H L ) ∪ (H L \ G L ). The next lemma shows that this is true.
Lemma 28. Let F and K be guaranteed games with the same sets of right options, and in case this set is empty, the atoms are identical. Let X be a guaranteed game.
1. If Ls(F +X R ) = Ls(K +X R ) for all X R ∈ X R then Rs(F +X) = Rs(K +X).

If Rs(F + X L )
Rs(K + X L ), for all X L ∈ X L , and Rs(F L + X) = Ls(F + X), for some F L ∈ F L ∩ K L , then Ls(F + X) Ls(K + X).
Proof. Part 1: We prove the ' ' inequality and then ' ' follows by symmetry. If Right's best move in F + X is obtained in the X component, then Rs(F + X) = Ls(F + X R ) Ls(K + X R ) min{Ls((K + X) R )} = Rs(K + X). Otherwise, if Right's best move is in the F component, then he achieve a score at least as good in K + X by mimicking. If there are no right-options in F + X then neither are there any in K + X. Then, by assumption, the right-atom in F + X is identical to the right-atom in K + X, and hence the Right-stops are identical.
The proof of part 2 is very similar to that of part 1, since the respective Rightstops are obtained via a common option.
For example, in part 2 of Lemma 28, if Rs(F L + X) = Ls(F + X), for some F L ∈ F L \ K L , then the inequality Ls(F + X) Ls(K + X) does not follow directly. As we will see later in this section, when it holds, it is by some other property of the games F and K.
The next result re-affirms that provided a player has at least one option then adding another option cannot do any harm. This is not true if the player has no options. For example, consider G = ∅ 1 | 2 , now adding the left option −1 to G gives the game H = −1 | 2 . But, since Ls(G) = 1 and Ls(H) = 0 then H G.
Proof. The proof is clear since Left never has to use the new option.

Reductions
We first consider the most straightforward reduction, that of removing dominated options. For this to be possible we require at least two left options.
Proof. Note that H ∈ GS, because H is not atomic (at least B is a left option) and G ∈ GS. By the monotone principle, Lemma 29, G H. Therefore we only have to prove that H G. For this, we need to show that Ls(H + X) Ls(G + X) and Rs(H + X) Rs(G + X) for all X. We will proceed by induction on the birthday of X. Fix X ∈ GS. By induction, for each X R ∈ X R , we know that Ls(H + X R ) Ls(G + X R ). Thus from Lemma 28(1), it follows that Rs(H + X) Rs(G + X).
Now consider the Left-stops. By induction, for each X L ∈ X L , we know that Rs(H + X L ) Rs(G + X L ), that is the first condition of Lemma 28 (2) is satisfied. By assumption, the only non-common option is A ∈ G \ H. Therefore, by Lemma 28 (2), it suffices to study the case Ls(G + X) = Rs(A + X). Since A B, we get Ls(H + X) Rs(B + X) Rs(A + X) = Ls(G + X). Hence H G, and so H ∼ G.
We remind the reader that while we only define the following concepts from Left's perspective, the corresponding Right concepts are defined analogously.
Definition 31. For a game G, suppose there are followers A ∈ G L and B ∈ A R with B G. Then the Left option A is reversible, and sometimes, to be specific, A is said to be reversible through its right option B. In addition, B is called a reversing option for A and, if B L is non-empty then B L is a replacement set for A. In this case, A is said to be non-atomic-reversible. If the reversing option is left-atomic, that is, if B L = ∅ ℓ , then A is said to be atomic-reversible.
If Left were to play a reversible option then Right has a move that retains or improves his situation. Indeed, it is the basis for the second reduction. In Normalplay games, bypassing a reversible option is to replace a reversible option by its replacement set, even if the replacement set is empty. This results in a simpler game equal to the original. In GS, there are more cases to consider. We begin by showing that, if the replacement set is non-empty, then bypassing a reversible option does result in a new but equal game. In Theorem 34, we then treat the case of an atomic-reversible option.
Theorem 32 (Reversibility 1). Let G ∈ GS and suppose that A is a left option of Proof. Consider G, A, B as in the statement of the theorem, and recall that, since B is a reversing right option, G B. Moreover, there is a replacement set B L , so we let H = G L \ {A}, B L | G R . We need to prove that H ∼ G, i.e., Ls(G + X) = Ls(H + X) and Rs(G + X) = Rs(H + X) for all X. We proceed by induction on the birthday of X.
Fix X. Note that B L , G L and H L are non-empty so that B + X, G + X and H + X all have Left options. Moreover A + X has Right options.
For the Right-stops: by induction we have that Ls(G + X R ) = Ls(H + X R ) for any X R ∈ X R . Thus by Lemma 28(1), we have Rs(G + X) = Rs(H + X).
For the Left-stops, and within the induction, we first prove a necessary inequality. For the Right-stops: by the argument before the claim, Rs(H +X) = Rs(G+X). Since G B then Rs(G + X) Rs(B + X) and thus Rs(H + X) Rs(B + X). This concludes the proof of Claim 1.
By induction we have that Rs(G + X L ) = Rs(H + X L ) for any X L ∈ X L , which gives the first assumptions of Lemma 28 (2). It remains to consider the cases where the second assumption does not hold.
First, we consider Ls(G + X). By Lemma 28(2), the remaining case to consider is Ls(G + X) = Rs(A + X). Since B ∈ A R , we have Rs(A + X) Ls(B + X). By Claim 1, we know that Ls(H + X) Ls(B + X). By combining these inequalities we obtain Ls(G + X) Ls(H + X).
Secondly, we consider Ls(H + X). The only possibly non-common option is C ∈ B L , with C ∈ H L \ G L , and where we, by Lemma 28(2), may assume that Ls(H + X) = Rs(C + X). Moreover, G B, and thus Ls(H + X) = Rs(C + X) Ls(B + X) Ls(G + X).
For the next reduction theorem, there is no replacement set, because the reversing option is left-atomic. We first prove a strategic fact about atomic reversible options-nobody wants to play to one! Lemma 33 (Weak Avoidance Property). Let G ∈ GS and let A be an atomicreversible Left option of G. For any game X, if X L = ∅ then there is an X L such that Rs(A + X) Rs(G + X L ).
Proof. Let A be an atomic-reversible Left option of G and let B ∈ A R be a reversing option for A. Assume that X has a left option.
By definition, G B and B = ∅ ℓ | B R . Since B is a right option of A then A + X = (A + X) L | ∅ r . Consequently,

Rs(A + X)
Ls(B + X) = Rs(B + X L ), for some X L , The next reduction is about replacing a left atomic-reversible option A in a game G. There are two cases. If Left has a 'good' move other than A then A can be eliminated. Otherwise, we can only simplify A.
Theorem 34 (Atomic Reversibility). Let G ∈ GS and suppose that A ∈ G L is reversible through B = ∅ ℓ | B R .

If Ls(G) = Rs(G
Proof. Let A ∈ G L and B ∈ A R be as in the statement of the theorem, with G B. First an observation: Let n be the birthday of G and since B is a proper follower of G, the birthday of B is less than n. Since G B, from Lemma 23 we have where n is the birthday of G. This proves the claim. The proof of the equality in both parts will proceed by induction on the birthday of X. Again, in both parts, let H be the game that we wish to show is equal to G. We have, by induction, that Ls(G + X R ) = Ls(H + X R ), and by G R = H R , from Lemma 28(1), it then follows that Rs(G + X) = Rs(H + X).
It remains to show that Ls(G + X) = Ls(H + X) in both parts.

Part 1.
The assumption is that there exists C ∈ G L \ {A} with Ls(G) = Rs(C).
Note that both G+ X and H + X have left options since C is in both G L and H L . From Lemma 29 we have G H, and thus it remains to show that Ls(H + X) Ls(G + X).
By Lemma 28 (2), we need only consider the case Ls(G + X) = Rs(A + X). Note that X must be left-atomic; else, by Lemma 33, there would exist X L ∈ X L with Rs(A + X) Rs(G + X L ). Therefore, we may assume that X = ∅ x | X R . In this case, since C = A is the best pass-allowed Left move in G then this is also true for H. We now have the string of inequalities, where the first inequalities are from Corollary 25, and the last inequality is by Claim 1. Since B is a right option of A, we also have that Thus Ls(G+X) Ls(H +X) and this completes the proof of part 1 of the theorem.

Part 2. In this case, the Right's-pass-allowed Left-stop of G is obtained only through
Recall that it only remains to show that Ls(G + X) = Ls(H + X), and that, by Lemma 28, we only need to consider the non-common options in the respective games.
Secondly, suppose that Ls(G + X) = Rs(A + X). Note that if X has a left option then, by Lemma 33, there exists some X L ∈ X L such that Ls(G+X) = Rs(G+X L ). By induction, then Rs(G + X L ) = Rs(H + X L ) Ls(H + X). Therefore, we may assume that X = ∅ ℓ | X R . Since B is a right option of A, the only Left option in G, we have the string of inequalities To show that Ls(H + X) ℓ + x, we note that it suffices for Left to move in the H component to ℓ | B ∈ H L , since all scores in B = ℓ | B R are at least ℓ. Thus, by Lemma 28(2), we now have Ls(G + X) Ls(H + X).
From this, together with the conclusion of the previous paragraph, we have Ls(G + X) = Ls(H + X).
Suppose that G ∈ GS has an atomic-reversible option, A ∈ G L , with the reversing option B = ∅ ℓ | B R . Given the reduction in Theorem 34(2), a remaining problem of atomic reducibility is to find a simplest substitution for B. In Section 3.3, we will show that the following result solves this problem. 1. There exists a smallest nonnegative integer n such that G ℓ − n and G ∼ ℓ − ( n + 1), Proof. Case 1: Let m = b(B). By assumption G B and, by Theorem 19 (2), B ℓ − m, and thus G ℓ − m. Since m is a nonnegative integer, the existence part is clear. Let n be the minimum nonnegative integer such that G ℓ − n.
Let K = ℓ − ( n + 1), which upon expanding is ∅ ℓ | ℓ − n , let H = K, G L \ {A} | G R , and let G ′ = K, G L | G R . By Lemma 29 and the definition of n, we have G ′ G ℓ − n. Hence ℓ − n is a reversing game in both G and G ′ , and both A and K are atomic-reversible Left options in G ′ .
Since G satisfies part 2 of Theorem 34. Then Claim 1 in Theorem 34 can be strengthened. Hence, ℓ = Ls(G) = Rs(A). We also have that Rs(K) = ℓ. It is now easy to see that Ls(G ′ ) = ℓ. Thus we have two atomic-reversible Left options in G ′ , and so we can apply part 1 in Theorem 34. We get that G ′ ∼ G since K is an atomicreversible Left option in G ′ . Moreover, G ′ ∼ H, since A is also atomic-reversible. This finishes the proof of Case 1.
Case 2: This is the case where G L = {A}. We put H = ∅ ℓ | G R ∈ GS. To prove G ∼ H we proceed by induction on the birthday of the distinguishing game X.
From Lemma 28(1) and induction, we have that Rs(G + X) = Rs(H + X), for any X ∈ GS.
For the Left-stops, from Case 1, we know that G ∼ ℓ − ( n + 1) | G R . Therefore, in the case X = ∅ x | ∅ y it is easy to see that Ls(H + X) = ℓ + x Ls(G + X), since y x. Moreover, we also have Ls(G + X) = Rs(A + X) Ls(B + X) = ℓ + x, which thus proves equality.
If X L = ∅ x and X R = ∅, then, Ls(G + X) = Rs(ℓ − ( n + 1) + X) and it is clear that Right can obtain the score ℓ + x by playing to ℓ − n + X. Since both games are left-atomic and in GS, then Rs(ℓ − ( n + 1) + X) ℓ + x, so in fact, equality holds. Hence, in this case, we get Ls(G + X) = ℓ + x = Ls(H + X).
If X L = ∅, then by Lemma 33 (weak avoidance), there is some X L such that Rs(A + X) Rs(G + X L ). Therefore, Ls(G + X) = max{Rs(G + X L ) : X L ∈ X L }. Also, Ls(H + X) = max{Rs(H + X L ) : X L ∈ X L } since there is no Left move in H. By induction, Rs(H + X L ) = Rs(G + X L ) and consequently, Ls(G + X) = Ls(H + X). Here ℓ is a real number and n 0 is an integer (as given in Theorem 35) providing a number of waiting moves for Right. We have the following definition.
Definition 36. A game G ∈ GS is said to be reduced if none of Theorems 30, 32, 34, or 35 can be applied to G to obtain an equivalent game with different sets of options.

Constructive Game Comparison
We wish to prove that, for a given guaranteed scoring game, there is one unique reduced game representing the full congruence class, a canonical form. To this purpose, in this subsection, we first develop another major tool (also to be used in Section 4) of constructive game comparison. The existence of a canonical form is far from obvious, as the order of reduction can vary. In Normal-play, the proof of uniqueness uses the fact that if G ∼ H then G − H ∼ 0. However, in (guaranteed) scoring play, G ∼ H does not imply G + ←→ H ∼ 0. We use an idea, 'linked', adapted from Siegel [13], which only uses the partial order. To fully adapt it for guaranteed games, we require a generalization of Theorem 27 (which in its turn is a generalization of Ettinger's [6] theorem for dicot games).

Recall that
where the conjugate is applied to the respective options, and if, for example, G R = ∅ r , then Definition 37. Let G ∈ GS and let m(G) = max{|t| : ∅ t is an atom in G}. Let r, s be two nonnegative real numbers. The (r, s)-adjoint of G (or just adjoint ) is Since −m(G) − r − 1 m(G) + s + 1, it follows that G • r,s ∈ GS. Theorem 38. Given G ∈ GS and two nonnegative real numbers r, s then Ls(G + G • r,s ) < −r and Rs(G + G • r,s ) > s.
Proof. In the game G+ ↔ G+ ∅ −m(G)−r−1 | ∅ m(G)+s+1 , the second player can mirror each move in the G+ ↔ G component, and there are no other moves since the remaining component is purely-atomic. Therefore, The bound for the Right-stop is obtained similarly.
Observation 39. If r = s = 0 in Definition 37, then Theorem 38 corresponds to the particular case where Ls(G + G • 0,0 ) < 0 and Rs(G + G • 0,0 ) > 0. This will suffice in the below proof of Lemma 43. Thus we will use the somewhat simpler notation G • for the (0, 0)-adjoint of G.
Definition 40. Let G, H ∈ GS. We say that H is linked to G (by T ) if there exists some T ∈ GS such that Ls(H + T ) < 0 < Rs(G + T ).
Note that, if H is linked to G, it is not necessarily true that G is linked to H. 1. There exists X ∈ GS such that Ls(G + X) < 0 < Ls(H + X) 2. There exists Y ∈ GS such that Rs(G + Y ) < 0 < Rs(H + Y ).
Proof. By assumption, there exists X such that Ls(G + X) < Ls(H + X) or there exists Y such that Rs(G + Y ) < Rs(H + Y ). By Theorem 21 (the claim in its proof), we have that Suppose that there exists Z such that α = Ls(G + Z) < Ls(H + Z) = β. Let . . , H R ℓ }, including the case that either or both are atoms. By Lemma 42, for each i, 1 i k, we can define X i such that Ls(G Li + X i ) < 0 < Ls(H + X i ), and, for each j, 1 j ℓ, we can define Y j such that Here G R • denotes the set of (0 Consider next G + T with T L in the first line of the definition. We get that Ls(G + T ) < 0 because, either The last case follows because there are left options in G, so the second line of the definition of T R applies. In every case, Ls(G + T ) < 0. The argument for Rs(H + T ) > 0 is analogous. Therefore, Ls(G + T ) < 0 < Rs(H + T ) and G is linked to H by T .
In the following result we extend Theorem 27 by using the linked results. From an algorithmic point of view, when comparing games G and H, it ultimately removes the need to consider G + X and H + X for all X.
Consider H L ∈ H L . Because G H , by Lemma 41, G is not linked to H L . Therefore, by Lemma 43, we have ∃G L ∈ G L : G L H L or ∃H LR ∈ H LR : G H LR . The proof of part 3 is similar.
(⇐) Assume 1, 2 and 3, and also suppose that G H. By the definition of the partial order, there is a distinguishing game X such that either Ls(G + X) < Ls(H + X) or Rs(G + X) < Rs(H + X). Choose X to be of the smallest birthday such that Ls(G + X) < Ls(H + X). There are three cases: In this case, Ls(H + X) = h+ x. On the other hand, Ls(G+ X) Ls(G+ X) Ls(G) + Rs(X) (this last inequality holds by Theorem 24). Also, Ls(G) + Rs(X) Ls(H) + x, because Ls(G) Ls(H) and by X ∈ GS, Definition 3(2). Finally, Ls(H)+x = h+x because Ls(H) is trivially equal to h. This contradicts Ls(G + X) < Ls(H + X). (c) Ls(H + X) = Rs(H + X L ), for some X L ∈ X L . By the "smallest birthday" assumption, Rs(G+X L ) Rs(H +X L ). Therefore, . Once more, we contradict Ls(G + X) < Ls(H + X).
For the Right-stops Rs(G + X) < Rs(H + X) the argument is similar. Hence, we have shown that G H.
Note that we can derive the known result, Theorem 27, as a simple corollary of Theorem 44, by letting H = s be a number.

Uniqueness of Reduced Forms
We are now able to prove the existence of a unique reduced form for a congruence class of games. We let denote "identical to", that is if G, H ∈ GS, then G H if they have identical game tree structure and, given this structure, each atom in G corresponds to an identical atom, in precisely the same position, in the game H.
The proof is divided into two cases.
Observe that part 2 of Theorem 34 (the atomic-reversibility theorem) applies, because if A would have been as in part 1 of that theorem, then it would have reversed out (contradicting the assumptions on G and H). Therefore, A is the only Left option with Ls(H) = Rs(A).
If, for every G L ∈ G L we have Ls(H) = Rs(G L ), then Ls(G) = Ls(H), which contradicts G ∼ H. Thus, there is some A ′ ∈ G L with Ls(H) = Rs(A ′ ) and, from the pairwise correspondence for non-atomic-reversible options, it also follows that A ′ is atomic-reversible. Therefore, we may assume that A = a − n + 1 and that A ′ = a ′ − m + 1 for some real numbers a, a ′ , and some nonnegative integers, n, m.
Since Rs(A ′ ) = Rs(A) then a = a ′ . That m = n follows from (Theorem 35 (1)), the definition of minimal nonnegative integer, since A R = a − n and A ′R = a ′ − m are reversing options. Therefore A A ′ , and again, if there was another Left option, G L ∈ G L with Ls(G) = Rs(G L ), then it must have been reversed out, because of the assumption of reduced form. Hence A ′ is the only such Left option in G.

Case 2:
The only left option of H is A = ∅ h | h − n , for some real number h and nonnegative integer n, that is H = ∅ h | h − n | H R . Since H cannot be reduced further, by the second part of Theorem 35, it follows that ∅ h | H R ∈ GS. Thus there must exist an s-atom, with s < h, in an atomic follower of H R .
Consider the Left options of G. By the pairwise correspondence of non-atomicreversible options, since H L has none then neither has G L . So, if G L has options they are atomic-reversible.
First, suppose that G = ∅ h | G R . The non-atomic-reversible right options of G and H are paired (the conclusion of Part 2 of this proof). Since G ∈ GS then ∅ s is not in any non-atomic-reversible right option of G and hence ∅ s is not in any non-atomic-reversible right option of H. Thus, either H R = ∅ s or H has a right atomic-reversible option s − m | ∅ s . In the latter case, by Theorem 34(2) (with Left and Right interchanged) Rs(H) = s. Thus, in both cases, Rs(H) = s, from which it follows that Rs(G) = s which, in turn, implies that ∅ s is in G R . This again contradicts G ∈ GS. Therefore, In all cases, we have shown that H L is identical to G L . The proof for H R and G R is similar. Consequently G H.
The next result is immediate. It allows us to talk about the canonical form of a game/congruence class.
where n is the minimum nonnegative integer such that G ℓ − n, and where ℓ = Ls(B) is a real number (where B is the reversing option).
(i) Suppose first that there is some Right option in H. We prove four claims.
Since G ℓ − n, we get G + n ℓ. Hence, Rs(G) Rs(G + n) ℓ, where the first inequality holds because Left can pass.
(b) There exists an atomic-reversible option H R ∈ H R . Suppose not; we will argue that this implies Rs(G + H) > 0, contradicting G + H ∼ 0 (Theorem 27). Because H has no atomic-reversible Right option, we saw in Case 2a that for all H R there exists non-atomicreversible G L such that G L +H R ∼ 0. By induction, G L ∼ −H R . Because A = ℓ − n + 1 is an atomic-reversible option in G L , by Theorem 34 (2), where the first equality is by definition of the conjugate of a game. This holds for all H R ∈ H R and so, Rs(H) > −ℓ. Therefore, by Theorem 24, and the claim is proved.
(c) The atomic-reversible Right option of H is −ℓ + m + 1 (where m is minimum such that H −ℓ + m).
We have seen in the inequality (1)  (d) Finally, m = n. Consider the integers, n and m as previously defined. They are minimal such that G ℓ − n and H −ℓ + m, respectively. If n = m, say n < m, from G ℓ− n, adding H to both sides gives 0 H +ℓ− n ⇒ H −ℓ+ n. This is a contradiction (m is not minimal). Hence, we must have m = n.
Thus, we have proved that if A = ℓ − n (in reduced form) is a Left atomicreversible option of G, then there is an H R ∈ H R with H R = −ℓ + n = −A.
(ii) Since A ∈ G L is an atomic-reversible option, then H R is not an atom.
First, if it were true that H R = ∅ −s , for some real number s, then this would force s = ℓ. This follows by an argument similar to that in 2b(i.c). Suppose therefore that H R = ∅ ℓ . In this case, A = ℓ − n + 1 is the only Left option of G; any other options would be non-atomic-reversibles (by domination) paired in H R (by Case 2a), but there are none. Now, the non-atomicreversible options of H L and G R are paired and since G ∈ GS then ℓ is less than or equal to all the scores in the games of G R . Since n 0 then, by Theorem 35, G L could be replaced by ∅ ℓ contradicting that G was in reduced form.
We have seen that each G L has a corresponding −G L in the set of Right options of H. This finishes the proof.
As a final comment, not every game is invertible and we do not have a full characterization of invertible games. We do know that zugzwang games do not have inverses.

A Scoring Games Calculator
The translation of a guaranteed game position to its canonical scoring value is not a trivial computation task and cannot be done manually except for very simple examples. A computer program is required for more complex positions. The Scoring Games Calculator (SGC) is such a program. It is implemented as a set of Haskell modules that run on an interpreter available in any Haskell distribution or embedded in a program that imports these modules.
The SGC has two main modules, Scoring and Position, that act as containers of two data types: Game and Position. The first module deals with scoring game values and the second with board positions given a ruleset.
Game values represent values from set S like <1|∅ 3 >. This type includes an extensive list of Haskell functions that mirror the mathematical functions presented in this article. One simple example is predicate guaranteed that checks if a game value in S is also in GS. Another operation is the sum of games that takes two values in GS and computes their disjunctive sum.
Position values represent board positions. Type Position is an abstract type. It encloses a set of services useful for all games, like reading a position from file or converting a position to its scoring value. These functions are only able to work when a concrete ruleset is implemented. Given a game, say Diskonnect, there should be a module Diskonnect that imports module Position, and is required to implement the Diskonnect ruleset. Almost all effort to define a new game focus in the implementation of function moves that, given a board position and the next player, returns the list of all possible next positions. With this, Position is able to construct a game tree for a given board position and to translate that position into its scoring value.
The scoring universe together with its main theorems concerning reductions and comparisons all have a strong recursive structure that fits quite well into a functional programming language like Haskell. Not all mathematical definitions are simply translations to functions, but some are. For example, the implementation of left-rprotected mirrors quite closely its definition, lrp :: NumberData -> Game -> Bool lrp r g = ls_d g >= r && for_all [ for_any [lrp r gRL | gRL <-leftOp gR] | gR <-rightOp g] where ls_d is Ls and syntax [f x|x<-list] defines list comprehensions. The SGC includes too many functions to be described here 7 . Currently, the following guaranteed rulesets are implemented: Diskonnect, Kobber, TakeSmall and TakeTall.