LIPIcs ITP 2024 26
LIPIcs ITP 2024 26
Abstract
In dependently typed proof assistants, users can declare axioms to extend the ambient logic locally
with new principles and propositional equalities governing them. Additionally, rewrite rules have
recently been proposed to allow users to extend the logic with new definitional equalities, enabling
them to handle new principles with a computational behaviour. While axioms can only break
consistency, the addition of arbitrary rewrite rules can break other important metatheoretical
properties such as type preservation. In this paper, we present an implementation of rewrite rules on
top of the Coq proof assistant, together with a modular criterion to ensure that the added rewrite
rules preserve typing. This criterion, based on bidirectional type checking, is formally expressed in
PCUIC – the type theory of Coq recently developed in the MetaCoq project.
Keywords and phrases type theory, dependent types, rewrite rules, type preservation, Coq
1 Introduction
Dependently typed languages are the basis of major proof assistants like Agda [11], Coq [15]
and Lean [5]. In these systems, since general equality is undecidable, it is split into a decidable
fragment called definitional and the complete but undecidable propositional equality. While
propositional equality can be extended through axioms, definitional equality is defined upfront
and limited in a way that can hinder proof developments and usability. Rewrite rules, as
recently proposed by Cockx et al. [3], can mitigate these limitations, by allowing users
to extend definitional equality in a powerful way through custom reductions. A standard
example one can write using our implementation is parallel plus, an addition which can
reduce on constructors from both sides, something which cannot be defined using a fixpoint.
Symbol pplus : N → N → N.
Infix "++" := pplus.
Rewrite Rules pplus_rew :=
| 0 ++ ?n ⇝ ?n | S ?m ++ ?n ⇝ S (?m ++ ?n)
| ?m ++ 0 ⇝ ?m | ?m ++ S ?n ⇝ S (?m ++ ?n).
The symbol pplus behaves as standard addition, but with additional definitional equalities,
that only hold propositionally for standard addition. For instance, n ++ 0 and 0 ++ n are
both definitionally equal to n. This example shows the definition of a function that could
already be defined, albeit with fewer definitional properties, but it is also possible to extend
© Yann Leray, Gaëtan Gilbert, Nicolas Tabareau, and Théo Winterhalter;
licensed under Creative Commons License CC-BY 4.0
15th International Conference on Interactive Theorem Proving (ITP 2024).
Editors: Yves Bertot, Temur Kutsia, and Michael Norrish; Article No. 26; pp. 26:1–26:18
Leibniz International Proceedings in Informatics
Schloss Dagstuhl – Leibniz-Zentrum für Informatik, Dagstuhl Publishing, Germany
26:2 The Rewster: Type Preserving Rewrite Rules for the Coq Proof Assistant
the logic of Coq with new powerful constructions like inductive-recursive types using rewrite
rules. As an illustration, consider the definition of (a simple version of) a universe of types:
Axiom U : Type.
Symbol El : U → Type.
Symbols (N : U) (Pi : ∀ a : U, (El u → U) → U).
Symbol U_rect : ∀(P : U → Type), P N → (∀ a b, (∀ A : El a, P (b A)) → P (Pi a b)) → ∀ u, P u.
Rewrite Rules U_rect_rew :=
| U_rect ?P ?pN ?pPi N ⇝ ?pN
| U_rect ?P ?pN ?pPi (Pi ?a ?b) ⇝ ?pPi ?a ?b (fun (A : El ?a) ⇒ U_rect ?P ?pN ?pPi (?b A)).
Rewrite Rules El_red :=
| El N ⇝ N
| El (Pi ?a ?b) ⇝ ∀ (A : El ?a), El (?b A).
Without rewrite rules, this construction can be fully postulated with axioms using proposi-
tional equality, but in practice neither the elimination principle U_rect nor function El will
compute, which makes the construction much less convenient to use. From this point of
view, strictly positive indexed inductive types currently available in Coq can be seen as one
particular class of inductive constructions that have been anticipated, justified and provided
natively in the system. Rewrite rules allow users to go outside this fragment, for instance by
defining a non-strictly positive inductive type.1
However, contrarily to axioms which can only endanger consistency, the power of rewrite
rules needs to be put in check or important metatheoretical properties can be broken. The
work of Cockx et al. puts in light that the first property that rewrite rules must verify is
confluence, otherwise subject reduction can be broken and type checking quickly becomes
undecidable. They develop a criterion that can be checked to determine if a rewriting system
satisfies confluence, namely the triangle property [3].
To preserve subject reduction of the complete theory, rewrite rules must also verify a
second property, type preservation, that is if t : T and t rewrites to t′ , then t′ : T . In [3], type
preservation of rewrite rules is simply postulated, and one of the main contributions of this
paper is to present a criterion to decide type preservation. A simple and intuitive check for
type preservation is to require that the equality induced by the rewrite rule should be well
typed for every possible instance (∀⃗x, p = r). There are however problems with this criterion
as terms don’t have a unique type in Coq because of the subtyping introduced by universe
cumulativity (Type@{u} is a subtype of Type@{v} when u ⩽u v). Let us see a few examples
of how rewrite rules may break type preservation in ways that are difficult to notice.
▶ Example 1. If we consider an identity function on types, which simply returns its argument,
cumulativity should mandate that the domain universe be smaller than the codomain universe.
However, in the following example with a rewrite rule, this restriction is not enforced.
Symbol id@{u v} : Type@{u} → Type@{v}.
Rewrite Rule id_rew := id ?T ⇝ ?T.
Universe a.
Check id Type@{a} : id Type@{a}.
The reason is that the naive check only mandates that both universes share a common bigger
universe – something which is always verified. Thus, the check doesn’t forbid this rule which
nonetheless breaks consistency by allowing the existence of a term U : U [8, 9].
1
The fact that general non-strictly positive inductive types are unsafe is due to Coquand and Paulin [4],
but the argument uses impredicativity.
Y. Leray, G. Gilbert, N. Tabareau, and T. Winterhalter 26:3
This paper demonstrates that establishing a sound criterion for type checking, in the absence
of unique types, requires an asymmetry between both sides side of the rewrite rules. Indeed,
having a common type is not enough, but the correct criterion rather ensures that the
right-hand side of a rewrite rule checks against the principal type inferred from the pattern
on the left-hand side.
Another issue with the naive criterion is that the type of the variables ⃗x that appear in
the pattern might also not be unique, and just testing for one possibility that works is not
enough.
▶ Example 2. Let us consider a symbol which extracts the domains of product types, which
can be defined as follows:
Symbol dom : Prop → Prop.
Rewrite Rule dom_rew := dom (∀ (x : ?A), ?B) ⇝ ?A.
Eval compute in dom (∀ (x : N), True). (* N : Prop *)
The problem here is that no information tells us what sort ?A has in the pattern in general,
since the domain of a product type may have any sort quality (SProp, Prop or Type). Our
naive criterion, however, fails to detect the issue because the equality is well typed when
both ?A and ?B have type Prop, since then both the pattern and the replacement term will
have that type, and this problematic rule can then be accepted.
The correct criterion will make use of typing in patterns to try to find equalities between
pattern variables, so they can be used when typing the replacement term. We can however
also face an issue if we use the existing unification algorithm as is on patterns, since it uses
shortcut heuristics which don’t always assume the worst.
The rewrite rule D _ (C _ ?b) ⇝ ?b looks enticing, if the box is not to be considered as a
truncation. However, we don’t know the behaviour of I at that point, and as we add rule
I _ ⇝ false, our first rule loses its type preservation property, so it should not in fact have
been accepted in the first place.
The last meta-theoretical property that is not endangered by adding axioms but may break
with additional rewrite rules is termination. However, non-termination cannot introduce
proofs of ⊥; these can only come from the declaration of symbols and rules which are
inconsistent propositionally from the start [3, Section 6.4]. It does not break subject
reduction either, and we can derive a type checker that is valid irrespective of termination,
in the sense that if it answers, the answer is correct. Non-termination can at worst result
in type checking divergence. Therefore, we can show subject reduction with our criterion
without relying on termination.
ITP 2024
26:4 The Rewster: Type Preserving Rewrite Rules for the Coq Proof Assistant
A, B, P, T, c, t, . . . ::= x variable
| ?x{⃗t} metavariable
| s sort
| A axiom
| C constructor
| I inductive
′
| tt application
| λ(x : A : s), t abstraction
| ∀(x : A : s), (B : s′ ) product
| case c return P : s with ⃗t case destruction
| fix f : T := t fixpoint
s, s′ ::= □qu sort
q ::= SProp Prop Type qualities
In this paper, we present how rewrite rules can be added to PCUIC, the theory of the
kernel of Coq as defined in the MetaCoq project [13, 14] Coq, and extend the work of Cockx
et al. [3] by providing a criterion for type preservation in presence of cumulativity. Rewrite
rules, as described in this paper and using the syntax demonstrated in the examples, have
been implemented and recently integrated into the development branch of the Coq proof
assistant.2 This means they are expected to be available in the upcoming release 8.20 of Coq.
2.1 Syntax
PCUIC, whose syntax is given in Figure 1, is a formal description of the kernel of Coq,
as defined in the MetaCoq project [14]. It is a dependent type theory with inductive
types, where inductive elimination is split between fixpoints and case analysis. Its sorts
are of three different so-called qualities: the usual Type hierarchy, with universe levels u
as index, and two additional sorts Prop and SProp. These two last sorts are impredicative,
and SProp also has definitional uniqueness of proofs, i.e., any two elements of a type in
SProp are convertible. Universes are ordered with ⩽u , but when the quality of a sort is
Prop or SProp, its universe becomes irrelevant. This defines an ordering ⩽s on sorts with
′
□qu ⩽s □qu′ ⇐⇒ q = q ′ ∧ (q ∈ {SProp, Prop} ∨ u ⩽u u′ ). We also order □Prop u <s □Type
u′ .
One should note that λ-abstractions, products and cases are heavily annotated; the type
annotation on the domains is standard for PCUIC, but the additional sort annotations are
2
https://github.com/coq/coq/pull/18038
Y. Leray, G. Gilbert, N. Tabareau, and T. Winterhalter 26:5
only required for the proof of our type preservation criterion and should be considered as
virtual. They can be unambiguously obtained from the typing judgment on the term and
will also be omitted when they aren’t useful.
To account for higher-order rewrite rules, the syntax needs to be extended with higher-
order variables – henceforth called metavariables. A metavariable ?x{⃗t} is meant to live
in an extended context, where said context extension is to be instantiated by the context
instantiation ⃗t that the metavariable carries. While regular variable substitution will be
denoted as t[τ ], metasubstitution – substitution for metavariables – will be denoted as
t{σ}. Regular substitutions do nothing to metavariables, they only propagate to the context
−→
instantiation: (?x{⃗t})[τ ] = ?x{t[τ ]}. Dually, metasubstitutions do nothing to variables
−−→
and operate solely on metavariables as (?x{⃗t}){σ} = σ(?x)[t{σ}], they also propagate
transparently to subterms on all other grammar constructions. Note that metasubstitutions
need to be defined on all metavariables of the substituted term, a condition that will be
verified for metasubstitutions appearing in rewrite rules.
2.2 Typing
ITP 2024
26:6 The Rewster: Type Preserving Rewrite Rules for the Coq Proof Assistant
Σ; Θ ⊢ Γ (x : T ) ∈ Γ Σ; Θ ⊢ Γ (C : T ) ∈ Σ
Var EnvVar
Σ; Θ; Γ ⊢ x : T Σ; Θ; Γ ⊢ C : T
Σ; Θ ⊢ Γ (Γ?x ⊢ ?x : A) ∈ Θ Σ; Θ; Γ ⊢ ⃗t : Γ?x Σ; Θ ⊢ Γ
Meta Sort
Σ; Θ; Γ ⊢ ?x{⃗t} : A Σ; Θ; Γ ⊢ □qu : □Type
↑q u
Σ; Θ; Γ ⊢ f : ∀(x : A), B Σ; Θ; Γ ⊢ t : A
App
Σ; Θ; Γ ⊢ f t : B[x := t]
Σ; Θ; Γ ⊢ A : □qu Σ; Θ; Γ, x : A ⊢ t : B
Lambda
Σ; Θ; Γ ⊢ λ(x : A : □qu ), t : ∀(x : A), B
′
Σ; Θ; Γ ⊢ A : □qu Σ; Θ; Γ, x : A ⊢ B : □qu′
′ ′ Forall
Σ; Θ; Γ ⊢ ∀(x : A : □qu ), (B : □qu′ ) : □qu∗q u′
q′
Σ; Θ; Γ ⊢ T : □qu Σ; Θ; Γ, f : T ⊢ t : T t guarded
Fix
Σ; Θ; Γ ⊢ fix f : T := t : T
Σ; Θ; Γ ⊢ t : A Σ; Θ; Γ ⊢ A ⩽ B Σ; Θ; Γ ⊢ B : □qu
Cumul
Σ; Θ; Γ ⊢ t : B
β FixRed
Σ; Θ; Γ ⊢ (λ(x : A), t) a → t[x := a] Σ; Θ; Γ ⊢ fix f := t → t[f := fix f := t]
′
Σ; Θ; Γ ⊢ □qu ⩽s □qu′ Σ; Θ; Γ ⊢ T : SProp Σ; Θ; Γ ⊢ t : T Σ; Θ; Γ ⊢ u : T
′
Σ; Θ; Γ ⊢ □qu ⩽α □qu′ Σ; Θ; Γ ⊢ t ⩽α u
?x ∈ Θ Σ; Θ; Γ ⊢ ⃗t ≡α ⃗t′ Σ; Θ; Γ ⊢ f ⩽α f ′ Σ; Θ; Γ ⊢ a ≡α a′
Σ; Θ; Γ ⊢ ?x{⃗t} ⩽α ?x{⃗t′ } Σ; Θ; Γ ⊢ f a ⩽α f ′ a′
Σ; Θ; Γ ⊢ A ≡α A Σ; Θ; Γ ⊢ t ⩽α t′
′
Σ; Θ; Γ ⊢ λ(x : A : □qu ), t ⩽α λ(x : A′ : □qu′ ), t′
Σ; Θ; Γ ⊢ A ≡α A′ Σ; Θ; Γ ⊢ B ⩽α B ′
′ q′
Σ; Θ; Γ ⊢ ∀(x : A : □qu ), (B : □qu11 ) ⩽α ∀(x : A′ : □qu′ ), (B ′ : □u1′ )
1
p ::= x variable
| □q?x sort
| C constructor
| I inductive
| S symbol
| pa application
| λ(x : a : s), p abstraction
| ∀(x : a : s), (b : s′ ) product
| case p return a : s with ⃗b case destruction
a, b ::= p | ?x pattern / pattern variable
q ::= SProp Prop Type ?x explicit quality / quality variable
s, s′ ::= □?y
?x pattern sort annotation
2.3 Cumulativity
Our typing judgment contains a cumulativity rule (Cumul), instead of the more traditional
conversion rule. Cumulativity is defined similarly to conversion, except that we allow
′
subtyping on sorts: □qu ⩽s □qu′ . Formally, cumulativity is defined as the transitive closure
⩽ of → ∪ ← ∪ ⩽α where → is reduction, ← antireduction (the relation symmetric to →)
and ⩽α is α-cumulativity. We also define conversion ≡ by replacing α-cumulativity with
α-conversion. The reduction is generated by β and ι-reductions, unfolding of fixpoints (Rule
FixRed) and is allowed to happen in any subterm. α-cumulativity is roughly α-equality
with the cumulativity rule on sorts explained above, closed by congruence. However, most
subterms need to be convertible and not only cumulative in the congruence. Formally,
cumulativity ⩽α is defined with the rules described in Figure 2 and the other congruences
where all subterms need to be related with ≡α , where t ≡α t′ is defined as t ⩽α t′ ∧ t′ ⩽α t.
PCUIC also supports SProp’s proof irrelevance in α-cumulativity. The rule given is more
idealistic than the real presentation with relevance marks [7] but the differences won’t matter
here.
3 Rewrite Rules
A rewrite rule is composed of a left-hand side, called pattern, and a right-hand side, called
replacement term, and written p ⇝ r. When it is applied, a term t is matched against the
pattern p to extract subterms at the holes of the pattern to form a metasubstitution σ, which
is then substituted in the replacement term r, resulting in reduction t → r{σ}. As explained
in the introduction, restrictions are needed so that the metatheory of PCUIC does not break.
3.1 Pattern
The syntax of patterns is described in Figure 3. As a separation from axioms, that never
reduce, we introduce in PCUIC a new class of constants named symbols, which is a specific
subclass of axioms on which rewrite rules operate.
Patterns corresponds to a subset of terms that need to be rigid. Their holes are denoted
as metavariables (?x, also called pattern variables) in the pattern. Note that, unlike
metavariables, pattern variables don’t carry context instantiations. Thus to see a pattern
ITP 2024
26:8 The Rewster: Type Preserving Rewrite Rules for the Coq Proof Assistant
A a {σ1 } t p {σ2 }
′
λ(x : A : □qu′ ), t λ(x : a : □qu ), p {[q := q ′ ; u := u′ ]+σ1 + σ2 }
A a {σ1 } B b {σ2 }
q′ q′
∀(x : A : □u1′ ), (t : □u2′ ) qi := q⃗i′ ; u⃗i := u⃗′i ]+σ1 + σ2 }
∀(x : a : □qu11 ), (b : □qu22 ) {[⃗
1 2
variable as a term, we have to add the identity instantiation {|∆| := ∆} for ∆ the context in
which the pattern variable lives. This gives us an injection p 7→ p from patterns to terms.
From now on, we will simply use the change of color to represent this injection.
The rigidity requirements manifest as limitations on where holes may appear, so patterns
are split into bona fide patterns, where holes will not be allowed, and argument patterns
where they will be allowed. Since term reduction happens in a stack machine, the reduction
of rewrite rules has to work on the machine representation of terms which have already been
reduced to weak head normal form. For instance, having a hole at the head of eliminations
(e.g., pattern ?f ?a against term (λx f, f x) 0 S) would be unstable (does ?a match with 0
or S?) and would not commute with other reductions (?f can match with S after reductions),
so it must be forbidden. This means that argument patterns may be anywhere but at the
main subterm of an elimination construction, that is an application or a case destruction.
We also have to impose additional restrictions to prove confluence; they are listed and
justified in Section 4.1. Finally, since we don’t want to try all rewrite rules at all steps in
reduction, we restrict patterns such that the head of their main branch of the pattern needs
to be a symbol, which will be the starting point to pattern matching during reduction.
In the end, each rewrite rule induces the following reduction rule:
(p ⇝ r) ∈ Σ t p {σ}
Rew
Σ; Θ; Γ ⊢ t → r{σ}
where t p {σ} denotes the matching of p against a term t as described in the following
section.
3.2 Pattern-Matching
Pattern matching is the operation that tries to match a term t against a pattern p, resulting
in a metasubstitution σ when it succeeds. Each pattern constructor can match against the
associated term constructor, trying to match recursively, and pattern variable match against
Y. Leray, G. Gilbert, N. Tabareau, and T. Winterhalter 26:9
any term to fill the metasubstitution. It is denoted as t p {σ} and is formally defined in
Figure 4. Actually, pattern matching can also match universes against universe variables and
sort qualities against quality variables, so the resulting σ contains in fact a metasubstitution,
a universe substitution and a sort quality substitution. These last two substitutions are
defined transparently on terms, only having effect on the quality and universe variables they
carry. Pattern matching verifies one crucial property: ∀p t σ, t p {σ} =⇒ t ≡α p{σ}. This
α-conversion would be an equality if not for the fact that some technical information is lost
by pattern matching, such as names of binders.
Note that patterns are as heavily annotated as terms, but in this case it is required in
the normal course of operations. Indeed, during the typing pass, we need to give a type to
argument patterns so they can be provided to all pattern variables. This means that all type
fields need an annotation to name the quality and universe of said type. These annotations
can however not be used during pattern matching, as they contain no computational content;
the quality and universe that are matched from the virtual annotation of terms can be
considered virtual as well. These type annotations are used to avoid the issue presented
in Example 2 where the type of a pattern variable may vary depending on the term being
matched.
ITP 2024
26:10 The Rewster: Type Preserving Rewrite Rules for the Coq Proof Assistant
(Γ ⊢ ?x : A) ∈ Θ Σ; Θ; Γ ⊢p p ▷ A Σ; Θ; Γ ⊢p A ⩽p B
PatVar Conv
Σ; Θ; Γ ⊢p ?x ◁ A Σ; Θ; Γ ⊢p p ◁ B
(x : A) ∈ Γ (C : T ) ∈ Σ
Var Ax-Symb Sort
Σ; Θ; Γ ⊢p x ▷ A Σ; Θ; Γ ⊢p C ▷ T Σ; Θ; Γ ⊢p □qu ▷ □Type
↑q u
Σ; Θ; Γ ⊢p p ▷ ∀(x : A), B Σ; Θ; Γ ⊢p a ◁ A
App
Σ; Θ; Γ ⊢p p a ▷ B[x := a]
Σ; Θ; Γ ⊢p a ◁ □qu Σ; Θ; Γ, (x : a) ⊢p p ▷ B
Lambda
Σ; Θ; Γ ⊢p λ(x : a : □qu ), p ▷ ∀(x : a), B
′
Σ; Θ; Γ ⊢p a ◁ □qu Σ; Θ; Γ, (x : a) ⊢p b ◁ □qu′
′ ′ Forall
Σ; Θ; Γ ⊢p ∀(x : a : □qu ), b : □qu′ ▷ □qu∗q u′
q′
type is in SProp (or equivalently, patterns which are syntactically irrelevant) can never be
inspected reliably since the term could always be swapped by any other term of the same
type. Therefore, irrelevant patterns need to be forbidden as well. Note that pattern holes
may appear at irrelevant positions, since they do not inspect the term and thus do not
conflict with the conversion rule, so the ban is specifically on patterns and not on argument
patterns.
Once this is done, the proof of confluence follows exactly the proof done by Cockx et al.,
which was already a variation on the proof done in [14]. The triangle criterion is used to
show that one-step parallel reduction satisfies the triangle lemma saying that for any term t,
there exists an optimally reduced term ρ(t) (that performs all possible reductions in parallel)
such that t ⇛ ρ(t) and for any t ⇛ u, u ⇛ ρ(t). Confluence is then a direct consequence of
this triangle lemma and the fact that parallel reduction entails reduction.
ITP 2024
26:12 The Rewster: Type Preserving Rewrite Rules for the Coq Proof Assistant
3. We need σ to completely instantiate Θ, so we make use of the fact that Θ contains exactly
one entry per pattern variable of p.
4. Since the types in Θ and Tp can refer to the additional sort annotations in patterns, we
need to consider the virtual parts of t and σ in the following.
It is well known that the typing judgment of PCUIC satisfies environment weakening [14].
In fact, our extension with metavariables changes nothing to the proof (the rule on metavari-
ables doesn’t mention the environment), so the property still stands in our system. Even
better, since our pattern typing judgment resembles regular typing, the same proof can be
adapted to work on it, proving that pattern typing also satisfies environment weakening. Let
us now prove the anti-substitution lemma.
We first need to introduce more elements and strengthen our property so that the
induction can go through. We first split Γ into Γ, ∆ and σ into σ0 , σ such that Γ is fixed
while ∆ is the context extension which corresponds to ∆p as we go into the pattern and σ0
corresponds to the already matched portion of the pattern, which will affect ∆ and Θ while
σ corresponds to the currently matched portion of the pattern. The proof goes by induction
on the typing derivation of p, with induction predicate:
Σ; Θ; ∆p ⊢p p ▷ Tp =⇒
Σ; Θ; ∆p ⊢p p ◁ Tp =⇒
Σ; []; Γ ⊢ σ : Θ{σ0 }
Let us give a representative subset of all cases needed to prove the induction.
Case ?x:
Hypotheses: Σ; []; Γ, ∆ ⊢ t : T , σ = [?x := t], Tp {σ0 } ≡ T , ∆p {σ0 } ≡ ∆
Goal: Σ; []; Γ ⊢ (∆p ⊢ ?x : Tp ){σ0 , σ} which simplifies to Σ; []; Γ, ∆p {σ0 , σ} ⊢ t : Tp {σ0 , σ}
Proof: Neither ∆p nor Tp mention ?x, so they are respectively convertible to ∆ and T by
hypothesis, which proves our goal.
Case p a
Hypotheses: Σ; Θ; ∆ ⊢p p ▷ ∀(x : Ap ), Bp , Σ; Θ; ∆ ⊢p a ◁ Ap , Tp = Bp [x := a],
Σ; []; Γ, ∆ ⊢ f t : T , f p {σ1 }, t a {σ2 }, ∆p {σ0 } ≡ ∆.
Y. Leray, G. Gilbert, N. Tabareau, and T. Winterhalter 26:13
By inversion of typing on an application, there exists A and B such that ∀(x : A), B ⩽ T ,
Σ; []; Γ, ∆ ⊢ f : ∀x : A, B and Σ; []; Γ, ∆ ⊢ t : A.
Then, by induction hypothesis on p, we get (∀x : Ap , Bp ){σ0 , σ1 } ⩽ ∀x : A, B and
Σ; []; Γ ⊢ σ1 : Θ{σ0 }.
By the definition of substitution and injectivity of products for ⩽, Ap {σ0 , σ1 } ≡ A and
Bp {σ0 , σ1 } ⩽ B. Since ∆p does not contain any metavariable in |σ1 |, ∆p {σ0 , σ1 } ≡ ∆.
This means we can use the induction hypothesis on a and get Σ; []; Γ ⊢ σ2 : Θ{σ0 , σ1 }
Goal: Bp [x := a]{σ} ⩽ B[x := t] and Σ; []; Γ ⊢ σ1 , σ2 : Θ{σ0 }. The second goal is
immediately proved by concatenating the induction hypotheses.
Proof: by commuting the substitution and metasubstitution,
since the pattern has type vector ?n while the replacement term has type vector ?n′ .
However, typing ensures that we always have ?n = ?n′ , so we could extract this equality
during pattern typing, in the same way that unification is used to collect constraints during
the elaboration of a Coq term [16]. This way, we can modify the subject reduction criterion
to take advantage of these additional conversion hypotheses when checking the type of the
right-hand side of the rewrite rule.
ITP 2024
26:14 The Rewster: Type Preserving Rewrite Rules for the Coq Proof Assistant
Σ; Θ; Γ ⊢ t →∗ t′ Σ; Θ; Γ ⊢ u →∗ u′ Σ; Θ; Γ ⊢p t′ ⩽p u′ ▷ E
Red
Σ; Θ; Γ ⊢p t ⩽p u ▷ E
C is injective x∈Γ
Inj Rel
Σ; Θ; Γ ⊢p C ⩽p C ▷ [] Σ; Θ; Γ ⊢p x ⩽p x ▷ []
Σ ⊢ q =q q ′ ∧ u ⩽u u′ possible
′ Sort
Σ; Θ; Γ ⊢p □qu ⩽p □qu′ ▷ [u ⩽u u′ ; q =q q ′ ]
Σ; Θ; Γ ⊢p A ≡p B ▷ E Σ; Θ; Γ, (x : A) ⊢p t ⩽p u ▷ E ′ B ∈ {λ, ∀}
′
Lambda/Forall
Σ; Θ; Γ ⊢p B(x : A), t ⩽p B(x : B), u ▷ E ∪ E
Σ; Θ; Γ ⊢p f ⩽p f ′ ▷ E Σ; Θ; Γ ⊢p a ≡p a′ ▷ E ′ Σ; Γ ⊢ f, f ′ whne
App
Σ; Θ; Γ ⊢p f a ⩽p f ′ a′ ▷ E ∪ E ′
Σ; Θ; Γ ⊢p c ≡p c′ ▷ Ec
Σ; Θ; Γ, Γi ⊢p P ≡p P ′ ▷ Er Σ; Θ; Γ, Γk ⊢p bk ≡p b′k ▷ Eb,k Σ; Γ ⊢ c, c′ whne
[ Case
Σ; Θ; Γ ⊢p case c return P with ⃗b ⩽p case c′ return P ′ with b⃗′ ▷ Ec ∪ Er ∪ Eb,i
i
PatVar
Σ; Θ; Γ ⊢p ?x ≡p t ▷ [∆ ⊢ ?x := t]
Figure 6 The rules for conversion judgments ⩽p and ≡p (replace all ⩽p with ≡p in the rules).
We proceed by defining a notion of pattern conversion and cumulativity (≡p and ⩽p ) that
collect the set of equalities E necessarily satisfied by substituted terms: we replace the binary
relation t ⩽p u with the ternary relation t ⩽p u ▷ E, which satisfies the following property:
∀t, u, E, Σ0 ; Θ; ∆p ⊢p t ⩽p u ▷ E =⇒
∀Σ ⊇ Σ0 , Ξ, Γ, σ, Σ; Ξ; Γ ⊢ σ : Θ ∧ Σ; Ξ; Γ, ∆p {σ} ⊢ t{σ} ⩽ u{σ}
=⇒ ∀(∆i ⊢ ti ≡ ui ) ∈ E, Σ; Ξ; Γ, ∆{σ} ⊢ ti {σ} ≡ ui {σ}.
Note that we are still very free in the implementation of ⩽p because, as long as we
don’t claim any equality (E = []), even the full relation would be sound. However, we must
remember that our primary goal is to extract as many enforced equalities as possible.
Our main objective is thus to apply safe inversions of conversion, to make sure that our
equalities necessarily hold:
If t1 {σ} ≡ u and t1 →∗ t2 , then by reflexivity of ≡ and substitutivity t1 {σ} ≡ t2 {σ} and
thus t2 {σ} ≡ u. This means that we are allowed to reduce in ≡p .
Y. Leray, G. Gilbert, N. Tabareau, and T. Winterhalter 26:15
When a term is in weak head normal form, its congruence rule is invertible. This includes
products, sorts, λ-abstractions, inductives, constructors and all weak head neutral terms
(whne). Therefore, the general strategy for pattern conversion will be to reduce our
arguments to weak head normal form and then to invert the congruence rule of the head
constructor.
Non-injective symbols may be equipped with any rule in the future, so the congruence
rules of their elimination context (as would-be neutral terms) are not invertible. This
means we cannot extract information in this case and should simply accept with no
equality (Failsafe). This approach is a way to avoid the failure of subject reduction
described in Example 3.
Metavariables may take any (well-typed) value, so the congruence rules of their elimination
context (as would-be neutral terms) are not invertible. However, the conversion can
be extracted as an equality. To keep the procedure easily decidable, we only keep the
equality when the elimination context is empty (?x{∆ := ∆} ≡ t). Also, if we extract
more than one equality for a pattern variable, we will discard all but one.
6 Implementation
Rewrite rules have been implemented and integrated into the Coq proof assistant with pull
request #18038. The criterion for type preservation has been implemented but pull request
#19290 is yet to be integrated to the Coq proof assistant.
Let us describe a more complete example now possible thanks to rewrite rules: excep-
tions [12]. Previously, one could define an exception as an axiom and merely state how it
behaves against other term constructors, now one can do the following:
Symbol raise : ∀ (A : Type), A.
Rewrite Rules raise_rew :=
| raise (∀ (x : ?A), ?B) ?a ⇝ raise ?B@{x := ?a}
| if raise B as b return ?P then _ else _ ⇝ raise ?P@{b := raise B}
| match raise N as n return ?P with 0 ⇒ _ | S _ ⇒ _ end ⇝ raise ?P@{n := raise N}
| fst (raise (?A ∗ ?B)) ⇝ raise ?A | snd (raise (?A ∗ ?B)) ⇝ raise ?B.
Note that the second and third rules would currently raise a warning complaining about
a potential universe inconsistency, since the return predicates ?P could have a universe level
larger than the one raise expects, but this issue can only happen if one mentions that level
explicitly, and we plan to introduce solutions to make this impossible (and thus make the
rules safe) in the future.
ITP 2024
26:16 The Rewster: Type Preserving Rewrite Rules for the Coq Proof Assistant
A few extensions were made from the theory exposed in this paper, the first of which being
the support of the full grammar of Coq terms. This includes notably primitive projections
(see the rules on the product type in the example above), universe polymorphism and sort
polymorphism where universe instances can be matched against.
An extension has been made so that users can make some symbols unfold fixpoints. In
Coq, fixpoints are guarded to prevent infinite loops and the guard condition operates on
inductive values. This means that fixpoint unfolding may only happen when the guarded
argument has a head constructor. However, a user may want to add new values of an
inductive type that should behave as a constructor in this regard, for instance an exception
as in the example above. To this end, a flag called unfold_fix can be given when declaring a
symbol:
(* Supersedes the declaration shown previously *)
#[unfold_fix] Symbol raise : ∀ (A : Type), A.
(* ... same set of rewrite rules ... *)
Eval compute in raise N + 5. (* raise N *)
In practice, metavariables can be used to bind terms on the left-hand side and mention
them in the right-hand side, but they are also convenient to avoid describing explicitly
sub-patterns that are “forced” and will be found by the extraction of equalities mechanism
described in Section 5. To this end, we have added the _ to produce metavariables that are
not relevant to the right-hand side, as the two branches of the example above.
In this setting, another extension has been added to the type preservation criterion to
mitigate the inference requirements for application and cases: while the theory requires that
the pattern infers respectively a product and an inductive, it is in fact safe to allow inferring a
pattern variable. This way, the user can still put _ at places where a product or an inductive
is expected, and we automatically add an equality between this implicit pattern variable and
the product/inductive with “fresh” pattern variables (respectively for the domain/codomain
and for the arguments to the inductive). Since we cannot generate fresh variables during
type checking, these additional pattern variables have to be added to the pattern from the
start, giving for application a virtual annotation (p : ∀(x : ?A), ?B) a, with typing rule
Σ; Θ; Γ ⊢ p ◁ ∀(x : ?A), ?B Σ; Θ; Γ ⊢ a ◁ ?A
Σ; Θ; Γ ⊢ (p : ∀(x : ?A), ?B) a ▷ ?B{x := a}
7 Related Work
Cockx et al. [3] introduce a system for rewrite rules in dependently typed λ-calculus, which
forms the foundation for this work. Their system employs simpler rewrite rules (lacking
higher-order rules and cumulativity) and doesn’t provide a criterion for type preservation. In
this paper, we extend their setting with higher-order rewrite rules in presence of cumulativity
and definitional proof-irrelevance We also provide a decidable criterion to ensure type
preservation which has been implemented in the Coq proof assistant. It is important to
note that the argument by Cockx et al. regarding the consistency of the theory extended
with rewrite rules, under the assumption that these rules are admissible as propositional
equalities, also applies to our system. This is because their argument relies on an encoding
in extensional type theory, and does not depend on the complexity or richness of the rewrite
rules syntax.
Y. Leray, G. Gilbert, N. Tabareau, and T. Winterhalter 26:17
We introduced and implemented a new framework for users to define their own rewrite
rules within the Coq proof assistant. Additionally, we developed a method to ensure these
user-defined rules maintain type safety, guaranteeing the integrity of Coq’s core functionalities
like subject reduction and completeness of type-checking.
From a practical point of view, our most immediate goal is to port to Coq the confluence
checker that has been written by Jesper Cockx for Agda, to improve the safety of rewrite
rules. Another meaningful extension is to define a notion of justified rewrite rules, which
corresponds to the mapping of existing terms to the declaration of symbols and a proof of
propositional equality on those terms to rewrite rules on those symbols. For instance, parallel
pplus is justified with the standard notion of addition. This way, justified rewrite rules can
be considered as a safe extension instead of their current status of additional axioms.
References
1 Andrej Bauer and Anja Petković Komel. An extensible equality checking algorithm for
dependent type theories. Logical Methods in Computer Science, Volume 18, Issue 1, January
2022. doi:10.46298/lmcs-18(1:17)2022.
2 Frédéric Blanqui. Type Safety of Rewrite Rules in Dependent Types. In 5th International
Conference on Formal Structures for Computation and Deduction (FSCD 2020). Schloss-
Dagstuhl - Leibniz Zentrum für Informatik, 2020. doi:10.4230/LIPIcs.FSCD.2020.13.
3 Jesper Cockx, Nicolas Tabareau, and Théo Winterhalter. The taming of the rew: A type
theory with computational assumptions. Proceedings of the ACM on Programming Languages,
5(POPL):60:1–60:29, January 2021. doi:10.1145/3434341.
4 Thierry Coquand and Christine Paulin. Inductively defined types. In Per Martin-Löf and
Grigori Mints, editors, COLOG-88, pages 50–66, Berlin, Heidelberg, 1990. Springer. doi:
10.1007/3-540-52335-9_47.
5 Leonardo de Moura and Sebastian Ullrich. The Lean 4 Theorem Prover and Programming Lan-
guage. In André Platzer and Geoff Sutcliffe, editors, Automated Deduction – CADE 28, pages
ITP 2024
26:18 The Rewster: Type Preserving Rewrite Rules for the Coq Proof Assistant