Math Gate


Pseudonymous Academic Credentials
Bitcoin Theorem Proving Treasure Hunt

Automated Reasoning in Higher Order Logic

Automated Reasoning in Higher Order Logic

This page was created and is maintained by Chad E Brown.


von Heijenoort, From Frege To Godel. 1967 (Available via Amazon)

An extremely brief guide to the contents:

It all starts with Frege's system and Peano's axiomatization of arithmetic.

Dedekind had to defend his notion of "number".

Around the turn of the century came the paradoxes: the Burali-Forti paradox which was discussed by Cantor, Russell's paradox communicated to Frege and Frege's response, and Richard's paradox.

Meanwhile, Padoa suggested studying questions about deductive theories using interpretations.

Hilbert popularized his program to found mathematics on a logic which can be shown consistent by finitistic methods.

Zermelo placed a spotlight on the axiom of choice by using it to prove every set can be well-ordered (later defending the axiom of choice and his proof). Fraenkel proved the independence of the axiom of choice (for set theory with urelements) by formalizing Russell's socks idea as permutation models.

At around the same time, Konig claimed to have proven that the continuum could not be well-ordered (giving the Zermelo-Konig "paradox" related to Richard's paradox).

Russell avoided the paradoxes by introducing his (ramified) theory of types. Zermelo avoided the paradoxes by providing an axiomatization of set theory (later extended by Fraenkel).

Whitehead and Russell wrote Principia Mathematica which included a new method for using description symbols. Wiener simplified Whitehead and Russell's system by reducing relations to predicates using pairing.

Lowenheim (more or less) proved his theorem giving countable models, which was simplified by Skolem who also studied consequences for Zermelo's set theory.

Post studied the propositional fragment of the system of Principia Mathematica.

Skolem began studying primitive recursive arithmetic as a way to avoid the paradoxes. Later, Ackermann would study wider classes of recursive functions.

Brouwer decried the use of excluded middle in infinite situations and suggested (demanded?) a move towards intuitionism. Later, he showed how to develop intuitionistic "set theory", topology, and analysis.

von Neumann was the first to define the ordinals in the modern way.

Schonfinkel was so amazing that he reduced mathematical logic to combinators S and K and a disjoint classes operator U.

Hilbert studied the role of infinity in mathematics and attempted a proof of the continuum hypothesis.

von Neumann introduced his own axiomatization of set theory based on functions as objects (some of which correspond to sets).

Kolmogorov showed how to interpret classical logic in intuitionistic logic.

Finsler showed how to skirt Richard's paradox by replacing "truth" with "provability", hinting at Godel's Incompleteness Theorems.

Hilbert responded strongly to criticism of his program, especially intuitionistic objections. Weyl responded to Hilbert, and Bernays provided more detail about Hilbert's formalization of arithmetic. Brouwer followed up by suggesting four points on which intuitionism and formalism could enter into a dialogue.

Skolem introduced a proof procedure based on his proof of the Lowenheim-Skolem theorem. Skolem's work may be considered the semantic counterpart of Herbrand's Theorem which gives alternative syntactic characterizations of provability. We can infer completeness of first-order logic by taking these results together. However, Godel proved the completeness result directly, using methods similar to Skolem's proof of the Lowenheim-Skolem theorem.

After Godel proved his Incompleteness Theorems, Herbrand proved the consistency of arithmetic with a restricted induction axiom.

 

Frege (1879). Begriffsschrift, a formula language, modeled upon that of arithmetic, for pure thought. This was only Frege's first publication in logic, but it was revolutionary. Frege introduces a system containing propositional calculus and quantification in which derivations depend on form, not content. He also separates propositions into function and argument instead of subject and predicate. Frege's interest in developing such a system was motivated by the attempt to give a logical analysis of the notion of a sequence. His notation is a departure from the arithmetic notation of Boole and others, and indicates the tree structure of formulae. However, this notation did not become standard because of the difficulty in typesetting.

 

Frege's system takes implication and negation (noting the interdefinability of other propositional operators) as primitive and uses the rule of detachment with some basic propositional axioms. (So, this system is a predecessor of a Hilbert-style system.) (Universal) quantification is introduced using three axioms and the rule of generalization. A substitution rule is also used, but not explicitly stated. Frege also allows quantification over function variables (noting that Phi(A) can be considered as a function of the argument Phi as well as the argument A [the evaluation function!]). This is what allows Russell's paradox to be formulated in the system. Frege points out that the inference rules, while they cannot be expressed in the system, are merely "rules for the use of our signs."

 

In the final part of the book, Frege begins to reconstruct arithmetic by introducing a theory of mathematical sequences.

 

Peano (1889). The principles of arithmetic, presented by a new method. The Peano axioms are introduced, in a symbolic notation closer to that used today (e.g., epsilon is used for class membership, and inverted C for implication). [Also, an infix dot notation is used.] Although Peano axiomatizes his system, he does not give rules of inference, so it cannot be considered a formalization. Also, he defines addition and multiplication recursively, although the system does not contain the tools which allow such a recursive definition. The system includes quantification, which Peano apparently invented independently of Frege. Peano covers more mathematical ground than Frege, but not with Frege's depth. As a result, Peano demonstrated how to express mathematical theories in a symbolic language.

 

Dedekind (1890a). Letter to Keferstein. In 1890, Hans Keferstein published a paper which commented on Dedekind's (1888) book on the notion of number. He proposed "corrections" which revealed misconceptions he had of Dedekind's work. A correspondence between Dedekind and Keferstein, to which this letter belongs, attempted to remove these misconceptions. Keferstein confused an equivalence relation on sets with identity of sets; he thought two different definitions of infinite sets were in conflict but were, in fact, merely stylistic variants; he proposed a new definition of simply infinite sets that avoided chains but abandoned mathematical induction. Dedekind's letter explains the development and justification of his ideas on the notion of natural number.

 

Burali-Forti (1897/1897a). A question on transfinite numbers/On well-ordered classes. This paper presents the first published statement of a modern paradox and immediately aroused interest in the mathematical world. The Burali-Forti paradox deals with the "greatest ordinal"--which is obtained by assuming the set of ordinals is well-ordered [and, of course, that it is a set!]--which must be a member of the set of ordinals and simultaneously greater than any ordinal in the set.

 

Burali-Forti considered this a proof by contradiction that the set of ordinals is only partially ordered (contradicting a result by Cantor which was soon to be published). Russell's first explanation (1903) was that the set of ordinals, while linearly ordered, is not well-ordered; but this could not be maintained. Jourdain suggested the use of Cantor's distinction between "consistent" and "inconsistent" multiplicities. Russell (1905a) and Zermelo (1908) both took the path of denying the existence of the set of ordinals.

 

Actually, Burali-Forti used a notion he called "perfectly ordered" sets which was not equivalent to Cantor's notion of well-ordered sets. When Burali-Forti realized this, he thought the paradox would vanish, but in fact it was easily recast in terms of well-ordered sets.

 

Cantor (1899). Letter to Dedekind. In this letter, Cantor suggests dealing with Burali-Forti's paradox by denying the sethood of the multiplicity of all ordinals and distinguishing between "consistent" and "inconsistent" multiplicities. This suggests the later distinction by von Neumann made between sets and classes. Cantor does not give any criteria for making the distinction. Actually, the suggestion of distinguishing between "consistent" and "inconsistent" multiplicities dates back to Schroder (1890). Schroder suggested a multiplicity is consistent if its elements are "compatible" and inconsistent otherwise. Schroder, interestingly, introduced this distinction before the modern paradoxes were discovered.

 

Cantor does make a statement that equivalent multiplicities should both be 'sets' or both be inconsistent. This can be considered an early form of the Axiom of Replacement introduced by Fraenkel and Skolem in 1922.

 

Padoa (1900). Logical introduction to any deductive theory. Here Padoa introduces a method for determining when a term in a deductive system is not definable by means of other terms and when a proposition is not derivable from other propositions. This is essentially the method of looking at interpretations (models) of such systems. The method was largely unnoticed until Tarski and Lindenbaum (1926) drew attention to it. Tarski established the method for any system formalized in the theory of types. Beth (1953) dealt with the first-order case [presumably showing the definability theorem], which is more involved because the predicate whose independence is being shown is not in the range of a quantifier. Craig (1957) used his interpolation theorem to give a simpler proof of Beth's result. A. Robinson (1955) had obtained a theorem equivalent to interpolation and derived Beth's result. Schutte (1962) extended interpolation and Beth's theorem to intuitionistic logic.

 

Russell (1902). Letter to Frege. Russell wrote this letter to Frege upon discovery of Russell's paradox (about the set of all sets that do not contain themselves). While Burali-Forti's paradox was of more set-theoretical concern, Russell's paradox was clearly a concern of logic. Russell provided a solution to the problem using his theory of types. Zermelo claimed that he had discovered the paradox independently.

 

Frege (1902). Letter to Russell. Frege responded promptly to Russell's letter. He believed Russell's paradox was the result of treating a function as something complete rather than incomplete, or "unsaturated." That is, we cannot treat functions as objects. Frege received Russell's letter while the second edition of his Foundations of Arithmetic was at the printshop. He added an appendix to the book in which he notes the paradox and proposes a restriction to avoid the paradox. Russell endorsed this solution in his book The Principles of Mathematics, but later considered other solutions (1905) before proposing his theory of types (1908a).

 

Hilbert (1904). On the foundations of logic and arithmetic. Hilbert presented this address at the Third International Congress of Mathematicians in August 1904. Hilbert had already provided axioms for geometry and for the real numbers and shown that the consistency of geometry can be reduced to that of the real-number system. One problem from his famous list of problems in 1900 was to show the consistency of the real-number system.

 

The discovery of Russell's paradox made the question of consistency more pressing. In this address, Hilbert attempts a proof of consistency of arithmetic, but showing that all "formulas of a certain class have a property (of being 'homogeneous') by showing that the initial formulas have it and the rules transmit it."

 

Hilbert also considers various points of view on mathematical foundations and sketches his intentions to reduce mathematics to a collection of formulas with extralogical symbols. Hilbert returned to the problem of foundations more seriously in the 1920's.

 

Zermelo (1904). Proof that every set can be well-ordered. Zermelo presents a proof, using the axiom of choice, of the well-ordering theorem, which was one of Hilbert's problems. This came only two months after Konig presented a flawed proof (which led to Konig's independent discovery of what was essentially Richard's paradox) that there is no well-ordering of the continuum. Zermelo's proof brought the axiom of choice, which had been stated and rejected by Peano (1899), into the limelight. The proof was controversial, and Zermelo later discussed the reactions and presented a second proof.

 

Richard (1905). The principles of mathematics and the problem of sets. The fact that Konig had a [flawed] proof that the continuum cannot be well-ordered coupled with the fact that Zermelo had proven that all sets can be well-ordered appeared to imply that set theory contained contradictions stemming from the notion of well-ordering. In this letter, Richard (a mathematics teacher in Dijon) presents his paradox. Richard's paradox results from listing "all numbers that are defined by finitely many words" and performing a diagonalization to construct a number, via finitely many words, which cannot be in the list. Richard attributed the paradox to the false assumption that the set of all such numbers is "totally defined." Poincare (1906) agreed with this assessment. Peano (1906a) rejected Richard's paradox as a paradox of linguistics, not mathematics. The paradox is today considered solved by distinguishing between language levels. The diagonal argument was used again by Godel to prove his incompleteness theorem.

 

Konig (1905a). On the foundations of set theory and the continuum problem. This paper was written at about the same time as Richard's, and deals with "finitely definable" numbers. Konig considers that if the set of real numbers is well-ordered, then there must be a first real number which is not "finitely definable" (not listed as a finite ordinal). But this itself may be considered a finite definition. Konig considered this a proof that the continuum cannot be well-ordered. In particular, the Continuum Hypothesis fails. Richard's paradox indicates that this proof cannot be accepted. Accepting Zermelo's proof, we have that there is a first element of the real numbers which is not "finitely definable" which gives what is sometimes called the Zermelo-Konig paradox on the notion of definability.

 

A third paradox related to Richard's and Konig's is Berry's paradox about the "first number definable in fewer than nineteen syllables. This paradox only involves a finite set of numbers.

 

Russell (1908a). Mathematical logic as based on the theory of types. [Summary of Quine's introduction] Russell discovered his paradox in June 1901 and wrote Frege about the paradox in 1902. In The Principles of Mathematics (1903), Russell writes that "it is the distinction of logical types that is the key to the whole mystery." However, by December 1905, Russell had abandoned the theory of types and presented three theories to overcome the paradoxes: "(1) the zigzag theory ('propositional functions determine classes when they are fairly simple, and only fail to do so when they are complicated and recondite'), (2) the theory of limitation of size ('there is not such a thing as the class of all entities'), and (3) the no-classes theory ('classes and relations are banished altogether')." Russell, at the time, favored the "no-classes" theory in which one talks about sentences and substitutions rather than the class of all objects that fulfill a sentence. Problems arise when one wants to quantify over what were considered classes, and Russell suspected the theory might be inadequate for much of classical mathematics. Soon he returned to the theory of types and published this paper on the theory of types in 1908.

 

Russell partitions the universe into levels, or types. Quantification is limited to objects of a certain type. Russell distinguished between "all" as expressed by bound, or "apparent", variables ranging over a objects of a certain type and "any" as expressed by free, or "real", variables which may refer to an object of any type. This leads to a subtle difference between asserting a statement (as a "propositional function") and asserting its universal closure, which carries over into Principia Mathematica.

 

Russell attributes the paradoxes to the "vicious-circle principle": "No totality can contain members defined in terms of itself." In Russell's theory of types a formula containing an apparent variable must be of higher type than the variable. (This leads to a complicated semantics which is closely tied to the syntax, as described in Leivant.) Russell proceeds from the lowest level, the type of individuals, to the next level, the type of "first-order propositions", and so on to so-called "nth-order propositions." From this hierarchy follows a hierarchy of "propositional functions" (sentences with free variables). Russell's theory of types is known as ramified type theory because "the type of a a function depends both on the types of its arguments and on the types of the apparent variables contained in it (or in its expression)." However, Russell introduced the axiom of reducibility into his system which cancelled the dependence on the types of the bound variables in the sense that propositional functions are assumed to be co-extensional with a "predicative" function (one in which the types of the apparent variables do not exceed those of the arguments). (Without such an axiom, the system makes for a "constructive set theory like Weyl's or that of intuitionism, both inadequate as a foundation for classical analysis.") "The axiom of reducibility is self-effacing: if it is true, the ramification that it is meant to cope with was pointless to begin with. Russell's failure to appreciate this point is due to his failure to distinguish between propositional functions as notations and propositional functions as attributes and relations." Ramsey (1931) suggested dropping ramification and the axiom of reducibility, noting that the use of ramification is to solve the "semantic" paradoxes rather than the set-theoretic paradoxes.

 

Russell's paper begins by considering a variety of paradoxes: the liar paradox, Russell's paradox, Berry's paradox, the paradox of the least indefinable ordinal, Richard's paradox, and Burali-Forti's paradox. He then distinguishes between "all" and "any", introduces the hierarchy of types, and goes on to describe the system.

 

Zermelo (1908). A new proof of the possibility of a well-ordering. Zermelo presents a new proof of the well-ordering theorem, again using the axiom of choice. In this proof, however, he assumes much less about set theory and derives the set-theoretic results necessary for the proof. Zermelo also discusses objections to the first proof which included a mistrust of Cantor's set theory, a wariness of the principle of choice, and a suspicion that the argument is reminiscent of those leading to the paradoxes. After this second proof, it was generally accepted that the axiom of choice implies the well-ordering theorem.

 

Zermelo (1908a). Investigations in the foundations of set theory I. Zermelo presents the first axiomatization of set theory. Dedekind (1888) had explicitly stated a number of principles of sets (or "systems"). However, the notion of set remained vague, which was intolerable after Burali-Forti and Russell's paradoxes were discovered. The two extremely different solutions to the paradoxes arriving in 1908 were Russell's theory of types and Zermelo's axiomatization of set theory.

 

Zermelo's basic idea for avoiding the paradoxes is to refuse sethood to collections that are "too big", such as the collection of all ordinals. Zermelo considered sets not just as collections but as objects satisfying his axioms. His axiom of separation was probably the most original. However, since the underlying logic is left unspecified, the axiom of separation (which depends upon the notion of "definite property") remained unclear. This flaw was later removed in different ways by Weyl, Fraenkel, Skolem, and von Neumann.

 

Since Cartesian products are not axiomatized, unordered pairs are used to define equivalence of disjoint sets. This equivalence is extended to any two sets by proving that we can always find an equivalent set to M which is disjoint from a given set N. Zermelo recognized that the existence of infinite sets requires a special axiom (unlike Dedekind who attempted to justify the existence of infinite sets). He also recognized the axiom of choice, which he used in his proof of the well-ordering theorem. The axioms are stated and a few theorems (including a form of Schroder-Bernstein, Cantor's theorem) are proven. In a paper published later in 1908, Zermelo showed how finite sets can be used to define the natural numbers.

 

Whitehead and Russell (1910). Incomplete symbols: Descriptions (from the introduction to Principia Mathematica). The logical device of description was used by Peano who introduced the original iota notation in which iota x is the unit set of x and backwards iota y is the sole member of the unit set y. Whitehead and Russell adapted this notation. Frege had also used a notation for the purpose of extracting an object from a unit set. Peano left descriptions unexplained for nonunit classes; Frege considered the operator to return the set of all objects satisfying the condition [semantically this is the identity function on such classes] for nonunit classes; Whitehead and Russell use a device to avoid Peano's incompleteness and Frege's arbitrariness when dealing with such cases. Any sentence containing a description operator can be expanded to a sentence without such an operator. This sentence will be false in the case of nonunit classes. This idea of "contextual definition" (look to the use, not to the meaning) dates back to Jeremy Bentham's (1843) "paraphrasis" as a method to account for fictions (Russell's examples included "the present King of France" and "the round square"). A mathematical application is a differential operator (which can be considered an "incomplete symbol"--of course, in Church's type theory a differential operator could be an object). One problem with Whitehead and Russell's approach is that it requires dealing with scope. An advantage is that the approach allows us to eliminate all individual and function constants in favor of predicates and relations.

 

Wiener (1914). A simplification of the logic of relations. Wiener's thesis (1913) compared Schroder's system to Whitehead and Russell's. While studying Principia Mathematica under Russell between 1913 and 1914, Wiener noted that relations could be reduced to sets by defining ordered pairs. Wiener later recalled (1953) that this discovery "excited no particular approval on the part of Russell." Wiener's definition of the ordered pair <x,y> is essentially {{{x},empty},{{y}}}. Hausdorff (1914) defined the ordered pair as {{a,1},{b,2}} where 1 and 2 are "two distinct objects different from a and b." Kuratowski (1921) introduced the definition {{a,b},{a}} which is still used today. These definitions imply that the ordered pair has type higher than that of the elements by 2.

 

Lowenheim (1915). On possibilities in the calculus of relatives. This paper contains a proof (with some omissions) of what is now known as the downward Lowenheim-Skolem theorem. The notation is that of Schroder's whose system encoded Peirce's "logic of relatives." The notation is arithmetical, uses subscripts to show dependence of a relation on arguments, and uses pi and sigma for quantifiers. The proof of the theorem consists of two parts. A preliminary part shows that a first-order formula can be written in a normal form equivalent with respect to satisfiability. This normal form contains a possibly infinite string of existential quantifiers involving "fleeing subscripts" (which play a role analogous to Skolem functions), then a string of universal quantifiers, and finally a quantifier-free formula. Lowenheim justified commuting "forall-exists" to "exists-forall" using the finite case and extending to the infinite case. This is unjustified without some use of the axiom of choice. The next part of the proof uses arguments which were later used by Skolem, Herbrand, and Godel. We start with a set of individuals corresponding to individual variables and free variables in the quantifier-free part of the formula. Then a hierarchy of sets is constructed from the so-called "fleeing subscripts." At each step in the construction the quantifier-free part of the formula can be replaced by what amounts to a conjunction of ground instances of the original quantifier-free part. At each finite step, there is a truth-functional assignment which satisfies this ground formula. From this fact, Lowenheim asserts the existence (without proper justification) of such an assignment for the infinite conjunction obtained at step omega. The gaps in Lowenheim's proof were filled in by Skolem.

 

Skolem (1920). Logico-combinatorial investigations in the satisfiability or provability of mathematical propositions: A simplified proof of a theorem by L. Lowenheim and generalizations of the theorem. Skolem uses what is now known as Skolem normal form for satisfiability to provide a new proof for Lowenheim's theorem. With this normal form, Skolem need not consider the infinite strings of existential quantifiers that Lowenheim used. Skolem also fills in the gap at the end of the proof using Dedekind's notion of chain. Skolem generalizes the result to countable sets of formulas. In (1938) Skolem noted that different proofs of Lowenheim-Skolem yield different results: (1) If F is satisfiable, then it is satisfiable in an at most countable domain, and (2) If F has a model D, then there is an at most countable submodel of D. The proof presented in this 1920 paper proves (2). Skolem later (1922) proved the weaker statement (1) without using the axiom of choice. (This 1922 proof may be considered closer to the spirit of Lowenheim's proof.)

 

Post (1921). Introduction to a general theory of elementary propositions. Post considers the propositional calculus, derived from a fragment of the system of Principia Mathematica. He presents both the axiomatic and truth-table treatments, and proves the system complete in three different senses. First, the system is complete in the sense that a formula is provable iff it is valid in the truth-table sense. Second, the system is complete ("in the sense of Post") [maximally consistent] in that if we adjoin any wff which is not provable as an axiom, then every wff is provable in the new system. Third, the system is complete (and this is how Post uses the term "complete" in the paper) in the sense that all truth functions can be defined in terms of the primitive truth functions--which he takes as negation and disjunction. He also proves consistency, in the sense that the system does not prove any wff consisting of a single propositional variable.

 

Post also considers generalizations by considering "an arbitrary number of (not necessarily binary) primitive truth functions" and by considering an arbitrary finite number of axioms and rules of inference. Finally, he considers m-valued truth tables. Post does not consider the independence of the axioms (a problem Bernays had solved in 1918 but did not publish until 1926).

 

Fraenkel (1922b). The notion "definite" and the independence of the axiom of choice. Fraenkel constructs a permutation [Fraenkel-Mostowski] model in which the axiom of choice fails. This may be considered a formalization of Russell's "socks" example which informally demonstrates how the axiom of choice might fail. (Note that this is set theory with atoms, or "urelements".) In order to carry out the proof, Fraenkel had to make the Zermelo's notion of "definite property" (as used in the Axiom of Separation) precise. Fraenkel created a specific notion of "function" to formulate "definite property" precisely. Skolem was also trying to make this notion precise at about the same time. Skolem arrived at the solution of using expressibility by a wff. Skolem [1929] showed Fraenkel's method of using functions could be reduced to his own method of using wffs. von Neumann's method of introducing classes is a third approach. Cohen's forcing [1963] provided a proof which does not use urelements.

 

Skolem (1922). Some remarks on axiomatized set theory. Skolem delivered this address to the Fifth Congress of Scandinavian Mathematicians in Helsinki, August, 1922. He presents a way of making Zermelo's "definite property" notion precise using expressibility by a class of wffs. A new proof of Lowenheim's theorem is given, filling in what Quine [1955] called the "law of infinite conjunction" without using the axiom of choice (Konig's lemma is often used for this purpose today). Skolem notes that the theorem implies there must be countable models of set theory and points out the Lowenheim-Skolem "paradox." Skolem points out that separation is not enough to imply the existence of "large" sets such as aleph_omega and proposes the axiom of replacement. Similar suggestions were made at about the same time by Fraenkel [1922a, 1921, 1922] and Lennes [1922]. Finally, Skolem points out that Zermelo's set theory does not uniquely determine its domain (by constructing a proper submodel of any model) and points out that this may imply certain set-theoretic questions, such as the continuum hypothesis, may be independent of Zermelo's axioms. (An analogy is made to the fact that the field theory axioms do not imply "exists x . x^2 = 2" or its negation.)

 

Skolem (1923). The foundations of elementary arithmetic established by means of the recursive mode of thought, without the use of apparent variables ranging over infinite domains. This paper develops arithmetic using quantifiers bounded by primitive recursive functions (what Skolem calls the "recursive mode of thought") and allowing proofs by mathematical induction. Note that such bounded quantifiers may be considered shorthand and eliminated. "This arithmetic is now known as primitive recursive arithmetic." Every true instance of a formula in the system is verifiable (we may regard this as a proof of consistency of the system). A proof in the system amounts to a schema for calculating the proof of any instance of the formula proven. Later, Curry [1941] and Goodstein [1941, 1957] developed primitive recursive arithmetic in a logic-free equational calculus. One interesting result Skolem proves is that we can allow primitive recursion over a "course of values" (i.e., depending on all the previous values, not just the value of the predecessor) without increasing the class of functions.

 

Brouwer (1923b, 1954, and 1954a). On the significance of the principle of excluded middle in mathematics, especially in function theory, Addenda and corrigenda, and Further addenda and corrigenda. Brouwer explains how finite mathematics led to the laws of logic which were later applied to nonfinite mathematics without justification. Using intuitionism, he shows counterexamples of classical theorems (including the example of a sequence whose definition depends on the digits of pi used to contradict Heine-Borel [in R, compact = closed and bounded]). (Elsewhere he gives intuitionistic forms of Bolzano-Weierstrass [infinite bounded sets in R have an accumulation point] and Heine-Borel.) He also splits the classical notion of convergent sequences into several intuitionistic notions: positively convergent sequence, negatively convergent sequence, and nonoscillating sequence. He later created an example of a sequence whose definition depends on at what time a theorem has been proven or disproven.

 

von Neumann (1923). On the introduction of transfinite numbers. For Cantor, the ordinals were the "order types" of the well-ordered sets, where "order types" were given by the equivalence relation induced by bijective order-preserving functions. Russell spoke in terms of equivalence classes. However, in axiomatic set theory, such classes are not sets. von Neumann introduced the alternative of using specific representatives of the equivalence classes. In particular, he introduced the idea of using the set of ordinals which precede the given ordinal, so that the ordering relation coincides with the membership relation. von Neumann carries this out assuming the notions of well-orderedness and similarity, though this is unnecessary because the ordinals can be defined directly using only the membership relation. Finally, he sketches a proof of the "principle of definition by transfinite induction."

 

Schonfinkel (1924). On the building blocks of mathematical logic. Schonfinkel delivered an address on these ideas in 1920, but they were not written up until 1924 by Heinrich Behmann. It was known that Scheffer's stroke ("nand") is complete (for definability) for the propositional calculus. In an attempt to reduce predicate calculus in a similar manner, Schonfinkel introduced the U operator for recognizing disjoint classes. He carries the reduction much further by introducing the S ("fusion") and K ("constant") combinators and ultimately reducing formulas to applicative combinations of S, K and U (without variables). (Note: K was translated as C, even though the German word for constant function is "Konstanzfunktion.") The device of getting rid of (bound) variables dates back to Peirce's 1870 algebra, even before Frege's 1879 work in quantification theory. Schonfinkel treats classes and relations as special cases of functions. By allowing functions to stand as arguments, higher order classes and relations are also obtained. Schonfinkel introduces the idea of "Currying" functions, an idea anticipated by Frege [1893].

 

In the end, Schonfinkel also considers the possibility of introducing a primitive function J having the property that JJ is S, JS is K and JK is U. Then applications of this J suffice to represent the class of wffs represented previously by S, K and U. Schonfinkel does not consider axioms for the system (a problem Curry would work on). If the idea of using a single primitive J function is as arbitrary as it seems, then this would probably be reflected by an increased complexity in the axioms for such a system. (Actually, there is a single combinator X described in Barendregt so that XXX reduces to K and X(XX) reduces to S.)

See also 1.

 

Hilbert (1925). On the infinite. This is the text of an address Hilbert delivered in 1925. Hilbert presents his ideas on the foundations of mathematics, noting the role played by the infinite. The paper also presents an attempted proof of the continuum hypothesis. The continuum hypothesis was first stated by Cantor in 1878 and was Hilbert's first problem in his 1900 address. Konig had proven that "the power of the continuum is not a limit of denumerably many smaller cardinal numbers." This was essentially all the progress that had been made when Hilbert made his attempt. Hilbert considers the set of functions from omega to omega (which has the same cardinality as the reals--i.e., 2^aleph_0 = aleph_0^aleph_0) and attempts to find a mapping from omega_1 onto this set. Ackermann's function [a function defined using "double" recursion and is not primitive recursive] is introduced during the proof. Skolem maintained in 1928 that the continuum hypothesis might not be decidable from the axioms of set theory. Later, Godel [1938, 1939, 1940] showed the consistency of the continuum hypothesis (since it is true in his constructible universe), and Cohen [1963, 1963a, 1964] proved the independence of the continuum hypothesis (using forcing). One result of Hilbert's argument was that the hierarchy of number theoretic functions became an object of study.

 

The following is a passage from the text (p. 375-376 of von Heijenoort):

 

"But there is a completely satisfactory way of escaping the paradoxes without committing treason against our science. The considerations that lead us to discover this way and the goals toward which we want to advance are these:

 

(1) We shall carefully investigate those ways of forming notions and those modes of inference that are fruitful; we shall nurse them, support them, and make them usable, wherever there is the slightest promise of success. No one shall be able to drive us from the paradise that Cantor created for us.

 

(2) It is necessary to make inferences everywhere as reliable as they are in ordinary elementary number theory, which no one questions and in which contradictions and paradoxes arise only through our carelessness.

 

Obviously we shall be able to reach these goals only if we succeed in completely clarifying the nature of the infinite."

 

von Neumann (1925). An axiomatization of set theory. This paper gives von Neumann's axiomatization of set theory. von Neumann's set theory includes sets and classes as objects. However, the present paper presents the axiomatization not in terms of sets and classes, but in terms of functions. The universe includes "I-objects" (arguments) and "II-objects" (including characteristic functions of classes). Some objects are I-II-objects. This system is finitely axiomatizable, in contrast to Zermelo's set theory. von Neumann searches for a uniquely characterized minimal model by considering models contained in other models. These "inner models" have been useful for investigating relative consistency. von Neumann's system was simplified, revised, and expanded by R. M. Robinson (1937), Bernays (1937-1954, 1958), and Godel (1940). It is now known as Godel-Bernays or von Neumann-Bernays-Godel or von Neumann-Godel-Bernays set theory.

 

Kolmogorov (1925). On the principle of excluded middle. This paper anticipates Heyting's formalization of intuitionistic logic and provides a method of interpreting classical mathematics in intuitionistic mathematics. Kolmogorov presents two formal systems H and B for propositional calculus with implication and negation. H is classical and B is the minimal calculus (Heyting's system without Axiom h "A -> (-A -> B)"). He extends these systems to include universal quantification with a rule of generalization and axioms for the universal quantifier. Kolmogorov introduces a notion of "pseudotruth" by taking a wff C and recursively double negating each subwff to obtain C*. He then proves that if H|-C, then B|-C*. An analogous result holds if we include the universal quantifier in the systems.

 

Note: Classical positive implicational calculus is given by intuitionistic positive implicational calculus + Peirce's law "((A->B)->A)->A" [and considering the Curry-Howard correspondence for simply typed lambda calculus without constants we have that there is no closed term of type "((a->b)->a)->a"--see Hindley 1997].

Note: For an implementation of the Kolmogorov interpretation in Twelf, see the report on the following page.

 

Finsler (1926). Formal proofs and undecidability. Finsler uses the diagonal argument of Richard's paradox (replacing truth with provability) to demonstrate the existence of a formula which is false, but formally undecidable. This result can be interpreted as foreshadowing Godel's Incompleteness Theorem. However, the notions of "formal provability" are different. Finsler distinguishes between "formal" and "conceptual" by considering the "formal" to consist of expressions in a given unambiguous language. He considers the list of binary sequences which are definable by a finite linguistic expression. We can define the antidiagonal sequence, but this definition belongs to the conceptual domain, so there is no contradiction. The antidiagonal is definable, but not formally definable. The same argument applies to proofs. We can list all the proofs which prove either that an infinite sequence contains infinitely many zeros or finitely many zeros. With each proof we can list the infinite sequence to which it refers. Then we form the antidiagonal. There can be no proof that determines whether the antidiagonal contains finitely or infinitely many zeros. However, we can determine that the antidiagonal must, in fact, contain infinitely many zeros by considering the fact that there must be infinitely many proofs that the sequence "1 1 1 . . . " has finitely many zeros.

 

The distinction between "truth" and "provability" can be expressed as follows. We can enumerate the formulas F_i(x) of exactly one free variable x (of an appropriate system) define a_i,j and b_i,j as follows: a_i,j = 1 if F_i(j) is true and 0 otherwise; b_i,j = 1 if F_i(j) is provable and 0 otherwise. Soundness implies b_i,j <= a_i,j. Richard's paradox implies there is no k so that 1 - a_j,j = a_k,j for all j. However, we may have a k so that 1 - a_j,j = b_k,j. If we have such a k, then we can diagonalize (let j = k) and we have a k so that 1 - a_k,k = b_k,k. This implies that F_k(k) is true but not provable.

 

Brouwer (1927). On the domains of definition of functions. Starting in 1918 Brouwer created intuitionistic "set theory" and reconstructed point-set topology and analysis in the intuitionistic setting. This paper proves that every function (intuitionistically) defined everywhere on [0,1] is uniformly continuous. The argument involves proving the bar theorem (a fundamental theorem on intuitionistic "sets") and a corollary, the fan theorem.

 

Hilbert (1927). The foundations of mathematics. Hilbert describes his formal system in detail and discusses some criticisms. Among the criticisms, he considers Poincare's objection that a consistency proof for a system with induction would come too late, because induction was used to construct the system and prove properties about it. Weyl responded to Hilbert's analysis of this criticism.

 

The following is an excerpt from the paper which discusses the intuitionistic attack on excluded middle (see von Heijenoort, pp. 475-476). Note that the logical epsilon-axiom states A(a) -> A(epsilon(A)).

 

"Intuitionism's sharpest and most passionate challenge is the one it flings at the validity of the principle of excluded middle, for example, in the simplest case, at the validity of the mode of inference according to which, for any assertion containing a number-theoretic variable, either the assertion is correct for all values of the variable or there exists a number for which it is false. The principle of excluded middle is a consequence of the logical epsilon-axiom and has never yet caused the slightest error. It is, moreover, so clear and comprehensible that misuse is precluded. In particular, the principle of excluded middle is not to be blamed in the least for the occurrence of the well-known paradoxes of set theory; rather, these paradoxes are due merely to the introduction of inadmissible and meaningless notions, which are automatically excluded from my proof theory. Existence proofs carried out with the help of the principle of excluded middle usually are especially attractive because of their surprising brevity and elegance. Taking the principle of excluded middle from the mathematician would be the same, say, as proscribing the telescope to the astronomer or to the boxer the use of his fists. To prohibit existence statements and the principle of excluded middle is tantamount to relinquishing the science of mathematics altogether. For, compared with the immense expanse of modern mathematics, what would the wretched remnants mean, the few isolated results, incomplete and unrelated, that the intuitionists have obtained without the use of the logical epsilon-axiom? The theorems of the theory of functions, such as the theory of conformal mapping and the fundamental theorems in the theory of partial differential equations or of Fourier series--to single out only a few examples from our science--are merely ideal propositions in my sense and require the logical epsilon-axiom for their development.

 

"In these circumstances I am astonished that a mathematician should doubt that the principle of excluded middle is strictly valid as a mode of inference. I am even more astonished that, as it seems, a whole community of mathematicians who do the same has now constituted itself. I am most astonished by the fact that even in mathematical circles the power of suggestion of a single man, however full of temperament and inventiveness, is capable of having the most improbable and eccentric effects."

 

The following is the next to the last paragraph of the paper (from von Heijenoort, p. 479):

 

"From my presentation you will recognize that it is the consistency proof that determines the effective scope of my proof theory and in general constitutes its core. The method of W. Ackermann permits a further extension still. For the foundations of ordinary analysis his approach has been developed so far that only the task of carrying out a purely mathematical proof of finiteness remains. Already at this time I should like to assert what the final outcome will be: mathematics is a presuppositionless science. To found it I do not need God, as does Kronecker, or the assumption of a special faculty of our understanding attuned to the principle of mathematical induction, as does Poincare, or the primal intuition of Brouwer, or, finally, as do Russell and Whitehead, axioms of infinity, reducibility, or completeness, which in fact are actual, contentual assumptions that cannot be compensated for by consistency proofs."

 

Weyl (1927). Comments on Hilbert's second lecture on the foundations of mathematics. Weyl declared his support for Brouwer's intuitionism in 1920, but he did not maintain strict allegiance to Brouwer. In this paper, Weyl opposes Hilbert's response to Poincare's criticism concerning the circularity of using induction to prove metamathematical results such as consistency. Hilbert insisted that these metamathematical uses are not really induction, but a procedure which gives each concrete case. Weyl countered, "One may here stress the 'concretely given'; on the other hand, it is just as essential that the contentual arguments in proof theory be carried out in hypothetical generality, on any proof, on any numeral." Induction is what justifies such generality.

 

After comparing the situation in foundations of mathematics to theoretical physics, Weyl suggests that Hilberts axiom system is not self-evident but arbitrary. He concludes, "If Hilbert's view prevails over intuitionism, as appears to be the case, then I see in this a decisive defeat of the philosophical attitude of pure phenomenology, which thus proves to be insufficient for the understanding of creative science even in the area of cognition that is most primal and most readily open to evidence--mathematics."

 

Bernays (1927). Appendix to Hilbert's lecture "The foundations of mathematics." In Hilbert's [1922] formalization of arithmetic he used a tau function on functions which takes tau(f) to 0 if f is constantly 0 and to the first number a with f(a) not 0 otherwise. Universal and existential quantifiers of first-order arithmetic can be defined using tau. Hilbert sketched a consistency proof for such a system with one primitive function. Ackermann (1924) attempted to transform this sketch into a consistency proof of analysis. At this point, Hilbert and his disciples replaced tau(f) with epsilon(A) for propositional functions A(a), which returns a number for which A holds if it holds for any number. The argument is complicated by the fact that we have several propositional functions which may lead to nested epsilon terms. Ackermann was able to carry out an analogous argument, but at the time of publication realized his proof did not ensure consistency of full analysis. He responded by restricting substitution which decreased the strength of his system so that the consistency proof succeeded. Hilbert's remarks refer to this latter proof and Bernay's paper gives more details of the proof.

 

Brouwer (1927a). Intuitionistic reflections on formalism. Brouwer lists four points (presented as "insights") on which intuitionism and formalism could enter into a dialogue. The first three insights note the similarity between finitary metamathematics and intuitionistic mathematics. In particular, the third insight is that the principle of excluded middle is the same as "the principle of the solvability of every mathematical problem." The fourth insight is that consistency proofs cannot provide a foundation because of the "vicious circle" such a proof necessarily contains.

 

Ackermann (1928). On Hilbert's construction of the real numbers. The notion of primitive recursion first appears in Dedekind 1888. Skolem [1923] showed that we may allow primitive recursion over a "course of values" without increasing the class of p.r. functions. Hilbert obtained a hierarchy of functions by considering functions of type n. A function of type 1 is a number theoretic function. A function of type i+1 is a functional whose values are natural numbers and one of whose arguments is a function of type i. Ackermann provided an example of a type 2 function, the iteration fucnction defined by

 

rho(f,a,0) = a

rho(f,a,n+1) = f(rho(f,a,n))

 

A number theoretic function of level n is defined by such primitive recursive equations using a function of type n (but none of greater type) for g or h. So, primitive recursive functions are of level 1. Ackermann answered the question of whether there were any genuine functions of higher levels by demonstrating a function of level 2. Ackermann's function is defined by

 

phi(a,b,0) = a + b

phi(a,b,n+1) = rho(lambda c phi(a,c,n),alpha(a,n),b)

 

where rho is the iteration function defined above and alpha is the p.r. function given by

 

alpha(a,0) = 0

alpha(a,1) = 1

alpha(a,n) = a for n > 1.

 

If phi were of level 1 (i.e., if it were p.r.), then the function phi(a,a,a) would be p.r. Ackermann showed this function grows faster than any p.r. function. phi can be given by equations which do not use rho, but do use primitive recursion on two variables. So, phi is effectively computable, but not p.r. This discovery led to classification of recursion schemas [Peter 1951, 1957] and led Herbrand to search for a notion of effectively calculable functions.

 

Ackermann considered these classes of functions in an attempt to clarify Hilbert's proposed proof of the continuum hypothesis, by counting the number-theoretic functions.

 

Skolem (1928). On mathematical logic. This paper is particularly important for the history of automated theorem proving. Skolem presents Boolean algebra, the first-order predicate calculus and touches on the second-order predicate calculus. Instead of concentrating on the axiomatic method as a proof procedure, he follows the idea of the proof of the Lowenheim-Skolem theorem. The wff of interest is put into Skolem normal form (which is equivalent with respect to satisfiability, though this is not shown) and the collection of "nth level" ground (Skolem) terms [the Herbrand universe] is considered. There are finitely many such terms at each level n. If the wff is satisfiable, then the conjunction of all ground terms (over the nth level terms) of the (quantifier-free part of the) wff must be satisfiable, for each n. The contrapositive is that if the wff is unsatisfiable, then for some n we have a truth-functional contradiction. So, Skolem essentially establishes that this proof procedure is sound and complete (in the sense of Quine [1955a, p. 145]). This provides proofs which are cut-free and satisfy the subformula property. Skolem also seems to argue that if a wff is satisfiable then it may be consistently added as an axiom. Skolem's work here is analogous to Herbrand's thesis work [1930]. In fact, we can deduce completeness of first-order logic from the completeness of Skolem's proof procedure (which links to satisfiability) combined with Herbrand's [1930] theorem (which links to provability). Godel proved completeness in 1930.

 

Skolem also uses his procedure to approach the decision problem for certain classes of formulas. In particular we can solve problems with a single universal quantifier followed by a string of existential quantifiers. Bernays and Schonfinkel [1928] had solved the special case with the prefix "forall x exists y." At about the same time as Skolem's result, Ackermann proved a stronger result by providing an "upper bound on the number of individuals of the finite domain in which the formula is satisfiable, if it is satisfiable at all." Skolem [1928b] reviewed Ackermann's paper (which used a method similar to Skolems), simplified Ackermann's arguments, and provided a smaller upper bound. Skolem also shows that we can use the proof procedure to decide wff's with prefix "forall x1, . . ., xm exists y1, . . ., yn" so long as every xi occurs in each atomic part of the matrix of the wff.

 

Herbrand (1930). Investigations in proof theory: The properties of true propositions. This is Chapter 5 of Herbrand's Ph.D. thesis (apparently completed in April 1929). Herbrand studies properties of wff's he calls A, B, and C and attempts to prove the fundamental theorem that each of these properties is equivalent to provability. He follows the fundamental theorem with applications to the decision problem and to the study of consistency of several systems. Errors in some of Herbrand's lemmas were pointed out by Dreben, Andrews, and Aanderaa [1963] and corrected by Dreben and Denton [1966].

 

Herbrand was interested in Hilbert's program and adopted Hilbert's finitistic point of view towards metamathematics. As a result, Herbrand was less interested in semantic concepts such as validity than in syntactic concepts such as provability (or "truth" as he called it). His work can be seen as a reinterpretation of Lowenheim-Skolem in this context.

 

Gentzen claimed Herbrand's theorem was a special case of his sharpened Hauptsatz, but this was a mistake. Herbrand's theorem does not apply only to formulas in prenex (as Gentzen apparently thought) but to all formulas. Gentzen's theorem only applies to formulas which can be represented as a sequent of prenex formulas. Herbrand also supplies more information about the "middle sequent" than Gentzen does. Gentzen's Hauptsatz on cut-elimination corresponds to Herbrand's elimination of modus ponens. On the other hand, Gentzen's Hauptsatz generalizes to intuitionistic and various modal logics. Herbrand's theorem does not.

 

Godel (1930a). The completeness of the axioms of the functional calculus of logic. This is a rewritten version of Godel's dissertation in which he proved the completeness of the predicate calculus and predicate calculus with identity. Godel proves that every formula of the pure predicate calculus is either refutable or aleph_0-satisfiable (so that the proof also implies Lowenheim-Skolem). The techniques are similar to those used by Lowenheim and Skolem, except Godel includes a step connecting provability with semantic arguments. For the predicate calculus with identity, Godel proves that irrefutable formulas are either finitely satisfiable or aleph_0-satisfiable. He also proves the compactness theorem, which provides a generalization of completeness to infinite sets of formulas. In 1949, Henkin presented a proof of completeness of first-order logic along different lines.

 

Godel (1930b, 1931, 1931a). Some metamathematical results on completeness and consistency, On formally undecidable propositions of Principia mathematica and related systems I, and On completeness and consistency. This includes an abstract presented in 1930, the famous paper by Godel in which he proves his incompleteness theorems, and a note closely connected with the paper. The argument is based on the Liar paradox, except instead of considering a statement expressing "I am not true," he considers a statement expressing "I am not provable." The diagonalization used to construct such a statement is reminiscent of Cantor's diagonal procedure and Richard's paradox.

 

Godel works in a system P with logical axioms equivalent to those of Principia Mathematica (but without ramification--it appears to be monadic omega-order logic) and Peano's arithmetic axioms. Since the individuals are identified with the natural numbers, we have a standard model in which we can consider whether a closed formula is true or false. Godel arithmetizes the language and shows that we can effectively decide the Godel numbers of axioms and proofs. This involves giving a definition of the class of primitive recursive functions (Godel calls them simply "recursive") and showing that the appropriate predicates are in the class. He proves that every primitive recursive predicate is representable in P. He defines omega-consistency. These tools are enough to show his incompleteness theorem by constructing the sentence expressing "I am not provable" and showing that this can neither be provable nor disprovable in P (and that such a sentence can be constructed for any omega-consistent extension P_kappa of P). Even though system P_kappa allows reference to higher-order objects, primitive recursive predicates can be represented by arithmetic (first-order) formulas. This implies that there are undecidable arithmetic propositions.

 

The statement "there exists an unprovable formula of P_kappa" expresses the consistency of P_kappa. We can internalize the first half of Godel's proof to obtain Godel's second incompleteness theorem: If P_kappa is consistent, then P_kappa does not prove "P_kappa is consistent" (i.e., "there exists an unprovable formula of P_kappa").

 

Rosser [1936] extended Godel's result by dropping omega-consistency. Hilbert and Bernays [1939] carried out the details of Godel's second incompleteness theorem for two number theory systems Z_mu and Z. Godel [1934] made precise the notion of general recursive functions as functions uniquely determined by equations among terms involving known (recursive) functions and the unknown function (following a suggestion by Herbrand). Church [1936] showed predicate calculus is undecidable.

 

Of course, Godel's results necessitated a profound change in Hilbert's program.

 

One small, but interesting remark made by Godel in the second section of the paper, is that we need only consider one-place relations (predicates) in the system P (see Wiener [1914]). This would later be used by Gentzen [1936] when he showed consistency of the simple theory of types. Godel states:

 

"Variables for functions of two or more argument places (relations) need not be included among the primitive signs since we can define relations to be classes of ordered pairs, and ordered pairs to be classes of classes; for example, the ordered pair a, b can be defined to be ((a),(a,b)), where (x,y) denotes the class whose sole elements are x and y, and (x) the class whose sole element is x."

 

And in a footnote:

 

"Nonhomogeneous relations, too, can be defined in this manner; for example, a relation between individuals and classes can be defined to be a class of elements of the form ((x_2),((x_1),x_2)). Every proposition about relations that is provable in PM is provable also when treated in this manner, as is readily seen."

 

Herbrand (1931b). On the consistency of arithmetic. Herbrand proves consistency of a fragment of arithmetic in which induction is applied only to formulas without bound variables or only using successor function. Ackermann [1924] and von Neumann [1927] had proved consistency for other such fragments with restricted induction. Herbrand's proof is intended as a contribution to Hilbert's program. Herbrand uses the word "intuitionistic" to describe the metamathematical methods he considers admissible, but this is closer to Hilbert's "finitary" than Brouwer's "intuitionism." Herbrand introduces a class of functions which are the general recursive functions. This is the first appearance general recursive functions as opposed to primitive recursive functions. He then uses such functions to eliminate the use of his restricted induction axiom.

 

Herbrand had a chance to examine Godel's incompleteness theorem before completing this paper. He explains why Godel's theorem does not apply to the (weak) fragment of arithmetic he considers.

 

The published paper was preceded by a note written by Hans Hasse and translated in von Heijenoort (pp. 619-620):

 

"Jacques Herbrand, born on 12 February 1908 in Paris, had a fatal accident on 27 July 1931 while mountain climbing in the Alps. The same day the editors received the manuscript published below.

 

"He spent the last six months of his life at German universities, in close contact with a number of German mathematicians and engaged in a lively exchange of ideas with them. They were all profoundly impressed by his noble personality, which was endowed with rich scientific gifts. With him an extraordinarily gifted mind has perished in the bloom of youth. The beautiful and important results that he found in the field of number theory and of mathematical logic and the fruitful ideas that he expressed in mathematical conversations justified the greatest hopes. The science of mathematics has suffered a severe and irreparable loss by his untimely death."