Witold Marciszewski

On advancing frontiers of science. A pragmatist approach

Paper read at the Polish-Austrian Conference Science versus Utopia. Warsaw, November 23-24, 2011.
Published in "Studia Philosophiae Christianae", XLVII nr 4 (2011), pp. 51-71.

ABSTRACT

The pragmatist approach, as stated in this essay, takes into account two features of knowledge, both having an enormous potential of growth: the scope of science, whose frontiers can be infinitely advanced, while firmness of its propositions grows with consolidating once attained frontiers. An opposite view may be called limitativist as it conservatively sticks to some a priori limiting principles which do not allow progressing in certain directions. Some of them influence science from outside, like ideological constraints, other ones are found inside science itself.

The latter can be exemplified by the principles like those: (1) there can be no action at a distance; (2) there are no necessary truths; (3) there are no abstract objects. The first might have happened to limit physics with rejecting the theory of gravitation. The second entails that arithmetical propositions are either devoid of (clasical) truth or are not necessary; this would limit arithmetic to the role of a mere calculating machine, without giving any insights into reality. The third principle, for instance, limits logic to the first-order level (since in the second one variables range over abstract sets). The history of ideas shows that such limiting principles, had they been obeyed, would have hindered some great achievements of science. This is why we should not acknowledge any such principle as necessarily true, that is, winning in confrontation with any view contrary to it. Such principles on equal terms should compete with other propositions in obtaining as high degree of epistemic necessity as they may prove worth of.

To the core of pragmatist approach there belongs treating epistemic necessity as a gradable attribute of propositions. In accordance with ordinary usage, "necessary" is a gradable adjective, having a comparative form. The degree of epistemic necessity of a scientific statement depends on how much it is needed for the rest of the field of knowledge (Quine's metaphor). The greater damage for knowledge would be caused by getting rid of the point in question, the greater is its epistemic necessity. At the top of such a hierarchy are laws of logic and arithmetic. Among physical laws at a very high level there is the law of gravitation, owing both to its universality, that is, a colossal scope of possible applications (advancement of frontiers), and its having been empirically confirmed with innumerable cases (consolidation of frontiers). Such a success has proved possible owing to the bold transgression of the limiting principle 1 (see above), and this has resulted in so high a degree of unavoidability.


The pragmatist approach of this essay can be expressed with the Chinese proverb: Black cat or white cat: if it can catch mice, it's a good cat.

Another motto is suggested in the announcement of a scientific conference: The frontiers of science are by definition continually shifting. Such a continuous shifting is what we call advance of science.


§1. Frontiers versus limits

The kinship of meaning between these terms is misleading, though in some translations both happen to be rendered with the same word, e.g. German "Grenze", Polish "granica". However, there is a significant opposition in their use.

A limit is something static and negative -- to mark the line that is not allowed, or not likely, to be gone beyond. When, for instance, we speak of a sequence of numbers as tending to a limit, we mean a point that cannot be exceeded; the sequene definitely stops at this point (while we do not speak of a number sequence as tending to a frontier). The derivative "limitation" means setting a limit to changes, in particular, changes being progressive. On the other hand, "frontier" means either a dynamic line which can shift forward, or a region to be occupied owing to such a shift. Definitions of frontier as found in dictionaries are as follows.

  1. A region just beyond or at the edge of a settled area.
  2. A wilderness at the edge of a settled area of a country.
  3. An undeveloped field of study; a topic inviting research and development.

Meanings 1 and 2 have evolved in the idiom of American settlers and pioneers at the Wild West, who had moved forward with their horses and wagons. Item 3 refers to an intellectual quest of pioneers and discoverers in science. Such an advancing frontier marks successive territorial wins.

In human actions, the existence of a limit means a constraint to stop some moves or prevent some kinds of behaviour. This is something that makes narrower the scope of our freedom or our possibilities. There may be limitations imposed on human actions through some human decisions; this is the case with legal systems, monastic rules, military discipline, etc.

Moreover, there are limitations which derive from the natural order, and get perceived and recognized by people. These are usually expressed in the form of rules to control our behaviour. If such a rule is of special importance, somehow fundamental, it is often honored with the name of a principle. Thus we come to the point that there is an important category to deserve the name of limiting principles.

The choice of this name is no excentric novelty. Already in 1949 it was introduced to the philosophical vocabulary by the famous British philosopher C.D.Broad. Here is his definition.

"There are certain limiting principles which we unhesitatingly take for granted as the framework within which all our practical activities and our scientific theories are confined. Some of these seem to be self-evident. Others are so overwhelmingly supported by all the empirical facts which fall within the range of ordinary experience and the scientific elaborations of it [...] that it hardly enters our heads to question them. Let us call these Basic Limiting Principles."

See: "The Relevance of Psychical Research to Philosophy", Philosophy 24, pp. 291-309.

I take here advantage of invoking a well-known author, but I do not follow his own list of limiting principles. Broad was most interested in mind-body relations, hence his principles mainly deal with that domain. Here we need a more comprehensive use to involve various domains of science and philosophy. The ordinary meaning of the verb "to limit" makes such a broad use justifiable. Hence I employ the phrase "limiting principles" to denote constraints exercised on our knowledge from outside, by some institutions or ideologies (example [1] in §2), as well as those acting within philosophy or science.

Nevertheless, the claim LP.2 was essential in the original research of Broad; in the period about 1920, together with Betrand Russell, he belonged to that small circle of philosophers who understood revolutionary ideas of then current physics.

See: www.hist-analytic.org/russell_and_broad_on_space_apa.htm, Russell and C. D. Broad on Space by Steve Bayne, 2000, Bertrand Russell Society,

The fact of being subjectively taken for granted does not necessarily render such principles objectively true. Some of them might be right, other ones wrong. If a limiting priciple is right, then it helps us to avoid errors, otherwise it puts a limit to progress, that is, withholds advancing frontiers of science.


§2. Some samples of limiting principles

As instructive examples of such limitations concerning science and philosophy, let us consider the following principles.

  • LP.1: The LP that the teaching of Catholic Church forms a source of limitative principles concerning development of sciece and philosophy. This general limitative principle has been divided into quite a number of detailed instructions in the basic document of 1864 entitled "The Syllabus of Errors Condemned by Pius IX".

    This document (www.papalencyclicals.net/Pius09/p9syll.htm) lists opinions judged as erroneous, hence in order to learn a limitative principle from any of them, the sentence in question should be denied. For instance, when the condemned view (item 14) reads "Philosophy is to be treated WITHOUT taking any account of supernatural revelation", the replacement of the negative particle "without" by the positive "with" yields the following (numbered as non-14) limitative principle:

    • non-14) Philosophy is to be treated WITH taking into account the supernatural revelation.

    • non-11) The Church ["not" cancelled] ought to pass judgments on philosophy, and ought NOT [added] to tolerate the errors of philosophy.

    • non-12) The decrees of the Apostolic See and of the Roman congregations DO NOT impede the true progress of science.

    Let us imagine some limitations following from these principles. As for 14, philosophy of mind could not be developed without maintaining, for instance, the dogma of soul immortality; at this point the freedom of research would be limited. According to 11, the freedom of inquiries should get limited to those philosophical statements which are not regarded by the Church as wrong. According to 12, it is not allowed to, e.g., assert that the condemnations of Copernicus and Galileo impeded the true progress (Copernicus' condemnation has been revoked in 1835).

  • LP.2: The Leibnizian LP: There can be no action at a distance. I call it Leibnizian (for mnemotechnic reasons) though Leibniz was no alone to blame the idea of gravitation for violating the principle in question. However, his eminence among the critics seems to justify such naming.

  • LP.3: The Humean LP, shared by the Vienna Circle: No proposition concerning the reality outside language enjoys the status of epistemic necessity, since any proposition is either empirical or mathematical. Being empirical, it is refutable, hence not necessary. Being mathematical, it has no epistemic import for it does not deal with any reality; hence its necessity is a matter of linguistic convention unable to grant any cognitive content to mathematical theorems.

  • LP.4: The nominalist LP: higher order logics should be disregarded for the lack of any objectual reference of their quantified variables.

  • LP.5: The constructivist LP: in order to acknowledge the existence of a mathematical entity, it has to be constructed by appropriate operations of human mind.

The first item represents limitations imposed on science and philosophy from outside by authorities having had considerable means to hamper intellectual quests. Why to mention such things nowadays, when in our open society such restrictions have lost any compelling power? However, there is an instructive moral in the story. Not so much in the publishing of Syllabus in 1864, but in the fact that the present practice of Catholic Church -- with respect to any research -- agrees with the claims having been blamed in Syllabus. Since these claims derive from the philosophy of Enlightenment, it may be said that nowadays we witness the Church converted to Enlightenment (not at this point alone, also at the point of human rights, etc).

This deserves to be regarded as a success of pragmatic attitude toward science. The Enlightenment belief in the power of reason was mainly due to the astonishing success of Newton's physics, esp. his theory of gravitation. This achievement consisted in a formerly unaimaginable range of applications of a scientific theory. Applications which have extended over the whole universe, from the earth to most remote stars, macroscopic regions as well as microscopic ones. Such a pragmatist argument must have convinced the whole academic world, and the whole educated public, about the power of human reason even if acting against theological LPs. And then, the only reasonable move to have remained to the Church was to retreat from condemning the autonomy of science.


§3. Newton's gravitation as a "good cat" to advance frontiers of science

The claim LP.2 was regarded by Broad as stating a fundamental limiting principle. This remains in accord with what was asserted by such eminent thinkers as, for instance, Leibniz. Nevertheless, the overt transgression of that principle by Issac Newton with his theory of gravitation is counted among the greatest achievements in the history of science. An unimaginable set of phenomena grows explained by the simple equation to state that the gravitional force is proportional to the product of masses of the bodies in question, and inversely proportional to the square of the distance between them. This is that force which plays a decisive role in the whole cosmic scenario from the very beginning of the universe.

Note, however, that it is a force which does exert an instantaneous action at a distance, both features being forbidden by the principle in question. This is why this idea was vehemently objected by Leibniz. What more remarkable the same objections were troubling Newton himself, nevertheless, it was his pragmatic attitude which took over fundamentalist scrupples. In spite of his being deeply uncomfortable with the notion of "action at a distance" which his equation implied, finally he stated:

"It is enough that gravity does really exist and acts according to the laws I have explained, and that it abundantly serves to account for all the motions of celestial bodies"

Quoted after http://en.wikipedia.org/wiki/Newton%27s_law_of_universal_gravitation. Iitalics mine - WM - to stress the pragmatic attitude which has won at last.

Thus the theory of gravitation has practically proved a good cat, even if this cat might have appeared black, that is, undesirable from a theoretical point of view. Had Newton yielded to Leibniz's attack and his own reservations, then his enormously seminal theory, forwarding the frontiers of science as much ahead as it never happened earlier, would have fallen prey of a categorical limiting principle.

To follow a sequel of this story, one should go deeper into Newton's doubts and Leibniz's charges. Let us take a look at the latter.

The very title of Leibniz's text reveals - in an ironic vein - the main line of his argument. It reads: "Antibarbarus Physicus pro Philosophia Reali contra renovationem qualitatum scholasticarum et intelligentiarum chimaericarum". Here "barbarian" is to mean "uncultured person", hence Leibniz sees himself as a defender of a higher intellectual culture. This culture amounts to rejecting the scholastic way of thinking characteristic of the Middle Ages (barbarian, in a sense).

Let me recall that schoolmen fancied occult qualities, or occult forces, to explain phenomena, as in that satire by Molier in which a scholastic doctor asked why opium makes one sleepy, explains quite seriously: "for there is in it the force to make one sleepy". No knowledge about reality ("philosophia realis") is conveyed by such ridiculously superficial explanations. Ironically, Leibniz compares the force of gravitation to such scholastic figments, and speaks against their revival, that is, "contra revovationem qualitatum scholasticarum".

See: Die Philosophische Schriften von Gottfried Wilhelm Leibniz Herausgegeben von E.J.Gerhardt, VII Band, Georg Olms, Hildesheim 1961, pp.337-343, passim.

Instead, Leibniz calls for any concept being introduced (here "gravitation") that it be defined in terms of some obvious primitives notions of mechanics, namely those of magnitude, form and movement. These he regarded as simplest and most obvious in the language of physics, and blamed the idea of gravitation for its not being reducible to those conceptual primitives. Newton had a similar research programme: in other cases he successfully tried to explain the origin of various forces which acted on bodies, but in the case of gravity, he did not succeed to identify any motion producing the force of gravity.

If so, why Leibniz and Newton so much differred with each other in their final conclusions? The deep difference lies in the respectives philosophies of science. Newton's was spontaneously pragmatist (though the term itself was not in use then), while Leibniz's was fundamentalist, firmly sticking to limiting principles.

The point of this story? It evidences that in some crucial questions it is pragmatism what moves the frontiers of science ahead, sometimes up to a farthest attainable point, as it was the case with Newton.

The story has continuation in Einstein's theory of general relativity, in which gravitation is an attribute of curved spacetime instead of being due to a force propagated between bodies (did this satisfy Leibniz's expectations?). This, however, is a separete issue to be handled by historians of physics, esp. experts in relativity.

Another point in current physics related to action at a distance, even more sophisticated, is that of Quantum Entanglement. An extensive and lucid treatment of this subject, including the problem of teleportation (which sounds like a story about action at distance), together with Einstein's objections, are lucidly explained in the article "Quantum Entanglement and Information" (2010) by Arthur Fine, found in "Stanford Encyclopedia of Philosophy" (plato.stanford.edu/entries/qt-entangle/). As quantum physics and quantum information go to the furthest frontiers of current science, evidently these themes are highly worth study.

There was a double enormous surprise in Newton's theory of gravitation: the universality, extending over the whole universe, its whole past and future, as well as the fact the new theory surpassed all scientific achievements of antiquity; those up to the 17th century were commonly regarded as insuperable. The latter feature has decidedly contributed to that trust in human reason which were to mark the coming age of Enlightenment. To conclude: note that this surprisingly efficient theory, explaining the universe and forwarding the course of civilization, is much due to Newton's pragmatic approach; thus pragmatism has proved its mettle against an unconditional relying on limititative principles.


§4. Epistemic necessity of as a high degree of indispensability

This Section is to perform two interrelated tasks: (1) first, to provide another case study of how a limiting principle may slow down progress of science; second, tu use the same study for introducing a concept which would deeper explain the process of advancing science, to wit the concept of epistemic necessity as a gradable property of propositions.

The advancing of frontiers, say, in the policies of an empire, consists of two actions: first, the conquering forces are to reach toward a point in the terrain to be annexed; second, this new frontier should get consolidated to secure it against the risk of being lost. An intellectual conquest comprises two similar phases. In the case of the law of gravitation it was (1) to propose this law as universal, ruling the whole universe; (2) to gradually check its applications to various kinds of phenomena, and various regions of the universe.

With each such application check successfully passed, this law proved more and more indispensable for understanding reality. There continually grows the number of phenomena which it explains and predicts. Nowaday, for instance, we learn owing to it about the initial forming of hydrogen from the plasma left behind from the big bang, about gravitational callapses of stars, etc. "Those things in heaven" (to cite Hamlet) which Newton could not have dreamt of, more and more extend the frontiers of the known universe; at the same time, they increasingly confirm the validity of the law, and this amounts to ever greater consolidation. Both extension and consolidation combine into advancement of frontiers.

The more proceeds such an advancing of the law in question, the more it grows indispensable. Such a status of being indispensable element of our knowledge deserves to be called epistemic necessity. The adjective "indispensable" means something not to be dispensed with, something that cannot be done away with.

When so defining "epistemic necessity" in terms of "indispensability", one should make it clear whether or not the latter admits a gradation. For it may happen that a product X which satisfies a need perfectly, nevertheless can be replaced by a substitute Y. Should we then deny indispensability to X? It depends on a comparative estimating of their merits. Suppose that the substitute Y brings the same result but at a greater cost: for instance, slower (expense of time), with an additional risk, with less convenience, etc. Then we shall say that X is more indispensable than its substitute Y. In this sense, indispensability proves to be a property capable of being graded. And so gradable is epistemic necessity of a proposition -- when defined in terms of its indispensability for our knowledge.

When the concept of necessary proposition gets referred to some objects, this challenges a limiting principle listed in §2, namely LP.3. This principle claims the non-existence of necessary propositions among those being concerned with any domain of reference. This limitation derives from the empiricist contention that every proposition about the world -- called synthetic for its adding a new piece to our knowledge -- must be justified on the basis of sensory experience. Only then it grows capable of being either true or false.

Otherwise, a proposition cannot pretend to be true. Such a detachment from reality -- according to that view -- is characteristic of mathematical propositions: their sole import for science consists in being rules to transform strings of symbols into other strings in a process of computing. If one calls them necessary, this is just in the sense of necessity relative to a linguistic convention; 2+2 equals 2 in virtue of certain conventions, termed meaning postulates, regarding the meanings of symbols "+", "=", etc. In this approach, the necessity is coextensive with the property of being analytic, and so there arises the famous dychotomy synthetic-analytic. Analyticity is conceived as not admitting of any gradation.

Had that Vienna Circle claim been taken seriously, this would have blocked metamathematical research, for instance, inquiries into completeness of the first-order logic or completeness of arithmetic. For completeness means provability of all the truths in the theory in question, hence it is assumed in such a research that mathematical propositions are either true or false.

However, neither Kurt Gödel nor Alfred Tarski were much impressed by this Vienna doctrine. Their studies have confirmed that mathematical statements are capable of receiving the values of truth or falsity. And so their epistemic necessity continues to be a point at issue. This attribute is regarded by some philosophers as coextensive with being a priori, that is, preceding, or being before (literal translation of "a priori) any sensory experience.

A thorough analysis of the a priori, frequently referred to in literature, is given with Morton White's study "The analytic and the syntetic" in his book Toward Reunion in Philosophy, Harvard University Press, Cambridge, 1956.

This view gives rise to the famous old controversy whether mathematical axioms are necessary while not being analytic. The name coined for such instances reads: "synthetic a priori". It is meant to express the point that such sentences add a piece of information to our knowledge (so being synthetic), but without being preceded by any sensory experience (so being a priori). This debate appears far from conclusiveness, so intricate are notions and assumtions involved.

Fortunately, the pragmatist approach is free from such perplexities. Once taking for granted that epistemic necessity is gradable, we encounter no question of either dychotomy or trichotomy. Instead, there is a scale of epistemic necessity degrees. Let the totality of our knowledge be represented by a field of force (as pictured by Quine). Points near edges symbolize narrow generalizations; their removal would not disturb the rest of field considerably, so readjustments would be relatively ease; this means a law degree of indispensability. Being found in the interior, closer to the center, means for a proposition to possess a broader field of applications (extending up to the edges), hence to enjoy a greater indispensability. Closest to the center are logical and mathematical statements; had they disappeared, the whole structure would collapse, and require a total reconstruction, a bulding anew (provided there be such a chance). These have the rank of the greatest epistemic necessity.

Such a model of knowledge does not imply the existence of an absolute necessity. Also in the circle closest to the center, some revisions are not unthinkable. Even classical propositional logic happens to be readjusted for some purposes, as seen in certain discussions about the law of excluded middle. Anyway, propositional logic, as being decidable, belongs to theories closest to the top of epistemic neceesity. Next to such a top would be the predicate logic as having proofs of consistency and completeness, but inferior to sentential logic for lacking decidability.

At that altitude there is the place for arithmetic, though it does not possess the attribute of completeness. As for consistency, it cannot be demonstrated with means which would exceed the inferential capabilities of arithmetic itself; this can be done only with some means of stronger systems, such as set theory, but those stronger ones, again, cannot have proofs of consistency without using still stronger means (new axioms, or new inference rules, which result in a greater ontological commitment, e.g. acknowledging the existence of sets of sets). Nevertheless, we do firmly believe in consistency of arithmetic on the strength of many centuries of experience with applying it in innumerable cases. Had arithmetic been inconsistent, then in such an enormously long time there must have occurred an error in applications. To use again the Chinese proverb we started with: if there is any cat which can catch mice with a possible greatest efficiency, such an enormously good cat is arithmetic. With such a pragmatist certificate, arithmetic propositions obtain the status of the possibly highest epistemic indispensability.

Let me sum up this piece of discussion, even at the cost of some repetitions, with quoting a text by W.V.O. Quine, which forms an essential part of his pragmatist manifesto.

"Total science is like a field of force whose boundary conditions are experience. A conflict with experience at the periphery occasions readjustments in the interior of the field. Truth values have to be redistributed over some of our statements. Re-evaluation of some statements entails re-evaluation of others, because of their logical interconnections -- the logical laws being in turn simply certain further statements of the system, certain further elements of the field. Having re-evaluated one statement we must re-evaluate some others."

See "Two dogmas of empiricism" in: From a Logical Point of View, Harward University Press, Cambridge, Mass. 1953, p.42 (Section VI). See also: www.ditext.com/quine/quine.html.

Another Quine's metaphor tells us that the degrees of necessity are like degrees of
greyness, instead of forming the black-white dychotomy.

"The lore of our fathers is a fabric of sentences. [...] It is a pale gray lore, black with fact and white with convention. But I have found no substantial reasons for concluding that there are any quite black threads in it, or any white ones."

This statement is found in his article of 1954: "Carnap and Logical Truth" contained in the volume The Ways of Paradox and Other Essays, revised edition, Cambridge, MA: Harvard University Press, 1976, pp. 107-32. This parable is discussed by Yemina Ben-Menahem, "Black, White and Gray: Quine on Convention", Synthese (2005) 146: 245-282.


§5. The inferential and computational power of higher-order logics

The limiting principles LP.4 and LP.5 (Section §2) deserve special interest. Were they obeyed this would have a disastrous impact on the progress of mathematics and computation. In considering the power of higher-order logics, which are forbidden by LP.4, one should start from a seminal statement by Kurt Gödel. In the paper "Über die Länge von Beweisen" (on the length of proofs, 1936) he pioneered the following idea. [1] some proofs, which in the first-order logic cannot be carried out (thus giving rise to undecidability), can be carried in the second-order logic, and [2] other ones which at the first-order level would require time not being available either to humans or to computers, become tractable in an accessible time when performed at the higher level. What, in turn, is not tractable in the second-order system of logic, may prove tractable in a third-order system, and so on.

In his short report Gödel did not give any proof of these statements. The proof has been given much later by S.R.Buss.

Samuel R. Buss, "On Gödel's theorems on lengths of proofs I: Number of lines and speedups for arithmetic." Journal of Symbolic Logic 39 (1994), 737-756.

A fact much relevant for the issue in question is provided with a remarkable exemplification of second-order logic's capability. It is found in an article by Boolos.

See: George Boolos, "A curious inference", Journal of Philosophical Logic 16: 1-12.

He gave a formalized proof of a certain arithmetic theorem in the second-order logic. This took space of about one printed page, hence several thousands single symbols.

On the other hand, in the first-order logic no formalized proof gets tractable (i.e., computable in practice) either for Boolos or for computer, since in any case it would require a number of symbol greater than the number of atoms in the observable universe. Boolos estimated that this quantity would be represented by an exponential stack in which a number is raised to the second power 64536 times.

What about a formalized computer-assisted proof in the second-order logic? In the print it has to be longer than Boolo's text because of requirements imposed by the softare to check correctness. In the literature at least two such proofs are presented having size of several tens of printed pages what is, in fact, a tractable size. Both proofs, given two different system of computer-aided reasoning, are found in the following study:

Christoph E. Benzmüller and Chad E. Brown, "The Curious Inference of Boolos in Mizar and OMEGA", Studies in Logic, Grammar and Rhetoric, 10(23), 2007, special volume From Insight to Proof. Festschrift in Honour of Andrzej Trybulec edited by Roman Matuszewski, Anna Zalewska, University of Bia³ystok, pp.299-386. On line: http://logika.uwb.edu.pl/studies/vol(10)23.html.

The experience obtained by the said researchers in performing the above task made it possible for them to estimate computer capabilities with respect to a more difficult performance. Let us imagine that a computer system is to be used not for checking a human-made formalized proof, but for devising such a proof by itself. Let it be the proof of the same theorem which was inquired by Boolos. The authors see the problem as follows.

"Boolos' example perspicuously demonstrates the limitations of current first-order and higher-order theorem proving technology. With current technology it is not possible to find his proof automatically, even worse, automation seems very far out of reach. Let's first give a high-level description why this is so. Firstly, Boolos' proof needs comprehension principles to be available and it employs different complex instances of them. [...] Secondly, the particular instances of the comprehension axioms cannot be determined by higher-order unification but have to be guessed. However, the required instantiations here are so complex that it is unrealistic to assume that they can be guessed. [...] Here it is where human intuition and creativity comes into play, and the question arises how this kind of creativity can be realised and mirrored in a theorem prover." [Italics - WM].

Christoph E. Benzmüller and Manfred Kerber, "A Challenge for Mechanized Deduction", 2001. The Web page quoted did not exist in the time of writing the present paper. The quotation is rewritten from: Witold Marciszewski, "The Gödelian Speed-up and Other Strategies to Address Decidability and Tractability", Studies in Logic, Grammar and Rhetoric, 9(22), 2006, University of Bia³ystok, pp. 9-29. On line: http://logika.uwb.edu.pl/studies/vol22.html.

The reference to the essential role of comprehension principle makes us aware how much the second-order logic is here relevant. Moreover the use of this logic requires intuition and invention unavailable to computer systems; and are just the priviledge of human minds. Hence it is up to humans to advance frontiers of knowledge far ahead. If only they be bold enough to not observe limitative principles like that banning higher-order logics.

Next, I am to pay attention to a curious fact about the axiom of choice. In spite of various doubts and objections, this statement proves essential and indispensable in automated theorem proving. Hence its common practical acceptance in that circle of researchers.

This is connected with the procedure of skolemization, that is, reduction to Skolem normal form. Owing to this procedure, a reasoner gets rid of quantifiers, and thus the formula in question gets transformed into an expression of sentential calcules. This, in turn, makes it possible to apply an algoritmic decision procedure of this calculus. Thus we are able to algorithmically establish whether the formula is, or is not, a tautology of predicate logic. As is commonly known, such a procedure fails in some cases. Sometimes, when the solution would be in the negative, the algorithm falls into the loop, and never stops. Nevertheless, skolemization (or something equivalent, e.g. Hilbert's episilon operation) is the most efficient procedure for such partial decidability. It requires no guesses, no invention or intuition, and thereby it can be performed by computers.

However, there is a philosophical cost of such a convenience. We have to violate the limiting principle listed as item LP.5 in §3. This priciple is not respected by the axiom of choice. For no choice function is defined in it to hint at the criteria of selecting representatives of certain sets to form a new set out of them. The existence of such a function is postulated without identifying its content. This is supposition necessary for eliminating quatifiers in expressions of the form: (x)(Ey)R(y,x).

In such a simple case (just one universal quantifier) skolemization is performed by replacing the existentially quantified variable y with a term f(x). If there are more universal quantifiers, then the function has correspondingly more arguments. In performing such instantiation, we do not bother about defining or constructing such a function, we simply assume that it does exist. Such arbitrariness may be judged as reckless by philosophically cautious people who prefer to observe the limiting principle LP.5. Nevertheless it renders enormous services in research, and so advances the frontiers of our knowledge ahead.


§6. Pragmatic insights ("this should work") beyond common intuitions

The parenthesed phrase is to suggest what I mean under pragmatic insight as compared with common intuition. This comparison is needed in order to detect those sources of fallacies which happen to be accepted as limiting principles. I consider here not only those limiting principles which we find in scientific or philosophical literature, but also those appearing in our everyday thinking.

The latter, even if not explicitly stated, limit our understanding of the world. An instructive example is found in fairly common intuitions concerning the free fall of bodies. In spite of passing exams in school physics, there are educated people who believe that -- in any conditions whatever -- heavier bodies are bound to fall faster than lighter ones. Galileo and Newton were able to discard that erroneous perception since they expected from the laws of nature an universal range of applications; and this is hardly available for intuitions born from our everyday experience. In the case in question our observations refer to bodies falling down to earth in the earthly atmosphere which produces the air resistance. In thus narrowed conditions, the impression of differences about the speed of falling bodies is not misleading; however, without such a restricting proviso there arises a fallacious limiting principle.

The pragmatist attitude is a suitable remedy against such fallacies. It tends to gain insights concerning a large domain of applications in which a hypothesis or a law should work, instead of depending on intuitions spontaneously acquired (though their commonality may induce people to take them for granted). Pragmatism claims that such insights are crucial for advancing frontiers of science.

It has been noticed above (in §3) that the law of gravitation was regarded as lacking a sufficient evidence, that is, as not being duly intuitive. Such was a feeling even of Newton's himself, not only of his opponents. Nevertheless, Newton accepted it on the basis that "it abundantly serves to account for all the motions of celestial bodies". Now we know that it serves to account for an astonishing number of phenomena both in macroscale and microscale. Thus it works! And such an efficient working must have been foreseen by Newton in a bold insight, in spite of the lack of direct evidence.

Some limiting common intuitions were shared by greatest thinkers, thereby delaying the dawning of ideas which were to advance the frontiers of science. This was, for example, the case of Albert Einstein who intuitively accepted the limiting principle that any evolution of the universe is impossible. Following this assumption, as if it were indubitable, he had "corrected" (in fact, corrupted) the first version of general relativity, and restored it only after Hubble's discovery of the expanding universe. Now we know that this restored original version of general relativity has an enormous impact on the foundations of cosmology.

Let me mention some other examples of conflicting intuitions, those belonging to what may me called "common sense" and those inspiring great discoverers. Among them there is the story of the Euclid's fifth postulate; its short and intuitive equivalant has been given by Proclos in the form:

Given a line and a point not on the line, it is possible to draw exactly one line through the given point parallel to the line.

For more, see: http://www.gap-system.org/~history/HistTopics/Non-Euclidean_geometry.html.

It was Gauss who worked out the consequences of a geometry in which more than one line can be drawn through a given point parallel to a given line, but he did not publish this revolutionary result, because the views of the academic circles were strongly dominated by the orthodoxy of the limiting principle supported by the authority of Immnaunuel Kant. He had asserted that Euclidean geometry is the inevitable necessity of thought. Only after publishing by Niko³aj £obaczewski in 1829 and János Bolyai in 1832 a system of geometry like that of Gauss, this discovewry came to be known to mathematicians. However, it required a time that the new geometry be duly appreciated, so far it was beyond the common intuition, and this fact exercised a strong limiting impact. A full recognition followed when non-Euclidean geometries proved to possess enormous applications in physics, hence there appeared the acknowledgment on pragmatic grounds.

In modern physics there is a lot of paradoxical counterintuitive statemants whose main justification consists in the fact that they work. Let me just mention the particle-wave duality. Waves and particles are intuitively perceived as so different categories of entities that such a duality seems to be evidently nonsensical.

Also mathematical logic and set theory, relatively new mathematical disciplines, happen to get limitated by certain intuitions, some of them fairly common, other ones cultivated in some philosophical schools. For instance, the authority of Aristotle, lasting for centuries down, limited logic to syllogistic rules (a point firmly asserted also by Immanuel Kant), while in the set theory the same authority inhibited the Cantorian idea of actual infinity (Aristotle allowed potential infinity alone). However, the modern predicate logic as well as Cantorian set theory have gained the recognition of academic communities owing to their much successful applications.

For the same reason, Gödel's incompleteness theorem concerning arithmetic has set aside the nominalist contention that mathematics lacks any objectual reference, and so gets limited to being a game played with mere symbols, like chess with chess pieces. Also the nominalistic refusal of acknowledging the existence of sets gets refuted by the enormous efficiency of second-order logic (as discussed in Section §5).


§7. Conclusions

If we try to rank this essay's key concepts according to their significance, the first three places in such a ranking would be scored by the notions of intuition, applications of a theory, and epistemic necessity. The last is to denote the degree of indispensability of a proposition, as measured with the range of its applications, theoretical as well as technological.

In such a way, the notion of intuition gets freed from two extremities. One of them consists in treating it suspiciously as something esoteric that cannot be conceived in terms of sober knowledge; the other -- in treating intuition as an infallible oracle, being the cognitive authority of the last resort (this point is conspicuous in Kant's doctrine of synthetic a priori).

Strong and weak sides of intuition are convincingly balanced by the economist and psychologist Daniel Kahneman. His approach has grown highly appreciated, owing to Nobel Prize (2002), as providing a basis to understand psychological factors of economic decisions. Kahneman's idea is concisely rendered in the title of his book Thinking, Fast and Slow (published by Farrar, Straus and Giroux, New York, 2011). The slow thinking amounts to algorithmic, step by step, proceding, while the fast one consists in flashes of intuition emerging somewhere from the resources of subconscious memory. Such a speed and creative novelty makes intuition indispensable for the efficiency of cognition, but does not grant infallibility.

Failures of intuition, when they happen, are due to the fact that intuitive perceptions result from the unconscious processing of experiences without a critical assessments which get feasible only at the level of full consciousness. Moreover such experiences may have a very narrow scope, as those concerning the fall of bodies, considered above in §6; this implies a too narrow set of consequences to be used in tests aiming at verification. As long as one's perspective, for example in physics, does not exceed the scope of everyday experiences alone (as was the case in antiquity and Middle Ages), they misleadingly appear to have a high authority, being like a certificate to act as limiting principles.

The development of instruments of research (from Galileo's lunette up to Hubble telescope and space probes) makes it possible to discover and measure facts inaccessible to everyday experiences. And the creating of new mathematical theories, as Newton's calculus, enables computation which on the basis of measurements checks reliability of hypotheses in vast domains of applications. However, let it be noticed that every theory overcoming old intuitions is based on some other intuitions which remain unquestionable. E.g., the law of gravitation presupposes intuitions of what are bodies, space, distance, multiplication, division, squaring.

Scientists happen to give up certain intuitions, even those supported by centuries of everyday experiences, in the case of their disagreement with a theory enjoying a wide range of theoretical and technological applications. The pragmatist strategy does not need to be defended with philosophical arguments, since empirical sciences in their practice spontaneously follow such strategy in a natural and spontaneous manner.

The same is the case in mathematical sciences, though the awareness of this fact has less progressed so far. It was Kurt Gödel who brought about a breakthrough in this matter (cp. §5). His leading follower is nowadays Gregory Chaitin who after Gödel declares a perspective of everlasting progress of mathematics. This discipline posesses the potential to win ever new computational means due to its readiness of reforming even own foundations, if needed for such a purpose. Here is Chaitin's statement, much opportune to sum up the contention of this essay, especially at the point stressed with italics by myself.

"Gödel's own belief was that in spite of his incompleteness theorem there is in fact no limit to what mathematicians can achieve by using their intuition and creativity instead of depending only on logic and the axiomatic method. He believed that any important mathematical question could eventually be settled, if necessary by adding new fundamental principles to math, that is, new axioms or postulates. Note however that this implies that the concept of mathematical truth becomes something dynamic that evolves, that changes with time, as opposed to the traditional view that mathematical truth is static and eternal."

See http://www.cs.auckland.ac.nz/CDMTCS/chaitin/charly.html, "Chaitin interview for Simply Gödel website" (9 February 2008).

How to sum up this essay still more concisely? Let mi use for help Ockham's famous maxim: Entia non sunt multiplicanda praeter necessitatem. It happens to be regarded as a strongly limiting principle, but after a reflexion it may prove to mean the opposite. An opportunity for such reflexion comes when we try to translate the maxim into English. What its English counterpart might be like? Since Latin grammar is here ambiguous, the maxim can be interepreted as the following equivalence: Entities should not be multiplied then and only then, if this is not necessary [in order to understand the world]. "To multiply" means adding new axioms or postulates (as told by Chaiting in the quotation above), since in this way one introduces new objects, and so advances the frontiers of the domain in question. Our equivalence implies the following:

  • If for understanding the world it proves necessary to multiply entities, they should be multiplied.
    Again in Latin:
  • Entia sunt multiplicanda, si ad mundum intelligendum id necesse est.
    Q.E.D.

Witold Marciszewski
http://calculemus.org