Formal Logic and Dialectics
Formal Logic and Dialectics
Formal Logic and Dialectics
How Does Logic Teach How to Think?
Limits of the Law of Identity
Logic and the Subatomic World
Modern Logic
Notes for Part One
The ability of men and women to think logically is the product of a lengthy process of social evolution. It antedates the invention of formal logic, not by thousands, but by millions of years. Locke already expressed this thought in the 17th century, when he wrote: "God has not been so sparing to men as to make them barely two-legged creatures, and left it to Aristotle to make them rational." Behind Logic, according to Locke, stands "a na�ve faculty to perceive the coherence or incoherence of its ideas." (44)
The categories of logic did not drop from the clouds. These forms have taken shape in the course of the socio-historical development of humankind. They are elementary generalisations of reality, reflected in the minds of men and women. They are drawn from the fact that every object has certain qualities which distinguish it from other objects; that everything exists in certain relations to other things; that objects form larger classes, with which they share specific properties; that certain phenomena cause other phenomena, and so on.
To some extent, as Trotsky remarked, even animals possess the ability to reason and draw certain conclusions from a given situation. In higher mammals, and in particular the apes, this capacity is quite advanced, as the most recent research into bonobo chimpanzees strikingly reveal. However, while the capacity to reason may not be a monopoly of the human species, there is no doubt that, at least in our small corner of the universe, the ability to think rationally has reached its highest point so far in the development of the human intellect.
Abstraction is absolutely necessary. Without it, thought in general would be impossible. The question is: what sort of abstraction? When I abstract from reality, I concentrate on some aspects of a given phenomenon, and leave the others out of account. A good mapmaker, for instance, is not someone who reproduces every detail of every house and paving-stone, and every parked car. Such an amount of detail would destroy the very purpose of the map, which is to make available a convenient scheme of a town or other geographical area. Similarly, the brain early on learns to ignore certain sounds and concentrate on others. If we were not able to do this, the amount of information reaching our ears from all sides would overwhelm the mind completely. Language itself presupposes a high level of abstraction.
The ability to make correct abstractions, which adequately reflect the reality we wish to understand and describe, is the essential prerequisite for scientific thought. The abstractions of formal logic are adequate to express the real world only within quite narrow limits. But they are one-sided and static, and are hopelessly inadequate to deal with complex processes, particularly movement, change and contradictions. The concreteness of an object consists of the sum-total of its aspects and interrelationships, determined by its underlying laws. It is the task of science to uncover these laws, and to get as close as possible to this concrete reality. The whole purpose of cognition is to reflect the objective world and its underlying lawfulness and necessary relationships as faithfully as possible. As Hegel point out, "the truth is always concrete."
But here we have a contradiction. It is not possible to arrive at an understanding of the concrete world of nature without first resorting to abstraction. The word abstract comes from the Latin "to take from." By a process of abstraction, we take from the object under consideration certain aspects which we consider important, leaving others to one side. Abstract knowledge is necessarily one-sided because it expresses only one particular side of the phenomenon under consideration, isolated from that which determines the specific nature of the whole. Thus, mathematics deals exclusively with quantitative relations. Since quantity is an extremely important aspect of nature, the abstractions of mathematics have provided us with a powerful instrument for probing her secrets. For this reason, it is tempting to forget their real nature and limitations. Yet they remain one-sided, like all abstractions. We forget this at our peril.
Nature knows quality as well as quantity. To determine the precise relation between the two, and to show how, at a critical point, one turns into the other is absolutely necessary if we wish to understand one of the most fundamental processes in nature. This is one of the most basic concepts of dialectical as opposed to merely formal thought, and one of its most important contributions to science. The deep insights provided by this method, which was long decried as "mysticism," are only now beginning to be understood and appreciated. One-sided abstract thought, as manifested in formal logic did a colossal disservice to science by excommunicating dialectics. But the actual results of science show that, in the last analysis, dialectical thinking is far closer to the real processes of nature than the linear abstractions of formal logic.
It is necessary to acquire a concrete understanding of the object as an integral system, not as isolated fragments; with all its necessary interconnections, not torn out of context, like a butterfly pinned to a collector’s board; in its life and movement, not as something lifeless and static. Such an approach is in open conflict with the so-called "laws" of formal logic, the most absolute expression of dogmatic thought ever conceived, representing a kind of mental rigor mortis. But nature lives and breathes, and stubbornly resists the embraces of formalistic thinking. "A" is not equal to "A." Subatomic particles are and are not. Linear processes end in chaos. The whole is greater than the sum of its parts. Quantity changes into quality. Evolution itself is not a gradual process, but interrupted by sudden leaps and catastrophes. What can we do about it? Facts are stubborn things.
Without abstraction it is impossible to penetrate the object in "depth," to understand its essential nature and laws of motion. Through the mental work of abstraction, we are able to get beyond the immediate information provided by our senses (sense-perception), and probe deeper. We can break the object down into its constituent parts, isolate them, and study them in detail. We can arrive at an idealised, general conception of the object as a "pure" form, stripped of all secondary features. This is the work of abstraction, an absolutely necessary stage of the process of cognition.
"Thought proceeding from the concrete to the abstract," wrote Lenin, "—provided it is correct (and Kant, like all philosophers, speaks of correct thought)—does not get away from the truth but comes closer to it. The abstraction of matter, of a law of nature, the abstraction of value, etc., in short all scientific (correct, serious, not absurd) abstractions reflect nature more deeply, truly and completely. From living perception to abstract thought, and from this to practice,—such is the dialectical path of the cognition of truth, of the cognition of objective reality." (45)
One of the main features of human thought is that it is not limited to what is, but also deals with what must be. We are constantly making all kinds of logical assumptions about the world we live in. This logic is not learned from books, but is the product of a long period of evolution. Detailed experiments have shown that the rudiments of this logic is acquired by a baby at a very young age, from experience. We reason that if something is true, then something else, for which we have no immediate evidence, must also be true. Such logical thought-processes take place millions of times all our waking hours, without us even being aware of them. They acquire the force of habit, and even the simplest actions in life would not be possible without them.
The elementary rules of thought are taken for granted by most people. They are a familiar part of life, and are reflected in many proverbs, such as "you can’t have your cake and eat it"—a most important lesson for any child to learn! At a certain point, these rules were written down and systematised. This is the origin of formal logic, for which Aristotle must take the credit, along with so many other things. This was most valuable, since without a knowledge of the elementary rules of logic, thought runs the risk of becoming incoherent. It is necessary to distinguish black from white, and know the difference between a true statement and one that is false. The value of formal logic is, therefore, not in question. The problem is that the categories of formal logic, drawn from quite a limited range of experience and observation, are really valid only within these limits. They do, in fact, cover a great deal of everyday phenomena, but are quite inadequate to deal with more complex processes, involving movement, turbulence, contradiction, and the change from quality to quality.
In an interesting article entitled The Origins of Inference, which appeared in the anthology Making Sense, on the child’s construction of the world, Margaret Donaldson draws attention to one of the problems of ordinary logic—its static character:
"Verbal reasoning commonly appears to be about ‘states of affairs’—the world seen as static, in a cross-section of time. And considered in this way the universe appears to contain no incompatibility: things just are as they are. That object over there is a tree; that cup is blue; that man is taller than that man. Of course these states of affairs preclude infinitely many others, but how do we come to be aware of this? How does the idea of incompatibility arise in our minds? Certainly not directly from our impressions of things-as-they-are."
The same book makes the valid point that the process of knowing is not passive, but active:
"We do not sit around passively waiting for the world to impress its ‘reality’ on us. Instead, as is now widely recognised, we get much of our most basic knowledge through taking action." (46)
Human thought is essentially concrete. The mind does not readily assimilate abstract concepts. We feel most at home with what is immediately before our eyes, or at least with things that can be represented in a concrete way. It is as if the mind requires a crutch in the shape of images. On this, Margaret Donaldson remarks that "even preschool children can frequently reason well about the events in the stories they hear. However, when we move beyond the bounds of human sense there is a dramatic difference. Thinking which does move beyond these bounds, so that it no longer operates within the supportive context of meaningful events, is often called ‘formal’ or ‘abstract.’" (47)
The initial process thus goes from the concrete to the abstract. The object is dismembered, analysed, in order to obtain a detailed knowledge of its parts. But there are dangers in this. The parts cannot be correctly understood apart from their relationship with the whole. It is necessary to return to the object as an integral system, and to grasp the underlying dynamics that condition it as a whole. In this way, the process of cognition moves from the abstract back to the concrete. This is the essence of the dialectical method, which combines analysis with synthesis, induction and deduction.
The whole swindle of idealism is derived from an incorrect understanding of the nature of abstraction. Lenin pointed out that the possibility of idealism is inherent in any abstraction. The abstract concept of a thing is counterposed artificially to the thing itself. It is supposed not only to have an existence of its own, but is said to be superior to crude material reality. The concrete is portrayed as somehow defective, imperfect and impure, as opposed to the Idea which is perfect, absolute and pure. Thus reality is stood on its head.
The ability to think in abstractions marks a colossal conquest of the human intellect. Not only "pure" science, but also engineering would be impossible without abstract thought, which lifts us above the immediate, finite reality of the concrete example, and gives thought a universal character. The unthinking rejection of abstract thought and theory indicates the kind of narrow, Philistine mentality, which imagines itself to be "practical," but, in reality, is impotent. Ultimately, great advances in theory lead to great advances in practice. Nevertheless, all ideas are derived one way or another from the physical world, and, ultimately, must be applied back to it. The validity of any theory must be demonstrated, sooner or later, in practice.
In recent years there has been a healthy reaction against the mechanical reductionism, counterposing the need for a holistic approach to science. The term holistic is unfortunate, because of its mystical associations. Nevertheless, in attempting to see things in their movement and interconnections, chaos theory undoubtedly comes close to dialectics. The real relationship between formal logic and dialectics is that between the type of thinking that takes things apart, and looks at them separately, and that which is also able to put them together again and make them work. If thought is to correspond to reality, it must be capable of grasping it as a living whole, with all its contradictions.
What is a Syllogism?
"Logical thinking, formal thinking in general," says Trotsky, "is constructed on the basis of the deductive method, proceeding from a more general syllogism through a number of premises to the necessary conclusion. Such a chain of syllogisms is called a sorites." (48)
Aristotle was the first one to write a systematic account of both dialectics and formal logic, as methods of reasoning. The purpose of formal logic was to provide a framework to distinguish valid from invalid arguments. This he did in the form of syllogisms. There are different forms of syllogism, which are really variations on the same theme.
Aristotle in his Organon, names ten categories—substance, quantity, quality, relation, place, time, position, state, action, passion, which form the basis of the dialectical logic, later given its full expression in the writings of Hegel. This side of Aristotle’s work on logic is frequently ignored. Bertrand Russell, for example, considered these categories to be meaningless. But since logical positivists like Russell have written off practically the whole history of philosophy (except the bits and pieces that coincide with their dogmas) as "meaningless," this should neither surprise nor trouble us too much.
The syllogism is a method of logical reasoning, which may be variously described. The definition given by Aristotle himself was as follows: "A discourse in which, certain things being stated, something other than what is stated follows of necessity from their being so." The simplest definition is given by A. A. Luce: "A syllogism is a triad of connected propositions, so related that one of them, called the Conclusion, necessarily follows from the other two, which are called the Premises." (49)
The mediaeval Schoolmen focused their attention on this kind of formal logic which Aristotle developed in The Prior and Posterior Analytics. It is in this form that Aristotle’s logic came down from the Middle Ages. In practice, the syllogism consists of two premises and a conclusion. The subject and the predicate of the conclusion each occur in one of the premises, together with a third term (the middle) that is found in both premises, but not in the conclusion. The predicate of the conclusion is the major term; the premise in which it is contained is the major premise; the subject of the conclusion is the minor term; and the premise in which it is contained is the minor premise. For example,
a) All men are mortal. (Major premise)
b) Caesar is a man. (Minor premise)
c) Therefore, Caesar is mortal. (Conclusion)
This is called an affirmative categorical statement. It gives the impression of being a logical chain of argument, in which each stage is derived inexorably from the previous one. But actually, this is not the case, because "Caesar" is already included in "all men." Kant, like Hegel, regarded the syllogism (that "tedious doctrine," as he called it) with contempt. For him, it was "nothing more than an artifice" in which the conclusions were already surreptitiously introduced into the premises to give a false appearance of reasoning. (50)
Another type of syllogism is conditional in form (if…then), for example: "If an animal is a tiger, it is a carnivore." This is just another way of saying the same thing as the affirmative categorical statement, i.e., all tigers are carnivores. The same in relation to the negative form—"If it’s a fish, it’s not a mammal" is just another way of saying "No fishes are mammals." The formal difference conceals the fact that we have not really advanced a single step.
What this really reveals is the inner connections between things, not just in thought, but in the real world. "A" and "B" are related in certain ways to "C" (the middle) and the premise, therefore, they are related to each other in the conclusion. With great profundity and insight, Hegel showed that what the syllogism showed was the relation of the particular to the universal. In other words, the syllogism itself is an example of the unity of opposites, the contradiction par excellence, and that, in reality, all things are a "syllogism."
The heyday of the syllogism was in the Middle Ages, when the Schoolmen devoted their entire lives to endless disputations on all manner of obscure theological questions, like the sex of angels. The labyrinthine constructions of formal logic made it appear that they were really involved in a profound discussion, when, in fact, they were arguing about nothing at all. The reason for this lies in the nature of formal logic itself. As the name suggests, it is all about form. The question of the content does not enter into it. This is precisely the chief defect of formal logic, and its Achilles’ heel.
By the time of the Renaissance, that great re-awakening of the human spirit, dissatisfaction with Aristotelian logic was widespread. There was a growing reaction against Aristotle, which was not really fair to this great thinker, but stemmed from the fact that the Church had suppressed all that was worthwhile in his philosophy, and preserved only a lifeless caricature. For Aristotle, the syllogism was only part of the process of reasoning, and not necessarily the most important part. Aristotle also wrote on the dialectic, but this aspect was forgotten. Logic was deprived of all life, and turned, in Hegel’s phrase, into "the lifeless bones of a skeleton."
The revulsion against this lifeless formalism was reflected in the movement towards empiricism, which gave a tremendous impetus to scientific investigation and experiment. However, it is not possible to dispense altogether with forms of thought, and empiricism from the beginning carried the seeds of its own destruction. The only viable alternative to inadequate and incorrect methods of reasoning is to develop adequate and correct ones.
By the end of the Middle Ages the syllogism was discredited everywhere, and subjected to ridicule and abuse. Rabelais, Petrarch and Montaigne all denounced it. But it continued to trundle along, especially in those Catholic lands, untouched by the fresh winds of the Reformation. By the end of the 18th century, logic was in such a bad state that Kant felt obliged to launch a general criticism of the old thought forms in his Critique of Pure Reason.
Hegel was the first one to subject the laws of formal logic to a thoroughgoing critical analysis. In this, he was completing the work commenced by Kant. But whereas Kant only showed the inherent deficiencies and contradictions of traditional logic, Hegel went much further, working out a completely different approach to logic, a dynamic approach, which would include movement and contradiction, which formal logic is powerless to deal with.
Does Logic Teach How to Think?
Dialectics does not pretend to teach people to think. That is the pretentious claim of formal logic, to which Hegel ironically replied that logic no more teaches you to think than physiology teaches you to digest! Men and women thought, and even thought logically, long before they ever heard of logic. The categories of logic, and also dialectics, are derived from actual experience. For all their pretensions, the categories of formal logic do not stand above the crude world of material reality, but are only empty abstractions taken from reality comprehended in a one-sided and static manner, and then arbitrarily applied back to it.
By contrast, the first law of the dialectical method is absolute objectivity. In every case, it is necessary to discover the laws of motion of a given phenomenon by studying it from every point of view. The dialectical method is of great value in approaching things correctly, avoiding elementary philosophical blunders, and making sound scientific hypotheses. In view of the astonishing amount of mysticism that has emerged from arbitrary hypotheses, above all in theoretical physics, this is no mean advantage! But the dialectical method always seeks to derive its categories from a careful study of the facts and processes, not to force the facts into a rigid preconceived straitjacket:
"We all agree," wrote Engels, "that in every field of science, in natural as in historical science, one must proceed from the given facts, in natural science therefore from the various material forms and the various forms of motion of matter; that therefore in theoretical natural science too the inter-connections are not to be built into the facts but to be discovered in them, and when discovered to be verified as far as possible by experiment." (51)
Science is founded on the search for general laws which can explain the workings of nature. Taking its starting point as experience, it does not confine itself to the mere collection of facts, but seeks to generalise on the basis of experience, going from the particular to the universal. The history of science is characterised by an ever-deepening process of approximation. We get closer and closer to the truth, without ever knowing the "whole truth." Ultimately, the test of scientific truth is experiment. "Experiment," says Feynman, "is the sole judge of scientific ‘truth.’" (52)
The validity of forms of thought must, in the last analysis, depend on whether they correspond to the reality of the physical world. This cannot be established a priori, but must be demonstrated through observation and experiment. Formal logic, in contrast to all the natural sciences, is not empirical. Science derives its data from observation of the real world. Logic is supposed to be a priori, unlike all the subject matter with which it deals. There is a glaring contradiction here between form and content. Logic is not supposed to be derived from the real world, yet it is constantly applied to the facts of the real world. What is the relationship between the two sides?
Kant long ago explained that the forms of logic must reflect objective reality, or they would be entirely meaningless:
"When we have reason to consider a judgment necessarily universal…we must consider it objective also, that is, that it expresses not merely a reference of our perception to a subject, but a quality of the object. For there would be no reason for the judgments of other men necessarily agreeing with mine, if it were not the unity of the object to which they all refer, and with which they accord; hence they must all agree with one another." (53)
This idea was developed further by Hegel, who removed the ambiguities present in Kant’s theory of knowledge and logic, and finally put on a sound basis by Marx and Engels:
"Logical schemata," Engels explains, "can only relate to forms of thought; but what we are dealing with here are only forms of being, of the external world, and these forms can never be created and derived by thought out of itself, but only from the external world. But with this the whole relationship is inverted: the principles are not the starting point of the investigation, but its final result; they are not applied to nature and human history, but abstracted from them; it is not nature and the realm of humanity which conform to these principles, but the principles are only valid in so far as they are in conformity with nature and history." (54)
It is an astonishing fact that the basic laws of formal logic worked out by Aristotle have remained fundamentally unchanged for over two thousand years. In this period, we have witnessed a continuous process of change in all spheres of science, technology and human thought. And yet scientists have been content to continue to use essentially the same methodological tools that were used by the mediaeval School men in the days when science was still on the level of alchemy.
Given the central role played by formal logic in Western thought, it is surprising how little attention is paid to its real content, meaning and history. It is normally taken as something given, self-evident, and fixed for all time. Or it is presented as a convenient convention which reasonable people agree upon, in order to facilitate thought and discourse, rather as people in polite social circles agree upon good table manners. The idea is put forward that the laws of logic are entirely artificial constructions, made up by logicians, in the belief that they will have some application in some field of thought, where they will reveal some truth or other. But why should the laws of logic have any bearing upon anything, if they are only abstract constructions, the arbitrary imaginings of the brain?
On this idea, Trotsky commented ironically:
"To say that people have come to an agreement about the syllogism is almost like saying, or more correctly it is exactly the same as saying, that people came to an agreement to have nostrils in their noses. The syllogism is no less an objective product of organic development, i.e., the biological, anthropological, and social development of humanity than are our various organs, among them our organ of smell." In reality, formal logic is ultimately derived from experience, just as any other way of thinking. From their experience, humans draw certain conclusions, which they apply in their daily life. This applies even to animals, though at a different level.
"The chicken knows that grain is in general useful, necessary, and tasty. It recognises a given piece of grain as that grain—of the wheat—with which it is acquainted and hence draws a logical conclusion by means of its beak. The syllogism of Aristotle is only an articulated expression of those elementary mental conclusions which we observe at every step among animals." (55)
Trotsky once said that the relationship between formal logic and dialectics was similar to the relationship between lower and higher mathematics. The one does not deny the other and continues to be valid within certain limits. Likewise, Newton’s laws, which were dominant for a hundred years, were shown to be false in the world of subatomic particles. More correctly, the old mechanistic physics, which was criticised by Engels, was shown to be one-sided and of limited application.
"The dialectic," writes Trotsky, "is neither fiction nor mysticism, but a science of the forms of our thinking insofar as it is not limited to the daily problems of life but attempts to arrive at an understanding of more complicated and drawn-out processes." (56)
The most common method of formal logic is that of deduction, which attempts to establish the truth of its conclusions by meeting two distinct conditions a) the conclusion must really flow from the premises; and b) the premises themselves must be true. If both conditions are met, the argument is said to be valid. This is all very comforting. We are here in the familiar and reassuring realm of common sense. "True or false?" "Yes or no?" Our feet are firmly on the ground. We appear to be in possession of "the truth, the whole truth, and nothing but the truth." There is not a lot more to be said. Or is there?
Strictly speaking, from the standpoint of formal logic, it is a matter of indifference whether the premises are true or false. As long as the conclusions can be correctly drawn from its premises, the inference is said to be deductively valid. The important thing is to distinguish between valid and invalid inferences. Thus, from the standpoint of formal logic, the following assertion is deductively valid: All scientists have two heads. Einstein was a scientist. Therefore, Einstein had two heads. The validity of the inference does not depend upon the subject matter in the slightest. In this way, the form is elevated above the content.
In practice, of course, any mode of reasoning that did not demonstrate the truth of its premises would be worse than useless. The premises must be shown to be true. But this leads us into a contradiction. The process of validating one set of premises automatically raises a new set of questions, which in turn need to be validated. As Hegel points out, every premise gives rise to a new syllogism, and so on ad infinitum. So that what appeared to be very simple turns out to be extremely complex, and contradictory.
The biggest contradiction of all lies in the fundamental premises of formal logic itself. While demanding that everything else under the sun must justify itself in the High Court of the Syllogism, logic becomes utterly confused when asked to justify its own presuppositions. It suddenly loses all its critical faculties, and resorts to appeals to belief, common sense, the "obvious," or, the final philosophical get-out clause—a priori. The fact is that the so-called axioms of logic are unproved formulas. These are taken as the starting point, from which all further formulae (theorems) are deduced, exactly as in classical geometry, where the starting point is provided by Euclid’s principles. They are assumed to be correct, without any proof whatsoever, i.e., we just have to take them on trust.
But what if the basic axioms of formal logic turn out to be false? Then we would be in just the same position as when we gave poor Mr. Einstein an additional head. Is it conceivable that the eternal laws of logic might be flawed? Let us examine the matter more closely. The basic laws of formal logic are:
1) The law of identity ("A" = "A".
2) The law of contradiction ("A" does not equal "not-A".
3) The law of the excluded middle ("A" does not equal "B".
These laws, at first sight, seem eminently sensible. How could anyone quarrel with them? Yet closer analysis shows that these laws are full of problems and contradictions of a philosophical nature. In his Science of Logic, Hegel provides an exhaustive analysis of the Law of Identity, showing it to be one-sided and, therefore, incorrect.
Firstly, let us note that the appearance of a necessary chain of reasoning, in which one step follows from another, is entirely illusory. The law of contradiction merely restates the law of identity in a negative form. The same is true of the law of the excluded middle. All we have is a repetition of the first line in different ways. The whole thing stands or falls on the basis of the law of identity ("A" = "A". At first sight this is incontrovertible, and, indeed, the source of all rational thought. It is the Holy of Holies of Logic, and not to be called into question. Yet called into question it was, and by one of the greatest minds of all time.
There is a story by Hans-Christian Andersen called The Emperor’s New Suit of Clothes, in which a rather foolish emperor is sold a new suit by a swindler, which is supposed to be very beautiful, but invisible. The gullible emperor goes about in his fine new suit, which everyone agrees is exquisite, until one day a little boy points out that the emperor is, in fact, stark naked. Hegel performed a comparable service to philosophy in his critique of formal logic. Its defenders have never forgiven him for it.
The so-called law of identity is, in fact, a tautology. Paradoxically, in traditional logic, this was always regarded as one of the most glaring mistakes which can be committed in defining a concept. It is a logically untenable definition which merely repeats in other words what is already contained in the part to be defined. Let us put this more concretely. A teacher asks his pupil what a cat is, and the pupil proudly informs him that a cat is—a cat. Such an answer would not be considered very intelligent. After all, a sentence is generally intended to say something, and this sentence tells us nothing at all. Yet this not very bright scholar’s definition of a feline quadruped is a perfect expression of the law of identity in all its glory. The young person concerned would immediately be sent to the bottom of the class. Yet for over two thousand years, the most learned professors have been content to treat it as the most profound philosophical truth.
All that the law of identity tells us about something is that it is. We do not get a single step further. We remain on the level of the most general and empty abstraction. For we learn nothing about the concrete reality of the object under consideration, its properties and relationships. A cat is a cat; I am myself; you are you; human nature is human nature; things are as they are. The emptiness of such assertions stands out in all its uncouthness. It is the consummate expression of one-sided, formalistic, dogmatic thinking.
Is the law of identity invalid, then? Not entirely. It has its applications, but these are far more limited in scope than what one might think. The laws of formal logic can be useful in clarifying certain concepts, analysing, labelling, cataloguing, defining. It has the merit of neatness and tidiness. This has its place. For normal, simple, everyday phenomena, it holds good. But when dealing with more complex phenomena, involving movement, sudden leaps, qualitative changes, it becomes wholly inadequate, and, in fact, breaks down completely.
The following extract by Trotsky brilliantly sums up Hegel’s line of argument in relation to the law of identity:
"I will here attempt to sketch the substance of the problem in a very concise form. The Aristotelian logic of the simple syllogism starts from the proposition that ‘A’ is equal to ‘A.’ This postulate is accepted as an axiom for a multitude of practical human actions and elementary generalisations. But in reality ‘A’ is not equal to ‘A.’ This is easy to prove if we observe these two letters under a lens—they are quite different from each other. But, one can object, the question is not of the size or the form of the letters, since they are only symbols for equal quantities, for instance, a pound of sugar. The objection is beside the point; in reality a pound of sugar is never equal to a pound of sugar—a more delicate scale always discloses a difference. Again one can object: but a pound of sugar is equal to itself. Neither is this true—all bodies change uninterruptedly in size, weight, colour, etc. They are never equal to themselves. A sophist will respond that a pound of sugar is equal to itself ‘at any given moment.’ Aside from the extremely dubious practical value of this ‘axiom,’ it does not withstand theoretical criticism either. How should we really conceive the word ‘moment’? If it is an infinitesimal interval of time, then a pound of sugar is subjected during the course of that ‘moment’ to inevitable changes. Or is the ‘moment’ a purely mathematical abstractions, that is, a zero of time? But everything exists in time; and existence itself is an uninterrupted process of transformation; time is consequently a fundamental element of existence. Thus the axiom ‘A’ is equal to ‘A’ signifies that a thing is equal to itself if it does not change, that is, if is does not exist.
"At first glance it could seem that these ‘subtleties’ are useless. In reality they are of decisive significance. The axiom ‘A’ is equal to ‘A’ appears on one hand to be the point of departure for all the errors in our knowledge. To make use of the axiom ‘A’ is equal to ‘A’ with impunity is possible only within certain limits. When quantitative changes in ‘A’ are negligible for the task at hand then we can presume that ‘A’ is equal to ‘A.’ This is, for example, the manner in which a buyer and a seller consider a pound of sugar. We consider the temperature of the sun likewise. Until recently we considered the buying power of the dollar in the same way. But quantitative changes beyond certain limits become converted into qualitative. A pound of sugar subjected to the action of water or kerosene ceases to be a pound of sugar. A dollar in the embrace of a president ceases to be a dollar. To determine at the right moment the critical point where quantity changes into quality is one of the most important and difficult tasks in all the spheres of knowledge including sociology…
"Dialectical thinking is related to vulgar thinking in the same way that a motion picture is related to a still photograph. The motion picture does not outlaw the still photograph but combines a series of them according to the laws of motion. Dialectics does not deny the syllogism, but teaches us to combine syllogisms in such a way as to bring our understanding closer to the eternally changing reality. Hegel in his Logic established a series of laws: change of quantity into quality, development through contradictions, conflict of content and form, interruption of continuity, change of possibility into inevitability, etc., which are just as important for theoretical thought as is the simple syllogism for more elementary tasks." (57)
Similarly with the law of the excluded middle, which asserts that it is necessary either to assert or deny, that a thing must be either black or white, either alive or dead, either "A" or "B". It cannot be both at the same time. For normal everyday purposes, we can take this to be true. Indeed, without such assumptions, clear and consistent thought would be impossible. Moreover, what appear to be insignificant errors in theory sooner or later make themselves felt in practice, often with disastrous results. In the same way, a hairline crack in the wing of a jumbo jet may seem insignificant, and, indeed, at low speeds may pass unnoticed. At very high speeds, however, this tiny error can provoke a catastrophe. In Anti-D�hring, Engels explains the deficiencies of the so-called law of the excluded middle:
"To the metaphysician," wrote Engels, "things and their mental images, ideas, are isolated, to be considered one after the other and apart from each other, fixed, rigid objects of investigation given once for all. He thinks in absolutely unmediated antitheses. ‘His communication is ‘yea, yea; nay, nay’; for whatsoever is more than these cometh of evil.’ For him a thing either exists or does not exist; a thing cannot at the same time be itself and something else. Positive and negative absolutely exclude one another; cause and effect stand in a rigid antithesis one to the other.
"At first sight this way of thinking seems to us most plausible because it is that of so-called sound common sense. Yet sound common sense, respectable fellow that he is in the homely realm of his own four walls, has very wonderful adventures directly he ventures out into the wide world of research. The metaphysical mode of thought, justifiable and even necessary as it is in a number of domains whose extent varies according to the nature of the object, invariably bumps into a limit sooner or later, beyond which it becomes one-sided, restricted, abstract, lost in insoluble contradictions, because in the presence of individual things it forgets their connections; because in the presence of their existence it forgets their coming into being and passing away; because in their state of rest if forgets their motion. It cannot see the wood for the trees. For everyday purposes we know and can definitely say, e.g., whether an animal is alive or not. But, upon closer inquiry, we find that this is sometimes a very complex question, as the jurists very well know. They have cudgelled their brains in vain to discover a rational limit beyond which the killing of the child in its mother’s womb is murder. It is just as impossible to determine the moment of death, for physiology proves that death is not a sudden instantaneous phenomenon, but a very protracted process.
"In like manner, every organic being is every moment the same and not the same; every moment it assimilates matter supplied from without and gets rid of other matter; every moment some cells of its body die and others build themselves anew; in a longer or shorter time the matter of its body is completely renewed and is replaced by other molecules of matter, so that every organic being is always itself, and yet something other than itself." (58)
The relationship between dialectics and formal logic can be compared to the relationship between quantum mechanics and classical mechanics. They do not contradict but complement each other. The laws of classical mechanics still hold good for an immense number of operations. However, they cannot be adequately applied to the world of subatomic particles, involving infinitesimally small quantities and tremendous velocities. Similarly, Einstein did not replace Newton, but merely exposed the limits beyond which Newton’s system did not work.
Formal logic (which has acquired the force of popular prejudice in the form of "common sense") equally holds good for a whole series of everyday experiences. However, the laws of formal logic, which set out from an essentially static view of things, inevitably break down when dealing with more complex, changing and contradictory phenomena. To use the language of chaos theory, the "linear" equations of formal logic cannot cope with the turbulent processes which can be observed throughout nature, society and history. Only the dialectical method will suffice for this purpose.
The deficiencies of traditional logic have been grasped by other philosophers, who are very far from the dialectical standpoint. In general, in the Anglo-Saxon world, there has traditionally been a greater inclination towards empiricism, and inductive reasoning. Nevertheless, science still requires a philosophical framework which will enable it to assess its results and guide its steps through the confused mass of facts and statistics, like Ariadne’s thread in the labyrinth. Mere appeals to "common sense," or the "facts," will not suffice.
Syllogistic thinking, the abstract deductive method, is very much in the French tradition, especially since Descartes. The English tradition was altogether different, being heavily influenced by empiricism. From Britain, this school of thought was early on imported to the United States, where it sunk deep roots. Thus, the formal-deductive mode of thought was not at all characteristic of the Anglo-Saxon intellectual tradition. "On the contrary," wrote Trotsky, "it is possible to say that this [school of] thought is distinguished by a sovereign-empirical contempt for the pure syllogism, which did not prevent the English from making colossal conquests in many spheres of scientific investigation. If one really thinks this through as one should, then it is impossible not to arrive at the conclusion that the empirical disregard for the syllogism is a primitive form of dialectical thinking."
Empiricism historically played both a progressive role (in the struggle against religion and mediaeval dogmatism) and a negative one (an excessively narrow interpretation of materialism, resistance to broad theoretical generalisations). Locke’s famous assertion that there is nothing in the intellect which is not derived from the senses contains the germ of a profoundly correct idea, but presented in a one-sided way, which could, and did, have the most harmful consequences on the future development of philosophy. In relation to this, Trotsky wrote shortly before his assassination:
"‘We do not know anything about the world except what is provided through experience.’ This is correct if one does not understand experience in the sense of the direct testimony of our individual five senses. If we reduce the matter to experience in the narrow empirical sense, then it is impossible for us to arrive at any judgment concerning either the origin of the species or, still less, the formation of the earth’s crust. To say that the basis for everything is experience is to say too much or to say nothing at all. Experience is the active interrelationship between subject and object. To analyse experience outside this category, i.e., outside the objective material milieu of the investigator who is counterposed to it and who from another standpoint is a part of this milieu—to do this is to dissolve experience in a formless unity where there is neither object nor subject but only the mystical formula of experience. ‘Experiment’ or ‘experience’ of this kind is peculiar only to a baby in its mother’s womb, but unfortunately the baby is deprived of the opportunity to share the scientific conclusions of its experiment." (59)
The uncertainty principle of quantum mechanics cannot be applied to ordinary objects, but only to atoms and subatomic particles. Subatomic particles obey different laws to those of the "ordinary" world. They move at incredible speeds, 1,500 metres per second, for example. They can move in different directions at the same time. Given this situation, the forms of thought which apply to everyday experience are no longer valid. Formal logic is useless. Its black and white, yes-or-no, take it or leave it categories have no point of contact with this fluid, unstable and contradictory reality. All we can do is to say that it is probably such and such a motion, with an infinite number of possibilities. Far from proceeding from the premises of formal logic, quantum mechanics violates the Law of Identity by asserting the "non-individuality" of individual particles. The Law of Identity cannot apply at this level, because the "identity" of individual particles cannot be fixed. Hence the lengthy controversy over "wave" or "particle." It could not be both! Here "A" turns out to be "not-A," and "A" can indeed be also "B." Hence, the impossibility of "fixing" an electron’s position and velocity in the neat and absolute manner of formal logic. That is a serious problem for formal logic and "common sense," but not for dialectics or for quantum mechanics. An electron has both the qualities of a wave and a particle, and this has been experimentally demonstrated.
In 1932, Heisenberg suggested that the protons inside the nucleus were held together by something he called exchange force. This implied that protons and neutrons were constantly exchanging identity. Any given particle is in a constant state of flux, changing from a proton into a neutron and back again. Only in this way is the nucleus held together. Before a proton can be repelled by another proton, it changes into a neutron, and vice versa. This process in which particles are changed into their opposites takes place uninterruptedly, so that it is impossible to say at any given moment whether a particle is a proton or a neutron. In fact it is both—it is and is not.
The exchange of identities between electrons does not mean a simple change of position, but a complicated process where electron "A" interpenetrates with electron "B" to produce a "mix" of, say, 60% "A" and 40% "B" and vice versa. Later, they may have completely exchanged identities, with all "A" here and all "B" there. The flow would then be reversed in a permanent oscillation, involving a rhythmic interchange of the electrons’ identities, which goes on indefinitely. The old rigid, fixed Law of Identity vanishes altogether in this kind of pulsating identity-in-difference, which underlies all being, and received its scientific expression in Pauli’s principle of exclusion.
Thus, two and a half millennia later, Heraclitus’ principle "everything flows" turns out to be true—literally. Here we have, not only a state of unceasing change and motion, but also a process of universal interconnection, and the unity and interpenetration of opposites. Not only do electrons condition each other, but they actually pass into each other and become transformed into each other. How far removed from the static, unchanging idealist universe of Plato! How does one fix the position of an electron? By looking at it. And how to determine its momentum? By looking at it twice. But in that time, even in an infinitesimally small space of time, the electron has changed, and is no longer what it was. It is something else. It is both a particle (a "thing," a "point") and a wave (a "process," movement, becoming). It is and is not. The old black and white method of formal logic used in classical mechanics cannot give results here because of the very nature of the phenomenon.
In 1963, it was suggested by Japanese physicists that the extremely small particle known as the neutrino changed its identity as it travelled through space at very high speeds. At one point it was an electron-neutrino, at another, a muon-neutrino, at another, a tauon-neutrino, and so on. If this is true, the law of identity, which has already been thoroughly battered, can be said to have received the final coup de grace. Such a rigid, black-and-white conception is clearly out of its depth when confronted with any one of the complex and contradictory phenomena of nature described by modern science.
In the 19th century, there were a number of attempts to bring logic up to date (George Boyle, Ernst Schr�der, Gotlob Frege, Bertrand Russell and A. N. Whitehead). But, apart from the introduction of symbols, and a certain tidying up, there is no real change here. Great claims are made, for example by the linguistic philosophers, but there are not many grounds for them. Semantics (which deals with the validity of an argument) was separated from syntax (which deals with the deductibility of the conclusions from axioms and premises). This is supposed to be something new, when, in reality, it is merely a re-hash of the old division, well known to the ancient Greeks, between logic and rhetoric. Modern logic is based on the logical relations among whole sentences. The centre of attention has moved away from the syllogism towards hypothetical and disjunctive arguments. This is hardly a breathtaking leap. One can begin with sentences (judgments) instead of syllogisms. Hegel did this in his Logic. Rather than a great revolution in thought, it is like re-shuffling cards in a pack.
Using a superficial and inexact analogy with physics, the so-called "atomic method" developed by Russell and Wittgenstein (and later repudiated by the latter) tried to divide language into its "atoms." The basic atom of language is supposed to be the simple sentence, out of which compound sentences are constructed. Wittgenstein dreamed of developing a "formal language" for every science—physics, biology, even psychology. Sentences are subjected to a "truth test" based on the old laws of identity, contradiction and the excluded middle. In reality, the basic method remains exactly the same. The "truth value" is a question of "either…or," "yes or no," "true or false." The new logic is referred to as the propositional calculus. But the fact is that this system cannot even deal with arguments formerly handled by the most basic (categorical) syllogism. The mountain has laboured, and brought forth a mouse.
The fact is that not even the simple sentence is really understood, although it is supposed to be the linguistic equivalent of the "building-blocks of matter." Even the simplest judgment, as Hegel points out, contains a contradiction. "Caesar is a man," "Fido is a dog," "the tree is green," all state that the particular is the universal. Such sentences seem simple, but in fact are not. This is a closed book for formal logic, which remains determined to banish all contradictions, not only from nature and society, but from thought and language as well. Propositional calculus sets out from exactly the same basic postulates as those worked out by Aristotle in the 4th century B.C., namely, the law of identity, the law of (non-)contradiction, the law of the excluded middle, to which is added the law of double negation. Instead of being written with normal letters, they are expressed in symbols, thus:
a) p = p
b) p = ~ p
c) p V = ~ p
d) ~ (p ~ p)
All this looks very nice, but makes not the slightest difference to the content of the syllogism. Moreover, symbolic logic itself is not a new idea. In the 1680s, the ever-fertile brain of the German philosopher Leibniz created a symbolic logic, although he never published it.
The introduction of symbols into logic does not carry us a single step further, for the very simple reason that they, in turn, must sooner or later be translated into words and concepts. They have the advantage of being a kind of shorthand, more convenient for some technical operations, computers and so on, but the content remains exactly as before. The bewildering array of mathematical symbols is accompanied by a truly Byzantine jargon, which seems deliberately designed to make logic inaccessible to ordinary mortals, just as the priest-castes of Egypt and Babylon used secret words and occult symbols to keep their knowledge to themselves. The only difference is that they actually did know things that were worth knowing, like the movements of the heavenly bodies, something which cannot be said of modern logicians.
Terms such as and so on and so forth, are designed to give the impression that formal logic is a science to be reckoned with, since it is quite unintelligible to most people. Sad to say, the scientific value of a body of beliefs is not directly proportionate to the obscurity of its language. If that were the case, every religious mystic in history would be as great a scientist as Newton, Darwin and Einstein, all rolled into one.
In Moliere’s comedy, Le Bourgeois Gentilhomme, M. Jourdain was surprised to be told that he had been talking prose all his life, without realising it. Modern logic merely repeats all the old categories, but throws in a few symbols and fancy-sounding terms, in order to hide the fact that absolutely nothing new is being said. Aristotle used "monadic predicates" (expressions that attribute a property to an individual) a long time ago. No doubt, like M. Jourdain, he would have been delighted to discover that he had been using Monadic Predicates all the time, without knowing it. But it would not have made a scrap of difference to what he was actually doing. The use of new labels does not alter the contents of a jar of jam. Nor does the use of jargon enhance the validity of outworn forms of thought.
The sad truth is that, in the 20th century formal logic has reached its limits. Every new advance of science deals it yet another blow. Despite all the formal changes, the basic laws remain the same. One thing is clear. The developments of formal logic over the past hundred years, first by propositional calculus (p.c.), then by lower predicate calculus (l.p.c.) has carried the subject to such a point of refinement that no further development is possible. We have reached the most comprehensive system of formal logic, so that any other additions will certainly not add anything new. Formal logic has said all that it has to say. If the truth were to be told, it reached this stage quite some time ago.
Recently, the ground has shifted from argument to deducing conclusions. How are the "theorems of logic derived"? This is quite shaky ground. The basis of formal logic has always been taken for granted in the past. A thorough investigation of the theoretical grounds of formal logic would inevitably result in transforming them into their opposite. Arend Heyting, the founder of the Intuitionist School of mathematics, denies the validity of some of the proofs used in classical mathematics. However, most logicians cling desperately to the old laws of formal logic, like a drowning man clutching at a straw:
"We do not believe that there is any non-Aristotelian logic in the sense in which there is a non-Euclidean geometry, that is, a system of logic in which the contraries of the Aristotelian principles of contradiction and the excluded middle are assumed to be true, and valid inferences drawn from them." (60)
There are two main branches of formal logic today—propositional calculus and predicate calculus. They all proceed from axioms, which are assumed to be true "in all possible worlds," under all circumstances. The fundamental test remains freedom from contradiction. Anything contradictory is deemed to be "not valid." This has a certain application, for example, in computers, which are geared to a simple yes or no procedure. In reality, however, all such axioms are tautologies. These empty forms can be filled with almost any content. They are applied in a mechanical and external fashion to any subject. When it comes to simple linear processes, they do their work tolerably well. This is important, because a great many of the processes in nature and society do, in fact, work in this way. But when we come to more complex, contradictory, non-linear phenomena, the laws of formal logic break down. It immediately becomes evident that, far from being universal truths valid "in all possible worlds," they are, as Engels explained, quite limited in their application, and quickly find themselves out of their depth in a whole range of circumstances. Moreover, these are precisely the kind of circumstances which have occupied the attention of science, especially the most innovative parts of it, for most of the 20th century.
For reasons of convenience, where the same work is cited several times in immediate sequence we have placed the reference number at the end of the last quote.
(1) Karl Marx and Frederick Engels, Selected Correspondence, Letter to Bloch, 21st-22nd September 1890, henceforth referred to as MESC.
(2) The Economist, 9th January 1982.
(3) W. Rees-Mogg, The Great Reckoning, How the World Will Change in the Depression of the 1990s, p. 445.
(4) Ibid., p. 27, our emphasis.
(5) The Guardian, 9th March, 1995.
(6) Gordon Childe, What Happened in History, p. 19.
(7) Ibid., pp. 19-20.
( Sir James Frazer, The Golden Bough, p. 10.
(9) Ibid., p. 105.
(10) Ludwig Feuerbach, The Essence of Christianity, p. 5.
(11) Aristotle, Metaphysics, p. 53.
(12) I. Prigogine and I. Stengers, Order Out of Chaos, Man’s New Dialogue with Nature, p. 4.
(13) Quoted in Margaret Donaldson, Children’s Minds, p. 84.
(14) Oeconomicus, iv, 203, quoted in B. Farrington, Greek Science, pp. 28-9.
(15) Feuerbach, op. cit., pp. 204-5.
(16) Quoted in A. R. Burn, Pelican History of Greece, p. 132.
(17) G. Childe, Man Makes Himself, pp. 107-8.
(18) Trotsky, In Defence of Marxism, p. 66.
(19) Marx, Capital, Vol. 1, p. 19.
(20) David Bohm, Causality and Chance in Modern Physics, p. 1.
(21) R. P. Feynman, Lectures on Physics, chapter 1, p. 8.
(22) Aristotle, op. cit., p. 9.
(23) Engels, Dialectics of Nature, p. 92.
(24) Trotsky, op. cit., pp. 106-7.
(25) M. Waldrop, Complexity, p. 82.
(26) Engels, Dialectics of Nature, pp. 90-1.
(27) Engels, Anti-D�hring, p. 162.
(28) J. Gleick, Chaos, Making a New Science, p. 127.
(29) M. Waldrop, op. cit., p. 65.
(30) D. Bohm, op. cit., p. x.
(31) Engels, Anti-D�hring, p. 163.
(32) I. Stewart, Does God Play Dice? p. 22.
(33) Feynman, op. cit., chapter 2, p. 5.
(34) Engels, Dialectics of Nature, pp. 345-6.
(35) Hegel, Science of Logic, Vol. 1, p. 258.
(36) B. Hoffmann, The Strange Story of the Quantum, p. 159.
(37) Engels, Dialectics of Nature, p. 96.
(38) Ibid., pp. 95-6 and p. 110.
(39) Ibid., p. 108 and p. 107.
(40) Hegel, Phenomenology of Mind, p. 68.
(41) Lenin, Collected Works, Vol. 38, p. 319; henceforth referred to as LCW.
(42) Marx and Engels, Selected Works, Vol. 1, p. 502; henceforth referred to as MESW.
(43) Marx, Capital, Vol. 1, pp. 19-20.
(44) Quoted in A. A. Luce, Logic, p. 8.
(45) LCW, Vol. 38, p. 171.
(46) M. Donaldson, Making Sense, pp. 98-9.
(47) M. Donaldson, Children’s Minds, p. 76.
(48) Trotsky, Writings, 1939-40, p. 400.
(49) A. A. Luce, op. cit., p. 83.
(50) Kant, Critique of Pure Reason, p. 99, footnote.
(51) Engels, The Dialectics of Nature, pp. 64-5.
(52) Feynman, op. cit., chapter 1, p. 2.
(53) Kant, Prolegomena zu einer jeden k�nftigen Metaphysik, quoted in E. V. Ilyenkov, Dialectical Logic, p. 90.
(54) Engels, Anti-D�hring, p. 43.
(55) Trotsky, Writings, 1939-40, pp. 399 and 400.
(56) Trotsky, In Defence of Marxism, p. 65, our emphasis.
(57) Ibid., pp. 63-6.
(58) Engels, Anti-D�hring, pp. 26-7.
(59) Trotsky, Writings, 1939-40, pp. 401 and 403.
(60) Cohen and Nagel, An Introduction to Logic and the Scientific Method, p. vii.
Thoughts?
- Login to post comments
Whew! Lots of thoughts, but I counted around 20 PageDowns on my laptop screen... gonna have to break it down in sections. May not get to all of it right now.
Wonderist on Facebook — Support the idea of wonderism by 'liking' the Wonderism page — or join the open Wonderism group to take part in the discussion!
Gnu Atheism Facebook group — All gnu-friendly RRS members welcome (including Luminon!) — Try something gnu!
Possible contradiction: What is meant by 'social evolution'? Is it the biological evolution of the human body's capability for culture, or is it the cultural evolution that produced language, art, etc.? The first is speaking about genetic evolution, the second about memetic evolution. The quote from Locke is clearly talking about the genetic ability of humans to think intelligently, so if 'social evolution' is intended to mean memetic evolution, then this contradicts with Locke's quote.
What is described here is what I would call the ability of conceptualization. This is an aspect of human consciousness, not necessarily human culture. It is a foundation of human culture, though. The 'language' of conceptualization is shared by all humans, and forms the basis of spoken and written human languages. It is the 'language' of the brain, and all humans 'speak' it from birth. It is a biological ability.
Agreed.
This ability, I call 'intuition'. It is the natural biological ability of the brain to come to intelligent *subconscious* decisions. And yes, it is absolutely the foundation for all 'higher' levels of thought such as scientific reasoning. There is a great book called 'On Intelligence' which describes how the brain naturally has this ability to make good predictions.
<klaxon> <klaxon> "Danger danger! Unsupported assertion!"
Disproof in the form of counter-example: Computers are very capable of dealing with complex processes, movement, change, and contradictions. All that is required is a detailed and rich-enough abstract model to enable a computer to deal with such things. Video games, weather simulations, chess and robotic AIs, etc. etc. are daily proving skeptics of logic wrong.
Essentially this paragraph is saying that our minds cannot reflect reality perfectly. A much easier proof of this is the fact that our brains are finite in size and can only hold so much information. There is more information in the universe than can fit in a human brain. Therefore, our rationality is bounded. In fact, the term for it is 'bounded rationality'.
Dialectics is not part of science. It may have inspired individual scientists, but it is incorrect to phrase such as a 'contribution' to science. A real contribution to science would be an actual scientific understanding of how dialectics works, so that ANY scientist could use the technique. Currently, our science is not yet advanced enough to tackle this, but I think the time is quickly approaching; ironically, it will be ushered in by computers, the extreme product of 'formal logic'.
DANGER DANGER! Unsupported assertions abound!
Dialectical thinking provides insights? Yes. Science did a disservice? No. Science shows the world works according to dialectical thinking? No.
The reason science 'excommunicated' dialectical thinking is because it is not reliable. It is not objective. Two scientists, using dialectical thinking, can come up with two radically opposite conclusions. Dialectics is not science. The whole purpose of science is to eliminate disagreement by resorting to evidence and objective, repeatable procedures.
Dialectics is an advanced form of what I call 'intuitive thinking', as opposed to 'rational thinking'. It is more advanced because it produces more insights than just educated guessing or new-agey type thinking, but it is not rational in the sense that it is not valid to use dialectical thinking as a justification for a particular conclusion, precisely because dialectical thinking can be used to justify ANY conclusion. Dialectics generates rationalizations, not rationality.
Someone who holds dialectical thinking above what the author refers to as 'one sided' 'formal logic' will make the exact same category of mistakes that someone who holds faith above evidence. They will be influenced by personal bias and will firmly believe fallacious arguments.
Someone who holds dialectics above rationality will make more errors of reasoning, and will ultimately make worse concrete predictions than someone who holds rationality higher than dialectics.
So, that concludes the first part of my analysis: While I find usefulness in dialectical thinking, I disagree with the author's flippant dismissal of (or possibly ignorance of) the actual observed and tested power of rational thinking.
Edit: Let me elaborate on why science did not do a disservice to excommunicate dialectical thinking. The essential point is that *before* the scientific method was developed, people relied on intuitive styles of thinking, like dialectics (though it wasn't called that yet). The problem with intuitive styles of thinking is that they do not produce good predictions, because each person's intuition is different and comes to different conclusions. If you want to know how many seconds it will take for a ball to drop 10 metres, people's intuitions will be all over the map, but if you use a rational process, you can improve the accuracy of predictions 1000 fold and more. For the purposes of making accurate predictions, which is the goal of science, rational processes simply work better and intuitive processes are simply not good enough for the kind of work we need science to do for us.
Dialectical thinking, as an intuitive process, leads to a lot of erroneous conclusions. If you take these conclusions on 'faith' and dismiss the power of rational processes to detect and correct erroneous reasoning, then you are doomed to create your own personal dogma. It's true because it feels true to me! rather than the rational It's true because we have tested it and it works.
If science had not excommunicated dialectical thinking, science would have ground to a halt as each scientist defends his own pet theory, ignoring evidence, etc. Science is founded on intuition, as I freely admit above. But it tries to correct intuition by correcting for logical fallcies (human brain biases), personal biases, errors in perception, etc. It takes intuition and molds it into a strict process whereby the best of intuition is kept (natural ability to predict) and the worst of intuition is removed or compensated for.
Wonderist on Facebook — Support the idea of wonderism by 'liking' the Wonderism page — or join the open Wonderism group to take part in the discussion!
Gnu Atheism Facebook group — All gnu-friendly RRS members welcome (including Luminon!) — Try something gnu!
This kind of shrill sophistry doesn't add anything to your argument. To call logic a dogma is ridiculous. You can question the laws of logic all you want. You can even invent your own laws of logic if you like. If they work better than the ones we have now, I will be the first to use them. But unfortunately, there are no better laws. If there were, we would use those instead.
We use the laws of logic because they WORK, not because some guy told us to use them. If you want to solve a problem, you NEED the laws of logic to effectively solve it.
Does it? Do you have examples?
What is it equal to then?
Is quantum mechanics based on dialectics? Or is it based on math and logic?
And we have a rational understanding of chaotic processes. You may have heard of chaos theory, cellular automata, etc.
Dynamics, evolution of cooperation, game theory, etc.
Object-oriented programming, user-interface design, etc.
Punctuated equilibrium, memetics vs. genetics, stable vs. dynamic environmental pressures.
All of these are rational understandings of things that the author claims cannot be adequately understood using a rational process.
Again, unsupported assertions easily disproven with modern counter-examples.
This excerpt exhibits an ignorance of how the mind actually works. The so-called 'concrete' 'whole' is ALREADY an abstraction. You do not 'dismember' the whole, the 'whole' has already been constructed from many tiny parts. The brain is made of neurons. Each sensory neuron detects one environmental phenomenon. The brain combines signals from many neurons to form a conception of a 'whole'. But the 'whole' was already many parts. It was human intuition that combined the parts into an abstract whole.
What the excerpt is actually saying is that the rational analysis should be combined with the native intuitive understanding of the 'whole'. But there is nothing privileged about this 'whole', except that we have a natural ability to detect it. That is more a consequence of our evolution than it is a consequence of actual 'wholes' existing independent of our minds which perceive them. We are not losing anything by performing a rational analysis. We are not perverting some sacred 'wholeness'. The wholeness we perceive is because we perceive it, because we are human animals living in an electromagnetic world where some globs of matter tend to stick together with other globs of matter, not because there is some magical 'wholeness' force which we have blindly forgotten about.
The universe really IS made of tiny pieces.
Again, a misunderstanding of reality. The universe IS tiny pieces, and it is only US that decide there are larger 'things'. We are ALL idealists. Is the thing in front of you a computer screen? Or is is billions of atoms? Or is it trillions of quarks? Or is it bazillions of matter quanta? Your notion of 'concrete' is ALREADY an abstraction. Does your computer screen exist? If you answer yes, then you admit to your own inherent idealism. For without this idealism, how would you know if your computer screen were broken? To be broken means to be other than what it should be, but the concept of 'should be' is inherently idealistic.
Good. At least we're on the same page there. With this common ground, we can eventually come to agreement or at least mutual understanding.
Wonderist on Facebook — Support the idea of wonderism by 'liking' the Wonderism page — or join the open Wonderism group to take part in the discussion!
Gnu Atheism Facebook group — All gnu-friendly RRS members welcome (including Luminon!) — Try something gnu!
Thank you for the reply. I hope it's clear that this isn't my writing. It was taken from the website:
http://www.marxist.com/rircontents.htm
I copied and pasted it to generate a discussion. I also have some disagreements with the authors and I'll respond when you have finished.
Spacebuddha, yes, I did assume it was you that wrote it. Sorry.
Kant or no Kant, this is a silly objection to syllogisms. They are a useful tool. This whole section seems to be an attempt to discredit the idea of a syllogism, as if it represents all that is wrong with logic.
This is a pointless objection. The purpose of a syllogism is not to 'advance steps', it is to transform complex premises into simple conclusions. Why? So that the simple conclusions can be remembered and the complex derivation does not have to be re-created every single time you think of an idea. To say that this is some sort of valid objection against syllogisms is to say that mathematical proofs, which take axioms and derive theorems, are just useless contrivances. Why do you need a theorem if you can just state the axioms? After all, the theorems are just another way of stating the axioms from which they are derived, right?
This is the kind of weird-ass thinking that dialectics produces.
A particular and a universal are not opposites, so it makes no sense to say syllogisms show the unity of opposites. Dialectics allows you to use fuzzy definitions and equivocations to achieve fast and powerful insights, but it does not guarantee that your insights will be correct, or even that they will be powerful. It only provides a system which, given good information at the beginning, will produce pretty-good insights from this information. But just like 'formal logic', if you start with crappy information, you end up with crappy insights. Garbage in, garbage out.
Love is not a syllogism, by the way. QED.
Dialectics suffers the same problem, so this is not a unique criticism of syllogisms. If your premises are fucked up, your conclusions will be fucked up. No surprise there. Dialectics does no better.
I wonder how it is that you will be able to discern the adequate-and-correct methods from the inadequate-and-incorrect ones. Oh, by checking them against reality? Oh my! That's empiricism! Drat! I thought you had saved dialectics from the nasty requirement that it has to be applicable to reality.
Note to fellow thinkers: I really do not understand the post-modern disgust and derision for reality and the methods we use to understand it. I regularly hear people complaining about things like: Empiricism, evidence, induction, theory, experiment, Occam's Razor, deduction, etc. etc.
Why the feeling of disgust? Why the derision? Do they think that logic is 'icky'? "Eww. That's so logical. Yuck, it MUST be wrong." I don't understand that feeling. If someone could explain it, I would greatly appreciate it.
What the heck does it mean to be discredited? Does it mean that people showed that the syllogism was unable to produce useful results and that there are demonstrably better methods for acheiving the same goals? If not, then what the fuck is the purpose of using the word 'discredited'? And why should its being ridiculed and abused have anything to do with its utility?
Wonderist on Facebook — Support the idea of wonderism by 'liking' the Wonderism page — or join the open Wonderism group to take part in the discussion!
Gnu Atheism Facebook group — All gnu-friendly RRS members welcome (including Luminon!) — Try something gnu!
GREAT post. Kind of sinks the atheists' boat that they can't even prove A is A. Can you spell "epistemologically dependant upon God"?
Q: Why didn't you address (post x) that I made in response to you nine minutes ago???
A: Because I have (a) a job, (b) familial obligations, (c) social obligations, and (d) probably a lot of other atheists responded to the same post you did, since I am practically the token Christian on this site now. Be patient, please.
You've just conceded that you don't know what an axiom is... we don't prove axioms, axioms are prior to any proof, seeing as proofs rely on logical argument!
Axioms like the law of identity are defended through retortion.
And there is no concern in any informed, scholarly circle about the law of identity somehow being in doubt! If you think otherwise, please cite me a scholarly article from an established university where anyone either 1) states that there is a problem 'proving the law of identity" (again, this is a doubly ignorant statement) and/or 2) cite me any established philosophy department that cites 'god' as an 'epistemological foundation' for logic.
Please don't hold your breath while attempting either task. No one who actually has taken a class in logic speaks in the nonsensical term a presupper uses...
Can you spell outright nonsense? Your claim here is nonsensical for several reasons. First, as I have already demonstrated, it's a non sequitur. Next, 'god' is an incoherent term. Finally, the claim that any statement in logic is 'epistemologicaly dependent' upon some 'entity' like a 'god' is both rather odd, and, in the final analysis, a vacuous statement: you don't say anything about epistemology at all.... you just assert that 'logic' somehow 'relies' on some 'entity'... what the hell do you even think you mean here?
"Hitler burned people like Anne Frank, for that we call him evil.
"God" burns Anne Frank eternally. For that, theists call him 'good.'
I've heard you use that word before but I cannot seem to find the definition you mean in online dictionaries and wikipedia. Could you point us to a link explaining how retortion works?
Wonderist on Facebook — Support the idea of wonderism by 'liking' the Wonderism page — or join the open Wonderism group to take part in the discussion!
Gnu Atheism Facebook group — All gnu-friendly RRS members welcome (including Luminon!) — Try something gnu!
Hi Natural. I am betting that you already know, implicitly, how retortion works and just need to have it verbalized.
http://editthis.info/logic/The_Laws_of_Classical_Logic
Classical logic rests upon a foundation of axioms. The axioms of classical logic, are a set of a priori abstractions that humans create, in order to make categorical syllogisms; their existence is contingent upon sentient brains. Some may argue, like myself, that these laws have correlates to basic laws of metaphysics1 and that this accounts for the 'utility' of logic, but it does not follow that logical rules are rules for the universe - they are rules for arguments that may or may not mirror basic metaphysics. The universe isn't subject to any laws of logic at all. The universe merely exists. However, because I feel there is a clear relationship between the metaphysical status of these concepts and their application to logic, I will address both concepts in my definitions.
The Axioms of Classical Logic
All of our syllogisms rely on these laws - that any thing is equal to itself, that tautologies must be true, and that contradictions must be false. Classical logic holds that everything has a definite, non-contradictory nature. A metaphysical law of identity would hold that to be perceived or even exist at all it must have a definite, non-contradictory nature, but for our purposes, it is enough to say that If A is true, then A is true!
[edit]Self Evident Nature of Axioms
Seeing as axioms are the foundation to classical logic, it is not possible to use classical logic to justify them. However, Aristotle found this is unnecessary: the axioms of classical logic are held to be self evident. We hold that they are are self evident because they all syllogisms rely on them, and because they can be defended through retortion.
A defense through retortion occurs whenever an argument must rely upon the very principle it seeks to overturn. Any attempt to form a syllogism to refute the axioms of classical logic will have to rely on the very axioms it seeks to overturn, leading to a self refutation (we call this type of self refutation the Stolen concept fallacy. See the Informal Fallacies section for more on this.
A classic example:
Those who invalidate reason ought seriously to consider whether they argue against reason with or without reason. If with reason, then they establish the principle that they are laboring to dethrone: but if they argue without reason (which, in order to be consistent with themselves they must do), they are out of reach of rational conviction, nor do they deserve a rational argument. - Ethan Allen, Revolutionary War hero
1 Metaphysics is a term also invented by Aristotle; and it has to do with theories concerning how existence itself 'works'. The only grounds requires for any basic metaphysical system is the existence of sentient brains. A sentient brain would glean basic metaphysical axioms, necessarily (and at least implicitly), in the process of thought. As for the question of where sentient brains 'come from' this is a question well outside the purview of both logic an metaphysics, it's a question of biology and, perhaps cosmology.
Note: the above site is my site, although I cite my sources re: retortion. Please feel free to give feedback.
"Hitler burned people like Anne Frank, for that we call him evil.
"God" burns Anne Frank eternally. For that, theists call him 'good.'