Doubting scientific realism

Mouse asked:

Can you please explain the difference between scientific realism and scientific anti-realism as viewpoints in the philosophy of science? Especially, how do scientific realists and scientific anti-realists differ in their views of 1) scientific facts, 2) scientific hypotheses, 3) scientific laws and 4) scientific theories? Also, can you please comment on how they differ in their views of atoms, electrons and quarks? Would a scientific anti-realists deny the existence of atoms, electrons and quarks?

Answer by Jürgen Lawrenz

Strictly speaking, you should consult a scientific Q&A website for this question. However, as its philosophical aspect interests me, I’ll give a response in those terms, while being mindful of the little adage of the bull in the china shop.

Scientific anti-realism is the doctrine which combats the notion that unobservables are ‘real’ things or events in the ‘real’ world. It lays stress on the possibility that instrumental results from experiments may not demonstrate existence conclusively. There is much to be said for this, especially in situations where the instrument itself has a prejudicial influence on the results, as in the realm of the quantum. We dive in and find ludicrous nomenclatures like ‘quarks’ and their ‘colours’ and ‘flavours’ being traded as if they were equivalent to the candies they gave us in kindergarten. To all this John Wheeler gave a consummate answer with his proposition of an “observer-created reality”, which means that the observer must take responsibility for interpreting his/ her findings and stop pretending they are objective facts. You might also remember Eugene Wigner’s frustration with the observation of quantum collapse experiments, when each time he looked, a different ‘fact’ stared him in the face. We seem to have found ways to ‘get around’ these irritations more recently, but many people would say that this is still instrumentalism, not objective reality.

The philosophical issue here is, that (cautiously phrased) we cannot truly bridge the discrepancy between outcomes of an experiment that has a tangible effect in the macroscopic world and the theoretical outcomes of experiments which leave only ambiguous traces like pointer readings or smears on a wall. If you take a sledge-hammer to an ornately wrought glass urn, you will not be able to reconstruct the object from the pulverised remains. But this is what a great deal of subatomic research purports to effect. You see the problem.

As for atoms, there is no need to deny them. They register willingly on spectrometers to support the conviction that they are physical objects. It is different with electrons which seem to be ‘states of energy’ of an atom and vulnerable to misconception as particulate ‘entities’. What actually happens inside an atom during fusion or fission cannot be made visible, although the outcome is highly visible and destructive. Divisibility in this case encourages the belief that an atom is an entity made of parts; and that the parts are themselves compounded of parts (hence the term ‘particles’); and so it goes down the ladder until we get either to the point where theory has no decimals left with which to resolve the difference between ‘space’ and ‘object’ or else just slides into infinite regress. Indeed a number of such routines — especially on computers whose rounding off routines are not positively knowable — rely on ‘re-normalisation’, a sophisticated term for the more common word ‘fudging’.

An anti-realist would certainly deny the reality of all the creatures of this zoo, and one could easily evince sympathy with their point of view. None other than Werner Heisenberg complained in a famous lecture that the questions “what is it made of?” and “are they divisible?” in relation to particles are wrong questions based on bad philosophy, and he wasn’t an anti-realist. Barrow and Tipler wrote a chapter, ‘Are there any laws of physics?’ (in their book The Anthropic Cosmological Principle), in which they claim that the laws of physics exist only in the minds of physicists which they then plaster over phenomena because they expect to see those laws being enacted. For my person, I feel that the quip “laws are made to be broken” suffers no exceptions. The laws of science are all man-made, and the number of these laws that have toppled over the last two centuries would fill a sizeable garbage bin.

No disrespect to the brilliant scientists working those fields is intended with these remarks. Moreover, I doubt that realists and anti-realists are encamped in mutually hostile quarters and never talk to each other. But I feel (very strongly) that Heisenberg’s point is relevant and that science going alone without philosophy is like a horse and carriage that lost the carriage.

There you have it. Inadequate, but hopefully somewhat illuminating.

 

Solipsistic Egoism?

Aleksander asked:

Can Max Stirner be considered a defender of “metaphysical solipsism” and “immaterialism”? Is there some relation between Stirner’s philosophy and George Berkeley’s philosophy? Is there some philosopher in history who defended “metaphysical solipsism”?

Answer By Martin Jenkins

To the first and second question — I don’t think so. To my knowledge, Stirner never writes that the world is immaterial or, advocates a variant of solipsism. He is understood as advocating a form of Egoism although this orthodox conclusion could be due to the translation of Eigen to ‘Ego’ when Stirner wanted to convey a wholly new philosophy.

His text might give the impression of solipsism as he does emphasise and write in ‘the first person’ — citing ‘I’, the Ego, me. The impression is misleading.

When Stirner writes about ‘Spooks’, he not saying the world is a ‘Spook’ and therefore immaterial, this is his description of Philosophies such as Christianity, Humanism, Communism which haunt the minds of people, which are wrongly posited as primary from which predicates or definitions of what is is to be human are derived. In this respect, he is following his contemporary Ludwig Feuerbach.

In criticising German Idealism and Hegel in particular, Ludwig Feuerbach observed that it practices the inversion of Subject and Object. Namely, instead of Thought being the predicate or object of the Human subject, the human becomes a predicate or object of Thought, where Thought is taken to be God, Reason, the Concept, Geist and so on.

Stirner continues this approach to Christianity etc, only what follows the inversion is not any universal human nature but only the ‘creative nothing’, a ‘Unique’. The ‘Unique’ is not a solipsist as s/he can associate with others for common aims in a ‘Union of Egos’. So ipso facto, solipsism is ruled out.

Immaterialism?

Any relation between George Berkeley and immaterialism? Berkeley’s philosophy can be described as a variant of Idealism. Stirner’s philosophy is regarded by some as the furthest development of Hegel’s Absolute Idealism. Indeed, in the early 1840’s Stirner frequented meetings of Young/Left Hegelians held at the Tippels Hostelry in Berlin, along with, amongst others, Frederick Engels.

Hegel’s held that historically and politically, Reason would dialectically supersede (Aufgehoben) its previous instantiations which had become unreason. Instrumental in achieving this was the Geist or collective consciousness of a people. That is, the Geist of a people would achieve self–consciousness of the unreason in society and act so as to reform it as instructed by Reason.

Not only does Stirner provide an account of History in the first part of The Ego and Its Ownness echoing Hegel’s Phenomenology of Spirit, but arguably, the collective Geist is transformed in the Ego of individuals; any vestige of Idealism is replaced by materialism and right becomes essentially, material might. So even here, Idealism has been negated by materialism.

So Aleksander, I don’t think there is any relation between Stirner and Solipsism, Berkeley and Immaterialism.

Physics vs feeling: Can I really touch anything?

Mouse asked:

I have recently heard that, according to physics, you can never actually touch anything.

This seems clearly false and I feel it should be refuted with philosophy (if not physics).

Can you comment on this?

p.s. See for example futurism.com/why-you-can-never-actually-touch-anything which seems to claim that, according to physics, you can never actually touch anything.

Answer by Jürgen Lawrenz

A philosopher can easily get tired of all these ultra-reductionist and ultimately futile assertions. One begins to wonder what they really seek to show (or prove?), or whether it’s merely one-upmanship on naive auditors.

After all, it must have been obvious to Caveman that the lizard on his hand can’t see him, and that conversely he can’t taste the food this creature eats. So what? Yet the creature feels something of the man’s skin, maybe a kind of leathery landscape that allows it to sink its claws in to hold fast; and the man would feel the pinpricks as well as the warm soft texture of the lizard’s belly. Between the two a connection is established, and that’s what we need to understand about our feelings – not that they are unreal, figments or hallucinations. Fact: there is rapport here that conveys a specific type of information to both nervous systems.

Of course you can place a microscope with stupendous magnification powers on this scene, when you would observe nothing but molecules tumbling around in some oily slosh. Entertaining for a while, but not informative to your heartbeat with the wonder of it. Going further down into the realm of physics changes everything again, because here even the molecules are so big that (by all appearances) you could drive a fleet of trucks through the spaces between them. Except that you can’t see this at all, because there is no way of making it visible. This is where those wonderful coloured graphs in our textbooks deceive us, making us believe that neutrons and protons and electrons and the rest of the subnuclear bric-a-brac are hard physical things, whereas in reality there is nothing hard or soft to be found.

So what could your physicist explain? Nothing much that is meaningful to the animal/human context. In particular: He could not deduce from either the chemical slosh or the scattering of neutrons what they might produce in terms of sensations. He and you need the experience of a physical touch first, before any physics elaboration is possible. Without this prior knowledge, it would not be possible to conceive of a moment or location when sensation arises; and the sad truth is, that even with this experience behind us, a billionfold swarm of elements still does not yield up the information that physics pretends to convey. And so, to be blunt about it, the message of physics on whether touch is real or not, ends up being gobbledygook.

The message from chemistry is slightly more respectable, because chemists deal with things to which we can put a measuring gauge through a microscope. But again: size is of the essence – for in the region of molecules which sport only a few million chemical bits and pieces, you can see something that vaguely resembles cause and effect being realised in the macroscopic world. But this could occur in a Petri dish as well as your fingertip, so that we are still no wiser on the question.

Consider, however, that some of these molecules are alive and act/react the same way cavemen and lizards do. Evidently this is where sensing becomes a phenomenon to which we can relate, namely with microbes initiating a sense of touch, from which we learn that all sensations are variants of the basic sense of touch.

Many steps up the evolutionary ladder, we discover that hearing is the impact of a train of air-driven molecules on your eardrum, and that taste is the tongue’s analysis of physically deposited chemical substances on its surface. The brain, which has evolved to recognise sensory impact as so many species of molecular vibrations, knows how to separate them and pipes this information into your consciousness in the form of subjective feelings.

This subjectivity does not, however, diminish its reality. Your and my reality is actual touch, hard or soft, hot or cold, rough or smooth. How wonderful to be told that all this is due to the energetic flurry of chemical elements and their interpenetration. That’s knowledge too. But emphasising the distance between the atoms of my fingertips and my tabletop for an argument that it precludes actual touch is non-information, because it copulates quite illicitly two different and incompatible dimensions of existence.

The only virtue, in the end, is that chemical know-how enables us at times to bridge these dimensions indirectly, i.e. by administering anaesthetic drugs. Yet even this is not a valid component of the argument, since it only numbs the body’s receptors. Blind people know about this and navigate with the help of sticks, exchanging one touch for another.

To end, a couple of curious instances on this “physics vs feelings” dichotomy.

(1) The poet Goethe, who knew a thing or two about art, wrote up a theory of colour. Some years later it was demolished in the name of science by the physicist Helmholtz. Who was right? Most people incline towards Helmholtz, but there have always been unrepentant lovers of art who ask the more relevant question, “what use is the physics theory to artists?” Answer: None. For colour is a sensation that provokes inter alia an emotional reaction. So does the pin-prick which, according to your physicist, involves no immediate touch.

(2) Arthur Eddington wrote on page 6 of his book The Nature of the Physical World that “the table I write on is mostly empty space”. How so? “There are innumerable electric charges rushing about, but their combined bulk amounts to less than a billionth of the table’s substance”. But (he says) he still maintains his complete trust in physical reality of the table, leaning on it while writing without expecting the billions of interstices to cause its collapse.

Thus, finally: Immediate touch is nothing other than a question of how greatly this event is magnified when you look at it. And it stands to reason that at a certain level of magnification, not only the sense of touch, but the meaning of this event disappears.

Sport and philosophy

Florence asked:

Hello! I am looking to do philosophy and politics at university however I am struggling to relate my achievements (outdoor sports/ qualifications, Gold Duke of Edinburgh Award etc) to the subject for my personal statement. What would you suggest for me to get involved within or read/ attend to make myself stand out from the crowd a bit more? Thank you in advance.

Answer by Jürgen Lawrenz

There isn’t much talk about sports among the great philosophers. But as luck would have it, the ancient Greeks were pretty much obsessed with it, for which their Olympic Games are the best testimony. This is not forgetting their statuary art with its obsession for an ideal male physique. There is an ample literature on sport itself, and its reflection in art in ancient Greek and Roman culture, written by scholars.

The fact that Anglosaxon colleges and universities practised and promoted intensive physical exercise for the past 200 years, even to the level of professional commitment, is a return to this emphasis in ancient culture. It is an issue worth pursuing in light of the contempt for sporting activities in the preceding 1000+ years, when the Church despised everything physical and left it to the armed forces of the princes to pursue.

You could tie this in with reading Plato, especially his Republic, which is one of the primary sources for political philosophy of all time. There is plenty of debate and discussion in this work about the physical fitness of those who are chosen for a role in the governing tier of an ideal society — including dietary considerations, physical exercise, ethical issues involved with their spartan regimes, and so on. It would look pretty good on your CV if you can make a convincing case of your familiarity with Plato’s ideas and how the Greek predilections for it seeped back into modern culture. The name Thomas Arnold, appointed headmaster of Rugby College in 1828, springs to mind as a catalyst for this modern revival.

There is also a spate of quite recent books on “Philosophy and Sport”. You can find them easily by putting these three words in your web search field. I can’t promise you anything about their contents or relevance to philosophy; but a few actually try to blend sport with social ethics and the influence of sport on politics. E.g. one American philosopher (David Papineau) has a list of the 5 best books on the subject matter and written his own texts on it.

Thinking in words – or not?

Jan asked:

What are the arguments for and against the proposition that humans think in words?

Answer by Jürgen Lawrenz

All arguments ‘for’ are driven by philosophical, linguistic, and even religious notions, as well as certain intuitive ideas based on the peculiarity that humans are alone among all species of animals with this capacity.

It is an attractive proposition to say to ourselves, as we think, “I’m thinking with words”, because that’s what we commonly seem to do. I’m doing this right now, thinking as I’m writing these words. By the same token, I am aware — as perhaps we should all be — that before I put pen to paper, there is an idea in my head I wish to express, and this idea is not a sequence of words — rather the words come as I write, as if my thinking mind triggers a process that collects the words ‘on the fly’, so to speak. Exactly the same pertains to speaking, which is precisely the reason that makes me write a speech down before I deliver it. I cannot trust my mind to make me think with the right words if I speak without prior preparation, and therefore writing them down is a safeguard against getting stuck or confused, letting wrong words slip out or simply missing something that I wish to say.

All these familiar hiccups are an argument ‘against’. I have known people who can “speak like a book”, but they are rare. And this applies to writing as well. Just look at the most common problem that afflicts writers: They grimace at their text and wonder why the words just don’t seem to reflect what they were intended to say. Tolstoi is reputed to have written War and Peace seven times over; and I think (again with few exceptions) this is the rule. For everyone, speaker or writer, it’s a struggle to find the right words to express their ideas.

This is not forgetting that words do not generally stand alone, but must obey the grammar and syntax of the language and that, importantly, most words must be fitted to this mould specifically, i.e. must occupy a specific place in the sequence, which is not predetermined, but can vary depending on the intended message.

There is enough in the above to show that thinking is not done with words. On the contrary, these struggles testify against it. If we thought with words, why do we make mistakes? It’s illogical to believe that I think words and then can’t speak or write them! So all this points to some faculty that is connected to, but not identical with, the “dictionary” and “grammar primer” in our memory. But we have to be careful to keep ambiguity at bay. It means that, although thinking is not done in words, nor with words, the words and grammar are ‘in reserve’, like infantry, cavalry, artillery etc. lining up for battle. In other words: We must have learnt the words as well as grammar and syntax first, before thinking is possible. And, incidentally, every infant would (if they could!) tell you the same thing.

Therefore the answer to this dilemma is the existence, in our brain, of verbal and motor cortices, all connected to the conceptual faculty and memory, which do this work for us. As I start to think with intention to speak or write, my cortices go hunting for the words, put them in sequence and activate the appropriate muscles — lips and tongue, or the hand driving a pen or tapping a keyboard. All the errors I mentioned above are reminders that it is a far from perfect performance. If we really thought in words, these things would not happen!

To sum up: I am not, generally speaking, convinced that science is in possession of appropriate tools to handle the many subject-related topics on which philosophy thrives; and this includes theories of the mind. But there are exceptions; and on your question we have one of these few, in that neurophysiology has by and large succeeded in unravelling an issue on which, as it turns out, philosophy is not well equipped to offer a plausible explanation from its own stock of concepts.

Occam’s Razor

Felicetta asked:

What philosophical “blade” encourages one to prefer simple explanations when they fit the evidence?

Answer by Jürgen Lawrenz

The “blade” you refer to is called “Occam’s Razor”. It is the name given to two arguments by the English scholastic thinker William of Ockham which stressed the “principle of greatest economy” in the search for truth and insight.

The first of these says:

It is pointless to do with more that can be done with less.

The other points out that:

A plurality should not be assumed without necessity.

Both these sentences are essentially warnings of the dangers of multiplying hypotheses to bolster a proof. An hypothesis is not a certainty; therefore five hypotheses will only render a proposition more uncertain and dissipate focus on the essence of an issue.

In addition, hypotheses are often framed with special nomenclatures requiring a definition, which is tantamount to a separate proof. But if only one of these is uncertain, then the whole ensemble is impaired.

Ockham, who was born in the 13th century, was targeting primarily the reliance of theological “proofs” on syllogistic principles. This method, he said, rests on confusion between concept and denotation — the first is a creature of the mind whereas the other points to something in the world. In metaphysical speculation, however, they are of equal value; therefore syllogisms which rely on supernatural causes run their course without contradiction and end up “proving” arguments that are plain nonsense.

There is a nice little book on this by Stephen Tornay: Ockham Studies and Sketches. It goes almost without saying that the march of science since the 18th century relies altogether on “Occam’s Razor”; it is nothing less than the First Commandment of scientific research.