Thursday 30 November 2023

Self-Consciousness: the Battle between Science and Philosophy


Self-Consciousness: the Battle between Science and Philosophy


By David Comfort

Thinkers have debated many questions about the nature of man, the most basic of which have been closely related existential ones such as: What is self? What is consciousness? A myriad of related questions arises, such as: is it possible for some advanced kind of consciousness to survive the death of its mortal envelope – the self? 

Broadly, there are two opposing perspectives: immaterial and material. The first school of thought is dominated by metaphysical philosophers (from the Ancient idealists through to more modern panpsychists); the second by empirical scientists. Metaphysicians regard consciousness as a transcendent faculty of the human mind, separable from self; materialists regard it as a neural phenomenon of the physical brain, inseparable from self. Metaphysicians assert that the body is an idea of the mind, materialists assert that the mind is a sensation of the body.

Step back a moment and consider the terminology. Con means ‘together’; scio, ‘to know’. Etymologically, then, consciousness in the self is an overriding sixth sense that analyses and organises phenomena for the purpose of knowing. Moreover, embodied awareness is only as integrating and potent as its five servant senses and neurological horsepower. 

The subject of higher consciousness in animals has always been controversial. Some argue it is nonexistent, while others allow it does exist but as a primitive survivalist awareness. ‘Wary’ creatures survive by being aware of danger. Self-preservation, then, might be said to be the mother of sentience and rudimentary consciousness. But humans seem to possess an advanced form: we are aware of being aware. So, introspective individuals can, at least theoretically, study their own consciousness. But, in doing so, they may find themselves in a reflection-on-reflection-on-reflection rabbit hole leading to infinite regress.

In ancient Greece, the Oracle of Delphi established the goal of all thought with a simple but difficult mandate: Man, know thyself then thou shalt know the Universe and God. This didn't mean focusing on the impermanent outer layers of the multi-levelled self – the physical, social, emotional, or psychological – but penetrating to its core: the spirit that transcends mortality and individuality. So, Plato said: ‘All philosophy is training for death.’ 

Through logic, logos, and purpose-discerning intelligence, telos, Plato and Aristotle, identified soul, or what they called psyche or anima, as the eternal essence of the self. Had they known about the later concept of consciousness, they may have considered it no different than the higher self’s anima.

Perhaps the first Western philosopher to venture a tentative definition of consciousness was John Locke. In his Essay Concerning Human Understanding (1690), the British Empiricist identified it as ‘the perception of what passes in a man's own mind’. But since ‘perception’ is basically a synonym for consciousness, the definition is autological. Fifty years earlier, the body/mind dualist, Descartes, had declared: ‘I think therefore I am’. Locke might have rephrased this instead as: ‘I am conscious, therefore I exist’. 

Cracking the mystery of consciousness – what it is, how it works, and whether it can outlive its default object, the self – became a primary focus for philosophers at the turn of the 20th century when the phenomenologists, Husserl and Heidegger, made a case for a ‘transcendental’ form, while the existentialist, Sartre, countered by arguing ‘consciousness is self-consciousness’. At the same time, the psychological nature of self was first systematically analysed  by Sigmund Freud who introduced the ego-id-superego trinity, and by his young colleague, Carl Jung, who began to plumb the unconscious as a repository of mythic and dream archetypes.

To understand the nature of self and consciousness, their complementary functions in the individual mind must be studied. To organise, understand and predict, the mind divides the world into objects, then analyses them according to their apparent causes. Dividing phenomena requires negation: X is X but not Z. So, equation and negation are the definitive abilities of the discriminating conscious mind. Most importantly, negation creates the two interconnected dimensions of human life and cognition, the building blocks of the self: 3D Space (I am here, not there), and 3D Time (my present is not my past, my past is not my future).

‘The only reason for time is so that everything doesn’t happen at once’, said Einstein, before proceeding to argue that time is not an absolute reality, but a mental construct affected by the motion of the observer relative to the observed. The kinetic present is nothing more than “everything happening at once”: it overwhelms the mind, preempting static objective thought. Without the idea of beginning and end, and without memory of the past and imagination of the future, the mind drowns in the disorder of the here and now.  

 So, while the body lives in space, the arena of movement, the mind lives in time, the measure of movement. Since time measures space (in light years), physicists collapse the two dimensions into one: Space–Time. Telescopes are time machines: looking into spacial distance, they peer into the past. When consciousness is ‘heightened’ in mystical or psychedelic states, space distorts or even dissolves, while time slows or even stops.

Modern physics renders the idea of a centered, stationary object or subject and a fixed point-of-view imaginary. Indeed, the body itself becomes a hive of hyperactive nerve activity. Outside, in global space, it spins at 1,000 mph while riding the earth’s merry-go-round at 67,000 mph around the sun which itself circles the center of the Milky Way at 450,000 mph. Nevertheless, the self remains body-centric, anchored in an illusory I-am-where-I-am spacial identity. Even so, while pondering consciousness, Locke concluded that the body is not so much ‘physical’ but the conscious sensation of the physical. Hence, even from his materialist point of view, he regarded the body’s assumed solidity and independent material reality as an unfounded conclusion. 

Some regard Heraclitus as one of the first materialists, at least in contrast to the aethereal Platonics. ‘No man ever steps into the same river twice’, he declared. From this fact, he concluded that all being is becoming, an idea that surely applies to the stream of consciousness. So, is everything both physical and mental indeed change – impermanence – hence impossible to pin down and to truly identify? In Einstein’s everything-happens-at-once pure present, the question is only valid in abstract time – when comparing a present river or consciousness to a past river or consciousness. In reality, the one-change ‘uni-verse’ is not all change, but all motion the manifestation of energy. To mentally break it into matter-in-motion is to replace a real kinetic with an imaginary static.

Heraclitus’ critic, Parmenides, argued that reality is indivisible as well as timeless, making change an illusion. Much later, Isaac Newton, working on his laws of motion, claimed that time and space are a priori (and thus ‘pre-time’) absolutes. Newton’s contemporary, Bishop Berkeley, challenged the premise in De Motu (On Motion), insisting motion is the absolute God-caused reality, while time and space are, a posteriori, human abstractions relative to it. And today, three centuries later, quantum physicists regard everything as an energy wave or vibration. The idea of the ‘particle’– and even the seemingly contradictory massless particle, the light photon – helps them escape mental chaos. Yet, as the father of quantum mechanics, Niels Bohr, pointed out, ‘Everything we call real is made of things that cannot be regarded as real. A physicist is just an atom’s way of looking at itself’.

Viewing self-consciousness in this light, then, could the compound term represent something that is not real but, instead, just a conceptual aid or verbal convenience? If more than that, is consciousness the self’s way of looking at itself, or is the self consciousness’s way of looking at itself? Either way, which came first, and which causes which, becomes a chicken-or-egg question. If not, it is a simultaneous birth question. 

Which, though, is it? Imagine consciousness as the mind’s flash camera. Depending on the F-stop and shutter speed, a nano-second separates the click/light flash (present) from an awareness of a developed photo (now representing a past sensory or mental event): that processing and re-cognition delay, that freeze-frame, creates time. Since we experience by being conscious of experiencing, our consciousness Polaroid stores its experience photos in the self’s temporal lobe headquarters. This artificial, subjectivised reality is the basis of the time-bound ego-sphere that dies when an individual’s embodied time is up. After self-purging the mind, some Eastern and Western mystics claim to have returned to the original universe lifeboat and entered eternal, disembodied consciousness. 

Shortly before his death, Einstein wrote to a friend mourning the loss of his young son: ‘A human being is a spatially and temporally limited piece of the whole, what we call the “Universe”. He experiences himself and his feelings as separate from the rest, an optical illusion of his consciousness. The quest for liberation from this bondage [or illusion] is the only object of true religion.’ 

 Individual consciousness expresses itself in symbolic language. The Word. This is the mind camera’s film capturing sense or cerebral experience. So, the hub of the five senses, the head, becomes a micro movie theatre of past and projected future images complete with a running commentary voice by its director: the ego. The ‘I’. The practical outcome of this abstraction is that, when viewing a present object, the mind also sees its composite idea of it based on past perceptions and understandings, an idea expressed by an identifying word – whether cow, cloud, cosmos, or whatever.

The first to believe that ‘The Word creates all things’. Egyptians referred to their priests’ writing as ‘The speech of the gods’. Early Christians adopted the idea: ‘In the beginning was the Word, and the Word was with God, and the Word was God’. (John 1:1). The first job God gave Adam was to name the Eden animals, and ‘… Whatever the man called each living creature, that was its name’. (Genesis 2:20). Thus, language became the foundation of the self’s conceptual universe. Words are structured according to grammar which itself reflects the mind’s own structure. In physics terms, nouns are substantial and static; verbs are waves and kinetic. Nouns come in cases that indicate their function: subjective, objective, possessive. Verbs come in tenses that indicate their time: present, past, future. 

Locke’s notion of consciousness as a seer presupposes a seen and thus creates subject/object dualism. Again, a person’s primary consciousness is self-consciousness. If consciousness is posited as the essence of a man, then he becomes schizophrenic: both the seeing subject – I – and the self-reflected seen object – Me. This divide leads to a daunting question: to comprehend what it is, can the seeing consciousness make itself into a seen object without becoming other that what it intrinsically is – the seer? In fact, self and consciousness seem in such close orbit that it is difficult to know which circles which, or if one reflects the other, or whether they are a two-way mirror.

A person is considered to be an individual, meaning ‘undivided’. Though the self may indeed seem unitary, to understand it, anyone trying to ‘Know Thyself’ becomes a spelunker of its layers, crust to core. The first level: consciousness of the body and its five senses. Second: of desire and emotion. Third: thought. Fourth: spirit, soul, or being. Materialists mostly live in the first and second levels; conceptualists in the third; mystics in the fourth. 

As Schopenhauer pointed out, the engine of the self, for most, is found in the second layer: will and desire. ‘My entire philosophy can be summarised in one expression: the world is the self-knowledge of the Will’, as he told a colleague. Predicated on the future, the ego’s Will creates time and turns life into a suspended animation for future gratification. Will becomes both a captain of consciousness and its corrective lens, or rose-colored glasses. It concentrates awareness on what it wants, while filtering out or airbrushing what it doesn’t. As time passes, the lens gets thicker, opaquer and more distorted, while the man behind it still insists he has 20/20 vision. 

All creatures possess the will to survive and reproduce. Humans go a step further, striving for well-being in love, fortune, fame, and/or power. But desire is the itch that increases the itchiness. Even the rare person who seems to have it all, often wants more. Anyway, whatever the self wants provides purpose and meaning to its life. In this sense, consciousness, being intentional, is governed by teleology. So, the mystery becomes: after the inanimate to animate evolution of things, climaxing in mortal consciousness, where did self-will come from? 

The question can’t be answered unless the age-old freewill versus determinism debate is resolved. Yet such a resolution seems unlikely since philosophers on both sides of the issue often present as logical conclusions what are ill-concealed presumptions. For centuries, mystics have taught that to be truly free -- to achieve transcendental consciousness -- one must escape bondage to the time-bound, desire-driven ego with all its attachments and anxieties. The few contemplatives who succeed realise that this self is a causa sui I-llusion. To return to the original primal self born of cosmic force – whether it be called divinity by Westerners or dharma by Easterners – mystics have for ages practiced self-reflection and self-mortification in many forms.  

Given that the self (illusory or not) operates conceptually and wilfully in space and time (mental projections or not), it is definable psychologically and philosophically. Any attempt to define consciousness, however, entails formidable problems since any definition is only as good as a majority agreeing with it. The more abstract and intangible the word-concept (God, Soul, Being, Truth, etc.), the more likelihood of a vague, subjective, and/or arbitrary definition. In the case of consciousness, its definition varies according to the disciplinary bias of the definer: the materialist scientist rejects subjectivity; the metaphysician embraces it. Thus, their definitions will never agree. Both materialist and immaterialist bias are problematic in their own right.

A century ago, Einstein energised the material, mechanistic Newtonian universe with E=mc 2, proving that supposed “solid” matter is in fact pure energy compressed by invisible forces (gravity, electromagnetic, and/or nuclear). Indeed, theorists see the early cosmos as pure, unbound energy with matter only being created hundreds of thousands of years after the Big Bang. 

Until the 20th century, scientists mostly studied matter macrocosmically. Then they turned their attention to the microscopic – to what the materialist Democritus first called the atom, Greek for ‘indivisible’. To their amazement, they discovered that it is indeed divisible into proton, electron, and neutron which themselves are composed of quarks bound together by gluons. To their alarm, they discovered that the atomic world seemed to operate according to completely different rules than the macro world. Rendering micro reality seemingly random – governed by chance if not by science’s mortal enemy, chaos or entropy – Einstein protested, ‘God does not play dice!’ 

More disturbing to the father of relativity was Niels Bohr’s proof of quantum complementarity and the Observer Effect. The first principle stated that the position of protean matter can be measured in space, or its speed measured in time – but not both simultaneously.  The second principle states that the observer, through the very act of observation, changes the observed object. In short, what we perceive is never the object in and of itself, but our interaction with it. Thus, the object has no independent reality, making scientific ‘objectivity’ an illusion, at least in the quantum realm. 

Taking Bohr’s complementarity and the Observer Effect into account, Werner Heisenberg derived the Uncertainty Principle in 1927. A major blow to the historic goal of science – certainty – - researchers were now reduced to conjecture based on probability. Kant had argued long before, in The Metaphysical Foundations of Natural Science, that the science of the mind shouldn’t be based on introspection since ‘even the observation itself alters and distorts the state of the object observed’. Given that the synaptic brain is animated by the three quantum forces – electromagnetic, strong and weak nuclear – shouldn’t it be studied according to quantum principles? If so, then in the act of observing self, consciousness cannot know what self is independently, just as consciousness can’t know itself independently through reflection.

Coincidentally, in the same year Heisenberg introduced uncertainty, Heidegger published Being and Time and confessed: ‘Philosophy constantly remains in the perilous neighborhood of supreme uncertainty’. Later, the philosopher, renowned for his obscurity, embraced mystery, writing that ‘Making itself intelligible is suicide for philosophy’. Cynics, sceptics, and fallibilists had said much the same thing centuries before due to the subjectivity of metaphysics and its ambiguous words and concepts. 

Striving for precision and clarity beyond words, scientists invented a new language: mathematics. Using geometry for positions in space, and calculus for movements in time, it seemed an intellectual panacea. ‘Number rules all!’ proclaimed Pythagoras. ‘Mathematics is the language with which God wrote the universe’, seconded Galileo, going so far as to recommend, ‘Measure what is measurable, and make measurable what is not so’. 

But in a universe of numerical quantities and intangible qualities, math can only measure and organize by equation the first, while ignoring the second. True, numbers can represent the degree of a quality – say, pain or spiciness on a on a 1 to 10 scale – but they can’t reveal the subjective experience of a specific kind of pain or taste, much less of self-consciousness. ‘Laws of numbers assume there are identical things, but in fact nothing is identical with anything else’, asserted Nietzsche, adding that logic itself is far too abstract, arbitrary, and simple to handle the complicated real world of quality or qualia – the unique nature of a thing which philosophers call ‘quiddity’ and Buddhists ‘suchness’. 

Charles Darwin, himself an encyclopaedist of the rich variety of nature with its countless evolving species, once observed: ‘A mathematician is a blind man in a dark room looking for a black cat which isn't there’. By contrast, early in his career, Bertrand Russell called math the ‘chief source of the belief in exact truth’. But later, perhaps overwhelmed by surreal, irrational, and imaginary numbers, not to mention the gadflies of infinity and zero, he began to question this exactness. Math, he concluded, ‘may be defined as a subject in which we never know what we are talking about, nor whether what we are saying is true’. Especially where self is concerned: for mathematicians the symbol i represents the imaginary unit, or square root of minus one. 

The contemporary philosopher Daniel Dennett calls consciousness ‘the last surviving mystery… confusing to most sophisticated thinkers’. In his book, ambitiously entitled Consciousness Explained (1991), he defines it as the sum of physical brain activities and calculations. He dismisses subjective qualia as ‘brain pranksters’ and concludes that humans are soulless computing machines no different than ‘complex zombies’ or AI robots. Challenging the notion, the Australian philosopher, David Chalmers, has argued that materialists, in their quest for certainty, ignore the ontological elephant in the room: the ‘hard problem’ of qualia consciousness: ‘what is if like to be a human’, or more precisely, a unique self? But, since studying self leads to the slippery slope of subjectivity and solipsism, preempting objectivity, consciousness materialists avoid it.  

In his Toward a Neurobiological Theory of Consciousness (1990) another consciousness expert, Kristoff Koch – with his co-author, Francis Crick (the recipient  of a Nobel prize, with James Watson, for discovering the structure of DNA) – argued that awareness can be reduced to ‘a pack of neurons’. Later, Koch, a lapsed Catholic, challenged Chalmers, ‘Why don’t you just say that when you have a brain the Holy Ghost comes down and makes you conscious?’ Then he bet his rival that, within 25 years’ time, science would solve the mystery of consciousness by identifying all its neural coordinates. In 2023, he graciously conceded defeat. 

Since his collaboration with Crick (who some called the ‘Mephistopheles of Materialism’), Koch had come to support the most synthetic of the four leading math-based neurological theories of consciousness: Integrated Information Theory – or IIT. Developed in 2004 by the Italian psychiatrist and sleep expert, Giulio Tononi, this combined materialist and immaterialist elements to conclude that consciousness inheres not just in brains but all matter in the universe. Today, neurologists are equally divided on IIT: supporters call the theory ‘promising’, detractors contemptuously dismiss it as pseudoscience due to its untestability. But as all well know, in the history of science, experiment has always lagged far behind theory, especially any theory borne exclusively from higher math.

The root problem becomes clear: In viewing the human mind as no more than a computing physical brain, the strict materialist gets lost in the chips, circuitry, and motherboard, while subsuming the essential thing: the immaterial energy that animates and connects all component parts in synergism. Materialists seem to regard this energy as a product of magic meat in the skull, rather than the other way around. Thus, for physicalists, consciousness-brain-self are inseparable, so death – UnConsciousness – is an absolute and any idea of a numinous afterlife, much less a cosmic consciousness independent of the physical self, becomes occult nonsense. 

And so, today, in large part due to the Immaterialist/Materialist divide, philosophy and science are opposed fields.Until Newton, though, there was no such distinction – every philosopher was also a scientist, and the ambition of each was to figure out as much as possible from every perspective. Arguably, the most ambitious, such as the omnivorous Aristotle, strived to understand everything material and immaterial though they had no illusions about the difficulty. They hoped to arrive at what scientists today call The Theory of Everything. TOE for short. For physicists, a TOE would unify the four recognized forces – gravity, electromagnetism, weak and strong nuclear – proving each to be a different expression of one master force. But an all-encompassing TOE would have to connect every branch of knowledge and even solve the mystery of consciousness.

Einstein who strived to ‘read the mind of God’ was among the first modern TOE aspirants when discussing the vehicle of human understanding itself: consciousness. ‘No problem can be solved from the same level of consciousness that created it’, he said – a truism still ignored by many. A student of philosophy, he went on: ‘Reality is merely an illusion, albeit a very persistent one’. As for the strict mechanical approach to physics’ mysteries, the father of relativity, an accomplished violinist, added, ‘What’s the use of describing a Beethoven symphony in terms of air-pressure waves?’ 

Einstein claimed that the only thing ‘incomprehensible’ about the universe was that it was comprehensible – held ‘wonder’ in the highest regard, calling it the ‘most beautiful thing’ and the source of all true science. Since, all systems of knowledge depend on open-mindedness to every possibility, inflexible certainty has always been the enemy of progress.

Again, as other physicists and metaphysicians have suggested, self-based ‘reality’ is not truly real but, for many reasons, illusory. Matter – the body -- is not solid, but compressed energy constantly changing form, while the space/time foundation of matter is a human invention for cognitive order in a chaotic, kinetic cosmos. Even with it, scientific objectivity is preempted by uncertainty and the observer’s altering effect on the observed. Moreover, abandoning imprecise words and concepts for numeric language – math – has not helped resolve these issues for neurologists since applying quantitative measurement and equation to a qualitative, subjective self-consciousness is trying to force a square peg into a round hole. So, the only solution to the problem is to regard the idea of matter as a product of consciousness, not the other way around. Moreover, if consciousness is regarded as a unifying energy, according to the law of the Conservation of Energy, it never dies but reifies itself in ever-changing forms.

As we have seen, self and consciousness are born together and act in concert: consciousness imparts to the self its idea of materiality in space/time, while its mortal envelope – self – imparts to consciousness its focus, will and purpose. And so, to bring our discussion to a conclusion, it seems that if a theory of everything is indeed possible, progress will only be made once materialist scientists and immaterialist philosophers set aside their biases, and instead start to collaborate together as part of a marvellous symbiosis. What that will look like we do not yet really know.



David Comfort’s essays appear in Pleiades, Montreal Review, Evergreen Review, Pennsylvania Literary Journal, Stanford Arts Review, Johns Hopkins' Dr. T.J. Eckleburg Review, Juked and Free Inquiry. He is also the author of The Rock & Roll Book of the Dead (Citadel/Kensington), The Insider’s Guide to Publishing (Writer’s Digest Books), and three other popular nonfiction titles from Simon & Schuster.  

David can be contacted at dbeco@comcast.net.

Monday 21 August 2023

REVIEW ARTCLE: Quantum Mechanics and the Rigor of Angles

 From The Philosopher CXI No. 1 Spring 2023

A young, fresh-faced Werner Heisenberg

Borges, Heisenberg, Kant, and the Ultimate Nature of Reality


William Egginton’s quest to make sense of life, the universe and everything is ambitious but ultimately unsuccessful. Unsuccessful? Yes, I know that sounds harsh, but then Egginton seeks not only to make sense of the mysteries of quantum physics, something the physicists abjectly fail to do, but to finally pin down the essential secrets of reality – something the philosophers likewise have made a fist of over the centuries. 

Part of the reason Egginton himself makes little progress is that he doesn’t see either group as having failed though. Rather he sees his role more as a cultural critic, picking out the best bits from the more mediocre.

For Egginton, there is essentially one key issue: whether reality exists ‘out there’, fixed and eternal, or whether it is rather a shifting miasma, a theatre performance in which the actors (say atoms) subtly respond to their audience (you and me and the scientist with a particle detector over there in the corner). Plato, we may summarise, largely emphasises the former view – although he certainly acknowledged the paradoxes it brought with it. Indeed, he suggests in some of his writing that reality is best approached through poetry and the imagination rather than through practical experiments. But Egginton is no great fan of Plato, instead he eulogises Immanuel Kant, who he often prefaces with the adjective ‘great’. 

Actually, many of the traditional pantheon of philosophers are introduced like this:  There’s “the great John Stuart Mill”, “the great French thinker” René Descartes, and Hegel, “the greatest German thinker of the nineteenth century”. All of them though slightly beneath that “great font of German wisdom”, Immanuel Kant. Kant, you see, intuited that the world scientists observe is not entirely independent of their gaze. It is instead, the product of the way they look at it, coloured by the perceptual spectacles they wear.

It is a good point, but one we could equally well have been attributed to Plato, or Zeno - let alone the “gloomy Scot”, David Hume, author of “that great book, The Treatise on Human Nature”. The danger with this kind of  praise for the philosophers is not so much that it is grating (ahem, “greating”), but that it is uncritical. You see, it is important to remember that Kant actually had many views and clearly some-of his theories were just plain daft. Famously he thought that all the planets in the solar system had life on them, with their intelligence related to their distance from the sun. 

Indeed, in the Critique of Pure Reason, the “famed” Critique of Pure Reason, he  occupies himself with the “inhabitants of the planets”, a happy speculation that is, of course, completely groundless. The point is, Kant’s writings should not be consumed uncritically – and while Egginton provides a rather fine overview of the philosopher’s oeuvre, it is flawed by the apparent assumption of the brilliance of all Kant’s words. And Kant is a big part of the book, as the subtitle plainly indicates. 

The same issue, with bells on, concerns Jorge Luis Borges. Why should this writer, excellent dramatist as he certainly was, be taken as a guide to quantum mechanics? It’s on the face of it implausible. Especially as no one actually understands quantum physics. That’s not just me sniping. Egginton himself acknowledges the words of the physicist and Nobel laureate Richard Feynman who once wrote:  “I think I can safely say that nobody really understands quantum mechanics”. To read Borges as a guide to QM is a bit like reading Winnie the Pooh as a guide to Taoist philosophy, as Bengamin Hoff did in The Tao of Pooh. Only Hoff’s book was a joke!

Mind you, I was recently a speaker on a panel discussing “the nature of the universe“ recently, alongside two quantum physicists, and they insisted that they did understand it. The problem was simply (they said) that average Joes lack the intuitive understanding of the beautiful and complex mathematics underlying the subject. You know, things like the extra dimensions quantum theory works daily with. How many dimensions are there according to quantum physics, you might ask? Well, ten, a mere ten we might say, dimensions are used to describe superstring theory, while eleven dimensions can describe supergravity and M-theory. But as Wikipedia helpfully explains, “the state-space of quantum mechanics is an infinite-dimensional function space”.

The theoretical physicist Roger Penrose has queried the logical coherence of such airy mathematical talk of multiple dimensions, yet as I say, many “experts” insist that it all makes perfect sense, albeit being hard to explain without complex mathematics. At least Egginton doesn’t go down that rabbit hole. There is next to no maths in this book, even though his third major character, Werner Heisenberg, made his contributions in just this “toe-curling“ area. As Egginton puts it: “The uncertainty principle, as it came to be known, showed with inescapable, mathematical precision that … full knowledge of the present moment wasn’t just hard to pin down; it was actually impossible.”

Which point explains why, to paraphrase Borges, the rules that govern the world are the man-made, artificial ones of chess, not the heavenly ones of the angels. So let’s give the last word to Egginton, who has produced an account that is always highly original, often insightful and only, in places, rather difficult and obscure. 

“There is rigor there, indeed. But to see that we are the chess masters who made it, we must let the angels go. And that, it seems, is the hardest task of all.”


Reviewed by Martin Cohen


 

 

 

 

The Rigor of Angels: Borges, Heisenberg, Kant, and the Ultimate Nature of Reality

By William Egginton 

Pantheon, New York, 2023

Friday 28 April 2023

REVIEW ARTICLE The Experience Machine

 From The Philosopher CXI No. 1 Spring 2023


In this optical illusion, the two orange circles are exactly the same size; however, the one on the right appears larger


REVIEW ARTICLE

The Experience Machine: How Our Minds Predict and Shape Reality


Andy Clark is a professor of ‘cognitive science’ at Sussex University and talks briefly about how he started there when the department was entirely novel. Coincidentally, I also remember this excitement, as I studied at the same university around this time, and was offered a choice of modules. The choice was social science, Marxism, or this new thing, ‘cognitive science’. I took the first option and my career has never recovered. Cognitive science, on the other hand, has become highly fashionable. But what is it exactly? I was suspicious then that the subject was really an uncomfortable blend of computing and biology - the study of the mechanisms of the brain.

The thing is, I don’t think the human mind is a computer – far less that you can work out how it operates by studying electrical signals in the brain circuits. 

Clark says that his approach “challenges a once traditional picture of perception, the idea that light, sound, touch and chemical odors activate receptors in eyes, ears, nose and skin, progressively being refined into a richer picture of the wider world”. The new idea, the “new science” as he puts it, “flips that traditional story on its head”. Perceptions are “heavily shaped from the other direction, as predictions formed deep in the brain… alter responses”.

Yet, having offered this radical reversal, Clark brings back the ‘outside-in’ approach by allowing that sense perception “helps correct errors in prediction”. What does ‘error’ mean here? That there is a real world out there that sense perceive accurately? It seems to be an uncomfortable attempt to ride two horses at once.

“Predictions and prediction errors are increasingly recognised as the core currency of the human brain, and is is in their shifting balances that all human experience takes shape” adds Clark undeterred. The brain is however in the driving seat, “painting the picture”, with sensory perception “mostly to nudge the brushstrokes”. Switching to computer language, he explains:

“Instead of constantly expending large amounts of energy on processing incoming sensory signals, the bulk of what the brain does is learn and maintain a kind of model of body and world– a model that can then be used , moment by moment, to try to predict the sensory signal”

As Clark mentions, we come across this usually hidden effect when we look at optical illusions, like the one where two figures are the same height yet one is made to seem much more larger by virtue of tricks with the background. Clark goes so far as to say that what we really see are “controlled hallucinations”.

Talking of which, the placebo effect is discussed and Clark notes how studies have found not only that people suffering from back pain benefit both from pills which contain active ingredients and those that don’t, but that this effect survives even when the patients are told the pill has no active ingredients. (Not that Clark goes there, but this certainly points to a possible justification for the infinitesimal treatments of the homeopaths.)

What he does say, however, is that anything which boosts confidence in an intervention will enhance its prospects of success – but here I think he misses the difference between conscious cues and subconscious ones. Odd lacuna? But then he actually argues that “predictive brains” involve the “active construction” of human experience.

A lot of the claims here are offered flat, as “science says”, yet surely deserve some scepticism. Research for example, that, when shown religious images, “religious  subjects rated a sharp pain as less intense than atheists shown the same images”. Or that isting the risks of side effects on medical treatments can “actually bring about the side effects they describe”. Let alone that dentists telling patients that the injection will only be a tiny poin prick reduces the pain experiences. On the contrary! Those words of reassurance signify to many of us that a very nasty pain is about to follow! Okay, maybe my point looks a bit like a joke, , but actually, one big concern I have with this account is how it removes the complexity of human thought processes. But that’s cognitive science!

Common-sense notions of causality are also reversed in the phenomenon noted by the German philosopher, Hermann Lotze and, later, William James too, that actions come about because we mentally represent the completed effects of the action. Clark gives the example of pulling the strings on a puppet. We are interested in (say) the puppet waving its hand – not in the details of how the string moves which bit of the puppet in what may be a complicated sequence. 

Likewise, it seems that when we have a drink of water to assuage thirst, we get immediate satisfaction of the thirst, even though the water has not had enough time to have had any physical effect.

More ominously, things like a police officer’s elevated heart rate when investigating a possible threat, can be taken by the “predictive brain” as themselves evidence that there really is a threat. (We’ve seen too many cases of such things in American in recent years, with police shooting householders or motorists out of misplaced conviction of a threat.)

Having put the brain in control of our environment, Clark then backtracks and offers examples of how out environment can be adapted to help our brain. Alzheimers sufferers, for example, he says, may arrange their homes with lots of visual clues, from written nots to arranged objects, to “take over the functions that were previously played by their biological brains”. A biological part of the brain is replaced by external, physical substitutes. Clark suggests we are all increasingly doing this – relying on calculators to do our maths, on search engines to remember things.“Most of our daily devices especially our smartphones and other wearables, are already starting to act as woven resources. They are devices whose constant functionality has become factored deep into our brains’ ongoing assumptions of what kinds of operations can be performed and when.” Actually, this is an idea Clark set out earlier in another book, co-authored with David Chalmers, called The Extended Mind. Clark mocks the “chauvinism” of those who say such devices cannot be considered part of our ‘minds’, as they are outside our heads. Yet he does not seem to have considered that all our thinking might be better understood as social, particularly given that so much of it is framed in words and concepts that are produced socially and made concrete in human languages.

Towards the end of the book, which is a reasonable place to do it, Clark sums up his theory: “To perceive is to find the predictions that best fit the sensory evidence”. This rather underlines how little philosophy there is in the book. “The sensory evidence” seems to be still there, just as John Locke and the other philosophers supposed centuries ago, steadily being processed by humans. The only new thing is that at a certain level of the conscious mind, the perceptions are being reorganised, largely in line with expectations based on previous experience. Clark declares this is big progress, writing that “understanding the way predictions of many kinds alter and adjust the shape of experience is our biggest clue yet to what experience really is”. But if that’s the takeaway from the book, it’s rather meagre. Plato wrote about perceptual illusions and what they told us about perception, two thousand odd years ago. Cognitive science, it seems, is a new name for a very old study.


Reviewed by Martin Cohen



Book art The Experience Machine

The Experience Machine: How Our Minds Predict and Shape Reality

By Andy Clark

Pantheon  (Penguin–Random House) 2023 ISBN 9781524748456

 

 

 

 





Thursday 16 March 2023

The Alchemy of Political Discourse: A Mix of Facts, Beliefs, Reality, and Uncertainty

The strange philosophical liaison of Martin Heidegger and his student Hanna Arendt mixed two very different worldview

The Alchemy of Political Discourse:

A mix of facts, beliefs, reality, and uncertainty


By Keith Tidman

Disinformation, ideological branding, and uncertainty abound in political arenas across the globe, leading to confusion, distrust, rifts, and violent partisanship. The shrillness becomes contagious, as people often find it easier to quarrel than to agree on common ground. To arouse ire and fuel division rather than facilitate the rational exchange of ideas. Yet, there’s a glimmer of hope, if we recall the words of French essayist Marcel Proust, who in 1919 pointed out in In Search of Lost Time that ‘A powerful idea communicates some of its strength to him who challenges it’. Given the frequently toxic clamour on today’s social media and even mainstream platforms, Proust’s observation appeals.

Yet, as we fold together considerations of epistemology and this insight, what facts and beliefs can we entrust ourselves to as we make political choices? Whether from picking our governments to reacting to global, national, and local controversies? And with what degree of certainty or uncertainty should we take the information we get fed or that we come across in our quest to unveil reality?

Today, the various social-media formats and legacy outlets of news and doctrinaire opinion compete to dominate the public stage, where facts and certainty, let alone attempts to illuminate complex issues, may instead add to the confusion while competing for eyes and ears. In the past, truth-seeking colossuses like Confucius, Buddha, Plato, and Socrates taught that wisdom entails acknowledging what we don’t know — or at least, what we think we know. Today, in contrast, much of the internet and other sources of news and opinion belong to predictable pundits and influencers, with devotees who may not be the most critical.

Facts are of course indispensable to describing the reality of what’s going on in society and politics, and to judge the need for change. Despite the sometimes-uncertain provenance and pertinence of facts, they serve as essential tinder to fuel predictions, decisions, choices, and change, especially if based on mutually agreed-upon reference points. The expectation being that facts, in a bed of civil communication, translate to the meeting of minds and civil communities. Yet, there is no single way to define facts and assess truth. Instead, the web of relationships that bear on the truth or untruth of ideas enable knowledge and meaning to emerge — creating an understanding, however incomplete, fragmented, and test-worn, of the mosaic that our minds stitch together.

Today, discourse in the public square tramples these considerations; and who gets to decide on reality is disputed, as camps stake out ownership of the politically littered battleground. A better starting place echoes through history, with the 2nd-century Greek Stoic philosopher Epictetus, who counseled: ‘First learn the meaning of what you say, and then speak’. The combustibility and tumultuousness of political exchanges make it all the harder, however, for many to heed this sage advice. Disappointingly, instead, acerbic disputants amplify their differences rather than soberly seeking to bridge divides. 

To these points, in 1620 Francis Bacon wrote in The New Organon and Related Writings
‘The human understanding when it has once adopted an opinion draws all things else to support and agree with it. And though there is a greater number and weight of instances to be found on the other side, yet these it either neglects and despises, or else by some distinction sets aside and rejects, in order that by this great and pernicious predetermination the authority of its former conclusion may remain inviolate’. 
His point resonates every bit today. Put another way, facts get shaded by politically motivated organised groups and individuals armed with high-decibel bullhorns, reflecting political, societal norms chiseled to match the dominant culture.

Consider the case, for example, of migrants arriving in the United States, whose families spread out across the fifty states, usually to escape persecution, violence, and deprivation in their home countries and to seek out a better life for their families. The toxic political atmosphere has been heavy with stridently opposing points of view on the subject. One camp contends that porous borders, resulting from what it argues are lax policies, existentially threaten the fabric of the nation. A ‘fortress America’ is thought to be the solution, involving literal and figurative walls. Another camp contends that, although regulatory measures may well be in order to better manage the influx, migrants add to the nation’s economy and promote notions of multicultural richness. Besides, this camp says, there’s a profound humanitarian aspect to these migrants’ desperation, and that perhaps racism is really at the core of objections.

The heated jostling between the two camps enlarges the existing political gap, widened all the more by increasingly radicalised positions and contests for supremacy in influencing national and local policy. The voices of the pundits, reverberating around the internet and elsewhere, can be thunderous from all parties. The resulting confusion about terms, whether on the matter of migrants or other tense issues, opens the door to a politics of deception, where political gain, harm, or mere puffery feed the many ways that ‘false facts’ — defined variously by each camp, with some deliberately and deeply seeded in the messaging — can be used to mark political ground. As well as to tip the scales in favour of choices based on competing sets of values and norms. Here, charges of brainwashing are cavalierly tossed around by each camp amid the prickly rhetorical parrying.

It’s all a long way from the world described by Immanuel Kant in which ‘All our knowledge begins with the senses, proceeds to the understanding’, and only then ‘ends with reason. There is nothing higher than reason, for working up the material of intuition and comprehending it under the highest unity of thought’ (The Critique of Pure Reason, 1781). He struck a sensible note. But today, in our imperfect world, it doesn’t always seem that reason prevails. A thousand factors and mechanisms corrupt the processes of empirical experience, cognitive understanding, and rational examination and decision-making.

Instead of being the anchors of social discourse, today ‘facts’ are used to propagate dissonance, or to deflect attention from self-serving policy proposals, or simply to disadvantage the ‘other’. Facts fuel jaundiced competition over political power and social control. Liberals complain that this ‘other’ is rooted in systemic, institutional bias and ranges across race, ethnicity, national origin, religion, indigenous group, language, and socioeconomic segmentation. The view is that marginalisation and disenfranchisement result, which might have been the aim all along. One solution is for wider programs of civic education and moral training seen as allowing space for a liberal multicultural bedrock, pride of identity, and respect for judicial impartiality.

Such programmes underscore that there is wisdom to be had by first listening to others rather than leaping unapologetically and frenziedly into the polemical fray from the get-go. After all, as Friedrich Nietzsche warned — and as, himself, one of philosophy’s great extremists, he should have known: ‘Extreme positions are not succeeded by moderate ones, but by contrary extreme positions’ (The Will to Power, 1901). Certainly, the urge to say something, no matter the substance or manner, in order simply to oppose seems endemic to the world of social media and to the public square broadly.

Unfortunately, in the process of democratising access to information through modern technology and its globalised character, with opinions at everyone’s finger tips, individuals with manipulative purposes may take advantage of those consumers of information who are disinclined or unprepared to dissect and challenge the messaging. That is, what do media narratives really say, who’s formulating what’s presented to readers and listeners, for whose benign or malign purposes, and who’s entrusted with doing the curating and vetting? Both leftwing and rightwing populism roams freely, whipsawing nations between opposite ideological poles, all along stressing the connective tissue of society

This political and social messaging, coming from multiple opinion platforms, offers tempting grist for manipulation, to coerce people into conformity, or worse antisocial actions, like attacks upon authority and democratic institutions, as well as upon long-established cultural norms. Such action violates Aristotelian virtue ethics, which assesses people’s reasons for action, favouring the golden mean between two extremes (Eudemian Ethics). As a general principle, manipulation emphasises group-think allegiance — a hallmark of surrendering personal power — over autonomy, free will, and rational reflection. Manipulators may choose to influence circumstances, indirectly prompting fidelity and behaviours by susceptible individuals and groups. These practices are often the target of moral censure, in part for objectifying victims.

The manipulative, influence-peddling practices may themselves, in their own right, be viewed as immoral according to prescribed sets of rules — called deontological censuring, in accordance with Kant’s notions of obligation. Or the moral focus may, instead, be on the outcomes of such practices — called consequentialist censuring, in accordance with English philosopher John Stuart Mill’s classical utilitarianism. As Mill explains, ‘Actions are right in proportion as they tend to promote happiness, wrong as they tend to produce the reverse of happiness’ (Utilitarianism, 1861). The harms and benefits of heavy-handed, manipulative coercion are comparatively weighed against each other in this censuring calculus.

Returning to the influence and leveraging of ‘facts’, today bogus facts loosely dot the communications landscape, steering beliefs, driving confirmation bias, stoking messianic zeal, stirring identity warfare, and invoking consequential outcomes such as ill-informed voting decisions. The public grapples with discerning which politicians really are honest brokers of high moral character and scrupulous intent, or which are instead a fifth column out to deceive and disrupt. Nor can the public readily know the workings of social media’s algorithms, which compete for the inside track, to modulate and moderate the heart of messaging’s content. There’s a battle underway for power, leverage, and control; and most of us are offered but a bit part, resigned to a largely passive role. 

The political philosopher Hannah Arendt — who moved to the United States after the Second World War, but grew up in Nazi Germany and was a student of Martin Heidegger — weighed in on similar issues of societal and political power dynamics. As well as language’s use to frame and convey critical matters of import to citizens. ‘Power is actualized only where word and deed have not parted company … where words are not used to veil intentions but to disclose realities, and deeds are not used to violate and destroy but to establish relations and create new realities’, she writes in The Human Condition, published in 1958. She prefigures a world in which political coalitions that are rhetorically opposed to one another scuffle to hold everyone accountable for good faith in affixing exact words to exact deeds.

Such dynamics underline the wisdom of Confucius many centuries ago, when he wrote: ‘The beginning of wisdom is to call things by their proper name’. And likewise explains the French philosopher Voltaire’s insistence in his Dictionnaire Philosophique of 1764 on the need for elucidation: ‘Define your terms, you will permit me again to say, or we shall never understand one another’. Eminently sensible, one might propose. For all these political theorists, it is only through such crystal clarity can citizens avoid the facts claimed as part of political and societal discourse turning into reasons for petty machinations and dispute.

Further, facts may have multiple dimensions: what one knows, how one uses language to describe what is known, how one confirms or falsifies what is known, and finally what meaning and purpose are attributed. Yet, despite such complexity and conditions, we assume that normally the pursuit of certainty gets us ever closer to reality, and we instinctively rally curiosity, imagination, and questioning to the causes of knowledge and understanding.

And yet, curiosity and discovery amidst such uncertainty has historically proven a powerful boon to human advancement, a philosophical viewpoint that physicist Richard Feynman underscored in a lecture at the Galileo Symposium in 1964: ‘People search for certainty. But there is no certainty’. Adding, however, that ‘We absolutely must leave room for doubt or there is no progress and no learning. There is no learning without having to pose a question’. The 14th-century Japanese Buddhist monk Yoshida Kenko prefigured Feynman’s cautionary viewpoint, saying, as a nod to ephemerality, ‘The most precious thing in life is its uncertainty’ — a note undervalued by those perhaps a bit too ready to indignantly take to the political ramparts over transitory matters.

A full understanding of the world infusing our lives thereby remains characterised by ambiguity — constrained within limits whose rawness might not be uncovered even through earnest, critical scrutiny. As to the point about rational investigation, the English philosopher and mathematician Bertrand Russell posed this sobering qualification regarding its role in intellectual and creative achievement: ‘Reason is a harmonising, controlling force rather than a creative one. Even in the most purely logical realms, it is insight that first arrives at what is new’ (Our Knowledge of the External World, 1914).

In an interview in 1960, in answer to a question posed by interviewer Woodrow Wyatt about the ‘practical use of your sort of philosophy to a man who wants to know how to conduct himself’, Russell goes on: 
‘I think nobody should be certain of anything. If you’re certain, you’re certainly wrong because nothing deserves certainty. So, one ought to hold all one’s beliefs with an element of doubt, and one ought to be able to act vigorously in spite of doubt. One has in practical life to act upon probabilities, and what I should look to philosophy to do is to encourage people to act with vigor without complete certainty’.
There is a well-known hazard called the Dunning-Kruger effect that bears on this uncertainty: a cognitive bias, where people self-deceptively overvalue their knowledge and aptitudes, a situation amplified by media narratives that have us in their crosshairs. People therefore often fail to recognise their limited ability to referee truth in what they see and hear. The effect is to distort public debate. All the more reason for intellectual caution in rendering judgement, and all the less reason for high confidence in political cloistering.

One vulnerability of such cloistering around hardened policy positions involves the tendency to draw generalised conclusions about political and social matters. In his 1739 Treatise of Human Nature, the Enlightenment philosopher, David Hume, stressed the risk of such generalisations in our arriving at beliefs: ‘That instances, of which we have had no experience, must resemble those, of which we have had experience, and that the course of nature continues always uniformly the same’. A similar process happens in the world of communications media, where political positions compete head-to-head. There, too, is a lack of absolute certainty about what is thought to be known. Yet, these closely guarded beliefs are really only hypotheses or drafts, subject to amendment. It’s not uncommon, of course, for political beliefs needing to change with the appearance of new facts; however, discovering the need for change proves challenging, given people’s natural resistance.

Meantime, much research into the psychology of uncertainty has shed light. For the most part, investigations point to the anxiety and insecurity that arise from uncertainty and the urgent desire to elude or mitigate it. People adopt behavioral measures to handle the disquiet and sense of susceptibility that uncertainty stirs, where risks are exacerbated by the steady patter of negative news fed to us by social media and legacy sources, ranging from political radicalism to violence, social collapse, terrorism, conspiracy theories, and other worries. The mind speedily recalibrates the jeopardy and reacts accordingly. Describing a curious, counterintuitive exception to uncertainty’s well-documented effects worth noting, two academics in a 2016 Scientific American article point out that while ‘we now know that it is true that certainty can prompt people to act, it is often uncertainty that prompts people to think’.

In everyday life, contending with ardent and bellicose political platforms, people live with uncertainty about truth and reality. All the more so if they misguidedly conclude that political and societal media narratives have little consequence for their own lives, even though such narratives are to be ignored at great risk. Doubt emerges throughout what people assume they know, including the pillars of their own core ideological beliefs. All the while, disputants heatedly pillory one another over the soul of the world. The air remains thick with sharply differing and paradoxical ideas, but where people defiantly look past each other, flouting resolution.

Georg Wilhelm Friedrich Hegel offered a solution to the dilemma posed by stridently differing points of view of the kind discussed above, suggesting a way for everyone to strive for a compromise. His solution makes use of a deceptively simple triad that he originally expressed this way: the ‘concrete’, whereby someone lays out a starting argument; the ‘abstract’, whereby someone refutes the starting argument; and the ‘absolute’, which reconciles the other two legs of the triad, forming a compromise that everyone agrees is better than either the beginning argument or counterargument. The contemporary Johann Fichte rendered Hegel’s solution into a more memorable form: ‘thesis, antithesis, synthesis’, which is how it became best known. How much more pacific and conciliatory discourse would be if today’s disputing political evangelists, incessantly railing over ideology, policy, reality, right and wrong, and power, instead applied Hegel and Fichte’s path toward resolution. 


 

Tuesday 28 February 2023

REVIEW: The Future of Humankind (2023)

 From The Philosopher CXI No. 1 Spring 2023

The Future of Humankind?
In his new book, John Hands tells us not to worry so much…

The Future of Humankind is a snapshot of current thinking about science, more than a real attempt at futurology. It is a deftly written book, that contains a lot of fascinating facts and information - while also keeping the reader active and thinking. The reader may not agree with a lot of it, but that’s a virtue as much as apparently a fault.

As a gloomy Capricorn, I’m not very optimstic about the future and so I am always happy to read a disaster book, and at first glance, the contents list of John Hand’s tale of what awaits humanity promises to be just that. But a few pages in, talking about spacerocks httting the Earth, the gloomy reader will already be a bit puzzled at what seems to be Hand’s indefatigable spirit of technological optimism. On this wonderfully awful prospect, we’re told that thanks to NASA scientists:

“…we now know that there are no comparably large asteroids (diameter greater than 5km) in orbits that could potentially hit Earth.”

Well, boo to that! But, I’m sure the scientists have their reasons. They usually do, which like Grouch Marx, the also offer to swap for other ones later if need be. But I recall from reading about Newton, that the ability to predict the movement of things like asteroids is not just difficult (lack of observations) but maybe actually impossible, due to the so-called three-body problem. This is that taking the initial positions and velocities of three point masses and solving for their subsequent motion is mathematically impossible. A tiny influence can create a sequence of effects - in the manner of the butterfly wing that causes a hurricane in chaos theory.

More mundanely, there’s certainly plenty of sources saying all the significant asteroids have now all been tidily registered and their movements calculated – but it is almost every year that a previously unindentified rock almost hits the earth, so I would have thought a little more scepticism is warranted.

A similarly optimistic note is struck a little later on, now in relation to the novel mRNA treatments for the corona virus, when we’re told:

“What is significant is how effective most of these vaccines have been. Two injections of the Pfizer–BioNTech vaccine spaced 21 days apart proved 95 % effective at preventing Covid-19 in those without prior infection and 100 % effective at preventing severe disease.”

I googled this and it seems that even the rather pro-Vax medical journal, the Lancet, reported that if this vaccine’s effectiveness against the Delta variant was 93% after the first month - after the first month mind!- it declined to just 53% after four months. Against other coronavirus variants, efficacy declined to 67%.

Likewise, a New York State Department of Health study at the end of 2021 found that the effectiveness of Pfizer’s vaccine against Covid infection plummeted from 68% to a ridiculous 12% for young children during the omicron surge from December 13 through January 24. Protection against hospitalization dropped from 100% to 48% during the same period.

As I say, Hands just seems to be an optimistic soul. Take comets, notoriously mysterious. These, he allows, constitute unknowns but he says they are behind “less than 1% of all impact events in Earth’s recent geological record” and none at all in historical times so “it is safe to conclude that impacts from asteroids or comets pose no existential threat to humans”. I don’t quite get where the “so” comes from here. One is reminded of Bertrand Russell’s unfortunate chicken who is used to receiving a handful of grain every morning from the farmer’s wife - for as long as the chicken can remember - until one fateful morning, the wife wrings its unfortunate neck.

Okay, what about the risk of nuclear war? At the time of writing this review, the Russian television is full of pundits threatening to wipe out Britain and American with nuclear tipped hypersonic missiles. The risk of nuclear war seems very real. Yet here too, Hands is optimistic. He says that treaties plus international opinion mean the risk of Armageddon diminishes steadily.

Talking about nuclear, I have to take particular umbrage with the account of the risk from nuclear reactors. Hands uses UN sources to reassure us that hardly anyone died either after the Chernobyl or Fukushima partial meltdowns. Yet I looked in detail at these for my own book on nuclear energy and I found that the UN account was woefully skewed towards defending the “peaceful use” of nuclear energy while neutral reports found convincing evidence of a huge toll, particularly after Chernobyl, a toll solidly recorded in hospital records, as well as a more speculative but potentially very significant toll worldwide due to things like plutonium particles entering ocean foodchains. Now neither of these nuclear disaster actually qualifies as apocalyptic, but Hands neglects how very close – a matter of hours – both plants came to much greater explosions, which it is generally agreed could have caused global radiation poisoning.

I am more sympathetic to the conclusion to a long chapter on the dangers from either population explosion or climate change or a combination of the two. The short story is that, again, Hands is optimistic, concluding there is a negligible probability either will result in the extinction of the human species. Okay, on this I agree! But again, is this really demonstrated or rather a rosy assessment based on cherry-picked data?

A more unusual doomsday topic is that of the supposed threat of humans being replaced by robots. It’s a good account this, but again, the rosy assuarance at the end that we really do not need to fear that “artificially intelligent machines built by humans will exceed human-level intelligence and thereafter bring about the extinction of the human species” seems to go beyond the evidence, not least because surely we do not know at this point in time what the capabilities of machines will be within a relatively short timescale.

Perhaps the more significant part of the book, certainly the part I like best, is entitled “Reflections and conclusions”. It is here, that Hands details his working methods and describes what he found when he attempted to discuss the doomsday scenarios with the relevant “experts”. As philosophers of science, like Thomas Kuhn and Paul Feyerabvend could have predicted, he seemed to find that within each field “most” of the experts cohered around one opinion, but there were invariably one or two outsiders with dissident views. The message from philosohy of science is that this is not because most people are right and one or two are laggards, though. It is because scientific debates are rooted in dominant ‘paradigms’. Acceptance and career success in a field requires researchers to conform. Yet, the point about these paradigms is that they change. To the point: identifying the firm opinions of the majority of scientists is not the route to certainty it pretends. Next year, the majority of scientists may think something different. That is how science works. 

At this point, I might mention that there was another book in this general area a few years ago, Why Science is Wrong...About Almost Everything, by entrepreneur Alex Tsakiris. which noted that the great majority of material in in textbooks from the pervious generation, written with such confidence then, is now equally confidently considered to be plain wrong, So we should be very sceptical about the value of surveys of scientific opinion. To be fair, Hands does himself describe, towards the end of the book, some of the strange cases of erroneous scientific predictions, for example that “heavier than air” machines cannot fly – but this is not the message of the bulk of the book.

Instead, there’s not enough sense of this need for caution about the pronouncements of ‘experts’ here. This contrasts with the caution about political claims and pressure groups. For example, Hands cites the case of XR (Extinction Rebellion) cofounder Gail Bradbrook who was quoted, in October 2019, saying that 97% of the world’s species, including humans, would perish within her daughter’s lifetime – unless everyone on the planet stopped producing CO2 by 2025, as evidence of the dangers of allowing your opinions to drive your analysis, but the same danger seems to have shaped this book, even if the opinions are backed by respectable authorities. 

As Part Two of this look into the future of the human species, Hands again takes a brief Cookes Tour of the current theories, such as ‘colonizing space’ or ‘using technology to extend the healthspans of individuals’, but here the optimism changes to a more sceptical approach. Indeed, he eventually concludes few things stand much chance of ever coming to be.

Some ideas do seem rather wild and to oppose our current understanding of physical laws – such as the speed of light. But then, in a closing chapter, Hands allows himself to step aside from the straitjacket of what we know to speculate that “the next stage of human evolution” could be a new kind of consciousness. He writes:

“I speculate that, in its fourth stage of evolution, the human cosmic consciousness will be able to comprehend such a higher reality. Furthermore, the human cosmic consciousness may well constitute that higher reality and be the cause of all the physical and chemical laws and parameters that enable it to evolve in an eternal, continuous cycle of self-creation. That is, it forms a cosmic consciousness that underlies everything and from which everything unfolds.”

It’s a nice idea, and it comes more intriguingly as part of an otherwise, as I say, determinedly “scientific” account. But to me, it seems to be driven more by optimism than anything as mundane as the evidence.


Reviewed by Martin Cohen


THE FUTURE OF HUMANKIND: Why We Should Be Optimistic
By John Hands 
Castleton, 2023
ISBN 978-0993371943

 





Kryptonite of the Übermensch

From The Philosopher, Volume CXI No. 1 Spring 2023
 

Kryptonite of the Übermensch

The Purgatory of  Soulmates Schopenhauer and Nietzsche 

By David Comfort



When Frederick Nietzsche stumbled on a copy Arthur Schopenhauer’s The World as Will and Representation, at the tender age of twenty-one, he was electrified. “Here every line shouted renunciation, negation,” the precocious philology student at the University of Leipzig later wrote. “Here I saw a mirror in which I spied [my] own mind in frightful grandeur.” 

But then, in many ways, Nietzsche and Schopenhauer, though separated by fifty years, were brothers. Both German iconoclasts were classical scholars, anti-establishment polemicists, depressives, bachelors, misogynists, syphilitics, self-exiles, animal lovers, skilled musicians (Schopenhauer flute, Nietzsche piano) and lovers of melodramatic opera to the point of weeping.

Most importantly, both believed that two thousand years of metaphysics had to be recast by shifting focus from divinity and empty abstraction, towards authentic living human experience and motivation, no matter how lowly or perverse it might seem. Accordingly, they were among the first philosophers who thought and wrote passionately – like human beings, not logic-driven automata. Their writings went on to inspire both the Existentialists with their “Existence-before-Essence” mantra and Freud’s science of psychoanalysis with its tripartite Ego-Id-Superego distinction.

In fact, while describing egotism as the fundamental incentive in all life, Schopenhauer wrote in On the Basis of Morality
“A man prefers the entire world’s destruction sooner than his own… and is capable of slaying another, merely to smear his boots with the victim’s fat.” 
In his own case, he insisted that he had checked his own ego at the door for the good of humanity. As a young man he had tried to fast-track his career by ingratiating himself to the then ascendent star of Germany, Goethe, a member of Johanna Schopenhauer’s – his famous mother’s – literary salon. He critiqued and provided an addendum to Goethe’s highly acclaimed colour theory, informing “His Excellency,” as he addressed him, that he was adding the ‘Apex’ to the genius’s ‘Pyramid’. When Goethe responded with faint praise, the young Schopenhauer obstinately declared, “Posterity will erect a monument to me!” 

The prodigy’s next career move was to throw shade on the esteemed philosopher, Georg Hegel, who he called “Monsieur Know-Nothing,” a “charlatan,” a “swaggerer” and (most famously) “a cuttlefish that creates a cloud of obscurity around itself.” On his appointment to the University of Berlin, where Hegel was professor, Schopenhauer, scheduled his lectures at the same time and was furious when only a few students showed up. Humiliated, he left Berlin but, shortly after his departure, welcomed the news that cholera had claimed Hegel.

Hoping to discredit, if not entirely replace, Hegel’s abstruse theocentric system, Schopenhauer, after much difficulty, managed to publish his anthropocentric manifesto The World as Will and Representation (later carried into World War One in his knapsack by the young German soldier, Adolph Hitler), Though Schopenhauer was convinced the eight-hundred page treatise would revolutionise two millennia of philosophical thought, he was criticised for parroting his grad school professors, Johann Gottlieb Fichte and Friedrich Schelling. Outraged, the outlaw insisted not only that his work was completely original but that those of his predecessors nothing more than “great bloated soap bubbles,” as David Cartwright recalls in his classic biography of the philosopher. Perhaps it was comments like this that led another celebrated German megalomaniac, Ludwig Wittgenstein (whose own students called him “God”) to say many years later: “Schopenhauer has quite a crude mind ... where real depth starts, his comes to an end.”  

But then, as far as Schopenhauer was concerned, his works were pearls before swine. “Talent hits a target no one else can hit; Genius hits a target no one else can see,” he pompously pointed out in The World as Will and Representation. The result was that soon, even his few supporters were calling him a “chained, barking dog.” Responding that he would, in fact, prefer to die if not for the companionship of dogs, Schopenhauer wandered restlessly from city to city with his beloved poodles, Atma and Atma. (Eccentrically, he called all his dogs by the same name…Atma, and nicknamed them all Butz. Atma being the Hindu word for the universal soul from which all universal souls arise.)

Fearing he might drown himself like his father, Frau Schopenhauer sent her son urgent letters, fretting about his poor health, “dark disposition” and isolation. Ignoring her meddling, and insisting that “to live alone is the fate of all great souls,” Arthur continued to travel solo – his only companions the dogs, a Buddha statue which he called his “crucifix surrogate” and a copy of the Upanishads, the Hindu scriptures that he called “the consolation of my life.” But when critics claimed his work owed much to the holy texts, he boasted that his writings were unprecedented and nothing less than the legendary Philosopher’s Stone itself: the absolute and divine truth as sought by alchemists for millennia. Even so, fearing continued rejection and obscurity, he started to think that the highest form of ascetism was voluntary starvation.

Suicide had been a hotly debated topic in the ancient world, as he well knew. Stoics such as Seneca, Epictetus, and Marcus Aurelius considered it an “honorable” option. Socrates had obviously agreed, preferring the hemlock to exile. Plato, however, whom Schopenhauer called “divine,” saw both sides of the matter and straddled the fence. Aristotle condemned suicide as ignoble – a cowardly act. Actually, Schopenhauer considered Kant “marvellous” until his hero seconded Aristotle, declaring that self-destruction violated one’s divinely mandated “duty to live.” But then, Schopenhauer reserved his utmost contempt for theist optimists, arguing that the perverse will-driven life “wasn’t worth living.” Nevertheless, thanks to late, hard-won recognition, he managed to soldier on.

Toward the end of his life, Schopenhauer, now regarding himself as a prophet, insisted his doctrine was inspired by the “spirit of truth and the Holy Ghost.” He had at last gained worshipful ‘apostles,’ eight by his count, with no traitorous Judas among them. Once, when they had to meet without him, he reminded them of the Gospel of Matthew (18:20): “Where two of you gather in my name, I am there with you.” Presumably, he was joking, having once declared, “A sense of humor is the only divine quality of a man.”  

Like Buddha, Schopenhauer said compassion was the source of all virtue. He defined the emotion as feeling another’s pain in one’s own body. “Extreme egoists,” he wrote in Parerga and Paralipomena, “ignore the misery that their unchecked self-interest produces, and malicious persons delight in the wretchedness of others.” Earlier in his life, he hadn’t seemed to vicariously feel the pain of his seamstress neighbour who sued him for assaulting and permanently disabling her after he flew into a rage during an argument on the shared landing of their apartment block, but instead wrote of his relief when she died and he no longer had to make court-mandated payments to her. Likewise, when one of his first critics, F.E. Beneke, committed suicide his gratification seemed to eclipse his compassion. As his biographer, David Cartwright wrote, “His ability to hold a grudge was elephantine.” 

As for his love life, the misogynist had for years satisfied his ‘animal’ urges with prostitutes, and likely contracted syphilis. By the time he reached his seventies, he suffered from rheumatism, deafness, and nervous disorders all of which combined to make handwriting almost impossible. Soon, palpitations, shortness of breath, and inflammation of the lungs took him to bed. “I have always hoped to die easily,” he wrote in his diary and so, apparently, he did.

Suffering a fear of being buried alive – a not uncommon phobia at the time – Schopenhauer had instructed that his body lay in state for days. And though the philosopher remained a vehement opponent of Christianity, in accordance with his dying wish, a Lutheran minister presided over his funeral and delivered the eulogy. 

*     * *     * * *     * *     * 

Another Lutheran minister, Carl Nietzsche, died of ‘softening of the brain’ when his precocious son, Frederick – named after King Wilhelm IV of Prussia – was five years old. Schopenhauer was laid to rest eleven years later while Frederick, planning to become a pastor like his late father, was attending boarding school in a medieval monastery.

After graduating with honors, Nietzsche changed his plans while attending University of Bonn and perusing The World as Will and Representation. Later, as a young philology student at University of Basil, he led a Schopenhauer chat group which he compared to “the first Christians.” One of them described the “master” as having “a god-like brow that appears to rise to infinity, framed by beautiful white hair under white eyebrows like those of the Olympian Zeus.” Nietzsche himself wrote admiringly of Schopenhauer’s theory of ‘Will,’ his asceticism-to-salvation doctrine, and shared his contempt for Hegel’s “cheap optimism and brain-rotting obscurity.” Later, Martin Heidegger, who wrote three books about Hegel, confessed in Contributions to Philosophy, “Making itself intelligible is suicide for philosophy.”

As with Schopenhauer, Nietzsche’s first career move was to hitch his wagon to a star: in his case, that of Richard Wagner. But, while the young philosopher was courting the favor of Germany’s lionised composer, he fell in unrequited love with Wagner’s wife and muse, Cosima, the illegitimate daughter of Franz Liszt. His falling out with the Wagners was followed by an even more devastating split from a Russian femme fatale by the name of Lou Andreas-Salome. He had hoped the brilliant young beauty would become a worshipful disciple and wife, but, fiercely independent, Salome decided that the supposedly ‘lofty’ philosopher was in fact a “mean egotist... concealing filthy intentions of a wild marriage.” As Curtis Cate reveals in his book entitled simply Friedrich Nietzsche, after she revealed her feelings, Nietzsche – who had formerly praised her as “the most gifted and reflective” of all his acquaintances – described her to a confidante as:
“…a sterile, dirty, evil-smelling she-ape with the false breasts – a calamity!”
Mortally wounded, he wrote to his sister, Elisabeth (who later gave his walking stick to Hitler): 
“I am too proud to believe that any human being could love me: that would pre-suppose he knew who I was. Equally little do I believe that I will ever love someone: that would presuppose that I found – wonder of wonders – a human being of my rank – don’t forget that… I find the founder of Christendom superficial in comparison to myself.”
On the rebound from the Salome calamity, he dashed out the first sections of Thus Spoke Zarathustra in a matter of weeks, describing the manuscript as a “bloodletting” inspired by her rejection, “turning my humiliation into gold.” Though he only printed forty copies and distributed a mere handful to associates, he called his masterpiece “the Fifth Gospel,” a replacement for “dead Christianity” and one of the two or three most important books in human history. Its critical recognition, though belated, “unleashed the floodgates of Nietzsche’s self-infatuation and megalomaniacal fantasies,” noted his biographer, Curtis Cate.

Meanwhile, having written God’s obituary in The Gay Science, Nietzsche declared that a replacement deity was necessary: the Übermensch or Superman. To escape nihilism, he argued that a new morality was also imperative. He laid this out in Zarathustra which he described as “a new holy book that challenges all existing religions.”  In a later work, Ecce Homo (a reference to Pilate’s greeting to the defendant Christ), containing such chapters as “Why I am so Wise” and “Why I am so Clever,” he claimed that his gospel was written “by God Himself” and that the authors of the Bible and the Vedas were “unworthy of unlatching my shoes.” (Ironically, Zarathustra was a second-coming of the Iranian prophet Zoroaster, an evangelist for absolute good and evil, a doctrine Nietzsche ridiculed in his follow-up title, Beyond Good and Evil.)

The ‘Madman’ hero of his early manifesto, Joyous Wisdom, carried a Diogenes lantern in the morning sunshine, and cried: “I am looking for God! Where has God gone? We have killed him - you and I!”

Despite Nietzsche’s claims, like Schopenhauer’s, of complete originality, the Almighty’s decease had been announced a few years earlier in The Philosophy of Redemption (1875) by the poet-philosopher, Philipp Mainländer, who had written: “God has died and his death was the life of the world.” Mainländer didn’t look forward to resurrection, reincarnation, Heaven, Hell, or Purgatory, but to nothingness. For him “the supreme principle of morality” was that “non-being is better than being.” For him, Schopenhauer’s Will-to-Live was the Will-to-Die, making death “salvation.” At age thirty-four, Mainländer hanged himself – using a pile of unsold copies of Redemption as a platform. Disgusted that his predecessor hadn’t found the courage of the Greek martyr, Sisyphus, Nietzsche called Mainländer a “dilettante” and a “sickeningly sentimental apostle of virginity.” 

Still, a dead God invited questions. Did Nietzsche or Mainländer mean dead to them, or objectively – to everybody? And, Who or What was dead: the Christian Trinity, the Abrahamic One-and-Only, or the idea of divinity generally? Finally, since the definitive characteristic of any god is im-mortality, how could one be dead, much less a murder victim?

Though Nietzsche provided no solid answers, he had no illusions about the implications of his conviction. If God were indeed dead, then weren’t morality, salvation, and immortality DOA too? Moreover, without divinity, death itself becomes the only inescapable Absolute, rendering life meaningless. Denouncing Kant, Hegel, and other “old maids with theologian’s blood” who perpetuated the “romantic hypochondria” of the Church, Nietzsche insisted that man should bravely press on alone and defiantly. “How shall we comfort ourselves?” he demanded. “Must we not become gods simply to appear worthy of it?”

Indeed. And through his own prophet, Zarathustra, successor to ‘The Madman,’ Nietzsche introduced the Übermensch. Going beyond good and evil, this Superman -- disdaining not only the Christian “master/slave morality” but “decadent” compassion too – would become a law unto himself. In so doing, he would avoid hopelessness and nihilism by immersing himself in the dynamic present world of his own life.

In the context of his Will to Power doctrine, the self-declared Superman identified two kinds of egotism: ‘Good’ or ‘holy selfishness’ and ‘Bad’ or ‘the unholy’. Like Schopenhauer, Nietzsche considered his to be of the first kind. He, too, prided himself in his ascetic life: having no close friends, living anonymously in cheap hotels, and avoiding “loud, shiny” things. To describe the second kind of egotism, he turned Luke 18:14 on its head, replacing “All those who exalt themselves will be humbled” with “All those who humble themselves wish to be exalted” – apparently not considering that this might apply to him.

Both philosophers suffered from chronic depression, a curse which, long before, Aristotle had called the all-too-common fate of insatiable minds. Was the despondency of the duo the inevitable by-product of their unflinching pessimism about the human condition, the other way around, or two sides of the same coin? In any case, Nietzsche called his melancholy “the worst of all penalties that exists on earth.” He feared that his few friends might regard him as a “crazy person driven half mad by solitude” and told one “not to worry too much if I kill myself.” For years he had been plagued by migraines, nausea, and seizures. He was convinced that lightning and thunderstorms triggered the attacks, and that his sanity depended on clear weather. He self-medicated with opium, hashish, cocaine, and chloral hydrate (a potent sedative used in asylums). In a moment of rare levity, he wrote in Twilight of the Idols: “Man does not strive for happiness; only the Englishman does.” Though he may never have experienced such pedestrian felicity, he was fortunate enough to be bipolar: his depression was sometimes swept away by sudden, unprovoked ecstasies especially during his solitary mountain marches under cloudless skies. When the darkness returned, he tried to steel himself: “What does not kill me makes me stronger!”

But, by his own admission, the Superman had been fighting “monsters” – Minotaurs in his cerebral labyrinth -- all his life and, in the process, trying not to become one. As even the comparatively upbeat Kant said: “Only the descent into the hell of self-knowledge can pave the way to godliness.”

The problem, by Nietzsche’s own admission, was that “If you gaze long into the abyss, the abyss gazes back into you.” So, even before losing his mind, the abyss-gazer confessed: “The thought of suicide is a great consolation: by means of it one gets through many a dark night.” The story goes that one morning, after seeing a horse being brutally beaten in the street, he collapsed, weeping, and was soon carried to a sanatorium in Basel.

Before being committed, Nietzsche had tried to ballast his ship with his Pythagoras-inspired Amor Fati or ‘Eternal Return’ doctrine. Like the Buddhists, he regarded time as a timeless circle which always returned to itself. In the face of such inevitability, the challenge of “the great soul” was, he thought, to embrace the process, to love one’s fate and believe everything happens for the best. “As long as I remain attached to my ego, willingly eternal return remains beyond my grasp,” he wrote. “Only by becoming myself the eternal joy of becoming, can I guarantee for myself eternal life… Happiness consists in dwelling in the realm that is beyond death.” Ironically, days before his collapse, his megalomania was peaking. He wrote to a friend that he was the reincarnation of, among others, Buddha, Dionysus, and Napoleon, “I have also been on the Cross,” he added, signing the letter the “Crucified One.” Having for years endured the strings and arrows of his critics, family and “friends,” had he seen himself in the unmercifully beaten horse?

Nietzsche spent the last ten years of his short life in asylums. At first, he introduced himself to visitors as the German emperor and Cosima Wagner’s husband. “There is no name that is treated with such reverence as mine!” he reminded everyone. In a letter signed ‘Nietzsche Caesar,’ he told the poet August Strindberg (struggling with his own mental crisis at the time) that he had ordered the Emperor’s execution, and would have the Pope thrown in jail for good measure. Later, he became Dionysus the “dying-and-rising” son of Zeus and Persephone, Queen of the Underworld. Like the god of wine, revelry, and madness, Nietzsche stripped himself naked and spent his days dancing, singing, and howling.

When he calmed down, as Curtis Cate recalls, he repeatedly wept: “I am dead because I am stupid, or I am stupid because I am dead.” Following many small strokes, the author of The Birth of Tragedy reverted to infantilism: according to another biographer, Julian Young, he played with dolls and toys, drank his own urine from his boot, and smeared his cell walls with his own faeces. After he had completely lost his mind, the Übermensch’s last words to his long-suffering caretaker were said to be: “Mutter, ich bin dumm” – “Mother, I am dumb.”

Early on, he had predicted his premature and tragic demise. Spoke Zarathustra: 
“And if one day my cleverness abandons me – ah, how it loves to fly away! – may my pride go on flying with my folly.” 
In the end, the prophet went on to even predict his creator’s martyrdom: “Your soul will be dead even before your body.” 

At his funeral, attended only by a handful of friends, his sister, Elisabeth, followed his strict instructions: “Promise me that when I die … no priest utter falsehoods at my graveside. Let me descend into my tomb as an honest pagan.”

After honoring her brother’s final wish, Elisabeth burst into tears, crying, “Zarathustra is dead!” And his eulogizer, Dr. Ernst Horneffer, beholding the philosopher in his final sleep, declared: “He looks like a dead god. Truly he does!”

*     * *     * * *     * *     * 

Like Schopenhauer and Nietzsche, philosophers throughout history have not addressed themselves to the commoner, but to the intellectual aristocracy. Speaking for his colleagues dedicated like himself to the “Know Thyself” Delphic mandate, Aristotle said that the “great-souled” man was aware of his own superiority. The first Greek Superman, Pythagoras, had declared himself the son of Apollo, the god of wisdom; and Empedocles, also claiming divine birth, jumped into Mount Etna to prove his immorality.

Many later metaphysicians asserted, explicitly or implicitly, the importance of ego-loss for spiritual development, or at least for avoiding execution by the Church. Ironically, Buddhist sympathisers Schopenhauer and Nietzsche, evangelists for ego-loss, were both, as we have seen, self-deifiers. Before them, in pursuit of truth and self-knowledge, philosophers had tried to erase their reason-clouding passions. But the anger and alienation of the Übermensches was so irrepressible that their passions drove their thoughts. Nietzsche, who called himself “dynamite” was dedicated to “living dangerously” as a “warrior” rather than a “saint of knowledge.” So, he wrote in “blood” and, when first reading Schopenhauer, recognised a blood brother.

Indeed, Schopenhauer, who argued that “truth is best observed in the nude,” had mocked most philosophies as the emperor’s new clothes. He denounced the Church Scholastics through Leibniz and Hegel as “cloud castle” builders and cerebral automata, utterly out of touch with real men and the real world. It was a matter pride, if not hubris, that both he and his successor sought solitude, “so as not to drink out of everybody’s cistern,” as Nietzsche put it. So, it was no wonder that that they aroused their colleagues’ contempt and alarm. Said the original dynamiter, the father of Cynicism, Diogenes, who had plagued Plato, “Of what use is a philosopher who doesn't hurt anybody's feelings?”

Again, the primary target of the Übermensches’ polemics was Christianity. God’s A-team – Saints Augustine, Anselm, and Aquinas – usurped Greek philosophy and turned it into evangelist theology by cherry-picking Plato and Aristotle and postulating flatulent ontological proofs. One of the few theists to risk heresy by criticising the Aquinas Mensa group was Erasmus. “They smother me with me dogmas,” complained the Renaissance humanist. “They are surrounded with a bodyguard of definitions, conclusions, corollaries, propositions explicit and propositions implicit; they are looking in utter darkness for that which has no existence whatsoever!” Later, Carl Jung, put it even more bluntly in his memoir, calling the Scholastics “more lifeless than a desert… who knew only by hearsay that elephants existed.”  

The two human, all-too-human new humanists more than agreed. Since the pre-Socratics most thinkers had struggled to identify the essence of life and humanity with lofty concepts while completely ignoring the inner experience and consciousness of the individual man independent from God.

It was no mistake that Schopenhauer called Christianity “a masterpiece of animal training” while Nietzsche, the self-described Anti-Christ, called it “dishonorable,” “cowardly” and “an incredibly cunning form of hypocrisy.” What maddened both most was the papist party line that “God is a necessary being,” parroted even by Kant, whom they otherwise admired. But, for them, this necessity was not based on truth, but only on an overriding human need for three necessities. First, for an absolute, eternal Creator, the uncaused cause of and explanation for everything. Second, for an all-knowing, all-good moral ruler. Third, and most importantly, for a deliverer from death and guarantor of immortality.

Ordinary meek mortals may have needed such things, but not the Übermensches! Even the supposedly ‘dogma free’ thinker, Descartes, whose philosophy was based on doubt, in the end confirmed the existence of an omniscient, benevolent God. In trying to identify man’s essence, wasn’t the Frenchman’s famous conclusion -- I think therefore I am—really just rationalist window-dressing for a fideist I believe therefore I am?

Unafraid of the Vatican and academic thought police, Schopenhauer and Nietzsche threw both the propositions out the window, declaring I Will therefore I am. For most men, except the most extraordinary, insisted Schopenhauer, “Reason is the slave of [willful] passion,” not the other way around. Outside the ivory tower of metaphysicians, among real humanity, the argument is hard to challenge. As for thinking, if it is indeed man’s essence, why do so few do it? And, even for the few who do, why do so many wind up espousing self-serving absurdities?

But in asserting the primacy of Will, the Übermensches created a problem for themselves, since neither really believed in Free Will. Both were Determinists. Like most Greeks, Nietzsche believed in the Fates, thus his Amor Fati idea. Schopenhauer sophistically argued for Free Will/Determinist Compatibilism, saying: “Man can do what he wills but he cannot will what he wills.” But the assertion begs the question: If man doesn’t will what he wills, then who or what does?

The answer is obvious for the monotheist. This who is the all-powerful, all-knowing eternal God who had creation all worked out from the beginning. Thus, one would expect all Christians to be Determinists. But, dispensing with consistency in favor of necessity once again, most (save Lutherans and Calvinists) were devout Free-Willers. The reason is simple: in a Determinist creation, man– lacking intent and, hence, true selfhood – is relieved of responsibility for his actions and is no longer a moral agent. Prayer also becomes an exercise in futility. Furthermore, strict Determinism renders the very concepts of human good and evil absurd, effectively invalidating all moral-mandate religions. Worse of all, freedom denial can, in effect, become a slippery slope to fatalism, if not nihilism.

The Übermensches were on that very slope even while denying they were on it. Schopenhauer called salvation the denial of selfish Will-to-Power in favor of Will-to-Truth. But how can truth or self-knowledge be gained if it hasn’t already been predetermined? Nietzsche said happiness – salvation – would continue to elude him until he freely chose to love his fate, no matter how purgatorial. But how could he freely choose anything in a God or Dharma dominated life?

In lieu of addressing these issues, both philosophers reasserted the necessity of man filling the void of a dead or imaginary God of convenience. In their case, this led to self-deification, if not megalomaniacal self-idolatry. In the ancient world this was called fatal pride, or hubris, the flaw which became the gravest of the medieval Seven Deadly Sins. The first of the legendary proud was Prometheus, cursed by Zeus for bringing mankind knowledge. Similarly, God had cursed the snake in Eden for peddling the deadly fruit, then driving Adam and Eve from paradise lest they ate of the Tree of Life and become gods “like one of us.” (Genesis 3:22).

In a real sense, the Übermensches – especially Nietzsche – were sons of Prometheus, chained to the mountain rock, their livers eaten daily by Zeus, in his eagle disguise. Their Christian predecessors had avoided the same fate only by surrendering to the Almighty, each crying to the heavens like Job, “Have mercy, forgive me, I am only a man and your servant!”

But the Supermen, waiting for no rescue by Hercules or a deus ex machina, instead clung defiantly to their Philosopher’s Stone. Steeling themselves, the classical scholars remembered the words of Prometheus which Hermes, the mediator between gods and men, called “the ravings of a Madman:” 
“Let Zeus hurl his blazing bolts with thunder and with earthquake… None of all this will bend My Will!”

 

About the author 

David Comfort’s essays appear in Free Inquiry, The Montreal Review, Pleiades, Stanford Arts Review, and he is the author of books including:  The Rock And Roll Book Of The Dead, The Fatal Journeys of Rock’s Seven Immortals — a study of the tempestuous lives and tragic ends of Elvis Presley, John Lennon, Jimi Hendrix, Jim Morrison, Janis Joplin, Jerry Garcia, and Kurt Cobain."

Address for correpondence: David Comfort <dbeco@comcast.net>