Thursday 30 November 2023
Self-Consciousness: the Battle between Science and Philosophy
Monday 21 August 2023
REVIEW ARTCLE: Quantum Mechanics and the Rigor of Angles
From The Philosopher CXI No. 1 Spring 2023
A young, fresh-faced Werner Heisenberg |
Borges, Heisenberg, Kant, and the Ultimate Nature of Reality
William Egginton’s quest to make sense of life, the universe and everything is ambitious but ultimately unsuccessful. Unsuccessful? Yes, I know that sounds harsh, but then Egginton seeks not only to make sense of the mysteries of quantum physics, something the physicists abjectly fail to do, but to finally pin down the essential secrets of reality – something the philosophers likewise have made a fist of over the centuries.
Part of the reason Egginton himself makes little progress is that he doesn’t see either group as having failed though. Rather he sees his role more as a cultural critic, picking out the best bits from the more mediocre.
For Egginton, there is essentially one key issue: whether reality exists ‘out there’, fixed and eternal, or whether it is rather a shifting miasma, a theatre performance in which the actors (say atoms) subtly respond to their audience (you and me and the scientist with a particle detector over there in the corner). Plato, we may summarise, largely emphasises the former view – although he certainly acknowledged the paradoxes it brought with it. Indeed, he suggests in some of his writing that reality is best approached through poetry and the imagination rather than through practical experiments. But Egginton is no great fan of Plato, instead he eulogises Immanuel Kant, who he often prefaces with the adjective ‘great’.
Actually, many of the traditional pantheon of philosophers are introduced like this: There’s “the great John Stuart Mill”, “the great French thinker” René Descartes, and Hegel, “the greatest German thinker of the nineteenth century”. All of them though slightly beneath that “great font of German wisdom”, Immanuel Kant. Kant, you see, intuited that the world scientists observe is not entirely independent of their gaze. It is instead, the product of the way they look at it, coloured by the perceptual spectacles they wear.
It is a good point, but one we could equally well have been attributed to Plato, or Zeno - let alone the “gloomy Scot”, David Hume, author of “that great book, The Treatise on Human Nature”. The danger with this kind of praise for the philosophers is not so much that it is grating (ahem, “greating”), but that it is uncritical. You see, it is important to remember that Kant actually had many views and clearly some-of his theories were just plain daft. Famously he thought that all the planets in the solar system had life on them, with their intelligence related to their distance from the sun.
Indeed, in the Critique of Pure Reason, the “famed” Critique of Pure Reason, he occupies himself with the “inhabitants of the planets”, a happy speculation that is, of course, completely groundless. The point is, Kant’s writings should not be consumed uncritically – and while Egginton provides a rather fine overview of the philosopher’s oeuvre, it is flawed by the apparent assumption of the brilliance of all Kant’s words. And Kant is a big part of the book, as the subtitle plainly indicates.
The same issue, with bells on, concerns Jorge Luis Borges. Why should this writer, excellent dramatist as he certainly was, be taken as a guide to quantum mechanics? It’s on the face of it implausible. Especially as no one actually understands quantum physics. That’s not just me sniping. Egginton himself acknowledges the words of the physicist and Nobel laureate Richard Feynman who once wrote: “I think I can safely say that nobody really understands quantum mechanics”. To read Borges as a guide to QM is a bit like reading Winnie the Pooh as a guide to Taoist philosophy, as Bengamin Hoff did in The Tao of Pooh. Only Hoff’s book was a joke!
Mind you, I was recently a speaker on a panel discussing “the nature of the universe“ recently, alongside two quantum physicists, and they insisted that they did understand it. The problem was simply (they said) that average Joes lack the intuitive understanding of the beautiful and complex mathematics underlying the subject. You know, things like the extra dimensions quantum theory works daily with. How many dimensions are there according to quantum physics, you might ask? Well, ten, a mere ten we might say, dimensions are used to describe superstring theory, while eleven dimensions can describe supergravity and M-theory. But as Wikipedia helpfully explains, “the state-space of quantum mechanics is an infinite-dimensional function space”.
The theoretical physicist Roger Penrose has queried the logical coherence of such airy mathematical talk of multiple dimensions, yet as I say, many “experts” insist that it all makes perfect sense, albeit being hard to explain without complex mathematics. At least Egginton doesn’t go down that rabbit hole. There is next to no maths in this book, even though his third major character, Werner Heisenberg, made his contributions in just this “toe-curling“ area. As Egginton puts it: “The uncertainty principle, as it came to be known, showed with inescapable, mathematical precision that … full knowledge of the present moment wasn’t just hard to pin down; it was actually impossible.”
Which point explains why, to paraphrase Borges, the rules that govern the world are the man-made, artificial ones of chess, not the heavenly ones of the angels. So let’s give the last word to Egginton, who has produced an account that is always highly original, often insightful and only, in places, rather difficult and obscure.
“There is rigor there, indeed. But to see that we are the chess masters who made it, we must let the angels go. And that, it seems, is the hardest task of all.”
The Rigor of Angels: Borges, Heisenberg, Kant, and the Ultimate Nature of Reality
By William Egginton
Pantheon, New York, 2023
Friday 28 April 2023
REVIEW ARTICLE The Experience Machine
From The Philosopher CXI No. 1 Spring 2023
In this optical illusion, the two orange circles are exactly the same size; however, the one on the right appears larger |
REVIEW ARTICLE
The Experience Machine: How Our Minds Predict and Shape Reality
Andy Clark is a professor of ‘cognitive science’ at Sussex University and talks briefly about how he started there when the department was entirely novel. Coincidentally, I also remember this excitement, as I studied at the same university around this time, and was offered a choice of modules. The choice was social science, Marxism, or this new thing, ‘cognitive science’. I took the first option and my career has never recovered. Cognitive science, on the other hand, has become highly fashionable. But what is it exactly? I was suspicious then that the subject was really an uncomfortable blend of computing and biology - the study of the mechanisms of the brain.
The thing is, I don’t think the human mind is a computer – far less that you can work out how it operates by studying electrical signals in the brain circuits.
Clark says that his approach “challenges a once traditional picture of perception, the idea that light, sound, touch and chemical odors activate receptors in eyes, ears, nose and skin, progressively being refined into a richer picture of the wider world”. The new idea, the “new science” as he puts it, “flips that traditional story on its head”. Perceptions are “heavily shaped from the other direction, as predictions formed deep in the brain… alter responses”.
Yet, having offered this radical reversal, Clark brings back the ‘outside-in’ approach by allowing that sense perception “helps correct errors in prediction”. What does ‘error’ mean here? That there is a real world out there that sense perceive accurately? It seems to be an uncomfortable attempt to ride two horses at once.
“Predictions and prediction errors are increasingly recognised as the core currency of the human brain, and is is in their shifting balances that all human experience takes shape” adds Clark undeterred. The brain is however in the driving seat, “painting the picture”, with sensory perception “mostly to nudge the brushstrokes”. Switching to computer language, he explains:
“Instead of constantly expending large amounts of energy on processing incoming sensory signals, the bulk of what the brain does is learn and maintain a kind of model of body and world– a model that can then be used , moment by moment, to try to predict the sensory signal”
As Clark mentions, we come across this usually hidden effect when we look at optical illusions, like the one where two figures are the same height yet one is made to seem much more larger by virtue of tricks with the background. Clark goes so far as to say that what we really see are “controlled hallucinations”.
Talking of which, the placebo effect is discussed and Clark notes how studies have found not only that people suffering from back pain benefit both from pills which contain active ingredients and those that don’t, but that this effect survives even when the patients are told the pill has no active ingredients. (Not that Clark goes there, but this certainly points to a possible justification for the infinitesimal treatments of the homeopaths.)
What he does say, however, is that anything which boosts confidence in an intervention will enhance its prospects of success – but here I think he misses the difference between conscious cues and subconscious ones. Odd lacuna? But then he actually argues that “predictive brains” involve the “active construction” of human experience.
A lot of the claims here are offered flat, as “science says”, yet surely deserve some scepticism. Research for example, that, when shown religious images, “religious subjects rated a sharp pain as less intense than atheists shown the same images”. Or that isting the risks of side effects on medical treatments can “actually bring about the side effects they describe”. Let alone that dentists telling patients that the injection will only be a tiny poin prick reduces the pain experiences. On the contrary! Those words of reassurance signify to many of us that a very nasty pain is about to follow! Okay, maybe my point looks a bit like a joke, , but actually, one big concern I have with this account is how it removes the complexity of human thought processes. But that’s cognitive science!
Common-sense notions of causality are also reversed in the phenomenon noted by the German philosopher, Hermann Lotze and, later, William James too, that actions come about because we mentally represent the completed effects of the action. Clark gives the example of pulling the strings on a puppet. We are interested in (say) the puppet waving its hand – not in the details of how the string moves which bit of the puppet in what may be a complicated sequence.
Likewise, it seems that when we have a drink of water to assuage thirst, we get immediate satisfaction of the thirst, even though the water has not had enough time to have had any physical effect.
More ominously, things like a police officer’s elevated heart rate when investigating a possible threat, can be taken by the “predictive brain” as themselves evidence that there really is a threat. (We’ve seen too many cases of such things in American in recent years, with police shooting householders or motorists out of misplaced conviction of a threat.)
Having put the brain in control of our environment, Clark then backtracks and offers examples of how out environment can be adapted to help our brain. Alzheimers sufferers, for example, he says, may arrange their homes with lots of visual clues, from written nots to arranged objects, to “take over the functions that were previously played by their biological brains”. A biological part of the brain is replaced by external, physical substitutes. Clark suggests we are all increasingly doing this – relying on calculators to do our maths, on search engines to remember things.“Most of our daily devices especially our smartphones and other wearables, are already starting to act as woven resources. They are devices whose constant functionality has become factored deep into our brains’ ongoing assumptions of what kinds of operations can be performed and when.” Actually, this is an idea Clark set out earlier in another book, co-authored with David Chalmers, called The Extended Mind. Clark mocks the “chauvinism” of those who say such devices cannot be considered part of our ‘minds’, as they are outside our heads. Yet he does not seem to have considered that all our thinking might be better understood as social, particularly given that so much of it is framed in words and concepts that are produced socially and made concrete in human languages.
Towards the end of the book, which is a reasonable place to do it, Clark sums up his theory: “To perceive is to find the predictions that best fit the sensory evidence”. This rather underlines how little philosophy there is in the book. “The sensory evidence” seems to be still there, just as John Locke and the other philosophers supposed centuries ago, steadily being processed by humans. The only new thing is that at a certain level of the conscious mind, the perceptions are being reorganised, largely in line with expectations based on previous experience. Clark declares this is big progress, writing that “understanding the way predictions of many kinds alter and adjust the shape of experience is our biggest clue yet to what experience really is”. But if that’s the takeaway from the book, it’s rather meagre. Plato wrote about perceptual illusions and what they told us about perception, two thousand odd years ago. Cognitive science, it seems, is a new name for a very old study.
Reviewed by Martin Cohen
The Experience Machine: How Our Minds Predict and Shape RealityBy Andy Clark
Pantheon (Penguin–Random House) 2023 ISBN 9781524748456
Thursday 16 March 2023
The Alchemy of Political Discourse: A Mix of Facts, Beliefs, Reality, and Uncertainty
The strange philosophical liaison of Martin Heidegger and his student Hanna Arendt mixed two very different worldviews |
The Alchemy of Political Discourse:
A mix of facts, beliefs, reality, and uncertainty
‘The human understanding when it has once adopted an opinion draws all things else to support and agree with it. And though there is a greater number and weight of instances to be found on the other side, yet these it either neglects and despises, or else by some distinction sets aside and rejects, in order that by this great and pernicious predetermination the authority of its former conclusion may remain inviolate’.
‘I think nobody should be certain of anything. If you’re certain, you’re certainly wrong because nothing deserves certainty. So, one ought to hold all one’s beliefs with an element of doubt, and one ought to be able to act vigorously in spite of doubt. One has in practical life to act upon probabilities, and what I should look to philosophy to do is to encourage people to act with vigor without complete certainty’.
Tuesday 28 February 2023
REVIEW: The Future of Humankind (2023)
From The Philosopher CXI No. 1 Spring 2023
The Future of Humankind?
The Future of Humankind is a snapshot of current thinking about science, more than a real attempt at futurology. It is a deftly written book, that contains a lot of fascinating facts and information - while also keeping the reader active and thinking. The reader may not agree with a lot of it, but that’s a virtue as much as apparently a fault.
As a gloomy Capricorn, I’m not very optimstic about the future and so I am always happy to read a disaster book, and at first glance, the contents list of John Hand’s tale of what awaits humanity promises to be just that. But a few pages in, talking about spacerocks httting the Earth, the gloomy reader will already be a bit puzzled at what seems to be Hand’s indefatigable spirit of technological optimism. On this wonderfully awful prospect, we’re told that thanks to NASA scientists:
“…we now know that there are no comparably large asteroids (diameter greater than 5km) in orbits that could potentially hit Earth.”
Well, boo to that! But, I’m sure the scientists have their reasons. They usually do, which like Grouch Marx, the also offer to swap for other ones later if need be. But I recall from reading about Newton, that the ability to predict the movement of things like asteroids is not just difficult (lack of observations) but maybe actually impossible, due to the so-called three-body problem. This is that taking the initial positions and velocities of three point masses and solving for their subsequent motion is mathematically impossible. A tiny influence can create a sequence of effects - in the manner of the butterfly wing that causes a hurricane in chaos theory.
More mundanely, there’s certainly plenty of sources saying all the significant asteroids have now all been tidily registered and their movements calculated – but it is almost every year that a previously unindentified rock almost hits the earth, so I would have thought a little more scepticism is warranted.
A similarly optimistic note is struck a little later on, now in relation to the novel mRNA treatments for the corona virus, when we’re told:
“What is significant is how effective most of these vaccines have been. Two injections of the Pfizer–BioNTech vaccine spaced 21 days apart proved 95 % effective at preventing Covid-19 in those without prior infection and 100 % effective at preventing severe disease.”
I googled this and it seems that even the rather pro-Vax medical journal, the Lancet, reported that if this vaccine’s effectiveness against the Delta variant was 93% after the first month - after the first month mind!- it declined to just 53% after four months. Against other coronavirus variants, efficacy declined to 67%.
Likewise, a New York State Department of Health study at the end of 2021 found that the effectiveness of Pfizer’s vaccine against Covid infection plummeted from 68% to a ridiculous 12% for young children during the omicron surge from December 13 through January 24. Protection against hospitalization dropped from 100% to 48% during the same period.
As I say, Hands just seems to be an optimistic soul. Take comets, notoriously mysterious. These, he allows, constitute unknowns but he says they are behind “less than 1% of all impact events in Earth’s recent geological record” and none at all in historical times so “it is safe to conclude that impacts from asteroids or comets pose no existential threat to humans”. I don’t quite get where the “so” comes from here. One is reminded of Bertrand Russell’s unfortunate chicken who is used to receiving a handful of grain every morning from the farmer’s wife - for as long as the chicken can remember - until one fateful morning, the wife wrings its unfortunate neck.
Okay, what about the risk of nuclear war? At the time of writing this review, the Russian television is full of pundits threatening to wipe out Britain and American with nuclear tipped hypersonic missiles. The risk of nuclear war seems very real. Yet here too, Hands is optimistic. He says that treaties plus international opinion mean the risk of Armageddon diminishes steadily.
Talking about nuclear, I have to take particular umbrage with the account of the risk from nuclear reactors. Hands uses UN sources to reassure us that hardly anyone died either after the Chernobyl or Fukushima partial meltdowns. Yet I looked in detail at these for my own book on nuclear energy and I found that the UN account was woefully skewed towards defending the “peaceful use” of nuclear energy while neutral reports found convincing evidence of a huge toll, particularly after Chernobyl, a toll solidly recorded in hospital records, as well as a more speculative but potentially very significant toll worldwide due to things like plutonium particles entering ocean foodchains. Now neither of these nuclear disaster actually qualifies as apocalyptic, but Hands neglects how very close – a matter of hours – both plants came to much greater explosions, which it is generally agreed could have caused global radiation poisoning.
I am more sympathetic to the conclusion to a long chapter on the dangers from either population explosion or climate change or a combination of the two. The short story is that, again, Hands is optimistic, concluding there is a negligible probability either will result in the extinction of the human species. Okay, on this I agree! But again, is this really demonstrated or rather a rosy assessment based on cherry-picked data?
A more unusual doomsday topic is that of the supposed threat of humans being replaced by robots. It’s a good account this, but again, the rosy assuarance at the end that we really do not need to fear that “artificially intelligent machines built by humans will exceed human-level intelligence and thereafter bring about the extinction of the human species” seems to go beyond the evidence, not least because surely we do not know at this point in time what the capabilities of machines will be within a relatively short timescale.
Perhaps the more significant part of the book, certainly the part I like best, is entitled “Reflections and conclusions”. It is here, that Hands details his working methods and describes what he found when he attempted to discuss the doomsday scenarios with the relevant “experts”. As philosophers of science, like Thomas Kuhn and Paul Feyerabvend could have predicted, he seemed to find that within each field “most” of the experts cohered around one opinion, but there were invariably one or two outsiders with dissident views. The message from philosohy of science is that this is not because most people are right and one or two are laggards, though. It is because scientific debates are rooted in dominant ‘paradigms’. Acceptance and career success in a field requires researchers to conform. Yet, the point about these paradigms is that they change. To the point: identifying the firm opinions of the majority of scientists is not the route to certainty it pretends. Next year, the majority of scientists may think something different. That is how science works.
At this point, I might mention that there was another book in this general area a few years ago, Why Science is Wrong...About Almost Everything, by entrepreneur Alex Tsakiris. which noted that the great majority of material in in textbooks from the pervious generation, written with such confidence then, is now equally confidently considered to be plain wrong, So we should be very sceptical about the value of surveys of scientific opinion. To be fair, Hands does himself describe, towards the end of the book, some of the strange cases of erroneous scientific predictions, for example that “heavier than air” machines cannot fly – but this is not the message of the bulk of the book.
Instead, there’s not enough sense of this need for caution about the pronouncements of ‘experts’ here. This contrasts with the caution about political claims and pressure groups. For example, Hands cites the case of XR (Extinction Rebellion) cofounder Gail Bradbrook who was quoted, in October 2019, saying that 97% of the world’s species, including humans, would perish within her daughter’s lifetime – unless everyone on the planet stopped producing CO2 by 2025, as evidence of the dangers of allowing your opinions to drive your analysis, but the same danger seems to have shaped this book, even if the opinions are backed by respectable authorities.
As Part Two of this look into the future of the human species, Hands again takes a brief Cookes Tour of the current theories, such as ‘colonizing space’ or ‘using technology to extend the healthspans of individuals’, but here the optimism changes to a more sceptical approach. Indeed, he eventually concludes few things stand much chance of ever coming to be.
Some ideas do seem rather wild and to oppose our current understanding of physical laws – such as the speed of light. But then, in a closing chapter, Hands allows himself to step aside from the straitjacket of what we know to speculate that “the next stage of human evolution” could be a new kind of consciousness. He writes:
“I speculate that, in its fourth stage of evolution, the human cosmic consciousness will be able to comprehend such a higher reality. Furthermore, the human cosmic consciousness may well constitute that higher reality and be the cause of all the physical and chemical laws and parameters that enable it to evolve in an eternal, continuous cycle of self-creation. That is, it forms a cosmic consciousness that underlies everything and from which everything unfolds.”
It’s a nice idea, and it comes more intriguingly as part of an otherwise, as I say, determinedly “scientific” account. But to me, it seems to be driven more by optimism than anything as mundane as the evidence.
Reviewed by Martin Cohen
THE FUTURE OF HUMANKIND: Why We Should Be Optimistic
By John Hands
Castleton, 2023
ISBN 978-0993371943
Kryptonite of the Übermensch
When Frederick Nietzsche stumbled on a copy Arthur Schopenhauer’s The World as Will and Representation, at the tender age of twenty-one, he was electrified. “Here every line shouted renunciation, negation,” the precocious philology student at the University of Leipzig later wrote. “Here I saw a mirror in which I spied [my] own mind in frightful grandeur.”
But then, in many ways, Nietzsche and Schopenhauer, though separated by fifty years, were brothers. Both German iconoclasts were classical scholars, anti-establishment polemicists, depressives, bachelors, misogynists, syphilitics, self-exiles, animal lovers, skilled musicians (Schopenhauer flute, Nietzsche piano) and lovers of melodramatic opera to the point of weeping.
Most importantly, both believed that two thousand years of metaphysics had to be recast by shifting focus from divinity and empty abstraction, towards authentic living human experience and motivation, no matter how lowly or perverse it might seem. Accordingly, they were among the first philosophers who thought and wrote passionately – like human beings, not logic-driven automata. Their writings went on to inspire both the Existentialists with their “Existence-before-Essence” mantra and Freud’s science of psychoanalysis with its tripartite Ego-Id-Superego distinction.
In fact, while describing egotism as the fundamental incentive in all life, Schopenhauer wrote in On the Basis of Morality:
“A man prefers the entire world’s destruction sooner than his own… and is capable of slaying another, merely to smear his boots with the victim’s fat.”
Hoping to discredit, if not entirely replace, Hegel’s abstruse theocentric system, Schopenhauer, after much difficulty, managed to publish his anthropocentric manifesto The World as Will and Representation (later carried into World War One in his knapsack by the young German soldier, Adolph Hitler), Though Schopenhauer was convinced the eight-hundred page treatise would revolutionise two millennia of philosophical thought, he was criticised for parroting his grad school professors, Johann Gottlieb Fichte and Friedrich Schelling. Outraged, the outlaw insisted not only that his work was completely original but that those of his predecessors nothing more than “great bloated soap bubbles,” as David Cartwright recalls in his classic biography of the philosopher. Perhaps it was comments like this that led another celebrated German megalomaniac, Ludwig Wittgenstein (whose own students called him “God”) to say many years later: “Schopenhauer has quite a crude mind ... where real depth starts, his comes to an end.”
But then, as far as Schopenhauer was concerned, his works were pearls before swine. “Talent hits a target no one else can hit; Genius hits a target no one else can see,” he pompously pointed out in The World as Will and Representation. The result was that soon, even his few supporters were calling him a “chained, barking dog.” Responding that he would, in fact, prefer to die if not for the companionship of dogs, Schopenhauer wandered restlessly from city to city with his beloved poodles, Atma and Atma. (Eccentrically, he called all his dogs by the same name…Atma, and nicknamed them all Butz. Atma being the Hindu word for the universal soul from which all universal souls arise.)
Fearing he might drown himself like his father, Frau Schopenhauer sent her son urgent letters, fretting about his poor health, “dark disposition” and isolation. Ignoring her meddling, and insisting that “to live alone is the fate of all great souls,” Arthur continued to travel solo – his only companions the dogs, a Buddha statue which he called his “crucifix surrogate” and a copy of the Upanishads, the Hindu scriptures that he called “the consolation of my life.” But when critics claimed his work owed much to the holy texts, he boasted that his writings were unprecedented and nothing less than the legendary Philosopher’s Stone itself: the absolute and divine truth as sought by alchemists for millennia. Even so, fearing continued rejection and obscurity, he started to think that the highest form of ascetism was voluntary starvation.
Suicide had been a hotly debated topic in the ancient world, as he well knew. Stoics such as Seneca, Epictetus, and Marcus Aurelius considered it an “honorable” option. Socrates had obviously agreed, preferring the hemlock to exile. Plato, however, whom Schopenhauer called “divine,” saw both sides of the matter and straddled the fence. Aristotle condemned suicide as ignoble – a cowardly act. Actually, Schopenhauer considered Kant “marvellous” until his hero seconded Aristotle, declaring that self-destruction violated one’s divinely mandated “duty to live.” But then, Schopenhauer reserved his utmost contempt for theist optimists, arguing that the perverse will-driven life “wasn’t worth living.” Nevertheless, thanks to late, hard-won recognition, he managed to soldier on.
Toward the end of his life, Schopenhauer, now regarding himself as a prophet, insisted his doctrine was inspired by the “spirit of truth and the Holy Ghost.” He had at last gained worshipful ‘apostles,’ eight by his count, with no traitorous Judas among them. Once, when they had to meet without him, he reminded them of the Gospel of Matthew (18:20): “Where two of you gather in my name, I am there with you.” Presumably, he was joking, having once declared, “A sense of humor is the only divine quality of a man.”
Like Buddha, Schopenhauer said compassion was the source of all virtue. He defined the emotion as feeling another’s pain in one’s own body. “Extreme egoists,” he wrote in Parerga and Paralipomena, “ignore the misery that their unchecked self-interest produces, and malicious persons delight in the wretchedness of others.” Earlier in his life, he hadn’t seemed to vicariously feel the pain of his seamstress neighbour who sued him for assaulting and permanently disabling her after he flew into a rage during an argument on the shared landing of their apartment block, but instead wrote of his relief when she died and he no longer had to make court-mandated payments to her. Likewise, when one of his first critics, F.E. Beneke, committed suicide his gratification seemed to eclipse his compassion. As his biographer, David Cartwright wrote, “His ability to hold a grudge was elephantine.”
As for his love life, the misogynist had for years satisfied his ‘animal’ urges with prostitutes, and likely contracted syphilis. By the time he reached his seventies, he suffered from rheumatism, deafness, and nervous disorders all of which combined to make handwriting almost impossible. Soon, palpitations, shortness of breath, and inflammation of the lungs took him to bed. “I have always hoped to die easily,” he wrote in his diary and so, apparently, he did.
Suffering a fear of being buried alive – a not uncommon phobia at the time – Schopenhauer had instructed that his body lay in state for days. And though the philosopher remained a vehement opponent of Christianity, in accordance with his dying wish, a Lutheran minister presided over his funeral and delivered the eulogy.
After graduating with honors, Nietzsche changed his plans while attending University of Bonn and perusing The World as Will and Representation. Later, as a young philology student at University of Basil, he led a Schopenhauer chat group which he compared to “the first Christians.” One of them described the “master” as having “a god-like brow that appears to rise to infinity, framed by beautiful white hair under white eyebrows like those of the Olympian Zeus.” Nietzsche himself wrote admiringly of Schopenhauer’s theory of ‘Will,’ his asceticism-to-salvation doctrine, and shared his contempt for Hegel’s “cheap optimism and brain-rotting obscurity.” Later, Martin Heidegger, who wrote three books about Hegel, confessed in Contributions to Philosophy, “Making itself intelligible is suicide for philosophy.”
As with Schopenhauer, Nietzsche’s first career move was to hitch his wagon to a star: in his case, that of Richard Wagner. But, while the young philosopher was courting the favor of Germany’s lionised composer, he fell in unrequited love with Wagner’s wife and muse, Cosima, the illegitimate daughter of Franz Liszt. His falling out with the Wagners was followed by an even more devastating split from a Russian femme fatale by the name of Lou Andreas-Salome. He had hoped the brilliant young beauty would become a worshipful disciple and wife, but, fiercely independent, Salome decided that the supposedly ‘lofty’ philosopher was in fact a “mean egotist... concealing filthy intentions of a wild marriage.” As Curtis Cate reveals in his book entitled simply Friedrich Nietzsche, after she revealed her feelings, Nietzsche – who had formerly praised her as “the most gifted and reflective” of all his acquaintances – described her to a confidante as:
“…a sterile, dirty, evil-smelling she-ape with the false breasts – a calamity!”
“I am too proud to believe that any human being could love me: that would pre-suppose he knew who I was. Equally little do I believe that I will ever love someone: that would presuppose that I found – wonder of wonders – a human being of my rank – don’t forget that… I find the founder of Christendom superficial in comparison to myself.”On the rebound from the Salome calamity, he dashed out the first sections of Thus Spoke Zarathustra in a matter of weeks, describing the manuscript as a “bloodletting” inspired by her rejection, “turning my humiliation into gold.” Though he only printed forty copies and distributed a mere handful to associates, he called his masterpiece “the Fifth Gospel,” a replacement for “dead Christianity” and one of the two or three most important books in human history. Its critical recognition, though belated, “unleashed the floodgates of Nietzsche’s self-infatuation and megalomaniacal fantasies,” noted his biographer, Curtis Cate.
Meanwhile, having written God’s obituary in The Gay Science, Nietzsche declared that a replacement deity was necessary: the Übermensch or Superman. To escape nihilism, he argued that a new morality was also imperative. He laid this out in Zarathustra which he described as “a new holy book that challenges all existing religions.” In a later work, Ecce Homo (a reference to Pilate’s greeting to the defendant Christ), containing such chapters as “Why I am so Wise” and “Why I am so Clever,” he claimed that his gospel was written “by God Himself” and that the authors of the Bible and the Vedas were “unworthy of unlatching my shoes.” (Ironically, Zarathustra was a second-coming of the Iranian prophet Zoroaster, an evangelist for absolute good and evil, a doctrine Nietzsche ridiculed in his follow-up title, Beyond Good and Evil.)
The ‘Madman’ hero of his early manifesto, Joyous Wisdom, carried a Diogenes lantern in the morning sunshine, and cried: “I am looking for God! Where has God gone? We have killed him - you and I!”
Despite Nietzsche’s claims, like Schopenhauer’s, of complete originality, the Almighty’s decease had been announced a few years earlier in The Philosophy of Redemption (1875) by the poet-philosopher, Philipp Mainländer, who had written: “God has died and his death was the life of the world.” Mainländer didn’t look forward to resurrection, reincarnation, Heaven, Hell, or Purgatory, but to nothingness. For him “the supreme principle of morality” was that “non-being is better than being.” For him, Schopenhauer’s Will-to-Live was the Will-to-Die, making death “salvation.” At age thirty-four, Mainländer hanged himself – using a pile of unsold copies of Redemption as a platform. Disgusted that his predecessor hadn’t found the courage of the Greek martyr, Sisyphus, Nietzsche called Mainländer a “dilettante” and a “sickeningly sentimental apostle of virginity.”
Still, a dead God invited questions. Did Nietzsche or Mainländer mean dead to them, or objectively – to everybody? And, Who or What was dead: the Christian Trinity, the Abrahamic One-and-Only, or the idea of divinity generally? Finally, since the definitive characteristic of any god is im-mortality, how could one be dead, much less a murder victim?
Though Nietzsche provided no solid answers, he had no illusions about the implications of his conviction. If God were indeed dead, then weren’t morality, salvation, and immortality DOA too? Moreover, without divinity, death itself becomes the only inescapable Absolute, rendering life meaningless. Denouncing Kant, Hegel, and other “old maids with theologian’s blood” who perpetuated the “romantic hypochondria” of the Church, Nietzsche insisted that man should bravely press on alone and defiantly. “How shall we comfort ourselves?” he demanded. “Must we not become gods simply to appear worthy of it?”
Indeed. And through his own prophet, Zarathustra, successor to ‘The Madman,’ Nietzsche introduced the Übermensch. Going beyond good and evil, this Superman -- disdaining not only the Christian “master/slave morality” but “decadent” compassion too – would become a law unto himself. In so doing, he would avoid hopelessness and nihilism by immersing himself in the dynamic present world of his own life.
In the context of his Will to Power doctrine, the self-declared Superman identified two kinds of egotism: ‘Good’ or ‘holy selfishness’ and ‘Bad’ or ‘the unholy’. Like Schopenhauer, Nietzsche considered his to be of the first kind. He, too, prided himself in his ascetic life: having no close friends, living anonymously in cheap hotels, and avoiding “loud, shiny” things. To describe the second kind of egotism, he turned Luke 18:14 on its head, replacing “All those who exalt themselves will be humbled” with “All those who humble themselves wish to be exalted” – apparently not considering that this might apply to him.
Both philosophers suffered from chronic depression, a curse which, long before, Aristotle had called the all-too-common fate of insatiable minds. Was the despondency of the duo the inevitable by-product of their unflinching pessimism about the human condition, the other way around, or two sides of the same coin? In any case, Nietzsche called his melancholy “the worst of all penalties that exists on earth.” He feared that his few friends might regard him as a “crazy person driven half mad by solitude” and told one “not to worry too much if I kill myself.” For years he had been plagued by migraines, nausea, and seizures. He was convinced that lightning and thunderstorms triggered the attacks, and that his sanity depended on clear weather. He self-medicated with opium, hashish, cocaine, and chloral hydrate (a potent sedative used in asylums). In a moment of rare levity, he wrote in Twilight of the Idols: “Man does not strive for happiness; only the Englishman does.” Though he may never have experienced such pedestrian felicity, he was fortunate enough to be bipolar: his depression was sometimes swept away by sudden, unprovoked ecstasies especially during his solitary mountain marches under cloudless skies. When the darkness returned, he tried to steel himself: “What does not kill me makes me stronger!”
But, by his own admission, the Superman had been fighting “monsters” – Minotaurs in his cerebral labyrinth -- all his life and, in the process, trying not to become one. As even the comparatively upbeat Kant said: “Only the descent into the hell of self-knowledge can pave the way to godliness.”
The problem, by Nietzsche’s own admission, was that “If you gaze long into the abyss, the abyss gazes back into you.” So, even before losing his mind, the abyss-gazer confessed: “The thought of suicide is a great consolation: by means of it one gets through many a dark night.” The story goes that one morning, after seeing a horse being brutally beaten in the street, he collapsed, weeping, and was soon carried to a sanatorium in Basel.
Before being committed, Nietzsche had tried to ballast his ship with his Pythagoras-inspired Amor Fati or ‘Eternal Return’ doctrine. Like the Buddhists, he regarded time as a timeless circle which always returned to itself. In the face of such inevitability, the challenge of “the great soul” was, he thought, to embrace the process, to love one’s fate and believe everything happens for the best. “As long as I remain attached to my ego, willingly eternal return remains beyond my grasp,” he wrote. “Only by becoming myself the eternal joy of becoming, can I guarantee for myself eternal life… Happiness consists in dwelling in the realm that is beyond death.” Ironically, days before his collapse, his megalomania was peaking. He wrote to a friend that he was the reincarnation of, among others, Buddha, Dionysus, and Napoleon, “I have also been on the Cross,” he added, signing the letter the “Crucified One.” Having for years endured the strings and arrows of his critics, family and “friends,” had he seen himself in the unmercifully beaten horse?
Nietzsche spent the last ten years of his short life in asylums. At first, he introduced himself to visitors as the German emperor and Cosima Wagner’s husband. “There is no name that is treated with such reverence as mine!” he reminded everyone. In a letter signed ‘Nietzsche Caesar,’ he told the poet August Strindberg (struggling with his own mental crisis at the time) that he had ordered the Emperor’s execution, and would have the Pope thrown in jail for good measure. Later, he became Dionysus the “dying-and-rising” son of Zeus and Persephone, Queen of the Underworld. Like the god of wine, revelry, and madness, Nietzsche stripped himself naked and spent his days dancing, singing, and howling.
When he calmed down, as Curtis Cate recalls, he repeatedly wept: “I am dead because I am stupid, or I am stupid because I am dead.” Following many small strokes, the author of The Birth of Tragedy reverted to infantilism: according to another biographer, Julian Young, he played with dolls and toys, drank his own urine from his boot, and smeared his cell walls with his own faeces. After he had completely lost his mind, the Übermensch’s last words to his long-suffering caretaker were said to be: “Mutter, ich bin dumm” – “Mother, I am dumb.”
Early on, he had predicted his premature and tragic demise. Spoke Zarathustra:
“And if one day my cleverness abandons me – ah, how it loves to fly away! – may my pride go on flying with my folly.”
At his funeral, attended only by a handful of friends, his sister, Elisabeth, followed his strict instructions: “Promise me that when I die … no priest utter falsehoods at my graveside. Let me descend into my tomb as an honest pagan.”
After honoring her brother’s final wish, Elisabeth burst into tears, crying, “Zarathustra is dead!” And his eulogizer, Dr. Ernst Horneffer, beholding the philosopher in his final sleep, declared: “He looks like a dead god. Truly he does!”
Like Schopenhauer and Nietzsche, philosophers throughout history have not addressed themselves to the commoner, but to the intellectual aristocracy. Speaking for his colleagues dedicated like himself to the “Know Thyself” Delphic mandate, Aristotle said that the “great-souled” man was aware of his own superiority. The first Greek Superman, Pythagoras, had declared himself the son of Apollo, the god of wisdom; and Empedocles, also claiming divine birth, jumped into Mount Etna to prove his immorality.
Many later metaphysicians asserted, explicitly or implicitly, the importance of ego-loss for spiritual development, or at least for avoiding execution by the Church. Ironically, Buddhist sympathisers Schopenhauer and Nietzsche, evangelists for ego-loss, were both, as we have seen, self-deifiers. Before them, in pursuit of truth and self-knowledge, philosophers had tried to erase their reason-clouding passions. But the anger and alienation of the Übermensches was so irrepressible that their passions drove their thoughts. Nietzsche, who called himself “dynamite” was dedicated to “living dangerously” as a “warrior” rather than a “saint of knowledge.” So, he wrote in “blood” and, when first reading Schopenhauer, recognised a blood brother.
Indeed, Schopenhauer, who argued that “truth is best observed in the nude,” had mocked most philosophies as the emperor’s new clothes. He denounced the Church Scholastics through Leibniz and Hegel as “cloud castle” builders and cerebral automata, utterly out of touch with real men and the real world. It was a matter pride, if not hubris, that both he and his successor sought solitude, “so as not to drink out of everybody’s cistern,” as Nietzsche put it. So, it was no wonder that that they aroused their colleagues’ contempt and alarm. Said the original dynamiter, the father of Cynicism, Diogenes, who had plagued Plato, “Of what use is a philosopher who doesn't hurt anybody's feelings?”
Again, the primary target of the Übermensches’ polemics was Christianity. God’s A-team – Saints Augustine, Anselm, and Aquinas – usurped Greek philosophy and turned it into evangelist theology by cherry-picking Plato and Aristotle and postulating flatulent ontological proofs. One of the few theists to risk heresy by criticising the Aquinas Mensa group was Erasmus. “They smother me with me dogmas,” complained the Renaissance humanist. “They are surrounded with a bodyguard of definitions, conclusions, corollaries, propositions explicit and propositions implicit; they are looking in utter darkness for that which has no existence whatsoever!” Later, Carl Jung, put it even more bluntly in his memoir, calling the Scholastics “more lifeless than a desert… who knew only by hearsay that elephants existed.”
The two human, all-too-human new humanists more than agreed. Since the pre-Socratics most thinkers had struggled to identify the essence of life and humanity with lofty concepts while completely ignoring the inner experience and consciousness of the individual man independent from God.
It was no mistake that Schopenhauer called Christianity “a masterpiece of animal training” while Nietzsche, the self-described Anti-Christ, called it “dishonorable,” “cowardly” and “an incredibly cunning form of hypocrisy.” What maddened both most was the papist party line that “God is a necessary being,” parroted even by Kant, whom they otherwise admired. But, for them, this necessity was not based on truth, but only on an overriding human need for three necessities. First, for an absolute, eternal Creator, the uncaused cause of and explanation for everything. Second, for an all-knowing, all-good moral ruler. Third, and most importantly, for a deliverer from death and guarantor of immortality.
Ordinary meek mortals may have needed such things, but not the Übermensches! Even the supposedly ‘dogma free’ thinker, Descartes, whose philosophy was based on doubt, in the end confirmed the existence of an omniscient, benevolent God. In trying to identify man’s essence, wasn’t the Frenchman’s famous conclusion -- I think therefore I am—really just rationalist window-dressing for a fideist I believe therefore I am?
Unafraid of the Vatican and academic thought police, Schopenhauer and Nietzsche threw both the propositions out the window, declaring I Will therefore I am. For most men, except the most extraordinary, insisted Schopenhauer, “Reason is the slave of [willful] passion,” not the other way around. Outside the ivory tower of metaphysicians, among real humanity, the argument is hard to challenge. As for thinking, if it is indeed man’s essence, why do so few do it? And, even for the few who do, why do so many wind up espousing self-serving absurdities?
But in asserting the primacy of Will, the Übermensches created a problem for themselves, since neither really believed in Free Will. Both were Determinists. Like most Greeks, Nietzsche believed in the Fates, thus his Amor Fati idea. Schopenhauer sophistically argued for Free Will/Determinist Compatibilism, saying: “Man can do what he wills but he cannot will what he wills.” But the assertion begs the question: If man doesn’t will what he wills, then who or what does?
The answer is obvious for the monotheist. This who is the all-powerful, all-knowing eternal God who had creation all worked out from the beginning. Thus, one would expect all Christians to be Determinists. But, dispensing with consistency in favor of necessity once again, most (save Lutherans and Calvinists) were devout Free-Willers. The reason is simple: in a Determinist creation, man– lacking intent and, hence, true selfhood – is relieved of responsibility for his actions and is no longer a moral agent. Prayer also becomes an exercise in futility. Furthermore, strict Determinism renders the very concepts of human good and evil absurd, effectively invalidating all moral-mandate religions. Worse of all, freedom denial can, in effect, become a slippery slope to fatalism, if not nihilism.
The Übermensches were on that very slope even while denying they were on it. Schopenhauer called salvation the denial of selfish Will-to-Power in favor of Will-to-Truth. But how can truth or self-knowledge be gained if it hasn’t already been predetermined? Nietzsche said happiness – salvation – would continue to elude him until he freely chose to love his fate, no matter how purgatorial. But how could he freely choose anything in a God or Dharma dominated life?
In lieu of addressing these issues, both philosophers reasserted the necessity of man filling the void of a dead or imaginary God of convenience. In their case, this led to self-deification, if not megalomaniacal self-idolatry. In the ancient world this was called fatal pride, or hubris, the flaw which became the gravest of the medieval Seven Deadly Sins. The first of the legendary proud was Prometheus, cursed by Zeus for bringing mankind knowledge. Similarly, God had cursed the snake in Eden for peddling the deadly fruit, then driving Adam and Eve from paradise lest they ate of the Tree of Life and become gods “like one of us.” (Genesis 3:22).
In a real sense, the Übermensches – especially Nietzsche – were sons of Prometheus, chained to the mountain rock, their livers eaten daily by Zeus, in his eagle disguise. Their Christian predecessors had avoided the same fate only by surrendering to the Almighty, each crying to the heavens like Job, “Have mercy, forgive me, I am only a man and your servant!”
But the Supermen, waiting for no rescue by Hercules or a deus ex machina, instead clung defiantly to their Philosopher’s Stone. Steeling themselves, the classical scholars remembered the words of Prometheus which Hermes, the mediator between gods and men, called “the ravings of a Madman:”
“Let Zeus hurl his blazing bolts with thunder and with earthquake… None of all this will bend My Will!”
About the author
David Comfort’s essays appear in Free Inquiry, The Montreal Review, Pleiades, Stanford Arts Review, and he is the author of books including: The Rock And Roll Book Of The Dead, The Fatal Journeys of Rock’s Seven Immortals — a study of the tempestuous lives and tragic ends of Elvis Presley, John Lennon, Jimi Hendrix, Jim Morrison, Janis Joplin, Jerry Garcia, and Kurt Cobain."
Address for correpondence: David Comfort <dbeco@comcast.net>