Sunday, 29 September 2024

REVIEW ARTICLE: Israel–Palestine, Federation or Apartheid?

  From The Philosopher CXII No. 2 Autumn 2024

A section of wall dividing Jewish and Palestinian areas in the West Bank

Deconstructing the Zionist Myth  


Review article by Paula Stanyer.

Shlomo Sand is clear eyed about the current situation in Israel. And in Israel–Palestine: Federation or Apartheid? he seeks to challenge those who offer solutions based on two separate states, believing this to be both unrealistic and at odds with events in Israel since the 1967 war. Instead, he sees Israel already as a binational state, only one in which one people completely dominate the other. Whilst he says that the situation is not exactly identical to that of apartheid South Africa between 1948 and 1992, he thinks that it is, in principle, very similar not least because it is based on the complete separation between two human groups that live side by side. 

For Sand, the day to day reality in the West Bank recalls, broadly, other colonial situations of the recent past. Specifically, the European colonisers almost always enjoyed the civil rights offered by the metropolis from which they and their ancestors had descended, even though their colonies were not legally annexed to it, and it was simply assumed that they would live like this for decades, alongside native peoples deprived of citizenship and fundamental human rights. 

Despite himself being a former advocate of a two-state solution to the Israeli Palestinian problem, today Sand, Professor Emeritus in History at the University of Tel Aviv and author of other books deconstructing Israel including The Invention of the Jewish People and The Invention of the Land of Israel, argues that the lives of Jews and Palestinians are so intertwined that today it is simply not possible for the former to decide to separate themselves from the latter. 

This calm yet compelling account surveys the ever growing demographic and economic integration, including the building of settlements in the West Bank and the founding of the Ariel University in the West Bank. Sand looks cooly at what he sees as the indulgence provided by the West to Israel due to European guilt over the Holocaust and Islamophobic attitudes which result in an inability to understand either the true state of affairs let alone to accord Palestinians a right to be treated as citizens with equal status. He is sharply critical of those that are blind to the situation of Palestinians and continue, in his view, to prattle on about ‘two states for two peoples’. On the contrary, Sand sees many obstacles to any agreement between the two sides above all that so many Israelis insist that an egalitarian, democratic state would pose a threat to Jewish identity. However, in Israel-Palestine: Federation or Apartheid? Sand sets out his new conviction that the only solution to the problem is a binational confederation for Israel/Palestine. After all:
‘With the passing years, Israel has continued to consolidate its hold on the occupied territories. Thousands of Israelis have set up home close to indigenous villages and Palestinian towns. They have acquired a great deal of land at low prices and that has become the property of the new settlers, for whom a whole network of roads are exclusively reserved. Severe oppression and denial of the basic rights of the local population have engendered violent resistance which in turn has fuelled ever harsher repression.’
Sand starts by challenging the reader to question the central aim of Zionism to create a Jewish state with a Jewish majority. He disputes the idea that Jews have an historic right to the land of Palestine and instead argues that there already is a bi-national state, just not one based on equality and human rights. Quite the reverse: Israel is a state where one people have domination over another in a way that is reminiscent of the domination practised in colonies throughout the nineteenth century and first half of the twentieth century. 
‘…for fifty-six years, million so of Palestinians have been living under a military regime, being deprived of civil, legal and political rights. Worse still, Palestinians under occupation have to live side by side with the colonisers in what is becoming ever more obviously an apartheid system. They were forbidden to live in the settlements; they are allowed only to work in them. They are forbidden to marry Jews and cannot apply for Israeli citizenship. Many Palestinian workers cross their old borders every day, to come and work in poor conditions in the Israeli economy, and must return to their homes before nightfall.’
Shlomo Sand himself was born in Austria in 1946, the son of Polish survivors of the Holocaust who emigrated to Jaffa in 1948. In due course, Sand would even fight as a member of the Israeli army in the Six Day War. In the preface to the book he recounts his journey as a political activist from 1967 when his anti-colonial beliefs initially favoured a two-state solution – an Israeli State for all citizens both Jewish and Arab and an independent Palestinian republic alongside it.

At the heart of the book is an effort to explain and justify his move away from the two-state solution towards instead that of a bi-national federation that would be based on one person one vote, collective rights, equality and a respect for human rights. Even though such a vision seems unattainable at the moment, Sand argues that Israel today is de facto a binational state as both Jewish and Palestinian populations are intertwined physically and economically.

Much of the book is a careful examination of two visions of Zionism – one that regards Zionism as the creation of a spiritual homeland for Jews throughout the world and one which requires the creation of a nation state for Jews. The former view was articulated by Ahad Ha’am, a rabbi who visited Palestine and foresaw that the early treatment of the local population by Jewish Pioneers would lead to conflict. Ha’am argued for bi-national arrangements as did Arthur Ruppin, a German Zionist proponent of pseudoscientific race theory, who saw the local population as deriving from ancient Hebrew peoples. The latter saw the need to reach some sort of modus vivendi with the Arabs since the idea of the transfer of Arabs to some other place was unrealistic. 

And so, Sand tirelessly examines and rejects the Zionist myth of exile and the linked notion of the return of Jews to the land of Israel after 2000 years of wandering. Despite this myth often being portrayed as an historical truth based on a mix of Biblical and archaeological evidence, Sand argues that it is simply false. But even were it to be correct, that would not be sufficient justification to take over a land inhabited by others.

By contrast, the competing vision for Israel of Theodor Herzl, the founder of the Zionist idea, was political based on the national state even as the very notion of ‘the land of Israel’ was an invention. As Donald Sasson sums up Sand’s view set out in his earlier book The Invention of the Jewish People, the ‘Land of Israel’ is barely mentioned in the Old Testament and when it is mentioned, it does not include Jerusalem, Hebron, or Bethlehem. In fact, as far as the Bible goes, ‘Israel’ only consists of Samaria, the region that is today’s northern Israel. There never was a united kingdom including both ancient Judea and Samaria

The second myth that is exposed is that at the time of early Jewish migration, the land was empty and uncultivated. It is this failure to fully acknowledge the existence of the local population that led some Zionists to give prophetic warnings that Jewish nationalism would inevitably lead to ‘a state armed to the teeth with enemies all around’. Likewise, the philosopher Hannah Arendt saw a Jewish state as a ghetto surrounded by a hostile environment – a Spartan state permanently at war with others.

Israel–Palestine: Federation or Apartheid? traces the binationalist and imperialist arguments through the 20th century to the present revealing the complexity and plurality of Zionist thought and challenging the notion that there is a single strand of Zionist belief. In the process, Sand focusses on Arab thinking through from 1891 to today including attitudes to early Jewish migration, the foundation of the Palestinian Liberation Organisation and on to the Oslo Accords.

The book ends, however, by saying that if, today, there is still no answer to the question of future of Israel, it is still up to each of all of us who are concerned for the future of the children and grandchildren of the region to continue pressing, even if in the dark and even if against hope, for respect for those so much-denied, so much-trampled principles of equality and human rights. This is an important and informative account that challenges readers to rethink their assumptions.


Israel-Palestine: Federation or Apartheid?
by Shlomo Sand

Polity Press 2024
ISBN 9781509564408 • 254 pages • Paperback • £15.99 

Translated by Robin Mackay



Monday, 2 September 2024

Fear and Loathing in the Land of the Free

T
rump supporters stand on a U.S. Capitol Police armored vehicle as others take over the steps of the Capitol.
(with acknowledgements to Kent Nishimura/LATimes)

Jeffersonian Democracy versus  Trumpian Tyranny 

By David Comfort

A‘Flawed Democracy’. That is how the Economist Democracy Index now classifies the United States, taking into account, broadly, such things as the Electoral College distortion of the popular vote; the influence of dark money in elections; gerrymandering of electoral districts; the ascendency of the executive branch at the expense of congressional and legislative power and the blurring of lines between of Church and State. 

All of which makes it a good time to step back and consider how, historically and philosophically speaking, did America, for so long so proud to see itself as the world’s ‘greatest democracy’ and a beacon to other nations, get here?

Democracy is a rarity in human affairs. Ancient nations were ruled by god-kings; the sole exceptions were Democratic Athens and Republican Rome. After the fall of Rome in the fifth century, divine-right European monarchs and the Church enjoyed absolute power for more than a millennium. Theocrats argued that since God’s will is absolute, the only choice people could make was to either bow to their religiously sanctioned leaders and be saved, or oppose them and be damned.

Amongst the philosophers daring to oppose such theocratic autocracy was Jeremy Bentham. The father of the moral system known as utilitarianism, was an English atheist who argued that morality should not be dictated by an imagined divine will, but actions instead carefully weighed up – calculated – in order to achieve to ‘the greatest happiness of the greatest number of people’. At the time, for many, the humanist philosophy seemed to be a revival of pagan Epicureanism. Indeed, to some evangelists, the idea of worldly happiness was subversive, if not plain ungodly.

Nonetheless, over in America, the American founding fathers embraced the new ethics. Before the Revolution, Patrick Henry famously declared, ‘Give me liberty or give me death!’ while, in the Declaration of Independence, his young colleague, Thomas Jefferson, wrote that liberty and the pursuit of happiness were ‘inalienable’ human rights. To protect them, Jefferson called for a separation of Church and State, an idea first proposed by the ‘father of liberalism’, the British philosopher, John Locke, a steadfast champion of freedom of religion.

As for Benjamin Franklin, the patriot who, besides George Washington, perhaps deserves the most credit for American independence, was in his own words, ‘a thorough deist’. He was also the Ambassador to France and thus in good position to enlist its invaluable support for shaking off the English shackles. The brightest of the founding fathers, the honorary doctor was a polymath, the founder of the American Philosophical Society (founded in 1743 in Philadelphia), and a friend of Voltaire and David Hume. After the Revolution, the wife of the first mayor of Philadelphia, the Cradle of Freedom, asked him: ‘Well, Doctor, what have we got, a republic or a monarchy?’ Recalling the short-lived Athenian democracy and Roman republic, he famously replied: ‘A republic – if you can keep it’. 

Recalling Demosthenes, Cicero, and other historic martyrs of freedom too, Franklin warned Jefferson shortly after signing the Declaration of Independence, ‘We must, indeed, all hang together, or most assuredly we shall all hang separately’. Fittingly too, the American motto chosen then became E Pluribus Unum or in English, Out of Many – One.

The authority of Judeo–Christianity was based on the idea that the Bible was the unimpeachable word of God: Jefferson and his compatriots challenged this idea. Jefferson, the third president, a deist suspected of being a closet atheist, redacted the entire Old Testament – with its divinely mandated commandments, genocides, plagues, and enslavements – from his new Jefferson Bible, calling it and the Gospels too ‘defective and doubtful’. Rejecting the medieval dogma of Biblical Infallibility, he and other Enlightenment thinkers took the good book, especially the Pentateuch, for myth, not as the literal word of the Lord to Moses. Moreover, as a Unitarian like many of his colleagues, Jefferson, an optimist about human nature, rejected the notion of original sin and man’s alleged corrupt nature. After leaving the White House, the classical scholar wrote to his predecessor, John Adams, a fellow Unitarian, of his hope that ‘the human mind will someday get back to the freedom it enjoyed two thousand years ago’. Athenian democracy, the first in history, was a great inspiration to him and his colleagues (though, with women and slaves disenfranchised, only twenty percent of the populace could participate).

The United States Declaration of Independence opens with Jefferson’s ringing phrase that ‘all men are created equal’. And yet, like six other of the twelve founding fathers, he owned slaves. In his case no less than six hundred over his lifetime, an amount considered to be typical at the time, and, at death, he freed five (including his two surviving children by Sally Hemmings, his domestic servant and maid) while selling off the remainder to pay off the Monticello plantation’s debts. Even so, he called the oldest global practice ‘abominable’ and during his presidency moved swiftly to outlaw the international slave trade. On the other hand, groups of American Indians who resisted his assimilating ‘civilisation program’ were forced into indentured labor without pay (while the 13th Amendment of 1865 emancipating Blacks did not apply to them for another fifty years.) Nevertheless, Jefferson, the plantation owner, expressed sympathy for indigenous peoples. In this, he was, likely, further influenced by freethinkers such as Jean-Jacques Rousseau who in their writings had celebrated the ‘noble savage’ as innately good and uncorrupted.

The ringing proclamation in the Declaration of Independence of rights to ‘life, liberty and the pursuit of happiness’, although penned by Jefferson, likely followed George Mason's Virginia Declaration of Rights (adopted June 12, 1776), which referred to ‘the enjoyment of life and liberty, with the means of acquiring and possessing property, and pursuing and obtaining happiness and safety’, but beyond that were the phrases of the English philosophers, notably Jeremy Bentham, who was in turn echoing Thomas Hobbes. 

Ironically, although in his writings Hobbes affirmed a slave’s right to rebel, he supported the monarchy, arguing that without its constraints human life would revert to the natural state of being ‘nasty, brutish and short’. In France, around the same time, Jean-Jacques Rousseau wrote in The Social Contract: ‘Man is born free, and everywhere he is in chains. Those who think themselves the masters of others are indeed greater slaves than they’. And as the legendary German philosopher–poet, Goethe, had put it: ‘None are more hopelessly enslaved than those who falsely believe they are free’. Here was a revolutionary idea, shifting focus from national freedom to personal. While the aristocrats may have considered themselves free in their power and privilege, many were yoked to their own insatiable and consumptive wants. 

The Founding Fathers – for the large part schooled in Greek, Latin and classical thought -- believed that citizens must be well informed to participate in a true democracy. So, naturally, they stressed the importance of education. Toward this end, Jefferson and Franklin were among the first to open free libraries with their own books. (The word ‘library’, incidentally, derives from the Latin liber, meaning both ‘book’ and ‘free’.) The signatories to the Declaration of Independence feared that a nation lacking books and education would degenerate into a ‘mobocracy’ motivated by, in the words of the Federalist, Alexander Hamilton, the ‘grossest errors of misinformation and passion’. 

This was the same fear expressed long before by Plato. For this reason, in his utopian Republic, he replaces democracy with a sophocracy (meaning a state ruled by ‘the wise’) led by a dispassionate Philosopher–King. Less pessimistic than Plato about the character and intelligence of the common man, though, the Founding Fathers declined such a coronation and instead gave power to the people. And yet, two hundred and fifty years later, in 2021, American democracy barely survived a mob insurrection triggered by an unscrupulous populist who, after his election, declared, ‘I love the uneducated!’ 

It is perhaps more significant than is generally realised that, in his bid for re-election at a recent rally, Donald Trump, real estate developer and TV reality star turned Commander in Chief, threatened to shutter the Department of Education. At the same time, Trump pledged to expand funding for religious schools, some of which advocated for teaching such things as Creationism and banning books such as 1984, Brave New World, and The Handmaid’s Tale. It is also revealing to find that one of the books most targeted by these schools is Thomas Paine’s The Age of Reason. Another is the Analects, by Confucius. Here, to be sure, Trump taps into a deeper vein of American anti-intellectualism with pre-revolutionary roots in religion, as well as into the antipathy aroused by Martin Luther when he called reason the enemy of faith and ‘the devil’s whore’. Or, as the Massachusetts fire-and-brimstone Puritan, John Cotton, thundered: ‘The more learned and witty you be, the more fit to act for Satan will you be!‘ 

It was despite such reservations about an unschooled citizenry that James Madison composed the American Bill of Rights. Again, it is slightly ironic that the American version followed an English precursor, in this case being inspired by the thirteenth century English Magna Carta that had made all free men equal while preventing royal tyranny. (At least in theory.) Madison’s first two amendments were the most consequential: those for freedom of speech and a right to bear arms. The combination is significant as although Shakespeare wrote that ‘The pen is mightier than the sword’, for millennia writers’ hands had been chopped off or their tongues cut out in accordance with Babylonian, Biblical, and Inquisition law. The Age of Reason was all about free expression and thought, even if the grim historic fact remained: as Otto von Bismarck put it, that world affairs are never decided by ‘speeches’ but by ‘blood and iron’. 

Granted, words may trigger wars, but swords and bullets finish them. Orator Patrick Henry’s battle cry in a speech made to the Second Virginia Convention on March 23, 1775, at Saint John’s Church in Richmond, Virginia: ‘Give me liberty or give me death’ was stirring, but it was the blood of British soldiers that birthed American freedom. Similarly, Inquisitionists’ sermons about the wrath of God may have been disturbing, but it was the rack, the stake, and the chopping block that brought the message home to anyone tempted to think, much less express themselves, independently.

Yet, today, the first two amendments to the US Constitution are fundamentally at odds: freedom of speech rests on reason and persuasion; freedom to bear arms summons a violence, often deadly, that silences speech. Originally, in the wake of the Revolution, the second freedom was defensive and collective, approving a well-regulated militia for national security; under might-makes-right aggressors and intimidators, it has become offensive, giving individual vigilantism and terrorism free rein. Meanwhile, even as AR-15s, not muskets, have become the leading cause of death for American children, that champion of deregulated gun rights, the National Rifle Association, has for decades enjoyed a tax-free status otherwise reserved for religious, and charitable causes, for organisations opposed to cruelty to children and animals.

Of course, totalitarians have always relied most heavily on weaponry while exploiting words for propaganda (in other words, ‘alternative facts’ and hence alternative reality). Just as Stalin killed millions in the Great Purge, Mao famously promised ‘Justice will come from the barrel of a gun’. Later, Mao’s ‘Great Leap Forward’ claimed the lives of millions.

Against this, Mahatma Gandhi, India’s apostle of nonviolence, once said: ‘No one can hurt me without me giving them permission’. Forty years later, Martin Luther King, who called the Indian freedom fighter his ‘guiding light’, broadened the context by saying: ‘No man is free unless every man is free’. However, as with Lincoln, Sadat, and the Kennedys, bullets can put an end to the lofty First Amendment words with Second Amendment bullets. And it seems that today many citizens who feel their country and traditional liberties endangered by secularists, immigrants and/or an imagined New World Order find themselves in a militant might-is-right frame of mind. They call themselves pro-life even as they remain staunchly pro-gun, pro-war, and pro-capital punishment. In 2024, polls show that a third of the population of Jefferson’s land of the free now want Trump, their defeated commander in chief, to return not only as their ‘justice’ but also as their ‘revenge’. Some even call him a messiah who will cast their elitist enemies into Eternal Fire. 

As for Trump himself, it is alarming to read in a 1990 Vanity Fair feature that Ivana Trump once revealed that her former husband kept at his bedside a collection of Hitler’s speeches, entitled My New Order. Ever since then, Trump has been at pains to deny reading the book, although his speeches certainly echo Hitler’s toxic nationalism, even talking of migrants ‘destroying the blood of our country!’ 

Hitler himself is known to have read and studied Niccolò Machiavelli’s The Prince (as did Stalin and, indeed, the Founding Fathers). Claiming that ‘it is safer to be feared than loved’, the medieval diplomat and historian, considering people ‘fickle, hypocritical and greedy’, advised leaders to ‘caress’ the submissive and ‘crush’ the troublesome, all the while ‘appearing to be religious’. Cynically adopting this policy, Hitler demanded, ‘Who says I am not under the special protection of God?’ 

Today, Trump claims to be the voice for the voiceless. Saying he has ‘the best words’ and boasting of his ‘high IQ’, he calls outsiders and critics animals, vermin and scum before urging his followers to ‘get wild’ and ‘knock the crap out of them’. Infuriated with ‘fake news’, the professional prevaricator launched Truth Social, a website which soon became a forum for death threats against his enemies – congressmen, critics, prosecutors, judges, and even his own former Vice-President! During the January 2021 insurrection, the Chosen One’s Christian soldiers wore ‘GOD, GUNS, TRUMP’ hats, AR-15 pins – and concealed weapons in black, leather Bible holsters. (Months earlier, Trump the firebrand had energised his followers by holding a Bible (upside-down) at Saint John’s church, as police used tear gas and violent riot control techniques to disperse peaceful Black Lives Matter protesters in what was was widely condemned as the sue of excessive forcend an affront to First Amendment rights to freedom of assembly.

Ironically, even as enflaming the MAGA mob with his rhetoric and shaking fists, Trump himself nearly shared the fate of the Kennedys and Martin Luther King when a lone gunman attempted to shoot him. But, saved by a sudden turn of the head during his speech, and now sporting an ear patch, he managed to turn the sniper’s wound into a stigmatic red badge of courage. Like his devout predecessor (and another champion of gun rights) Ronald Reagan, Trump claimed God himself stepped in to protect him from the assassin’s bullet.

All of which goes to say that today, America is more fractured and endangered than at any other time in American history, save the Civil War. As in that fraught time, the post-Independence fellowship among citizens has given way to a fear – and even hatred. The world will soon discover if enlightened Jeffersonian democracy will prevail and strengthen under the very different leadership of Kamala Harris – who would be America’s first woman President –  or if the work of the Founding Fathers will be undone by a cult leader who exploits the most irrational and tribal instincts of its people. 

Thursday, 15 August 2024

REVIEW ARTICLE: Humor and Its Philosophical Keystones

HUMOR AND ITS PHILOSOPHICAL KEYSTONES

By Keith Tidman

Funny philosophers, as envisaged by Zolumio for ‘The Ah-Ha Moment’

Humor is one of those rarer things in the world that is universal, though what is funny is seen at least a little bit differently by everyone. Plato was particularly grumpy about the subject, proposing that the guardians of utopia, who rule the city, should never laugh. But for most of us, a sense of humor is always valued in a person, whether a spouse, a friend, a work colleague, an academic, or a public figure. Humor is a welcome palliative for all that ails. It appears to be deep-seated in human nature.

The oldest surviving book of jokes, called Philogelos (which translates from the ancient Greek to “laughter lover”), dated back to roughly the 5th century CE — a compilation of some 265 probably preexisting jokes, though of uncertain attribution. Some of the jokes are wittier than others, and some admittedly aren’t at all funny by today’s standards, but their place in history gives them a respected cachet. That being said, of course, humor, and jokes in particular, presumably extend much farther back into prehistory.

Humor can be subtle or brash, or found to hover anywhere in-between. Through its seemingly hard-wired ubiquity and variety, there’s plenty for all tastes — often effortlessly crossing cultural borders, on other occasions stopping hard in its track at the lands’ edge to accommodate local norms, or sometimes even to irreverently challenge them.

 

We might observe that humor and philosophy share similar aspirations, especially in the realm concerned with the exploratory disassembly and reassembly of reality, where insights are sovereign. Of course,  different motives linger under the mantle, ready to surprise. And yet, most philosophers, going back to the ancient world, gave humor only transitory discussion, largely obscured as asides within other topics. Arguably the first major philosopher who treated humor as a central theme was the Frenchman, Henri Bergson, in his 1900 work bearing the to-the-point title Laughter. This, in its time, was a bestseller and a key philosophical reference.

 

But back to humor’s omnipresence, though: Where would the following joke fit, for example? 

The heir of a rich relative wished to arrange for an imposing funeral, but he lamented that he could not properly succeed because the more money he gave the mourners to look sad, the more cheerful they looked. 

Everywhere? Nowhere? At least multiple somewheres? That, by the way, is said to have been one of Immanuel Kant’s favorite jokes, which he used to exemplify the particular theory of humor to which he subscribed, which I’ll get to in a moment.

 

And so to an entertaining soon to be released book, The Ah-Ha Moment: Exploring Philosophical Ideas Through Jokes and Puzzles* by philosopher and author Martin Cohen. This invites us to renavigate — or perhaps out of curiosity, to navigate for the first time — what’s funny, how, and why. At its core, the book’s aim seems to be to revisit what might tickle our own funny bones. Humor that might on occasion ruffle feathers in either annoyingly or amusingly provocative ways; while for other people, cause a full-on, hard-to-stop belly laugh.

 

This might be a good place, before digging into theories of humor, to catalogue something that should be obvious, but might not be. That is, the many benefits of humor, starting with joy, camaraderie, enhanced attention, stress reduction, lightened burdens, physical and mental health, alertness, anger release, stronger relationships, improved mood, conflict diffusion, happiness, confidence, coping with loss, creativity, release of inhibitions, diffusion of defensiveness, acceptance of own imperfections, confronting challenges, and more.

 

Yet, somehow these prime factors just listed, which are perhaps more the stuff of sociology and psychology, seemed to skirt by the awareness of many earlier philosophers, some of whom simply distrusted humor and laughter. 

 

Several theories of humor have been explored over the many centuries; however, of all these, it’s fair to say that standing front and center among the pack is what’s called the “incongruity theory.”

 

Incongruity theory says that a joke acquires its funniness by first setting up an expectation and then winding up with a surprisingly contrary, even absurd ending. A concise description of this theory came, perhaps unlikely, from the otherwise solemn Kant, who stated in his Critique of Judgment that “In everything that is to excite a lively laugh there must be something absurd.” No bone to pick with that.

 

Laughter is a response to that “absurdity.” It is the ah-ha incongruity between what’s expected and what’s actually delivered. Anything that conflicts with expectation according to, say, some familiar standard, can be funny. Often, we don’t anticipate the imminent humor, which often firstly hangs back unassumingly, only to emerge from a cogent play on words or ideas. 

 

For example, as shared with us by Martin Cohen, let’s take a line, basted in dry humor, delivered by the charmingly self-effacing former president, Ronald Reagan. In response to a question, he pokes fun at his own age and reputation for laziness, saying, “I have left orders to be awakened at any time in case of a national emergency … even if I’m in a Cabinet meeting.” 

 

A buildup ending with a punchline in which is embedded an element of fun, self-effacement, harmless dissonance, pleasant silliness, and unanticipated surprise. Our expectation of what Reagan was about to reveal at first led us down a different path of humor disguised in a cloak of faked seriousness.

 

In short, there are occasions, primarily when we are upset for other reasons, that life’s inconsistencies might simply distress us further. However, when our thoughts are darting around and in the mood to think playfully, those same inconsistencies may take on a different, more palatable, even amusing light. When what we misguidedly anticipate from the setting up of a joke is infringed upon in some inoffensive manner, where one feels secure, the result is often a laugh. 

 

According to Arthur Schopenhauer, this distinguishing of safe incongruity is essential for humor and today, the incongruity theory overwhelmingly prevails in philosophy’s and psychology’s explanations of the what, how, and why of humor — which is why Cohen’s book devotes ample space to examples of how it works.

 

An alternative approach, somewhat less princely than that of incongruity, is the “superiority theory” of humor, favored by RenĂ© Descartes. Thomas Hobbes was also an ardent protagonist of this take on humor, saying, in Human Nature, that:

 “…the passion of laughter is nothing else but sudden glory arising from some sudden conception of some eminency in ourselves.”

Put simply, Hobbes is saying that when we tell a joke, we’re poking fun at someone or something, as in what might Americans today call a roast. Not to make someone the butt of mean-spirited jokes, but in a form of ironic adoration, to exaggerate the funnier peccadilloes of the person celebrated. Possibly the most effective form of superiority theory is making fun of oneself; the glee embedded in self-deprecation (self-roasting, let’s say) nearly always works and makes for safe, inoffensive fodder, a modesty appreciated by an audience.

 

Let’s turn now to another form of humor, based in relief theory, which plays on how humor influences the overall mood of the listener, particularly an emotionally wound-up listener. John Dewey was a subscriber to this theory. His opinion was that laughter “marks the ending … of a period of suspense, or expectation.” It is a “sudden relaxation of strain.... The laugh is thus a phenomenon of the same general kind as the sigh of relief.”


Sigmund Freud offered a story; this time of a prisoner being escorted to his execution on a Monday morning. While on his way to the room of doom, the prisoner quips, “Well, this is a good beginning to the week.” Tension is teed up at first, as we squirm at least little uneasily over the pending execution. However, the criminal’s unanticipated light-hearted reaction to his circumstance, and perhaps the gentle laughter that ensues, lessens that unease.

 

Another story concerns a construction worker hurled into the air when a planned detonation happened too soon. Because the stunned worker fell back to land’s surface far from the construction site, his pay was cut for the half-day’s “absence from his place of employment.” Freud suggested that, by the end of the story, we might chuckle to release tension over initially sympathizing with the worker’s misfortune. We therefore ultimately laugh at the incident rather than continue to fret over it.

 

Bertrand Russell once noted that “The point of philosophy is to start with something so simple as not to seem worth stating, and to end with something so paradoxical that no one will believe it.” Might some forms of humor similarly playfully make that paradoxical transition, too?

 

Among Russell’s many thought-provoking one-liners, which represent a particular delivery mode of humor he clearly seemed to relish, were these zingers, in modern parlance.

“There are two motives for reading a book; one, that you enjoy it; and the other that you can boast about it.” 

“Most people would sooner die than think; in fact, they do so.” 

“I would never die for my beliefs, because I might be wrong.” 

“There is no reason to worry about mere size.... Sir Isaac Newton was very much smaller than a hippopotamus, but we do not on that account value him any less.” 

 Likely a combination of the theories of humor discussed in The Ah-Ha Moment, along with yet others, account for what makes remarks and stories comical. The explanations and examples are eclectic, steeped in history, culture, and personal experience. Clearly, the nature of humor — its role, efficacy, and appropriateness — underwent many generations of twists and turn, in the years since the stodgy stoic philosopher Epictetus urged people to “let not your laughter be loud, frequent, or unrestrained.”

 

Thank goodness we’re a long way from that kind of stuffiness. I wonder how Epictetus would react if transported through time to sit in front of today’s standup comedians, like Sarah Silverman and Chris Rock, or to episodes of Fawlty TowersSeinfeld, and The Office, or to movies like Bridesmaids and Beetlejuice. Or, switching media, perhaps entirely different genres of comedy — including the world’s rich body of literary humor of course, as well as the amazingly brazen political cartoons of Thomas Nash — that likewise reflect people’s and cultures’ assorted predilections.

 

The Ah-Ha Moment: Exploring Philosophical Ideas Through Jokes and Puzzles splendidly introduces us to much of all that and more: the eclecticism, the subjectivity, and the what, how, and why of humor through the ages, all the way up to the contemporaneously comical — with examples of jokes abundantly inserted throughout the prose (for those “ah-ha!” moments). Oh, and there is a closing “how-to” account of being funny.

 

_______________________


The Ah-Ha Moment, by Cohen cover


* Martin Cohen, The Ah-Ha Moment: Exploring Philosophical Ideas Through Jokes and Puzzles. Austin Macauley Publishers (September 2024), https://www.amazon.com/Ah-Ha-Moment-Exploring-Philosophical-through/dp/1685628982.

 

Friday, 1 March 2024

REVIEW ARTICLE: Another Side of “Reason and Argument”

 From The Philosopher CXII No. 1 Spring 2024


Gorgias, a Sophist who taught the art of rhetoric

Robin Reames takes a look at the Ancient Art of Thinking for Yourself


This is a timely and very readable look at the art of rhetoric – as seen from a philosophical perspective. That’s not surprising as the word ‘rhetoric‘ comes from the Greek meaning simply ‘speech’. However, as Robin Reames explains, rhetoric is a very special kind of speech. She even calls it a ‘metalanguage’ – one that describes and explains how words work.

Specifically, rhetoric is a metalanguage that describes the various moves that make words effective and persuasive (or not). Terms that we really only associate with English literature classes such as alliteration, onomatopoeia, allegory, metaphor, simile and so on, Reames points out, originated in the ancient works of rhetoric, where they were understood as identifying tools useful to persuade and influence. 

Take metaphors, for example. These devices are powerful. And they are everywhere. In fact, even  when we try to dissect rhetoric and discuss argument, we often end up employing them,  particularly the metaphor of war. We may speak of: 

defending a position 

shooting down an argument 

attacking an opponent 

taking a side

Not to forget, there are the constant ‘Wars’ on drugs, or on terror. Even against ‘climate change’!

More generally, though, rhetoricians understand that the human ear is naturally drawn to certain things. It is attracted to rhythm. It likes repetition, yet needs surprise and spontaneity. At times it can be tickled by pauses, at times it detests them. It can attracted by vivid description, at other times it wants crisp logic. The art is know which style to use when.

In Ancient Greece, and for thousands of years after,  the study of how to craft language and deliver persuasive speeches, largely consisted of examination of historical and literary texts and great works of oratory, along with learning the jargon of the different figures of speech. However, at the end of the nineteenth century, perhaps because writing had replaced speaking as the dominant form of education, rhetoric receded from view. 

Reames seems ambivalent whether this is a good thing or not. A theme often returned to in the book is how rhetoric led to the fall of Athens, after fiery speeches, by the likes of Callias an Alcibiades, encouraged the Greeks to vote for a disastrous war on Sicily. The book dwells at length on examples of recent political rhetoric leading to bad outcomes. There  is a long, indeed overlong look at the birther controversy (the one about where Barack Obama was born, the dispute potentially affecting his right to be elected President of the US).

At times, Reames reflects a general academic preference for logic, seen as the highest form of thinking and hence of arguing too. She says:

‘…all of it began with Aristotle. Aristotle conceived of logic as the study of how claims and conclusions of all kinds are proved or justified, and he developed logic and rhetoric side by side to highlight all the ways people produce persuasive arguments and proofs. As you might guess, he did this so that people could make better, more solid, and more reliable arguments.’

Rhetoric, by contrast is seen as deceptive, sneaky, dishonest. Reames writes:

‘… a Sophist could easily supply Callias the necessary verbal weapons to convince himself and others that greed is good, that self-interest is in the common interest, that profligacy is parsimony, and so on.’

And Reames recalls Protagoras’s advice to  Socrates, ‘Let’s face it, ordinary people never notice anything anyway; they just repeat whatever’s dictated to them by the powerful.’ More subtle is a point about rhetoric's employment of emotions as opposed to facts. She writes:

‘Facts are by definition falsifiable, so if something is claimed as a matter of fact, at least in theory it can be disproven or shown not to have happened. This makes them highly vulnerable once they’re used in rhetoric… In contrast to facts, we think of values and emotions as… personal, as things we possess or own. Emotions ebb and flow involuntarily. They are chimeral. Values are context and culture specific. They vary from person to person, society to society. …Because of the relativity and subjectivity of values and emotions, we naturally assume that they are equally relative, subjective, and malleable when they are used in rhetoric. But, in fact, the reverse is true. Compared to facts, values and emotions have tremendous staying power once they are introduced in rhetoric.’

And yet:

‘High ideals like truth, goodness, and even choice have no particular content of their own; they don’t come attached to specific material realities. They can mean different things to different people in different contexts… The flexibility and pliability of values like freedom and choice is why so many of us agree that these are indispensably valuable, while nevertheless disagreeing passionately about specific issues and policies where those values are at stake.’

However, the Ancient rhetoricians recognised that it is hopeless to come to any agreement about what to do about a problem if we do not understand where our views diverge in the first place. And so they used questions to identify where two points of view diverge; to shift the focus from what should be done to what gives rise to disagreements in the first place… to clarify what, exactly, our disagreement is about. Cicero identified only four kinds of questions

  • Question of fact: Does the problem exist? Has it occurred? Does the issue need to be considered? 
  • Question of definition: What kind of problem is it? How should the issue be defined? What category, genre, or discipline does it belong in? 
  • Question of quality: What is the qualitative value of the problem? How serious is the problem? How urgent? Does it need immediate attention, or can it be dealt with at a later date? 
  • Question of policy: What should be done? What action should be taken? 


‘On the most contentious issues, our discussions woefully neglect fact questions and definition questions. If ancient rhetoricians are right, we’d have a better chance of reaching agreements about policy questions if we took the time to ask those questions.’

Reames credits one of philosophy’s understated  female figures, Aspasia, for promoting the use of questions as a debating technique – the method that we conventionally link with Socrates.

‘Though she’s not commonly remembered in this way, it was actually Aspasia who taught rhetoric to the philosopher Socrates. It was from Aspasia that Socrates learned the dialectic method of question and answer for which he is famous. He favored Aspasia’s method precisely “because he wished to present no arguments himself, but preferred to get a result from the material which the interlocutor had given him—a result the interlocutor was bound to approve as following necessarily from what he had already granted.’

The weakest part of the book come, perhaps paradoxically, where it strays into rhetoric itself. There are sections designed to show the foolishness of the author’s father, who subscribed to views about Covid and Global Warming that the author seems to consider self-evidently ridiculous. Her own views are presented as rational arguments, yet are at root irrational, subjective rather than objective. For example, she mocks her father for buying a stock of incandescent lightbulbs rather than following the official requirements to move to LED bulbs to (presumably) save the planet.

‘He valued freedom and choice so highly that he was willing to spend hundreds of his meagre Social Security income [per month] on it.’

And:

‘Energy companies stood to make enormous profits by maximizing the belief that, in spending what little money they have on light bulbs and paying hundreds a month for electricity in their homes, people like my dad were, in fact, exercising their “freedom.” ’

I rather sympathise with her father though here, as I find the light from LEDs cold and flickery. And I know that the costs are rather less in LEDs favour. If her Dad had ten incandescent bulbs running each of them for 100 hours each a month it would still only cost six dollars. Not “hundreds a month”. Here’s the boring math:

A 60W incandescent bulb uses 60 kilowatt-hours (kWh) of electricity every 1,000 hours. At the rate of $0.11 per kWh, it would cost $6.60 to operate ten incandescent bulbs for 100 hours.

The 12W LED bulb uses 12 kWh of electricity every 1,000 hours. At the rate of $0.11 per kWh, it would cost $1.32 to operate ten LED bulb for 100 hours.

All of which underlines the real truth about rhetoric which is that, yes, facts are boring while stories, whether true or not, are compelling. That said, this book is a timely look at a vital but neglected aspect of social life.


Reviewed by Martin Cohen


 
 

The Ancient Art of Thinking for Yourself: The Power of Rhetoric in Polarized Times

By Robin Reames 

Basic Books, New York, 2024

ISBNs: 9781541603974 (hardcover), 9781541603981 (ebook)

Thursday, 30 November 2023

Self-Consciousness: the Battle between Science and Philosophy


Self-Consciousness: the Battle between Science and Philosophy


By David Comfort

Thinkers have debated many questions about the nature of man, the most basic of which have been closely related existential ones such as: What is self? What is consciousness? A myriad of related questions arises, such as: is it possible for some advanced kind of consciousness to survive the death of its mortal envelope – the self? 

Broadly, there are two opposing perspectives: immaterial and material. The first school of thought is dominated by metaphysical philosophers (from the Ancient idealists through to more modern panpsychists); the second by empirical scientists. Metaphysicians regard consciousness as a transcendent faculty of the human mind, separable from self; materialists regard it as a neural phenomenon of the physical brain, inseparable from self. Metaphysicians assert that the body is an idea of the mind, materialists assert that the mind is a sensation of the body.

Step back a moment and consider the terminology. Con means ‘together’; scio, ‘to know’. Etymologically, then, consciousness in the self is an overriding sixth sense that analyses and organises phenomena for the purpose of knowing. Moreover, embodied awareness is only as integrating and potent as its five servant senses and neurological horsepower. 

The subject of higher consciousness in animals has always been controversial. Some argue it is nonexistent, while others allow it does exist but as a primitive survivalist awareness. ‘Wary’ creatures survive by being aware of danger. Self-preservation, then, might be said to be the mother of sentience and rudimentary consciousness. But humans seem to possess an advanced form: we are aware of being aware. So, introspective individuals can, at least theoretically, study their own consciousness. But, in doing so, they may find themselves in a reflection-on-reflection-on-reflection rabbit hole leading to infinite regress.

In ancient Greece, the Oracle of Delphi established the goal of all thought with a simple but difficult mandate: Man, know thyself then thou shalt know the Universe and God. This didn't mean focusing on the impermanent outer layers of the multi-levelled self – the physical, social, emotional, or psychological – but penetrating to its core: the spirit that transcends mortality and individuality. So, Plato said: ‘All philosophy is training for death.’ 

Through logic, logos, and purpose-discerning intelligence, telos, Plato and Aristotle, identified soul, or what they called psyche or anima, as the eternal essence of the self. Had they known about the later concept of consciousness, they may have considered it no different than the higher self’s anima.

Perhaps the first Western philosopher to venture a tentative definition of consciousness was John Locke. In his Essay Concerning Human Understanding (1690), the British Empiricist identified it as ‘the perception of what passes in a man's own mind’. But since ‘perception’ is basically a synonym for consciousness, the definition is autological. Fifty years earlier, the body/mind dualist, Descartes, had declared: ‘I think therefore I am’. Locke might have rephrased this instead as: ‘I am conscious, therefore I exist’. 

Cracking the mystery of consciousness – what it is, how it works, and whether it can outlive its default object, the self – became a primary focus for philosophers at the turn of the 20th century when the phenomenologists, Husserl and Heidegger, made a case for a ‘transcendental’ form, while the existentialist, Sartre, countered by arguing ‘consciousness is self-consciousness’. At the same time, the psychological nature of self was first systematically analysed  by Sigmund Freud who introduced the ego-id-superego trinity, and by his young colleague, Carl Jung, who began to plumb the unconscious as a repository of mythic and dream archetypes.

To understand the nature of self and consciousness, their complementary functions in the individual mind must be studied. To organise, understand and predict, the mind divides the world into objects, then analyses them according to their apparent causes. Dividing phenomena requires negation: X is X but not Z. So, equation and negation are the definitive abilities of the discriminating conscious mind. Most importantly, negation creates the two interconnected dimensions of human life and cognition, the building blocks of the self: 3D Space (I am here, not there), and 3D Time (my present is not my past, my past is not my future).

‘The only reason for time is so that everything doesn’t happen at once’, said Einstein, before proceeding to argue that time is not an absolute reality, but a mental construct affected by the motion of the observer relative to the observed. The kinetic present is nothing more than “everything happening at once”: it overwhelms the mind, preempting static objective thought. Without the idea of beginning and end, and without memory of the past and imagination of the future, the mind drowns in the disorder of the here and now.  

 So, while the body lives in space, the arena of movement, the mind lives in time, the measure of movement. Since time measures space (in light years), physicists collapse the two dimensions into one: Space–Time. Telescopes are time machines: looking into spacial distance, they peer into the past. When consciousness is ‘heightened’ in mystical or psychedelic states, space distorts or even dissolves, while time slows or even stops.

Modern physics renders the idea of a centered, stationary object or subject and a fixed point-of-view imaginary. Indeed, the body itself becomes a hive of hyperactive nerve activity. Outside, in global space, it spins at 1,000 mph while riding the earth’s merry-go-round at 67,000 mph around the sun which itself circles the center of the Milky Way at 450,000 mph. Nevertheless, the self remains body-centric, anchored in an illusory I-am-where-I-am spacial identity. Even so, while pondering consciousness, Locke concluded that the body is not so much ‘physical’ but the conscious sensation of the physical. Hence, even from his materialist point of view, he regarded the body’s assumed solidity and independent material reality as an unfounded conclusion. 

Some regard Heraclitus as one of the first materialists, at least in contrast to the aethereal Platonics. ‘No man ever steps into the same river twice’, he declared. From this fact, he concluded that all being is becoming, an idea that surely applies to the stream of consciousness. So, is everything both physical and mental indeed change – impermanence – hence impossible to pin down and to truly identify? In Einstein’s everything-happens-at-once pure present, the question is only valid in abstract time – when comparing a present river or consciousness to a past river or consciousness. In reality, the one-change ‘uni-verse’ is not all change, but all motion the manifestation of energy. To mentally break it into matter-in-motion is to replace a real kinetic with an imaginary static.

Heraclitus’ critic, Parmenides, argued that reality is indivisible as well as timeless, making change an illusion. Much later, Isaac Newton, working on his laws of motion, claimed that time and space are a priori (and thus ‘pre-time’) absolutes. Newton’s contemporary, Bishop Berkeley, challenged the premise in De Motu (On Motion), insisting motion is the absolute God-caused reality, while time and space are, a posteriori, human abstractions relative to it. And today, three centuries later, quantum physicists regard everything as an energy wave or vibration. The idea of the ‘particle’– and even the seemingly contradictory massless particle, the light photon – helps them escape mental chaos. Yet, as the father of quantum mechanics, Niels Bohr, pointed out, ‘Everything we call real is made of things that cannot be regarded as real. A physicist is just an atom’s way of looking at itself’.

Viewing self-consciousness in this light, then, could the compound term represent something that is not real but, instead, just a conceptual aid or verbal convenience? If more than that, is consciousness the self’s way of looking at itself, or is the self consciousness’s way of looking at itself? Either way, which came first, and which causes which, becomes a chicken-or-egg question. If not, it is a simultaneous birth question. 

Which, though, is it? Imagine consciousness as the mind’s flash camera. Depending on the F-stop and shutter speed, a nano-second separates the click/light flash (present) from an awareness of a developed photo (now representing a past sensory or mental event): that processing and re-cognition delay, that freeze-frame, creates time. Since we experience by being conscious of experiencing, our consciousness Polaroid stores its experience photos in the self’s temporal lobe headquarters. This artificial, subjectivised reality is the basis of the time-bound ego-sphere that dies when an individual’s embodied time is up. After self-purging the mind, some Eastern and Western mystics claim to have returned to the original universe lifeboat and entered eternal, disembodied consciousness. 

Shortly before his death, Einstein wrote to a friend mourning the loss of his young son: ‘A human being is a spatially and temporally limited piece of the whole, what we call the “Universe”. He experiences himself and his feelings as separate from the rest, an optical illusion of his consciousness. The quest for liberation from this bondage [or illusion] is the only object of true religion.’ 

 Individual consciousness expresses itself in symbolic language. The Word. This is the mind camera’s film capturing sense or cerebral experience. So, the hub of the five senses, the head, becomes a micro movie theatre of past and projected future images complete with a running commentary voice by its director: the ego. The ‘I’. The practical outcome of this abstraction is that, when viewing a present object, the mind also sees its composite idea of it based on past perceptions and understandings, an idea expressed by an identifying word – whether cow, cloud, cosmos, or whatever.

The first to believe that ‘The Word creates all things’. Egyptians referred to their priests’ writing as ‘The speech of the gods’. Early Christians adopted the idea: ‘In the beginning was the Word, and the Word was with God, and the Word was God’. (John 1:1). The first job God gave Adam was to name the Eden animals, and ‘… Whatever the man called each living creature, that was its name’. (Genesis 2:20). Thus, language became the foundation of the self’s conceptual universe. Words are structured according to grammar which itself reflects the mind’s own structure. In physics terms, nouns are substantial and static; verbs are waves and kinetic. Nouns come in cases that indicate their function: subjective, objective, possessive. Verbs come in tenses that indicate their time: present, past, future. 

Locke’s notion of consciousness as a seer presupposes a seen and thus creates subject/object dualism. Again, a person’s primary consciousness is self-consciousness. If consciousness is posited as the essence of a man, then he becomes schizophrenic: both the seeing subject – I – and the self-reflected seen object – Me. This divide leads to a daunting question: to comprehend what it is, can the seeing consciousness make itself into a seen object without becoming other that what it intrinsically is – the seer? In fact, self and consciousness seem in such close orbit that it is difficult to know which circles which, or if one reflects the other, or whether they are a two-way mirror.

A person is considered to be an individual, meaning ‘undivided’. Though the self may indeed seem unitary, to understand it, anyone trying to ‘Know Thyself’ becomes a spelunker of its layers, crust to core. The first level: consciousness of the body and its five senses. Second: of desire and emotion. Third: thought. Fourth: spirit, soul, or being. Materialists mostly live in the first and second levels; conceptualists in the third; mystics in the fourth. 

As Schopenhauer pointed out, the engine of the self, for most, is found in the second layer: will and desire. ‘My entire philosophy can be summarised in one expression: the world is the self-knowledge of the Will’, as he told a colleague. Predicated on the future, the ego’s Will creates time and turns life into a suspended animation for future gratification. Will becomes both a captain of consciousness and its corrective lens, or rose-colored glasses. It concentrates awareness on what it wants, while filtering out or airbrushing what it doesn’t. As time passes, the lens gets thicker, opaquer and more distorted, while the man behind it still insists he has 20/20 vision. 

All creatures possess the will to survive and reproduce. Humans go a step further, striving for well-being in love, fortune, fame, and/or power. But desire is the itch that increases the itchiness. Even the rare person who seems to have it all, often wants more. Anyway, whatever the self wants provides purpose and meaning to its life. In this sense, consciousness, being intentional, is governed by teleology. So, the mystery becomes: after the inanimate to animate evolution of things, climaxing in mortal consciousness, where did self-will come from? 

The question can’t be answered unless the age-old freewill versus determinism debate is resolved. Yet such a resolution seems unlikely since philosophers on both sides of the issue often present as logical conclusions what are ill-concealed presumptions. For centuries, mystics have taught that to be truly free -- to achieve transcendental consciousness -- one must escape bondage to the time-bound, desire-driven ego with all its attachments and anxieties. The few contemplatives who succeed realise that this self is a causa sui I-llusion. To return to the original primal self born of cosmic force – whether it be called divinity by Westerners or dharma by Easterners – mystics have for ages practiced self-reflection and self-mortification in many forms.  

Given that the self (illusory or not) operates conceptually and wilfully in space and time (mental projections or not), it is definable psychologically and philosophically. Any attempt to define consciousness, however, entails formidable problems since any definition is only as good as a majority agreeing with it. The more abstract and intangible the word-concept (God, Soul, Being, Truth, etc.), the more likelihood of a vague, subjective, and/or arbitrary definition. In the case of consciousness, its definition varies according to the disciplinary bias of the definer: the materialist scientist rejects subjectivity; the metaphysician embraces it. Thus, their definitions will never agree. Both materialist and immaterialist bias are problematic in their own right.

A century ago, Einstein energised the material, mechanistic Newtonian universe with E=mc 2, proving that supposed “solid” matter is in fact pure energy compressed by invisible forces (gravity, electromagnetic, and/or nuclear). Indeed, theorists see the early cosmos as pure, unbound energy with matter only being created hundreds of thousands of years after the Big Bang. 

Until the 20th century, scientists mostly studied matter macrocosmically. Then they turned their attention to the microscopic – to what the materialist Democritus first called the atom, Greek for ‘indivisible’. To their amazement, they discovered that it is indeed divisible into proton, electron, and neutron which themselves are composed of quarks bound together by gluons. To their alarm, they discovered that the atomic world seemed to operate according to completely different rules than the macro world. Rendering micro reality seemingly random – governed by chance if not by science’s mortal enemy, chaos or entropy – Einstein protested, ‘God does not play dice!’ 

More disturbing to the father of relativity was Niels Bohr’s proof of quantum complementarity and the Observer Effect. The first principle stated that the position of protean matter can be measured in space, or its speed measured in time – but not both simultaneously.  The second principle states that the observer, through the very act of observation, changes the observed object. In short, what we perceive is never the object in and of itself, but our interaction with it. Thus, the object has no independent reality, making scientific ‘objectivity’ an illusion, at least in the quantum realm. 

Taking Bohr’s complementarity and the Observer Effect into account, Werner Heisenberg derived the Uncertainty Principle in 1927. A major blow to the historic goal of science – certainty – - researchers were now reduced to conjecture based on probability. Kant had argued long before, in The Metaphysical Foundations of Natural Science, that the science of the mind shouldn’t be based on introspection since ‘even the observation itself alters and distorts the state of the object observed’. Given that the synaptic brain is animated by the three quantum forces – electromagnetic, strong and weak nuclear – shouldn’t it be studied according to quantum principles? If so, then in the act of observing self, consciousness cannot know what self is independently, just as consciousness can’t know itself independently through reflection.

Coincidentally, in the same year Heisenberg introduced uncertainty, Heidegger published Being and Time and confessed: ‘Philosophy constantly remains in the perilous neighborhood of supreme uncertainty’. Later, the philosopher, renowned for his obscurity, embraced mystery, writing that ‘Making itself intelligible is suicide for philosophy’. Cynics, sceptics, and fallibilists had said much the same thing centuries before due to the subjectivity of metaphysics and its ambiguous words and concepts. 

Striving for precision and clarity beyond words, scientists invented a new language: mathematics. Using geometry for positions in space, and calculus for movements in time, it seemed an intellectual panacea. ‘Number rules all!’ proclaimed Pythagoras. ‘Mathematics is the language with which God wrote the universe’, seconded Galileo, going so far as to recommend, ‘Measure what is measurable, and make measurable what is not so’. 

But in a universe of numerical quantities and intangible qualities, math can only measure and organize by equation the first, while ignoring the second. True, numbers can represent the degree of a quality – say, pain or spiciness on a on a 1 to 10 scale – but they can’t reveal the subjective experience of a specific kind of pain or taste, much less of self-consciousness. ‘Laws of numbers assume there are identical things, but in fact nothing is identical with anything else’, asserted Nietzsche, adding that logic itself is far too abstract, arbitrary, and simple to handle the complicated real world of quality or qualia – the unique nature of a thing which philosophers call ‘quiddity’ and Buddhists ‘suchness’. 

Charles Darwin, himself an encyclopaedist of the rich variety of nature with its countless evolving species, once observed: ‘A mathematician is a blind man in a dark room looking for a black cat which isn't there’. By contrast, early in his career, Bertrand Russell called math the ‘chief source of the belief in exact truth’. But later, perhaps overwhelmed by surreal, irrational, and imaginary numbers, not to mention the gadflies of infinity and zero, he began to question this exactness. Math, he concluded, ‘may be defined as a subject in which we never know what we are talking about, nor whether what we are saying is true’. Especially where self is concerned: for mathematicians the symbol i represents the imaginary unit, or square root of minus one. 

The contemporary philosopher Daniel Dennett calls consciousness ‘the last surviving mystery… confusing to most sophisticated thinkers’. In his book, ambitiously entitled Consciousness Explained (1991), he defines it as the sum of physical brain activities and calculations. He dismisses subjective qualia as ‘brain pranksters’ and concludes that humans are soulless computing machines no different than ‘complex zombies’ or AI robots. Challenging the notion, the Australian philosopher, David Chalmers, has argued that materialists, in their quest for certainty, ignore the ontological elephant in the room: the ‘hard problem’ of qualia consciousness: ‘what is if like to be a human’, or more precisely, a unique self? But, since studying self leads to the slippery slope of subjectivity and solipsism, preempting objectivity, consciousness materialists avoid it.  

In his Toward a Neurobiological Theory of Consciousness (1990) another consciousness expert, Kristoff Koch – with his co-author, Francis Crick (the recipient  of a Nobel prize, with James Watson, for discovering the structure of DNA) – argued that awareness can be reduced to ‘a pack of neurons’. Later, Koch, a lapsed Catholic, challenged Chalmers, ‘Why don’t you just say that when you have a brain the Holy Ghost comes down and makes you conscious?’ Then he bet his rival that, within 25 years’ time, science would solve the mystery of consciousness by identifying all its neural coordinates. In 2023, he graciously conceded defeat. 

Since his collaboration with Crick (who some called the ‘Mephistopheles of Materialism’), Koch had come to support the most synthetic of the four leading math-based neurological theories of consciousness: Integrated Information Theory – or IIT. Developed in 2004 by the Italian psychiatrist and sleep expert, Giulio Tononi, this combined materialist and immaterialist elements to conclude that consciousness inheres not just in brains but all matter in the universe. Today, neurologists are equally divided on IIT: supporters call the theory ‘promising’, detractors contemptuously dismiss it as pseudoscience due to its untestability. But as all well know, in the history of science, experiment has always lagged far behind theory, especially any theory borne exclusively from higher math.

The root problem becomes clear: In viewing the human mind as no more than a computing physical brain, the strict materialist gets lost in the chips, circuitry, and motherboard, while subsuming the essential thing: the immaterial energy that animates and connects all component parts in synergism. Materialists seem to regard this energy as a product of magic meat in the skull, rather than the other way around. Thus, for physicalists, consciousness-brain-self are inseparable, so death – UnConsciousness – is an absolute and any idea of a numinous afterlife, much less a cosmic consciousness independent of the physical self, becomes occult nonsense. 

And so, today, in large part due to the Immaterialist/Materialist divide, philosophy and science are opposed fields.Until Newton, though, there was no such distinction – every philosopher was also a scientist, and the ambition of each was to figure out as much as possible from every perspective. Arguably, the most ambitious, such as the omnivorous Aristotle, strived to understand everything material and immaterial though they had no illusions about the difficulty. They hoped to arrive at what scientists today call The Theory of Everything. TOE for short. For physicists, a TOE would unify the four recognized forces – gravity, electromagnetism, weak and strong nuclear – proving each to be a different expression of one master force. But an all-encompassing TOE would have to connect every branch of knowledge and even solve the mystery of consciousness.

Einstein who strived to ‘read the mind of God’ was among the first modern TOE aspirants when discussing the vehicle of human understanding itself: consciousness. ‘No problem can be solved from the same level of consciousness that created it’, he said – a truism still ignored by many. A student of philosophy, he went on: ‘Reality is merely an illusion, albeit a very persistent one’. As for the strict mechanical approach to physics’ mysteries, the father of relativity, an accomplished violinist, added, ‘What’s the use of describing a Beethoven symphony in terms of air-pressure waves?’ 

Einstein claimed that the only thing ‘incomprehensible’ about the universe was that it was comprehensible – held ‘wonder’ in the highest regard, calling it the ‘most beautiful thing’ and the source of all true science. Since, all systems of knowledge depend on open-mindedness to every possibility, inflexible certainty has always been the enemy of progress.

Again, as other physicists and metaphysicians have suggested, self-based ‘reality’ is not truly real but, for many reasons, illusory. Matter – the body -- is not solid, but compressed energy constantly changing form, while the space/time foundation of matter is a human invention for cognitive order in a chaotic, kinetic cosmos. Even with it, scientific objectivity is preempted by uncertainty and the observer’s altering effect on the observed. Moreover, abandoning imprecise words and concepts for numeric language – math – has not helped resolve these issues for neurologists since applying quantitative measurement and equation to a qualitative, subjective self-consciousness is trying to force a square peg into a round hole. So, the only solution to the problem is to regard the idea of matter as a product of consciousness, not the other way around. Moreover, if consciousness is regarded as a unifying energy, according to the law of the Conservation of Energy, it never dies but reifies itself in ever-changing forms.

As we have seen, self and consciousness are born together and act in concert: consciousness imparts to the self its idea of materiality in space/time, while its mortal envelope – self – imparts to consciousness its focus, will and purpose. And so, to bring our discussion to a conclusion, it seems that if a theory of everything is indeed possible, progress will only be made once materialist scientists and immaterialist philosophers set aside their biases, and instead start to collaborate together as part of a marvellous symbiosis. What that will look like we do not yet really know.



David Comfort’s essays appear in Pleiades, Montreal Review, Evergreen Review, Pennsylvania Literary Journal, Stanford Arts Review, Johns Hopkins' Dr. T.J. Eckleburg Review, Juked and Free Inquiry. He is also the author of The Rock & Roll Book of the Dead (Citadel/Kensington), The Insider’s Guide to Publishing (Writer’s Digest Books), and three other popular nonfiction titles from Simon & Schuster.  

David can be contacted at dbeco@comcast.net.

Monday, 21 August 2023

REVIEW ARTCLE: Quantum Mechanics and the Rigor of Angles

 From The Philosopher CXI No. 1 Spring 2023

A young, fresh-faced Werner Heisenberg

Borges, Heisenberg, Kant, and the Ultimate Nature of Reality


William Egginton’s quest to make sense of life, the universe and everything is ambitious but ultimately unsuccessful. Unsuccessful? Yes, I know that sounds harsh, but then Egginton seeks not only to make sense of the mysteries of quantum physics, something the physicists abjectly fail to do, but to finally pin down the essential secrets of reality – something the philosophers likewise have made a fist of over the centuries. 

Part of the reason Egginton himself makes little progress is that he doesn’t see either group as having failed though. Rather he sees his role more as a cultural critic, picking out the best bits from the more mediocre.

For Egginton, there is essentially one key issue: whether reality exists ‘out there’, fixed and eternal, or whether it is rather a shifting miasma, a theatre performance in which the actors (say atoms) subtly respond to their audience (you and me and the scientist with a particle detector over there in the corner). Plato, we may summarise, largely emphasises the former view – although he certainly acknowledged the paradoxes it brought with it. Indeed, he suggests in some of his writing that reality is best approached through poetry and the imagination rather than through practical experiments. But Egginton is no great fan of Plato, instead he eulogises Immanuel Kant, who he often prefaces with the adjective ‘great’. 

Actually, many of the traditional pantheon of philosophers are introduced like this:  There’s “the great John Stuart Mill”, “the great French thinker” RenĂ© Descartes, and Hegel, “the greatest German thinker of the nineteenth century”. All of them though slightly beneath that “great font of German wisdom”, Immanuel Kant. Kant, you see, intuited that the world scientists observe is not entirely independent of their gaze. It is instead, the product of the way they look at it, coloured by the perceptual spectacles they wear.

It is a good point, but one we could equally well have been attributed to Plato, or Zeno - let alone the “gloomy Scot”, David Hume, author of “that great book, The Treatise on Human Nature”. The danger with this kind of  praise for the philosophers is not so much that it is grating (ahem, “greating”), but that it is uncritical. You see, it is important to remember that Kant actually had many views and clearly some-of his theories were just plain daft. Famously he thought that all the planets in the solar system had life on them, with their intelligence related to their distance from the sun. 

Indeed, in the Critique of Pure Reason, the “famed” Critique of Pure Reason, he  occupies himself with the “inhabitants of the planets”, a happy speculation that is, of course, completely groundless. The point is, Kant’s writings should not be consumed uncritically – and while Egginton provides a rather fine overview of the philosopher’s oeuvre, it is flawed by the apparent assumption of the brilliance of all Kant’s words. And Kant is a big part of the book, as the subtitle plainly indicates. 

The same issue, with bells on, concerns Jorge Luis Borges. Why should this writer, excellent dramatist as he certainly was, be taken as a guide to quantum mechanics? It’s on the face of it implausible. Especially as no one actually understands quantum physics. That’s not just me sniping. Egginton himself acknowledges the words of the physicist and Nobel laureate Richard Feynman who once wrote:  “I think I can safely say that nobody really understands quantum mechanics”. To read Borges as a guide to QM is a bit like reading Winnie the Pooh as a guide to Taoist philosophy, as Bengamin Hoff did in The Tao of Pooh. Only Hoff’s book was a joke!

Mind you, I was recently a speaker on a panel discussing “the nature of the universe“ recently, alongside two quantum physicists, and they insisted that they did understand it. The problem was simply (they said) that average Joes lack the intuitive understanding of the beautiful and complex mathematics underlying the subject. You know, things like the extra dimensions quantum theory works daily with. How many dimensions are there according to quantum physics, you might ask? Well, ten, a mere ten we might say, dimensions are used to describe superstring theory, while eleven dimensions can describe supergravity and M-theory. But as Wikipedia helpfully explains, “the state-space of quantum mechanics is an infinite-dimensional function space”.

The theoretical physicist Roger Penrose has queried the logical coherence of such airy mathematical talk of multiple dimensions, yet as I say, many “experts” insist that it all makes perfect sense, albeit being hard to explain without complex mathematics. At least Egginton doesn’t go down that rabbit hole. There is next to no maths in this book, even though his third major character, Werner Heisenberg, made his contributions in just this “toe-curling“ area. As Egginton puts it: “The uncertainty principle, as it came to be known, showed with inescapable, mathematical precision that … full knowledge of the present moment wasn’t just hard to pin down; it was actually impossible.”

Which point explains why, to paraphrase Borges, the rules that govern the world are the man-made, artificial ones of chess, not the heavenly ones of the angels. So let’s give the last word to Egginton, who has produced an account that is always highly original, often insightful and only, in places, rather difficult and obscure. 

“There is rigor there, indeed. But to see that we are the chess masters who made it, we must let the angels go. And that, it seems, is the hardest task of all.”


Reviewed by Martin Cohen


 

 

 

 

The Rigor of Angels: Borges, Heisenberg, Kant, and the Ultimate Nature of Reality

By William Egginton 

Pantheon, New York, 2023