Russia and the West: what is the Endgame?

Reading Time: 4 minutes

In light of the poisoning of the former Russian spy Sergei Skripal and his daughter Yulia, and the tit-for-tat expulsions of diplomats by NATO countries and Russia that followed, one might begin to wonder: what exactly is the endgame of this increasing confrontation between Russia and the West?

The number of Russian diplomats that have just been expelled from NATO countries

When Prime Minister Theresa May announced the initial expulsion of 23 Russian diplomats, identified by British authorities as “undeclared intelligence officers”, the mood in the House of Commons was one of apparently sincere moral outrage. How dare Russia carry out such an attack on our soil! How can we punish the Putin regime to the maximum extent? From what most of the MPs had to say, one might conclude that their concerns were solely around the human rights record of Putin’s Russia.

Few would deny that Vladimir Putin is an authoritarian leader with scant regard for human rights, but there is no shortage of such leaders today on the world stage. The United Kingdom happily sells arms to Saudi Arabia, an absolute monarchy carrying out a brutal war on the Houthi rebels in Yemen and inflicting a terrible cost on that country’s civilian population. Britain rolls out the red carpet for Xi Jinping, now likely to be China’s dictator-for-life.

Perhaps the problem is that Putin’s ruthlessness is not confined to Russia’s borders, and extends as far as Britain’s shores. This argument lacks geopolitical perspective, however. Even after assurances were given to the Russians that NATO would not expand any further into what was formerly Russia’s sphere of influence, that organisation, redundant after the fall of Soviet Communism, went on to expand right up to the Russian border with Estonia and Latvia. While both the West and Russia have interfered in Ukraine’s affairs, to the detriment of Ukrainians, it should be obvious that the latter has a much greater interest in what transpires in that country. Even in further-flung Syria, Russia has many military assets as well as a decades-old alliance with the Assad regime. The West, meanwhile, has effectively supported radical Islamist groups in an effort to overthrow Assad, having failed to learn from its mistakes in Iraq and Libya.

The West cannot accuse Russia of playing realpolitik when it does so itself, under the guise of humanitarian intervention or democracy promotion. Put another way, it is not military intervention per se that the Western establishment opposes; it is military intervention by an outside power, in service of its national interest rather than that of “the liberal international order”. Russia as a nation state stubbornly refuses to go quietly into the night and let the “end of history” dawn .

Even the poisoning of the Skripals looks more like a pretext to pursue anti-Russia policies, instead of the real cause. Skripal was a double agent, and spies put their lives on the line. More inexcusable is the careless way in which he was targeted, putting his daughter’s as well as other innocent lives at risk. However, one has to be rather naïve to think that that MI6 has never endangered or harmed any innocent people during its various activities abroad.

What really seems to lie behind the hostility towards Russia is the overblown idea–verging on a conspiracy theory–that without malicious Russian cyber-activity, Brexit, the Trump presidency, Catalan separatism, virtually any unwelcome political development in the West that one cares to name… would not have happened. This is not to deny that Russia tried to interfere in elections by spreading propaganda. The United States has long engaged in similar activity and continues to do so this day, as former CIA director James Woolsey admitted in a recent television interview. The problem, however, is that Russia is blamed disproportionately or even entirely for certain political outcomes that might have come to pass anyway. Russia did not create the economic grievances, social divisions and culture wars that characterise the contemporary West; the most that can be said is that it–along with many other actors–moved to exploit them where possible. In other words, Russia is a scapegoat. The process of scapegoating is driven by sentiment, rather than reason. Unfortunately, this means that the current policy of confrontation with Russia has not been rationally thought out. If it had been, it would have been seen for the madness that it is.

Firstly, it has not in fact been proven that the Salisbury attack was carried out by “the Russian state”. If indeed it was not, and the British government rushed to judgement, this would be highly irresponsible – especially given the stakes of a conflict with Russia. The fact that Jeremy Corbyn has pointed this out does not make it false.

Secondly, if tensions with Russia continue to escalate, it is difficult to see how they will be ratcheted down. One cannot simultaneously believe that Putin is an evil genius and that he will simply back down after the next round of sanctions. The more he is pushed into a corner, the more likely he is to take military action of some kind. And the truth is that the West no longer has much of a stomach for that. If this is true even of a military superpower like the United States, which now does much of its fighting with drones and avoids putting its own citizens in harm’s way, how much more so must it be true of the United Kingdom?

If Russia is as wicked an adversary as many in the Western elite appear to believe, why antagonise this nuclear power further? How can anyone imagine that embarking on such a course of action will end well?

3 April Update: Gary Aitkenhead, the head of Britain’s military research centre, the Defence Science and Technology Laboratory, was “unable yet to say whether the military-grade nerve agent that poisoned a Russian double-agent last month had been produced in Russia.”

Cryptocurrencies and the True Source of Value

Reading Time: 5 minutes

One of the arguments against Bitcoin and cryptocurrencies in general is that they do not represent true value. Behind the crypto-algorithms, according to this line of argument, is really nothing that could objectively be considered currency; indeed, nothing at all. Hence, cryptocurrencies are a bubble which is bound to burst. This is not just an any-man-on-the-street opinion; it has been espoused by the billionaire investor Howard Marks, who predicted the “dotcom bubble” of the 1990s. “In my view, digital currencies are nothing but an unfounded fad (or perhaps even a pyramid scheme), based on a willingness to ascribe value to something that has little or none beyond what people will pay for it,” Marks said in 2017. Marks used historical precedent to underscore this point, pointing to the notorious “tulip mania” that started in the Netherlands in the 17th century. In 1637, at the height of the mania, a single tulip bulb could be worth up to ten times the annual income of a skilled craftsman.

Lydian coin. Inscription reads “I am the sign of Phanes”. Electrum (alloy of gold and silver), length: 2,3 cm. Late 7th century BCE, found at Ephesus. Israel Museum, Jerusalem.

One might wish to consider other historical precedents, however. Currencies per se are a surprisingly recent invention in human history. According to the archaeological record, the first coins were used in Lydia (present day Turkey) in the 7th century BCE (see image above). This is long after the rise of cities and kingdoms and indeed the successful smelting of metals, including gold, silver and bronze; even 500 years after the commencement of the Iron Age in the Middle East. We also know that this was not due to lack of technical engraving ability, since many small metal seals with intricate designs have been found dating from many centuries prior to the 7th century Lydian coins (see image below).

Seal of Tarkummuwa, King of Mera. Silver (diameter: 4.2 cm). c. 1400 BCE, found at Smyrna. Walters Art Gallery, Baltimore.

The anthropologist David Graeber has provided an interesting explanation of why coinage was eventually developed. Coins were not initially used by most ordinary people, he argues. The available archaeological evidence shows that the first coins were used by soldiers. This makes sense, Graeber argues, when we consider that ancient rulers had to find a reliable way of feeding armies at the frontier of their empires. If the soldiers were stationed inland, he points out, it would be extremely difficult to move large amounts of grain or other foodstuffs with them. If, however, standardised coins could be minted and given to soldiers, the soldiers would be able to buy the necessary food from the ruler’s civilian subjects in these far-flung parts of the empire. By taxing his subjects, these metallic tokens of value would then be returned to the king. They began as a more efficient way of feeding armies, but once they acquired universally recognised value within the state, could be applied to any economic transaction.

In order to be hard to forge, coins had to be minted out of rare metals by skilled craftsmen. But even gold, silver and copper, which were used for the earliest coins, have no intrinsic value, as Israeli historian Yuval Noah Harari points out – “you can’t eat it, or fashion tools or weapons out of it.” The lesson here is that no form of currency has value above and beyond what we ascribe to it, collectively, as human beings. Thus, it will not do to dismiss a cryptocurrency, as Marks does, because it has no value beyond what people will pay for it (this is not to say, of course, that other arguments against cryptocurrencies fail; only that this particular line of argument is unconvincing). One might well imagine an ancient Lydian exclaiming, “These bits of metal with their fancy designs and inscriptions have no real value. The whole fraud will surely collapse after the king dies.” And yet, as we now know, it did not turn out that way. The coins had value because enough people came to believe that they did and that was all that mattered.

We have since, although only relatively recently in 1971, abandoned the gold standard, making way for the the US dollar as the world’s reserve currency. One could even argue, as some have, that the dollar is a less reliable store of value than either gold or Bitcoin, because the US Federal Reserve can simply print as many units as it sees fit – and indeed, in the last round of quantitative easing since the 2008 crash, it has been printing an unprecedented number. The amount of gold in the world runs up against physical limitations, whereas the amount of Bitcoin runs up against mathematical ones. While it is true that other cryptocurrencies can avoid the same limitations that appear to be built into Bitcoin, matters such as the total number of units to be issued and the value of each unit relative to everything else still depend on the vital criterion of consensus by the community of users. Notice that the technological aspects aside, this criterion also applied to the very first currencies used by our species. While it is true that the first coins were issued by rulers in a top-down fashion, these rulers did not realise that they had brought into being a monetary system that would soon escape their control. As Graeber also notes, after appearing in Lydia, coinage soon emerged independently in differently parts of the world. This meant that when different empires came into contact with each other, they had to arrive at a fair exchange rate. If the empires were of roughly equal power, this could not be determined by either of their rulers and was determined instead by market factors beyond any one individual’s control. Exchange rates between different official currencies have thus continued to fluctuate from ancient until modern times.

Bitcoin and other cryptocurrencies could indeed be seen as the next logical step: prior to their emergence, the only “non-physical” medium of exchange resembling a truly global currency was the IMF’s “Special Drawing Rights” or SDRs, although as their name suggests these have only been issued and used in exceptional circumstances. Better yet, unlike SDRs, cryptocurrencies are not controlled centrally in any way. Instead, they are designed to bypass both governments and banks. All they require is a public ledger, the blockchain, to keep track of all transactional information. Governments and banks understandably find this frustrating and will likely do all they can to bring cryptocurrencies under their control. In this respect, however, they may resemble a Lydian king who tries to fix the prices of various commodities, only to find his attempts frustrated by his subjects, who find roundabout ways to buy or sell commodities at market prices.

The fact of the matter is that we are now all living in a global economy, and cryptocurrencies have beaten the IMF to the finish line of establishing imaginary units of value that are created (or “mined”), recognised and used globally. One or even all of them may collapse eventually, but the point is that such an event cannot be brought about by governments or banks. The technology is now out there, as is the will to avoid the fiats of governments or banks. And if they do collapse irreversibly, that is not necessarily good news for fiat currencies. The need for an independent global currency will likely persist even in their absence, perhaps leading to a return to something like the gold standard. In any event, when we go back to the very root of currencies and what makes them valuable, we may well discover a counter-intuitive (at least, to some) truth: that both gold and cryptocurrencies are better placed as stores of value than fiat currencies, such as the pound, dollar or euro.

There is No Solution to the Problem of “Fake News”

Reading Time: 4 minutes

In the aftermath of the 2016 election, the term “fake news”, seldom heard previously, became ubiquitous. This was, of course, no coincidence: the unexpected victory of Donald Trump cried out for an explanation, and invoking the concept was one such attempt by the president’s many critics, who could not bring themselves to face the possibility that he won fairly. As one conservative commentator saw it, “just as progressive ideas were being rejected by voters across the western world, the media suddenly discovered a glitch which explained why. Fake news is the new false consciousness.” But the dissemination of disinformation and propaganda is as old civilization itself. The internet is merely a new means of spreading these, and even then, not especially new. Consider, for instance, the anti-vaccination and “9/11 truth” movements of the preceding decades, and the role played by the internet in amplifying the noises of otherwise small groups of dedicated ideologues or charlatans. So we are still left wondering: why only in the last few years has the term “fake news” entered public discourse?

A possible answer is that the point has been reached at which traditional purveyors of news feel that they no longer have control over broader narratives. Their sounding of the alarm over “fake news” is thus a desperate rallying cry in order to regain this control. Some have drawn an analogy to the invention of the printing press in the 16th century, which also revolutionized the spread of information and led to the Protestant Reformation (and of course, disinformation, such as exaggerated accounts of the horrors of the Spanish Inquisition). From this perspective, it is futile to resist the changing ways in which information spreads. One must adapt or die. In many ways, Donald Trump, who began his presidency fighting off a cascade of “fake news” allegations, including about such petty matters as the size of his inauguration crowd, has done a better job of adapting to the new informational eco-system. Twitter, with its 280–until recently, only 140–character limit, has turned out to be the perfect medium for a president with a reportedly short attention span. He also uses it to bypass the mainstream media in order to reach the public directly with his own message or narrative. And the president has masterfully turned the weapon of “fake news” around, aiming it right back at the media. At the end of 2017, his first year in office, he seemed to relish releasing the “The Highly Anticipated Fake News Awards”, a list of misleading or false anti-Trump news stories undermining the media’s insistence that it is impartial.

For all its faults, however, the mainstream media does have a legitimate point about the dangers of “fake news”. There must be an objective standard against which all purveyors of news are held and there does need to be a common set–or at least core–of facts upon which all rational parties in society can agree. But this is easier said than done, and it is far from obvious that there is a “quick fix” solution to this problem that does not merely favor one set of news purveyors over another, based on criteria other than factual accuracy. For example, many in the US fear that the Federal Communications Commission’s (FCC) proposed changes to “net neutrality” rules will give a few major companies the ability to speed up, slow down or even block access to certain web addresses or content. Comcast, for instance, is simultaneously the largest television broadcasting company, through its National Broadcasting Company (NBC) channel, and the largest internet service provider in the United States. Should the current FCC chairman’s plans to end “net neutrality” succeed, this will put Comcast in a powerful position to regulate–effectively–much of the online media landscape according to its own financial interests as a news organisation.

Social media companies such as Facebook have come under fire for spreading “fake news.” Although Mark Zuckerberg initially argued that Facebook is a tech platform and not a media company per se, he was eventually forced to concede that whatever he had originally intended the company to be, an increasing number of people around the world did in fact get their news primarily from their Facebook newsfeed and that Facebook therefore had a “a responsibility to create an informed community and help build common understanding”. Behind this corporate newspeak must also lie a very real fear that government regulation of Facebook as a media company could end up crippling its business model. If Facebook could be held liable for the spread of false information, it would need to hire thousands of fact checkers to nip this in the bud whenever it occurs, but doing so would be far too costly for the organisation, to say nothing of the practical challenges involved. Thus, it has had to rely on very imperfect “fake news” detection algorithms, and more recently, a deliberate de-emphasis of news altogether, the idea behind this being to return the platform to its original purpose of connecting friends and family.

But it is gradually dawning on many people that the war on “fake news” may be unwinnable. This is because there is no in-principle solution to the age-old philosophical problem of how to know what is true. If anything, this problem has become vastly more difficult now that there is an abundance of information to sort through, presented to us in a non-random–but not necessarily truth-tracking–way. We would all do well, however, to exercise greater skepticism in response to all truth claims, including official ones, such as the vague claim that Russia “hacked the election”. Skepticism does not come naturally to human beings, who are notoriously credulous. One should thus be taught to be skeptical from a young age, and to favor logical consistency and empirical evidence over other considerations when evaluating competing truth claims. This approach falls well short of a real solution, but it may help us individually and collectively to navigate the treacherous ocean of information in which we find ourselves. Hopefully, we will find ways of adjusting to our current information environment and a new equilibrium will emerge from the informational chaos. Cronycle is one platform that is ahead of the curve in this respect: it not only recognizes the problem of information overload, but provides its users with useful tools for finding the trustworthy, high quality content out there in the Wild, Wild Web.

Oprah: The Candidate of Democrat Desperation

Reading Time: 4 minutes

For the first time in a while, there was a feel-good atmosphere in Hollywood on Sunday evening as Oprah Winfrey delivered a Golden Globes speech that many described as “presidential”. She sounded all the right notes about sexual harassment and rape, managing to present women–especially, black women–as righteous victims and brave fighters simultaneously. She pointed the finger of blame squarely at “brutally powerful men”. And she expressed the very cognitive dissonance at the heart of liberal, Democratic America when she stressed the importance of “uncovering the absolute truth”, only to exhort all women a mere two sentences later to tell “your truth”.

After years of postmodernist preaching that has trickled all the way down from the Ivory Tower to talk shows like Oprah, the chickens finally came home roost with the election of Donald Trump in 2016. Coincidentally, at that very moment, American liberals discovered the importance of objective truth, claiming to be on the side of the facts, against “fake news”. They mocked Kellyanne Conway’s phrase “alternative facts”, even as they continued to elevate the “lived experiences” of those belonging to minorities, above other considerations.

Conor Friedersdorf, in a recent piece in The Atlantic, documents just a few of the quacks Oprah has promoted over the years: the self-help writer Louise Hay, who theorised that Holocaust victims could be paying for the sins of a previous life; the “psychic medium” John Edward, who claimed to be in communication with the dead relatives of Oprah’s audience; Suzanne Somers, who promoted oestrogen creams and vitamin supplements to prevent aging; and Jenny McCarthy, the Playboy model who swears that her child’s autism was caused by vaccination. As he notes, these people were all speaking “their truth”, which was put on an even footing–at best–with expert opinion.

The list of possible subjective “truths” does not end there, however. “Your truth”, Friedersdorf points out, could also include the notion that “immigrants are ruining America; or that the white race is inherently superior to all others; or that the rules set forth in Leviticus or the Koran are the only way to live; or that the latest Alex Jones conspiracy theory is correct; or that climate change is a hoax cooked up by liberals to gain control over all aspects of life in the United States”. In her Golden Globes speech, Oprah tells the story of Recy Taylor–a black woman gang-raped by six white men who were never brought to justice–and of the courage she showed in speaking “her truth” about this appalling crime. But would we really want to put the truth of what happened to Taylor on a par with all the other aforementioned “truths”, asks Friedersdorf.

We should not discount the value of individual subjective experiences, but at the same time we should be aware of the perils of calling these “truths” without measuring them against any objective factual standard. Of course there are many sensible Democrats who are well aware of such perils, even when these come from their own side. But one can already almost hear some of them saying, as some Republicans said of Trump: “Never mind all that. We just have to beat the other side. We just have to win!”

Ross Douthat of the New York Times has a somewhat more forgiving attitude toward Oprah, arguing that for all her faults, she is one of the few remaining unifying forces in the American cultural and spiritual landscape. At first sight, “with her Obama-endorsing, #MeToo politics and her tendency to mix spirituality with pseudoscience” she seems to belong firmly in the Democratic camp. However, he points out,

…the divide between blue-state spirituality and red-state spirituality is much more porous than other divisions in our balkanized society, and the appeal of the spiritual worldview cuts across partisan lines and racial divides. (Health-and-wealth theology is a rare pan-ethnic religious movement, as popular among blacks and Hispanics as among Americans with Joel Osteen’s skin tone, and when Oprah touts something like “The Secret,” the power-of-spiritual-thinking tract from the author Rhonda Byrne, she’s offering a theology that’s just Osteen without Jesus.) Indeed, it may be the strongest force holding our metaphysically divided country together, the soft, squishy, unifying center that keeps secularists and traditionalists from replaying the Spanish Civil War.

Is that not all the more reason for her to run in 2020, Democrats might ask. Douthat is somewhat doubtful, raising the possibility that the whole plan could backfire; that “Oprah would cease to be a figure of the spiritual center the instant she assumed a partisan mantle.” Indeed, this fear seems to be what lies behind Oprah’s own reluctance to run for any political office, let alone for president.

Yet this is a risk Democrats appear willing to take, convinced that the only way they can beat Trump is by fighting fire with fire. Moreover, the situation gets more desperate by the day, as even fervent “Never Trump” publications such as National Review and The Economist have begun to concede that the economy is booming under Trump. If the tax cuts really do end up offering financial relief for most Americans in the coming year, the Democrats may fail to retake the House of Representatives in the mid-term elections this November. 2020 is still quite far off, and much can happen in the meantime, but if all they have to offer by then is Oprah Winfrey and some Twitter-inspired slogans, the Democrats could well ensure that Trump wins a second term. This is especially likely if they double down on identity politics and political correctness, factors which drove many people to vote for Trump in 2016. Even in the best case scenario, these approaches will not help them win over any voters in the so-called “flyover states”. And given the nature of the American political system, even millions more votes in the coastal areas will not move the Democrats any closer to an Electoral College victory in 2020.

Does the Future of Religion Lie in Technology?

Reading Time: 7 minutes

Very little these days seems untouched by technology. Indeed, people’s lives are so saturated with it that they sometimes speak of “withdrawal symptoms” on those increasingly rare occasions when they find themselves without internet access. Some try to escape it at least some of the time, for instance, by “disconnecting” for a day or more when on holiday or on a retreat. Yet surely, one might think, religion is one area that remains largely untouched by technology. This is certainly true of the Amish or ultra-Orthodox Jews, who are outright suspicious of any new technology. But it is true of mainstream religion as well. The eternal “flame” in synagogues is now often electric, churches have replaced candles on their Christmas trees with electric lights, and the muezzin’s call to prayer is often amplified by a loudspeaker. These changes, however, are trivial. Ancient religions have shown themselves able to incorporate technology into their practices, without disappearing or changing beyond recognition. It seems, then, that technology does not directly threaten religion.

Nevertheless, throughout most of the Western world, the churches are empty. Declining church attendance certainly seems to be correlated with technological advancement, but is there a causal connection? Perhaps the further factor causing both is the triumph of the scientific worldview. This laid the ground for the discoveries of biological evolution and Biblical criticism–which pulled the carpet from under religion–as well as for rapid technological advancement. What makes Western societies different from non-Western ones is that the former experienced both of these processes simultaneously. The latter, by and large, experienced only the latter. It is possible, however, that in the coming decades, the whole world will secularise as all societies move toward the scientific worldview. The question then is whether religion dies out and mankind continues without it, or whether one or many new religions are born into the vacuum that will be left.

Already there are some indications of what these religions might look like. The Way of the Future (WotF) “church” was founded in 2015 by self-driving car engineer Anthony Levandowski, on the “the realization, acceptance, and worship of a Godhead based on Artificial Intelligence (AI) developed through computer hardware and software.” Although this doctrine sounds somewhat comical and far-fetched, when one unpacks it a little, it begins to make sense. Through all the data and computing power to which it had access, Levandowski’s AI would be, for human intents and purposes, omniscient and omnipotent. We would believe in its ability to answer all our questions and solve all our problems and “worship” it by doing whatever it required of us in order to perform these tasks better, including handing over all our personal data. We would also do well to show proper respect and reverence towards this AI-godhead, so that it might in turn have mercy upon us. “There are many ways people think of God, and thousands of flavors of Christianity, Judaism, Islam,” says Levandowski, “but they’re always looking at something that’s not measurable or you can’t really see or control. This time it’s different. This time you will be able to talk to God, literally, and know that it’s listening.”

The Israeli historian Yuval Noah Harari, in his book Homo Deus: A Brief History of Tomorrow, distinguishes between two main types of “techno-religions”: “techno-humanism” and “data religion” (or “dataism”). Although he uses the term “religion” more broadly than most other writers would (e.g. he considers capitalism a religion), his discussion is helpful here. WotF would fit into the former category, since it posits that humans would continue to exist and to benefit from AI. “Dataism”, however, as Harari puts it, holds that “humans have completed their cosmic task, and they should now pass the torch on to entirely new kinds of entities.” (Harari, p. 285) This is more in line with what has been called the singularity, the point at which humans either merge entirely with AI or are eliminated by it – perhaps due to their inefficiency. It is of course entirely possible that techno-humanism is only a stepping stone on the way to the singularity, in which case it is indistinguishable from data religion, but Harari leaves this possibility open. Levandowski, too, dislikes the term “singularity”, preferring to speak of a profound “transition”, the end point of which is not at all clear.

A central tenet of “dataism” is that data processing is the highest good, which means that it evaluates everything in terms of its contribution to data processing. As a further corollary, anything that impedes the flow of data is evil (Harari, p. 298). With this in mind, Harari proposes Aaron Swartz as the first martyr of “dataism”. Swartz made it his life’s mission to remove all barriers to the free flow of information. To this end, he downloaded hundreds of thousands of articles from JSTOR, intending to release these so that everyone could access them free of charge. He was consequently prosecuted and when he realised that he was facing imprisonment, committed suicide. Swartz’ “martyrdom”, however, moved his goal a step closer, when JSTOR, in response to petitions and threats from his fans and “co-religionists”, apologised for its role in his death and agreed to allow free access to much of its data (Harari, p. 310).

These ideas about future techno-religions are all interesting, but they seem to miss at least one key feature of religion, namely its continuity with the past. Much of religious belief and practice is concerned with events that occurred in the past and are re-enacted through common rituals. Nicholas Wade, in his book The Faith Instinct, argues that religion has evolved gradually throughout human history (and pre-history). According to his thesis, religion “evolved for a single reason: to further the survival of human societies. Those who administer religions should not assume they cannot be altered. To the contrary, religions are Durkheimian structures, eminently adjustable to a society’s needs.” (Wade, p. 226)

He observes that every major social or economic revolution has been accompanied by a religious one. When humans were in their most primitive state, that of hunter gatherers, they were animists who believed that every physical thing had an associated spirit. Their rituals included dancing around campfires and taking hallucinogenic drugs in order to access the spirit world. With the agricultural revolution, humans developed calendars and religious feasts associated with the seasons. They came to worship a smaller number of “super-spirits”, or gods, often associated with agriculture, for example Demeter the Greek goddess of the harvest. The next phase of this revolution was increasing urbanisation, which began in the Middle East. As cities gave rise to states, and states to empires, the nature of religion changed again. It needed to be organised in a unified and centralised manner, and as the Roman emperors eventually discovered, Christianity was more conducive to these requirements than paganism (in the Far East, Buddhism, and in the Near East, Islam, fulfilled much the same function). The Protestant Reformation happened at approximately the same time as the voyages of discovery and the expansion of European empires around the world. This new form of Christianity placed greater emphasis on the individual, and so ushered in capitalist free enterprise. The Industrial Revolution then followed, which was the last major revolution until the present Information Revolution, as it might be called. Yet in the approximately 200 years since then, no new (or rather, updated) religious system has yet emerged. As Wade suggests at the end of his book.

Maybe religion needs to undergo a second transformation, similar in scope to the transition from hunter gatherer religion to that of settled societies. In this new configuration, religion would retain all its old powers of binding people together for a common purpose, whether for morality or defense. It would touch all the senses and lift the mind. It would transcend self. And it would find a way to be equally true to emotion and to reason, to our need to belong to one another and to what has been learned of the human condition through rational inquiry. (Wade, p. 227)

One might wonder whether techno-religions would be up to the task. Notice, however, that all the previous religious transformations were gradual – so gradual, in fact, that many people who lived through them may not even have noticed them. We still see evidence of this in the many pagan traditions that were incorporated into Christianity, for example, the Christmas tree, which probably derives from the ancient Roman practice of decorating houses with evergreen plants during the winter solstice festival of Saturnalia. Ancient temples devoted to pagan gods became churches. See for example, the Temple of Demeter pictured below, first built in 530 BCE and later rebuilt as a church in the 6th century CE. When Islam arrived on the scene in the 7th century, it did not claim to be a new religion. On the contrary, it held that the first man, Adam, was a Muslim and that everyone had subsequently strayed from the true religion, to which Muhammad would return them. It retrospectively retold all the Biblical stories in order to fit this narrative. Hagar, for instance, is a mere concubine of Abraham in the Bible, but according to Muslim tradition, she was his wife. This is important because she was the mother of Ishmael, who is considered the father of the Arabs and the ancestor of Muhammad.

The Temple of Demeter (rebuilt as a church), Naxos, Greece

The problem with techno-religions, as currently construed, is that instead of building on all this prior religious heritage, they propose to throw it out and start again de novo. But human nature is not like that, at least not until we have succeeded in substantially altering it through gene editing or other technology! Human beings crave a connection with the past in order to give their lives meaning. Since religion is mainly in the business of giving meaning to human lives (setting aside the question of whether there is any objective basis to this perceived meaning), a techno-religion that tells us to forget our collective past and put our faith in data or AI is surely one of the least inspiring belief systems we have ever been offered. If, however, we could imagine techno-religions that built on our existing religious heritage, and found some way of preserving those human values and traditions that have proven timeless, perhaps by baking them into the AI or data flow in some way, these religions might be on a firmer footing.