Russia and the West: what is the Endgame?

Reading Time: 4 minutes

In light of the poisoning of the former Russian spy Sergei Skripal and his daughter Yulia, and the tit-for-tat expulsions of diplomats by NATO countries and Russia that followed, one might begin to wonder: what exactly is the endgame of this increasing confrontation between Russia and the West?

The number of Russian diplomats that have just been expelled from NATO countries

When Prime Minister Theresa May announced the initial expulsion of 23 Russian diplomats, identified by British authorities as “undeclared intelligence officers”, the mood in the House of Commons was one of apparently sincere moral outrage. How dare Russia carry out such an attack on our soil! How can we punish the Putin regime to the maximum extent? From what most of the MPs had to say, one might conclude that their concerns were solely around the human rights record of Putin’s Russia.

Few would deny that Vladimir Putin is an authoritarian leader with scant regard for human rights, but there is no shortage of such leaders today on the world stage. The United Kingdom happily sells arms to Saudi Arabia, an absolute monarchy carrying out a brutal war on the Houthi rebels in Yemen and inflicting a terrible cost on that country’s civilian population. Britain rolls out the red carpet for Xi Jinping, now likely to be China’s dictator-for-life.

Perhaps the problem is that Putin’s ruthlessness is not confined to Russia’s borders, and extends as far as Britain’s shores. This argument lacks geopolitical perspective, however. Even after assurances were given to the Russians that NATO would not expand any further into what was formerly Russia’s sphere of influence, that organisation, redundant after the fall of Soviet Communism, went on to expand right up to the Russian border with Estonia and Latvia. While both the West and Russia have interfered in Ukraine’s affairs, to the detriment of Ukrainians, it should be obvious that the latter has a much greater interest in what transpires in that country. Even in further-flung Syria, Russia has many military assets as well as a decades-old alliance with the Assad regime. The West, meanwhile, has effectively supported radical Islamist groups in an effort to overthrow Assad, having failed to learn from its mistakes in Iraq and Libya.

The West cannot accuse Russia of playing realpolitik when it does so itself, under the guise of humanitarian intervention or democracy promotion. Put another way, it is not military intervention per se that the Western establishment opposes; it is military intervention by an outside power, in service of its national interest rather than that of “the liberal international order”. Russia as a nation state stubbornly refuses to go quietly into the night and let the “end of history” dawn .

Even the poisoning of the Skripals looks more like a pretext to pursue anti-Russia policies, instead of the real cause. Skripal was a double agent, and spies put their lives on the line. More inexcusable is the careless way in which he was targeted, putting his daughter’s as well as other innocent lives at risk. However, one has to be rather naïve to think that that MI6 has never endangered or harmed any innocent people during its various activities abroad.

What really seems to lie behind the hostility towards Russia is the overblown idea–verging on a conspiracy theory–that without malicious Russian cyber-activity, Brexit, the Trump presidency, Catalan separatism, virtually any unwelcome political development in the West that one cares to name… would not have happened. This is not to deny that Russia tried to interfere in elections by spreading propaganda. The United States has long engaged in similar activity and continues to do so this day, as former CIA director James Woolsey admitted in a recent television interview. The problem, however, is that Russia is blamed disproportionately or even entirely for certain political outcomes that might have come to pass anyway. Russia did not create the economic grievances, social divisions and culture wars that characterise the contemporary West; the most that can be said is that it–along with many other actors–moved to exploit them where possible. In other words, Russia is a scapegoat. The process of scapegoating is driven by sentiment, rather than reason. Unfortunately, this means that the current policy of confrontation with Russia has not been rationally thought out. If it had been, it would have been seen for the madness that it is.

Firstly, it has not in fact been proven that the Salisbury attack was carried out by “the Russian state”. If indeed it was not, and the British government rushed to judgement, this would be highly irresponsible – especially given the stakes of a conflict with Russia. The fact that Jeremy Corbyn has pointed this out does not make it false.

Secondly, if tensions with Russia continue to escalate, it is difficult to see how they will be ratcheted down. One cannot simultaneously believe that Putin is an evil genius and that he will simply back down after the next round of sanctions. The more he is pushed into a corner, the more likely he is to take military action of some kind. And the truth is that the West no longer has much of a stomach for that. If this is true even of a military superpower like the United States, which now does much of its fighting with drones and avoids putting its own citizens in harm’s way, how much more so must it be true of the United Kingdom?

If Russia is as wicked an adversary as many in the Western elite appear to believe, why antagonise this nuclear power further? How can anyone imagine that embarking on such a course of action will end well?

3 April Update: Gary Aitkenhead, the head of Britain’s military research centre, the Defence Science and Technology Laboratory, was “unable yet to say whether the military-grade nerve agent that poisoned a Russian double-agent last month had been produced in Russia.”

#DeleteFacebook Isn’t About Data Security

Reading Time: 3 minutes

In the wake of the Cambridge Analytica scandal, Facebook might actually be in trouble. The #DeleteFacebook hashtag is trending, and it’s seen some unlikely contributors, like Blink 182 singer Mark Hoppus, and Brian Acton, the co-founder of WhatsApp, which was itself sold to Facebook. Meanwhile, Facebook stock has dropped by 10% this week so far. The FTC has announced that it’s opening an investigation into Facebook’s business practices, to determine whether Facebook violated its user agreement, an infraction which would come with a hefty fine. Mark Zuckerberg hasn’t made a public statement about the matter yet, but he’s been summoned by the UK Parliament. The bad news keeps piling up.

The obvious question is whether Facebook will survive, after whatever punitive measures are dispensed. And, while it’s possible that it won’t, it’s difficult to imagine how its extinction would come about. Its users could always leave, but there’s very little individual incentive to do that, and, given that a third of the world uses Facebook, getting everybody to quit would represent a massive coordination problem. Therefore, unless Facebook is banned outright, or somehow sued into oblivion, it seems likely that it will persist, if in some sort of regulated or otherwise curtailed form.

The less obvious question is: why now? This is by no means the only data scandal that Facebook has been embroiled in. Any intelligent consumer of digital media knows very well that Facebook is harnessing their personal data, and that such data has been treated carelessly before, and used for somewhat nefarious ends. Probably the most striking example came in 2014, when PNAS published a study by researchers who quite literally played with the emotions of Facebook users to find experimental evidence of Internet-based emotional contagion. More recently, earlier in March, it was revealed that Facebook’s researchers had told advertisers that it had figured out how to identify whether its teenage users were feeling desperate or depressed—and that this could be worthwhile marketing data. Given all of this, it’s clear that data security isn’t the primary force driving #DeleteFacebook.

It’s much more plausible that what’s behind the media conflagration isn’t data security itself, but rather the involvement of Donald Trump. Some have claimed that Cambridge Analytica was responsible for Trump’s election, having provided his campaign with personal data about voters that (maybe) offered unprecedented psychologiccal leverage, revealing which precise people could be viably targeted by propaganda. If you’re anti-Trump, and you believe this, then your beloved social network has unwittingly engaged in a large-scale erosion of democracy, which is to say, a technologically-driven coup by a candidate you don’t like.

This may not even be the case, by the way. The person who’s most loudly proclaimed that Cambridge Analytica was responsible for the election’s outcome is the now-suspended CEO of Cambridge Analytica, Alexander Nix. Ted Cruz’s campaign hired Cambridge Analytica, obviously didn’t win the election, and, as David A. Graham of the Atlantic reports, “found that CA’s products didn’t work very well, and complained that it was paying for a service that the company hadn’t yet built.” Corroborating this view is Kenneth Vogel, a New York Times reporter from their Washington Bureau, who recently Tweeted that Cambridge Analytica “…was (&is) an overpriced service that delivered little value to the TRUMP campaign.” He went on to claim that campaigns only signed up to secure access to the Mercer family—a rich line of big-time Republican donors—being that they’re major CA investors.

To sum up: Cambridge Analytica is only one of many organizations which have used personal Facebook data in a sinister manner, and its use of that data might have actually been inconsequential. If this is the case, #DeleteFacebook offers a clear lesson to tech companies, which is that it’s not actually important whether your product or service unscrupulously surveils its users. It’s more important to ensure that your company doesn’t give its data to anybody particularly unpopular, especially if they end up getting elected. If you sell your data to relatively unproblematic clients, you’ll probably be okay.

Does Social Media Really Polarize Our Politics?

Reading Time: 3 minutes

It’s often said that social media has a polarizing effect on our politics. And, on the surface, this narrative makes a lot of sense. The polarization of politics has continued as social media has taken over our brains. And what social media does, among other things, is make a game of earning the approval of your peers, thus solidifying your group identity. When you post something that pleases the sensibilities of your cohort—whether it’s a handsome selfie or a solemn plea for stricter gun control—you get the satisfaction of an immediate bombardment of friendly notifications. The reward structure of the social media experience doesn’t provide incentives for expressing minority views, or objecting to the prevailing narratives, or befriending those who disagree with you.

Moreover, Twitter and Facebook aren’t great places for dialogue. Political arguments are usually futile in real life, even with all of the felicitousness provided by face-to-face interaction. It’s much worse when ideological disagreements need to be reduced to 280 characters, or haveto compete with cute pictures of somebody’s baby. In this setting, sensitivity and nuance doesn’t play well. What gets the most attention is pithiness and aggression. In short, social media enables the self-congratulation and self-separation of mutually hostile political factions. Sounds pretty polarizing, right?

Yes. However, there’s a big and obvious question here, which is whether this is actually any different from the pre-Twitter media landscape. Long before Facebook was ever a gleam in Mark Zuckerberg’s eye, the various political classes selected the media that was most collegial to their respective worldviews. To take America as an example, in previous decades, Christian conservatives tuned into right-wing talk radio to hear about the horrors of the gay agenda, whereas elite liberals picked up Harper’s to read about the horrors of capitalism. (This is still true today, in part.) Bubbles and echo chambers exist in absence of Twitter. All that’s required to create ideological homogeneity is tribal self-selection or homophily—the tendency of people to hang out with people who are like them and agree with them, given freedom of association. That’s definitely a pre-iPhone tendency.

But, of course, it’s still possible that social media has enhanced tribal patterns of behaviour—that this is not a difference of kind, but it is a difference in degree. So, if we check the data, what do we find? Well, it appears that social media does, in fact, have an effect on polarization. It’s just the opposite effect that critics might expect. According to a demographic study by Boxell et al., published by Stanford University, political polarization is actually less pronounced among demographics that use social media more often (young people, essentially). This shows that it’s unlikely that social media is a more powerful driver of polarization than old-fashioned media. (Or it shows that, even if social media does polarize, there’s some countervailing anti-polarizing force that’s much more powerful.)

And, like the just-so story about why social media polarizes, there’s an appealing readymade narrative about why the opposite might be true. While political disagreements on Twitter and Facebook tend to be shallow and nasty, they’re still genuine disagreements—something that doesn’t usually occur in traditional media. The New York Times doesn’t contain a second page declaring that all the articles on the front page are slanted. And while it’s true that debate programs are a staple of political television, such programs are usually staffed by a preexisting team who are paid to perform a predictable set of reactions to ongoing affairs. Meanwhile, on Twitter, it’s quite easy to run into novel objections to everything you believe in, which, even if they aren’t particularly convincing, might compel more considered private reflection.

Or maybe it’s even simpler than that. It’s possible that young people are less polarized because social media is so nasty and tribal. While a minority of social media influencers make a lot of provocative noise, it’s possible that the non-contributing majority is quietly alienated by the vitriol. While a controversial tweet with 1200 retweets looks impressive, there’s no way to measure the number of users who have quietly rolled their eyes and moved on—or have simply quit Twitter altogether.

There’s a larger lesson here, which is that it’s unwise to infer narratives of societal change based simply on the most visible behaviour provoked by one app or another. (Another demonstration of this: millennials have way less sex than their parents, despite the existence of Tinder and all the moral panic surrounding it.) Ultimately, sensationalist narratives about the polarizing effects of social media are just the kind of thing that’s popular on social media.

There is No Solution to the Problem of “Fake News”

Reading Time: 4 minutes

In the aftermath of the 2016 election, the term “fake news”, seldom heard previously, became ubiquitous. This was, of course, no coincidence: the unexpected victory of Donald Trump cried out for an explanation, and invoking the concept was one such attempt by the president’s many critics, who could not bring themselves to face the possibility that he won fairly. As one conservative commentator saw it, “just as progressive ideas were being rejected by voters across the western world, the media suddenly discovered a glitch which explained why. Fake news is the new false consciousness.” But the dissemination of disinformation and propaganda is as old civilization itself. The internet is merely a new means of spreading these, and even then, not especially new. Consider, for instance, the anti-vaccination and “9/11 truth” movements of the preceding decades, and the role played by the internet in amplifying the noises of otherwise small groups of dedicated ideologues or charlatans. So we are still left wondering: why only in the last few years has the term “fake news” entered public discourse?

A possible answer is that the point has been reached at which traditional purveyors of news feel that they no longer have control over broader narratives. Their sounding of the alarm over “fake news” is thus a desperate rallying cry in order to regain this control. Some have drawn an analogy to the invention of the printing press in the 16th century, which also revolutionized the spread of information and led to the Protestant Reformation (and of course, disinformation, such as exaggerated accounts of the horrors of the Spanish Inquisition). From this perspective, it is futile to resist the changing ways in which information spreads. One must adapt or die. In many ways, Donald Trump, who began his presidency fighting off a cascade of “fake news” allegations, including about such petty matters as the size of his inauguration crowd, has done a better job of adapting to the new informational eco-system. Twitter, with its 280–until recently, only 140–character limit, has turned out to be the perfect medium for a president with a reportedly short attention span. He also uses it to bypass the mainstream media in order to reach the public directly with his own message or narrative. And the president has masterfully turned the weapon of “fake news” around, aiming it right back at the media. At the end of 2017, his first year in office, he seemed to relish releasing the “The Highly Anticipated Fake News Awards”, a list of misleading or false anti-Trump news stories undermining the media’s insistence that it is impartial.

For all its faults, however, the mainstream media does have a legitimate point about the dangers of “fake news”. There must be an objective standard against which all purveyors of news are held and there does need to be a common set–or at least core–of facts upon which all rational parties in society can agree. But this is easier said than done, and it is far from obvious that there is a “quick fix” solution to this problem that does not merely favor one set of news purveyors over another, based on criteria other than factual accuracy. For example, many in the US fear that the Federal Communications Commission’s (FCC) proposed changes to “net neutrality” rules will give a few major companies the ability to speed up, slow down or even block access to certain web addresses or content. Comcast, for instance, is simultaneously the largest television broadcasting company, through its National Broadcasting Company (NBC) channel, and the largest internet service provider in the United States. Should the current FCC chairman’s plans to end “net neutrality” succeed, this will put Comcast in a powerful position to regulate–effectively–much of the online media landscape according to its own financial interests as a news organisation.

Social media companies such as Facebook have come under fire for spreading “fake news.” Although Mark Zuckerberg initially argued that Facebook is a tech platform and not a media company per se, he was eventually forced to concede that whatever he had originally intended the company to be, an increasing number of people around the world did in fact get their news primarily from their Facebook newsfeed and that Facebook therefore had a “a responsibility to create an informed community and help build common understanding”. Behind this corporate newspeak must also lie a very real fear that government regulation of Facebook as a media company could end up crippling its business model. If Facebook could be held liable for the spread of false information, it would need to hire thousands of fact checkers to nip this in the bud whenever it occurs, but doing so would be far too costly for the organisation, to say nothing of the practical challenges involved. Thus, it has had to rely on very imperfect “fake news” detection algorithms, and more recently, a deliberate de-emphasis of news altogether, the idea behind this being to return the platform to its original purpose of connecting friends and family.

But it is gradually dawning on many people that the war on “fake news” may be unwinnable. This is because there is no in-principle solution to the age-old philosophical problem of how to know what is true. If anything, this problem has become vastly more difficult now that there is an abundance of information to sort through, presented to us in a non-random–but not necessarily truth-tracking–way. We would all do well, however, to exercise greater skepticism in response to all truth claims, including official ones, such as the vague claim that Russia “hacked the election”. Skepticism does not come naturally to human beings, who are notoriously credulous. One should thus be taught to be skeptical from a young age, and to favor logical consistency and empirical evidence over other considerations when evaluating competing truth claims. This approach falls well short of a real solution, but it may help us individually and collectively to navigate the treacherous ocean of information in which we find ourselves. Hopefully, we will find ways of adjusting to our current information environment and a new equilibrium will emerge from the informational chaos. Cronycle is one platform that is ahead of the curve in this respect: it not only recognizes the problem of information overload, but provides its users with useful tools for finding the trustworthy, high quality content out there in the Wild, Wild Web.

Oprah: The Candidate of Democrat Desperation

Reading Time: 4 minutes

For the first time in a while, there was a feel-good atmosphere in Hollywood on Sunday evening as Oprah Winfrey delivered a Golden Globes speech that many described as “presidential”. She sounded all the right notes about sexual harassment and rape, managing to present women–especially, black women–as righteous victims and brave fighters simultaneously. She pointed the finger of blame squarely at “brutally powerful men”. And she expressed the very cognitive dissonance at the heart of liberal, Democratic America when she stressed the importance of “uncovering the absolute truth”, only to exhort all women a mere two sentences later to tell “your truth”.

After years of postmodernist preaching that has trickled all the way down from the Ivory Tower to talk shows like Oprah, the chickens finally came home roost with the election of Donald Trump in 2016. Coincidentally, at that very moment, American liberals discovered the importance of objective truth, claiming to be on the side of the facts, against “fake news”. They mocked Kellyanne Conway’s phrase “alternative facts”, even as they continued to elevate the “lived experiences” of those belonging to minorities, above other considerations.

Conor Friedersdorf, in a recent piece in The Atlantic, documents just a few of the quacks Oprah has promoted over the years: the self-help writer Louise Hay, who theorised that Holocaust victims could be paying for the sins of a previous life; the “psychic medium” John Edward, who claimed to be in communication with the dead relatives of Oprah’s audience; Suzanne Somers, who promoted oestrogen creams and vitamin supplements to prevent aging; and Jenny McCarthy, the Playboy model who swears that her child’s autism was caused by vaccination. As he notes, these people were all speaking “their truth”, which was put on an even footing–at best–with expert opinion.

The list of possible subjective “truths” does not end there, however. “Your truth”, Friedersdorf points out, could also include the notion that “immigrants are ruining America; or that the white race is inherently superior to all others; or that the rules set forth in Leviticus or the Koran are the only way to live; or that the latest Alex Jones conspiracy theory is correct; or that climate change is a hoax cooked up by liberals to gain control over all aspects of life in the United States”. In her Golden Globes speech, Oprah tells the story of Recy Taylor–a black woman gang-raped by six white men who were never brought to justice–and of the courage she showed in speaking “her truth” about this appalling crime. But would we really want to put the truth of what happened to Taylor on a par with all the other aforementioned “truths”, asks Friedersdorf.

We should not discount the value of individual subjective experiences, but at the same time we should be aware of the perils of calling these “truths” without measuring them against any objective factual standard. Of course there are many sensible Democrats who are well aware of such perils, even when these come from their own side. But one can already almost hear some of them saying, as some Republicans said of Trump: “Never mind all that. We just have to beat the other side. We just have to win!”

Ross Douthat of the New York Times has a somewhat more forgiving attitude toward Oprah, arguing that for all her faults, she is one of the few remaining unifying forces in the American cultural and spiritual landscape. At first sight, “with her Obama-endorsing, #MeToo politics and her tendency to mix spirituality with pseudoscience” she seems to belong firmly in the Democratic camp. However, he points out,

…the divide between blue-state spirituality and red-state spirituality is much more porous than other divisions in our balkanized society, and the appeal of the spiritual worldview cuts across partisan lines and racial divides. (Health-and-wealth theology is a rare pan-ethnic religious movement, as popular among blacks and Hispanics as among Americans with Joel Osteen’s skin tone, and when Oprah touts something like “The Secret,” the power-of-spiritual-thinking tract from the author Rhonda Byrne, she’s offering a theology that’s just Osteen without Jesus.) Indeed, it may be the strongest force holding our metaphysically divided country together, the soft, squishy, unifying center that keeps secularists and traditionalists from replaying the Spanish Civil War.

Is that not all the more reason for her to run in 2020, Democrats might ask. Douthat is somewhat doubtful, raising the possibility that the whole plan could backfire; that “Oprah would cease to be a figure of the spiritual center the instant she assumed a partisan mantle.” Indeed, this fear seems to be what lies behind Oprah’s own reluctance to run for any political office, let alone for president.

Yet this is a risk Democrats appear willing to take, convinced that the only way they can beat Trump is by fighting fire with fire. Moreover, the situation gets more desperate by the day, as even fervent “Never Trump” publications such as National Review and The Economist have begun to concede that the economy is booming under Trump. If the tax cuts really do end up offering financial relief for most Americans in the coming year, the Democrats may fail to retake the House of Representatives in the mid-term elections this November. 2020 is still quite far off, and much can happen in the meantime, but if all they have to offer by then is Oprah Winfrey and some Twitter-inspired slogans, the Democrats could well ensure that Trump wins a second term. This is especially likely if they double down on identity politics and political correctness, factors which drove many people to vote for Trump in 2016. Even in the best case scenario, these approaches will not help them win over any voters in the so-called “flyover states”. And given the nature of the American political system, even millions more votes in the coastal areas will not move the Democrats any closer to an Electoral College victory in 2020.

Is there Hope Yet for the Rainbow Nation?

Reading Time: 9 minutes

South Africa might be seen as a microcosm of the “globalised” world itself. Home to people of African, European, Indian, Asian, and mixed descent, it nevertheless attempts to find some unity of purpose in its diversity. In 1994 with the end of white minority rule, it thus styled itself as the “Rainbow Nation”: the idea that the country’s many racial and ethnic parts can make up a beautiful whole. But rainbows are also elusive, as anyone who has ever tried to find the proverbial pot of gold at the end of one will know.

Comparisons with South Africa’s northern neighbour, Zimbabwe, although imperfect, are instructive. In both cases, a numerically significant, though proportionally small white minority held political power for a century or more. This white minority established an entire state apparatus and a modern economy, of which the black population formed the bulk of the working class. Black people, however, were disenfranchised and–especially the educated among them–began to demand political participation. This demand was increasingly difficult to resist as other African countries (with smaller settler populations) had become independent from British, French and Portuguese colonial rule by the mid-1970s. Added to the internal pressure for radical political reform was now a chorus of international condemnation, including from countries such as the United States, which itself had upheld legal segregation until only about two decades earlier. The white settler populations of Zimbabwe and South Africa withstood this pressure for as long as they could, convinced in the main that a sudden handover of power to the black majority, or rather, to its ostensible representatives, would bring about the ruin of their countries.

While this conviction was brought about by thinly veiled self-interest and old-fashioned racist attitudes, it was also grounded in a certain realism that foreign observers often lacked. This is a point that the white leaders of Zimbabwe (then Rhodesia) and South Africa tried to make repeatedly, when interviewed by William F. Buckley in 1974. It is of course true that a small percentage of whites, motivated by the daily oppression and injustice they saw all around them, actively sought an end to the system of white minority rule. However, these were to be found mostly on the political left and their proposed alternative was some form of Third World socialism or even outright Soviet-aligned Communism. It is also noteworthy that even the famous white South African Communist leader Joe Slovo once told the Afrikaner journalist Hermann Giliomee, “They [the African National Congress] are going to fuck things up, we know it,” and regarded the idea of simply handing over land to the average “Joe Tshabalala” and telling him to “start farming” as idiocy. He was prophetic, for this is almost exactly what happened in Zimbabwe after 2000, only “Joe Tshabalala”, rather than the average Zimbabwean, was a Robert Mugabe crony.

The white left put a lot of thought into how to destroy the regimes of hated figures such as Ian Smith and P.W. Botha, but very little into the admittedly difficult task of replacing white minority rule with something that would actually be better. An even smaller–or at least less influential–segment of white opinion, best represented by the late South African politician Helen Suzman, was liberal. Their view was that the state institutions and capitalist economy were not in themselves wicked, and would be legitimate if only barriers to the equal participation of the black majority were lifted and the playing field made more even for all.

This was indeed what seemed to happen in Zimbabwe at first, which may have been one reason why that the international community turned a blind eye to Mugabe’s atrocities upon coming to power. Zimbabwe differed from South Africa in that it had two rival “liberation” movements at the time of independence, Joshua Nkomo’s Zimbabwe African People’s Union (ZAPU) and Mugabe’s Zimbabwe African National Union (ZANU). The former comprised mostly Ndebele and was backed by the Soviet Union, while the latter was Shona-dominated and supported by China. After winning the first democratic election in 1980, ZANU proceeded to weaken its rival by whatever means possible, and between 1983 and 1987, the Zimbabwe National Army, carried out a series of massacres of Ndebele civilians, known as the Gukurahundi, a Shona term meaning “the early rain which washes away the chaff before the spring rains.” Approximately 20,000 people were killed and many thousands tortured.

The white population was largely unaffected by this strife and scarcely a word of condemnation was heard from Western governments. A number of American universities even bestowed honorary degrees on Mugabe during the Gukurahundi. Zimbabwe continued to be called “the bread basket of Africa”, due to its being a net food exporter, and its excellent pre-independence education system remained intact.

However, due to the repressive nature of Mugabe’s regime and a gradually declining economy, ZANU-PF’s (as the party was renamed when it merged with the remnants of ZAPU in 1987) political support continued to erode. Although Mugabe did all he could to rig elections and intimidate opposition, ZANU-PF was dangerously close to losing majority support after the emergence of the Movement for Democratic Change (MDC) party in 1999. Mugabe then needed to cash in his political chips, so to speak. He needed a populist policy that would buy him enough support to win the next election in 2002. It was at this point that he settled upon the so-called “land redistribution” issue, which destroyed the country’s economy, leading to hyperinflation and the eventual adoption of the US dollar as the medium of exchange in the country.

In South Africa, the National Party government held onto power for longer, and there is some reason to believe it only relinquished it because of geopolitics. The collapse of the Soviet Union meant that apartheid South Africa could no longer make the case to the United States that it was a bulwark against the spread of Soviet influence in Africa. Nelson Mandela, meanwhile, persuaded the African National Congress (ANC) to drop the policy of nationalising key sectors of the economy in 1992. In fact, especially during the Mbeki administration (1999-2008), the opposite policy of privatisation was pursued. Several black billionaires were created through the government’s Black Economic Empowerment (BEE) policies, including Cyril Ramaphosa, who has just been elected leader of the ANC. However, government corruption, present from the beginning, increased enormously in the Zuma years (2009-present) as an entire “patronage network” emerged under the president. Large amounts of public money were looted and the Gupta brothers, disreputable foreign businessmen, worked together with the Zuma faction of the ANC to “capture” the entire state for their mutual financial benefit.

DECEMBER 18, 2017. Nasrec, Johannesburg. Jacob Zuma and Cyril Ramaphosa seen just before victory. PHOTOGRAPH: ALON SKUY

A joke that white South Africans have been known to tell goes as follows:

Question: What’s the difference between South Africa and Zimbabwe?

Answer: 10 years.

As it happens, more than 10 years have passed since that joke was first told and South Africa has thus far managed to avoid Zimbabwe’s fate. There are a few of reasons why this might be the case. Firstly, the ANC, for all its faults, is a very “broad church” consisting of trade unionists, Communists, black nationalists, African traditionalists, Christians, billionaire businessmen and everything in between. It has so far shown little sign of being beholden to any particular individual, the way ZANU-PF was to Mugabe for 37 years until he was forced to step down only a month ago. There is a term limit of two 5-year presidential terms, which Mbeki respected (although he resigned under pressure before the end of his second term) and Zuma shows every sign of respecting as well. RW Johnson has also noted that unlike many other African countries, South Africa has a long tradition of free speech that allows criticism of the country’s executive leadership. Secondly, South Africa is in a sense “too big to fail”. When Zimbabwe’s economy collapsed, millions of Zimbabweans were able to migrate–often illegally–to South Africa, in order to find work. South Africa’s 55 million people, over 5 times as many as Zimbabwe, have no such option. Only the wealthiest will be able to go elsewhere. South Africa’s economy, although slow-growing, is also the third largest in Africa. Its GDP per capita is the 7th highest, although it is also one of the most unequal countries in the world, with a Gini coefficient of 0.65.

It is against this backdrop that the recent leadership contest within the ruling party took place. The difference between the two leading candidates was stark. One the one hand, Nkosazana Dlamini-Zuma, Jacob Zuma’s ex-wife, ran on promises of “Radical Economic Transformation”, which, she explained in a speech to business leaders in August, included “land restitution without compensation,” no longer allowing “the Reserve Bank [to] be policy independent from government”, an end to “White Monopoly Capital” (i.e. alleged white control of the private sector) and economic growth of 5%, to be achieved “through deliberate government intervention in a developmental state.” On the other was Cyril Ramaphosa, who spoke of a “New Deal” for South Africa, involving “the combined resources of all sectors of South Africa society: workers, business – small and big – social enterprises, informal sector and the vast resources of the state,” restoring investor confidence, “1 million new jobs” in agriculture by 2030, maintaining “fiscal discipline to ensure our resources are directed to where they have the greatest developmental impact and not diverted to servicing debt or populist projects” and “targeting 3 percent GDP growth in 2018… rising to 5 percent growth by 2023.” He exhorted South Africans to be “bold and innovative, without being reckless.”

Many South Africans were overjoyed when Ramaphosa won the leadership contest on 18 December. Judging by the performance of South Africa’s currency, the rand, after the news of his victory, so too were investors.

However, there are still some reasons to temper our optimism. Other figures recently elected to senior positions in the ANC are staunch Zuma allies, such as the party’s deputy president David Mabuza and secretary-general Ace Magashule. They can be expected to frustrate any attempts at cleaning house should these clash with their own political and financial interests. Then there is the problem that a number of Ramaphosa’s statements are not so different from those of Dlamini-Zuma. In March 2016, in his capacity as Deputy President of South Africa, he said, “For far too long, this economy has been managed by white people. That must come to an end… Those who don’t like this idea – tough for you. That is how we are proceeding.” Although he remains vague about how this goal might be achieved, and may not really believe his own rhetoric, in saying such things he seems to pander to the populist notion that white and black South Africans are competing in a zero-sum game. In a more recent speech in November, he reiterated the need to “accelerate the transfer of ownership and control of the economy to black South Africans. This is both a moral and economic imperative.” He also opined that, “Our land was stolen from our forebears leading to the destruction of the asset base of the African people resulting in the impoverishment of the black nation.” Land redistribution, he said, would be achieved through “the transfer of state farms on a commercial scale to black farmers and the accelerated transfer of agricultural land.”

This sort of rhetoric, albeit vague in certain respects, is not to be taken lightly. A taboo topic in South Africa is the increasing murder rate of white farmers and their employees. Given the particularly brutal nature of these murders, there is reason to think they are motivated by the outlook–to which Ramaphosa assents above–that they are living on “stolen” land that does not really belong to them. There has certainly been a lot of online hype about farm murders and disagreement about what the true murder rate is, but we do know that the number of farm murders has been on the increase since 1990, and now totals between 1,824 and 3,100. Given that many South African farmers–who number only in the tens of thousands and feed the entire country–are fearing for their lives, the least Ramaphosa could have done was condemn the violence against them and reassure them that they will be treated fairly. However, he did no such thing. Furthermore, it was both unnecessary and irresponsible of him to beat the land reform drum when according to a survey by the South African Institute of Race Relations, only 0.4% of respondents saw “skewed land ownership” as one of  “the two most serious problems unresolved since 1994” and “by the Government’s own admission, between 50% and 90% of all land reform projects have failed, the recipients of formerly successful farms failing to produce any marketable surplus.”

Another interesting account of Ramaphosa’s views in 1994 has surfaced, which should make white South Africans nervous. In his memoirs, the late Mario Oriani-Ambrosini, a South African MP from the Inkatha Freedom Party, recounts:

In his brutal honesty, Ramaphosa told me of the ANC’s 25-year strategy to deal with the whites: it would be like boiling a frog alive, which is done by raising the temperature very slowly. Being cold-blooded, the frog does not notice the slow temperature increase, but if the temperature is raised suddenly, the frog will jump out of the water. He meant that the black majority would pass laws transferring wealth, land, and economic power from white to black slowly and incrementally, until the whites lost all they had gained in South Africa, but without taking too much from them at any given time to cause them to rebel or fight.

We do not know whether Ramaphosa still cleaves to these views, or whether he has mellowed somewhat with experience–in both government and business–and age.

It cannot be denied, of course, that South Africa is in a serious situation, with such skewed wealth distribution, extreme income inequality and high unemployment. Ramaphosa’s task in addressing these problems is Herculean and unenviable. But BEE, which involves creating a handful of black billionaires like himself, has already been tried, and failed. Seizing white-owned property or wealth would be sure to fail too, on a much more catastrophic scale than in Zimbabwe (in an extraordinary turn of events, Zimbabwe’s new president Emmerson Mnangagwa, now appears to be reversing its policy and is encouraging white farmers to return). This is not to suggest that its white inhabitants are inherently superior to its black inhabitants; only to recognise the reality that those white South Africans who occupy senior, well-paying positions in the economy do so largely because they are the best qualified for those positions. One can certainly point out that they acquired their skills and knowledge advantage through unfair means over the centuries, but having competent people in senior positions is essential in a developing economy and ultimately benefits everyone. If Ramaphosa wants to create a new generation of skilled black South Africans, he should by all means be applauded. But he should also be willing to admit that this will take time, especially given the damage done to the education system by previous ANC administrations, and that in the meantime, black South Africans in general still have much to learn from white South Africans, who are not their enemies, but their compatriots.

Is the Indo-Pacific Here to Stay?

Reading Time: 3 minutes

Coverage of Trump’s recent tour of Asia has tended to focus on a few areas. There were his readiness to speak kindly of China, in spite of earlier campaign promises. Then there was the continuing war of words with North Korea, which has reached farcical levels. And finally, the obligatory snaps of Putin and Trump together, cast as pet and master.

Less attention was paid to the US President’s use of ‘Indo-Pacific’, at least in the West. It could have been an error on his part, a confusion with the more common Asia-Pacific – except that reports say that General McMaster is also a fan of the term. In spite of Trump’s ‘economic nationalism’, inherited from Steve Bannon (which casts Indian workers as interlopers in America), and a spate of shootings involving the diaspora in America, the president is continuing his predecessors work.

Indo-US relations have not always been the warmest. During the Cold War, presidents traditionally sided with Pakistan, whilst the Soviet Union supported nominally socialist India. In 1971, at the height of the Bangladeshi War of Independence, Richard Nixon sent an aircraft carrier into the Indian Ocean as a potential deterrent to further Indian action (thankfully for all parties, it was never used). The pro-Pakistani position continued up until George W. Bush and to a degree into the Obama Administration (although given the heavy use of drone strikes, it was a contentious relationship to say the least).

Nevertheless, the previous government realised that Afghanistan and Iraq had worn down US forces, and left them without the training needed for fighting conventional warfare. As US officials realised the futility of attempting to achieve stability in the countries which they had invaded, they also saw that China’s power projection had only grown greater over time.

And that also meant that they needed partners in the Asia-Pacific region. There was Japan and South Korea, of course – in spite of Trump’s threats during North Korean missile testing, the alliance with them has held strong. Australia, too, could play a part, although its distance rendered it a less useful ally in curbing Chinese soft power. At stake was not merely territory, but hegemony over the ASEAN (Association of South East Asian Nations) countries: a loose grouping of small states including Thailand, Vietnam, the Phillipines, and Myanmar. All have felt (in varying degrees) Chinese influence, and are wary of further pushes by the world’s most populous country.

As its neighbour, and in theory regional counterweight, India could be the final part of the equation for the US. The restarting of the ‘Quad’ talks between Japan, India, Australia, and the US, on the sidelines of Trump’s tour, hinted at an arrangement which could reasonably support small nations facing Chinese pressure.

It would also fit into India’s plans to gain greater ties with the ASEAN, in the so-called Act East policy. The world’s biggest democracy has long suffered from poor relationships with its direct neighbours, Sri Lanka and Pakistan (Bhutan is the exception to this rule, although an attempted Chinese incursion might have lead to a regime change). China has capitalised on this, using investment as a tool to build a ‘string of pearls’ to limit India’s abilities to power project.

There are obstacles to making the Indo-Pacific a reality: the group haven’t even held a naval exercise together yet (a previous attempt fell apart due to external pressure from Beijing). Trump could always change his mind, and attempt to gain more concessions from India, although this seems unlikely – his relationship with Prime Minister Narendra Modi has proven smooth so far. A more real obstacle is India’s limited naval power: a series of mishaps aboard submarines have killed a number of sailors, and made it even more decrepit in contrast to China’s arsenal.

But as China’s power continues to grow, it’s difficult not to see at least some form of cooperation between India, other regional powers, and the US. There’s too much at stake in the region: a fact which might bring some scant relief to the members of ASEAN, who are the pawns in this global game of chess.

Alt-Left vs. Alt-Right

Reading Time: 8 minutes

What has been called the “alt-right” is only the mirror image of certain elements on the left who call themselves “progressive”, but are regarded more pejoratively by others as “politically correct” (PC) “Social Justice Warriors” (SJWs). Although those in either camp would usually be loath to admit this symmetry, there are exceptions. Milo Yiannopoulos, for example, in an article written for Breitbart, characterises the alt-right as “mostly white, mostly male middle-American radicals, who are unapologetically embracing a new identity politics that prioritises the interests of their own demographic.” And according to a recent article published in Salon, it was “inevitable, from the beginning, that white nationalism would arise as a necessary outgrowth if liberals [in the American sense of the term] kept up with their identity politics obsession, and that is precisely where we find ourselves.”

In focusing obsessively on the identities and subjective experiences of oppressed or marginalised groups, the “progressive” left has taken the principle that the “underdog is always in the right” (or what Bertrand Russell called “the supposed virtue of the oppressed”) to its logical conclusion. This has created an intellectual environment (at least in academic and mainstream media circles, where “progressive” ideology reigns) in which the moral “trump card” in any argument is always held by the most oppressed person. It has also created the perception, not without merit, that white males are themselves an oppressed minority: oppressed by a nefarious “globalism” that is indifferent or even cheery about their demographic decline, by affirmative action, which goes out of its way to avoid employing them in positions for which they may be qualified and contemporary feminism, which treats them like “rapists-in-waiting”. When the heterosexual and “cis-gender” qualifiers are added as well, the remaining subset is made to feel its minority status even more acutely. In the new hierarchy of virtue, these white males are, through no fault of their own, placed irredeemably at the bottom. From their point of view (with a few self-flagellating exceptions), this is plainly the height of unfairness. But from the point of view of the “progressive” left, the unfairness is that these white men occupy an unearned position at the top of Western society’s hierarchy of privilege. The inversion of that hierarchy is thus–whether consciously or not–felt necessary to right this wrong.

In other words, the progressive/PC/SJW left and the alt-right really are two sides of the same coin. At the extremes of each group, one finds small bands of fanatics willing to use violence to impose their vision on society, notably “Antifa” or Black Lives Matter on the left and neo-Nazis or white nationalists on the right. To the extent that President Trump was drawing a moral equivalence between these radical factions when he condemned the extremism and violence on both sides during the Charlottesville protests, he was spot on. The fact that the mainstream media and even some establishment Republicans found this rather obvious point to be scandalous is a testament to how far to the left the country’s elite has shifted on cultural matters. Another useful term introduced into the discussion by the president is that of the “alt-left”. This is perhaps better than the other available terms discussed above, which may mean different things to different people.

Nevertheless, both the alt-left and the alt-right remain difficult to define or to delineate neatly. Like the Supreme Court’s standard on pornography, one simply feels one knows an alt-righter or alt-lefter when one sees one. The diagram below is an attempt to make sense of the alt-left and alt-right as loose groupings still largely outside mainstream culture, but at the same time growing in number and influence, whilst also moving towards the extreme ends of the spectrum. Another crucial feature that should not be left out of the definition is internet presence or activity. Social media in particular has acted, albeit unwittingly, as a gigantic sorting machine, enabling like-minded people to meet in cyberspace and egg each other on to further extremes, without encountering any antiphonal voices. As people drift further apart online and in their heads, when they encounter each other in the “real world”, they often do not know how to deal with differences in opinion other than by shouting angry slogans or even committing violent acts. This is not to say that the broad categories of the alt-left and the alt-right do not contain some reasonable or well-intentioned people; quite the opposite in fact. The problem is that it has become harder than ever to find common ground when people in different ideological camps have come to view their opponents as fundamentally wicked or stupid rather than merely mistaken.

The alt-left and alt-right have even developed very similar epistemology. It is common for alt-lefters, on the one hand, to talk about being “woke”. This means that they have been awakened to the reality of the privilege enjoyed by white, male, heterosexual and cisgender people in society, whereas previously they were oblivious to this. If they do not belong to all of these categories themselves, they become aware of their own oppression and concomitant virtue. If they do, they can still achieve some measure of (but never total) redemption by “checking their privilege” and “staying woke”. The alt-right, on the other hand, speaks of red pills, or being “red-pilled” in verbal form. This is a popular culture reference to the 1999 movie The Matrix, in which the protagonist is asked to choose between taking the red pill, which will show him the disturbing reality of the world, or the blue pill, which will return him to the comforting delusions of his former life. He boldly chooses the red pill. In both alt-left and alt-right circles, it is common to watch Youtube videos which purport to reveal to us the way things really are. One emerges intellectually “shell-shocked” with a feeling of having attained profound knowledge, after stumbling around for so long with the wool pulled over one’s eyes. However, one sometimes fails to consider the possibility that what one now believes is not necessarily the truth; rather, it is just that one has been disarmed by hearing a radical perspective that one had not considered before.

The effect of the “woke” or “red pill” video can wear off and as one re-enters the real world, in all its complexity, one realises that this cannot be the whole truth. Perhaps a young black feminist from New York is inspired by a video of Linda Sarsour, a fellow woman of colour from an oppressed religious minority, bravely speaking out against the “white supremacist” President Trump. Sarsour and women like her are just what we need to smash the “patriarchy”, she comes to believe. She then finds out, however, that Sarsour’s views on women’s issues are more conservative than those of many fundamentalist Christian Republicans in Congress. Meanwhile, a young unemployed white man from the Midwest watches a video about race and IQ. Whites have higher IQ on average than blacks, he learns. Suddenly, everything is clear to him. This is why so many blacks have to resort either to welfare or crime in order to get by. If only we cut welfare programs and end affirmative action, I will be able to take up my rightful place in society, he comes to believe. He then turns on the television and sees Neil deGrasse Tyson explaining black holes, and quickly realises that this black man is much smarter than he is, and that even if group differences in IQ do exist, this tells us nothing useful when comparing two individuals. He may also start to wonder what differences of a few IQ points really mean for society in the long run, given that artificial intelligence is poised to overtake even the smartest human beings.

However, Youtube algorithms being as they are, one can easily go back for another, perhaps even stronger dose. One can become addicted to this “political pornography”, reaching ever higher levels of “wokeness” or popping ever more red pills. One then ironically exits the real world and enters a pure ideological realm, in which no counter-evidence to one’s position can be entertained.

Although its power may now be waning, the alt-left is still a much stronger cultural force than the alt-right. There are several reasons for this.

It emerged much earlier and almost without anybody noticing, spread throughout society’s elite (universities, the Supreme Court, news media, Hollywood, etc.) as the radical anti-establishment generation that came of age in the 1960s, itself became the establishment. This process was largely complete by the 1990s, to the extent that conservatives by then had ceded ground on virtually all the culture war issues and were hardly fighting back. The fact that George W. Bush sought to promote “compassionate conservatism” was a tacit admission that “conservative” had become a dirty word. Theresa May’s recognition that the Tories in Britain were seen as the “nasty party” was a similar gesture of surrender, although admittedly Britain’s left-wing culture has much deeper roots than that of the US. The alt-right was a pushback against this tendency, an insistence on no longer playing by the left’s rules, only to keep losing to it. “Conservative” figures who tried to play by these rules became known as “cuckservatives”, or simply “cucks” for short, and the search for “true” conservatives began.

The alt-left also has the advantage of being seen as morally superior, due to the more general association of the left with the moral high ground. It is perhaps in part a testament to the left’s superior PR abilities that this association remains in many people’s minds despite the fact that in the 20th century, movements of the extreme left (Communism, anti-colonial “liberation” movements) caused far more death and suffering than movements of the extreme right (Nazism, Fascism and colonial regimes). This perceived moral superiority has also enabled those on the left to adopt a holier-than-thou attitude, which in the narcissistic age of Facebook and Twitter has morphed into what James Bartholomew has called “virtue signalling”, the mere advertisement of one’s virtue (usually unaccompanied by taking any real action) for the purposes of improving one’s social standing in left-wing circles.

The alt-right’s rejoinder to this is “vice signalling”, which involves alt-righters saying the most provocative things possible, with the intention of causing maximum offence to alt-lefters, thereby impressing their fellow travellers on the right. Milo Yiannopoulos and Blaire White are excellent examples of this phenomenon. An openly homosexual man and a “trans woman”, respectively, even taking into account the possibility of self-hatred, they surely cannot mean every derogatory comment they make about groups to which they themselves belong. Although initially the alt-right was happy to tolerate these figures, due to their pungent attacks on the cultural left, as the alt-right has grown in strength and even helped to elect Donald Trump to the presidency, it has tended to shed them.

This “eating of one’s own” had hitherto been the preserve of the left; the old saying that “the right seeks converts; the left seeks traitors” may only have reflected the power differential between the left and the right prior to the rise of the alt-right. Although the alt-left can just about hold together, as long as it has right-wing bogeymen to point towards, its internal divisions have become more severe in recent years. A telling article appeared in 2015 entitled, “The Rock Paper Scissors of PC Victimology.” Although the author provides a largely compelling critique of the alt-left victimhood hierarchy or “oppression Olympics”, he is apparently also aggrieved for the less principled reason that even he, as a “gay, Jewish man” is no longer near the top of the hierarchy, where he should be. “Like gay men, Jews have been relegated to the bottom of the progressive victim pyramid, a low ranking that has held fast in spite of the rampant bigotry and violent attacks directed at them,” he seems to whine.

The solution to the ever-intensifying “oppression Olympics”, however, is obviously not to keep tinkering with the hierarchy until everyone is happy (which is impossible), or in order to squeeze the electorate for more votes (which seems to be the strategy of the Democratic Party in the United States). Nor is it to appeal mainly to aggrieved whites, who are still in the majority in that country, as President Trump has done. Trump is in any case not the champion of alt-right causes that many (on both sides) thought he would be. He seems to have few, if any, strong political convictions. Rather, he is just much better at reading the electorate than any pollster or talking head; a charismatic figure who said what many were thinking but were too afraid to say out loud, in order to get elected. The only real solution to this great divergence and separation into hermetically sealed ideological camps must be to bring the alt-left and the alt-right back into the mainstream, and to force them to talk to each other. In order to do this, however, Western societies will have to recover the respect for freedom of speech that they once had. All ideas must be up for discussion, no matter how looney they may seem to some, and there should no policing of the boundaries of permissible speech or “hate speech”. Whatever one’s political convictions, when one really thinks about this for a moment, it is clear that there is no palatable alternative.

[Note: In this essay, I only discuss the cultural aspects of the left-right divide, and leave the economic aspects to others who have a far better grasp of these. Of course, culture and economics are not entirely separable, and I regret any blind spots that may result from this.]

Don’t Rely on the ‘Trump Bump’ – Journalism’s Future is Still Bleak

Reading Time: 3 minutes

With the election of Donald Trump, there was a mixture of glee sprinkled in with the horror in the world of reporters and opinion writers. The new president was an easy target, both for his outrageous statements and for the ever growing cast of leaks which surrounded him, on everything from his alleged charity to work to longstanding allegations of collusion with the Kremlin. It gave rise to the term the ‘Trump Bump’ which, in journalistic circles, meant that the new Commander-in-Chief offered opportunities for big scoops, more openings in newsrooms, and a much needed cash injection for flagging establishment media from non-profits. At a time when journalism had been written off, the POTUS seemed to be going out of his way (albeit perversely) to save it.

The closure of the Gothamist and DNAInfo should put a halter on any such celebrations, even as exclusives on the White House’s potential collusion have grabbed headlines. It’s not that New York City is a ‘news desert’, particularly compared to other states – the New York Daily News and Rupert Murdoch’s New York Post will continue to scrap it out there. But the decision to shut two of the best examples of local reporting sets a worrying precedent. Whether the rationale behind their closure was truly business or whether this is a media baron crushing any dissent, it is difficult not to see their demise as emblematic of a wider problem for journalism.

Joe Ricketts, the billionaire owner of an online stock brokerage site, had founded DNAInfo (which also had offices in Chicago, Los Angeles, San Fransisco, and Washington) in 2009, and bought the Gothamist in March this year. Their coverage of local issues including crime and real estate reflected an older form of shoe leather reporting – a bulwark against the growing empires of ‘McNewspapers’ like USA Today, which repeat content across states to minimise costs.

Ricketts’ letter (to which all his former sites now redirect) lays the fault of his decision at the feet of the websites themselves. “DNAinfo is, at the end of the day, a business, and businesses need to be economically successful if they are to endure,” he wrote, reflecting on the fact that the enterprise never broke even. He concludes his letter “I’m hopeful that in time, someone will crack the code on a business that can support exceptional neighborhood storytelling for I believe telling those stories remains essential.” It’s a rather trite ending – a man worth $2.1 billion bemoaning the costs of running newspapers.

If it sounds like something doesn’t quite add up, there’s a small additional matter: the decision came a week after staff at the New York offices of DNAInfo and the Gothamist voted to unionise. Ricketts letter assiduously avoided reflecting on that vote, but a widely linked to blog of his reveals no love for the idea of collective labour: it ends with the line”It is my observation that unions exert efforts that tend to destroy the Free Enterprise system.”

There is nothing illegal in Joe Ricketts’ actions – as CEO and owner, he had the complete right to pull the rug out from under the feet of over a hundred journalists. But they are spectacularly concerning. If we give him the considerable benefit of the doubt and chalk it all down to business, it seems inconceivable that papers started after the Great Recession and competing in a crowded digital environment could offer considerable returns on investment. Even the New York Times, which has had the advantage of a long standing reputation and of being one of Trump’s favourite punching bags, has only just begun to make some inroads towards growth, after a period of painful layoffs and a massive pivot towards the digital. It’s also hard to see Ricketts – clearly a man versed in business – would somehow imagine that the DNAInfo network would magically start printing money.

All of which suggests the other, even less palatable alternative: that the billionaire funders of journalism care less about editorial integrity than they do about control. Granted, this was not the voice of the purse dictating how tales should be told, but shutting down papers ostensibly because of unionisation is the next worst thing. At a time when good local reporting is neither lucrative nor readily in demand, this is particularly sad.

It’s not the first time we’ve seen billionaires using the sheer weight of dollars to push papers out of business – think Peter Thiel and Gawker. And what precedent does it set for other news sites, like the Washington Post, who are reliant upon a megadonor for their continued existence? The common assumption is that Jeff Bezos is unlikely to engage in similar chicanery; given that not even the staff at DNAInfo seemed to pre-empt Ricketts’ move, such an assumption doesn’t feel so safe any more.

The media barons used to buy papers because they made money as much as they offered the power of persuasion. Today, the first motivation has evaporated, and is unlikely to return in the near future. That seems to mean that dissent is even less likely to be tolerated.

What does Siloed Social Media mean for Politics?

Reading Time: 3 minutes

The old adage for dealing with dealing with online abuse was ‘Don’t feed the trolls’ – a statement based on the premise that they could fundamentally dealt with like offline bullies. By refusing to give them the emotional response and the attention which they crave, the argument went, they would get bored and move off (presumably to bother someone else).

But what does refusing to feed them actually look like, on a platform like Twitter – a space in which it’s easy for celebrities and micro-celebrities to weaponise their fame, turning their followers in far larger numbers and with far greater vehemence than in an offline setting? One answer is to block them, although given that it’s easy enough to make a new account and the sheer volume of the attacks , this can be impractical. Another is to put your account as private – or to go even further and quit it outright.

This was the understandable option taken by the targets of Gamergate, the organised campaign which ostensibly fought for ‘ethics in video game journalism’, but which always looked curiously like a reactionary pushback against criticisms of gaming’s often misogynist culture. Later, actress Leslie Jones would be forced to leave Twitter facing down a mob of a similar sort, targeting her for her ethnicity.

More so than getting an emotional response, this has been the goal of the leaders of the harassment campaigns, such as Milo Yiannopoulos, the former Breitbart employee who was finally permanently removed from Twitter following the campaign against Jones. He wouldn’t be the only ‘martyr’ in the eyes of self-proclaimed freedom fighters. In the wake of the Pizzagate conspiracy theory, rumours began circulation on Twitter about child pornography being hosted by the site by a series of accounts (including current alt-right celebrity, Brittany Pettibone). Having already removed a number of far-right accounts after Trump’s surprise victory, Twitter hastily swung into action, apparently partially to protect its own reputation.

It wouldn’t be the only platform to do so: even Reddit, famously defiant in the face of protests against the mixture of hate speech and borderline felonies on some of its threads, has banned a number (including the once popular r/AltRight). And as would later happen with Twitter, users quickly discovered alternative platforms, whose professed love of freedom went deeper. For Reddit there was Voat, which became central to the Pizzagate ‘investigation’, whilst Twitter got Gab (which also offers an opportunity for recording videos for audiences).

On the one hand, the decision for proponents of particularly loathsome ideologies to migrate from the mainstream space is welcome. A study on Reddit’s work shutting down some of the most controversial and repugnant subreddits suggested that rather than spreading the hate around other threads, most of those displaced tended to pipe down without the community support. Of course, it doesn’t take into account those who moved to platforms like Voat, which have tended to be less open to research from the mainstream establishment.

On the other hand, the practice of banning speech is a plaster for broader societal issues – and not a terribly sticky one in the long term. Although protecting users from campaigns of harassment is common decency (not to mention good business sense), pushing those already heading down dark paths to spaces like Voat seems likely to make their beliefs even more radical. A campaign based around punitive action also plays into their rhetoric of an establishment trying to attack them for violating free speech (gleefully ignoring those who have been forced to leave the arena of free speech out of fear).

The crisis of free speech, although so often imagined as a problem brought on by university safe spaces and ‘snowflake’ culture, is as much – if not more so – the result of a particular strain of conservatism mixed with what Adrienne Massanari dubbed a “toxic technoculture”. The result is a persecution complex which sees any debate as part of a broad attempt to stifle free speech, and a willingness to use whatever tactics necessary to attack opponents (see: fake antifa posters).

There is no easy solution to the problem which we face today – one which looks set to widen as the ‘culture wars’ continue. Forcing those with vile opinions onto alternative spaces no longer looks like the solution, as it simply intensifies their feelings of being stiffed. Allowing them to engage in wanton acts of harassment isn’t either, though: it’s time for tech to take a good look at itself and figure out the third way.

Explainer: The Iran Nuclear Deal

Reading Time: 2 minutes

Few pet peeves have attracted the ire of Donald Trump as much as the Iran nuclear deal completed by his predecessor. President Obama’s 2015 action was meant to reshape policy in the region, breaking the long-standing divide between the Shia powerhouse and the US which dates back to late 1970s, and which had only intensified under the Bush administration. By lifting the economic sanctions which had been designed to bring Iran in line (and which, as in Russia and North Korea, had largely failed to do so) and adopting a softer stance, Iran was to limit its nuclear energy programme – a key step in the path to nuclear weapons.

The deal covered a number of technical areas, designed to hamper a move from using nuclear power for civilian purposes to military ones. As the BBC reported, the measures are comprehensive: they include setting a low level of uranium enrichment, sufficient for use in power generation, but well below the 90% required for a nuclear weapon. The centrifuges used for the enrichment process have also been cut considerably in number, with higher quality nuclear material being removed from the country. A heavy-water plant which could have been used to produce plutonium has similarly been reconfigured to reduce the possibility of this happening, whilst leaving the energy infrastructure intact. The removal of sanctions on Iranian oil exports (a key part of the state’s economy) offers a considerable cash bounty, even as sanctions on arms and missile parts were retained.

Donald Trump – who built his campaign in part upon anti-Obama rhetoric – has thrown a spanner in the works by accusing Iran of contravening parts of the deal. It’s an opinion his allies don’t share, with Boris Johnson, the British Foreign Secretary, the latest figure to state that the deal will go on in spite of the president’s bluster. And yet the Republican Senate seems to be moving towards a set of punitive measures against Iran: measures which other signatories (including Britain and the EU) warn look uncomfortably like an attempt by America to renege on earlier promises, and which could hurt trust in the US brand around the world.

As much as an attempt to undermine his predecessor’s legacy in an almost monomaniacal fashion, the decision to punish Iran seems to represent a proxy war against the country’s expansion into the conflict with ISIS. As the self-proclaimed caliphate has lost land and supporters, Shia militias with Iranian leadership have captured land as part of a broad collapse of any semblance of unity amongst the anti-ISIS forces. Whilst in Kirkuk, the Kurdish forces were forced out by Iraqi military, wary of their nominal compatriots’ vote for independence, Iranian aligned troops have secured land elsewhere. Trump’s decision to not recertify the nuclear deal thus smacks of an attempt to punish Iran for its growing regional influence – albeit through the wrong channels.

What does this mean in the long term? That answer is worryingly unclear. Iran seems unlikely to halt its advances in the Levant as a result of threats from the US (particularly so long as the rest of the signatories continue to point out that none of the terms of the deal seem to have been broken). Donald Trump’s almost legendary fickleness is perhaps the most important variable, and the most unknown: if the warhawks get his ear, his threats to break the nuclear deal altogether might just come to fruition.

What ISIS and its critics get equally wrong about early Islamic history

Reading Time: 5 minutes

As the star of ISIS (apparently) begins to fade, it is perhaps worth interrogating its vision of returning to Islam in its original, pure form. A Salafi Islamist movement, ISIS seeks to purge contemporary Islam of its heretical accretions. Its English-language magazine Dabiq, explains that the

Khilafah [caliphate] could not be established except through a jama’ah [group] that gathered upon the Kitab [lit. “Book”, i.e. the Qur’an] and Sunnah [tradition of the Prophet Muhammad] with the understanding of the Salaf [ancestors], free from the extremities of the murji’ah and khawarij [Islamic sects considered heretical].[1]

Supporters as well as detractors of ISIS often claim that the group seeks to impose an austere version of 7th century Islam on all over whom it rules. Both camps are mistaken, however, in their historical assessment of 7th century Islam. Let us begin with a brief summary of the first century or so of Islam. In 632, according to Muslim tradition, the Prophet Muhammad died, giving rise to a succession dispute and consequent schism that still affects Islam to this day: that between its Sunni and Shi‘i branches. The first four successors, according to Sunni Muslims, are the rashidun (“rightly guided”) caliphs. According to Shi‘i Muslims, however, the first three of these were imposters, and the fourth, Muhammad’s first cousin and son-in-law ‘Ali should have succeeded Muhammad directly. The Sunni faction emerged as the dominant one–as it remains today–and upon the death of ‘Ali in 660/661 CE, Mu‘awiyah founded the Umayyad Caliphate. This lasted until 750 CE, when the Abbasids (a rival Sunni faction) defeated the Umayyads. It is only during the Abbasid period that we start to see evidence of the familiar strictures of Sunni Islam, such as prohibitions on drinking wine or depicting human–and even animal–forms. ISIS takes these prohibitions very seriously, as seen by the harsh punishment it metes out to drinkers (80 lashes, if caught), and from its destruction of pre-Islamic art and sculptures. Archaeological evidence, however, suggests that the Umayyad caliphs embraced many aspects of classical Hellenistic (or Greco-Roman) culture. The caliph Mu‘awiyah renovated the Roman baths at Hammat Gader (in northern Israel) in the first year of his reign, as attested by a unique inscription written in Greek script, but containing both Greek and Arabic words. ‘Abd al-Malik, who became caliph in 685 CE, put a standing human figure, believed to represent himself or possibly even the Prophet Muhammad,[2] on the earliest phase of coins minted during his reign (Fig. 1). He built the Dome of the Rock in an octagonal shape shared by a number of Byzantine churches in the region. The adjacent Al-Aqsa Mosque originally contained wooden panels bearing Greek inscriptions that appear to have been taken from churches. These panels have been dated to the early Umayyad period, most likely to the reign of ‘Abd al-Malik’s son, Walid I.[3]

 

Fig. 1. ʿAbd al-Malik’s “standing caliph” coin (obverse & reverse, respectively)

Perhaps the most extraordinary find from the Umayyad period, however, was the palace at Khirbat al-Majfar, near Jericho in the West Bank. Two pieces of marble from the site indicate that the caliph Hisham ibn ‘Abd al-Malik, who reigned from c.727-743 CE, once inhabited the palace. It may, however, have been built somewhat earlier. It is also associated with the caliph Walid II, who reigned briefly (743-744 CE) as one of the last Umayyad leaders and was notorious for his excessive consumption of wine. This is due to a large and prominent wine press within the palace (Fig. 2). However, there is no specific evidence linking Walid II to the palace, and it is unlikely that the wine press was constructed and used only his brief reign.

 

Fig. 2. Wine press at “Hisham’s Palace”

The palace also contained numerous nude and semi-nude sculptures of human beings, one of which resembles the standing caliph figure on ‘Abd al-Malik’s coins (Fig 3). Others depict bare-breasted female figures, animals (Figs. 4 and 5) and even a nude male with an apparently intact (or uncircumcised) penis (Fig. 6). This is perhaps significant because the practice of male circumcision is thought to have been universal among Muslims since the time of Muhammad. Uncircumcised infidels, by contrast, are regarded as inferior or unclean, hence the late Sheikh Abu Muhammadal-Adnani’s (ISIS’ erstwhile spokesman) attempt to insult John Kerry by calling him an “uncircumcised geezer.” Adnani is apparently unaware that the practice of neonatal male circumcision is widespread in the United States. He would be far more surprised, however, to learn that sculptures of “uncircumcised geezers” also adorned the early caliph’s palaces.

 

Fig. 3. “Standing caliph” sculpture

Fig. 4. Bare-breasted female sculpture

Fig. 5. Arch with semi-nude female forms and animals

Fig. 6. Nude male figure

 

How should we interpret this archaeological evidence? To be sure, it does not tell us exactly what the early caliphs believed. It does, however, demonstrate that they drank wine and sculpted nude human forms. It also strongly suggests that whatever strictures we now associate with Sunni Islam emerged later. Take for instance the Qur’anic prohibition on wine:

O you who have believed, indeed, intoxicants, gambling, [sacrificing on] stone altars [to other than Allah], and divining arrows are but defilement from the work of Satan, so avoid it that you may be successful. (5:90)

Other parts of the Qur’an are more equivocal. Sura 2:219, acknowledges both sin and benefits in drinking, but holds that the former outweigh the latter. Another verse even seems to portray alcohol in a positive light: “And from the fruits of the palm trees and grapevines you take intoxicant and good provision. Indeed in that is a sign for a people who reason” (16:67). It might be observed that the classical Islamic literature has long regarded the Umayyads as “bad Muslims” who succumbed to various sensual infidel temptations. One must remember, however, that this portrayal of the Umayyads has its origins in Abbasid or post-Abbasid writings. The Abbasids, who had seized power from the Umayyads, needed to portray them as illegitimate rulers. In order to do so, they retrospectively applied their own stricter religious standards to the Umayyad period and found that the Umayyads had fallen short of these. History, as the saying goes, is written by the victors. Archaeological evidence, on the other hand, does not lie: the earliest Muslim caliphate was less like ISIS and more like ancient Greece or Rome.

 

References:

[1] Dabiq, Issue 1, p. 33.

[2] See R. Hoyland, “Writing the Biography of the Prophet Muḥammad: Problems and Solutions,” History Compass 5 (2007), 14.

[3] N. Liphschitz, G. Biger, G. Bonani, W. Wolfli, “Comparative Dating Methods: Botanical Identification and 14C Dating of Carved Panels and Beams from the Al-Aqsa Mosque in Jerusalem,” Journal of Archaeological Science (1997) 24, 1045–1050.

Explainer: The British Far-Right

Reading Time: 2 minutes

The murder of Jo Cox in the run-up to Brexit was shocking not merely for the fact that it was the first killing of a British MP in over 20 years. The words her killer, Thomas Mair, shouted – “Death to traitors, freedom for Britain” – marked a return of British white supremacy which had last had its heyday in the 1990s.

Whilst more electorally focussed versions of white nationalism have persisted and gained support over the past decade – most notably in the brief surge of support the British National Party (BNP) received under Nick Griffin – the last decade of the 20th century had marked a high-point in the indigenous white nationalist movement (as opposed to US or European imported groups). Neo-Nazi groups centred in the UK included Combat 18 (the number standing for the letters ‘AH’, Hitler’s initials) and Blood & Honour, which hosted the formerly flourishing white nationalist music scene. For groups like these, which had evolved from post-WWII fascists and disillusioned imperialists with ill-disguised antipathy for immigrants from former colonies, the high point of their publicity came in April 1999 – courtesy of a 22 year old called David Copeland.

A former BNP member, Copeland had read ‘The Turner Diaries’, William Luther Pierce’s dystopian novel and handy manifesto for the budding fascist. In 1995 it had made the headlines in America when Timothy McVeigh, the Oklahoma City Bomber, was discovered with pages from the novel (which describes an attack on an FBI building) – a screed which called for radical warfare against the state. Copeland turned to explosives himself, but he targeted another typical fascist target – non-white Britons, immigrants, and members of the LGBTQ community. His attacks killed three and wounded another 140.

It’s probably unfair to call Copeland’s attack the catalyst for the failure of the British far-right scene – the deaths of key members of groups like Combat 18 through factional infighting also played a role. At any rate, the early 21st century saw the apparent transition of the old street fighting outfits to electoral politics.

Mair’s actions shattered this illusion, especially as his words were rapidly coopted by a previously little known group, National Action. With its roots in Yorkshire (which has traditionally played host to the BNP and other far-right outfits), the group had only been founded in 2013 – but it nevertheless is the only far-right group proscribed in Britain. The status has conferred upon it a great deal of respect in white supremacist forums, seeming validation of the state control which Pierce’s Turner Diaries ‘predicted’ – not bad going, considering attempts in 2015 to organise a rally in Liverpool ended with National Action members hiding behind the shutters of a shop at Liverpool Lime Street Station.

In styling, National Action offers a blend of the peculiarly British and the distinctly transnational – a technique borrowed from the broader alt-right. Where older iterations of the website from 2013 show a particular approach which mimicked the National Front, focusing on immigrants, the group has increasingly opted for a broader symbolism. One of the most recent examples of its home page featured Anglo-Saxon imagery alongside the broader, pseudo-academic ideology which has been popularised by Richard Spencer and others in America – and which is increasingly developing in continental Europe and Britain.

How the Echo Chamber won Trump his presidency

Reading Time: 4 minutes

American Likes 1

“It was the best of times, it was the worst of times…” pretty much sums up 2016 in a single sentence. Today’s result in the American Presidential Election came as a shock to many, partly because all the data pointed to the contrary right up until the day itself. It seems we are still a long way from understanding the contextual nature of mining big data for information, and this problem extends itself into how we are becoming heavily reliant on similar systems controlling how we consume information.

Nowadays we use Facebook, Twitter, and other social networks to gain information, ingest content and read the news. We often see what we like and like what we see, resulting in biased social feeds because of this echo chamber. Most of us have realised how biased our news feed is which is down to how we use them, but recent political events such as Brexit and the US election have shown us the extent of just how much this is the case.

[quoter color=”yellow”]We unknowingly accept the echo chamber we’re placed in because we are fed information we like and agree with our own opinion.[/quoter]

In many cases, users don’t even realise that they consume one-sided, or similar information because of the social circles around them. This entire phenomenon can be chalked up to “You don’t know what you don’t know,” and users end up only reading regurgitated information. As a consequence, we unknowingly accept the echo chamber we’re placed in because we are fed information we like and agree with our own opinion. It’s then reinforced because people in the same social sphere agree with us too. So the echo chambers’ cycle remains because it gives us a false sense of affirmation that we are right in our beliefs – also known as a confirmation bias.

How likes divided a nation

The 2016 U.S. Election coverage is a perfect example. The Wall Street Journal recently put together this graphic which depicts how feeds may differ to Facebook users based on their political views: http://graphics.wsj.com/blue-feed-red-feed/

In the graphic, you can see Liberal and Conservative Facebook feeds side by side and how much they differ. After all, your feed is designed to prioritise content based on what you’ve liked, clicked and shared in the past. This means that conservatives don’t see much content from liberal sources and vice versa.

This particular presidential campaign has been fought on an entirely different content battleground than others, with a new army generating that content at high speed; with low value but an extremely high impact judging from today’s outcome. According to an article in Wired, one in every five election-related tweets from September 16 to October 21 was generated by a bot. These bots automatically generated content that met the criteria of the political agenda.  Because of the deluge of tweets, they triggered and shaped online discussions around the presidential race, including trending topics and how online activity surrounding the election debates were judged. The problem stems in a shift from an Information Economy to an Attention Economy, where he or she who makes the most noise, wins. Unfortunately, noise does not equal signal but you can’t tell them apart when a bot or algorithm is involved.

The perfect example of this was the revelation that over 100 pro-Trump websites were in fact, being run out of a small town in Macedonia. Posts weren’t being generated by bots but by a small group of teenagers making money from click bait articles which were mostly false and misleading. The most successful post, according to Buzzfeed when they investigated the issue was based on a story from a fake news website, was the headline on the story from ConservativeState.com which read “Hillary Clinton In 2013: ‘I Would Like To See People Like Donald Trump Run For Office; They’re Honest And Can’t Be Bought.’” The post was a week old and had racked up an astounding 480,000 shares, reactions, and comments on Facebook. Those numbers are astounding and prove that attention means more than information.

Breaking the cycle

Bots and algorithms don’t seek out opposing views or surface them for readers because they’re not built that way. They serve us what we want to hear. It’s the same when we’re served news written by human hands specifically for our tastes. We become trapped in a filter bubble wrapped around an echo chamber (or should that be an echo chamber wrapped around by a filter bubble?!).

[quoter color=”yellow”]Bots and algorithms…serve us what we want to hear[/quoter]

The way to break free from this is to start understanding how algorithms work, why content screaming for attention can no longer be trusted as relevant, and to surround ourselves with different viewpoints. The ultimate goal is balance and only this way can you find a new perspective, different content, and learn what you don’t yet know.

We should be more selective in the content we consume. Instead of the algorithm doing the filtering first, we should manually look to filter the news, media, and information ourselves in order for algorithms to gently nudge new information by suggesting opposing views that broaden our perspective.

The algorithm should be the one to challenge our point of view, not reinforce it.