There’s No 21st Century le Carré. Can There Be?

John le Carré, who passed at the close of 2020, was a perfect spy novelist: a literary talent with an Oxford, MI6, and MI5 pedigree writing at the height of the Cold War. It is unsurprising the 21st Century has not seen an heir to his rare combination of skill and experience. More broadly, however, the post-Cold War Western security landscape has not produced the same calibre of fiction that the Cold War did. There was no le Carré of the War on Terror, nor was there an Ian Fleming or Tom Clancy.

Perhaps the medium of the blockbuster novel is itself antiquated, but there also have not been definitive mass audience films or TV series about post-Cold War intelligence characters with anything like the relevance to a time, place, and conflict like those following le Carré’s George Smiley, Clancy’s Jack Ryan, or even Fleming’s James Bond through the Cold War.

The Bourne films were cinematic successes but essentially ignored the War on Terror. 24 achieved some mass cultural salience and did not a bad job of addressing terrorism from a public relations perspective, but it studiously contrived its terrorists to avoid too much verisimilitude with actual transnational organizations and their political and ideological pretensions. Homeland grasped the nettle with respect to America’s actual war adversaries but plot-wise saw more House of Cards-style shark jumping than Tinker, Tailor, Soldier, Spy or even Fauda-like grit. The Bond movies of the late 20th Century complacently rehashed the Cold War, focusing on Soviet and North Korean anachronisms; though Goldeneye (the best of the post-Cold War Bond movies) did grapple with the issue of post-Soviet Russian state formation. The best espionage art created during the Post-Cold War decades was not (explicitly) about them: The Americans was full-throatedly a 1980s time trip.

As the 21st Century ground on, interesting explanations for this phenomenon were offered by Matt Gallagher in The Atlantic and Emma Mustich in Salon, both in 2011. These articles persuasively identified the diminished scale of Post-Cold War conflict as the reason for nostalgic security fiction, respectively pointing to American society’s general detachment from the wars in Iraq and Afghanistan and the War on Terror’s lack of a true existential threat. More recently, Leonid Bershidsky picked up the same theme for Bloomberg and again found a worthy explanation in the need for a state-level actor as grist for good spy drama.

Gallagher, Mustich, and Bershidsky’s valid arguments boil down to an “End of History” theory of spy fiction: that high-level clashes between global powers are the key to compelling espionage art, and a unipolar world just didn’t provide good material. This reasoning has strong resonance with Ross Douthat’s theory of recent history in The Decadent Society. Descriptively, the story of relentless Cold War reboots is entirely of a piece with Douthat’s observation that the West has found itself culturally exhausted and trapped in a nostalgia loop probably since the 1969 moon landing, and thus certainly since the 1989 fall of the Berlin Wall.

Why were we unable to escape the clash of powers paradigm for cultural products? Furthermore, if we accept that “History” has reasserted itself (or perhaps just begun), why has our art been unable to track the latent causes of today’s History, which necessarily must have been lurking under the surface from 1989 to the present? One theory is that Western audiences were conditioned to find some subjects more memetically “interesting” than others during the 20th Century. That there are epigenetic-like settings governing our collective curiosity and these had been tuned to a Cold-War frequency by a sticky dial. Another theory is that some periods are intrinsically more interesting than others — humans find clashes of civilizations compelling all the way down and the interludes are inevitably dull. A third is that we let our optimism distract us from our curiosity. That Goldeneye’s (1995) Soviet-revanchist General Ourumov and MI6-turncoat Alec Trevelyan of the “Janus” international crime syndicate were actually dead ringers (as far as popcorn villains go) for future Vladimir Putins and transnational actors, but the heady hope for an End of History saw 007 and his CIA friends nip these chaos actors in the bud, allowing Western audiences (and future artists) to not so much as recognize a placid reality as take a vacation from turbulence for a few decades.

Similarly, perhaps such a willful blindness allowed Western audiences to miss the extant spy intrigue hiding in plain sight. As Tyler Cowen argued for Bloomberg in 2018: “Spies Are More Common, and Boring, Than You Think: If you keep looking for James Bond, you’ll miss the thousands of bureaucrats collecting bits of intel”:

John Negroponte, former director of national intelligence, admitted in 2006 that the U.S. was deploying about 100,000 spies around the world. Given that the U.S. is the world’s technology and military leader, and yet has a relatively small share of global population, is it so crazy to think the number of people spying on us is larger than that?

Cowen, Bloomberg Opinion; Marginal Revolution (July 31, 2018).

Ubiquitous espionage is actually the opposite of boring, which, of course, is likely why Cowen wrote about it. While the average spy accordingly would be less James Bond-like than Hollywood suggests, the average person would be more James Bond-like than Hollywood suggests. The problem is we had been misdefining boring after the end of the Cold War, paying attention to Bond the institutionalist not Trevelyan the disruptor with an esoteric (Lienz Cossack?) agenda, not to mention ignoring the possibility that lesser versions of these charismatic characters might be more available in daily life than most audiences and artists were attuned to.

We still largely remain trapped in a Cold-War cultural template. Hope for a renaissance in spy fiction has been found in the reemergence of state-level conflict familiar to a 20th Century audience. As Bershidsky had observed at the beginning of 2020, “state actors are back” and “of late, news reports have provided enough material for a silver age [of spy thrillers] to start — if authors take heed.” But why settle for silver?

Berhsidsky’s simple “if” contains profound insights about the West and global media in the 21st Century. The question of whether today’s Hollywood is willing and able to address spy games with China as it did those with the Soviet Union in the 20th Century is self-answering in the negative. The concept of Hollywood itself, though, is also a relic of the 20th Century. It is already clichéd to say cultural communications are unbundled, distributed, decentralized, disintermediated, demonopolized, etc., etc.

Therefore, there cannot be a le Carré of the 21st Century if that means following a 21st Century George Smiley through the halls of 20th Century power (Whitehall bureaucracy, the “Oxford Circus”) via a 20th Century medium (the novel, the Hollywood blockbuster). A 21st Century le Carré would follow the diffuse power and hardly-hidden hands of state and non-state spies through the capillaries of 21st Century cultural authority: Twitter, Facebook, TikTok, Substack, blogs, LinkedIn (!), and podcasts, producing that content through those very same channels, and blurring the lines between news and fiction, fact and propaganda, truth and deception — deploying the same themes that defined the “golden age” of spy art, and may yet define our own.

Do Consumers Care About Privacy? If Not, Antitrust for Privacy is a Fool’s Errand

“Big tech” companies are criticized for their size and surveillance practices. Recent antitrust filings argue big tech size begets platform surveillance and big tech surveillance begets platform size.

The social networking antitrust case brought by state attorneys general alleges that monopolistic platforms harm consumers by degrading the privacy protections they offer over time. The argument is that a platform wins users with sector-leading privacy protections and proceeds to roll those back as the platform gains market power. Since consumers do not pay a fee for social networking services, a claim that consumers are harmed by an alleged monopolistic platform must focus on the quality of its services, and according to state attorneys general, privacy is a principal feature of social network service quality.

“Historically, Personal Social Networking providers have refrained from charging a monetary price for providing Personal Social Networking to users, relying instead on monetizing user data and engagement through advertising. Personal Social Networking providers compete for users based on a variety of factors, including quality of the user experience, functionality, and privacy protections, among other factors.”

Complaint at 13.

The argument that consumers prefer greater privacy protection than big tech provides is always tricky, as every example of an alleged big tech surveillance practice existing in the market is simultaneously an example of the market tolerating that very practice. The antitrust privacy argument seeks to resolve this tension with the explanation that market power enables companies to adopt practices despite consumers’ preferences: without viable alternatives in the market for “Personal Social Networking Services,” consumers must take what they can get, and what they can get is not what they actually want.

In a world where consumers’ revealed preferences on a mass scale suggest they accept the balance of privacy protection presently on offer, how can one discern that consumers demand greater privacy protection than the market actually supplies? The state attorneys general complaint looks to a market entrant platform’s early privacy protection practices and argues those features — along with other aspects of user experience — were a platform’s initial competitive advantage. The intellectual inspiration for the state antitrust suit — Dina Srinivasan’s law journal article — looks to consumer privacy preference polling and surveys to support the claim that consumers value privacy in social networking services. Query whether either of those empirical claims is compelling or dispositive.

As Skee-Lo’s timeless song goes, “I wish I was a little bit taller, I wish I was a baller….” With respect to consumer demand, there is little doubt that ceteris paribus consumers would accept, and plausibly prefer, greater privacy protections. However, where greater privacy protections are traded off against other competing values, who knows to what extent consumers in aggregate will hold them dear.

In theory, increasing competition in the market for personal social networking services would provide an interesting experiment to test the hypothesis that consumers demand greater privacy protections than those presently on offer. However, this begs the question of how little competition in social networking there truly is. The state antitrust suit narrowly defined the relevant market as that for “Personal Social Networking Services” and excludes YouTube from that space. The word “TikTok” does not appear in the complaint; the word “Snapchat” appears once as an oblique reference. In antitrust, market definition matters. It is possible that in a competitive market for social networking services, the relative value that consumers place on privacy protection as traded off against other features is less than what legal theorists and attorneys general assume it is or ought to be.

Market failures happen. Could it be that the aggregate of individual choices with respect to the relative value placed on privacy produces an inefficient distribution of privacy for society as a whole? Maybe. Perhaps digital privacy protection is a value that society demands but just not in the market, and instead must be made manifest through extra-market solutions, such as GDPR-like comprehensive digital privacy legislation. If true, antitrust is the wrong approach on privacy. To discipline the market with non-market values would likely require a “Blue Model“-like accommodation between big tech and big reg: companies with the size to withstand the compliance costs, which, in turn, will favor consolidation.

America Deserves a Second Opinion

Dr. Fauci’s latest admission of a noble lie to the New York Times — that he has “slowly but deliberately been moving the goal posts” of his public herd immunity estimates “partly based on new science, and partly on his gut feeling that the country is finally ready to hear what he really thinks” — drew apt comparisons to the disappointing elite described in Martin Gurri’s The Revolt of the Public.

Antonio García Martínez tweeted in reaction, “I see @mgurri and the unsustainability of elite authority everywhere I look now.”

Dr. Fauci is the perfect archetype of Gurri’s Olympian technocrat, an elite man of the “center” whose mastery of esoteric knowledge has granted him the privilege of distance from the public. His foil, President Trump, is the perfect archetype of Gurri’s tribune of the deplorables, a man of the “border” whose mastery of communicating sectarian rage won him the bully pulpit in order to take the Faucis of the world down a peg.

Dr. Fauci is a good man. He has devoted his life and estimable technical gifts to the public weal. His role in PEPFAR alone, not to mention Operation Warp Speed, should seal his place in the book of noble public servants. Nonetheless, he is a leader elevated in a bygone era whose virtues are wrong for the present age.

In a world of institutional monopoly on technical knowledge and societal narrative — the industrial information ecosystem of the 20th Century — the license to tell the noble lie was one of the highest badges of elite merit. Aspiring technocrats day dreamt in Ivy League libraries of the public crisis during which they might be called upon to lie to the naive face of Joe Public for his own good, all while feeling a warm glow of both intellectual superiority and moral rectitude.

Yet in a world where the flow of information is less like that of a unidirectional command through an industrial hierarchy and more like that of a leaking industrial solvent corroding the pipes that carry it, the noble lie (once revealed) is a grave threat to what shred of public trust in institutions remains.

Revealing the noble lie at this juncture, when public trust in authority is not an abstraction but an essential prerequisite to the widespread adoption of world-saving vaccines, is a devastatingly bad judgment call that can only come from someone as bright and well-intentioned as Dr. Fauci when he is being loyal to an institutional value system that made the noble lie a key feature of his public-health ministerial portfolio.

If we want a “legitimate hierarchy” — the kind that might be able to achieve the requisite buy in for a world-saving vaccine — we now require new virtues and values among our leaders.

As Gurri himself writes, “The qualities I would look for among elites to get politics off this treadmill are honesty and humility: old-school virtues, long accepted to be the living spirit behind the machinery of the democratic republic, though now almost lost from sight.”

I leave it to the reader to determine whether the clinical mien of Dr. Fauci evinces humility.

Needless to say, the noble lie is not honesty. Since Dr. Fauci is a good man loyal to public well-being he should use his technocratic perch to communicate to our budding next generation of elites that the time has come for a new ethic of elite truth-telling that aligns the esoteric and exoteric narratives.

The development of safe and effective COVID-19 vaccines is a triumph of the elite — a technical problem solved. While the elite are competent at solving technical problems, they mistook this ability for the capacity to solve social problems. As Gurri puts it:

Modern government’s original sin is pride. It was elected on a boast — that it can solve any “problem,” even to fixing the human condition — and it endures on a sickly diet of utopian expectations. We now know better. Since the fall of the Soviet Union, we have understood that even the most brutal application of power cannot redeem the human lot.

Martin Gurri, The Revolt of the Public (Stripe Press 2018), 424.

It is revealing that Dr. Fauci’s noble lie centered around the concept of herd immunity. Given the development of multiple safe and effective COVID-19 vaccines, herd immunity in large part becomes a function of public adoption of the inoculation, which in turn is in large part a function of public trust in authority.

Fauci’s increasing confidence in the public’s willingness to get the jabs led him to reveal that herd immunity was further away than he previously told. The logic of this communication strategy — to this observer at least — is not immediately obvious. Presumably Fauci’s thinking was that the public should not be demoralized by a goal that is too far away as to appear unattainable, as that demoralization would lead too many citizens to forego the vaccine, effectively thinking “since we’ll never get to 90%, what’s the use in me bothering to get the vaccine at all” — call this the “demoralization theory.” This theory of human reasoning, however, is also not obviously correct. One can imagine — and in fact has observed — public health communications that make the picture look worse to the public than it actually is in order to inspire action; this approach with respect to herd immunity would make one offer a higher estimate in order to scare the public into getting vaccinated in droves if we’re to have any hope at all of beating this thing — call this the “scared straight theory.”

Public health communicators have lied on both sides of the ledger — making the problem seem more manageable than it is and worse than it actually is in order to inspire desirable public action. This inconsistency in the strategy of the noble lie reveals a belief in the value of noble lying itself that is stronger than a belief in either the demoralization or scared straight theories of public action. Again, the elite are competent at solving technical problems (developing a safe and effective COVID-19 vaccine) but incompetent at solving social problems: nudging public behavior in the right direction.

In discussing herd immunity, Dr. Fauci is essentially discussing how much longer we have to live in fear and a state of suspended animation. He acted the part of the parent driver on the road trip who fibs “just a little farther” in response to “are we there yet” in order to forestall a tantrum (or mutiny).

Query, however, what the appropriate metric for the end of our lives in fear and suspended animation actually should be. When a safe and effective vaccine is in the offing, high levels of caution and risk aversion are eminently rational — as we may actually slay this dragon. This observer endorses this approach based on the belief that the dragon will actually be slain by the technical miracle of our COVID-19 vaccines. Yet if the dragon were somehow not likely to be slain, by this observer’s lights, life in the bunker would still need to end at some point nonetheless. Reasonable precautions like masks and rapid at-home testing would remain eminently sensible and essential in a world of an extant dragon and reemergence from the bunker, yet all dreams deferred would not remain sensible: we can sacrifice one Thanksgiving and one Christmas with loved ones, but any more beyond that and what life really are we preserving in the bunker? Noble lies avoid this question. A healthy and competent society should not be in the business of avoiding difficult questions, nor should its leaders.

The Empire ‘Likes’ Back: Soft Power and Authority in the 21st Century

This file is licensed under the Creative Commons Attribution 2.0 Generic license. Attribution: Firebrace, Wikimedia Commons

The Brink of a 21st-Century Empire?

From the vantage point of 2020, the soft exit of the Duke and Duchess of Sussex, Prince Harry and Meghan (née Markle), from a starchy royal life for a Netflix–backed Royal Life™ is an innocuous, tabloid fascination. From the vantage point of centuries of civilization, it is potentially an historic event. The transition of the Duke and Duchess from obedient servants of the Crown to freewheeling influencers of “the streaming platform that brought you The Crown,” sits at the nexus of ancient and modern power — of primaeval royalty and digital reach. In this marriage of three estates — the Crown, Hollywood, and Silicon Valley — is the template for a 21st-Century empire.

Rule Britannia

Monarchy is one of the most powerful inventions of the replicators that brought you humanity. It allows a generation of a family to stamp its mark upon the future to a far greater degree than does heredity alone. Whereas the first monarch’s mark upon each generation is successively halved in terms of heredity, her influence is undiminished in terms of political power: Queen Elizabeth II is only 1/16th related to Queen Victoria but she is every bit the Queen, perpetuating up to this minute her family’s reign.

Monarchy has been one of humanity’s dominant answers to the questions of what ensures intergenerational cultural persistence; what allows the transfer of political power; and what confers legitimacy upon the seat of political power. It is the story of conquests, constitutions, fratricide, regicide, and revolutions. It is biblical. It is an agonist of the great wars of the 20th Century. It is a force whose legacy continues to shape most of our present world whether we realize it or not.

Monarchy exists in positive law — at the ends of barrels and bayonets of Lee Enfield rifles — but also in cultural legitimacy. It was only the revolutions of the 18th and 20th Centuries — American, French, and Russian — that dissociated monarchy from 10,000 years of the natural law itself — the unquestioned order of man, nature, and nature’s god. Based on that track record, monarchy is among the fittest memes ever to exist on Earth in the minds of men.

In two sentences, the memetic fitness of an Empire was dramatically scaled back:

We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness.–That to secure these rights, Governments are instituted among Men, deriving their just powers from the consent of the governed, –That whenever any Form of Government becomes destructive of these ends, it is the Right of the People to alter or to abolish it, and to institute new Government, laying its foundation on such principles and organizing its powers in such form, as to them shall seem most likely to effect their Safety and Happiness.

The unanimous Declaration of the thirteen united States of America, July 4, 1776.

While “Rule Britannia” is no longer a regime imposed around the globe by the end of a rifle, its progeny remain the dominant forces defining human life the world over: the English language, empirical science, the United States of America, the industrial revolution, the Anglo-American common law, the transcontinental global market, the Turing machine (or “computer”), and the World Wide Web. In five British names — Newton, Bacon, Locke, Darwin, and Smith — you have the basis of modern science, philosophy, politics, law, and economics. Add in Bentham, Mill, and the King James Bible and you arguably have the basis of modern morality. The sun may have set on the British Empire, but it continues to shine on its memetic, cultural legacy.

The Fittest Meme

The first two decades of the 21st Century have seen, as Marin Gurri puts it, a crisis of authority: the Arab Spring, Brexit, the election and presidency of Donald Trump, the declining sway of legacy media and expert institutions, the balking of international institutions in the face of a global pandemic. Paradoxically, it is the very dominance of the legacy of Rule Britannia — a global anglophone civilization linked together by the technological descendants of British (or at least Anglo-American) inventions, the computer, the Internet, and the modern corporation — that has eroded the authority of the elite class that administers that legacy.

Fittingly, two of the most elite members of the most elite class, the Duke and Duchess of Sussex of the British Royal Family, have decamped from their formal station for greener pastures in the “influencer” game.

Being royal has historically carried the distinct advantage of avoiding market competition. One’s place atop the hierarchy was ensured without having to dirty one’s hands in the rough and tumble of the marketplace. Now, the Duke and Duchess would prefer to compete than to merely sit.

While the very arbitrariness of a royal status insulated from the social signals of one’s fellow men led to the political Declaration that justly struck a great blow against monarchy, that very insulation itself was an advantage of the royal system that could (though certainly did not always) redound to the benefit of society as a whole. Where democracy risks (and must provide republican institutions to counteract) tyranny of the majority and tyranny of faction, monarchical tyranny against the majority and against faction could protect minority rights and the general welfare. Not having to respond to the short-term, kinetic demands of the demos could lead an otherwise enlightened statesman to make unpopular but beneficial decisions for the long-term, static wellbeing of society.

Social media has been maligned for destabilizing democracy. To the extent it does, it is because it overdoses democracies on democracy itself. Every voice — no matter how false, fatuous, or destructive — is heard and can be repeated endlessly. Institutions from universities to corporations to newspapers to governments tread carefully around and cater to the whims of the digital demos. In a world where the lowliest jester can take down the mightiest king, the Duke and Duchess would rather be off the throne and in the pit.

If handled adroitly, Prince Harry and Meghan can wield the power of the old and new worlds. They hold the most elite original brand of the world’s underlying civilization and are now at liberty to unleash it not above the marketplace but within it.

The original advantage of royalty — the ability to avoid competition — still obtains for the Sussexes to a great extent. They did not need to build an app, top a sport, win a talent competition or an election to reach their current station. The Duchess did need to make it to a certain rung of fame and fashion for entree to her royal status, but its grace is now fully hers. The Sussexes’s followers are baked into the cake. Their royalty makes them inherently interesting regardless of whether they embrace it, renounce it, or do something with it in between — and regardless of whether the rest of us are aware of the ultimate source of our fascination. Not only are the Sussexes starting on third base in terms of followership, but they also don’t have to perform any justificatory cultural function to retain their renown, like winning a world cup or making a platinum record.

This market exceptionalism allows the Sussexes to transcend the short-term incentives of the marketplace like the ancient royals before them. They can be famous without having to play to the crowd. They can be truly influential, since they are not captives of their own fans. That is, as long as they understand the original source of their followership and do not debase their power through fan service.

An Empire of Soft Power

It is apt that the Sussexes are taking the reigns of a nascent cultural venture just as one of the last emblems of British hard power, Hong Kong, is fully dominated by the CCP. The Sussexes have the opportunity to build an empire of stateless soft power through their royalty and the tools of entertainment and social media.

The modern Windsors are gunpowder monarchs without a true command. Yet the power of the Victorians lay as much in their cultural as in their military force. Its royal imprimatur notwithstanding, Victorianism was the rise of, as Deirdre McCloskey puts it, bourgeois dignity — a universally adoptable manner that could confer dignity upon an individual from any rank of society through habit, not birth.

The Sussexes can trade on the dignity of their “birth” while modeling a universal culture for the 21st Century. With the tools of Hollywood and Silicon Valley coupled to their inherent followership they can wield vast cultural force on multiple continents.

Like the leader of a democracy, social media platforms face some perverse incentives. Since their authority and power is derived from the favor of and engagement by the demos, they risk serving the short-term, kinetic demands of their users instead of their long-term, static interests. This can lead them to trade on the tendencies of human nature to engage in immediate gratification and get hooked on informational junk food. Given weaknesses in the human capacity for inter-temporal optimization, a market participant can addict a user by supplying a subjective experience that is a mere iota more pleasurable at the marginal moment of life than the best foregone alternative — even if the aggregate pleasure provided by that market participant is substantially lower (or potentially negative) in the long run when compared to foregone alternatives that failed to appeal at the marginal moment.

Market participants can debase us in their quest to serve us. In that quest, both parties are debased — the pusher and the addict.

If the Sussexes can maintain their long-term resonance regardless of their ability to supply short-term gratification and feed short-term addiction, they can escape the debasement of the commoner influence peddler.

From Wellness to Wellbeing

The high-end influencer game vacuously employs the term “wellness.” A moniker with the connotations of premium-mediocre bourgeois consumption and vague oatmeal-like aromas.

While the Sussexes may easily fall for the temptation of servicing premium-medicore consumerism, the high-road is premium bourgeois. Rembrandt and the Dutch Renaissance were products of the high-end bourgeois. The Sussexes, if they keep aesthetic discernment about them can be more Rembrandt than GOOP.

Wellness points to a less mediocre idea. At the heart of modern ethics lies John Stuart Mill’s meditations on “wellbeing,” the ultimate end of liberty. In wellbeing, Mill unites utilitarian and rights-based morality by defining individual wellbeing as a constitutive feature of the concept as a whole.

A soft-power empire devoted to maximizing human wellbeing is the highest road the Sussex project might take. This road is available to the Sussexes to the extent they can avoid the doom loop of fan service and fan addiction. Escaping perverse incentives, they can focus on their followership’s wellbeing.

This would involve choosing effective altruism over affective altruism in their charitable work: evidence-based high-yield projects over preening photo ops. It would look to Esther Duflo, Abhijit Banerjee, and Raj Chetty for intellectual heft and inspiration.

It would also involve cultivating artistic projects that strive for transcendent excellence over fashionableness. It would look to Roger Scruton, Simon Schama, and Kanye West for inspiration.

Is any of this remotely likely? Who knows. The centrality of mental health to some of Harry’s projects is a promising sign. The tools for individuals to captain their own souls and seek static over kinetic pleasures are essential to wellbeing properly understood and overcoming the doom loop of an addiction-based market. Self-possessed individuals can make market signals themselves point to healthier demands and help raise society to the plane of a virtuous cycle.

If the Sussexes can uncouple high status from high price, untether fact from fashion, and make quality, tasteful design more widely discernible and attainable, they will have launched a project deserving of the word noble.

Currently they are a skunk works for the British Royal Family as a whole. If they succeed, the Royal Family can co-opt their success. We will know this has come to pass if the Empire ‘likes’ them back.

Coverups, Corruption, and inCapacity

China has some impressive institutions. One can decry the immorality of the CCP’s concentration-camp-building surveillance state and obfuscation of a burgeoning global pandemic while also acknowledging, for example, that China’s disease tracing program bore impressive results. (Query whether the U.S. or any Western state would or should (!) tolerate the involuntary commitment of babies and the publishing of an infected person’s entire post-infection whereabouts on a municipal WeChat account.)

Repeatedly, intellectuals in Western democracies embarrass themselves salivating over the perception of uber-competence in authoritarian competitor states. Nonetheless, it can be true, for example, that the Soviets both built Potemkin props to seduce Western useful idiots and also built a genuinely impressive roster of mathematicians (though query whether the Soviets or a deeper Russian civilization did the latter).

While there is something to admire in the speed with which China can stand up a new rail station or construct an airstrip on an artificial island, there is also reason to believe the edifice of CCP state capacity has cracks in its foundation. Literally.

Interesting analysis by Ian Storey in The Diplomat argues that the runways on China’s ostensibly fearsome forward operating bases in the Spratly islands may be too poorly built (hastily laid and at risk of subsidence) to sustain a viable jet fighter presence to establish local dominance.

In addition to inherent engineering challenges with artificial islands, corruption may have hurt the strategic project. Storey writes:

Doubts about the structural integrity of the artificial islands are amplified when the issue of corruption is considered. Despite President Xi Jinping’s anti-graft campaign, corruption in China remains endemic, including in the military-industrial complex. For instance, in July 2019 Su Bo, who oversaw the construction of China’s first domestically produced aircraft carrier, the Liaoning, was convicted of corruption and jailed for 12 years. And in May 2020, Hu Wenming, the head of China’s aircraft carrier construction program, was arrested and charged with corruption and passing secrets to foreign powers. Corruption in the building industry leads to short cuts and shoddy construction.

Ian Storey, “Why Doesn’t China Deploy Fighter Jets to the Spratly Islands?,” The Diplomat (Aug. 14, 2020).

While there is much to admire (and perhaps fear) in the glittering hardware markets of Shenzen, do not underestimate the extent to which corrupt institutions will bring down even the most technically competent civilizations. Even the most positive interpretations of China’s failure to warn the world about the COVID-19 outbreak pin the blame on the perverse incentives within the CCP regime. As the New York Times reports, “Local officials often withhold information from Beijing for fear of reprisal, current and former American officials say.” This is a manifestly unhealthy and unsustainable information conveying mechanism. Much is made of the Chinese surveillance state’s panoptic omniscience — and yet at best the story of COVID-19’s escape is one of Beijing’s inability to discern that middling bureaucrats from the provinces were hiding the onset of one of the greatest human crises in a century.

There will always be dark truths within a society. In many (if not most, if not all) cases, acknowledging and handling that reality is the best path forward. A civilization with institutions too corrupt to detect, surface, and mitigate internal and global threats in a timely fashion is not one that anyone desiring a healthy future should wish to emulate.

My latest for Exponents: Green Tech and the Post-Decadent Society

I argue in Exponents:

Ultimately, it is alchemy – cultural and technological creativity – or as Douthat puts it, the capacity to “imagine and work toward renewal and renaissance” that counteracts decadence. So where to begin the post-decadent project? Well, green technology – to avoid the worst of climate change and, perhaps, transcend yet more spiritual disillusionment.

In the discussion of stalled growth, Douthat paraphrases Tyler Cowen for the proposition that green technologies are “defensive” in nature – helpful for sustainability but “not a world-altering innovation in the style of the steamship or the airplane or the gas-powered automobile that it aspires to replace.”

Douthat’s reservations here are overstated, as sustainability is underrated. In Cowen’s own case for the morality of economic growth, Stubborn Attachments, the watchword is “Wealth Plus,” with the “Plus” encompassing the essential feature of environmental and societal sustainability. To the extent the crisis in cultural confidence Douthat depicts is the result of deceleration and subtle backsliding, even modest steps forward are vital inflection points and potential origins for yet greater ambition. With further imagination, green tech is the beginning of our renaissance.

Imagine a grand bargain that sees a carbon tax enacted alongside wise deregulation – of the Saul Griffith, not C. Montgomery Burns, variety – to make nuclear and solar energy more affordable and devote a greater share of national GDP to research and development. Imagine, as Peter Thiel invokes in a review of The Decadent Society, compact nuclear reactors cheaply powering, with zero carbon emissions, the factories to produce and the electricity to fuel Teslas for mass consumption. Imagine those same reactors powering hyperloops to carry commuters from abundant, affordable, and aesthetic carbon-sink housing to prosperous urban cores. Imagine in the heart of the city (as well as the cloud) Stripe University – educating minds for creativity, not credentialism – where the Department of Progress Studies is pioneering institutional incentives to speed the replacement of compact fission with fusion reactors.

Imagine a reporter from the Srinivasan Post – her retirement fund and children’s college funds secured with assets from initial coin offerings that seeded transformational technologies – delving into her latest in a series of investigations of fusion technology, which help scientists, regulators, venture capitalists, and the public parse PR puffery from bona fide breakthroughs. Imagine students and researchers the world over inspired by those articles to make significant contributions to fusion and other digital science hub repositories – from the COVID-25 vaccine to the Martian Excursion Module repos – in return for their own coin stakes in that technological progress via the a16z “It’s Time to Build” exchange.

Is this future far-fetched? As Douthat might say, have some faith – and let’s get to work.

Check out the rest!

The Transience of Memory: Let’s Remember Where We Parked

In a win for innovation, Eric Schmidt apparently helped the U.S. Air Force apply software to mid-air refueling. The New York Times reports:

At an Air Force facility in Qatar in 2016, Mr. Schmidt visited officers who scheduled flight paths for the tankers that refueled planes. They used a white board and dry-erase markers to set the schedule, taking eight hours to complete the task.

Mr. Schmidt said he recalled thinking, “Really? This is how you run the air war?” Afterward, he and others at the Defense Department worked with the tech company Pivotal to ship software to the officers.

This sounds like a welcome efficiency gain. I just hope we remember how to use the white boards. I would bet there is accumulated knowledge and cognitive skill embodied in the expert flight path whiteboarders that we can’t fully understand or replicate without either carefully recording the current state of knowledge or having to retrace over time the same slow path by which the knowledge evolved in the first place.

Don’t underestimate our incapacity to recall how to perform complex and essential processes following a periodic lapse in use. There was a period when we lost the knack for manufacturing a key ingredient in nuclear warheads. At one point, there were fears we lost the capacity to manufacture F-22s. While those fears proved unfounded when an audit uncovered the necessary tooling, the concern itself proves the difficulty of regaining embodied knowledge that goes missing.

This is where Schumpeter meets Hayek. Creative destruction can birth a better way, but it also means that gradually-accumulated and embedded knowledge will be razed.

Even elephants can go missing from the historical record. In the jaws of the COVID-19 pandemic, historians and economists ponder how much civilization forgot about the 1918 flu, not to mention the 1957 flu, not to mention our 2006 preparations for pandemic flu.

Given the difficulty of holding onto low-frequency, high-amplitude knowledge, one might expect competent civilizations to produce and then hold sacred value-laden documents, such as constitutions, religious texts, and commandments.

My first for Exponents: Neoliberalism – The Cause of Surveillance Capitalism?

FRIEDRICH VON HAYEK 1981 Austrian economics and political philosopher in Gothenburg

I argue in Exponents:

Protecting sanctuary in the home against encroaching technology is valued by diverse thinkers. Hayek and Zuboff would seem to agree that the economic logic of the market can run amok when it breaches the threshold of the private home. Yet instead of dismantling the economic engine, we ought to let it operate in its appropriate domain. The role of law in confronting the loss of contextual integrity due to modern technology can be to preserve the historic separation between the intimate and extended orders. The home could be made safe from even more digital prying eyes. The power of Big Tech can be harnessed to optimize the globalized world, while at the same time it is kept out of the modern hearth.

Do read the whole thing.

Preference Cultivation and Overcoming Inertia

Alexis de Tocqueville (*This portrait is in the public domain in its country of origin and other countries and areas where the copyright term is the author’s life plus 100 years or fewer.)

For many problems, local and preference-sensitive solutions are more efficient and effective than distant fiats (e.g., U.S. university-based network computing was far more advanced than Soviet computing). However, local preferences themselves can be inefficient (e.g., single family home owners opposing new construction). Between market mythologizing and top-down, central planning lies an option of preference cultivation.

Local cruft and inertia often need to be fracked. But centrally-chosen alternatives leave much to be desired. For example, well-run charter schools employing common core materials can outperform stagnant district schools, but the common core can stymie good teachers. Centrally-imposed solutions might raise the floor but lower the ceiling.

What if we could raise the floor against the wishes of vested and complacent special interests opposing reform but also raise the ceiling against arrogant technocrats demanding standardization? What if local preferences themselves could be reformed without the need for preference-denying guardrails. It would be like having a market that eschewed fast food instead of requiring a mayor to tax or ban it.

Incentives and self-interest matter. An individual teacher’s union member can understand where her bread is buttered and oppose non-unionized charters competing for students and per-pupil funding. That’s rational. But it may not be rational on a longer time horizon. One generation may benefit from an inefficient allocation. But the next generation will be left with underperforming schools and unaffordable public employees.

Civilization is premised on one generation caring about the welfare of the next. Somewhere among those with vested interests lies concern (even if overshadowed by other considerations) for the long-term well being of the community or at least one’s own progeny. (Note: civilization will not work without progeny.)

Individuals’ understanding of self-interest can be expanded to “self-interest properly understood” a la de Tocqueville. Horizons can be broadened, consequences can be ascertained, lessons can be learned, and tastes can be made. Rejecting those premises would mean rejecting any possibility for education, political persuasion or cultural transformation. Those premises would seemingly have to be accepted by both the left and the right, otherwise why do people ever open their mouths?

If education is at all possible, preferences can be changed. Preparing market and political participants to better understand their own self-interest as enmeshed in long-term community values, and cultivating their preferences accordingly, should lead to better outcomes.

Continue reading “Preference Cultivation and Overcoming Inertia”