Friday, October 7, 2011

Hacking the Warlord Complex



The Presidential campaign has been settling into full embarkation on the quest mode this week, as the pool of candidates locks in. Christie and the Rogue are staying on the sidelines. The Republican voters who would dethrone a demonized Obama have lost the possibility of a better choice than the current stable. For my adult lifetime, the Republicans have been waiting for the second coming of Ronald Reagan, and unsurprisingly, he's really dead.

Across the aisle, Democratic voters wait for the President to more emotively channel their feelings, their aspirations for proto-utopian benediction, a rekindling of that feeling they had on the night he won. Unfortunately, HOPE is a little more complicated when it is expressed through MQ-9 Reaper drones.



Meanwhile, a somewhat eclectic collection of dissenters is busy trying to #Occupy the abstraction, declaring that it is not looking for leaders. While there are things about the Occupy movement that seem pretty old school (like their admonitions to bring back the Glass-Steagall Act of 1932, which mandated segregation of investment banking and deposit-taking until its repeal in 1993, as if that were a silver bullet to kill the vampire infestation of Capital), the idea of the leaderless opposition movement is very much in tune with the Zeitgeist. Networks are the dominant organizational model of the 21st century, and they don't have heads. Perhaps that's what took the media so long to pay attention—the lack of a figurehead really hacks the master narrative (same way that the media desperately tried to elect Mohammed el-Baradei the leader of the Egyptian movement when he parachuted in from his plush life in the West).



The Presidential election season really stands in profound contrast to the movements we have seen all over the world this year. We have come to take voting for granted, as a kind of old world civic duty, structured as a consumer choice between two similar products. Coke or Pepsi? "Politics practiced as a branch of advertising," as JG Ballard noted. The liberated vigor that voting represented when it was a new freedom achieved through revolutions against capricious monarchs has degenerated into an emphysemic wheezing, as we watch our mature republics struggle to navigate a radically morphing world. Is it too heretical to question whether 18th century political structures are really up to the task of managing the 21st century world?



To me, it is self-evident that contemporary human social networks mediated by computing technology are naturally evolving to provide a more complete and participatory means for our governance, one that is likely to radically change existing republican political systems in the same way the tech boom of the 1990s challenged monolithic corporate powers tat had evolved in the 20th century. I think any development that lessens the concentration of power in any particular individual or group is a good development, one that will promote a healthier and freer society. But I also can't help but wonder: is there some inherent human need to elect chiefs that one is foolish to think can be changed? Can you really have a human society that is not structured as a pyramid with one dude at the top, expressing superior power to maintain order?



Consider the fact that, at its root, the Anglo-American legal system is based on the methods a family of nomadic warlords developed to administer the territory of England after they had conquered it under force of arms. Our property laws, largely evolved from the means used to settle disputes between the warlord's senior minions about the respective lands they were charged with running to maintain dominion. Is it really surprising when you hear the Russian intelligentsia whine about how the people don't really want democracy—they just want Putin the tiger hunter to maintain order and the pride of the nation? The fact that basically every corporation is structured like a medieval military band with a single chief at the top, only periodically accountable to the board of elders or the tribal stakeholders to whom they are accountable, says a lot about the natural order of things. Are we always waiting for the return of the King?



One can't help but wonder whether the great danger of atomizing the distribution of power through new constitutional codes of the network wouldn't just expose us more to a mob that can be manipulated by a strongman that knows how to push their buttons. Network-based movements give me great hope for the potential for a more authentic democracy. But they also make me wonder: what would Goebbels do with Facebook?

Sunday, October 2, 2011

To the App-Cave!

Going over my notes and recollections from FenCon last weekend, this one stands out even among a LOT of news, good advice and reading recommendations. Pyr editor Lou Anders showed a few of us this fantastic iPad/ iPhone app for reading comic books digitally. It's called ComiXology and I found it very nicely reviewed at a British tech review Website.

DC and Marvel are digitally available this way and so are plenty of edgier, bolder independent comics. Oh-oh. I have never felt an itch to invest in an iPad until now. I've always loved looking at comic books, and this app gives you a full-color, flexible view that can sequence through the panels in order and then zoom to the full page for the layout to be appreciated. Between that and the enormous amount of talent doing comics these days - wow.

Friday, September 30, 2011

Robert's Rules of Emoticon Order



It is an interesting thing to watch the tired old Palestinian fighters, who have spent their whole lives competing with the Israelis for control of the same soil, present their petition for statehood to the United Nations. In part because the idea that such a process even exists is so science fictionally cool, because it represents the possibility of creating new states—maps do change, as we have all seen in our lives, and the possibilities for how much they could change are theoretically boundless. The criteria are pretty simple: you need to have some real estate over which you exercise internal and external sovereignty, a permanent population, a government, and the capacity to enter into relations with other sovereign states. Simple enough, but for the unspoken part about the other people who might think it's *their* sovereignty to exercise.



Imagine, if you will, a near-future Texas—say fifty years out—whose demographics have radically changed, such that the only Perry that could ever be elected to statewide office would be a Perez. It is not hard to imagine such a sub-state of the United States deciding it wanted to return to becoming a sovereign state of its own. And that to accomplish such an act isn't really about the legalities under the flaccid regime of international law, the law of an imaginary sovereign with no real ability to enforce its edicts, but about the military ability to keep out occupying armies and the political ability to secure diplomatic recognition. Just ask the Confederates—the political theory underpinning our legal systems has never really articulated a coherent legal code defining when and how new states can be created within existing ones. Which doesn't stop a whole lot of free thinking iconoclasts from trying.


[Pic: President Kevin Baugh of the Republic of Molossia, aka a piece of land outside Reno, Nevada.]

Of course Abbas wants to get after the issue now, in the fall of the Arab Spring, as the incipient leaders of post-revolution territories debate their visions for a 21st century Arab state. But to do so also seems very anachronistic, when we are in a historical moment that reveals the the culture of the Jewish diaspora a much more relevant model for the organization of peoples than a piece of land with a fence around it and some ruling fathers running the rancho. Isn't the real power of the modern Israeli state based on the pre-Westphalian power of the transnational, inter-state network of supporters, who want the state because it articulates the existence of the network in the only terms that were understood by the post-colonial, post-WWII rulers of the world atlas?

In this century, the network is a more compelling model for the polity than the nation state.



The signs are all there in the outstanding roundup by Nicholas Kulish at The New York Times of the post-democracy movements emerging around the world—As Scorn for Vote Grows, Protests Surge Around Globe.

Surprise: the generations raised in cyberculture don't take the truths of constitutional democracy as self-evident.

Increasingly, citizens of all ages, but particularly the young, are rejecting conventional structures like parties and trade unions in favor of a less hierarchical, more participatory system modeled in many ways on the culture of the Web.

In that sense, the protest movements in democracies are not altogether unlike those that have rocked authoritarian governments this year, toppling longtime leaders in Tunisia, Egypt and Libya. Protesters have created their own political space online that is chilly, sometimes openly hostile, toward traditional institutions of the elite.

The critical mass of wiki and mapping tools, video and social networking sites, the communal news wire of Twitter and the ease of donations afforded by sites like PayPal makes coalitions of like-minded individuals instantly viable.

“You’re looking at a generation of 20- and 30-year-olds who are used to self-organizing,” said Yochai Benkler, a director of the Berkman Center for Internet and Society at Harvard University. “They believe life can be more participatory, more decentralized, less dependent on the traditional models of organization, either in the state or the big company. Those were the dominant ways of doing things in the industrial economy, and they aren’t anymore.”

Yonatan Levi, 26, called the tent cities that sprang up in Israel “a beautiful anarchy.” There were leaderless discussion circles like Internet chat rooms, governed, he said, by “emoticon” hand gestures like crossed forearms to signal disagreement with the latest speaker, hands held up and wiggling in the air for agreement — the same hand signs used in public assemblies in Spain. There were free lessons and food, based on the Internet conviction that everything should be available without charge.

Someone had to step in, Mr. Levi said, because “the political system has abandoned its citizens.”

The rising disillusionment comes 20 years after what was celebrated as democratic capitalism’s final victory over communism and dictatorship.

In the wake of the Soviet Union’s collapse in 1991, a consensus emerged that liberal economics combined with democratic institutions represented the only path forward. That consensus, championed by scholars like Francis Fukuyama in his book “The End of History and the Last Man,” has been shaken if not broken by a seemingly endless succession of crises — the Asian financial collapse of 1997, the Internet bubble that burst in 2000, the subprime crisis of 2007-8 and the continuing European and American debt crisis — and the seeming inability of policy makers to deal with them or cushion their people from the shocks.

Frustrated voters are not agitating for a dictator to take over. But they say they do not know where to turn at a time when political choices of the cold war era seem hollow. “Even when capitalism fell into its worst crisis since the 1920s there was no viable alternative vision,” said the British left-wing author Owen Jones.




Will the law students of the future learn Robert's Rules of Emoticons? It seems very likely to me. As suggested in last month's post, In the Panopticon, no one can hear your reboot, it seems indisputable that contemporary networking technologies present more compelling tools for the construction of direct democracy than have ever existed. These under-40s all over the world who are the natives of the realm of those technologies are naturally forming their own political networks using those tools. And these imminent polities may violate all the geopolitical conventions of land, language, and ethnicity.

Geopolitics isn't going away, but it is going to have its work cut out for it dealing with the emerging 21st century cyberpolitics.

What will the United Nations Security Council do about sovereign polities that assert themselves in the ethereal space of the network, even controlling resources and behaviors through the systems of the network, without needing to wall in any segments of the physical world?

What will happen when a virtual world secedes from the jurisdiction of the governments of the physical world?

What happens when a virtual polity decides to assert dominion over the physical world?



This mode seems the first really viable alternative approach to political choice, and the idea of democratic representation, to emerge in a long time. The NYT piece tries to place it within the existing dualistic right/left paradigm, but that kind of watered down Hegelian dialectic doesn't really have any place in the network. A network parliament would be a polyphony. A network parliament, in all likelihood, wouldn't be a parliament—it would be the People.

In a time when modern Greece is crumbling as a sovereign republic, is it too utopian to imagine the planet as a virtual Athens, governed by a network-enabled direct democracy? It is certainly a scary idea for the power elites of the world, the rulers of all the contemporary republics quietly scornful of popular opinion while relentlessly pandering to it and manipulating it in their own political and financial interests. And the American Founders would have no trouble scaring us with the idea of how horrific it could be to live in a society ruled by an Internet mob.

Lots to worry about in how to construct effective operating systems for that sort of polity, but the truth that seems self-evident to me is that we need to start tackling those tasks in earnest, because it's already starting to happen.

Tuesday, September 27, 2011

Frankenstein's Moon

One of the cool things about my day job is that sometimes it intersects with my genre leanings in a big way. Take Frankenstein's Moon as a case in point. This is how I spent much of last week, distilling a full-blown Sky & Telescope article down to a media-release-sized writeup, balancing readability with accuracy. Not always an easy task, especially when there's a lot of research and technical nuance involved. Practice helps, though. In the past, we've worked on similar projects connecting Edvard Munch's painting The Scream with Krakatoa, settled a date conflict regarding Caesar's invasion of Britain, offered a convincing new date for the ancient battle of Marathon and solved Walt Whitman's meteor mystery, among many others. Fun stuff, that!

The current Frankenstein piece seems to be capturing popular attention as well. Already it has resulted in a nice articles in The Guardian, which as been reprinted in quite a few British newspapers. Another article written by Jim Forsythe at WOAI in San Antonio has been picked up by Reuters and shown up all over the world, including MSNBC. So yeah, we've got lots of Frankenstein to enjoy here at the end of September.

One cool bit of conflation didn't make it into the media release, but is touched upon in the full article. Allow me to slip into Jess Nevins mode for a moment (although Jess would likely scoff that this is common knowledge) to explain. During the original "ghost story" challenge mentioned below, Mary Shelley is the only participant to actually finish a written piece begun at that time. Lord Byron began one, but soon lost interest and abandoned it. John Polidori, however, took up that fragment some time later and was inspired to write The Vampyre, published in 1819. The story was an immediate success, in part, no doubt, because the publisher credited it as written by Lord Byron (Polidori and Byron fought for some years to get the attribution corrected in subsequent printings). The Vampyre was the first fiction to cast the legendary bloodsuckers as an aristocratic menace in the narrative, and spawned a popular trend of 19th century vampire fiction which culminated with Bram Stoker's enduring Dracula in 1897. Which means the two most famous horror icons of 20th century pop culture--Dracula and Frankenstein's monster--can both trace their lineage back to that 1816 gathering at Villa Diodoti overlooking Lake Geneva.
Frankenstein’s moon: Astronomers vindicate account of masterwork

Victor Frankenstein’s infamous monster led a brief, tragic existence, blazing a trail of death and destruction that prompted mobs of angry villagers to take up torches and pitchforks against him on the silver screen. Never once during his rampage, however, did the monster question the honesty of his ultimate creator, author Mary Wollstonecraft Shelley.

That bit of horror was left to the scholars.

Now, a team of astronomers from Texas State University-San Marcos has applied its unique brand of celestial sleuthing to a long-simmering controversy surrounding the events that inspired Shelley to write her legendary novel Frankenstein. Their results shed new light on the question of whether or not Shelley’s account of the episode is merely a romantic fiction.

Percy Bysshe Shelley (played by Douglas Walton) and Lord Byron (played by Gavin Gordon) listen as Mary Wollstonecraft Shelley (played by Elsa Lanchester) tells her tale of horror. [Bride of Frankenstein]

Texas State physics faculty members Donald Olson and Russell Doescher, English professor Marilynn S. Olson and Honors Program students Ava G. Pope and Kelly D. Schnarr publish their findings in the November 2011 edition of Sky & Telescope magazine, on newsstands now.

“Shelley gave a very detailed account of that summer in the introduction to an early edition of Frankenstein, but was she telling the truth?” Olson said. “Was she honest when she told her story of that summer and how she came up with the idea, and the sequence of events?”

A Dark and Stormy Night

The story begins, literally, in June 1816 at Villa Diodati overlooking Switzerland’s Lake Geneva. Here, on a dark and stormy night, Shelley—merely 18 at the time—attended a gathering with her future husband, Percy Bysshe Shelley, her stepsister Claire Clairmont, Lord Byron and John Polidori. To pass the time, the group read a volume of ghost stories aloud, at which point Byron posed a challenge in which each member of the group would attempt to write such a tale.

Villa Diodati sits on a steep slope overlooking Lake Geneva. Relatively clear views prevail to the west, but the view of the eastern sky is partially blocked by the hill. A rainbow greeted the Texas State researchers upon their arrival at Lake Geneva. [Photo by Russell Doescher]

“The chronology that’s in most books says Byron suggested they come up with ghost stories on June 16, and by June 17 she’s writing a scary story,” Olson said. “But Shelley has a very definite memory of several days passing where she couldn’t come up with an idea. If this chronology is correct, then she embellished and maybe fabricated her account of how it all happened.

“There’s another, different version of the chronology in which Byron makes his suggestion on June 16, and Shelley didn’t come up with her idea until June 22, which gives a gap of five or six days for conceiving a story,” he said. “But our calculations show that can’t be right, because there wouldn’t be any moonlight on the night that she says the moon was shining.”

Moonlight is the key. In Shelley’s account, she was unable to come up with a suitable idea until another late-night conversation--a philosophical discussion of the nature of life--that continued past the witching hour (midnight). When she finally went to bed, she experienced a terrifying waking dream in which a man attempted to bring life to a cadaverous figure via the engines of science. Shelley awoke from the horrific vision to find moonlight streaming in through her window, and by the next day was hard at work on her story.

Doubting Shelley

Although the original gathering and ghost story challenge issued by Byron is well-documented, academic scholars and researchers have questioned the accuracy of Mary Shelley’s version of events to the extent of labeling them outright fabrications. The traditionally accepted date for the ghost story challenge is June 16, based on an entry from Polidori’s diary, which indicates the entire party had gathered at Villa Diodati that night. In Polidori’s entry for June 17, however, he reports “The ghost-stories are begun by all but me.”

Russell Doescher and Ava Pope take measurements in the garden of Villa Diodati. [Photo by Marilynn Olson]

Critics have used those diary entries to argue Shelley didn’t agonize over her story for days before beginning it, but rather started within a span of hours. Others have suggested Shelley fabricated a romanticized version for the preface of the 1831 edition of Frankenstein solely to sell more books. Key, however, is the fact that none of Polidori’s entries make reference to Byron’s ghost story proposal.

“There is no explicit mention of a date for the ghost story suggestion in any of the primary sources–the letters, the documents, the diaries, things like that,” Olson said. “Nobody knows that date, despite the assumption that it happened on the 16th.”

Frankenstein’s moon

Surviving letters and journals establish that Byron and Polidori arrived at Villa Diodati on June 10, narrowing the possible dates for the evening of Byron’s ghost story proposition to a June 10-16 window. To further refine the dates, Shelley’s reference of moonlight on the night of her inspirational dream provided an astronomical clue for the Texas State researchers. To determine which nights in June 1816 bright moonlight could’ve shone through Shelley’s window after midnight, the team of Texas State researchers traveled in Aug. 2010 to Switzerland, where Villa Diodati still stands above Lake Geneva.

Ava Pope, Kelly Schnarr and Donald Olson on the steep slope just below Villa Diodati. [Photo by Roger Sinnott]

The research team made extensive topographic measurements of the terrain and Villa Diodati, then combed through weather records from June of 1816. The Texas State researchers then calculated that a bright, gibbous moon would have cleared the hillside to shine into Shelley’s bedroom window just before 2 a.m. on June 16. This calculated time is in agreement with Shelley’s witching hour reference. Furthermore, a Polidori diary entry backs up Shelley’s claim of a late-night philosophical “conversation about principles” of life taking place June 15.

Had there been no moonlight visible that night, the astronomical analysis would indicate fabrication on her part. Instead, evidence supports Byron’s ghost story suggestion taking place June 10-13 and Shelley’s waking dream occurring between 2 a.m. and 3 a.m. on June 16, 1816.

“Mary Shelley wrote about moonlight shining through her window, and for 15 years I wondered if we could recreate that night,” Olson said. “We did recreate it. We see no reason to doubt her account, based on what we see in the primary sources and using the astronomical clue.”

For additional information, visit the Sky & Telescope web gallery at www.skyandtelescope.com/Frankenstein.

Friday, September 23, 2011

Hijacking Flight 117 to the Nostalgia Factory



Fridays are all about escape. All you need to do is flip through the weekend arts section of the newspaper, a menu of evasions of the real. This week it's another upbeat baseball movie from the ubiquitous Brad Pitt, three more variations on corporate life as apolitical action thriller (the commuter version in Drive starring Ryan Gosling, the retro remix version in the Jason Statham/Robert DeNiro/Clive Owen reinvention of The Killer Elite (how I wish someone really could channel Peckinpah for our post-GWOT culture), and the teen wolf wet dream version in Abduction of Taylor Lautner (they're not your real parents!)), and best of all, the ever-grunting über-Spartan Gerard Butler in Machine Gun Preacher (aka, What Would Jesus Shoot?)



On television, the escape is beyond an alternate present, into an alternate past. The success of Mad Men has shown Hollywood that, in a world where the present is apocalyptic and the future no longer exists, the past is the place we will go to happily watch commercials—indeed, the shows are all commercials for an imagined version of the past, when they're not anachronized versions of our favorite old commercials. Pan Am, The Playboy Club, Boardwalk Empire, even Game of Thrones...we like our product placement to occur in beautifully curated cathode ray nostalgia bubbles. As Alessandra Stanley says in her review of Pan Am, "When the present isn’t very promising, and the future seems tapered and uncertain, the past acquires an enviable luster."



Pan Am even imagines a time when there were cute revolutionaries in our midst imagining a better world: "Christina Ricci plays Maggie, a closet beatnik who wears the Pan Am uniform to see the world but at home listens to jazz and studies Marx and Hegel."



What a perfect semiotic response to the state of things in the world after 9/11, itself an evolved derivative of the Lockerbie Bombing, by imagining oneself eternally flying the airline that represented the dream of a shiny corporate everyday interplanetary 2001? Especially if you revisit the decade that just passed, in Mark Danner's amazing piece in this week's New York Review of Books—"After September 11: Our State of Exception." Danner conveys the catalyzing power of the historical change when wars between states were as relevant as a vintage game or Risk, and the duty of the sentinel was to protect the monolithic state from elusive and conceptually intangible networks:

[M]ake no mistake, the critical decisions laying the basis for the state of exception were made in a state of anxiety and fear. How could they not have been? After September 11, as Richard Clarke put it simply, “we panicked.” Terrorism, downgraded as a threat by the incoming Bush administration, now became the single all-consuming obsession of a government suddenly on a “war footing.”

Every day the President and other senior officials received the “threat matrix,” a document that could be dozens of pages long listing “every threat directed at the United States”10 that had been sucked up during the last twenty-four hours by the vast electronic and human vacuum cleaner of information that was US intelligence: warnings of catastrophic weapons, conventional attacks, planned attacks on allies, plots of every description and level of seriousness. “You simply could not sit where I did,” George Tenet later wrote of the threat matrix, “and be anything other than scared to death about what it portended.”11

One official compared reading the matrix every day—in an example of the ironic “mirroring” one finds everywhere in this story—to “being stuck in a room listening to loud Led Zeppelin music,” which leads to “sensory overload” and makes one “paranoid.” He compared the task of defending the country to playing goalie in a game in which the goalie must stop every shot and in which all the opposing players, and the boundary lines, and the field, are invisible.12

All this bespeaks not only an all-encompassing anxiety about information—about the lack of map rooms displaying the movements of armies, the maddening absence of visible, identifiable threats, the unremitting angst of making what could be life-and-death judgments based on the reading and interpreting of inscrutable signs—but also, I think, guilt over what had been allowed to happen, together with the deep-seated need to banish that guilt, to start again, cleansed and immaculate. Thus “the War on Terror”—a new policy for a new era, during which the guardians of the nation’s security could boast a perfect record: no attacks on American soil. The attacks of September 11 would be banished to a “before time” when the “legalistic” Clinton rules applied, before “the gloves came off.” The successful attack could thus be blamed on the mistaken beliefs of another time, another administration. The apocalyptic portal of September 11 made everything new, wiping out all guilt and blame.





No wonder my son, raised in the Cheney/Obama decade, gravitates toward the vinyl records of the 1970s. 9/11 succeeded in flipping the switch that turns our media culture into a giant fear-based psyop on ourselves. If I unleash my Jack Bauer action figure inside the screen of Pan Am, will he fall in love and settle down in the peaceful interregnum that never existed during the long war of the twentieth century? Maybe his lover will be Christina Ricci's New Left stoner, and she and Jack will partner up to foment a revolutionary atemporal mashup in the mediascape of the early 21st century. Better futures are there for those unafraid to leap into the Nietzschian uncertainty of tomorrow.



Isn't it better to punch through the exit door than assume crash positions? Don't believe what the flight attendants tell you: it is always your right to go into the cockpit.

Friday, September 16, 2011

Capitalist tools



I was struck the other day, upon reading this opinion piece within the peachy electronic pages of the Financial Times from UBS senior staffer George Magnus, by what an unusual clarion it was to find within the four corners of a business newspaper:

Financial bust bequeathes a crisis of capitalism
By George Magnus

Financial markets have had a torrid summer of breaking news about slowing global growth, fears over a new western economic contraction and the unresolved bond market and banking crisis in the eurozone.

But these sources of angst have triggered turbulence before, and will continue to do so. Our economic predicament is not a temporary or traditional condition.

Put simply, the economic model that drove the long boom from the 1980s to 2008, has broken down.

Considering the scale of the bust, and the system malfunctions in advanced economies that have been exposed, I would argue that the 2008/09 financial crisis has bequeathed a once-in-a-generation crisis of capitalism, the footprints of which can be found in widespread challenges to the political order, and not just in developed economies.

Markets may actually have twigged this, with equity indices volatile but unable to attain pre-crisis peaks, and bond markets turning very Japanese. But it is not fashionable to say so, not least in policy circles.


The basic theme is not that unusual—plenty of economic doom scenarios can be found throughout the mainstream media these days. What is so unusual is the way the diagnosis is expressed in Marxist terms—"crisis of capitalism" being part of the core lexicon of Capital, a term whose use reveals training in those texts as part of one's toolkit for understanding the contemporary world. To openly state that we are experiencing a "once-in-a-generation crisis of capitalism" is to summon all the Hegelian world historical eschatology of Marx—the idea that there are underlying forces in tension, that will ultimately lead to an endpoint of the current period, and some (hopefully better) period on the other side of the long apocalyptic night.



It's like the Kali Yuga of political economy.

I find the use of this lexicon very refreshing. Not just because of the gravity it invokes, but also because it reveals a healthy intellectual diversity that is largely unknown in the U.S. While anyone who studied economics or politics in the UK (at least before 1990) would have learned a healthy dose of Marx, you would be unlikely to easily find it in any American curriculum other than perhaps European intellectual history, or a dismissive sidebar in your introduction to classical microeconomics.

And you would never see a reference to a "crisis of capitalism" in an American newspaper (especially not a business newspaper), because you would so rarely even see the word "capitalism" used in an American newspaper.

We just talk about "business." Frequent use of the term "capitalism" to describe our political economic system would suggest, heretically, that there might be an alternative system.


[Pic courtesy of the amazing site of The Wanderer.]

Our culture is founded on religious freedom, as we see evidenced every day in the reality-confounding faith-based factual pronouncements of political leaders from both the Coke and Pepsi factions of our two-party system. But there really is no ideological freedom outside of the religious community. We have all the Utopias you can eat, and tolerate them with smiles—so long as they keep their fresh-baked modes of thinking and living confined to the congregation (which may be a secular variation, like the hippie communes of the 60s, or theologically science fictional but culturally successful religious movements like Mormonism). But we do not have an alternative political economy to capitalist Constitutionalism. The Federalist Papers are the real American Talmud, and there is no alternative to the seven Articles of the Founders.

American culture was extremely effective in the twentieth century in suppressing any authentic alternatives, violently excising any European-style revolutionary fervor beginning around the time of World War I. Who knew the first car bomb was a horse-drawn wagon detonated on Wall Street by an Italian-American anarchist who managed to leave a crater at the corner of Wall and Broad in protest of the imprisonment and deportation of other alienated immigrant anarchists? (See Buda's Wagon: A Brief History of the Car Bomb, by Mike Davis.) Socialist thought was appropriated by FDR to navigate the culture through the Depression, but mainly to empower the state to create the military industrial war machine born in WWII that is the core engine of 21st century American capitalism. By the end of the Clinton administration, ideological difference was largely illusory, and the end of the End of History with 9/11 really did nothing to change that.



So the only place you are likely to find the world "capitalist" in the American mainstream is in a winking necktie from Forbes magazine.

This weekend, while Austin amps up the somatic consumerist revel of ACL Fest, in which our teens are taught to express the illusion of ideological diversity through their choice of bands, the folks from Adbusters are trying to occupy Wall Street. Of course, they do not have an actual ideological agenda—they want an American Tahrir movement, but they seem to barely know what they are protesting (beautifully presented but politically impotent grievances with the soul-crushing ennui of our advertising-based mental environment), let alone what the end goal is, as evidenced by their email to the multitude last night revealing that they can't even decide on their one demand. At least they're trying to break out of the haze.



What they may not perceive is the extent to which the network technologies incubated by the military-industrial complex are now bulldozing the monolithic institutions of the post-Westphalian world in a way no revolutionary cadre could ever imagine. The future is looking to be one of network-based polities and mass-customization, in which the advertising is derived from what's already in your head. These are some very real Hegelian world historical forces rolling over the political economic landscape, and we don't even yet have an "ism" to locate them in our cultural taxonomy.

Monday, September 12, 2011

9/12/11

Yesterday the Soaring Club of Houston couldn’t fly because of Temporary Flight Restrictions having to do with a wildfire to the east of the Field. The last time I remember it being clear, bright weekend weather when we couldn’t fly our sailplanes was the days after 9/11/01, when US aviation was grounded. It’s deja vu with the perspective of ten intervening years.

Sunday’s Houston Chronicle Editorial pages include a column by Kathleen Parker in which she says, ” We stumble at last upon a purpose for columnists – to say that which no one else dares.” This in a column in which she posits that 9/11 caused America to go temporarily insane; that today’s political dysfunction took root in the soil of Ground Zero. Well, in observing the American mindset today, I’ve had to conclude that you can’t understand it without invoking psychopathology, or religion, and in particular, religion and psychopathology intersecting like a Venn diagram of doom.

Earlier this week Thomas Friedman dared too. He said, “. . . rather than use 9/11 to summon us to nation-building at home, Bush used it as an excuse to party — to double down on a radical tax-cutting agenda for the rich that not only did not spur rising living standards for most Americans but has now left us with a huge ball and chain around our ankle. And later, rather than asking each of us to contribute something to the war, he outsourced it to one-half of one-percent of the American people. . . . We used the cold war to reach the moon and spawn new industries. We used 9/11 to create better body scanners and more T.S.A. agents. It will be remembered as one of the greatest lost opportunities of any presidency — ever.” The entire Friedman column is worthwhile reading.