Friday, October 28, 2011

Queen for a day



The first news coming over the network into my ears this morning was bizarrely atemporal for a day of otherwise apocalyptic headlines: the decision of the 16 nations of the British Commonwealth to change a bunch of 17th century statutes and abolish the rule of male primogeniture. Meaning that, if Wills and Kate have themselves a little princess, and then a petit prince after that, the princess still gets daddy's job when he's done. I guess David Cameron has finally discovered the category of reform that he was born to drive.



While Cameron acts as midwife to England's thousand-year experiment in degenerative eugenics, childless physicist Prime Minister Angela Merkel, product of an extinct state socialist meritocracy, shows she has more cojones than the big bankers of Europe, calling their bluff in after-midnight negotiations and getting them to write off half of their Greek debt to avert default and potential meltdown.



While she and the other fiduciaries of Europe's pensioners struggle to maintain the 1999 future in a deleveraging world, back in the New World the occupation of the abstractions seems to have actually scratched the nerve of general popular discontent with the American distribution of wealth. Whether the movement will produce any real reform in the absence of any coherent political theory for how things should be organized differently remains to be seen, but when you start to think about large quantities of highly educated and chronically unemployed people mixing with demobilized veterans of our endless wars against other abstractions and seasoned with the radical political contingent that has always been around but largely invisible to the media since 1989, the realm of plausible scenarios becomes a lot more interesting. To the Wall Street Station?



Digesting the smorgasbord of my anachronistic morning newspaper and all these disparate threads, I am struck by how much of it is united by a common thread: the pursuit of the liberation from work. The escape from the grinding alienation of life in a capitalist society. For the 1%, by making enough money that you don't need to make any more. For the pensioners, by doing your time dutifully and graduating to an early and lengthy retirement of modest leisure. For the Occupy-ers, perhaps through a more self-expressive and communitarian existence in some alternate system they have been unable to actually articulate.

These all seem to me like variations on a Viagra commercial. The one where the grey but otherwise hot and healthy and implicitly prosperous couple is walking on the beach. Our R-rated, secularized, 21st century version of heaven—the happy variation of life when played by contemporary Capital's rules, the end of alienation we can supposedly obtain by enduring decades of it.



Consider the example of the railroad workers arrested in New York yesterday in a successful disability fraud scheme that would have extracted $1 billion of pension funds to finance the eternal days off of eleven people who "after claiming to be too disabled to stand, sit, walk or climb steps, retired to lives of regular golf, tennis, biking and aerobics." Is that really what we are all stealing to achieve? Simulations of country club leisure?

No wonder the same front page also reported changes in the rules of golf.

Is the real problem that, in a society that is dehabituating itself from the practice of financing today's consumption with the imaginary income of tomorrow, the idea of that world on the other side of the paycheck is no longer tenable? Once that narrative breaks, the whole thing unravels like a Ponzi scheme. After everyone stops complaining about it, what happens then? Maybe it takes a world without a future to teach people how to live an authentic now—maybe even one in which the golf courses are put to other uses.

Friday, October 21, 2011

How to win revolutions and influence people


[pic: Mohammed el Bibi brandishes Qaddafi's golden gun, courtesy of mirror.co.uk.]

So our mediasphere exploded with revolution porn yesterday, as the West celebrated the death of one of its odder symbiotes. (You knew Qaddafi was doomed weeks ago when you learned his homes were being fire-bombed by Scandinavian F-16s; the image of having him dragged from a culvert by a kid in a Yankees cap and a mullet, stealing his golden gun, surely was scripted by some master of psyops.) All of our heads of state and talking heads lined up to celebrate his death as a moral victory of the Libyan people, subtextually reveling in the way the narratives are finally starting to play out in accordance with the American master mold, ever since the death of Osama. For our exhausted republic, sustained by our national myths of popular rebellion against unelected monarchs, actual revolutions on other continents are a very convenient way to provide the People with a way to vicariously experience the thing they wish they could do, but don't actually know how to—being so effectively brainwashed with the idea that the current society is the product of revolution. We the People! Or is it Up with People!?



I see the Occupy Austin folks around town, holding up signs saying "Join the Revolution." One of their staging areas is across the street from my home in far East Austin, a neon plant where you can hear them arguing in the wee hours about their messages of the day. It's authentically refreshing to see indigenous protest movements calling out our latest plutocratic disequilibrium, but when I see signs like "Bring Back the American Dream," I have to wonder whether any of these infantilized suburban white boys with look-at-me dreadlocks would be able to chase down a dictator under artillery fire, drag his body out of a drainage ditch, and put a bullet through his head.


[Pic: Get Motivated! billboard, Airport Boulevard, Austin]

At the same time as the Occupy-ers have gotten their signs into our minds, a different set of signs has been springing up all over town, on gigantic billboards along every major thoroughfare—advertisements for the upcoming Get Motivated! seminar featuring an ultimate lineup of Bill Cosby, Colin Powell, Rudy Giuliani, Mary Lou Retton, General Stanley McChrystal and a coterie of platitude-spinning entrepreneurs and salesmen. Only $1.95 per person!



Granted, I have a semiotic soft spot for the any seminar organizer with the lunatic genius to bring together America's peppiest little balance beam cheerleader and the Col. Kilgore of the GWOT (the man with his own custom nunchakus, whose command comprised "...a handpicked collection of killers, spies, geniuses, patriots, political operators and outright maniacs...a former head of British Special Forces, two Navy Seals, an Afghan Special Forces commando, a lawyer, two fighter pilots and at least two dozen combat veterans and counterinsurgency experts... [who] jokingly refer to themselves as Team America"). I imagine Mary Lou Retton enlisted by the "retired" General McChrystal as the leader of a next generation Gymkata squad, taking out the world's most flamboyant dictators with a combination of floor show acrobatics, pep squad aphorisms, and napalm. (And featuring Bill Cosby as Alexander Scott, their wacky post-Oscar Goldman handler.)


[Pic: the personal nunchakus of General Stanley McChrystal]

What I really wish is for a way to combine these disparate movements. Get Occupied!? All the more so when I read that the Get Motivated! seminars are a giant scam to sell people a smorgasbord of get rick quick schemes. I share the Occupy-ers' desire for radical change in our advertising-based mental environment, but I think an American political movement that attacks the idea of *success* is doomed to failure. Americans only object to wealth when, as with the 1% riding on top of the Great Recession, it becomes perceived as a ruling class that the average person no longer has the opportunity to join. But self-improvement and meritocratic achievement are the real American religions, and opting out of Alpha seems a juvenile political strategy. Think how powerful it would be if someone could appropriate the self-help business values and aspirations that run through American culture from Benjamin Franklin through Warren Buffett in service of an opposition to mega-Capital, Empire, and the class of plutocrats and technocrats they create?



I imagine a slightly shifted reality where radicalized versions of Stan McCrystal, Mary Lou Retton and Joel Osteen evangelize global change through self-help seminars and ubiquitous infomercials—revolution as a get rich quick scheme. The chaos of our apocalyptic globe repackaged as a leveling opportunity for political arbitrage, the geopolitical equivalent of a work-from-home program teaching you how to make big money off your neighbors' home foreclosures. Put a little MKULTRA in the Starbucks, and see what happens.

Either that, or wait to see how long before #Occupy gets successfully co-opted by Madison Avenue.

Friday, October 7, 2011

Hacking the Warlord Complex



The Presidential campaign has been settling into full embarkation on the quest mode this week, as the pool of candidates locks in. Christie and the Rogue are staying on the sidelines. The Republican voters who would dethrone a demonized Obama have lost the possibility of a better choice than the current stable. For my adult lifetime, the Republicans have been waiting for the second coming of Ronald Reagan, and unsurprisingly, he's really dead.

Across the aisle, Democratic voters wait for the President to more emotively channel their feelings, their aspirations for proto-utopian benediction, a rekindling of that feeling they had on the night he won. Unfortunately, HOPE is a little more complicated when it is expressed through MQ-9 Reaper drones.



Meanwhile, a somewhat eclectic collection of dissenters is busy trying to #Occupy the abstraction, declaring that it is not looking for leaders. While there are things about the Occupy movement that seem pretty old school (like their admonitions to bring back the Glass-Steagall Act of 1932, which mandated segregation of investment banking and deposit-taking until its repeal in 1993, as if that were a silver bullet to kill the vampire infestation of Capital), the idea of the leaderless opposition movement is very much in tune with the Zeitgeist. Networks are the dominant organizational model of the 21st century, and they don't have heads. Perhaps that's what took the media so long to pay attention—the lack of a figurehead really hacks the master narrative (same way that the media desperately tried to elect Mohammed el-Baradei the leader of the Egyptian movement when he parachuted in from his plush life in the West).



The Presidential election season really stands in profound contrast to the movements we have seen all over the world this year. We have come to take voting for granted, as a kind of old world civic duty, structured as a consumer choice between two similar products. Coke or Pepsi? "Politics practiced as a branch of advertising," as JG Ballard noted. The liberated vigor that voting represented when it was a new freedom achieved through revolutions against capricious monarchs has degenerated into an emphysemic wheezing, as we watch our mature republics struggle to navigate a radically morphing world. Is it too heretical to question whether 18th century political structures are really up to the task of managing the 21st century world?



To me, it is self-evident that contemporary human social networks mediated by computing technology are naturally evolving to provide a more complete and participatory means for our governance, one that is likely to radically change existing republican political systems in the same way the tech boom of the 1990s challenged monolithic corporate powers tat had evolved in the 20th century. I think any development that lessens the concentration of power in any particular individual or group is a good development, one that will promote a healthier and freer society. But I also can't help but wonder: is there some inherent human need to elect chiefs that one is foolish to think can be changed? Can you really have a human society that is not structured as a pyramid with one dude at the top, expressing superior power to maintain order?



Consider the fact that, at its root, the Anglo-American legal system is based on the methods a family of nomadic warlords developed to administer the territory of England after they had conquered it under force of arms. Our property laws, largely evolved from the means used to settle disputes between the warlord's senior minions about the respective lands they were charged with running to maintain dominion. Is it really surprising when you hear the Russian intelligentsia whine about how the people don't really want democracy—they just want Putin the tiger hunter to maintain order and the pride of the nation? The fact that basically every corporation is structured like a medieval military band with a single chief at the top, only periodically accountable to the board of elders or the tribal stakeholders to whom they are accountable, says a lot about the natural order of things. Are we always waiting for the return of the King?



One can't help but wonder whether the great danger of atomizing the distribution of power through new constitutional codes of the network wouldn't just expose us more to a mob that can be manipulated by a strongman that knows how to push their buttons. Network-based movements give me great hope for the potential for a more authentic democracy. But they also make me wonder: what would Goebbels do with Facebook?

Sunday, October 2, 2011

To the App-Cave!

Going over my notes and recollections from FenCon last weekend, this one stands out even among a LOT of news, good advice and reading recommendations. Pyr editor Lou Anders showed a few of us this fantastic iPad/ iPhone app for reading comic books digitally. It's called ComiXology and I found it very nicely reviewed at a British tech review Website.

DC and Marvel are digitally available this way and so are plenty of edgier, bolder independent comics. Oh-oh. I have never felt an itch to invest in an iPad until now. I've always loved looking at comic books, and this app gives you a full-color, flexible view that can sequence through the panels in order and then zoom to the full page for the layout to be appreciated. Between that and the enormous amount of talent doing comics these days - wow.

Friday, September 30, 2011

Robert's Rules of Emoticon Order



It is an interesting thing to watch the tired old Palestinian fighters, who have spent their whole lives competing with the Israelis for control of the same soil, present their petition for statehood to the United Nations. In part because the idea that such a process even exists is so science fictionally cool, because it represents the possibility of creating new states—maps do change, as we have all seen in our lives, and the possibilities for how much they could change are theoretically boundless. The criteria are pretty simple: you need to have some real estate over which you exercise internal and external sovereignty, a permanent population, a government, and the capacity to enter into relations with other sovereign states. Simple enough, but for the unspoken part about the other people who might think it's *their* sovereignty to exercise.



Imagine, if you will, a near-future Texas—say fifty years out—whose demographics have radically changed, such that the only Perry that could ever be elected to statewide office would be a Perez. It is not hard to imagine such a sub-state of the United States deciding it wanted to return to becoming a sovereign state of its own. And that to accomplish such an act isn't really about the legalities under the flaccid regime of international law, the law of an imaginary sovereign with no real ability to enforce its edicts, but about the military ability to keep out occupying armies and the political ability to secure diplomatic recognition. Just ask the Confederates—the political theory underpinning our legal systems has never really articulated a coherent legal code defining when and how new states can be created within existing ones. Which doesn't stop a whole lot of free thinking iconoclasts from trying.


[Pic: President Kevin Baugh of the Republic of Molossia, aka a piece of land outside Reno, Nevada.]

Of course Abbas wants to get after the issue now, in the fall of the Arab Spring, as the incipient leaders of post-revolution territories debate their visions for a 21st century Arab state. But to do so also seems very anachronistic, when we are in a historical moment that reveals the the culture of the Jewish diaspora a much more relevant model for the organization of peoples than a piece of land with a fence around it and some ruling fathers running the rancho. Isn't the real power of the modern Israeli state based on the pre-Westphalian power of the transnational, inter-state network of supporters, who want the state because it articulates the existence of the network in the only terms that were understood by the post-colonial, post-WWII rulers of the world atlas?

In this century, the network is a more compelling model for the polity than the nation state.



The signs are all there in the outstanding roundup by Nicholas Kulish at The New York Times of the post-democracy movements emerging around the world—As Scorn for Vote Grows, Protests Surge Around Globe.

Surprise: the generations raised in cyberculture don't take the truths of constitutional democracy as self-evident.

Increasingly, citizens of all ages, but particularly the young, are rejecting conventional structures like parties and trade unions in favor of a less hierarchical, more participatory system modeled in many ways on the culture of the Web.

In that sense, the protest movements in democracies are not altogether unlike those that have rocked authoritarian governments this year, toppling longtime leaders in Tunisia, Egypt and Libya. Protesters have created their own political space online that is chilly, sometimes openly hostile, toward traditional institutions of the elite.

The critical mass of wiki and mapping tools, video and social networking sites, the communal news wire of Twitter and the ease of donations afforded by sites like PayPal makes coalitions of like-minded individuals instantly viable.

“You’re looking at a generation of 20- and 30-year-olds who are used to self-organizing,” said Yochai Benkler, a director of the Berkman Center for Internet and Society at Harvard University. “They believe life can be more participatory, more decentralized, less dependent on the traditional models of organization, either in the state or the big company. Those were the dominant ways of doing things in the industrial economy, and they aren’t anymore.”

Yonatan Levi, 26, called the tent cities that sprang up in Israel “a beautiful anarchy.” There were leaderless discussion circles like Internet chat rooms, governed, he said, by “emoticon” hand gestures like crossed forearms to signal disagreement with the latest speaker, hands held up and wiggling in the air for agreement — the same hand signs used in public assemblies in Spain. There were free lessons and food, based on the Internet conviction that everything should be available without charge.

Someone had to step in, Mr. Levi said, because “the political system has abandoned its citizens.”

The rising disillusionment comes 20 years after what was celebrated as democratic capitalism’s final victory over communism and dictatorship.

In the wake of the Soviet Union’s collapse in 1991, a consensus emerged that liberal economics combined with democratic institutions represented the only path forward. That consensus, championed by scholars like Francis Fukuyama in his book “The End of History and the Last Man,” has been shaken if not broken by a seemingly endless succession of crises — the Asian financial collapse of 1997, the Internet bubble that burst in 2000, the subprime crisis of 2007-8 and the continuing European and American debt crisis — and the seeming inability of policy makers to deal with them or cushion their people from the shocks.

Frustrated voters are not agitating for a dictator to take over. But they say they do not know where to turn at a time when political choices of the cold war era seem hollow. “Even when capitalism fell into its worst crisis since the 1920s there was no viable alternative vision,” said the British left-wing author Owen Jones.




Will the law students of the future learn Robert's Rules of Emoticons? It seems very likely to me. As suggested in last month's post, In the Panopticon, no one can hear your reboot, it seems indisputable that contemporary networking technologies present more compelling tools for the construction of direct democracy than have ever existed. These under-40s all over the world who are the natives of the realm of those technologies are naturally forming their own political networks using those tools. And these imminent polities may violate all the geopolitical conventions of land, language, and ethnicity.

Geopolitics isn't going away, but it is going to have its work cut out for it dealing with the emerging 21st century cyberpolitics.

What will the United Nations Security Council do about sovereign polities that assert themselves in the ethereal space of the network, even controlling resources and behaviors through the systems of the network, without needing to wall in any segments of the physical world?

What will happen when a virtual world secedes from the jurisdiction of the governments of the physical world?

What happens when a virtual polity decides to assert dominion over the physical world?



This mode seems the first really viable alternative approach to political choice, and the idea of democratic representation, to emerge in a long time. The NYT piece tries to place it within the existing dualistic right/left paradigm, but that kind of watered down Hegelian dialectic doesn't really have any place in the network. A network parliament would be a polyphony. A network parliament, in all likelihood, wouldn't be a parliament—it would be the People.

In a time when modern Greece is crumbling as a sovereign republic, is it too utopian to imagine the planet as a virtual Athens, governed by a network-enabled direct democracy? It is certainly a scary idea for the power elites of the world, the rulers of all the contemporary republics quietly scornful of popular opinion while relentlessly pandering to it and manipulating it in their own political and financial interests. And the American Founders would have no trouble scaring us with the idea of how horrific it could be to live in a society ruled by an Internet mob.

Lots to worry about in how to construct effective operating systems for that sort of polity, but the truth that seems self-evident to me is that we need to start tackling those tasks in earnest, because it's already starting to happen.

Tuesday, September 27, 2011

Frankenstein's Moon

One of the cool things about my day job is that sometimes it intersects with my genre leanings in a big way. Take Frankenstein's Moon as a case in point. This is how I spent much of last week, distilling a full-blown Sky & Telescope article down to a media-release-sized writeup, balancing readability with accuracy. Not always an easy task, especially when there's a lot of research and technical nuance involved. Practice helps, though. In the past, we've worked on similar projects connecting Edvard Munch's painting The Scream with Krakatoa, settled a date conflict regarding Caesar's invasion of Britain, offered a convincing new date for the ancient battle of Marathon and solved Walt Whitman's meteor mystery, among many others. Fun stuff, that!

The current Frankenstein piece seems to be capturing popular attention as well. Already it has resulted in a nice articles in The Guardian, which as been reprinted in quite a few British newspapers. Another article written by Jim Forsythe at WOAI in San Antonio has been picked up by Reuters and shown up all over the world, including MSNBC. So yeah, we've got lots of Frankenstein to enjoy here at the end of September.

One cool bit of conflation didn't make it into the media release, but is touched upon in the full article. Allow me to slip into Jess Nevins mode for a moment (although Jess would likely scoff that this is common knowledge) to explain. During the original "ghost story" challenge mentioned below, Mary Shelley is the only participant to actually finish a written piece begun at that time. Lord Byron began one, but soon lost interest and abandoned it. John Polidori, however, took up that fragment some time later and was inspired to write The Vampyre, published in 1819. The story was an immediate success, in part, no doubt, because the publisher credited it as written by Lord Byron (Polidori and Byron fought for some years to get the attribution corrected in subsequent printings). The Vampyre was the first fiction to cast the legendary bloodsuckers as an aristocratic menace in the narrative, and spawned a popular trend of 19th century vampire fiction which culminated with Bram Stoker's enduring Dracula in 1897. Which means the two most famous horror icons of 20th century pop culture--Dracula and Frankenstein's monster--can both trace their lineage back to that 1816 gathering at Villa Diodoti overlooking Lake Geneva.
Frankenstein’s moon: Astronomers vindicate account of masterwork

Victor Frankenstein’s infamous monster led a brief, tragic existence, blazing a trail of death and destruction that prompted mobs of angry villagers to take up torches and pitchforks against him on the silver screen. Never once during his rampage, however, did the monster question the honesty of his ultimate creator, author Mary Wollstonecraft Shelley.

That bit of horror was left to the scholars.

Now, a team of astronomers from Texas State University-San Marcos has applied its unique brand of celestial sleuthing to a long-simmering controversy surrounding the events that inspired Shelley to write her legendary novel Frankenstein. Their results shed new light on the question of whether or not Shelley’s account of the episode is merely a romantic fiction.

Percy Bysshe Shelley (played by Douglas Walton) and Lord Byron (played by Gavin Gordon) listen as Mary Wollstonecraft Shelley (played by Elsa Lanchester) tells her tale of horror. [Bride of Frankenstein]

Texas State physics faculty members Donald Olson and Russell Doescher, English professor Marilynn S. Olson and Honors Program students Ava G. Pope and Kelly D. Schnarr publish their findings in the November 2011 edition of Sky & Telescope magazine, on newsstands now.

“Shelley gave a very detailed account of that summer in the introduction to an early edition of Frankenstein, but was she telling the truth?” Olson said. “Was she honest when she told her story of that summer and how she came up with the idea, and the sequence of events?”

A Dark and Stormy Night

The story begins, literally, in June 1816 at Villa Diodati overlooking Switzerland’s Lake Geneva. Here, on a dark and stormy night, Shelley—merely 18 at the time—attended a gathering with her future husband, Percy Bysshe Shelley, her stepsister Claire Clairmont, Lord Byron and John Polidori. To pass the time, the group read a volume of ghost stories aloud, at which point Byron posed a challenge in which each member of the group would attempt to write such a tale.

Villa Diodati sits on a steep slope overlooking Lake Geneva. Relatively clear views prevail to the west, but the view of the eastern sky is partially blocked by the hill. A rainbow greeted the Texas State researchers upon their arrival at Lake Geneva. [Photo by Russell Doescher]

“The chronology that’s in most books says Byron suggested they come up with ghost stories on June 16, and by June 17 she’s writing a scary story,” Olson said. “But Shelley has a very definite memory of several days passing where she couldn’t come up with an idea. If this chronology is correct, then she embellished and maybe fabricated her account of how it all happened.

“There’s another, different version of the chronology in which Byron makes his suggestion on June 16, and Shelley didn’t come up with her idea until June 22, which gives a gap of five or six days for conceiving a story,” he said. “But our calculations show that can’t be right, because there wouldn’t be any moonlight on the night that she says the moon was shining.”

Moonlight is the key. In Shelley’s account, she was unable to come up with a suitable idea until another late-night conversation--a philosophical discussion of the nature of life--that continued past the witching hour (midnight). When she finally went to bed, she experienced a terrifying waking dream in which a man attempted to bring life to a cadaverous figure via the engines of science. Shelley awoke from the horrific vision to find moonlight streaming in through her window, and by the next day was hard at work on her story.

Doubting Shelley

Although the original gathering and ghost story challenge issued by Byron is well-documented, academic scholars and researchers have questioned the accuracy of Mary Shelley’s version of events to the extent of labeling them outright fabrications. The traditionally accepted date for the ghost story challenge is June 16, based on an entry from Polidori’s diary, which indicates the entire party had gathered at Villa Diodati that night. In Polidori’s entry for June 17, however, he reports “The ghost-stories are begun by all but me.”

Russell Doescher and Ava Pope take measurements in the garden of Villa Diodati. [Photo by Marilynn Olson]

Critics have used those diary entries to argue Shelley didn’t agonize over her story for days before beginning it, but rather started within a span of hours. Others have suggested Shelley fabricated a romanticized version for the preface of the 1831 edition of Frankenstein solely to sell more books. Key, however, is the fact that none of Polidori’s entries make reference to Byron’s ghost story proposal.

“There is no explicit mention of a date for the ghost story suggestion in any of the primary sources–the letters, the documents, the diaries, things like that,” Olson said. “Nobody knows that date, despite the assumption that it happened on the 16th.”

Frankenstein’s moon

Surviving letters and journals establish that Byron and Polidori arrived at Villa Diodati on June 10, narrowing the possible dates for the evening of Byron’s ghost story proposition to a June 10-16 window. To further refine the dates, Shelley’s reference of moonlight on the night of her inspirational dream provided an astronomical clue for the Texas State researchers. To determine which nights in June 1816 bright moonlight could’ve shone through Shelley’s window after midnight, the team of Texas State researchers traveled in Aug. 2010 to Switzerland, where Villa Diodati still stands above Lake Geneva.

Ava Pope, Kelly Schnarr and Donald Olson on the steep slope just below Villa Diodati. [Photo by Roger Sinnott]

The research team made extensive topographic measurements of the terrain and Villa Diodati, then combed through weather records from June of 1816. The Texas State researchers then calculated that a bright, gibbous moon would have cleared the hillside to shine into Shelley’s bedroom window just before 2 a.m. on June 16. This calculated time is in agreement with Shelley’s witching hour reference. Furthermore, a Polidori diary entry backs up Shelley’s claim of a late-night philosophical “conversation about principles” of life taking place June 15.

Had there been no moonlight visible that night, the astronomical analysis would indicate fabrication on her part. Instead, evidence supports Byron’s ghost story suggestion taking place June 10-13 and Shelley’s waking dream occurring between 2 a.m. and 3 a.m. on June 16, 1816.

“Mary Shelley wrote about moonlight shining through her window, and for 15 years I wondered if we could recreate that night,” Olson said. “We did recreate it. We see no reason to doubt her account, based on what we see in the primary sources and using the astronomical clue.”

For additional information, visit the Sky & Telescope web gallery at www.skyandtelescope.com/Frankenstein.

Friday, September 23, 2011

Hijacking Flight 117 to the Nostalgia Factory



Fridays are all about escape. All you need to do is flip through the weekend arts section of the newspaper, a menu of evasions of the real. This week it's another upbeat baseball movie from the ubiquitous Brad Pitt, three more variations on corporate life as apolitical action thriller (the commuter version in Drive starring Ryan Gosling, the retro remix version in the Jason Statham/Robert DeNiro/Clive Owen reinvention of The Killer Elite (how I wish someone really could channel Peckinpah for our post-GWOT culture), and the teen wolf wet dream version in Abduction of Taylor Lautner (they're not your real parents!)), and best of all, the ever-grunting über-Spartan Gerard Butler in Machine Gun Preacher (aka, What Would Jesus Shoot?)



On television, the escape is beyond an alternate present, into an alternate past. The success of Mad Men has shown Hollywood that, in a world where the present is apocalyptic and the future no longer exists, the past is the place we will go to happily watch commercials—indeed, the shows are all commercials for an imagined version of the past, when they're not anachronized versions of our favorite old commercials. Pan Am, The Playboy Club, Boardwalk Empire, even Game of Thrones...we like our product placement to occur in beautifully curated cathode ray nostalgia bubbles. As Alessandra Stanley says in her review of Pan Am, "When the present isn’t very promising, and the future seems tapered and uncertain, the past acquires an enviable luster."



Pan Am even imagines a time when there were cute revolutionaries in our midst imagining a better world: "Christina Ricci plays Maggie, a closet beatnik who wears the Pan Am uniform to see the world but at home listens to jazz and studies Marx and Hegel."



What a perfect semiotic response to the state of things in the world after 9/11, itself an evolved derivative of the Lockerbie Bombing, by imagining oneself eternally flying the airline that represented the dream of a shiny corporate everyday interplanetary 2001? Especially if you revisit the decade that just passed, in Mark Danner's amazing piece in this week's New York Review of Books—"After September 11: Our State of Exception." Danner conveys the catalyzing power of the historical change when wars between states were as relevant as a vintage game or Risk, and the duty of the sentinel was to protect the monolithic state from elusive and conceptually intangible networks:

[M]ake no mistake, the critical decisions laying the basis for the state of exception were made in a state of anxiety and fear. How could they not have been? After September 11, as Richard Clarke put it simply, “we panicked.” Terrorism, downgraded as a threat by the incoming Bush administration, now became the single all-consuming obsession of a government suddenly on a “war footing.”

Every day the President and other senior officials received the “threat matrix,” a document that could be dozens of pages long listing “every threat directed at the United States”10 that had been sucked up during the last twenty-four hours by the vast electronic and human vacuum cleaner of information that was US intelligence: warnings of catastrophic weapons, conventional attacks, planned attacks on allies, plots of every description and level of seriousness. “You simply could not sit where I did,” George Tenet later wrote of the threat matrix, “and be anything other than scared to death about what it portended.”11

One official compared reading the matrix every day—in an example of the ironic “mirroring” one finds everywhere in this story—to “being stuck in a room listening to loud Led Zeppelin music,” which leads to “sensory overload” and makes one “paranoid.” He compared the task of defending the country to playing goalie in a game in which the goalie must stop every shot and in which all the opposing players, and the boundary lines, and the field, are invisible.12

All this bespeaks not only an all-encompassing anxiety about information—about the lack of map rooms displaying the movements of armies, the maddening absence of visible, identifiable threats, the unremitting angst of making what could be life-and-death judgments based on the reading and interpreting of inscrutable signs—but also, I think, guilt over what had been allowed to happen, together with the deep-seated need to banish that guilt, to start again, cleansed and immaculate. Thus “the War on Terror”—a new policy for a new era, during which the guardians of the nation’s security could boast a perfect record: no attacks on American soil. The attacks of September 11 would be banished to a “before time” when the “legalistic” Clinton rules applied, before “the gloves came off.” The successful attack could thus be blamed on the mistaken beliefs of another time, another administration. The apocalyptic portal of September 11 made everything new, wiping out all guilt and blame.





No wonder my son, raised in the Cheney/Obama decade, gravitates toward the vinyl records of the 1970s. 9/11 succeeded in flipping the switch that turns our media culture into a giant fear-based psyop on ourselves. If I unleash my Jack Bauer action figure inside the screen of Pan Am, will he fall in love and settle down in the peaceful interregnum that never existed during the long war of the twentieth century? Maybe his lover will be Christina Ricci's New Left stoner, and she and Jack will partner up to foment a revolutionary atemporal mashup in the mediascape of the early 21st century. Better futures are there for those unafraid to leap into the Nietzschian uncertainty of tomorrow.



Isn't it better to punch through the exit door than assume crash positions? Don't believe what the flight attendants tell you: it is always your right to go into the cockpit.