Friday, August 31, 2018

Battlezone's Virtual Reality

Asked to explain his inspiration for "cyberspace," the virtual-reality datasphere in which the characters of his early fiction spent much of their time, William Gibson explained that he'd gotten the idea from watching video gamers. In the arcades of the early 1980s, young people by the millions stood mesmerized by the new digital realities unfolding before them, immersed more fully in those worlds that the real one. Some early videogames encouraged this immersive experience by offering a first-person perspective (as in early auto-racing games, for instance), or using external focusing hardware, like periscopes, to block out more of the dull external world.

Some did both. One of the most successful early "virtual reality"  games, Battlezone (1980), used a periscope, a first-person perspective, and a three-dimensional manifold to give players a more focused and immersive experience than any preceding title. Battlezone's 'world" didn't offer much complexity or detail - just enough to maintain one's focus and interest. Players drove a laser-shooting tank through a flat, faintly unearthly landscape, with a moon hanging low in the sky above distant mountains. They dodged polyhedral boulders and destroyed enemy tanks, missiles, and the occasional flying saucer hovering briefly near the horizon. Controls were simple: just a firing button and two joysticks, one to control the player's tank and one its turret. The two independent modes of movement allowed one to move in one direction while looking and shooting in another, surveying and scooting about the landscape like an infantryman. The graphics seem primitive by modern standards, but the blocky, monochromatic, wire-frame images of Battlezone were state-of-the-art computer animation in 1980. Indeed, John Carpenter and the designers of Escape from New York assumed they would remain so until at least 1997.

BZ never acquired the cachet of more popular titles like Space Invaders and Pac-Man, but it did produce a spin-off simulator for the U.S. Army, and many later versions of the original game with updated graphics. I suspect, given William Gibson's original observation about videogames and cyberspace, that it provided some inspiration for cyberpunk stories like Gibson and Michael Swanwick's "Dogfight" and Walter Jon Williams's "Panzerboy." I am also fairly certain fond memories of Battlezone inspired Stu Maschwitz to make "Tank," a short retro-SF film pitting fighter pilots against a seemingly-indestructible supertank. Watching "Tank," I was reminded why early video and computer games left such a mark on American geeks of a certain age (middle-aged, now): their (necessarily) high level of abstraction forced gamers to fill in sensory gaps with their own imaginations, which made it likelier for offerings like Battlezone or Zork to inspire us to create our own works.* Modern electronic games are better and smarter in many ways than their predecessors, but the enormous amount of expertise and money they require makes them less amenable to the kind of imaginative "tinkering" that Maschwitz does.


* My brother and our friend Andrew spent a good deal of time in the '80s crafting text-only computer games, some of them quite ambitious

Monday, July 30, 2018

Watch It All Decay

I remember the 1970s as an odd, dispiriting time. (Better to have grown up during that decade in America, rather than in contemporary Cambodia or Uganda, but I digress.) The cultural zeitgeist was one of entropy, decadence, and decline. The president was a depressing nobody, tolerated mainly for the contrast he provided to the villainy of his predecessors. Public places all seemed dirty or sticky or too dangerous for children. Adults were angry or drunk all the time. Books were cheap and disposable, television mostly awful, especially children's shows. 

The aura of decay extended to science fiction, ostensibly a genre of hope, and to its real-life manifestation, the American space program. Consider: during my formative years as a reader and media consumer, roughly ages 5 through 10, not a single American traveled into space. The last Moon landing took place when I was a toddler, and I was also too young to remember Skylab or Apollo-Soyuz. By the late 1970s I had come to suspect that even the nascent Shuttle program was a sham. (My father or mother, or perhaps a teacher, had told me that the Enterprise was just a test model that had to be carried on a 747.* I extrapolated from this my belief that the Shuttle program was an expensive fake.) I was genuinely shocked when the Columbia first went into orbit in April 1981, having firmly believed that there would be no more manned American space flights in my lifetime. 


I suspect I wasn't the only person to think this, which perhaps is what makes late '70s films about the space program such significant artifacts. Manned space flight, they suggested, was as obsolete as 40-cent gasoline. Capricorn One (1977), which I saw on my eighth birthday, postulated a faked Mars landing, staged in order to save both NASA's budget and the lives of astronauts who otherwise would have died from botched engineering. Meteor (1979), a B-movie with a cast of has-beens, saw an American deep-space mission destroyed by the eponymous death rock. Facing a world-killing collision of the meteor with Earth, the Americans and Soviets then had to join forces to destroy the oncoming threat with their secret nuclear space platforms. The implication was that space was primarily a place of destruction, suitable mainly for military installations - a point perhaps not lost on future president Ronald Reagan. Salvage (1979), a TV movie that Christopher Mills's blog recently brought to my attention, saw a privately-built rocket travel to the Moon to recover, for sale back on Earth, the remains of one of the Apollo landers. The pride of the American nation had now become just another way for some plucky junkmen to make a buck.

Collectively, such films characterized the United States as a land of failure, shoddiness, ass-covering, greed, and nostalgia. This surely made them successful with adults and critics, but did nothing to inspire children. Give George Lucas this much credit: Star Wars very much cut against the grain of American popular culture and of science fiction during the decade of its release.


(Images above are from howstuffworks.com and Rotten Tomatoes, respectively.)


* Actually, I think I had a toy space shuttle mounted on the back of a 747 when I was 8 or 9. It reinforced my perception of fakery.

Saturday, June 30, 2018

The Future Is Not What You Think: H.M. Hoover

I am pleased to report the republication (in e-book format) of the young adult sci-fi novels of Helen M. Hoover.  Nearly all of her books had been out of print for 30 or 40 years, and if I am typical of my generation they left a lasting impression on thousands of readers.

Hoover was by no means a perfect writer. Plot was not her strong suit, or more precisely she didn't seem that interested in it. Her stories moved forward at a rather languid pace, even when the characters were in the midst of a rescue operation or a war. This gave her time to describe settings, which she did quite well - misty marshes at dawn, dense forests, decaying mansions, smoky wooden lodges - along with the vividly imagined animals (creeping molluscs, wild pigs, bewildered alien birds) that inhabited them. Her characters also tended toward flatness and disposability, though she usually included a few exceptions, like the cunning and brutal Major in Children of Morrow and the world-weary grandfather in The Shepherd Moon. This particular flaw she shared with most SF writers. Like them, Hoover primarily wanted her books to explore ideas, ideas that would challenge readers' expectations and leave them unsettled.

Her earliest novels, Children of Morrow and Treasures of Morrow (1973/1976), featured a conflict between two post-apocalyptic societies, one ("the Base") superstitious and patriarchal, the other (the Morrow community, "Lifespan") high-tech, egalitarian, and psychically sensitive. Children presented readers with a straightforward rescue mission and a clear set of heroes and villains, but its sequel blurred the line between the two somewhat, as a follow-up Morrow expedition to the Base trashes the impoverished inhabitants' sacred shrine and threatens their survival. Hoover more clearly challenged the "technological/sensitive/good versus primitive/brutish/bad" dichotomy in her later novels. The Delikon (1977), which takes place after a successful alien invasion of Earth, describes the humans' subsequent overthrow of their alien masters from the standpoint a Delikon, a genetically modified alien who is trying to teach human children the ethics of her people. Hoover makes clear that neither side has a monopoly of virtue or villainy; the Delikon value beauty and harmony but have imposed their will by force, the humans are grubby philistines but are also seeking genuine freedom. Another Heaven, Another Earth (1981) inverts the central tropes of Hoover's early novels even more thoroughly. The inhabitants of the lost colony of Xilin are tall, graceful, fond of natural beauty and fulfilling work; their high-tech rescuers from Earth are overbearing and aesthetically crude, interested primarily in profit and glory.* The Shepherd Moon (1984) brings together two societies that are both fundamentally flawed, predicated on artificial scarcity, abuse of children, and indifference to suffering - and tells their story through an antagonist who liked the status quo and a protagonist who has no idea how to change it. Like the best young-adult writers, Hoover is unafraid to pose questions and conundrums that make the most sanguine adults uncomfortable.


*  The novel does make it clear, however, that the Elf-like colonists' lives are unsustainable: the alien environment is slowly poisoning them, shortening their lives and curtailing their fertility.

Sunday, May 13, 2018

Games That Don't Suck: Clank!


While the archetypal fantasy plot is probably the quest, the archetypal fantasy gaming plot is almost certainly the dungeon crawl, a search for treasure and magic in an underground ruin, populated by monstrous beasts and humanoids. I’ve written elsewhere about the Cold War and sci-fi influence on this old and important stereotype. One of the earliest fantasy board games, Dungeon (which your humble narrator first played forty years ago), essentially sent players onto a dungeon map to defeat monsters and take their stuff. Dungeonquest, Talisman Dungeon, and One Deck Dungeon rang their individual changes on this theme, as did the byzantine and lavishly (over)produced Descent.

If I feel in the mood for some dungeon-crawling, as an old D&Der sometimes does, my go-to game is now likely to be Clank!, a new offering from Renegade. The 2016 title has a familiar premise: players descend into a warren of caves and chambers beneath an old castle, fight monsters, steal various forms of loot, and escape. The game board depicts treasure rooms, secret rooms (with various secret prizes), crystal caves (which stop forward movement), and an underground market (because why the hell not) where adventurers can buy keys and backpacks and other goodies. Connecting hallways turn the dungeon into more of a maze, with extra-long corridors, guarded corridors, locked corridors, and one-way corridors to navigate, unlock, or foil.

The game mechanic that makes this title elegant, strategic, and rather unusual is deck-building, the same playing feature at the core of Dominion. Players start Clank! with a weak deck representing their resources and actions: skill points to buy new cards, swords to fight guards and monsters, and boots to move from room to room. On each turn, each player draws five cards from his/her deck, plays them in any order, and then discards their played hand and any newly-bought cards. Before the game begins, players lay out five cards from the game’s master deck, representing monsters they can fight (for gold, usually), new skills or equipment they can buy with skill points, and gems and other treasure that build their victory point total. Purchased or defeated cards are replenished after each player’s turn.

The game’s clumsy title refers to another aspect of play that makes it much more of a treasure hunt and survival match than a hack-and-slay festival: an invulnerable central enemy. Within Clank!’s dungeon resides a powerful dragon whom the players must evade, even as they steal her treasures and (sometimes) her dragon eggs. Whenever the players make noise – and there are many action cards that generate noise (or “Clank”), they must put one or more cubes of their own color into a box on the game board. Certain cards contain a “dragon attacks” icon; when a player reveals one, everyone places their cubes in an opaque bag and draws out a number of cubes equal to the current “threat level.” Each drawn cube of a player’s color inflicts one wound on that player; ten wounds put them out of the game. Play starts with some black dummy cubes already in the bag, representing missed attacks. As the game progresses and the players collect treasures, the dummies are exhausted and the threat level (number of cubes drawn) increases, making each dragon attack likelier to injure someone. In addition, once a player leaves the dungeon, automatic dragon attacks occur on each of those player’s subsequent turns until the other players escape or are knocked out. Clank! thus includes a press-your-luck feature, a nice mechanic to have handy if a player is behind on victory points and wants to take a risk on a knock-out victory.

Clank! currently retails for $50-60, not including two supplements (which I haven't tried) and the over-elaborate (IMHO) sci-fi version Clank In Space. As with Dominion or Pandemic, the sticker price may seem a little high, but one will get more game play and far more entertainment out of it than a less expensive and more generic title.

Sunday, April 22, 2018

Where Were You, Childs?

The characters in John Carpenter's The Thing (1982), which my better half and I watched again recently, come across as a rather affectless lot. In the early part of the film they are wedded to their obscure work routine and their dreary entertainments - and is anything drearier than watching old game shows on VHS? None seem very interested in the alien remains that Macready and Copper discover at the Norwegian base; most just think they are gross. Few even show much curiosity when Macready and Norris find the alien space ship that the Norwegians blasted out of the ice. Stoner character Palmer even claims that UFOs don't surprise him in the least: "They're falling out of the sky like flies. Government knows all about it, right, Mac?...It's Chariots of the Gods."

After the dog Thing first manifests itself, however, the characters finally come alive, with a mixture of anger and terror. These intensify as Blair and the others uncover the Thing's doppelganger powers, its ability to imitate Earth organisms (including humans), perfectly. The claustrophobic research station, shut in by an Antarctic storm and Blair's sabotage, turns into an emotional lens, intensifying the characters' disaffection and turning them upon and against one another. Eventually paranoia makes everyone at the station into an adversary. The question "Who's the Thing?" (or "a Thing") becomes less meaningful after the de facto leader, Macready, makes it clear he'll kill everyone rather than let the alien intruder infect the world. With humans like these, who needs alien enemies? What's the point of being human if everyone in your "human" community is on the verge of killing one another?


In the end, the aliens and humans do end up killing one another, or at least dooming one another to die. In the final scene, one much discussed by sci-fi fans, Childs finds Macready outside the burning ruins of the station, giving him a half-credible story about seeing Blair and pursuing him into the storm. Macready suggests that it no longer matters if one or the other of them had become an alien. He offers Childs a drink. I think this scene works best if we don't try to figure out "Which one of the survivors was really a Thing?" Instead, it is a moment, ironically, of restored humanity. The Thing's characters become most vivid when they are least human: shredded and violated by alien possessors, or filled with terror and ready to kill anyone. Conversely, when the station crew members are preoccupied with boring tasks and pastimes, they are most human, most capable of peacefully coexisting with one another. In sharing a drink and waiting for the storm to freeze them, Childs and Macready are making themselves dull, blurring their characters' rough and angry edges, restoring their human-ness just before equalizing themselves in death. Maybe we are at our most human when we seem half-dead, when we are watching three-year-old episodes of The Price Is Right. God, this is a horror story, isn't it?


 

Wednesday, February 28, 2018

Deep Future

Science fiction, if we date its birth to Mary Shelley's work, emerged in the same era as Hutton, Agassiz, and Darwin's elucidation of "deep time:" the extremity of Earth's age and the even greater longevity of the stars and galaxies. The discovery of a "deep past" helped scientists remove human beings from the conceptual center of the universe, much as Copernicus and Galileo had moved humanity from its astronomical center. If all of our planet's history, Mark Twain observed several decades after Shelley's death, comprised a tall tower, human history would constitute merely a single thin layer of paint on the very top of the tower. We should pity those poor souls who think the purpose of the tower is to hold up the paint.

The concept of a deep past, along with subsequent discoveries in Earth science, evolutionary biology, and astronomy, made possible an equally unsettling line of speculation: after our own brief lives will come a deep future, extending billions of years after the present. In so distant a future human beings will evolve out of recognition, or even (as Shelley herself speculated in The Last Man) die out completely; the mountains and rivers and continents will assume strange new shapes; the Sun will grow old and red and the billions of other stars evolve toward their own senescence. This kind of speculation can inspire awe, as one reflects on the chronological "size" of time and space, and a certain peace of mind, born of one's mental journey far beyond the greed, vanity, and tumult of our own day. It also breeds alienation, as one realizes how both the deep past and the deep future reduce the lives of human individuals, nations, even the whole species, to comparative nothingness.

This makes speculation about the deep future difficult for SF writers, even though one would assume it a natural subject for them. Fiction requires protagonists, and thus a human or sentient presence to grapple with hardships and challenges. Deep-future sci-fi scenarios usually employ some contrivance to preserve a narrator or point-of-view character with whom readers can sympathize. Isaac Asimov's "The Last Question" imagines disembodied humans and computers surviving as beings of pure thought. Poul Anderson's Tau Zero shepherds its human protagonists to the end of time with relativistic time dilation. Vernor Vinge's Marooned in Realtime uses stasis fields to catapult a human colony into a post-human - actually, post-Singularity - future, 50 million years hence. Charles Stross's "Palimpsest" features quasi-human time travelers colonizing future Earths with the survivors of various geological extinction events. 

Even non-fiction essays on the deep future find it necessary to posit some form of intelligent life in order to engage their readers' attention. For example, Laurence Krauss, Robert Scherrer, and Avi Loeb, in their speculations on the far-future expansion of the universe, raise the question of how "future astronomers" (post-human or non-human) will be able to perceive galaxies beyond our own, after dark energy has propelled them so far away that one cannot see their light. They imagine that these scientists may have somehow preserved observational data from our own era, but will not be able to confirm it with their own observations. Avi Loeb offers a glimmer of hope in the form of "hypervelocity stars" that break free of the gravity of their home galaxies, and which may pass near enough the Milky Way to demonstrate to our poor future stargazers the reality of extra-galactic objects.

This is a reassuring possibility, but I must confess I am a bit less interested in what the universe will look like in 100 billion years than in the changes that will have occurred on our own planet over, say, the next 100 million years. One of the appealing features both of Vinge's Marooned in Realtime and Stross's "Palimpsest" is their presentation of a future Earth as analogous to another world, a habitable but alien planet located not many light years but many chronological years from our own. Frank Landis makes a more direct observation to this effect in Hot Earth Dreams. Our own planet was, after all, a different world just 10,000 years ago, with different atmospheric chemistry (less carbon dioxide and methane), significantly different geography, and a greater variety of fauna. Ten or even five thousand years from now, Earth will have undergone equally great biological and physical change. Our descendants will essentially colonize a new Earth. So will their descendants in later geological epochs. Landis argues that this is a more realistic vision of colonization than trying to find a human-inhabitable world, a biosphere haven if you like, in another star system. Over the course of tens of thousands of years I imagine humans will do both, but the huge expense of interstellar travel will oblige the vast majority of our future colonists to embark on a timewise rather than spatial expansion of our human civilization.

Wednesday, January 31, 2018

Atwood's Dystopias for the Comfortable



Margaret Atwood, the much-lauded author of several dozen books, including several famous works of dystopian speculative fiction, got into a bit of hot water earlier this month over her defense of fellow writer Steven Galloway. Galloway stood accused of sexual misconduct at the University of British Columbia, whose administration fired him after an ineptly-conducted investigation. Atwood and other professional colleagues signed a letter of protest to UBC in 2016. Last year, with the long-delayed rise of the #MeToo movement – and, I suspect, with  growing doubts about Galloway’s innocence – many of the letter’s supporters withdrew their signatures. Atwood, however, not only defended Galloway but published a self-aggrandizing op-ed in favor of established institutions and against revolutionary “terror and virtue” – that is, in favor of the status quo. The essay was learned and intelligent, but the author deployed both of those virtues in defense of her own narrow privileges.

In The Root, Clarkisha Kent has some incisive and some usefully caustic things to say about Atwood. White privilege and cultural appropriation, Kent argues, find their way into a good deal of Atwood’s writing, both non-fiction and fiction. The Handmaid’s Tale, M.A.’s most famous work, stands as Exhibit A. Members of Atwood’s own privileged class (including myself, I must confess) see Handmaid’s Tale as a canonical, irreproachable work of dystopian and feminist fiction. Kent (following the lead of Ana Cottle) presents it instead as a “white feminist” nightmare, a scenario in which white American women are subjected to and ennobled by the same treatment actually meted out to African-American slaves: degradation, rape, forced pregnancy, elimination of human rights and identity. Atwood does this with barely a nod to persons of color. Indeed, she set Handmaid’s Tale at one of the citadels of modern white American privilege, a place where Atwood herself went to graduate school: Harvard University. M.A. horrifies the reader not by recalling the ghastly treatment of women in historic (and modern) slave societies, but by positing a threat to the well-being of upper-middle class whites. If one of the purposes of good fiction is to build empathy for others, Handmaid’s Tale fails the test: it serves instead to boost the self-regard of the comfortable.

Atwood’s tendency to appropriate the experiences and creative work of others extends to her treatment of genre fiction, or at least science fiction. I’ve written before of her dystopian novel Oryx and Crake and Atwood’s unattributed borrowing of at least one idea from Frederick Pohl and Cyril Kornbluth’s The Space Merchants. Much of O&C reminds me of more recent work by Paul di Fillipo and Nancy Kress, some of whose more famous stories came out when Atwood was writing her biological disaster novel. One may defend M.A. by noting that she probably has no idea these authors exist, or by arguing that no author, particularly an accomplished and respected one, has the obligation to cite their sources in a work of fiction. I am not sure the latter is true – writers of historical fiction, for example, usually mention the non-fiction works that informed their own. The former observation, if true, merely reinforces our view of Atwood as insular and privileged.

Unfortunately, M.A.’s work appeals to exactly the sort of people who craft high-school and college curricula and book-club reading lists, so I suspect Handmaid’s Tale and Oryx and Crake will remain widely read and discussed for years to come. Fortunately, we have other writers of speculative fiction with less insular views, a greater appreciation of racial and class oppression, and at least as much writing skill as Atwood, writers who are gradually gaining the regard of canon-makers. I look forward to a future where students are far more familiar with the works of Octavia Butler and Ursula LeGuin than with Margaret Atwood’s well-written but intellectually-confining future visions.

*

(Image of Margaret Atwood from Goodreads.com)