Saturday, June 30, 2018

The Future Is Not What You Think: H.M. Hoover

I am pleased to report the republication (in e-book format) of the young adult sci-fi novels of Helen M. Hoover.  Nearly all of her books had been out of print for 30 or 40 years, and if I am typical of my generation they left a lasting impression on thousands of readers.

Hoover was by no means a perfect writer. Plot was not her strong suit, or more precisely she didn't seem that interested in it. Her stories moved forward at a rather languid pace, even when the characters were in the midst of a rescue operation or a war. This gave her time to describe settings, which she did quite well - misty marshes at dawn, dense forests, decaying mansions, smoky wooden lodges - along with the vividly imagined animals (creeping molluscs, wild pigs, bewildered alien birds) that inhabited them. Her characters also tended toward flatness and disposability, though she usually included a few exceptions, like the cunning and brutal Major in Children of Morrow and the world-weary grandfather in The Shepherd Moon. This particular flaw she shared with most SF writers. Like them, Hoover primarily wanted her books to explore ideas, ideas that would challenge readers' expectations and leave them unsettled.

Her earliest novels, Children of Morrow and Treasures of Morrow (1973/1976), featured a conflict between two post-apocalyptic societies, one ("the Base") superstitious and patriarchal, the other (the Morrow community, "Lifespan") high-tech, egalitarian, and psychically sensitive. Children presented readers with a straightforward rescue mission and a clear set of heroes and villains, but its sequel blurred the line between the two somewhat, as a follow-up Morrow expedition to the Base trashes the impoverished inhabitants' sacred shrine and threatens their survival. Hoover more clearly challenged the "technological/sensitive/good versus primitive/brutish/bad" dichotomy in her later novels. The Delikon (1977), which takes place after a successful alien invasion of Earth, describes the humans' subsequent overthrow of their alien masters from the standpoint a Delikon, a genetically modified alien who is trying to teach human children the ethics of her people. Hoover makes clear that neither side has a monopoly of virtue or villainy; the Delikon value beauty and harmony but have imposed their will by force, the humans are grubby philistines but are also seeking genuine freedom. Another Heaven, Another Earth (1981) inverts the central tropes of Hoover's early novels even more thoroughly. The inhabitants of the lost colony of Xilin are tall, graceful, fond of natural beauty and fulfilling work; their high-tech rescuers from Earth are overbearing and aesthetically crude, interested primarily in profit and glory.* The Shepherd Moon (1984) brings together two societies that are both fundamentally flawed, predicated on artificial scarcity, abuse of children, and indifference to suffering - and tells their story through an antagonist who liked the status quo and a protagonist who has no idea how to change it. Like the best young-adult writers, Hoover is unafraid to pose questions and conundrums that make the most sanguine adults uncomfortable.


*  The novel does make it clear, however, that the Elf-like colonists' lives are unsustainable: the alien environment is slowly poisoning them, shortening their lives and curtailing their fertility.

Sunday, May 13, 2018

Games That Don't Suck: Clank!


While the archetypal fantasy plot is probably the quest, the archetypal fantasy gaming plot is almost certainly the dungeon crawl, a search for treasure and magic in an underground ruin, populated by monstrous beasts and humanoids. I’ve written elsewhere about the Cold War and sci-fi influence on this old and important stereotype. One of the earliest fantasy board games, Dungeon (which your humble narrator first played forty years ago), essentially sent players onto a dungeon map to defeat monsters and take their stuff. Dungeonquest, Talisman Dungeon, and One Deck Dungeon rang their individual changes on this theme, as did the byzantine and lavishly (over)produced Descent.

If I feel in the mood for some dungeon-crawling, as an old D&Der sometimes does, my go-to game is now likely to be Clank!, a new offering from Renegade. The 2016 title has a familiar premise: players descend into a warren of caves and chambers beneath an old castle, fight monsters, steal various forms of loot, and escape. The game board depicts treasure rooms, secret rooms (with various secret prizes), crystal caves (which stop forward movement), and an underground market (because why the hell not) where adventurers can buy keys and backpacks and other goodies. Connecting hallways turn the dungeon into more of a maze, with extra-long corridors, guarded corridors, locked corridors, and one-way corridors to navigate, unlock, or foil.

The game mechanic that makes this title elegant, strategic, and rather unusual is deck-building, the same playing feature at the core of Dominion. Players start Clank! with a weak deck representing their resources and actions: skill points to buy new cards, swords to fight guards and monsters, and boots to move from room to room. On each turn, each player draws five cards from his/her deck, plays them in any order, and then discards their played hand and any newly-bought cards. Before the game begins, players lay out five cards from the game’s master deck, representing monsters they can fight (for gold, usually), new skills or equipment they can buy with skill points, and gems and other treasure that build their victory point total. Purchased or defeated cards are replenished after each player’s turn.

The game’s clumsy title refers to another aspect of play that makes it much more of a treasure hunt and survival match than a hack-and-slay festival: an invulnerable central enemy. Within Clank!’s dungeon resides a powerful dragon whom the players must evade, even as they steal her treasures and (sometimes) her dragon eggs. Whenever the players make noise – and there are many action cards that generate noise (or “Clank”), they must put one or more cubes of their own color into a box on the game board. Certain cards contain a “dragon attacks” icon; when a player reveals one, everyone places their cubes in an opaque bag and draws out a number of cubes equal to the current “threat level.” Each drawn cube of a player’s color inflicts one wound on that player; ten wounds put them out of the game. Play starts with some black dummy cubes already in the bag, representing missed attacks. As the game progresses and the players collect treasures, the dummies are exhausted and the threat level (number of cubes drawn) increases, making each dragon attack likelier to injure someone. In addition, once a player leaves the dungeon, automatic dragon attacks occur on each of those player’s subsequent turns until the other players escape or are knocked out. Clank! thus includes a press-your-luck feature, a nice mechanic to have handy if a player is behind on victory points and wants to take a risk on a knock-out victory.

Clank! currently retails for $50-60, not including two supplements (which I haven't tried) and the over-elaborate (IMHO) sci-fi version Clank In Space. As with Dominion or Pandemic, the sticker price may seem a little high, but one will get more game play and far more entertainment out of it than a less expensive and more generic title.

Sunday, April 22, 2018

Where Were You, Childs?

The characters in John Carpenter's The Thing (1982), which my better half and I watched again recently, come across as a rather affectless lot. In the early part of the film they are wedded to their obscure work routine and their dreary entertainments - and is anything drearier than watching old game shows on VHS? None seem very interested in the alien remains that Macready and Copper discover at the Norwegian base; most just think they are gross. Few even show much curiosity when Macready and Norris find the alien space ship that the Norwegians blasted out of the ice. Stoner character Palmer even claims that UFOs don't surprise him in the least: "They're falling out of the sky like flies. Government knows all about it, right, Mac?...It's Chariots of the Gods."

After the dog Thing first manifests itself, however, the characters finally come alive, with a mixture of anger and terror. These intensify as Blair and the others uncover the Thing's doppelganger powers, its ability to imitate Earth organisms (including humans), perfectly. The claustrophobic research station, shut in by an Antarctic storm and Blair's sabotage, turns into an emotional lens, intensifying the characters' disaffection and turning them upon and against one another. Eventually paranoia makes everyone at the station into an adversary. The question "Who's the Thing?" (or "a Thing") becomes less meaningful after the de facto leader, Macready, makes it clear he'll kill everyone rather than let the alien intruder infect the world. With humans like these, who needs alien enemies? What's the point of being human if everyone in your "human" community is on the verge of killing one another?


In the end, the aliens and humans do end up killing one another, or at least dooming one another to die. In the final scene, one much discussed by sci-fi fans, Childs finds Macready outside the burning ruins of the station, giving him a half-credible story about seeing Blair and pursuing him into the storm. Macready suggests that it no longer matters if one or the other of them had become an alien. He offers Childs a drink. I think this scene works best if we don't try to figure out "Which one of the survivors was really a Thing?" Instead, it is a moment, ironically, of restored humanity. The Thing's characters become most vivid when they are least human: shredded and violated by alien possessors, or filled with terror and ready to kill anyone. Conversely, when the station crew members are preoccupied with boring tasks and pastimes, they are most human, most capable of peacefully coexisting with one another. In sharing a drink and waiting for the storm to freeze them, Childs and Macready are making themselves dull, blurring their characters' rough and angry edges, restoring their human-ness just before equalizing themselves in death. Maybe we are at our most human when we seem half-dead, when we are watching three-year-old episodes of The Price Is Right. God, this is a horror story, isn't it?


 

Wednesday, February 28, 2018

Deep Future

Science fiction, if we date its birth to Mary Shelley's work, emerged in the same era as Hutton, Agassiz, and Darwin's elucidation of "deep time:" the extremity of Earth's age and of the (even greater) longevity of the stars and galaxies. The discovery of a "deep past" helped scientists remove human beings from the conceptual center of the universe, much as Copernicus and Galileo had moved humanity from its astronomical center. If all of our planet's history, Mark Twain observed several decades after Shelley's death, comprised a tall tower, human history would constitute merely a single thin layer of paint on the very top of the tower. We should, Twain implied, have nothing but pity for the poor souls who think the purpose of the tower is to hold up the paint.

The concept of a deep past, along with subsequent discoveries in Earth science, evolutionary biology, and astronomy, made possible an equally unsettling line of speculation: after our own brief lives will come a deep future, extending billions of years after the present. In so distant a future human beings will evolve out of recognition, or even (as Shelley herself speculated in The Last Man) go extinct; the mountains and rivers and continents will assume strange new shapes; the Sun will grow old and red and the billions of other stars evolve toward their own states of senescence. This kind of speculation can inspire awe, as one reflects on the chronological "size" of time and space, and a certain peace of mind, born of one's imagined mental journey far beyond the greed, vanity, and tumult of our own day. It also breeds alienation, as one realizes how both the deep past and the deep future reduce the lives of human individuals, nations, even the whole species, to comparative nothingness.

This makes speculation about the deep future difficult for SF writers, even though one would assume it a natural subject for them. Fiction requires protagonists, and thus a human or sentient presence to grapple with hardships and challenges. Deep-future sci-fi scenarios usually employ some contrivance to preserve a narrator or point-of-view character with whom readers can sympathize. Isaac Asimov's "The Last Question" imagines disembodied humans and computers surviving as beings of pure thought. Poul Anderson's Tau Zero shepherds its human protagonists to the end of time with relativistic time dilation. Vernor Vinge's Marooned in Realtime uses stasis fields to catapult a human colony into a post-human - actually, post-Singularity - future, 50 million years hence. Charles Stross's story "Palimpsest" features quasi-human time travelers colonizing future Earths with the few survivors of various geological extinction events. 

Even non-fiction essays on the deep future find it necessary to posit some form of intelligent life in order to engage their readers' attention. For example, Laurence Krauss, Robert Scherrer, and Avi Loeb, in their speculations on the far-future expansion of the universe, raise the question of how "future astronomers" (post-human or non-human) will be able to perceive galaxies beyond our own, after dark energy has propelled them so far away that one cannot see their light. They imagine that these scientists may have somehow preserved observational data from our own era, but will not be able to confirm it with their own observations. Avi Loeb offers a glimmer of hope in the form of "hypervelocity stars" that break free of the gravity of their home galaxies, and which may pass near enough the Milky Way to demonstrate to our poor future stargazers the reality of extra-galactic objects.

This is a reassuring possibility, but I must confess I am a bit less interested in what the universe will look like in 100 billion years than in the changes that will have occurred on our own planet over, say, the next 100 million years. One of the appealing features both of Vinge's Marooned in Realtime and Stross's "Palimpsest" is their presentation of a future Earth as analogous to another world, a habitable but alien planet located not many light years but many chronological years from our own. Frank Landis makes a more direct observation to this effect in Hot Earth Dreams. Our own planet was, after all, a different world just 10,000 years ago, with different atmospheric chemistry (less carbon dioxide and methane), significantly different geography, and a greater variety of fauna. Ten or even five thousand years from now, Earth will have undergone equally great biological and physical change. Our descendants will adapt to an essentially colonize a new Earth. So will their descendants in later geological epochs. Landis argues that this is a more realistic vision of colonization than trying to find a human-inhabitable world, a biosphere haven if you like, in another star system. Over the course of tens of thousands of years I imagine humans will do both, but the huge expense of interstellar travel will mean that the vast majority of our future colonists will necessarily embark on a timewise rather than spatial expansion of our human civilization.

Wednesday, January 31, 2018

Atwood's Dystopias for the Comfortable



Margaret Atwood, the much-lauded author of several dozen books, including several famous works of dystopian speculative fiction, got into a bit of hot water earlier this month over her defense of fellow writer Steven Galloway. Galloway stood accused of sexual misconduct at the University of British Columbia, whose administration fired him after an ineptly-conducted investigation. Atwood and other professional colleagues signed a letter of protest to UBC in 2016. Last year, with the long-delayed rise of the #MeToo movement – and, I suspect, with  growing doubts about Galloway’s innocence – many of the letter’s supporters withdrew their signatures. Atwood, however, not only defended Galloway but published a self-aggrandizing op-ed in favor of established institutions and against revolutionary “terror and virtue” – that is, in favor of the status quo. The essay was learned and intelligent, but the author deployed both of those virtues in defense of her own narrow privileges.

In The Root, Clarkisha Kent has some incisive and some usefully caustic things to say about Atwood. White privilege and cultural appropriation, Kent argues, find their way into a good deal of Atwood’s writing, both non-fiction and fiction. The Handmaid’s Tale, M.A.’s most famous work, stands as Exhibit A. Members of Atwood’s own privileged class (including myself, I must confess) see Handmaid’s Tale as a canonical, irreproachable work of dystopian and feminist fiction. Kent (following the lead of Ana Cottle) presents it instead as a “white feminist” nightmare, a scenario in which white American women are subjected to and ennobled by the same treatment actually meted out to African-American slaves: degradation, rape, forced pregnancy, elimination of human rights and identity. Atwood does this with barely a nod to persons of color. Indeed, she set Handmaid’s Tale at one of the citadels of modern white American privilege, a place where Atwood herself went to graduate school: Harvard University. M.A. horrifies the reader not by recalling the ghastly treatment of women in historic (and modern) slave societies, but by positing a threat to the well-being of upper-middle class whites. If one of the purposes of good fiction is to build empathy for others, Handmaid’s Tale fails the test: it serves instead to boost the self-regard of the comfortable.

Atwood’s tendency to appropriate the experiences and creative work of others extends to her treatment of genre fiction, or at least science fiction. I’ve written before of her dystopian novel Oryx and Crake and Atwood’s unattributed borrowing of at least one idea from Frederick Pohl and Cyril Kornbluth’s The Space Merchants. Much of O&C reminds me of more recent work by Paul di Fillipo and Nancy Kress, some of whose more famous stories came out when Atwood was writing her biological disaster novel. One may defend M.A. by noting that she probably has no idea these authors exist, or by arguing that no author, particularly an accomplished and respected one, has the obligation to cite their sources in a work of fiction. I am not sure the latter is true – writers of historical fiction, for example, usually mention the non-fiction works that informed their own. The former observation, if true, merely reinforces our view of Atwood as insular and privileged.

Unfortunately, M.A.’s work appeals to exactly the sort of people who craft high-school and college curricula and book-club reading lists, so I suspect Handmaid’s Tale and Oryx and Crake will remain widely read and discussed for years to come. Fortunately, we have other writers of speculative fiction with less insular views, a greater appreciation of racial and class oppression, and at least as much writing skill as Atwood, writers who are gradually gaining the regard of canon-makers. I look forward to a future where students are far more familiar with the works of Octavia Butler and Ursula LeGuin than with Margaret Atwood’s well-written but intellectually-confining future visions.

*

(Image of Margaret Atwood from Goodreads.com)

Wednesday, December 13, 2017

No Genre for Old Men

My consumption of science fiction, both in print and in other media, has been declining steadily for the past fifteen years, even as my overall consumption of books and television has increased. Sci-fi, like other genre fiction* (especially horror, fantasy, and supernatural romance), primarily appeals to a younger audience, and Your Aging Narrator no longer fits that demographic. Popular franchises like Doctor Who, Star Wars, Star Trek, and (a level or two below them but still well-known) the Hitchhiker's Guide to the Galaxy tend to feature young or childlike protagonists, quest-like central stories, conflicts with stupid and mulish authorities, and lots of action. Those of us in the "(wo)men of a certain age" category usually aren't much into action, unless we are paying someone else to do it, and are likelier to sympathize with the stupid authorities than their antagonists. (You kids get off my lawn!)

Trying to think of exceptions to this rule, in the form of SF novels or shows specifically aimed at middle-aged audiences, I recalled a fragment of a movie line from memory: "Before you really do grow old." Of course! The Star Trek movies, or the first six of them at least, aimed themselves at fans of the Original Series - most of whom were hitting or passing 30 when ST I appeared in 1979 - and at members of their age cohort. The first movie hit the mark too well in one respect: it had the sloooow pacing of a European art film, though without the charm. The other installments had peppier (or at least more suspenseful) narratives, while presenting themes that would appeal to more mature audiences: peace making (ST VI), fighting extinction (ST IV), looking for personal or divine meaning (ST V, alas), and coping with death (ST II and, to some extent, ST III). The corny dialogue, references to classic literature (Shakespeare and Melville, for example), and offbeat villains also denoted an older target audience. The later Star Trek films adopted an action-movie format** and, with the Abrams reboot, a much younger cast. I can't blame the producers for reaching out to a newer, younger, and more profitable market, but I have not been able to take either pleasure or comfort from any of the ST movies made after 1991.*** 

So hand me my copy of Julius Caesar, in the original Klingon, pass me my Romulan ale and my reading glasses, give me a star to sail by, and I'll be happy.


* Mysteries, westerns, and spy novels and movies are the exceptions here, but the western is dead and spy movies are moribund except as parodies.

** A friend referred to Star Trek: Generations as "three old white guys fighting on a rock."

*** I liked Star Trek: Nemesis, but that was just a tribute to Star Trek II  and VI.

Friday, November 3, 2017

Did You Ever Take the Voight-Kampff Test Yourself?

When someone (one of my college TAs, as I recall) introduced me to the most popular fan theory about Blade Runner (1982), namely that Rick Deckard was a replicant, I got rather angrier than the remark warranted. The aforementioned theory rested on Rachel's pointed question to Rick about the Voight-Kampff test, a mechanism used to distinguish replicants from "real" humans: "Have you ever taken that test yourself?" I think many fans, and eventually director Ridley Scott himself, read the question as a hint* that the VK machine would reveal Deckard's android identity.

I read it instead as a deeper philosophical question, a rare one in movies and one a science-fiction film was well-positioned to ask: how does anyone know s/he is real, is human? Most of the replicants in Blade Runner were "Nexus 6" models who strongly resembled humans: they had the same physical abilities as humans, could reason, could feel love and fear and pain, could and did hope for a future. Rachel had even more human features. She had a childhood, or at least the memories of one; had mastered a creative art, piano-playing; had the capacity for complex human emotions. She had, in short, all the attributes of a human being. All that became irrelevant when she failed the VK: she was now a replicant and a slave, a lesser life, someone whom Rick could murder at will. The only things separating Rachel from Deckard or Bryan, however, were the Tyrell Corporation's (presumably secret) records of her manufacture and the results of a mechanical test. How many natural-born humans would have fared any better at the VK exam than her? How arbitrary, then, is the difference between a full person and a fake person, between freeman and slave. This interrogation of mental and social reality is, I suggest, the philosophical heart of the original Blade Runner film. Simply saying "Oh, Deckard must be a replicant" robs it of its power.

To its credit, the sequel movie, Blade Runner 2049 refuses to answer the "human or replicant" question, except to imply (through the film's villain, Niander Wallace) that in Deckard's case it doesn't matter. All of the (other?) replicants in the new film are aware of their artificiality, have easily-accessible serial numbers, and in some cases routinely take a "baseline" test determining their level of obedience. The main character, K., not only knows what he is but prefers being a replicant; indeed, he is upset by even the possibility that he might be partially human. The other replicants in this film don't want to be human, either. They merely want the few things that make human beings independent, including control over (to borrow a Marxist term) their own means of reproduction. "More human than human" was a subtle joke in the first Blade Runner movie; now it is something audiences are willing to accept at face value. BR 2049 has shed some of the poetic ambiguity of its predecessor, but it has made its social criticism more explicit. And it has put a spoke in Ridley Scott's rhetorical wheel, which these days is a plus.



* Another supposed hint was Captain Bryant's threat: "If you're not cop, you're little people." Bryant didn't explain what that meant. One could translate "little people" as "replicants who don't have a badge to protect them." Or it could mean "regular shmoes like everyone else in this dump."