All the things that were unusual about the class made it attractive to the students. It was about popular culture, in the first place, and taught by a charismatic and expert professor. Andrew Sarris was a film critic for the Village Voice, a position he’d held long before he began lecturing at Columbia University, 17 years ago. Sarris was commonly associated with his 1962 essay, ‘Notes on the Auteur Theory’, regarding the critical position that the director was the primary author of a film. The essay had popularised the theory in America and earned Sarris a nemesis in The New Yorker’s Pauline Kael. Their rivalry had since dissolved but remained on the minds of a lecture hall of undergraduates.
In his lectures, Sarris would run film reels of clips from various movies to illustrate broader points about cinematography or storytelling. Such compilations were themselves a technical novelty (Sarris may have needed professional help to put them together) and invited students to immediately engage with the course. Sarris discussed how different films were related thematically, and what implications the cinema had on politics and the arts at large. At the close of each lecture, the enthused undergraduates barricaded Sarris with earnest questions, and one young student, impressed with the presentation and content of these lectures, decided he would have to tell his friend Michael Abbott all about them.
A graduate student in Columbia’s Theatre school, Michael Abbott, 23, had no official business attending an undergraduate film lecture, but upon his friend’s recommendation snuck into the hall with the enrolled students. Michael, too, was struck by Sarris’ methodology. He noted the professor’s use of immediately accessible cultural references that eased the students into more sophisticated and less accessible concepts; “challeng[ing] students with material that they didn’t even understand was challenging.” And he liked, more than anything, that this college professor believed popular culture was worthy of serious academic study.
He would have to come back here.
NEW YORK CITY, 1986-1987
Michael Abbott had arrived at Columbia a 22-year-old Theatre and English major from a small college in Indiana. He was adamant that he wanted to direct professionally, but being accepted by one of the most prestigious directing programs in the country was something of an inexplicable surprise. His interview had gone well, and he came equipped with a thorough resume, but he half-believed that the program staff simply liked the idea of having a “Midwestern, corn-fed boy in the mix.”
The staff were all active directors in the city and did not teach theatre exclusively. The majority of the assessment depended on a written thesis and the staging of one major production, and would lead to an apprenticeship under a director followed by a position in a theatre company. The program was limited to six students, a number that, it was made apparent at the outset, would eventually be reduced to be four. The eliminations were for logistical reasons, as the college couldn’t support six directors simultaneously staging plays, but to characterise the pursuit of a Master’s degree as an extended job interview did nothing to reduce the extant pressure.
Of the six, Michael was the only one straight from undergraduate school, or from as small a school as Indiana’s Wabash College, and being in his early 20s meant he was a good decade younger than the rest of the students. Nor was Michael well-travelled; if his peers were American then they had lived internationally, and the others were from cities like Beirut and Johannesburg.
Michael, furthermore, was as interested in film as he was in theatre, whereas the other students all seemed singularly focused on their area of study. But despite a deep appreciation of the cinema, he had not even heard of film theory until college, had never read Barthes and his critical readings of film or theatre had mostly been limited to their immediate entertainment value, or noting a standout performance. He admired the films of John Ford and the plays of Sam Shepard, two American artists who covered very American subjects that Michael was drawn to: explorations of the American mythos, the American west and the American identity. Michael worked with playwrights who wrote similar material, which earned him the sardonically-enunciated nickname “the American director” from the program head. Within the internationally cultured program, the westerns of John Wayne were decidedly Michael’s to keep.
In Michael's view, the six of them were a “pretty crazy group of people” to begin with, and the realities of the program stoked their ambitions. They were being trained to become professional directors, the staff stressed, not teachers, which was how many professional directors in the city did in fact support themselves. The program had no interest in academics, and having been forewarned that at least two of them “would not survive”, the students “got very, very single-minded about themselves.” Determined and career-driven, the other five rarely interacted with each other outside of the program and almost never with Michael. Their approaches were different, and because of his youth and relative inexperience, he felt, they didn’t respect him very much.
Even outside the directing program “pressure-cooker”, there was little release. “There was this demonic notion at Columbia,” says Michael, “that if you could survive this hell, you’d be better for it.” As he worked he was uncomfortably aware of a Svengali-like presences looming over his shoulder, or telling him to his face: “that was shit. That was stupid. That was not worth my time.” Michael had had a “lousy” public school education, didn’t think much of teachers and certainly no teacher had ever dealt with him like that or spoken to him in that way. They were focused on results, and their criticism cut deep. “I had, and still have,” says Michael, “a tendency to be kind of a worrier, to cling to outcomes and to need certain things to happen for me to be happy.” The program had put his self-worth on the table, and Michael was easily pushed to anxiety and doubt. “There were times,” he remarks with some bitterness, “when it was really no fun at all.” The sentiment likely extended beyond Michael; in a sobering episode, one of the women in the program killed herself.
Michael had little in common with the director students and admired how tightly-knit and low-drama the playwriting program seemed in comparison. Through chance encounters in a cafĂ© on Amsterdam Avenue, he made friends with a group of students from the Journalism and Law schools with whom he shared a hobby that he never discussed with his theatre colleagues: video games. Whatever inspired their friendship (“These were people who weren’t dating on Saturday nights; we had time on our hands,” Michael notes wryly,) he was thankful for this small but easy-going community founded around their niche pastime. The topic of theatre never entered into their clique discussions, or the “gamer nights” they organised, and this break from academic pressures helped Michael to relax.
The years that Michael Abbott spent at Columbia coincided with the apex of the AIDS crisis. By 1986, more than 19,000 Americans had died from AIDS; a statistic which had received disproportionately minor attention from the news media and the federal government. Medical researchers marginalized the epidemic as an unglamorous minority’s disease; although it had also been discovered in heroin users, female partners of infected men and babies subject to infected blood transfusions. It was this, coupled with political and social discomfort in making the sexual habits of gay men the subject of national discourse had led to 19,000 preventable deaths, so argued a loose coalition of doctors, journalists and gay activists. The gay communities of New York City and San Francisco were the most visibly affected, and Michael, working in the theatre, had several gay friends who were convinced by and large that the Reagan administration was doing its best to ignore the crisis. Michael remembers the disparity between what the government “said America was and what we were living at the time in the city.”
It took the death of a film star, the 1960s romantic lead Rock Hudson, to propel AIDS into the national consciousness. A plurality of Americans, including members of Michael’s family, were shocked not just by Hudson’s death but that a gay man could be like Rock Hudson. Ryan White, a 13-year-old haemophiliac who had contracted AIDS from a blood transfusion, had been expelled from his high school, and his cause was taken up by celebrities like Michael Jackson and Elton John. White was from Michael’s home state of Indiana, a place that Michael began to see bore greater resemblance to Ronald Reagan’s vision of America than did New York City. He had had gay friends at home; only he didn’t know they were gay at the time. The idea of coming out in Indiana was different than it was in New York City, where Michael had quickly befriended a flamboyant student on the College Board.
Michael had grown up in a conservative Christian church whose ideology he found problematic when he reached college. He was concerned with the idea of monotheism, and challenged why exactly the things he was told were “evil” were indeed wrong. “I think in my case, I was so overwhelmed as a boy by a powerfully strong father, that somehow the notion of 'don’t do that, don’t try that, don’t say that', whatever, I become oddly attracted to those things. As an act of liberation.” Michael’s character was more inquisitive than contrarian, and amidst widespread anger and frustration, he became troubled by his repudiation of Christianity and whether he was replacing that spiritual vacancy with anything else.
New York City had politically activated Michael, a commencement he thought had already occurred at Wabash, but nothing had prepared him for the hopelessness and desperation embodied by his new friends. He would hear every week, if only from a friend of a friend, that someone new had died. He was witnessing politics and art motivated by anger for the first time. In the theatre, they struggled with how to respond to the disease and the crisis that threatened to define them. There was a debate between those who wanted to argue that if the public got to know homosexuals, then they would like them, and those who wanted to yell that while they would never be understood they would no longer be ignored. The prevailing rallying cry became “get on board or get out of our way.” Some chose to tune out the entire thing, less, Michael thinks, out of homophobia or prejudice, but because of how overwhelming everything was at once. Regardless, “the voice of the artists in New York was changing,” says Michael, “and everything seemed up for grabs at that time in the American theater as it related to the AIDS epidemic.”
Michael was concerned, and angry too, but he didn’t have a response. Artistically, he didn’t have experience writing about the contemporary political and moral conscience, nor was he gay, nor did he have AIDS. His strenuous academic career, furthermore, wouldn't allow him the time to volunteer as his friends had. Given a remarkable catalyst in American history, he wished he could have risen to the occasion with some artistic and personal statement; instead, he felt sorely disconnected.
In high school, Michael knew students whose parents wouldn’t let them play Dungeons & Dragons. The tabletop role-playing game had been stigmatised by religious conservatives as an object of the occult, powerfully and indivisibly affiliated with imagery of demons and witchcraft. It was not just the pastime of geeks, but of Satan. Legends persisted about D&D-addicted teenagers turning up dead in steam tunnels, and Patricia Pulling, whose gamer son committed suicide, would form the vocal watchdog group Bothered About Dungeons & Dragons (BADD).
The prejudice didn’t affect Michael much; he’d never been a big D&D player. Some of the video games he did play, however, bore that influence. The D&D and Lord of the Rings-derived Ultima games were typical of the role-playing genre at that time. Players would select a character who navigated a fantasy-fiction universe to defeat an all-powerful evil, fighting monsters and collecting equipment along the way. The three games in the series offered only slight variations on the theme, but had soared in popularity for the radical technological advances made with each instalment. Richard Garriott, 24, the Ultima designer and programmer, had self-published his first game, Akalabeth -- effectively an Ultima prototype – in high school, and sold it on the shelves of the ComputerLand store where he worked. By Ultima III, his success was such that he dropped out of college to make games professionally. Between the popularity of the series and its allegedly demonic lineage, Garriott and his company, Origin Systems, were an attractive target for Patricia Pulling-esque letter-writing campaigns that tagged his games as corruptive influences and Garriott personally as “the Satanic perverter [sic] of America’s youth”. Garriott didn’t quite see where the letters were coming from, but, out of interest, tried.
Garriott realised that the ostensible villains in his stories never demonstrated their villainy; the game would merely instruct players that certain characters were evil. These same scourges of the land were constrained to the game’s final levels, waiting for the hero to kill them. This same hero could loot, pillage and murder civilians free of conscience. If Ultima lacked moral nuance, it was largely the fault of technical limitation and video game convention, but nonetheless it gave the stories an ethically suspect cast.
The new Ultima, Garriott decided, would not be built around the hypothetical threat of a principal villain but the hero's capacity for altruism and inspiration. To finish the game, players would need to demonstrate proficiency in a series of virtues; compassion, humility, love. To ensure this theme was consistent with the game's mechanics, Garriott designed a system of consequences. Ultima players would still be able to commit any crime, but faced appropriate repercussions. While the game itself would not object if a player stole another character’s property, the relevant character now could. Ultima players took for granted that the ends justified the means; now, Garriott would challenge them on that assumption. As he programmed the game, Garriott kept his plan a secret for fear of alienating prospective role-players with idealistic talk of morality and justice. Better, it seemed, to first ease them in with the familiar Ultima concepts. Nor was Garriott particularly convinced that his experiment would work; that players would take kindly to admonishment.
Ultima IV: Quest of the Avatar was published in September of 1985, and although Michael had lost interest in the Ultima series, over the subsequent months it overtook his gaming circle. All of his journalism and law friends were playing Ultima IV simultaneously and experiencing the same epiphany. Michael had never considered the video game much of an art form, and did not expect to be questioned on ethics, responsibility and relationships. “It wasn’t at all clear that a game could do that,” he says, “to put you in a relationship with these characters, characters that had a life inside of the game.”
It was a revelation, as they concluded in a series of impassioned and drunken conversations the likes of which they’d never had about games before. Michael recognised in Ultima IV the same tremendous dramatic potential already possessed by the theatre. A director, Michael associated role-playing with actors and stage plays rather than Dungeons & Dragons, but in this game he saw a possible future convergence. The concept of an avatar fascinated him to no end: it was a “vessel for really good writers to get hold of and do something amazing with,” like implicate the audience in an interactive morality play. “It was role-playing in my way of thinking and not the D&D way of thinking.”
To Michael Abbott, Ultima IV had opened the door for the video game industry, and he had every expectation that someone else would walk through it. Michael had a head for stories and drama, not programming, but he was convinced that in ten years, characters in video games would be so complex as to allow infinitely variable experiences and limitless interactions.
A German woman had been cut from the directing program for unclear reasons, which left Michael in the final line-up. He had been directing Sam Shepard one-act plays for assignment, and chose to have the male parts played by women. “These were absolutely fearless actresses,” he says, “who gave just unbelievable performances,” and it was the quality of his productions that was starting to win him the respect of his three remaining director peers.
Michael had a wealth of Sam Shepard material to choose from. Shepard, the 44-year-old playwright, had a knack for prolificacy, and could readily issue series of cogent one-act plays that touched on themes of national identity. Shepard's writing had earned a collection of off-Broadway awards, and his Buried Child, which articulated the disillusionment with American idealism through the ugly dissolution of a nuclear family, won him the Pulitzer Prize in 1979. The multitalented Shepard had picked up an Academy Award nomination for playing the pilot Chuck Yaeger in The Right Stuff and had collaborated with Patti Smith and Bob Dylan. “He had a particular vision that no one else had at that time,” says Michael. “You could see him maturing as a writer: he becomes more ambitious and writes larger plays. His scope broadens, and suddenly he’s writing about the American conscience post-Vietnam. He carries all that on his shoulders.”
Shepard was born and raised in Illinois, less than two hours from Michael's home, and the two Midwesterners now lived and worked in New York City. His plays had a Midwestern quality and represented a lifestyle Michael intuitively understood. Shepard, Michael points out, demonstrated an artistic preoccupation with a father who abandoned his children. “It’s hard for me not to feel deeply connected to those things because they’re so troubled and I think have a resonance with my own experiences.”
Shepard wrote about the American mythos, a subject covered almost antithetically by the films of the late John Ford. Both men wanted to examine the American West for what it really meant, and began from a position of general skepticism. “The Western mythos is all about dramas and fears,” says Michael. “Ford was still very sentimental about those parts of American history, sometimes unquestionably. Shepard, too, is deeply skeptical but poetically so. He's not an angry playwright.” Ford made westerns; genre films that didn’t share Shepard’s avant-garde credentials, but Michael treated them with equal seriousness. “When people say that a certain type of movie or a certain type of culture is not worth our time,” he says, “or is somehow less valuable, or low culture, or whatever, I find that I have this immediate feeling that – that’s wrong. I must defy that, somehow. Maybe they’re right, I don’t know.” Andrew Sarris’ talks on auteur theory had eliminated Michael’s lingering resistance that he might be reading too much into Ford, or Shepard, or anyone.
Michael had taken from Sarris that “if you track something carefully enough and deeply enough you can get at those things that artists seem compulsively determined to deal with.... I didn’t make those kinds of connections before he helped me think about that stuff.” In connecting one Ford film to another, Ford to Shepard, and film to theatre, then thematic patterns appeared. He was drawn to the recurring themes portrayed in Ford and Shepard: fathers and sons, masculinity and the solitary hero. Abstracting art to a personal level, Michael could ask himself why Shepard and Ford were inspired to write about fathers and sons, and why he had been compelled to detect those themes. How would his approach as a theatre student change, he wondered, if he incorporated ideas from film? If he studied art outside of a vacuum, and looked at film and theatre and life and politics and science and video games as one, connecting them all as though they were one conversation, then perhaps he would see something new. It made sense to the young Indiana director who had been determined to enter graduate school immediately because, leaving college with no particularly employable skills, he couldn’t face moving back home to an alcoholic stepfather. When the stepfather died of cancer shortly before the director left for New York, he felt relief that his mother did not have to deal with that situation alone.
At the Museum of Modern Art, Michael attended a screening of Straight Shooting, a silent 1917 John Ford western with Grafton Nunes, a member of the theatre faculty who would become his thesis adviser. Ford’s seventh film (of an ultimate 140) surprised Michael with its early confidence. When it was over, Michael turned to Nunes and asked him: “Has anyone ever focused on this kind of thing for their thesis?” They hadn’t, he replied, but it wouldn’t be impossible.
The thesis connected three versions of the solitary hero from John Ford films, each portrayed by a different actor. Although it would be assessed as his stage directing thesis, it had no specific theatre content or specific film content; it was a character study whose themes, Michael hoped, were universal. He wrote The Solitary Hero in the Films of John Ford in Barnard College, where he bargained for a space to start his thesis production; in the Museum of Modern Art, whose Ford collection he exhausted; in Indiana, whose state university held the Ford papers, and in New York where he first read the Tao Te Ching. He wrote it everywhere.
In October of 1999, the Visiting Artists Committee of Wabash College invited the film critic Molly Haskell to speak at the college. Haskell had written for numerous New York newspapers and magazines and most famously had authored From Reverence to Rape, the classic 1974 feminist film theory text on the portrayals of women in cinema. She was also the wife of Andrew Sarris, whose lectures Professor Michael Abbott had snuck into thirteen years prior. Michael admired Haskell’s writing, but nonetheless expected her to decline: Haskell, 60, didn’t accept many speaking engagements from what Michael could tell.
Haskell agreed. “She was very intrigued by our situation, I think,” says Michael, “as an all-male school, and with her work I think she was particularly eager to come to a place with an all-male culture and talk to them about gender.”
Michael had liked From Reverence to Rape and further appreciated that Haskell was not an ideologue: “a feminist that didn’t always buy the feminist line 100 per cent.” Haskell had written, sincerely, about the actor John Wayne, whose conservative politics and prototypically masculine screen persona were hardly dogmatically consistent with feminist ideology. “It was clear to me that she approaches ideas head on,” says Michael, “with whatever value they have to her at that moment.”
In welcoming Haskell to the Wabash campus, Michael turned the conversation to the subject of video games, something that the film critic knew nothing about. Michael told her about how games were becoming increasingly cinematic, how they were adopting editing and cinematography techniques from film, and plots and characters were ever more sophisticated. Games, in style and form, were approaching a version of the cinema, Michael told her, but with the distinguishing element of interactivity, and Haskell was interested in that. Michael wondered out loud where games would go in the future, and she wondered with him.
December 28, 2008
December 23, 2008
In The Future We Will Play, Part I: Indiana
CRAWFORDSVILLE, 2004
The video game class that appeared in the course catalogue stood out as unlikely subject matter for both the college and the professor. The single-sex, 900-student Wabash College was a liberal arts school that had never before covered the medium, and Professor Michael Abbott, the 41-year-old chair of the Theatre department, had only a vocational interest in games and no experience teaching programming or game design.
The course, envisioned as a history of video games, had been in Abbott’s plans since at least 2001, and its delayed institution was occurring ten years after Abbott had interviewed for a faculty position at Wabash in 1994. Abbott, who held a Master of Fine Arts in Directing and had taught briefly at Marquette in Wisconsin, had found Wabash attractive after learning that the staff would allow him to bring in new courses. Though a theatre graduate, he deeply admired the cinema and it was incredible to him that Wabash at that time had no dedicated film classes.
Abbott had signed on, almost, to establish Wabash’s film department, and since then he had regularly taught courses on film, theatre, directing and dramaturgy, and had advanced to the position of Chair of Theatre, which, titular prestige aside, is accompanied by administrative and personnel duties he’d have otherwise preferred not to handle. His stature had landed him on the Teacher Education Committee and the Visiting Artists Committee, and through the latter he had been fortunate enough to meet some admired artists and writers, like the feminist film critic Molly Haskell and the playwright Tony Kushner, whose Angels in America (“the greatest American play ever written”, he says) had articulated a Pulitzer-winning response to the late-80s AIDS crisis.
Abbott had been presented with Wabash’s McLain-McTurnan-Arnold Award for Excellence in Teaching in 2001. While he would admit Wabash was small enough of a school that virtually all of its professors would win it eventually, he found it particularly meaningful that he received the award at a comparatively young age. His career had endowed him with the self-confidence, and the credibility with his colleagues, to launch now what was effectively an academic pilot program for an untaught subject.
Unapologetically enthusiastic about video games, Abbott kept a PlayStation 2 console in his office and the door was open for students to stop by and discuss this shared hobby. He had considered video games a serious art form as early as his graduate school years, and ever since it seemed like the rest of the world was catching up to the level of sincere regard he held for the medium.
Games had broken through to the cultural mainstream on the crest of a media narrative that cast the industry as a multi-million dollar machine to rival Hollywood, in which high-production-value action titles from famous franchises – Grand Theft Auto: San Andreas and Halo 2 – were assigned leading roles. College life, certainly in contrast to Abbott’s own experience, had adjusted accordingly. Students commandeered the campus’ computer network to stage skirmishes in the tense multiplayer shooter Counter-Strike, the fraternities of Kappa Sigma and Tau Kappa Epsilon housed unusually serious gamers, and most of the other dorms could boast of multiple consoles and sport game fans. A loose confederacy of Dungeons & Dragons, role-playing and tabletop gamers successfully lobbied for student senate funding and designated themselves the ‘Dork Club’. “They actually have money to spend,” explains a faintly awed Michael Abbott, “to buy games and buy pizza and have fun… even ten years ago it would have been unheard of.”
From Abbott's perspective, Wabash was subject to a clear generational shift. “Very few of my colleagues are gamers, but as our faculty gets younger, that is clearly changing. I have a colleague in Chemistry who's very into Warhammer and Transformers and the survival horror genre. A visiting professor in my department whose office is right next to mine is a big fan of strategy games and sports management sims.”
Video games were fine to play, but to teach them was something else. That proposition, perhaps surprisingly, faced as much resistance from students as it did the non-gaming faculty. “I don’t think we've successfully made the case for studying games even to gamers,” says Abbott. “They're not shocked because they think it's wrong or stupid; they just can't figure out what there is to learn. They're games. They're great, but so is bubble gum.” Abbott, on the other hand, had long seen in video games similar creative possibilities to theatre and film. Games, much like other forms of art, could be a lens through which to view the world, and the medium was made uniquely fascinating through the element of interactivity. His class, potentially, could host a discussion between like-minded individuals. “I just want to be part of that discussion any way I can.”
Even if Wabash students weren’t convinced that games could be intellectual and support close readings, they were still generally enthusiastic about the act of play, and Abbott believed he could turn that to his advantage. Video games were fun, they had never been more popular, and students might more readily engage with complicated subjects if games were the point of entry. If Abbott legitimised games by assigning them as his course texts, he would be speaking with the students in a language they already knew, and that few liberal arts professors understood or sanctioned. It was a teaching style demonstrated to Michael by his own film professors, who had helped students deconstruct complicated subjects through the accessible and relevant case studies of modern cinema.
The study of games was not intended for the students to better understand graphics, programming or animation. Games could teach the basics of research, critical writing, critical thinking, and an appreciation of complex systems: the skill set of a liberal arts student.
Every incoming Wabash freshman has to take a tutorial course. Each course is limited to 15 students and the classes are effectively intended as an academic induction, introducing the freshmen to the skills they will need to succeed at the college. Tutorial topics are broad and chosen by the respective professors, who are encouraged to pick something outside of their expertise to keep themselves interested. Despite the limitless subject matter, Wabash had never offered anything like ‘In The Future We Will Play: The Art and History of Video Games’, which appeared on course lists sent out to new students that summer.
The class caught the interest of Tom Elliott, although the student never considered himself anything more than a “casual console player”. Elliott, who had yet to declare a major, “thought it would be particularly fun to study [games] from an academic perspective” and pictured an “easy A” course in which he would do nothing but play games.
“The only class that actually sounded engaging”, remembers Nelson Barre, was ‘In The Future We Will Play’. “I found the discussion of video games as a cultural phenomenon intriguing.” Barre had attended high school in Colorado, five miles from the tragic school shooting at Columbine in 1999, a massacre whose perpetrators had played – in a connection conspicuous to the mainstream news media – copious amounts of the archetypically violent shooter Doom. “It’s easy to point the fingers [at video games] but when one is so accustomed to playing video games oneself, there is the tendency to defend them.”
The class’s 15 vacancies were filled very quickly, Michael Abbott was told.
***
“There’s no text for me, there’s no textbook, there’s no guidebook, there’s not existing literature.” The pipe dream now a reality, Abbott found his ambitions waylaid by practicalities. The course called for old arcade games, but making even modern video games available for 15 students to play on their own time was enough of a challenge.
The students would be there every Tuesday and Thursday, with the interim over the weekend assigned to game playing, and Wednesday and Tuesday night set aside for “reading and reflection”. Abbott didn’t lecture in the conventional sense, nor did he use PowerPoint; instead, he prioritised discussion and conversation. The advantage of the tutorial format, he notes, was that “once you’ve chosen [your topic] and establish it, you can use it as a starting point to develop something more.”
By the time the 15 students first appeared together in the classroom, Abbott had written to them to gauge their interests. He knew that he had a lot of fans of sport and racing games; “a lot of guy games, I expected that. But I also had a core group of really serious RPG players, and some strategy guys and some sim guys.” Abbott had once co-taught an upper-level Shakespeare class for English majors, and his teaching partner had prescribed assignments based on the student’s areas of specialisation; whether creative writing, or history, or verse. The stratification of video games by genre would easily lend itself to the same tactic, and everyone in the class could spend part of the semester as the resident expert in their preferred kind of game.
In establishing the terms of the course, Abbott confirmed that it would involve a fair amount of game playing, but the students would be required to take comprehensive notes. The idea was not to enjoy themselves without consequence but to witness the evolution of the medium through the use of hands-on examples.
Abbott was conscious that a college class on video games might be perceived around the campus as a free ride. “I’d say there was a stigma around the class itself”, says Tom Elliott. “You’d tell someone you were studying video games, and the invariable response would be a laugh, or something to the effect of ‘that must be so hard.’” At the time, Elliott didn’t necessarily disagree. “Even being in the class I doubted the academic rigor of the subject…. my thought of video games as a serious subject would be on the same level as ebonics or dancing, you just can’t quite consider them serious.”
“I think it motivated me to make sure that no one could accuse that class of going easy on work,” says Abbott, who despite a freeform, conversational teaching style, set a rigorous assignment schedule. “I really socked it to them, probably too much, because I was so concerned about being seen as offering a lollipop course…. They wrote a lot. They read a lot. I gave them supplementary reading materials on virtually everything we did, and they had to constantly write response papers and analytical papers on stuff we played.”
Students like Nelson Barre remained invested in the class, heavy workload notwithstanding. Barre, an English and Classics major, admired the course’s latitude. Abbott would digress along the course objective to engage the students in discussions of whether games were art or why the public at large were not paying attention. “Our class covered everything from ethics to gender, politics to narrative”, says Barre. “I readily see all these things in video games, but when I first took the course, I was not quite expecting the amount of depth Michael covered in class.”
***
Abbott proposed a debate between two students over a game-related topic, and Tom Elliott took up the challenge. Elliott, who says as a gamer he’d only ever “gotten on board” with the Super Nintendo console (which had been released thirteen years prior) would develop an interest in performing stand-up comedy, and Abbott was starting to see him as the H.L. Mencken of Wabash College. He belonged to a fraternity that in Elliott’s estimation could claim at least one game console in every room. Elliott wanted to argue the case of Dance Dance Revolution, the Japanese music game in which players stand on a plastic mat marked with coloured arrows and step in time with the beat of certain songs. “I said I would defend the idea that Dance Dance Revolution is the greatest game ever. Abbott suggested the topic might be a bit broad, but not wanting to back down, I said I would argue that DDR was the healthiest game ever.”
Elliott borrowed a heart monitor from a friend on the football team, and measured his heart rate during an evening game of Dance Dance Revolution. He compared the results to his experience playing a “non-interactive game”: something so passive it still escapes his recollection. With perhaps the slightest degree of flippancy, Elliott presented his findings to the class. “I made a compelling argument based on the fact that my heart rate was higher playing DDR than sitting on my ass”. The “debate” was downhill from there, and Abbott watched the speakers trade personal attacks with amusement. “I think he enjoyed our creativity,” says Elliott, “even if we weren’t using the most effective debating techniques.” As students outside the class dismissed Abbott's course as an effortless waste of time, Elliott began to think that they were actually jealous.
Nelson Barre likewise heard from his friends and family that he was “lucky to be taking such an easy course”. “That’s a class?” he would sometimes be asked. “I explained it was part of being a freshman, [and] taught writing, organization, study and discussion skills”, he says, “[and] it became a slightly more respectable class.” Barre, remembers Abbott, was the kind of student who “puts things together one piece at a time, and he’s very careful. He’s very serious, I think, it’s just his personality. He looks for the big picture. He tries to make connections to things. I think he became convinced in that class that his peers, who think that games are just a silly waste of time, need to be corrected. And so he took it upon himself to do some correcting.”
***
The historical texts of ‘In The Future We Will Play’ were the early ‘80s arcade games produced by Williams Electronics like Defender and Robotron: 2084. “I bought one of those ex-arcade dual joystick things that you can plug into your computer with a USB port,” says Abbott, explaining how he solved one of the course’s several logistical problems, “and I brought this behemoth into class and we would play all those really great Defender games.” The creators of these early texts were programmers like Eugene Jarvis and Larry DeMar, who, unusually for classic authors studied in an arts course, were still alive. “They’re still kicking, but nobody pays attention to them anymore. So you get these [students] a little charged up about these games and they get competitive with them, and suddenly they’re doing the kinds of research that you would normally wait until later in a student’s career to get to. They were doing primary sources research, they were contacting these designers to see if they could get interviews. They were doing a kind of journalism that you would not expect freshmen to do. Purely because their interest was being driven forward by these games, I think.”
The concept of a college course that facilitated a discussion between sincere gamers was expanding. The professor was all too happy to talk about the latest games outside of class hours. Elliott and Barre each expressed interests in a theatre major, and Abbott would be assigned to be their academic advisor. “I became close with those guys, and this doesn’t always happen. I think part of that was because we just gamed together, you know.” The idea had not necessarily been to enjoy themselves, but it happened regardless.
While the class at times could entertain, neither the students nor the professor lost focus on the aspects of video games they considered to be meaningful. “To us,” says Elliott, “a discussion of the impact of Super Mario Bros. on the Western world was a serious discussion and not a joke.”
***
“For a freshman,” says Abbott, speaking with slight regret, “a 20-page paper is a pretty big deal.” Their term paper was a research assignment on an influential game creator who had affected the game medium and industry. If Abbott had increased the difficulty of a freshman tutorial to combat outside misinterpretation, his students nevertheless met the challenge. “I was able to tap into the students’ eagerness to play these games. What that meant for me was that I was able to get their enthusiasm level up before diving in to a more rigorous assignment that I might not ordinarily have been able to assign.”
When Nelson Barre saw the subjects his fellow students were choosing, it reaffirmed his faith in the idealistic notions of the course. “[Everyone] looked at topics greater than simply video games for video games' sake.” The papers Abbott received interpreted the Metal Gear Solid designer Hideo Kojima as a deconstructionist, examined gender roles in the Final Fantasy series, and considered the silent hero as a masculine ideal.
Barre, Elliott and their thirteen peers were engaged and enthusiastic beyond the level anticipated of freshmen. Abbott attributes the result in part to the course’s unusual subject matter: “I met them at the place they lived,” he says, in reference to his choice of course theme. Whatever role video games played in the students’ motivation, however, Abbott would never disparage the hand he was unexpectedly dealt. “This was just an unbelievably bright group of guys.”
***
Abbott had tried to use video games to instruct those fundamental liberal arts skills other mediums were widely considered capable of teaching, and the students came to see these connections between media. “I added a Theater major,” says Tom Elliott, “because I enjoyed his teaching and the field in which he worked, which I see as interconnected.” Elliott still doesn’t believe games are “important enough” to be studied individually, but praises the tutorial. “The basic instruction in writing that I learned in the course is still valuable to me. I think my ideas on good vs. bad games also expanded during the course. I began to see the merits of some more online and multi-player games and also learned to appreciate a compelling story in a game as much as great graphics.”
“I always felt I was learning something from Michael,” says Nelson Barre. “He had a sense about him that only comes along in a professor with whom you really connect.” Barre would sign up for more of Abbott’s courses, though none of them were about video games, and he would become increasingly interested in Abbott’s original field, the theatre. He remembers ‘In The Future We Will Play’ fondly: “Michael's class taught me to be a better writer, speaker and thinker. I first developed my writing sense in that class, and Michael's ability to inquire and help in the writing process allowed me to flourish in the class and all the others I took in undergrad. I consider it, still, my favorite course in the four years I spent at Wabash.”
***
Michael Abbott was impressed by his students' dedication, and while he wondered if he had pushed them too hard, they had responded. “[What] I’m really teaching is critical thinking, analytical reading, collaborative discussion skills, clear, concise writing, research and public presentation. If I could make measurable progress in these areas with my students, I think my college would let me teach dog grooming.”
Regardless of the positive results of his experiment, and the personal plaudits he earned from his students, Abbott was not fully satisfied. ‘In The Future We Will Play’ was too broad, he thought, and they had not been afforded the time or the luxury to cover topics in-depth or linger on specific details. “We didn’t do enough reading of texts or enough close analysis to accomplish what I hoped.” He knew now that his students would have been ready for it.
“The problem, I think, with it was that it was mostly an overview course. That’s okay, but as a teacher, overview courses, survey courses, it’s hard to generate a lot of enthusiasm for that over and over.” He considered less general subjects; subjects that could sustain wider conversation and persuade the skeptical. “There’s a pretty strong resistance amongst students, even college-age students, to the idea of studying games. They just don’t quite buy it yet. At least that I see.”
Conversation was productive, especially for an art form as comparatively undeveloped as video games. “If you believe that video games, like any art, are a way of seeing, a way of seeing a world, or experiencing it…” Abbott never had a textbook because the rules were not written yet. “It’s partly why I think it’s good to have so many people thinking about them.”
The class could have been longer.
The video game class that appeared in the course catalogue stood out as unlikely subject matter for both the college and the professor. The single-sex, 900-student Wabash College was a liberal arts school that had never before covered the medium, and Professor Michael Abbott, the 41-year-old chair of the Theatre department, had only a vocational interest in games and no experience teaching programming or game design.
The course, envisioned as a history of video games, had been in Abbott’s plans since at least 2001, and its delayed institution was occurring ten years after Abbott had interviewed for a faculty position at Wabash in 1994. Abbott, who held a Master of Fine Arts in Directing and had taught briefly at Marquette in Wisconsin, had found Wabash attractive after learning that the staff would allow him to bring in new courses. Though a theatre graduate, he deeply admired the cinema and it was incredible to him that Wabash at that time had no dedicated film classes.
Abbott had signed on, almost, to establish Wabash’s film department, and since then he had regularly taught courses on film, theatre, directing and dramaturgy, and had advanced to the position of Chair of Theatre, which, titular prestige aside, is accompanied by administrative and personnel duties he’d have otherwise preferred not to handle. His stature had landed him on the Teacher Education Committee and the Visiting Artists Committee, and through the latter he had been fortunate enough to meet some admired artists and writers, like the feminist film critic Molly Haskell and the playwright Tony Kushner, whose Angels in America (“the greatest American play ever written”, he says) had articulated a Pulitzer-winning response to the late-80s AIDS crisis.
Abbott had been presented with Wabash’s McLain-McTurnan-Arnold Award for Excellence in Teaching in 2001. While he would admit Wabash was small enough of a school that virtually all of its professors would win it eventually, he found it particularly meaningful that he received the award at a comparatively young age. His career had endowed him with the self-confidence, and the credibility with his colleagues, to launch now what was effectively an academic pilot program for an untaught subject.
Unapologetically enthusiastic about video games, Abbott kept a PlayStation 2 console in his office and the door was open for students to stop by and discuss this shared hobby. He had considered video games a serious art form as early as his graduate school years, and ever since it seemed like the rest of the world was catching up to the level of sincere regard he held for the medium.
Games had broken through to the cultural mainstream on the crest of a media narrative that cast the industry as a multi-million dollar machine to rival Hollywood, in which high-production-value action titles from famous franchises – Grand Theft Auto: San Andreas and Halo 2 – were assigned leading roles. College life, certainly in contrast to Abbott’s own experience, had adjusted accordingly. Students commandeered the campus’ computer network to stage skirmishes in the tense multiplayer shooter Counter-Strike, the fraternities of Kappa Sigma and Tau Kappa Epsilon housed unusually serious gamers, and most of the other dorms could boast of multiple consoles and sport game fans. A loose confederacy of Dungeons & Dragons, role-playing and tabletop gamers successfully lobbied for student senate funding and designated themselves the ‘Dork Club’. “They actually have money to spend,” explains a faintly awed Michael Abbott, “to buy games and buy pizza and have fun… even ten years ago it would have been unheard of.”
From Abbott's perspective, Wabash was subject to a clear generational shift. “Very few of my colleagues are gamers, but as our faculty gets younger, that is clearly changing. I have a colleague in Chemistry who's very into Warhammer and Transformers and the survival horror genre. A visiting professor in my department whose office is right next to mine is a big fan of strategy games and sports management sims.”
Video games were fine to play, but to teach them was something else. That proposition, perhaps surprisingly, faced as much resistance from students as it did the non-gaming faculty. “I don’t think we've successfully made the case for studying games even to gamers,” says Abbott. “They're not shocked because they think it's wrong or stupid; they just can't figure out what there is to learn. They're games. They're great, but so is bubble gum.” Abbott, on the other hand, had long seen in video games similar creative possibilities to theatre and film. Games, much like other forms of art, could be a lens through which to view the world, and the medium was made uniquely fascinating through the element of interactivity. His class, potentially, could host a discussion between like-minded individuals. “I just want to be part of that discussion any way I can.”
Even if Wabash students weren’t convinced that games could be intellectual and support close readings, they were still generally enthusiastic about the act of play, and Abbott believed he could turn that to his advantage. Video games were fun, they had never been more popular, and students might more readily engage with complicated subjects if games were the point of entry. If Abbott legitimised games by assigning them as his course texts, he would be speaking with the students in a language they already knew, and that few liberal arts professors understood or sanctioned. It was a teaching style demonstrated to Michael by his own film professors, who had helped students deconstruct complicated subjects through the accessible and relevant case studies of modern cinema.
The study of games was not intended for the students to better understand graphics, programming or animation. Games could teach the basics of research, critical writing, critical thinking, and an appreciation of complex systems: the skill set of a liberal arts student.
Every incoming Wabash freshman has to take a tutorial course. Each course is limited to 15 students and the classes are effectively intended as an academic induction, introducing the freshmen to the skills they will need to succeed at the college. Tutorial topics are broad and chosen by the respective professors, who are encouraged to pick something outside of their expertise to keep themselves interested. Despite the limitless subject matter, Wabash had never offered anything like ‘In The Future We Will Play: The Art and History of Video Games’, which appeared on course lists sent out to new students that summer.
The class caught the interest of Tom Elliott, although the student never considered himself anything more than a “casual console player”. Elliott, who had yet to declare a major, “thought it would be particularly fun to study [games] from an academic perspective” and pictured an “easy A” course in which he would do nothing but play games.
“The only class that actually sounded engaging”, remembers Nelson Barre, was ‘In The Future We Will Play’. “I found the discussion of video games as a cultural phenomenon intriguing.” Barre had attended high school in Colorado, five miles from the tragic school shooting at Columbine in 1999, a massacre whose perpetrators had played – in a connection conspicuous to the mainstream news media – copious amounts of the archetypically violent shooter Doom. “It’s easy to point the fingers [at video games] but when one is so accustomed to playing video games oneself, there is the tendency to defend them.”
The class’s 15 vacancies were filled very quickly, Michael Abbott was told.
***
“There’s no text for me, there’s no textbook, there’s no guidebook, there’s not existing literature.” The pipe dream now a reality, Abbott found his ambitions waylaid by practicalities. The course called for old arcade games, but making even modern video games available for 15 students to play on their own time was enough of a challenge.
The students would be there every Tuesday and Thursday, with the interim over the weekend assigned to game playing, and Wednesday and Tuesday night set aside for “reading and reflection”. Abbott didn’t lecture in the conventional sense, nor did he use PowerPoint; instead, he prioritised discussion and conversation. The advantage of the tutorial format, he notes, was that “once you’ve chosen [your topic] and establish it, you can use it as a starting point to develop something more.”
By the time the 15 students first appeared together in the classroom, Abbott had written to them to gauge their interests. He knew that he had a lot of fans of sport and racing games; “a lot of guy games, I expected that. But I also had a core group of really serious RPG players, and some strategy guys and some sim guys.” Abbott had once co-taught an upper-level Shakespeare class for English majors, and his teaching partner had prescribed assignments based on the student’s areas of specialisation; whether creative writing, or history, or verse. The stratification of video games by genre would easily lend itself to the same tactic, and everyone in the class could spend part of the semester as the resident expert in their preferred kind of game.
In establishing the terms of the course, Abbott confirmed that it would involve a fair amount of game playing, but the students would be required to take comprehensive notes. The idea was not to enjoy themselves without consequence but to witness the evolution of the medium through the use of hands-on examples.
Abbott was conscious that a college class on video games might be perceived around the campus as a free ride. “I’d say there was a stigma around the class itself”, says Tom Elliott. “You’d tell someone you were studying video games, and the invariable response would be a laugh, or something to the effect of ‘that must be so hard.’” At the time, Elliott didn’t necessarily disagree. “Even being in the class I doubted the academic rigor of the subject…. my thought of video games as a serious subject would be on the same level as ebonics or dancing, you just can’t quite consider them serious.”
“I think it motivated me to make sure that no one could accuse that class of going easy on work,” says Abbott, who despite a freeform, conversational teaching style, set a rigorous assignment schedule. “I really socked it to them, probably too much, because I was so concerned about being seen as offering a lollipop course…. They wrote a lot. They read a lot. I gave them supplementary reading materials on virtually everything we did, and they had to constantly write response papers and analytical papers on stuff we played.”
Students like Nelson Barre remained invested in the class, heavy workload notwithstanding. Barre, an English and Classics major, admired the course’s latitude. Abbott would digress along the course objective to engage the students in discussions of whether games were art or why the public at large were not paying attention. “Our class covered everything from ethics to gender, politics to narrative”, says Barre. “I readily see all these things in video games, but when I first took the course, I was not quite expecting the amount of depth Michael covered in class.”
***
Abbott proposed a debate between two students over a game-related topic, and Tom Elliott took up the challenge. Elliott, who says as a gamer he’d only ever “gotten on board” with the Super Nintendo console (which had been released thirteen years prior) would develop an interest in performing stand-up comedy, and Abbott was starting to see him as the H.L. Mencken of Wabash College. He belonged to a fraternity that in Elliott’s estimation could claim at least one game console in every room. Elliott wanted to argue the case of Dance Dance Revolution, the Japanese music game in which players stand on a plastic mat marked with coloured arrows and step in time with the beat of certain songs. “I said I would defend the idea that Dance Dance Revolution is the greatest game ever. Abbott suggested the topic might be a bit broad, but not wanting to back down, I said I would argue that DDR was the healthiest game ever.”
Elliott borrowed a heart monitor from a friend on the football team, and measured his heart rate during an evening game of Dance Dance Revolution. He compared the results to his experience playing a “non-interactive game”: something so passive it still escapes his recollection. With perhaps the slightest degree of flippancy, Elliott presented his findings to the class. “I made a compelling argument based on the fact that my heart rate was higher playing DDR than sitting on my ass”. The “debate” was downhill from there, and Abbott watched the speakers trade personal attacks with amusement. “I think he enjoyed our creativity,” says Elliott, “even if we weren’t using the most effective debating techniques.” As students outside the class dismissed Abbott's course as an effortless waste of time, Elliott began to think that they were actually jealous.
Nelson Barre likewise heard from his friends and family that he was “lucky to be taking such an easy course”. “That’s a class?” he would sometimes be asked. “I explained it was part of being a freshman, [and] taught writing, organization, study and discussion skills”, he says, “[and] it became a slightly more respectable class.” Barre, remembers Abbott, was the kind of student who “puts things together one piece at a time, and he’s very careful. He’s very serious, I think, it’s just his personality. He looks for the big picture. He tries to make connections to things. I think he became convinced in that class that his peers, who think that games are just a silly waste of time, need to be corrected. And so he took it upon himself to do some correcting.”
***
The historical texts of ‘In The Future We Will Play’ were the early ‘80s arcade games produced by Williams Electronics like Defender and Robotron: 2084. “I bought one of those ex-arcade dual joystick things that you can plug into your computer with a USB port,” says Abbott, explaining how he solved one of the course’s several logistical problems, “and I brought this behemoth into class and we would play all those really great Defender games.” The creators of these early texts were programmers like Eugene Jarvis and Larry DeMar, who, unusually for classic authors studied in an arts course, were still alive. “They’re still kicking, but nobody pays attention to them anymore. So you get these [students] a little charged up about these games and they get competitive with them, and suddenly they’re doing the kinds of research that you would normally wait until later in a student’s career to get to. They were doing primary sources research, they were contacting these designers to see if they could get interviews. They were doing a kind of journalism that you would not expect freshmen to do. Purely because their interest was being driven forward by these games, I think.”
The concept of a college course that facilitated a discussion between sincere gamers was expanding. The professor was all too happy to talk about the latest games outside of class hours. Elliott and Barre each expressed interests in a theatre major, and Abbott would be assigned to be their academic advisor. “I became close with those guys, and this doesn’t always happen. I think part of that was because we just gamed together, you know.” The idea had not necessarily been to enjoy themselves, but it happened regardless.
While the class at times could entertain, neither the students nor the professor lost focus on the aspects of video games they considered to be meaningful. “To us,” says Elliott, “a discussion of the impact of Super Mario Bros. on the Western world was a serious discussion and not a joke.”
***
“For a freshman,” says Abbott, speaking with slight regret, “a 20-page paper is a pretty big deal.” Their term paper was a research assignment on an influential game creator who had affected the game medium and industry. If Abbott had increased the difficulty of a freshman tutorial to combat outside misinterpretation, his students nevertheless met the challenge. “I was able to tap into the students’ eagerness to play these games. What that meant for me was that I was able to get their enthusiasm level up before diving in to a more rigorous assignment that I might not ordinarily have been able to assign.”
When Nelson Barre saw the subjects his fellow students were choosing, it reaffirmed his faith in the idealistic notions of the course. “[Everyone] looked at topics greater than simply video games for video games' sake.” The papers Abbott received interpreted the Metal Gear Solid designer Hideo Kojima as a deconstructionist, examined gender roles in the Final Fantasy series, and considered the silent hero as a masculine ideal.
Barre, Elliott and their thirteen peers were engaged and enthusiastic beyond the level anticipated of freshmen. Abbott attributes the result in part to the course’s unusual subject matter: “I met them at the place they lived,” he says, in reference to his choice of course theme. Whatever role video games played in the students’ motivation, however, Abbott would never disparage the hand he was unexpectedly dealt. “This was just an unbelievably bright group of guys.”
***
Abbott had tried to use video games to instruct those fundamental liberal arts skills other mediums were widely considered capable of teaching, and the students came to see these connections between media. “I added a Theater major,” says Tom Elliott, “because I enjoyed his teaching and the field in which he worked, which I see as interconnected.” Elliott still doesn’t believe games are “important enough” to be studied individually, but praises the tutorial. “The basic instruction in writing that I learned in the course is still valuable to me. I think my ideas on good vs. bad games also expanded during the course. I began to see the merits of some more online and multi-player games and also learned to appreciate a compelling story in a game as much as great graphics.”
“I always felt I was learning something from Michael,” says Nelson Barre. “He had a sense about him that only comes along in a professor with whom you really connect.” Barre would sign up for more of Abbott’s courses, though none of them were about video games, and he would become increasingly interested in Abbott’s original field, the theatre. He remembers ‘In The Future We Will Play’ fondly: “Michael's class taught me to be a better writer, speaker and thinker. I first developed my writing sense in that class, and Michael's ability to inquire and help in the writing process allowed me to flourish in the class and all the others I took in undergrad. I consider it, still, my favorite course in the four years I spent at Wabash.”
***
Michael Abbott was impressed by his students' dedication, and while he wondered if he had pushed them too hard, they had responded. “[What] I’m really teaching is critical thinking, analytical reading, collaborative discussion skills, clear, concise writing, research and public presentation. If I could make measurable progress in these areas with my students, I think my college would let me teach dog grooming.”
Regardless of the positive results of his experiment, and the personal plaudits he earned from his students, Abbott was not fully satisfied. ‘In The Future We Will Play’ was too broad, he thought, and they had not been afforded the time or the luxury to cover topics in-depth or linger on specific details. “We didn’t do enough reading of texts or enough close analysis to accomplish what I hoped.” He knew now that his students would have been ready for it.
“The problem, I think, with it was that it was mostly an overview course. That’s okay, but as a teacher, overview courses, survey courses, it’s hard to generate a lot of enthusiasm for that over and over.” He considered less general subjects; subjects that could sustain wider conversation and persuade the skeptical. “There’s a pretty strong resistance amongst students, even college-age students, to the idea of studying games. They just don’t quite buy it yet. At least that I see.”
Conversation was productive, especially for an art form as comparatively undeveloped as video games. “If you believe that video games, like any art, are a way of seeing, a way of seeing a world, or experiencing it…” Abbott never had a textbook because the rules were not written yet. “It’s partly why I think it’s good to have so many people thinking about them.”
The class could have been longer.
December 16, 2008
Elevator Music
If you'll allow a second trip back to the year 2007, I'd like to briefly revisit Mass Effect, a game which under the curious and shifting laws of game reviews is still eligible for the top ten lists and awards of 2008. Players will remember Mass Effect's scandalously long and unskippable elevator rides, which were used to disguise loading screens. Interestingly, during these odysseys, Mass Effect felt obligated to pipe in music, idle conversation and news bulletins. Mass Effect, a work of entertainment, provides supplementary amusements to relieve the player from the monotony it volunteered for. The game becomes an airline dutifully screening P.S. I Love You on a ten-hour flight, except there's no reason to go through the motions of the plane ride in the first place. It prompts the question what exactly was ever so bad about loading screens.
It's an odd decision to proactively implement the inconveniences of reality when it means producing the verisimilitude of boredom. This kind of realism was never meant for gameplay or dramatic effect but to craft the most immersive, cinematic 3D experience ever devised. In doing so, developers seemingly become so averse to anything that resembles a video game. Loading screens don't cut it, then, and so Mass Effect instead prescribes a deathly dull "real life" solution and an accompanying mea culpa to excuse its dreariness.
Sony's HOME went live this week (though "live" may be a poor choice of words.) This virtual reality networking extravaganza champions the virtues of queuing and patience but without even Mass Effect's perfunctory distractions. It exists for those who'd rather walk an avatar across town to catch a glimpse of the Young Vampires in Love trailer than clicking on a button. Games are simulating inconvenience for the illusion of reality and instead of reconsidering the whole concept, pile on extra entertainments so the player can endure it. What price artifice?
It's an odd decision to proactively implement the inconveniences of reality when it means producing the verisimilitude of boredom. This kind of realism was never meant for gameplay or dramatic effect but to craft the most immersive, cinematic 3D experience ever devised. In doing so, developers seemingly become so averse to anything that resembles a video game. Loading screens don't cut it, then, and so Mass Effect instead prescribes a deathly dull "real life" solution and an accompanying mea culpa to excuse its dreariness.
Sony's HOME went live this week (though "live" may be a poor choice of words.) This virtual reality networking extravaganza champions the virtues of queuing and patience but without even Mass Effect's perfunctory distractions. It exists for those who'd rather walk an avatar across town to catch a glimpse of the Young Vampires in Love trailer than clicking on a button. Games are simulating inconvenience for the illusion of reality and instead of reconsidering the whole concept, pile on extra entertainments so the player can endure it. What price artifice?
December 10, 2008
Never Break The Chain
I don't have a new Hit Self-Destruct post for you this week. To make up for it, I have an unearthed a classic piece from the vault (there's only one thing in the vault.) This is an old article I wrote about Bioshock that was never published on this site. It dates back to either October or November of 2007, which means it's a fairly immediate emotional reaction to the game but not exactly timeless. Who even remembers Bioshock these days. You could substitute the word "Bioshock" with something else that has disappointed you recently, like Mirror's Edge, Little Big Planet, Far Cry 2, or a colleague, spouse or child. I dropped in some new pictures to subtly modernise it. This won't be a regular feature, incidentally; I'm uncomfortable reading any of my writing that's more than twenty minutes old. I accidentally looked at the top of this paragraph and shrieked.
Bioshock: the Game that Wasn't
Here's an impression of a preview writer circa June 2007: When I look at BioShock I see a great game. Ken Levine and his team are about to unleash a revolution that will change the way we think about video games. Is it too soon to call BioShock an unqualified masterpiece? We hope not. Game of the Year? Try Game of the Century. Prepare to be Shocked. We noticed some balancing and FOV issues but we're confident that these will be ironed out before release.
Uncanny. The thing is, I'm actually being a little bit serious. I really do look at BioShock and see a great game. Caveat: the game in question isn't actually there. I'm talking about the game that's buried somewhere in the design documents in Ken Levine's office, and that the version I played only hints at.
By "better game" I don't mean one where 2K "fixed" the vita-chambers or the hacking minigame or made it more like System Shock 2 in every way. I mean a BioShock that's as good as it deserved to be, if only it hadn't fell victim to a Xenian flame-out in the final third. Much like how A.I. could have been a good movie if only it had ended right at that moment. There's no similar consensus as to how BioShock should have ended, only that it shouldn't have been that.
Let's start at the beginning, by which I mean the ending. On a strictly narrative level, BioShock's ending is underwhelming at best. It inelegantly redefines what the story was all about: it was about an underwater dystopia, objectivism versus nihilism, rationality and free will, right? Well, the ending's not. Somewhere in that final third, this game about high-minded philosophy and critical metafiction becomes a game about wide-eyed little girls and uncomplicated megalomania. From Rapture to crapture.
Make no mistake, it's absolutely important that BioShock conclude the story of the Little Sisters and — to less of an extent — the player character, Jack. It does both of these well (in the "good" ending, at least.) Neither of these things are as relevant as what BioShock omits — and I know they're relevant because the game spent over half its length convincing me of their worth. The haunting and arguably game-defining moment of meta-game commentary: ignored completely. Any kind of a synthesis or resolution to the objectivism/nihilism debate: sorely absent, and the dynamic is reduced to simple martyrs and villains.
This is unusually academic territory. For a video game, certainly, this is Foucault. Even though BioShock proved itself fairly capable of handling this material, we're admittedly dealing with the more esoteric (though still glaring) of the game's neglects. And yet BioShock disregards the completely literal element of Rapture. How can there not be any resolution on this magnificently ostentatious experiment which is both the premise and the entire setting of the game? The ending avoids the topic, not in the form of a cliffhanger, but rather it briefly confirms your personal altruism or sociopathy and throws you back to the main menu. Apparently, it wasn't that subject matter suddenly became too difficult, it's that the whole game inexplicably went off the rails.
I've glossed over a rather important point. The Andrew Ryan scene is one of the best and most worthwhile statements ever made in a video game about video games. Sure, there's hardly any competition for the title, and the statement in question is not particularly complex, but that's no excuse for not saying it. If anything, BioShock raised the bar for other developers in that respect. What are they going to do with the point BioShock just made? Interesting question, but here's a better one: what's BioShock going to do with the point BioShock just made? Well, hmm.
After that scene the whole dialectic about slaves and free will vanishes from the game. Fair enough, one could argue: the game already made its point reasonably well, and you're still playing a video game. Within the context of the story, though, Jack is a "slave" because he's being mind-controlled. The story's not about that, of course, that's just how it legitimises the abstract point about player agency: that in video games your choices are so limited you might as well be mind-controlled. There'd be less cause to complain if Jack remained under Fontaine's control the whole game, but he doesn't. He breaks free soon enough and the rest of the game plays out exactly the same. The player takes orders from Tenenbaum, not Atlas, and that's the extent of the differences. Again: could be fine, if BioShock hadn't abandoned its excuse for being so constricted, and thus hadn't become exactly what it was criticising. Every thought, every idea that BioShock compels with that Ryan scene is soon forgotten, and that wonderful moment is marginalised instead of assuming its rightful place as the cornerstone of a better game, and a better story: BioShock's tantalisingly close to fully realising its idea about being held hostage to narrative — less Truman Show and more Sophie's World. Essentially, BioShock just made a great, incendiary point about video games. Now, what's the game going to do with it?
It's not going to do anything with it.
BioShock tells you something incredibly exciting and then refuses to discuss it. What we've just seen, says BioShock, was a video game. That goes for gaming across the board. Artifical. Restricted. The implication is, now that we're fully cognisant of our limitations and have the means to remove them, we're about to see real life. It turns out reality is a lot like a video game.
It could have even been the illusion of choice; anything that wasn't the exact same thing you've been showing us and that you just damned. It's excruciating that a game this intelligent and talented drops the ball so badly. The most important choice that's actually in the game is even taken away from you: the choice between the "good" and "bad" endings. Instead, BioShock presumes to know the player's character, and this seems very much the wrong game to assume that. It could have made sense if the ending depended on the Little Sisters' actions, based on your behaviour towards them throughout, but that's not the case. That's not the BioShock we have.
Perhaps I'm being naĂŻve. After all, who hasn't, even once, bought into marketing over-hype and subsequently been disappointed with an above-average product? BioShock isn't Fable, though — it was over-hyped, sure, but my disappointment is in response to the quality of the game itself. That Ryan scene says "we're shooting for the stars," and I believe it. BioShock is better than this. BioShock is capable of more than this.
I believe that if any game was going to show us the video game version of "reality" (as opposed to the video game version of "video game" — I know this is confusing) it would have been BioShock. After the Ryan scene, that's when we should have had our revolution. That's when it should have changed the way we think about video games. You should have been showing us free will, self-determination, autonomy, as if it's all new to us. Maybe that's too high an expectation — but at the very least, that's when it should have become a better game.
The sad reality is what I'm proposing — and what BioShock itself proposed, frankly — is somewhat impractical. A lot of games flop in their final hours. Just as many stay consistent, and a select few save some of their best stuff for the endgame. I can't think of any game that completely reinvents itself at the two-thirds mark and eclipses everything that gaming has shown you to date. The end of a game is the area worst affected by crunch time, that's where the most debilitating and noticeable cuts are made. And nobody wants to hear that the game will really get good, honestly it will, once you've sunk ten hours into it. BioShock's introduction is probably the game at its most impressive, because once you've hooked the player it really doesn't matter if they see the ending. They've bought the game. BioShock painted itself into a corner — the only way to satisfy its promises is to deliver what in conventional game development simply cannot be delivered.
We can't blame it all on current marketplace realities. The fact of it is BioShock didn't even try and fail. Worse than that, it just didn't think it through. BioShock had a very clever idea but didn't know what to do with it. All dressed up and nowhere to go. Or rather, all dressed up and then goes home and won't answer your calls. Post-Ryan, we wonder what could possibly happen next. BioShock's wondering the same thing. Perhaps BioShock drops the subject so abruptly because it literally doesn't have an idea how to end that game; the game we saw in the Ryan scene and that we thought we were playing. Fortunately for BioShock, it does have an idea on hand for an ending to a far less compelling game. Levine's mentioned how late in the process the story came together. It's sadly self-evident.
BioShock could have done it, I really believe that. If only because of its single-minded determination in getting the player to that Andrew Ryan scene and then executing it so well. What we're left with is the BioShock that is and not the BioShock that isn't there. And that's the really special one. Try game of the century.
Bioshock: the Game that Wasn't
Here's an impression of a preview writer circa June 2007: When I look at BioShock I see a great game. Ken Levine and his team are about to unleash a revolution that will change the way we think about video games. Is it too soon to call BioShock an unqualified masterpiece? We hope not. Game of the Year? Try Game of the Century. Prepare to be Shocked. We noticed some balancing and FOV issues but we're confident that these will be ironed out before release.
Uncanny. The thing is, I'm actually being a little bit serious. I really do look at BioShock and see a great game. Caveat: the game in question isn't actually there. I'm talking about the game that's buried somewhere in the design documents in Ken Levine's office, and that the version I played only hints at.
By "better game" I don't mean one where 2K "fixed" the vita-chambers or the hacking minigame or made it more like System Shock 2 in every way. I mean a BioShock that's as good as it deserved to be, if only it hadn't fell victim to a Xenian flame-out in the final third. Much like how A.I. could have been a good movie if only it had ended right at that moment. There's no similar consensus as to how BioShock should have ended, only that it shouldn't have been that.
Let's start at the beginning, by which I mean the ending. On a strictly narrative level, BioShock's ending is underwhelming at best. It inelegantly redefines what the story was all about: it was about an underwater dystopia, objectivism versus nihilism, rationality and free will, right? Well, the ending's not. Somewhere in that final third, this game about high-minded philosophy and critical metafiction becomes a game about wide-eyed little girls and uncomplicated megalomania. From Rapture to crapture.
Make no mistake, it's absolutely important that BioShock conclude the story of the Little Sisters and — to less of an extent — the player character, Jack. It does both of these well (in the "good" ending, at least.) Neither of these things are as relevant as what BioShock omits — and I know they're relevant because the game spent over half its length convincing me of their worth. The haunting and arguably game-defining moment of meta-game commentary: ignored completely. Any kind of a synthesis or resolution to the objectivism/nihilism debate: sorely absent, and the dynamic is reduced to simple martyrs and villains.
This is unusually academic territory. For a video game, certainly, this is Foucault. Even though BioShock proved itself fairly capable of handling this material, we're admittedly dealing with the more esoteric (though still glaring) of the game's neglects. And yet BioShock disregards the completely literal element of Rapture. How can there not be any resolution on this magnificently ostentatious experiment which is both the premise and the entire setting of the game? The ending avoids the topic, not in the form of a cliffhanger, but rather it briefly confirms your personal altruism or sociopathy and throws you back to the main menu. Apparently, it wasn't that subject matter suddenly became too difficult, it's that the whole game inexplicably went off the rails.
I've glossed over a rather important point. The Andrew Ryan scene is one of the best and most worthwhile statements ever made in a video game about video games. Sure, there's hardly any competition for the title, and the statement in question is not particularly complex, but that's no excuse for not saying it. If anything, BioShock raised the bar for other developers in that respect. What are they going to do with the point BioShock just made? Interesting question, but here's a better one: what's BioShock going to do with the point BioShock just made? Well, hmm.
After that scene the whole dialectic about slaves and free will vanishes from the game. Fair enough, one could argue: the game already made its point reasonably well, and you're still playing a video game. Within the context of the story, though, Jack is a "slave" because he's being mind-controlled. The story's not about that, of course, that's just how it legitimises the abstract point about player agency: that in video games your choices are so limited you might as well be mind-controlled. There'd be less cause to complain if Jack remained under Fontaine's control the whole game, but he doesn't. He breaks free soon enough and the rest of the game plays out exactly the same. The player takes orders from Tenenbaum, not Atlas, and that's the extent of the differences. Again: could be fine, if BioShock hadn't abandoned its excuse for being so constricted, and thus hadn't become exactly what it was criticising. Every thought, every idea that BioShock compels with that Ryan scene is soon forgotten, and that wonderful moment is marginalised instead of assuming its rightful place as the cornerstone of a better game, and a better story: BioShock's tantalisingly close to fully realising its idea about being held hostage to narrative — less Truman Show and more Sophie's World. Essentially, BioShock just made a great, incendiary point about video games. Now, what's the game going to do with it?
It's not going to do anything with it.
BioShock tells you something incredibly exciting and then refuses to discuss it. What we've just seen, says BioShock, was a video game. That goes for gaming across the board. Artifical. Restricted. The implication is, now that we're fully cognisant of our limitations and have the means to remove them, we're about to see real life. It turns out reality is a lot like a video game.
It could have even been the illusion of choice; anything that wasn't the exact same thing you've been showing us and that you just damned. It's excruciating that a game this intelligent and talented drops the ball so badly. The most important choice that's actually in the game is even taken away from you: the choice between the "good" and "bad" endings. Instead, BioShock presumes to know the player's character, and this seems very much the wrong game to assume that. It could have made sense if the ending depended on the Little Sisters' actions, based on your behaviour towards them throughout, but that's not the case. That's not the BioShock we have.
Perhaps I'm being naĂŻve. After all, who hasn't, even once, bought into marketing over-hype and subsequently been disappointed with an above-average product? BioShock isn't Fable, though — it was over-hyped, sure, but my disappointment is in response to the quality of the game itself. That Ryan scene says "we're shooting for the stars," and I believe it. BioShock is better than this. BioShock is capable of more than this.
I believe that if any game was going to show us the video game version of "reality" (as opposed to the video game version of "video game" — I know this is confusing) it would have been BioShock. After the Ryan scene, that's when we should have had our revolution. That's when it should have changed the way we think about video games. You should have been showing us free will, self-determination, autonomy, as if it's all new to us. Maybe that's too high an expectation — but at the very least, that's when it should have become a better game.
The sad reality is what I'm proposing — and what BioShock itself proposed, frankly — is somewhat impractical. A lot of games flop in their final hours. Just as many stay consistent, and a select few save some of their best stuff for the endgame. I can't think of any game that completely reinvents itself at the two-thirds mark and eclipses everything that gaming has shown you to date. The end of a game is the area worst affected by crunch time, that's where the most debilitating and noticeable cuts are made. And nobody wants to hear that the game will really get good, honestly it will, once you've sunk ten hours into it. BioShock's introduction is probably the game at its most impressive, because once you've hooked the player it really doesn't matter if they see the ending. They've bought the game. BioShock painted itself into a corner — the only way to satisfy its promises is to deliver what in conventional game development simply cannot be delivered.
We can't blame it all on current marketplace realities. The fact of it is BioShock didn't even try and fail. Worse than that, it just didn't think it through. BioShock had a very clever idea but didn't know what to do with it. All dressed up and nowhere to go. Or rather, all dressed up and then goes home and won't answer your calls. Post-Ryan, we wonder what could possibly happen next. BioShock's wondering the same thing. Perhaps BioShock drops the subject so abruptly because it literally doesn't have an idea how to end that game; the game we saw in the Ryan scene and that we thought we were playing. Fortunately for BioShock, it does have an idea on hand for an ending to a far less compelling game. Levine's mentioned how late in the process the story came together. It's sadly self-evident.
BioShock could have done it, I really believe that. If only because of its single-minded determination in getting the player to that Andrew Ryan scene and then executing it so well. What we're left with is the BioShock that is and not the BioShock that isn't there. And that's the really special one. Try game of the century.
December 4, 2008
Debate Class
[Official transcript of the West Hope Middle School debate club meeting, 3/3/2006]
DYLAN: "We are going to argue today that 'video game', two words, is the correct spelling, and 'videogame', one word, is wrong. For our first argument, we note that the Oxford English Dictionary defines 'video game' as two words, and so the dictionary agrees that we are right. Because we use the dictionary to spell things correctly, we think this is definitely the right way to spell 'video game'. Thank you."
MADISON: "Good afternoon everyone! Thank you in advance for listening to us. As our opening statement, I would like to point out that many popular videogame writers spell it as one word. For example, Kieron Gillen spells it like that. Kieron Gillen, for those of you who don't know, invented New Games Journalism, which basically is a better way of writing about videogames. Also, it's 'new'. And because Mr. Gillen knows what is new and better, we trust him about the best way to spell 'videogame'. Our opposition can hold on to their dictionary, we want to know why they don't want to hold on to the future."
ETHAN: "Now I am going to rebut you. Ol' Kieron's so-called New Games Journalism manifesto tells game journalists that essentially they can write anything they want and it doesn't matter. They can say crazy things that have nothing to do with games, or use reviews as their personal diary or something stupid like that, and it's all okay according to Kieron Gillen! So we think people should keep on spelling 'video game' as two words, like it always was, to uphold order and stop anarchy."
ASHLEY: "Another thing we would like to point out is -- "
DYLAN: "Madison! Can't you see this is tearing our relationship apart -- "
[Transcript ends.]
DYLAN: "We are going to argue today that 'video game', two words, is the correct spelling, and 'videogame', one word, is wrong. For our first argument, we note that the Oxford English Dictionary defines 'video game' as two words, and so the dictionary agrees that we are right. Because we use the dictionary to spell things correctly, we think this is definitely the right way to spell 'video game'. Thank you."
MADISON: "Good afternoon everyone! Thank you in advance for listening to us. As our opening statement, I would like to point out that many popular videogame writers spell it as one word. For example, Kieron Gillen spells it like that. Kieron Gillen, for those of you who don't know, invented New Games Journalism, which basically is a better way of writing about videogames. Also, it's 'new'. And because Mr. Gillen knows what is new and better, we trust him about the best way to spell 'videogame'. Our opposition can hold on to their dictionary, we want to know why they don't want to hold on to the future."
ETHAN: "Now I am going to rebut you. Ol' Kieron's so-called New Games Journalism manifesto tells game journalists that essentially they can write anything they want and it doesn't matter. They can say crazy things that have nothing to do with games, or use reviews as their personal diary or something stupid like that, and it's all okay according to Kieron Gillen! So we think people should keep on spelling 'video game' as two words, like it always was, to uphold order and stop anarchy."
ASHLEY: "Another thing we would like to point out is -- "
DYLAN: "Madison! Can't you see this is tearing our relationship apart -- "
[Transcript ends.]
December 1, 2008
Friends Like These
Comparisons between Fallout 3 and Fable 2 come easily. Both games are high-profile, open world RPGs with heavy emphasis on character customization and moral dilemmas; also, their names begin with the letter 'F' and end with a number, and they came out at the same time. Michael Abbott will note that, in contrast with Fable 2's deliberately authored story, supported by memorably written and voiced characters, the NPCs of Fallout 3 appear lacking in interactive and emotional depth.
I think that the characters work well enough for the specific purposes of Fallout 3. They may never fully endear themselves to the player, and they might not be cool enough to cut it as someone's cellphone wallpaper, but they're sufficient to populate the world believably and in this game that's what matters. If there's any substantive dialogue or conversation going on in Fallout 3 it's not between the doctors, the raiders or the paladins; it's between the player and the world. Fallout 3 is a game in the tradition of Half-Life, BioShock and Myst rather than Knights of the Old Republic or Psychonauts. Your avatar is an excuse to explore a place, and you discern the history of this nuclear war through your own exploration of geography and architecture. The characters exist to show you what's happened to humanity; the world isn't there as a backdrop for their personal dramas. The main character in Fallout 3 is Washington, D.C.
Certainly, Bethesda could have extended their NPCs a couple extra layers of interaction deeper or infused their dialogue and facial animation with greater expression. As Steve Gaynor points out here, though, choosing to maintain the plausibility of environment over character requires some sacrifice of the latter. The NPCs can't exist on the higher plane of complex interaction to which Bethesda took the wasteland. If Bethesda thought they could do everything, if they'd had NPCs running all over the place trying to tell you about their personal stories at any conceivable location in this infinitely variable sandbox, then they'd have the emergent disasters of GTA IV: stealing cars and running over old people before the eyes of your oblivious date. The NPCs are limited, deliberately, because Bethesda know their limitations. While the Fallout 3 characters aren't frozen in place like they were in Morrowind, or like they are in Mass Effect, they're clearly restricted in their movements. They are confined within a radius of loci points like goats tethered to a post, to use a mathematical concept I learned when I was eleven and never thought about since until this paragraph. It's a design decision Bethesda adhere to, except when they don't. Because, sometimes, they won't.
Occasionally they'll violate their own rules and gift a certain subset of NPC -- the permanent companion -- with greater autonomy, which ends up compromising both the character and the world. Fallout 3's a personal experience, gradually revealing the world to the player, whose avatar never gets in the way. It's a compact between two parties which was never built to accomodate a third character.
Of all the potential sidekicks who can get in to the passenger seat, none of them will respond to any of the emergent events, the horrifying discoveries or the plot turns which punctuate this game. Given how exciting Fallout 3 often is, it's deflating, with your adrenaline at such highs, to have your emotions tempered by the constant presence of this unresponsive cipher who isn't interested in what's going on. The game doesn't incorporate a third character into your conversation, it ignores them and so they feel false. Their presence is an immersion problem.
The companions are out of sync with the rest of the game. Aside from fights, they won't react to any external factors, and neither the story nor any other character will ever acknowledge their existence. It's as if you have an imaginary friend, except you're all too aware of the emotional dead weight you're carrying around. They're accessories, effective only as mobile turrets and dress-up dolls.
Any NPC is believable to a point, and as soon as their scripted routines are disrupted all the flaws become quickly apparent. Bethesda largely prevents that from ever occurring, except, inexplicably, in this case. It's a technical issue. The companions can't be programmed to exhibit a convincing array of responses to all the emergent possibilities conceivably generated in an open-world playground. Games aren't able to simulate human behaviour at the level which Fallout 3 requires to be consistently credible.
What level are they at? The dog. Fable 2 knew this as well. Behaviourally, Fallout 3's Dogmeat is as sophisticated as all the other companions, but nobody would ever think to involve a dog in the plot or ask him for advice. He runs around, bites monsters, barks, and you can tell him good boy and send him to fetch things. He acts like a dog where none of the other companions act sufficiently like people. What's more, nothing in Fallout 3 provokes a similar reaction to seeing a super mutant who's beating Dogmeat with a sledgehammer. No other character makes you drop everything and tear that mutant apart. It's at once charming and surprising how instinctively the words get away from my dog will come to you. That's the goal. On the immersion meter, we're at "dog". Getting to "human" is a process.
Until then there's something special, sadly, to be said for solitude; also, dogs.
I think that the characters work well enough for the specific purposes of Fallout 3. They may never fully endear themselves to the player, and they might not be cool enough to cut it as someone's cellphone wallpaper, but they're sufficient to populate the world believably and in this game that's what matters. If there's any substantive dialogue or conversation going on in Fallout 3 it's not between the doctors, the raiders or the paladins; it's between the player and the world. Fallout 3 is a game in the tradition of Half-Life, BioShock and Myst rather than Knights of the Old Republic or Psychonauts. Your avatar is an excuse to explore a place, and you discern the history of this nuclear war through your own exploration of geography and architecture. The characters exist to show you what's happened to humanity; the world isn't there as a backdrop for their personal dramas. The main character in Fallout 3 is Washington, D.C.
Certainly, Bethesda could have extended their NPCs a couple extra layers of interaction deeper or infused their dialogue and facial animation with greater expression. As Steve Gaynor points out here, though, choosing to maintain the plausibility of environment over character requires some sacrifice of the latter. The NPCs can't exist on the higher plane of complex interaction to which Bethesda took the wasteland. If Bethesda thought they could do everything, if they'd had NPCs running all over the place trying to tell you about their personal stories at any conceivable location in this infinitely variable sandbox, then they'd have the emergent disasters of GTA IV: stealing cars and running over old people before the eyes of your oblivious date. The NPCs are limited, deliberately, because Bethesda know their limitations. While the Fallout 3 characters aren't frozen in place like they were in Morrowind, or like they are in Mass Effect, they're clearly restricted in their movements. They are confined within a radius of loci points like goats tethered to a post, to use a mathematical concept I learned when I was eleven and never thought about since until this paragraph. It's a design decision Bethesda adhere to, except when they don't. Because, sometimes, they won't.
Occasionally they'll violate their own rules and gift a certain subset of NPC -- the permanent companion -- with greater autonomy, which ends up compromising both the character and the world. Fallout 3's a personal experience, gradually revealing the world to the player, whose avatar never gets in the way. It's a compact between two parties which was never built to accomodate a third character.
Of all the potential sidekicks who can get in to the passenger seat, none of them will respond to any of the emergent events, the horrifying discoveries or the plot turns which punctuate this game. Given how exciting Fallout 3 often is, it's deflating, with your adrenaline at such highs, to have your emotions tempered by the constant presence of this unresponsive cipher who isn't interested in what's going on. The game doesn't incorporate a third character into your conversation, it ignores them and so they feel false. Their presence is an immersion problem.
The companions are out of sync with the rest of the game. Aside from fights, they won't react to any external factors, and neither the story nor any other character will ever acknowledge their existence. It's as if you have an imaginary friend, except you're all too aware of the emotional dead weight you're carrying around. They're accessories, effective only as mobile turrets and dress-up dolls.
Any NPC is believable to a point, and as soon as their scripted routines are disrupted all the flaws become quickly apparent. Bethesda largely prevents that from ever occurring, except, inexplicably, in this case. It's a technical issue. The companions can't be programmed to exhibit a convincing array of responses to all the emergent possibilities conceivably generated in an open-world playground. Games aren't able to simulate human behaviour at the level which Fallout 3 requires to be consistently credible.
What level are they at? The dog. Fable 2 knew this as well. Behaviourally, Fallout 3's Dogmeat is as sophisticated as all the other companions, but nobody would ever think to involve a dog in the plot or ask him for advice. He runs around, bites monsters, barks, and you can tell him good boy and send him to fetch things. He acts like a dog where none of the other companions act sufficiently like people. What's more, nothing in Fallout 3 provokes a similar reaction to seeing a super mutant who's beating Dogmeat with a sledgehammer. No other character makes you drop everything and tear that mutant apart. It's at once charming and surprising how instinctively the words get away from my dog will come to you. That's the goal. On the immersion meter, we're at "dog". Getting to "human" is a process.
Until then there's something special, sadly, to be said for solitude; also, dogs.
November 28, 2008
Thirteen
It's difficult to remember a time when I ever thought I would like the Witcher. I might have hoped once that the game's Polish and allegedly literary origins would result in a new and lively take on a very specific, very conservative sub-genre of diminishing relevance: the PC fantasy RPG. Instead, you play as an amnesiac, part of an elite order of monster slayers who wage war in dungeons and taverns against mysterious and evil wizards trying to take over the world, and it turns out the creative forces behind the Witcher played D&D when they were kids like everyone else in the game industry.
If I'd played this game when I was a kid I can imagine liking it. Medieval fantasy was so much more palatable back then, and the frequent cameos by forbidden profanity and women in undress probably would have sealed the deal. But the real reason I'm so currently unimpressed with the Witcher is because I don't have the patience for this kind of thing anymore.
The Witcher is extensive in the worst way. It's a long and repetitive thing, with lots of sidequests, lore text, inventory management and vaguely-interactive scenes of two people standing and exchanging exposition. I have found as I get older that I have less time for such an uneconomical and encumbered method of experiencing content. Being able to afford other games means I usually can't afford to immerse myself so completely in something as unjustifiably long-winded like this. It doesn't do well in comparisons, either: whatever pathos the Witcher can wrest from the story of a family tragedy as recited by stiffly-animated characters in cutscenes and dialogue trees is easily topped by a one-room tableau of creatively arranged art assets and 15 seconds of audio in Fallout 3.
There's something very archaic about the Witcher, especially in contrast with Fallout, which rewards exploration by scattering its best moments way off the critical path. With the Witcher, you learn very soon what to expect. Quests, whether side or main, cover extremely similar ground and being told to go kill ten monsters will never be an inspiring objective regardless of the fictional stakes. Witcher players are forever compiling one amorphous to-do list rather than exploring interesting diversions from their urgent and vital mission. Unless the player is a compulsive fixer -- admittedly a core RPG demographic -- there's not much reason to endure all these small variations on the main quest is to improve their character's stats and make the progression through all the mandatory content slightly easier. (Who hasn't always wanted to role play that emotion.) Either that, or because they want to get a look at a playing-card-sized painting of a naked renaissance fair barmaid while hoping their wife doesn't walk in the room.
The Witcher is either so enamoured of its central gameplay -- or so unimaginative -- that faced with the design challenge of extending the player's experience beyond the main quest, it just gives them as much of the same thing as possible. The developers were too concerned, perhaps, with some ill-conceived minimum length requirement and padded the game out in the easiest way. It stands to bizarre reason that if the player enjoys the basic dungeon crawling and escort missions then they should enjoy doing those things over and over again with less reward. I would have been fine with this if I was younger and if the game was all I had, but even back then there were alternatives to designing side content.
There's a line that can be traced from Monkey Island through Anachronox to Yakuza 2, and their side quests which existed as accessible and surprising alternatives to the main game. You could always entertain yourself with absurd dialogue options and non-essential content, and, when you were stuck, the game would endear itself to you again. They required little investment and paid off almost immediately. From the melodramatic estrangement between a son and his father; the burgeoning career of a street rapper; testing a litany of pick-up lines on unimpressed women to the environmental vignettes of Fallout 3; they were slight distractions, quickly resolved and comparatively so sophisticated in their brevity.
Whereas the Witcher, like every game of its type, casts you in the role of the hero unlucky enough to walk into down the day that everyone needed their problems solved and it took exactly the same thing to solve each one. It's too familiar. If there's an NPC with a unique name, I know, then it follows that there's a heirloom he lost in the swamps and I'll need to clear an hour from my schedule to venture through some caves fighting off packs of wolves and then one big wolf. I'm not thirteen anymore, and I've done this before.
At thirteen I would have exhausted the Witcher. I can decide now, however, faster than I could then, if I'm going to like something. It's a snap judgment based on years of experience and learned design preferences that tells me I shouldn't waste time screwing around in the Witcher or anything like it. It's instinct which comes with age, although the downside, it seems, is that there's now less out there for me to like.
If I'd played this game when I was a kid I can imagine liking it. Medieval fantasy was so much more palatable back then, and the frequent cameos by forbidden profanity and women in undress probably would have sealed the deal. But the real reason I'm so currently unimpressed with the Witcher is because I don't have the patience for this kind of thing anymore.
The Witcher is extensive in the worst way. It's a long and repetitive thing, with lots of sidequests, lore text, inventory management and vaguely-interactive scenes of two people standing and exchanging exposition. I have found as I get older that I have less time for such an uneconomical and encumbered method of experiencing content. Being able to afford other games means I usually can't afford to immerse myself so completely in something as unjustifiably long-winded like this. It doesn't do well in comparisons, either: whatever pathos the Witcher can wrest from the story of a family tragedy as recited by stiffly-animated characters in cutscenes and dialogue trees is easily topped by a one-room tableau of creatively arranged art assets and 15 seconds of audio in Fallout 3.
There's something very archaic about the Witcher, especially in contrast with Fallout, which rewards exploration by scattering its best moments way off the critical path. With the Witcher, you learn very soon what to expect. Quests, whether side or main, cover extremely similar ground and being told to go kill ten monsters will never be an inspiring objective regardless of the fictional stakes. Witcher players are forever compiling one amorphous to-do list rather than exploring interesting diversions from their urgent and vital mission. Unless the player is a compulsive fixer -- admittedly a core RPG demographic -- there's not much reason to endure all these small variations on the main quest is to improve their character's stats and make the progression through all the mandatory content slightly easier. (Who hasn't always wanted to role play that emotion.) Either that, or because they want to get a look at a playing-card-sized painting of a naked renaissance fair barmaid while hoping their wife doesn't walk in the room.
The Witcher is either so enamoured of its central gameplay -- or so unimaginative -- that faced with the design challenge of extending the player's experience beyond the main quest, it just gives them as much of the same thing as possible. The developers were too concerned, perhaps, with some ill-conceived minimum length requirement and padded the game out in the easiest way. It stands to bizarre reason that if the player enjoys the basic dungeon crawling and escort missions then they should enjoy doing those things over and over again with less reward. I would have been fine with this if I was younger and if the game was all I had, but even back then there were alternatives to designing side content.
There's a line that can be traced from Monkey Island through Anachronox to Yakuza 2, and their side quests which existed as accessible and surprising alternatives to the main game. You could always entertain yourself with absurd dialogue options and non-essential content, and, when you were stuck, the game would endear itself to you again. They required little investment and paid off almost immediately. From the melodramatic estrangement between a son and his father; the burgeoning career of a street rapper; testing a litany of pick-up lines on unimpressed women to the environmental vignettes of Fallout 3; they were slight distractions, quickly resolved and comparatively so sophisticated in their brevity.
Whereas the Witcher, like every game of its type, casts you in the role of the hero unlucky enough to walk into down the day that everyone needed their problems solved and it took exactly the same thing to solve each one. It's too familiar. If there's an NPC with a unique name, I know, then it follows that there's a heirloom he lost in the swamps and I'll need to clear an hour from my schedule to venture through some caves fighting off packs of wolves and then one big wolf. I'm not thirteen anymore, and I've done this before.
At thirteen I would have exhausted the Witcher. I can decide now, however, faster than I could then, if I'm going to like something. It's a snap judgment based on years of experience and learned design preferences that tells me I shouldn't waste time screwing around in the Witcher or anything like it. It's instinct which comes with age, although the downside, it seems, is that there's now less out there for me to like.
November 25, 2008
Extreme
The facts: if you register on Ubisoft's website, you can download a skin for the new Prince of Persia game that lets you play as the character from Assassins' Creed.
The spin: Ben Mattes, producer, via Eurogamer: "Prince of Persia is an incredible experience. We're thrilled to give our loyal fans another way to journey through it. This exclusive reward is a 'thank you' to our fans, who can easily unlock it when they link their Ubisoft.com accounts to their Xbox 360 or PlayStation 3 gamer identities."
Sometimes you'll read a bizarre regional news story, about a gazelle running for local office or something, and think, haha, only in that country! The quote above is an Only In The Games Industry.
We have become so casually effusive when we talk about reliving the incredible Prince of Persia experience through an extreme and radical prism, we're really saying that you're basically going to be sent some cardboard 3D glasses from a children's colouring book. If you sign up for a mailing list.
What a depressingly disingenuous quote, more so for having made it as a headline on any website in the world. Either we are so self-absorbed and self-important that we really do obsess over these meaningless things, or Ubisoft are so cynical that announcing trivialities with righteous insincerity is their default setting. When did expectations get so high that they can't just say that this is a cool, if slight, bonus which players might enjoy? What can possibly be the point?
I picture a bored copywriter in a Montreal office forced to gush soullessly over an unlockable character skin and wondering if anything even matters anymore.
The spin: Ben Mattes, producer, via Eurogamer: "Prince of Persia is an incredible experience. We're thrilled to give our loyal fans another way to journey through it. This exclusive reward is a 'thank you' to our fans, who can easily unlock it when they link their Ubisoft.com accounts to their Xbox 360 or PlayStation 3 gamer identities."
Sometimes you'll read a bizarre regional news story, about a gazelle running for local office or something, and think, haha, only in that country! The quote above is an Only In The Games Industry.
We have become so casually effusive when we talk about reliving the incredible Prince of Persia experience through an extreme and radical prism, we're really saying that you're basically going to be sent some cardboard 3D glasses from a children's colouring book. If you sign up for a mailing list.
What a depressingly disingenuous quote, more so for having made it as a headline on any website in the world. Either we are so self-absorbed and self-important that we really do obsess over these meaningless things, or Ubisoft are so cynical that announcing trivialities with righteous insincerity is their default setting. When did expectations get so high that they can't just say that this is a cool, if slight, bonus which players might enjoy? What can possibly be the point?
I picture a bored copywriter in a Montreal office forced to gush soullessly over an unlockable character skin and wondering if anything even matters anymore.
November 21, 2008
Video Dames
[Written by Duncan Fyfe and Alex Ashby.]
Late on a Friday night, a bleary-eyed 26 year old woman named Bridget slumped over the couch in the living room of the apartment she shared with her law student friend. Lying on her chest, she arranged the empty beer bottles standing on the carpet into a tidy circle. The My Bloody Valentine album Loveless murmured out of the stereo; the CD was possibly on repeat, you couldn't really tell.
Michelle swayed out of the bathroom holding a wine glass and slouched against the door frame. "Oh my God, listen to this," she said, articulating wildly with her free hand. "I had the best idea. We should start a website. Where we talk about games. We'll write it about girls who play games. It can be written by girl gamers for girl gamers. You know what we'll call it? You know what? Video Dames. Video Dames dot com."
Bridget rolled onto her back. "That's a funny name," she said.
Smiling broadly, Michelle continued, "We would be the video dames. This is awesome. We should really do this."
With a fingernail, Bridget flicked a bottle to the floor. "What would we, like, write about though? What kind of articles? Did you come up with anything else besides the name?"
Michelle delayed in the doorway, tilting her head from side to side in vacillation.
"I don't want to write a website," Bridget decided, "it'd be really hard. Having to update it every day? Fuck that, I don't want to do that, it'll take up too much time."
Staring for a moment, Michelle whispered "okay" and stiffly sat down by the arm of the other couch. Bridget stumbled to her feet and, bent over the kitchen counter, cracked open another beer. "Do you want to go to that movie tomorrow?" she asked.
Michelle took a careful sip of the wine. "No."
Late on a Friday night, a bleary-eyed 26 year old woman named Bridget slumped over the couch in the living room of the apartment she shared with her law student friend. Lying on her chest, she arranged the empty beer bottles standing on the carpet into a tidy circle. The My Bloody Valentine album Loveless murmured out of the stereo; the CD was possibly on repeat, you couldn't really tell.
Michelle swayed out of the bathroom holding a wine glass and slouched against the door frame. "Oh my God, listen to this," she said, articulating wildly with her free hand. "I had the best idea. We should start a website. Where we talk about games. We'll write it about girls who play games. It can be written by girl gamers for girl gamers. You know what we'll call it? You know what? Video Dames. Video Dames dot com."
Bridget rolled onto her back. "That's a funny name," she said.
Smiling broadly, Michelle continued, "We would be the video dames. This is awesome. We should really do this."
With a fingernail, Bridget flicked a bottle to the floor. "What would we, like, write about though? What kind of articles? Did you come up with anything else besides the name?"
Michelle delayed in the doorway, tilting her head from side to side in vacillation.
"I don't want to write a website," Bridget decided, "it'd be really hard. Having to update it every day? Fuck that, I don't want to do that, it'll take up too much time."
Staring for a moment, Michelle whispered "okay" and stiffly sat down by the arm of the other couch. Bridget stumbled to her feet and, bent over the kitchen counter, cracked open another beer. "Do you want to go to that movie tomorrow?" she asked.
Michelle took a careful sip of the wine. "No."
November 20, 2008
The Neutral
Every week, the reporter Joe Klein writes a column for Time magazine. In October he opined, favourably, on Barack Obama's performance in the second presidential debate. A letter to the editor took exception to the editorial:
We read Joe Klein's "The Obama Surge" in my English class [Oct. 20]. We had heard about Klein's bias towards the Democrats, but this column took it too far. There was not a single complementary remark about McCain or a single negative one about Obama. Klein also noted that McCain seems awkward because of his physical impairments. This was insulting and, I believe, irrelevant to voters. McCain has sacrificed far more for his country than Klein ever will.
I admire the author for finding the time to write this letter between complaining about video game reviews on the internet.
Imagine an opinion piece which discusses a recent game in terms that are unilaterally positive. The article focuses on one abstract element of the game, setting or atmosphere or art design or something. Whatever, it's a theme that really resonated with the author, who then wants to explore it in detail. While liberal in its praise, the piece is not exhaustive, and very intentionally doesn't mention any of the game's well-known flaws -- things like crashes, framerate issues and AI problems. Taking these alarming omissions into account, is this opinion piece ethically suspicious or merely irresponsible?
Apparently you present your evaluative thoughts on a game in any format more sophisticated than "played 3 hours of mirror's edge last night... like the music... combat sucks... more soon, xoxo" some people are going to equate it to a review. They view the article through the conventions of a review and bizarre standards of objectivity, impartiality and fairness as upheld by a constituency of impotent watchdogs. Where the original article was never supposed to be definitive, now people are reading it like it has pretensions towards being the final word on the matter. The expectation of a review is that it should cover all the good and bad points about a game, presumably in an objective, expedient and unpretentious style that educates the reader on whether or not to buy it. It should assess all the major areas: graphics, sound, story, fun, replayability. It should note all the bugs, loading times and sub-par animations. Even though those qualities are pretty irrelevant to your thesis if you want to write about the game from any perspective other than usability or hardware, the article described above still transforms from what you liked about the game to you very conspicuously leaving out everything bad about it. If everyone knows the game has a big crashing problem and that's not mentioned in the "review"? The publisher must have sent a whole truck of cocaine and hookers to explain that travesty of justice.
What about the piece has really been invalidated though? Its value as a consumer report? Maybe so if the problem in question was particularly egregious, but we're not even talking about actual reviews. Is the only thing gamers look for in any kind of critique product details? Are there really people who think the only possible purpose of criticism is to better inform the customer or the voter? I didn't think I wrote reviews anymore but apparently I still am writing someone else's shopping list. I don't understand exactly how appending your personal take on a game with some conventional wisdom about crashes is supposed to be helpful. Given the overwhelming tone of the piece, whose mind is that going to change anyway?
Maybe all these reviews and articles really should be written in the aggressively neutral, zero sum tones of a Wikipedia page's "critical response" section, something onto which anyone can project their preferences. Ideally, though, you want to write about games in interesting ways that engage readers regardless of whether they like the game, whether they've played the game or have any interest in the game. It should transcend basic responses to specific and technical points.
To offer a dissenting opinion, I don't believe anything I just wrote.
We read Joe Klein's "The Obama Surge" in my English class [Oct. 20]. We had heard about Klein's bias towards the Democrats, but this column took it too far. There was not a single complementary remark about McCain or a single negative one about Obama. Klein also noted that McCain seems awkward because of his physical impairments. This was insulting and, I believe, irrelevant to voters. McCain has sacrificed far more for his country than Klein ever will.
I admire the author for finding the time to write this letter between complaining about video game reviews on the internet.
Imagine an opinion piece which discusses a recent game in terms that are unilaterally positive. The article focuses on one abstract element of the game, setting or atmosphere or art design or something. Whatever, it's a theme that really resonated with the author, who then wants to explore it in detail. While liberal in its praise, the piece is not exhaustive, and very intentionally doesn't mention any of the game's well-known flaws -- things like crashes, framerate issues and AI problems. Taking these alarming omissions into account, is this opinion piece ethically suspicious or merely irresponsible?
Apparently you present your evaluative thoughts on a game in any format more sophisticated than "played 3 hours of mirror's edge last night... like the music... combat sucks... more soon, xoxo" some people are going to equate it to a review. They view the article through the conventions of a review and bizarre standards of objectivity, impartiality and fairness as upheld by a constituency of impotent watchdogs. Where the original article was never supposed to be definitive, now people are reading it like it has pretensions towards being the final word on the matter. The expectation of a review is that it should cover all the good and bad points about a game, presumably in an objective, expedient and unpretentious style that educates the reader on whether or not to buy it. It should assess all the major areas: graphics, sound, story, fun, replayability. It should note all the bugs, loading times and sub-par animations. Even though those qualities are pretty irrelevant to your thesis if you want to write about the game from any perspective other than usability or hardware, the article described above still transforms from what you liked about the game to you very conspicuously leaving out everything bad about it. If everyone knows the game has a big crashing problem and that's not mentioned in the "review"? The publisher must have sent a whole truck of cocaine and hookers to explain that travesty of justice.
What about the piece has really been invalidated though? Its value as a consumer report? Maybe so if the problem in question was particularly egregious, but we're not even talking about actual reviews. Is the only thing gamers look for in any kind of critique product details? Are there really people who think the only possible purpose of criticism is to better inform the customer or the voter? I didn't think I wrote reviews anymore but apparently I still am writing someone else's shopping list. I don't understand exactly how appending your personal take on a game with some conventional wisdom about crashes is supposed to be helpful. Given the overwhelming tone of the piece, whose mind is that going to change anyway?
Maybe all these reviews and articles really should be written in the aggressively neutral, zero sum tones of a Wikipedia page's "critical response" section, something onto which anyone can project their preferences. Ideally, though, you want to write about games in interesting ways that engage readers regardless of whether they like the game, whether they've played the game or have any interest in the game. It should transcend basic responses to specific and technical points.
To offer a dissenting opinion, I don't believe anything I just wrote.
Subscribe to:
Posts (Atom)