As part of my work on the lineages of player practices, I’m beavering away on a five part serial looking at game inventories. It was originally going to be just one post entitled The Joy of Sets, but it has predictably spun out of control and has turned into a bigger project. I will be taking pairs of games and looking at the lineage connections between them, which is not simply a matter of saying what influenced what directly. For instance, Notch never played Dungeon Master to my knowledge – but the design of Minecraft inherits almost the entirety of its inventory practices from it. This will be my big serial for this year, and I hope to kick it off soon. Stay tuned!
In some sense, ignorance might be an appropriate word for what I’m advocating: for creators to intentionally ignore with greater diligence the pressures to be similar, to follow fashion or money or power, pressures to use objective, scientific methods of art production. And similarly, I think part of what I’m advocating for could be called dogmatism: for creators to hold firm in their values and goals in order to create works that are more distinct, more filled with themselves, more honest and interesting and worth talking about.
Please rush over to his blog to read the entirety of The Ignorant Dogmatist right away!
The original firestarter makes one of its targets the kind of self-focussed indie game design method Kevin defends here. Yet I cannot do anything but respect Kevin’s commitment to exploring his own creative vision in games. For me, what Kevin is doing is making what I call artgames, and the moment you’re committed to art you are no longer practicing a commercial craft. You’ve gone down a marvellous rabbit hole, one where money may be tight but that worthwhile things get made. Almost everything I’ve thought worthwhile in games in the past five years has been an artgame… This is largely what I choose to play these days.
Why sell out artists in The Craft of Game Design Cannot Be Measured By Any Metric, then? When I look at Kevin’s output, which includes Eidolon and The Absence of Is, I see someone pursuing their vision for its own sake, which is the mark of an artist – a way of life I greatly respect, not least because it now feels closed to me. But when I look at the indie market, I see people pursuing a similar kind of self-focussed process and making yet more-of-the-same violent, repetitive ordinariness. Such indies are, I presume, trying to make a living – and they’re doing it badly. It was these indies I wanted to lambast.
If my piece in any way discourages someone from accepting the role of the starving artist, with all that entails, I apologise unreservedly. Art is one of the greatest ways to add value to life beyond money. But most indies aren’t making art. They’re masturbating into a codebase and thinking they’ll hit big doing so. Maybe I should respect that as a kind of art, but I just see it as bad commercial practice.
With my philosopher-hat on (I wear many, conflicting hats), I can only smile with an inner warmth at this line:
I think that often, the non-mechanical components of a game are more important than the mechanical ones, and so I tend to work on visuals and writing at least as early as mechanics.
I wrote Imaginary Games in part to defend this philosophy, and next week I’ll present to a hundred game academics about how games are more than their merely artefactual machinery. Kevin describes himself as willingly ‘ignorant’… his ignorance, though, is closer to the kind praised in Jacques Rancière’s The Ignorant Schoolmaster – it is a freedom from stultifying conformity. I could never oppose this, especially not when it is done in the pursuit of art. Everyone must discover who they are, sometimes over and over again… and never let someone like me tell you otherwise.
In a comment replying to The Craft of Game Design Cannot Be Measured By Any Metric, game designer and Chief Creative Officer at Spryfox Dan Cook gave such a sterling, thorough rebuttal that I’ve reposted it here in full.
When I design I have a mental model of how I imagine my game will be played by players. This includes predictions about player emotions, learning, buying behaviors and a dozen other factors necessary to make a self-sustaining game in one of today’s various markets. I also make predictions about how markets will act. Platform desires, player designers, press desires.
Then we build the game, or at least we build an initial version of it.
Then we playtest the game to see if the my predictions worked out. Most of the time they don’t. In the best cases I’m only off by a factor or two. In the worse cases I’m off by several orders of magnitude. However, I may also find that players behaved in a manner that was actually more interesting than I predicted.
So we build another iteration of the game. Somehow, we need to connect the empirical reality of what the playtest suggests with what we predict will happen. This usually involves updating our models, sometimes radically. Often incrementally.
For some designers, this process can be frustrating. The reality of player behavior imposes constraints on their mostly imaginary vision. But I tend to see constraints as necessary to the process of design. And constraints based off observing real people playing the game tends to more often than not yield opportunities to impact the real shared world of many people vs the isolated imaginary world of a single person. We find new ways of playing that are more vibrant and interesting.
How are metrics useful when iterated on a game?
Game designers are information starved. With writing, we have an imperfect but competent mechanism for imagining how someone might feel reading a bit of text. In order to write, you must read. And thus you are forced to process a work in a somewhat similar fashion to how a potential reader might process. Game developers do not have this luxury. We build systems multiple times removed from a player’s experience. Write some code. Do a dozen other steps. Build an executable that someone somewhere runs. Knowing how people with react to what we make is hard.
So we use crutches. We create complex models of how players think. We use ‘proven’ patterns. We watch players and try to imagine what they are feeling. Then we try to backtrack all far removed information to whether or not a number in the bowels of a broken machine should be 2 or 4.
There are certainly classes of information we can extract more easily. Surface player emotions on individual playthroughs. Awesome. We can do that. But human behavior is broad. We see the need to sample behaviors across populations and discover central tendencies or outliers.
So metrics or analytics are that tool. They let us understand statistical patterns of behavior. Do they let us see inside the minds of our players? No. Nothing does yet. Do they replace in person playtests? No. Smart designers use multiple sources of insight.
But metrics do provide an amazing range of insight by allowing us to look at hard problems from a different direction. If players in an MMO are flooding forums with complaints about a change, how many people are impacted? How did playstyles change?
When balancing economies and progression systems, metrics are essential. You can’t do an in-person playtest of someone playing a game for 90 days. The old tools don’t work. And various forms of data collection do.
Maybe all this doesn’t need to be said. Maybe you are worried about something else entirely.
Are you worried about how metrics shines a light on bullshit design? Because a lot of design is unsubstantiated bullshit. We imagine people will play a game a certain way and then they don’t. Such an ego buster. Metrics beat us with bully numbers. They bluntly state our initial idea was flawed. Or even worse, the thing that people have been praising us for years doesn’t actually apply to anyone but some weird elite group of outliers that happens to give out chintzy feel good awards. Reality can be cruel when you live in a fantasy. But it also acts as a constraint that forces us to up our game and make something that works. Versus wandering blindly off a cliff in a feel good haze. Which I’ve done. (Lovely until you fall).
Are you worried that Bad Men use metrics in a reductive fashion to emphasize making money over art? Bad Men have been emphasizing making money over art for a very long time. For any golden era of games there were penny pinchers micromanaging creative decisions at a level that destroyed souls. Might I suggest that a new tool for getting data is not the actual problem. The team sets their goals. The tools just get them there.
Are you worried that we are using Dumb Metrics? That the dumb patterns dumbly followed by dumb practitioners result in dumb ideas and dumb games? Well it is true. And the solution is one that applies to all complex instruments used in the pursuit of art and beauty: Get Good.
I actually see metrics, competent design and building something positive that meets player needs as three complementary pursuits. I’ve asked “Well, what do players want and how does that align with business? And how does that align with art or craft?”
Here’s one answer. Many players want connection with meaning and community. They want mastery and agency. This leads to them enjoying an activity for a long period of time. That results in great retention metrics. And when deep needs are being met, people are willing to spend. Will I spend a buck on Pokemon lures to enhance a relaxing afternoon with my wife at the coffee shop? Yes. It makes for joyful light conversation. The game improves our relationship by creating a shared playful space.
Metrics track and tune all this. Is that evil? Just the opposite. I consider it doing great good for the world through competent design practices.
I have made minor edits to the text to make it read as a standalone post: the original comment is still available under the original post.
I dislike the absolutist nature of the argument, and prefer the more nuanced version. As a creative person, I still like things like food, a roof, and perhaps air conditioning when the temperature and humidity get high outside. But, I think it is important to realize that there is a decision to be made. One can choose to pure creative energy to create experiences on one extreme, pandering to tastes and maximizing for profit on the other, and a lot of room between the two extremes. And, as much as we might lionize the indie iconoclasts, the reality is that sometimes it takes a lot of work and understanding what people actually want to survive as an indie.
The argument Brian refers to here is art vs. commerce. Personally, I don’t accept a significant divide between art and commerce here… the vast majority of art is commercial in the sense that this term is used today: music recordings and performances are sold, paintings are auctioned, theatre and cinemas charge an entry fee. Knowing that games are artworks doesn’t mean the people who make them don’t deserve to be fed. I absolutely agree with Brian that game developers are no different in this regard: part of my argument in The Craft of Game Design Cannot Be Measured By Any Metric is precisely that indies, in rejecting commercial design considerations, are gambling on their livelihood.
So I accept Brian’s point that metrics can be used responsibly, at least in principle. My argument is only that there is a tension between the craft of game design, and engineering systems for commercial exploitation. Developers who can use metrics to assist their game design practices ought to make clear how this can be achieved without it becoming exploitative. I welcome the discussion here – it is this discourse that I feel is substantially missing.
If game design is a craft, what becomes of it when game development is driven solely by financial metrics? Does any of the craft remain? Or are games reduced to mere commercial pit traps, luring in and monetising their unwitting victims?
A little over a decade ago, when my friend and colleague Richard Boon and I were writing 21st Century Game Design, I had predicted that this century in games was going to be characterised by a new focus upon understanding players, and that this would be attained by various models of player behaviour. I suggest (with the benefit of hindsight) that this general claim was correct, and that we have gone from an era where game design was dominated by dogmatic assumptions and self-satisfying design practices (although neither of these have gone away…) to one where understanding how players relate to games is an inescapable part of the videogame industry.
But we made one crucial error in that book. My assumption had been that modelling player behaviour entailed understanding how to satisfy play needs, which is to say, having a positive, inclusive, moral and practical relationship with players. But the dominant forms of player modelling right now have absolutely no need to understand how to satisfy players in any form, because the principal form of model we are using right now is analytic metrics – and these metrics are blind to any aspect of the mental states of the player whatsoever. If our image of game design in the 21st century was that the industry was going to be making money by creating games that deeply satisfied their players, what we are actually facing now is an industry that makes the majority of its money by simply analysing where the leaks are in their cashflow, and acting as digital predators to suck spare change out of players’ digital wallets.
It may be helpful to look at the key metrics at use today to verify what I’m claiming. Firstly, there are the measures of activity – Daily Active Users (DAUs), Sessions, Stickiness (DAU/MAU), Retention and its inverse, Churn. Then, the measures of monetisation – Conversion Rate (percentage of players making purchases), ARPDAU and ARPPU (Average Revenue Per Daily Active User, or Per Paying User). Also, game economy measures for Sources, Sinks, and the Flow Rate of in-game currencies, all geared towards engineering sufficient sparseness that players will be encouraged to pay money for advantages. And that’s what it’s all about: squeezing money out of players' impulses – although in analytics, there are no players, only users, just like the narcotics industry. As the company Game Analytics observe with the admirable unvarnished honesty that belongs to these thoroughly pragmatic commercial practices:
Successful free-to-play games create long-term relationships with users. Users that enjoy the experience enough are willing to pay to for a competitive advantage. A game needs to have strong retention to have time to build this relationship. (Emphasis added.)
One of the most coherent supporters of the free-to-play business model where such metrics dominate is Nicholas Lovell, author of The Curve and regular on the same speaking circuit as me. We first met at Develop Liverpool, many years back, and our paths still occasionally cross. He views the challenges of that side of the market as not so much about monetisation (he rankles at being called a ‘monetisation consultant’) as about retention, in accordance with the quote above. But I read very little from him about the craft of game design, and his recent talks have tended to be framed in terms of the keywords ‘Acquire, Retain, Monetise’, which sounds like a scaled down version of the Ferengi Rules of Acquisition. Nicholas continually insists our industry can self regulate itself away from abusive practices – but I still don’t see any sign of this, nor indeed do I detect much interest in doing so.
The focus on metrics over game design has brought the videogame industry closer to its less reputable but more profitable cousin ‘gaming’ – what's commonly known as gambling – and with it, we have a host of ethical questions about what we are doing, none of which can be merely presupposed. We urgently need a debate on monetisation practices to establish what ethical metrics consist of, but the industry does not want to have this talk. I offered a dynamite panel to GDC this year on this topic, but it was knocked out of contention instantly. The industry is afraid to have the conversation, but until we are ready to address questions about what metrics mean for game design as a craft, we have a serious unaddressed problem that affects the integrity of the games industry. Of course, in purely capitalistic terms there is no integrity, there is only money. But money is just another of our imaginary games – it just happens to be one that we all take very, very seriously, since we have lost our ability to feed ourselves without it.
One game designer who has taken a stand on the ethics of monetisation is former Free Realms creative lead Laralyn McWilliams, who quit a job out of disgust over the issues I’m highlighting here. In an interview back in 2014 entitled “The problem with ‘best practices’ in free-to-play”, Laralyn reports how designing for ‘friction’, which is to say, monetising player frustration, finally became something she couldn’t endorse:
…a designer came to me and said there was a spot where it got really rough; there weren't enough quests, and the grind was really terrible. He wanted to add five or ten quests to make it feel better…. But when I looked at our numbers that was the spot where we had our best monetisation. The awful feeling of that grind was getting people to spend money, so I had to say no to something that would make players happy because it would cut our revenue. At that point I said, ‘Nope,’ and I got out of social games.
Against the ruthless focus on the bottom line is the possibility, if nothing else, that game design can fulfil its calling as a craft, and that informed practitioners of that craft can satisfy the play needs of many different kinds of players. This does happen, even in the battleground of metrics, and developers that are willing to commit to doing so can build a loyal fanbase that supports them, and helps other players to find them. It’s a harder path, to be sure, because it means making commercial artworks that are worthwhile instead of just cranking the sausage machine of rehashed ideas. Nothing good comes without effort. But if we want to walk this path, it entails more than simply resisting the purely metrics-driven concept of commercial games.
Sadly, indie developers who have avoided going down the predatory monetisation path have tended to simply default to making what they like to play and then gambling upon finding an audience for it, which I view as a hugely risky way to pursue a career in game design. I’ve seen dozens (perhaps now hundreds) of developers fail doing this... it’s simply not a good enough plan to trust that – by chance – your play needs will align with enough players to magically make ends meet. As Rami Ismail of Vlambeer suggested to me when I accused him of giving this exact advice:
...I've told developers to make what they want to make - [but] never in that vacuum. My entire existence as a public figure exists because I was one of the very few prolific 2010-generation indies that was yelling about taking business seriously, engaging with publishers and marketing, and doing the work to make your game visible.
21st Century Game Design will be going out-of-print soon; its multinational publisher has withdrawn from publishing books about making games entirely, which in itself says something. Our first book’s core vision – that there are methods for game design, but there is no single, perfect method for game design – remains as true today as it ever did. Our deployment of that vision through a fusion of horizons between psychology of play and the history of videogames remains, I believe, an extremely fruitful way of understanding the craft of game design. Alas, the games industry didn’t choose this path. It choose instead an unholy schism between dogmatic indie design on the one hand, and pragmatic monetisation design on the other. Personally, I feel that the artworks we call games deserve more than this, but I appear to be in the minority. In a games industry divided between a stubborn individuality unable to reliably feed itself, and investment-glutted money farms, there seems little room left for cultivating the craft of satisfying players.
Agree? Disagree? I’d love to hear your comments! Have a blog? Any and all replies at other blogs will be promoted here to keep the conversation going – just let me know the link in the comments or on Twitter.
While I primarily teach aspiring game designers in the UK for University of Bolton’s School of Creative Technologies, I also teach Game Narrative for the fantastic Art of Game Design MFA programme at Laguna College of Art and Design (LCAD) in the US. This inventive MFA programme offers benefits to industry professionals looking to buff up their career, academics with an interest in Game Studies, and recent bachelor’s graduates who want to stand out from the crowd. It is also a point of personal pride for me, having argued for many years for the status of games as artworks, to be teaching on a Master of Fine Art degree in Game Design.
Building upon an established BFA programme that is one of the Top 10 ranked in the United States, the Art of Game Design MFA is perfect for strategic career growth. LCAD BFA programme covers Game Art, 3D Character, and 3D Environment, and is supported by innovative trans-university partnerships including USC’s GamePipe Laboratory, as well as boasting a placement record in excess of 94%. On the Masters programme, candidates work closely with some of the top names in game design and game studies, including taking my own world-class module in Game Narrative (also available in a Bachelor’s version at University of Bolton), and hone practical skills and business acumen while developing a critical, theoretically-informed framework for understanding games.
The deadline for submission for the 2016 Fall semester is June 1st. If you have any questions, contact LCAD Art of Game Design MFA Founder and Chair, Sandy Appleöff Lyons, who will be happy to discuss your career goals and educational objectives.
Traditional game design is based upon the practices of tabletop game design, that is, writing rules (now generally called ‘game mechanics’) that are implemented into programmed systems. This method works. But it misrepresents the practical aspects of the process by obscuring the relationship between games and players. Games are never invented from nothing: they exist as variations of successful player practices.
Excluding young children, all players come to every game with their own pre-existing player practices already well-established. Defender (1981) was difficult for arcade players to learn because it’s control scheme was nothing like the other arcade games of the late 70s and early 80s. The computer strategy game Steel Panthers (1995) uses a hex map because thirty year's earlier Avalon Hill’s second edition of Gettysburg (1961) established the benefits of these over square maps. DOOM (1993) and Quake (1996) used arrow keys rather than WASD because movement in most Western RPGs up to then had been controlled that way, with mouse-look simply creeping in as an optional alternative interface for games mounted on the Quake engine. Changes were incremental, not revolutionary, because utterly innovative practices become a barrier to play, creating negative word-of-mouth, high risk of bad reviews, and thus no eventual community.
Community is the big issue here. As I wrote to Dan Cook earlier this year, no-one plays alone. Commercially successful game developers (and indie game devs who can feed themselves) have in common that they either made a game for existing communities of players, or they founded a new community around their game. In all cases, the player practices are contiguous with earlier player practices – either in terms of interface, fictional world, or agency (which is to say, the intersection between the two). The three work together, and all three are important – although in different ways to different players, who may experience a variety of aesthetic flaws as a result of their preferences. Clashes between interface practices create perplexity; clashes between world and agency create ruptures; clashes between agency and interface generate inelegance. All discourage players from engaging in a new community, but not all are strictly game design problems (rupture in particular is often a narrative design issue).
Successful game design doesn’t have to minimise all these aesthetic flaws, because not all players are bothered by rupture, not everyone is sensitive to inelegance, and some players willingly persist in the face of perplexity. But it is the last of these flaws – perplexity – which is the greatest problem for games courting a community of players, because players can adopt a new game easily if its players practices are close to those they already know, and this applies to interface, world, and agency practices. If a games interface practices cause perplexity instead (by being different from player expectations, founded on prior experiences), there is a barrier erected around the game and only a minority of players will get through it. Indeed, contemporary games have developed new community practices to offset this exact problem – such as Wikis that provide detailed information of player practices expressed as game mechanics, and guides that introduce players to new practices gently. Even so, successful new games achieve their success by taking advantage of existing player practices, and only vary them to a relatively small degree, such that players can switch from an existing player community to that of the new game with minimal complications.
A few examples may be helpful. Blizzard’s all-conquering World of Warcraft (2004) did not create a new community but rather absorbed others that were already engaged in very similar player practices. Firstly, the DikuMUDs that had near-identical practices to WoW but used a text interface, followed by much of the MUD community in general (including the other early ‘graphical MUDs’ like EverQuest). Secondly, computer RPG players, since they had very similar practices in interface, world, and agency, but usually played in single-player worlds. Thirdly, tabletop role-players, from whose player practices all these other communities descended. World of Warcraft effectively monopolised the role-playing game lineages, and their communities, through high production values, careful community management, and a buffed-up version of the practices of Dungeons and Dragons (1974). It ultimately became such a huge player community than even the wellspring of its player practices, D&D, began to copy it, with its fourth edition rules clearly geared to appeal to the community WoW had stolen away from the table.
Similarly, Mojang’s monolithic mega-hit Minecraft (2009) was readily available to a hugely diverse community of players because it used a standard interface, one that descended from Quake’s mouse-look combined with inventory mechanics from the cRPG lineage (those largely added to the pool of player practices by 1987’s seminal Dungeon Master). Minecraft did not succeed by monopolising existing communities, however, but by being able to be played by a huge pool of players (thanks to its low-perplexity ‘standard’ interface, and a strong supply of wiki content to bridge the gap to its high perplexity crafting system). Once it was rolling, it then supporting hugely diverse player communities thanks to the open conﬁguration of its numerous regimes of play – from peaceful construction, to vicious permadeath that descends from early digital D&D variants such as Rogue (1980).
Significant growth in community was also fuelled by the ingenious early access business model, which Minecraft both invented and popularised. Unlike later early access schemes, Notch offered rising entry fees from a very low starting point – it was about $10 when I got it, I think it'd been half that when I first saw it, then later it was $20 and $30. Part of my buying decision was precisely the thought that I didn’t want to pay more later, and I’ll wager I'm not the only one who was drawn in this way. This is one of the two key reasons why Minecraft could not have come from a publisher, and could only have been an indie project. The other issue was its low-fi visual aesthetic, very much resembling my indie flop Play with Fire (2006) three years earlier, although there is no direct connection between the two games to my knowledge. (Indeed, the only person I’ve ever found who even saw Play with Fire is Miguel Sicart).
In Minecraft’s case, we can see how its success did not primarily come from its game design ingenuity, which merely provided the seed of appeal around which its communities gathered. It’s success was rooted to continuity of player practices from the lineages of FPS (for interface) and RPG (for world and agency). Minecraft cross-bred and thus hybridised the two key videogame lineages, but it was its inventive business model that provided a means of growing a new community organically and thus had a far bigger part to play in its success than design innovation. This is in no way a criticism. I have enormous admiration for the variations to player practices that Minecraft introduced, which have still not settled into any stable configuration in the games community at large.
Equivalently, superior community maintenance was more important to World of Warcraft’s success than design innovation, of which it had very little – and not because Blizzard isn’t full of extremely capable designers. A gainful comparison here could be made to id software, the only company to get signiﬁcant traction from the shareware business model. It innovated the ‘standard’ interface – but it built its community on pre-existing interface practices, from the Western cRPG lineages (as noted above), and then grew a community with a non-standard business model. Only when that community was established did id get a chance to spread the now-standard mouse-look FPS interface (which eventually gives us the twin stick control scheme on console as well, via other developers’ variations).
Traditional game design works much of the time because game designers are already members of communities of practice and can therefore replicate and vary those player practices effectively. Those capable of abstracting these practices into ‘rules’ or ‘game mechanics’ inevitably end up in the role of game designer, because they can communicate play in the written form that helps holds big projects together. (Small teams can avoid documentation entirely in many cases, but larger games have no other reasonable option). Nonetheless, the work of games designers will succeed or fail according to how well it maintains and varies the established practices. When it fails, it is often because of unresolved conflicts over precisely which practices are being replicated or modified – especially in traditional publishing relationships. But successful game design has always been embedded within the already existing player communities, and new directions have worked far less often than variations on known themes, no matter what players say about what they think they want.
Traditional marketing is an even less reliable method than game design, in so much as the openly stated strategies (such as target demographics) utterly miss the point about why spending money can fuel the formation of communities. The players are largely already inside the communities for the various big game brands (Mario, Call of Duty, Mortal Kombat, GTA etc.) but can easily be enticed to play games with similar interface, world, or agency. Meanwhile, world-focussed media brands (Middle Earth, Disney, Lego, Star Wars, Harry Potter) provide further opportunities to bring existing player practices to their (largely zero-agency) communities, offering substantial commercial benefits – at a substantial price to developers. Indies can’t afford to do this, so they typically just rip them off – just like the big companies, actually! Tomb Raider comes from Indiana Jones, just as Halo comes from Aliens (with a Larry Niven twist), and Call of Duty comes from Medal of Honor, which comes from Saving Private Ryan (both being concurrent Spielberg-produced projects). Even the much-vaunted indie game Braid (2008) wholly depends upon the player practices of Mario it has borrowed.
So what should you do if you want to be a successful game designer? Well, the primary route to success is to be backed by big publishing money like Shigeru Miyamoto, Peter Molyneux, or Wil Wright – but there’s no way onto the thrones these days without first getting into the trenches. Indeed, there never was. So if you’re aiming for success, you have to be planning to grow a community somehow. If you can’t get, or don’t want, a brand license to make acquiring that community easier, you have to modify the player practices of an existing set of communities. There is only one other option: set your living costs low enough that you get to set the criteria of ‘success’ below the rest of the industry. I have great respect for those that do. But even they are still engaged in variations on the existing player practices. That’s what game design was always about – talk of ‘game mechanics’ is only a medium for the exchange of ideas. We should not let it distract us from acknowledging our intimate familiarity with the player practices of successful games, because we are all a part of at least some of these communities, and always have been.
ihobo will return in the Gregorian New Year.
What makes something a role-playing game? The Essence of RPGs was a serial in three parts running here at ihobo.com that offered an answer to this question by tracing the essence of these games to two sets of player practices, rule-play and role-play . Each of the parts ends with a link to the next one, so to read the entire serial, simply click on the first link below, and then follow the "next" links to read on.
Here are the three parts of The Essence of RPGs, each of which begins with a link to the corresponding part of the source serial:
If you enjoyed this serial, please leave a comment. Thank you!
In 1998, I produced one of my final tabletop role-playing game designs with my friend and colleague Rob Briggs, who had also worked on my first published tabletop games, Avatar and Outlands. My work in this space was always motivated by trying out something different that could build upon the successes of other games. Avatar tried to overcome the problem of player knowledge in RPG worlds (since they often had thousands of pages of details that needed to be read) by asking players to co-operate in a world building game to co-create their setting. Outlands attempted to merge science fiction settings into a hodge-podge world in the same manner as Dungeons & Dragons had done with fantasy, and remains one of my favourite designs. Shifter made play from the absurdity of time travel when you can endlessly repeat and modify your previous actions. In Contract, Rob and I tried to boil down the essence of our role-playing into the simplest possible system, and what that meant was creating a character sheet that served as a contract between the games master and the player, establishing who the player was vouching to be in the fictional world of the game.
In the previous part, I described how the ontological complexity of Dungeons & Dragons lead to engaging ‘rule-play’ that was rooted in the infinite variability of the character sheet. In this final part, all that remains is to show how the character sheet was also the locus of the other lineage descending from the tabletop role-playing game – namely role-play. The essential point here is the one that was central to Contract – that the character sheet serves as an agreement as to who the player is undertaking to be in the fictional world of the game. To commit to role-play was to follow that character wherever it led, even – or especially! – to their death. The choices you made as a role-player were not about agency as it is usually construed in videogames, but about being someone else and choosing what they would choose. It was about play-acting and empathy more than about power fantasies and free choice.
From the earliest days of the tabletop role-playing game, there were two main camps for how the story-play would operate, two different sets of player practices for role-play neither of which was specified by the game itself. The first, and the one I was involved in right from the start, could be called dramatic role-play, a form that takes its influence from storytelling and mythology – the kind of psychological patterns identified by Joseph Campbell as the heroic monomyth (or ‘hero’s journey’). This branch of the RPG story leads from the fantasy novels of the mid-twentieth century (that inspired D&D) directly to the sci-fi and fantasy novels of the end of that century and the start of the next. Authors such as Cory Doctorow, China Miéville, Walter Jon Williams, and of course George R.R. Martin were all dedicated role-players who crafted their narrative skills at the gaming table.
In dramatic role-play, the focus of interest is how characters inter-relate to one another, and as a result those of us engaged in dramatic role-play very quickly realised that the dice were a liability more than they were an asset. We learned to fudge dice roles for dramatic effect, and never regretted it. Characters in our games still died, but they died as a consequence of their actions, not as a result of mere random chance. In the 1990s, many tabletop systems began to emphasise these player practices explicitly in their designs, a lineage exemplified by Erick Wujcik's Amber Diceless Roleplaying (1991), which dispensed with dice entirely, and Jonathan Tweet’s Everway (1995), which replaced dice with a highly visual Fortune Deck (an example of which is pictured above).
The parallel set of player practices to dramatic role-play are what nowadays is often misleadingly termed ‘Old School role-play’. This name is an attempt to claim legitimacy from the sheer age of the practice, but this approach is not any older than dramatic role-play, having the same historical root – Dungeons & Dragons. That said, given that drama is as old as civilization, the practices of dramatic role-play could be traced back to the ancient Greeks at the very least, especially if we take seriously Roger Caillois’ suggestion that theatre (as one form of his mimicry) should be considered a key example of human play.
What characterises Old School role-play isn’t drama but harshness, and I propose to term it brutal role-play. In these player practices, the dice are as sacred as in Caillois’ games of chance and fate (alea) and therefore players are honour-bound to accept their outcome, no matter how terrible. (This would give the ‘Old School’ players an unbeatable comeback to my previous comment about theatre, since no human game is older than dice!) Since all tabletop role-playing games are shockingly inadequate simulations of reality, playing ‘Old School’ generally means accepting a risk of death disproportionately higher than in everyday life. This also means that brutal role-play can be highly effective at simulating the psychological paranoia invoked by violent encounter. No Old School role-player enters into battle unprepared!
When these player practices crossed over into videogames, they developed in both predictable and unexpected ways. Dungeons & Dragons immediately spawned rule-play imitations such as dnd (1974-5) on the PLATO educational computer system. The popularity of these early dungeon crawl games was such that pioneers in computer role-playing games were not always tabletop role-players, since some of them simply picked up their practices from digital simplifications. Michael Toy, Glenn Wichman, and Ken Arnold don’t even mention role-playing games when talking about the origins of Rogue (1980), although they did name-check Adventure (1976-7), which is a direct descendant from D&D.
Rogue, like many of the early computer RPGs, inherits brutal role-play but without the role-play, thus creating a kind of brutal rule-play that today goes by the term permadeath. This concept, however, is a player practice originating with tabletop role-playing games, within which a dead character was dead forever. Before role-playing games, no fictional game world lasted long enough for permadeath to make any sense. Dungeons & Dragons, however, was effectively a persistent world – and one in which fatality has a very permanent meaning, at least until the later introduction of resurrection spells for characters who had reached a certain level within the game. Frankly, even after these rules additions returning from the dead was exceedingly rare, requiring a level 16 Cleric in the first edition of Advanced Dungeons & Dragons (at the time, a very high level indeed – probably taking well over a year of playing every week to attain). In all my years of playing D&D, no character ever returned from the dead. Extra lives were an invention of the arcade.
The split into the Western RPG and Japanese RPG lineage, although initially connected and in both cases rooted in D&D, produced a division into the Western-style player practices that were essentially rule-play (brutal or otherwise) and a Japanese-style that was more narratively focussed. But despite the greater emphasis on character and story in the Japanese lineage, role-play was not a part of their player practices in any meaningful sense. The Japanese planners (i.e. game designers) were not tabletop role-players, and did not import player practices from face-to-face play in any example I have been able to locate. Curiously, however, the Japanese traditions did culminate in one of the most significant examples of videogame role-play, namely Yu Suzuki’s Shenmue (1999). Inspired by 1980s Japanese computer RPGs, Suzuki set out to create a rich fictional world in which the player was asked to play a specific character, Ryo Hazuki. Players enjoyment of Shenmue was largely down to their openness to the player practices of dramatic role-play, which Western games seldom if ever encouraged by design.
This does not mean that the Western RPG lineage did not foster role-play: on the contrary, it was widespread but primarily as a player practice. The focus on agency lead to rule-play by design, and it supported role-play only when the player was willing to bring that element in through their own play. This took an interesting turn with the creation from 1978 of MUD1 by Roy Trubshaw and Richard Bartle. The player practices of tabletop role-playing games rapidly spread online in these text-based fictional worlds, which included a mix of both role-play and rule-play. The latter was epitomized by the LP MUDs and DikuMUDs, from which EverQuest (1999) and World of Warcraft (2004) directly descended, adding little more than graphical and technical polish. Simultaneously, role-play was happening in just about all kinds of MUDs but only when specific players knew (and valued) the relevant player practices. Personally, I found it was more common with the various MUSEs in circulation in the 1990s. (I spent many years in TrekMUSE playing as a Romulan ambassador who later became an officiator of marriages, and never once fired a gun).
What can be seen clearly in these examples is the point I made at the beginning of this serial: an artefactual reading of a game is always an incomplete reading. When it comes to role-playing games, whether tabletop or computer, the options for both rule-play and role-play are very often supported by the very same game, with the possibilities being exploited by different players in unique ways according to the player practices that they have previously encountered and enjoyed. Players who learned to role-play at the tabletop often brought their practices into their digital play, continuing to focus on dramatic story telling and characters. Those that did not were rarely if ever encouraged to role-play by any videogame, despite Bethesda’s mission statement to bring as much of the tabletop RPG experience into videogames as was technically possible. In this regard, the authentic experience of tabletop role-play has mostly appeared in videogames through artistic motivations unconnected with the tabletop: I have suggested in reference to one of Tale of Tales artgames that “there is more of the authentic experience of role-play in Bientôt L'été’s flaws than in all of Bethesda’s perfections.” So too their massively multiplayer screensaver The Endless Forest, which inspired thatgamecompany’s Journey – both examples of digital games that deliver role-play despite having only tangential lineage connections with tabletop RPGs.
The essence of role-playing games lies in their connectivity, via their player practices, to tabletop role-playing games, and thus to Dungeons & Dragons, the origin of the form, and the most influential game of all time. These player practices can be understood as forming two broad and diverse lineages – networks of related games and their interconnected ways of playing. Firstly, that of the rule-play of complex ontologies, epitomized by the agency-focus of the Western computer RPG lineage that branches from it. Secondly, that of the role-play of dramatic story telling, epitomized by contemporary tabletop role-playing games like Ben Lehman's Polaris: Chivalric Tragedy at Utmost North (2005) or Thoughtful Games’ Montsegur 1244 (2009). These latter practices appear in videogames by design just occasionally, although they are always being manifesting in a vast variety of games as a result of players who have learned the player practices of role-play from the tabletop. This indeed is what has always motivated me as a player: as much as I enjoy rule-play, and as much as I take pleasure in designing systems that support it, my own play is always far more influenced by the player practices I learned at the tabletop. Anyone whose life has been swept away in this wondrous inheritance, as mine has been, knows that we cannot pretend that videogames are somehow isolated from the tabletop that helped bring them to life: digital gaming is as much descended from the dice of the role-player as from the joystick of the arcade.
With thanks to David Calvo for suggestions as to the most interesting recent examples of role-play practices, and to everyone who ever played or designed a tabletop RPG with me over the years.
In the previous part, I staked my claim for a new understanding of role-playing games as nothing more nor less than the children of TSR’s seminal Dungeons & Dragons. This sounds trivial: but that’s only because it’s easy to underestimate the earth-shattering effect of D&D upon player practices, and the sheer impact it possesses through its demonstrable influence. This was not just upon tabletop game designers but also videogame designers, whose player practices (and hence design practices) descend directly from the programmer-designers of the 70s and 80s, all of whom were conditioned by the ubiquitous presence of tabletop RPGs in the gaming culture of the time. Next week, I’ll explore the concept of a role to be played that was also essential to the role-playing game concept, but first I want to explain why the rules of D&D remade our understanding of games in their entirety, in ways that now dominate commercial videogames even today.
Boardgames had been on the rise from the moment the industrial revolution made the means of mass production available. This wasn’t surprising, as humans had been enjoying games for millennia, but the games were necessarily either simple or narrowly distributed, because of the cost of hand-making all components. The late Victorian era saw an explosion of games with colourful titles like The Game of the District Messenger Boy (1886), although their components were all quite simple. In the twentieth century, games became more and more representative as the means of production allowed for more elaborate components, a trend exemplified by Cluedo or Clue (1949), which adds a map that is not just a track or a grid but clearly the layout of a fictional mansion. For the nerds of the late 50s and 60s, however, family boardgames such as these had become irrelevant, because Charles Robert’s Avalon Hill was finally on the scene, making a dizzying array of historical wargames, with rules defining player practices that are still being used in turn-based strategy games today.
It was into a landscape of player practices defined by Avalon Hill (and, by the 1970s, it’s me-too competitors) that Gary Gygax and Jeff Perren released Chainmail (1971). It was revolutionary precisely because it wasn’t historical: its influences were fantasy novels of the kinds made popular by Robert Howard, Fritz Leiber, and Michael Moorcock, not to mention J.R.R. Tolkien. But because it wasn’t historical, it also shifted the focus of its player practices significantly as well. All Avalon Hill games were about being in command of an army of some kind or another. What Chainmail invented (or rather codified, since it was inspired by various fan rulesets in circulation), was a switch in focus from armies to heroes. With this, the rules naturally changed to express different parameters and to measure different concerns – hence the appearance of magic spells and monsters, and more importantly levels.
With Dungeons & Dragons, which was initially presented as a variant of Chainmail, Gygax and Dave Arneson refined the level concept to encompass another key player practice in gaming today, that of experience points. In D&D, players picked a character class (just Fighting-man, Magic-user, and Cleric in the first edition!) and then advanced their character through levels via the acquisition of XP, gaining in power as they did so. It is a player practice so ubiquitous in gaming today, that it is difficult to truly appreciate that it originates almost entirely with D&D. Prior to D&D, gaming as a hobby was about tactical simulations of clashes between essentially anonymous armies, for which any concept of advancement was irrelevant. That ceased to the be case as soon as the template for the fictional world was not a historical battle but a quest narrative. The ultimate consequence of D&D’s very simple mechanics was a gradual intensification of advancement mechanics and their associated player practices (such as grinding), until – via the viral explosion of so-called ‘social’ games, and the more overtly D&D-descended World of Warcraft – these practices had become the foundation of billion dollar economic behemoths, worlds apart from the face-to-face tabletop play where it had spawned.
‘Rule-play’ is in essence a focus upon character advancement. This might go to the extremes of ‘min-maxing’ (making decisions solely for the purpose of maximizing benefit) or it might be a more subtle focus upon the pleasures of gaining new powers and capabilities. Indeed, Dungeons & Dragons created a near-infinite array of things to acquire! It is this breadth of options that underpins rule-play, and that can make a game fit the descriptor ‘role-playing game’ even when the elements of role-play are slender. What distinguished tabletop role-playing games as systems from the games that existed prior to Dungeons & Dragons (if we ignore the player practices, and hence what the game actually consists of in play) was the presence of a complex ontology. This philosophical term ‘ontology’ refers to the study of being, but it has acquired a sense in information technology of cataloguing and classifying what exists. In role-playing games that serve rule-play well, what exists is a great many things!
In early arcade games, the variety of entities was low because of technical constraints – Pac-man generates its engaging play from just seven entities, not counting the individual mazes. With the Avalon Hill wargames, the ontologies were basically lists of units and lists of terrain – many more things than possible in the (later) arcade, and also more than in family boardgames, because the players were capable of dealing with more complex systems and thus more intricate player practices. A fully delineated ontology for, say, Monopoly, might rack up many entities by treating each different card as a separate entity, but even this would not reach the degree of complexity that Dungeons & Dragons opened the door to. Firstly, there is the range of possible player characters that can be constructed from (fictional) ontological elements such as class and race, not to mention the variety of specific entities implied by different combinations of attributes (Strength, Dexterity, Constitution, Intelligence, Wisdom, and Charisma). These are further diversified by the varieties of equipment – both that which can be purchased in a shop, and that which can be acquired as treasure from ever-more-detailed tables. And this is all before the adventurers have left the tavern! Once out in the world, there are varieties of monster (all equally diversified by their attributes), of terrain, even of alternative dimensions. The fictional world of a typical role-playing game is always diversely populated.
The sign of the relevance of ontological complexity to role-playing games was TSR’s decision in 1977 to pursue a two-pronged strategy, dividing the game between the comparative simplicity of ‘basic’ D&D and the extraordinary complexity of Advanced Dungeons & Dragons, with its large (and expensive!) hardback rulebooks. Again, rule-play is an appropriate name: players whose interest lay with narrative play did not need five hundred pages of lists and tables to fulfill their play needs! But rule-players did – they craved more details, more tables for simulating very specific situations (even if they never actually used them), bigger treasure tables, more monsters, more equipment, more, more, more! This is the essential quality that computer role-playing games inherited from the tabletop: with the limitations of a computer, the role-play dimension could only be crudely replicated, but the ontological complexity was perfectly suited to computerisation, albeit requiring the work of many artists to create the entities’ appearances (and even more now that games are routinely delivered in polygonal 3D). Ontological complexity begets player choice, and for many players that dimension of agency is irresistible.
What is worth noting about the design of Dungeons & Dragons is the way it created an opening for player practices that engaged with worlds of such enumerated detail. Prior to D&D, the entities were all formally specified by the rules with the only wiggle room coming from house rules (which were common, but generally did not add much to the ontological complexity). With D&D, the concept of a character sheet changed the landscape of play forever by creating a component of play that was filled in by hand, and thus that could be completed in any way the player imagined (provided the games master – or dungeon master, in D&D’s case – agreed). No need for a token or card to represent every piece of equipment, you just write what your character carries into the relevant box. Want to make your elven bard stand out from the crowd? Just write a unique description onto your character sheet! It is the character sheet that opened up player practices towards infinite imagination and away from prescriptive rules systems, and from this simple conceptual liberation the possibilities of both rule-play and role-play emerged.
Next week, the final part: Role-play