The thing about needing a finite experience is absolutely me. Even if it's not a story (Although having even the vaguest story helps so much) it should have guiding goals and a spot where it says 'okay you're done with everything, playing past here is just for any personal goals you might have'. Similarly, if the story is interesting or characters compelling I can look past so many gameplay flaws. UI flaws are harder these days, so I'd definitely suggest an easy to use UI.
Speaking of The Baron in Witcher 3, or the various Persona 5 villain arcs --
There's a working structure for video game storytelling where the game can have a lot of good little self-contained stories all over the place, each with a beginning, middle, and end, even if the overall game itself meanders along forever. The Arabian Nights approach, if you will.
Bethesda and Obsidian are trying for this, even if neither of them is terribly consistent. The Like a Dragon / Yakuza games are masters at this, where the tonal whiplash between the soap opera melodrama of the main plot and the goofy low-stakes side stories somehow only makes all the side stories funnier by contrast.
How much story is “too much” story? … I guess it all depends on how good it is, which is to say, how much of a genius you are. Execution is everything, and if that could be boiled down to a useful algorithm, Concord wouldn’t have happened.
I think when it comes to particularly bad RPG stories, my goto would be Neverwinter Knights 2. Specifically to this article, too many factions, particularly in the enemy department. And they may have been “distinct” but the bloat was such that all nuance and detail was lost.
Another factor from that game I would highlight is losing the plot entirely. This is less spending time away from the story and more the story spending time away from itself. Going from point B to C had this horrible tendency to go off in directions that had so little to do with C and went on for so, 𝘴𝘰 long, you forget why C was even a thing. Combined with the “too many factions” problem and any ties back to C was lost in the scramble.
If the end goal is to climb Mt. Plotdevice, you don’t necessarily have to spend every minute marching towards it. But most of the time, it should at least be visible in the background. If the river in Distraction Canyon flows from its climactic peaks, you can still have your hole-in-the-ground fun and not have your audience forget what they came for.
Spot on - old chap! Or as we say in the realms of the Nether-Netherlands - The stoy starts to suck, if the compression is too obvious. Happens in older age more than in younger age, I guess.
So, you hit on one thing here that I really disagree with. Quite a bit, actually. This comment is gonna be a long one, so I absolutely won't blame you if you don't read it at all (in fact, I might even advise it). Perhaps it would be better if I could demonstrate some self-restraint here; alas, that is not the world we presently find ourselves in.
"It doesn't matter how good your content is if nobody sees it."
I feel like this is solid conventional wisdom for old media, but less applicable to games. If someone can't watch a movie all the way to the end? Or read a novel through to the final chapter? Yeah, that's a problem. But games are a different beast altogether, demanding a far greater investment of time and attention. And CRPGs specifically? Even moreso.
I get that on the production side of things, it can feel like wasted effort: why bother writing a quest few people are ever going to trigger? Why bother writing dialog for a character few people will ever meet? Why design a dungeon few players will ever even notice, let alone explore? It can be disheartening. But for video games -- for RPGs in particular -- I think it's necessary.
We talk a lot about "player agency" with regard to games and RPGs, but what that ultimately comes down to is simply giving the players a choice. Do you go through Door A or Door B? If both doors lead to the same corridor, it's a false choice, no more than a poorly-maintained illusion of agency. What makes that decision *meaningful* is that by choosing Door A, a player will not know what lay beyond Door B. It's a decision with a consequence.
Perhaps the biggest example of "wasted effort" expended on this sort of thing do date is The Witcher 2's (in)famous second act, which took place in an entirely different region, featuring entirely different quests and characters, depending on a single choice the player was presented with at the end of the first act -- with nothing to indicate just how far-reaching the consequences of that choice would be.
For those who played The Witcher 2 to completion, every single player missed out on an *enormous* chunk of the game. And simply *knowing* that their choices led to that outcome makes those choices more meaningful, even if, given the opportunity, they'd make the same choice again.
God knows I've played through The Witcher 2 a good three times or so, and I've never -- not once -- seen any Act 2 but the Scoia'Tel's. Similarly, I've played the big classics, the KOTORs and Baldurs' Gates and Planescape and Fallouts many times, but more often than not, I find myself making the same decisions in those crucial moments. Choices that only have meaning because I could have chosen otherwise.
So, consider turning your thinking around. You said:
"For a game this length, that is an AMAZINGLY high percentage. Yet, it's still really low for a game we're calling a classic."
But I would suggest that the real question here is why is it that a game can be a classic *despite* people generally not experiencing the thing in whole?
And I would posit it's down to the medium. Certainly, I can think of many other classic games that most players are unlikely to see through to the end. How many people have beaten every level of a Mario game, I wonder? Or gone through Dark Souls all the way to the end credits? You also brought up The Last Of Us, a relatively short game featuring a linear narrative, but even in that case, current trophy stats indicate that only around 40% of players played through to the end credits. Yet, still, it is widely-considered a classic. And, indeed, if we're going to use Achievements/Trophies as a measure here, we must also confront the odd fact that that, for the vast majority of games, few than half of the players reach the "end," wherever that may be. (And only a fraction of a fraction manage to clear games "100%".)
I don't think it's unreasonable to assume that games saw similar statistics back in the years and decades before we had these systems to track player behavior.
....
And while this may be a bit of an unnecessary aside in a comment already overlong, but on the topic of Baldur's Gate 3, it's probably important to note just how beneficial this kind of "rare" or "unseen" content has been for marketing. So much of the chatter around the game -- which continues, unabated, to this day -- is oriented around rare interactions and consequences, the kinds of things that even the most dedicated of players are unlikely to see. Hell, in my current run of BG3 (a game, by the way, I've logged 500 hours into without yet finishing -- though I *do* intend to) I'm playing as a Drow Ranger, and took the path through the Underdark out of Act 1. And I *just* saw a TikTok highlighting a unique interaction that was *only* available to Drow PCs on that path through the game that I managed to miss entirely.
I could put another 500 hours into the game and still never see the scene on my own. It is *glorious.*
....
.... .... .... ....
But, also, I want to make it clear that your point is *still* very valid. Most every form or written media can and *must* be improved by ruthlessly cutting anything that isn't essential. The best writing advice I've ever had to give students is simply this: write your essay, write your short story, then sleep on it. The next day, highlight the first paragraph, or the entire first page -- and DELETE it. Then read through it again and ask yourself, "Did it lose anything worth having?" And most of the time the answer is going to be, "No."
I firmly believe that the best habit for any writer, the best skill to have, is simply the ability to ruthlessly cut their own work. You may doubt that from my post here, which is a bit much, I freely admit -- but I can at least promise you that if I were saying this in any kind of formal context, I'd be "gently" revising the word count down to at least half its current length.
(Second best skill, as you noted, is, of course, following the KISS rule. :D)
....
.... .... ....
A couple of other small comments I'd like to make, briefly, before I draw this ordeal to a close:
Re: Bloodborne's story -- I think you're maybe falling into a common but reductive misunderstanding of what "story" is. Basically, plot is only an aspect of a story, not the story itself. And while Bloodborne's plot can indeed be described as "completely incomprehensible," I'd argue that what makes the game so memorable is how well-executed all of the *other* narrative aspects are: specifically the setting and the characters. The bosses and NPCs are all unique and memorable, despite typically having only very limited amounts of dialog; and the setting itself, why... that IS the game. This is true of all modern FROMSoft games' narratives, I think: they prioritize setting and mood most of all, and plot least of all.
Meanwhile, with cases like Minecraft, there's still a narrative at play for players to engage with, it's just something wholly emergent and player-constructed. I don't think that invalidates the narrative element of the game, however. Harkening back to 3rd grade English class, one of the first things we're ever taught is that whole "there are three kinds of conflict," thing -- and what is a survival game but an implicit story of Man vs. Nature?
....
Re: Humor, you're absolutely right, but at the same time, the advice to "be funny" is just... well, let's just say it seldom works. The easiest way for a writer *not* to be funny, in my experience, is to *try* to be funny.
Rather, I'd frame it more as writers need to learn to not be quite so serious, all the time. Especially for video games. *Especially* especially for 40+ hour RPGs. If you want players to spend a lot of time inside your constructed world, that world needs to have texture to it -- which, with writing, means different tonalities. Make some content dramatic, make some content comedic; make some content profound, make other content absurd. Take advantage of the juxtaposition in tone to make the world feel more vibrant and interesting and -- perhaps most crucially of all -- unexpected.
There's nothing more delightful than laughing -- or crying -- when you least expected to. And that's the kind of feeling that game storytelling is uniquely positioned to deliver, I think.
I can't tell players what they should prefer. If people play 15% of a game and are happy, I'm glad for them. But if a really good, epic, acclaimed game can boast only 20% of people finishing it, and that's the GOOD result, I think something has gone very wrong. Remember, I'm think about this in the context of a flailing industry and out of control development costs.
"I think you're maybe falling into a common but reductive misunderstanding of what "story" is."
Yes, everything that happens in a game can be folded into a looser definition of story, but think we all know where I'm looking at here. When game designers get together to hash out the story, "Story" means the formal, fixed, written part.
We we ask, "What is the story for this mess?" we aren't thinking about when Joel shoots 87 zombies. That's the game part. In a few games, game=story, but that's very few.
Ahaha... I do have a bit of a problem with talking too much.
The context here is definitely something to keep in mind -- BG3 is a ridiculously over-ambitious game, and then you hear Swen talk about all these *other* big ideas he had to cut... it was definitely over-scoped. I don't know just how much of a problem that is for games in-general, but with RPGs specifically it seems to be a very common problem. I can't help but think here on the hubris of the Xenosaga series -- a spiritual sequel to the (in)famously-unfinished Xenogears -- which, for some ungodly reason, they thought they'd be able to make a *six-game series* out of. Or any of the many games that end in big cliffhangers, on the assumption of getting a sequel, only for that sequel to never materialize.
As for the comment on story, yes, I'm well aware of what you meant, and I wasn't trying to say you were wrong -- rather, it's that I don't think Dark Souls was the best example to use there, as I think those are games that (generally) do an excellent job with their stories, even with the plots being as threadbare and hard-to-decipher as they are. And just speaking personally, I think it's valuable to acknowledge and appreciate games that try to craft unconventional narratives. I wasn't really considering emergent narratives (and entirely other can of worms) but rather the work that goes into art direction and level design to make a play space feel compelling and engaging.
Anyway, I think part of the issue here might simply be one of energy allocation. Even with a big, expensive game like Baldur's Gate 3, you've only got a small number of writers penning the narrative, and they're only going to have so much energy to expend in so many directions. The trend in modern RPGs certainly seems to be focusing on expanding and developing companion characters more than anything else, so I'm not exactly surprised that that "main plot," as it were, wound up being so messy -- it's not really the focus of games' writers' efforts, I think.
And to an extent, that might even be justified by the completion statistics: everyone who plays BG3 will encounter the companions and spend a lot of time interacting with them; but only a fraction of that number will see the main quest through to its conclusion.
And speaking of trophy stats, just for fun, I looked up the stats for BG2 -- they're even worse. Granted, we only have this data for the Enhanced Editions, but... they're quite bad. On Steam, a whopping 70% of players haven't earned *any* achievements. That means only 3 out of every 10 players have completed the *prologue.* And only 10% of players have the achievement for beating Irenicus and clearing the game.
Does that mean that something has "gone very wrong" with BG2, too?
I guess I'm just... unconvinced that this is really that much of a problem, rather than simply being a (neutral) pattern of behavior. Most players don't complete most games, and it makes sense to me that the larger a game is, the longer it takes to clear, the smaller that demographic is going to be. Is that a problem? I don't really know. I'm certain it's very disheartening for the people who who are writing those stories, designing those dungeons and monsters and what have you, that their work is only reaching a fraction of their audience... but if that work hadn't been done, how much audience would these games have in the first place?
Like I don't think the math is so simple here that we can say, "Well, if only 20% of players are reaching the end of a game, then if we cut the ending and don't waste resources on content only 20% of players will see, we'll only be losing 20% of our audience." Much like how I think decisions in RPGs can only matter if there's the potential for there to be different outcomes, I don't think many players will be willing to get invested in a story without the potential for their to be a gratifying conclusion at the end of their journey, even if they never reach it.
Thank god I'm not the only person who immediately restarted Witcher 2 to experience the stuff I missed and then.... immediately sided with the Scoia'tael again, just like my first time.
At least in my case I did end up doing a third run through in which I finally did side with the Blue Stripes, no regrets :)
So small counterpoint from me from somebody who loved Witcher 2 and played both paths. How many people who played Witcher 2 do you think actually understood how much they even missed? Insert plenty of other games here too.
In other games like Walking Dead people got the illusion of their choices mattering and were really happy about it, but some who looked too deeply were upset once they realized things didn't really change as much as they desired. But is that a problem with the game or a problem with how a chunk of us consume game media and fret over what we could have done or missed or whatever. We demand huge sprawling things and then often don't play them or complain about bloat.
We demand 'oh you should allow people to be evil' in CRPGs but statistically almost nobody does that. Why not cut all that out and instead allow better nuanced options on how to be a good person in RPGs, you still get choices that way but I'd argue they're actually meaningful as opposed to a meaningless choice between good and evil because basically everyone will just choose 'good'.
Okay, you touched on a lot of different things that I think are really interesting to talk about... so please forgive me if I'm throwing too much text at you in response!
*ahem*
Re: The Witcher 2: I take your point, but to start with a small digression: how many TW2 players noticed that? Quite a few of them, I imagine -- this was back when the Witcher series was still relatively niche and aimed at RPG-enthusiasts, who tended to be very online. At the very least I'm sure it was a significant fraction. But I recognize this digression is besides the point. I might similarly point to another RPG (and, personally, I have zero qualms calling it such) generally regarded as a classic: Majora's Mask -- a game that, arguably, disincentivized exploration with its strict time-management system. How many people who've played Majora's Mask to completion saw all of the side-quests to completion, including the Anju/Kafei quest that is generally considered to be some of the best little bits of the storytelling in the series? Not many.
What we're really hitting on here, I think, is that what matters most of all isn't so much that choices have consequences, or that players have agency, but rather than players appreciate the *illusion* of consequence, of agency, even if they don't really have as much as they think they do. Part of the reason why the simple Be Good/Be Evil dichotomy is so common in games is because it's an easy way to create that illusion, if thinly, with minimal resources spent on optional content: conversations can either end with an NPC being killed by the player, or handing over a MacGuffin; all you really need is an Ending A and Ending B. Like in Mass Effect (1), a game generally praised for being "reactive," nothing the player says or does really affects the outcome of the story, beyond the final few choices which only affect a few minutes (possibly only a few tens-of-seconds, I don't remember) of the ending sequence. Of course, RPGs need more than that -- as you suggested, it's a duality easily tossed in favor of giving the player different options on "how to be the hero." Which, I think, is precisely what most modern RPGs do -- your Mass Effects, your Dragon Ages, your Witchers, all have a predefined idea of the player character as a generally heroic person, with the players agency in the narrative demonstrated mostly in how they *respond* to the world rather than how their actions *affect* the world.
....
I'm sorry, I feel like this response is really scattered. There's just so much here to talk about!
On the subject of breadth and depth, for example, I don't know know that audiences are a really claiming for the maximalist approach you're suggesting they do, certainly not in any great numbers. I seldom see people bemoaning, say, Skyrim, for it's total lack of reactivity -- they're too busy exploring the vast open world. Likewise, I've yet to see *anyone* suggest that Baldur's Gate 3 would've been better as an open-world game -- there, they're too busy exploring all of the many intricacies of its small, but deeply-reactive world.
.....
But what I really want to talk about here, speaking as a lonely writer myself, and mindful of our current context at the bottom of a Jeff Vogel blog-post, is... RESOURCE ALLOCATION. Which, I promise, is more interesting than it sounds.
Basically, all of the things people love about big RPGs are things that require a great deal of resources to deliver, that make them complete nonstarters for many projects. And while I've not experience with creating video games, I do have some small experience with interactive storytelling (interactive nonfiction, if you'd believe it) and collaborative narrative projects -- so I'm very familiar with how quickly these endeavors can become bloated and unwieldly and fall apart under the weight of too many ambitions.
The golden rule, I think, is not that "players choice matters," as RPG fans often seem to think, but rather -- and I think I hinted at this earlier -- that "the *illusion* of player choice matter." You don't actually need an RPG to react to the player as much as they think it could or should.
Essentially, what matters is that whenever players are presented with a choice, you want them to think about what the *consequences* of that choice are THIS is what matters most -- the paramount experience of an RPG -- not the subsequent validation of seeing said consequence play out, or invalidation of seeing it *not* play out, or subversion of seeing it play out in an unexpected manner.
The problem is, you need to have just enough actual reactivity to maintain that illusion -- too little and it falls apart. If every negative dialog option elicits a "But Thou Must," in response, players are going to quickly assume there's no reactivity, and fail to recognize it when it happens.
One example I would point to would be Chrono Trigger, an otherwise fairly straightforward and linear JRPG, which does something rather clever in the tutorial: the player has a lot of different actions they can take or not take during a festival, all minor choices, none of which feel particularly consequential... but then PLOT happens, and you're placed on trial, and all of those choices are then interrogated. Suddenly, players realize that, all this time, they were being watched -- and judged -- by the game, and are having to face the consequences of their actions. And that little bit of reactivity casts a shadow over the entire rest of the game, lending even small choices a sense weight they'd otherwise lack, even though there's little more reactivity in the entire game until very end.
Basically, you want that tension in every scene: that things *might* matter more than players suspect.
It's the video game equivalent of a DM just randomly rolling some dice in the background to keep their players nervous, or smiling or laughing to themselves as though they know something the players don't (and maybe should have noticed) when in reality the only thought in their brain is how they should've ordered the pizzas a half-hour earlier.
(I think there's also an entire other conversation we could have on different kinds of reactivity, which I could broadly categorize as front-facing and back-facing, which are *dramatically* different to implement, but I'll save that for another time.)
My apologies if this comment is a bit hard to parse, or too scattershot. My health's been a bit poor of late, and some of the medications I'm on can affect my mental state. I hope it was all sufficiently coherent, at least? I just have a lot of fun talking about these things, so I really couldn't wait for my mind to be clearer. :D
What I was alluding to in my reply is that what gamers and purchasers say they want from games and what actually causes them to purchase and appreciate games tend to diverge from each other.
Statistically, people don't buy short games as much as long ones, even though I think also statistically people enjoy and finish short games more than long ones. This leaves game developers in an awkward place where they're developing a lot of content that goes entirely unseen because people demand that it be present so they can not interact with it at all. Because people will vocally say 'This game wasn't good value for my dollar so I didn't buy it'. Good versus evil content is one of those things because it'll be something like 89% of people will choose good and 11% choose evil. So evil paths are typically stunted and unsatisfying even for people who actually enjoy the play of being evil, but people will complain if they don't exist at all.
To bring back to the BG3 discussion and how it's so big, apparently like only 4-5% of people did like me and chose to play an Origin Character as their Avatar and control them and they put an absolutely nonsense amount of work into it. It's lovely that Larian was able to do that and I had a lot of fun with it, but wouldn't we have a lot more better games if the companies weren't expected to do all this and just deliver tight directed experiences?
The illusion of choice was a great way to do it but it has broken some in the modern era with the internet. When you can go online and see that 'no you actually didn't change much of anything' people feel cheapened even though they were perfectly happy in the moment and in the illusion. Now that video games are a topic of discussion both in person and online people are going to share their experiences about it more.
As you point out, minor reactivity can often do the trick just fine. As long as the game is saying 'hey something you did mattered and I noticed it', and you don't try to pretend it was something super huge, it seems to have a really good effect on people. You don't need the Witcher 2 chapter split you just need 'you stole that fruit at the festival' or 'I decided to let this guy live but I thought that guy was a jerk and killed him for what he did'.
I think a lot of weird game design decisions and vestigial systems can be laid at the feet that gaming consumers and the market have certain expectations even though the same people will also complain about those expectations.
I'm actually having a particularly bad day and just had to take some of my heaviest meds... so lets hope I can maintain that coherence!
---
Your first point is a very good one: consumers (of any media) are, in general, very bad at articulating just what it is, exactly, that they want. There's that handy old truism that "No one knows what they want until they get it."
For a very long time, I think video game players, especially, fell prey to conflating *quality* with *quantity*. Particularly in the RPG space. I'm tempted to say that this has gotten better in recent years, but then I start thinking about how incredibly bloated and overdesigned the average open-world game is, and I wonder....
I'm definitely old enough to remember when RPGs were described first and foremost by the number of hours one could invest in them. BG2 was never described to be as a great RPG because of the writing, or freedom, or exploration, or combat systems, or any of the other aspects that make it so enjoyable... but, rather, because it was "a 200 hour RPG."
With scope and scale increasing as much as it has, and more and more developers forced to rely on various shortcuts to make games as big as possible, I think players are at least a little more savvy to the fact that more content doesn't mean better content, and that *too much* content can, in fact, be a detriment to an experience. Or I hope so. Right now, for example, I'm thinking of Assassins Creed -- Oddyssey and Valhalla, especially. Games with hundreds of hours of things to do and people to talk to, set in some of the most expansive and beautifully-detailed open worlds to date... yet, in my experience at least, none of that content was especially memorable or compelling. Not once did I encounter anything that surprised me, or made me smile, or made me feel anything other than, "Gee, this sure is a decent way to keep my hands and eyes occupied while listening to this audiobook."
---
Back to the multi-path thing -- and this touches on the two different types of reactivity I mentioned before, which I guess makes this a good point to talk about them? Well, in a bit. Anyway, my argument would be that while it's true that "people will complain if they don't exist at all," even (and especially) when they're not even electing to experience those routes themselves, I don't think that invalidates the criticism. The thing is, if you've got an good route (defeat the Dragon Lord!) and an evil route (join forces with the Dragon Lord!) both options lead to the same outcome, simply because that's the option most players will pick... then its a meaningless choice. That's a choice that only has value *because* there's an alternative.
And, indeed, as we see in these sorts of games, those minority-routes, those evil-routes, are consistently executed with far less depth and care than the more conventional alternative(s). It took Larian a little over a year to patch-in some more detailed "evil endings" to BG3, simply to bring those routes just a little bit closer to the more conventional endings, so I think it's pretty clear where their priorities will be.
Personally, the way I look at things, I think it's better to be ambitious and fail than to to be unambitious and succeed. I think video games are a spectacular medium whose potential as a storytelling vehicle we've only ever barely begun to scratch the surface of, and I think there's more than enough room for both wildly ambitious, overscoped, reactive games like Baldur's Gate 3, and those smaller, "tight-directed experiences."
It all comes down to $$$. If you've got the resources to be ambitious, be ambitious; if you don't have the resources to be ambitious, be clever.
---
Okay, so about the two different kinds of reactivity... I'm honestly not sure where the best place would be to dig into this, so I'm just sort of randomly diving in here.
Basically, as I said before, I think we can broadly segregate reactivity into two categories: front and back-facing. By which I mean, there's reactivity (or perhaps 'player agency' is the better term) that is focused *before* a story-critical event, and reactivity that occurs *after.*
The latter is the more straightforward (and, arguably, also more expensive/difficult) and conventional type: there's some big plot event, and the player is given a choice to make, typically by selecting a line of dialog from among multiple options. Each dialog option, presumably, leads to a different possible permutation, branching outward from that moment.
This is the general format of a choose-your-own-adventure story, where player agency is used to determine how events play out. It is extremely difficult to do well, and very time-consuming. These "trees" can grow quite large and quite tangled, and making every possible path through as satisfying as any of the others is *extraordinarily* difficult. This is why most games built around this kind of reactivity rely so heavily on implication and illusion, I think -- they just give the player a ton of different dialog options, only a small number of which *actually* matter, but the player has no real way of knowing what options matter and which don't. I believe The Witcher 3, for example, gave the player hundreds of different dialog choices -- possibly thousands -- throughout the course of it's main plot, but the actual decisions that affected the ultimate outcome of the story? As I recall, they numbered only four or five.
Front-facing reactivity is an entirely different beast, and I think it's the kind of reactivity that Larian RPGs are based on. Basically, those big plot moments are set in stone: the outcomes are fixed, but HOW your each those outcomes is where players have the most agency. Here, the investment/resources need to go more into systems and mechanics than writing, which makes these stories easier to write, I think. To jump back to BG3, for example, the big choices the player makes in Act 1 with regard to Minthara largely happen *before* you ever meet her. How do you treat the Tiefling refugees when you meet them? How do you interact with the Goblin Camp in-between you-and-her? You've got a ton of different options with how to approach the various problems in the game, but there are very few different outcomes.
Am I still making sense? I worry I'm not making sense. I've been wanting to write up my own essay on BG3, someday, and this is an aspect of the game I really want to explore, but I've never been quite able to articulate as well as I think I should.
Basically, it's the difference between players being given a lot of different, interesting choices *before* an encounter, versus only getting those choices *during* an encounter.
---
On the subject of people looking up in-game choices on the Internet, later, and feeling as though the experience was diminished if they learn those were false or illusory choices... I agree that's an issue, but to what extent? What matters more, I wonder, what people feel about those choices in-the-moment, or afterward, upon reflection? My gut is to say that the former weighs more -- like I said earlier (I think), I'm really drawn to that immediate tension players get when presented with a choice, when they have to consider what the consequences might be. And, at least in my case, when I do look things up, and discover that the narrative structure of a game was deceptively simple, well... for my part, I'm more impressed by anything else. It's like magic, really: you can enjoy a trick for what it is, and even learning how it was done won't exactly invalidate the wonder you initially felt in the moment. Sometimes, even, seeing "how the sausage was made" can grant one greater appreciation for it.
For example, if you haven't seen Penn and Teller's old SNL skit, I think that's a good example. Their trick is solid enough when you watch it, but becomes something altogether different once they reveal *how* they accomplished it: https://www.youtube.com/watch?v=mwkmgqbYXdE
(Post continued, apparently I'm talking too damned much here....)
---
Hm... I've kinda lost my train of thought. Anyway, in closing... I certainly agree with your last point about vestigial systems. Though I wouldn't necessarily place the blame *entirely* on consumers -- most game developers, themselves, are passionate video game-consumers, too. And they're going to want to create the same sorts of things they like. We're all both creating and maintaining a cycle of expectation, I think. This is how tropes are perpetuated.
And "vestigial" is definitely a good word to use -- certainly, there are a lot of systemic, mechanical and narrative tropes that seem to persist unchallenged, simply by virtue of their ubiquity. Like, one of my own personal pet peeves is the entire *concept* of "mana." Though I know some people will (quite enthusiastically) disagree, to me, it's always seemed like a redundant concept. Mana, effectively, is meant to be an asptraction of a person's life force or power -- by another word, chi, or ki, or chakra, anima, or what have you. Only we *already* have that concept abstracted in the form of HP.
Ultimately, I think it functions as a pretty boring resource-management problem, since there's seldom in real trade-off to consuming mana beyond simply having less mana to work with in-the-moment. Whereas if we combine these two different abstracted currencies into one, suddenly there's a more meaningful trade-off to consider, since the more of whatever this combined resource is call is expended, the weaker the player will be. If you want to cast a giant spell with mana, all you have to lose is that mana, whereas if it's combined mana/HP, then all of a sudden there's a risk to that choice: if your big attack doesn't work, you'll be greatly disadvantaged in the following moments. Etc., etc.
Likewise, it's interesting, I think, to note just how many action games were created in the years/decades following the widespread adoption of the dualshock-style gamepad before the breakout success of Dark Souls demonstrated so ably that using the left and right shoulder buttons for left-and-right-handed weapons/tools made a lot more intuitive sense that the old method of tying face buttons to different attacks more-or-less at random. That was a design trope so deeply-ingrained that it went largely unquestioned... right up until it wasn't.
But, uh... I guess I ended here on a bit of a digression. Oh well.
Again, don't worry, I can still follow what you're saying. And I did read all of it even if I'll pick and choose some to expand on here.
One thing I find interesting is your commentary of a 'meaningless choice'. I don't think there's any meaning in a binary choice if everyone is going to pick one side. I don't think the existence of another side gives any meaning to a choice. The meaning here is that we chose to interact with this content in the first place. There are so many games out there with so many premises, the fact that I am choosing to play an RPG where I am on a quest to defeat the Dark Lord and be a hero is itself intrinsically meaningful.
Similarly, if there's another game where I join forces with the Dark Lord (like say Tyranny perhaps) it's also meaningful that people play and interact with that. Rather than have a vestigial and mostly pointless evil mode in a different game why not make a fully fleshed game about being the bad guy?
To me there is no meaning for the good/evil choice because _I was never going to choose the evil path_. Or in the rare case where I say 'today I'm signing up to be evil' I'm never choosing the good path. Meaning is only derived if I am actually tempted between options, if I have to reflect on the situation and think about whether I'd do it, and this nuance is getting a little more common these days but not that much. I have seen choices where they can get closer to meaning by a similar vein to Bioshock where it's like 'save the girl and get X/2' vs 'kill the girl and get X' where they try to lure you into the dark side via resources. Unfortunately, Bioshock, like most games, sabotages itself by making the good path ultimately get unique and equivalent/better resources anyway. Even Frostpunk, another game that often asks you to make rough choices, it simply adds more mechanical complexity that a skilled player can bypass. It means a little that I'll dump mechanical complexity on myself to help people out better but maybe not a lot.
This isn't to say you can't have meaningful choices. Take a look at Scarlet Hollow, a supremely complicated VN that is all about choice. They have 'balancing passes' on a VN, because you can choose 2 of 7 traits at the beginning and those traits all do something interesting in the gameworld and let you bypass having to make a bad choice once in the game. For example at one point you can choose whether person A or B dies but if you picked the Strong trait you can save both. But all of its meaningful choices are driving towards the same end, you are a character that is trying to accomplish a thing. You're always going to accomplish the thing, but how you do it and your journey along the way changes, and who you are is shaped by the choices you make, and you literally cannot solve everything. To me, this is meaningful.
Witcher itself has plenty of meaningful choices because they're all viewed through the lens of Geralt and there's rarely a 'happy ending / golden' choice. Sometimes there is, so you still have to stay sharp and guess, but many times you're always choosing something to sacrifice. Sure the ultimate story goes the same way but the journey was a lot more than its finality. BG3 is about a lot of little side stories as well and choices you didn't really realize you were making, people you just missed seeing (or did spot). Its main plot is again 'be the goodly hero' or 'be a real jerk'.
I'm glad you're impressed if the illusion works for you, I tend to be too, and I find I tend to enjoy games considerably more than many of my friends for it!
I think having a series of medium bad's is good way to get people hyped and talking about the story. This is villains that dominate a small area and probably an only personally hate by some of the cast (going down to 1 later on), who she enough villainy from them that you want them to go down hard but not enough that they get Joker immunity and the fans keep insisting they be brought back longer after all the plot juice as been squeezed out them. I think you can a lot of flexibility as a writer with medium Bads, as there many types of villains wouldn't work as final boss of an 80 hour rpg and you an have more of them.
For shortly more episodic stories I really liked Dragon Quest VII as every town, which you had to do one at time as their was only one new town at a time had a short story with a few new character, a few short dungeons and local villain and it stood on its own and left you wondering what the heck is going on in the next town. When we got the end and the dungeons got longer and everything was about another dragonquest demonlord I wasn't having as much fun.
A lovely article as ever Jeff. If you haven't played it already I would recommend a play through of Darkside Detective/DD : A Fumble In The Dark. It both includes humour that works, and also really pares back what's necessary in an adventure game.
It also has a structure that I think could definitely work in RPGs - a set of distinct and not always related missions, which I enjoyed it so much I rationed myself to a scenario per night. It'd be interesting to adapt that to an RPG, possibly with 'ranked missions' (tier 1/2/3.. choices) to cope with the level gains inherent in most RPGs.
People will go to a particular pizzeria in town because it has the *best cheese ever*. The restaurant chose to spare no expense on that ingredient, and they reap the rewards for those that find that ingredient to be the most important in their pizza. But... others continue to go to Dominos because it's cheap and predictable.
Video games are a combination of ingredients. Extreme quality in one area can compensate for lackluster quality in another. Diablo has an "ok" little story, but that's not why (most) people play Diablo... it's for the high-production fireworks and loot-fest. Diablo doesn't try to be a story game; it tries to be the flagship loot game. But it's also a fantasy (A)RPG, so it's gotta have at least some story.
My point is... the story ingredient one part of the pie. For some games, the story is the primary ingredient. For some the gameplay is the primary ingredient.
It's my opinion that the primary ingredient of your games is the world-building. It's not necessarily story itself, but the *clarity* of the world building and the stories that exist in it. I want to play more because the presentation is incredibly clear (through outstanding writing), and I truly desire to see what is around the next corner or what interesting new things are in that next location.
With the pizza analogy, of course every worthy pizza has to have some core ingredients for anyone to even give it a chance (no sauce? huh?). It's your craftsmanship in applying written word to the expected ingredients of CRPGs (monsters, character advancement, exploration) that makes your games stand apart. The combat engine works, the sound functions, the graphics get the point across (I think really well in a retro-way), but it's the world building that keeps me coming back.
So yes, if you do it (story) right, then you'll make more money. But you're speaking from the perspective of someone who is masterclass in RPG writing. It is primarily your skill at writing that makes your money. It is what makes your boutique company stand out, and you as artisan CRPG author.
Every video game needs at least a little bit of story. But for some, it's the main ingredient.
I also have a question for you, if you don't mind -- and perhaps you've explored this elsewhere (I am a relatively new Spiderweb fan): what are your thoughts on the unique challenges around creating memorable, compelling villains in CRPGs? A genre where it's typically very difficult to write a scene where the player-character talking to the villain without needing to contrive some reason why they can't simply murder the Big Bad right then and there?
Baldur's Gate 2 got around this largely with dream sequences that kept Irenicus front-and-center despite being at a far physical remove from the player; Baldur's Gate 3, meanwhile, seldom lets you meet any of its villains without being able to end them right then and there. It seems to me that CRPGs are a uniquely difficult genre to write villains for, and I'm curious what your favorite (and least favorite) methods for getting around those problems might be.
I can't say I've yet played many of your games, but I am currently nearing the end of The Queen's Wish (and greatly enjoying it) -- and I particularly like the conceit of the "Zoom Meetings" the player has with the royal family, a continent a way. It's a great way to flesh out relationships, and the degree of social and political ambiguity in those relationships and those interactions gives the story a fantastic degree of tension -- I'm never quite certain if I'm talking to a potential friend or a potential foe, and have to weigh every word with these far-off overseers with my more immediate actions and concerns. This one choice, I think, really elevates would would otherwise have been a fairly simplistic, episodic narrative structure.
And I'm deeply curious to see what other, similar tricks, you've used in your other games.
The hard thing about making villains in a game is that the whole thing is from the main player POV, so the villain only exists at a distance.
So you need to find a way to make the player get really familiar with the villain. A walk to talk to the enemy or see it acting or encounter it in other ways. There's a lot of ways to do this, but you need it.
A big bad villain I really like and like to hate is the governor from a house of many doors. The more you explore the world and the stories of the officers the more you learn just how much they twisted the world into something even darker then it would have been. And when you have the final revelation of the game (on one route) and how smug they are when you realize just how much of a monster they are it hits harder after they were so mysterious the whole game.
Ooh, I haven't played that one, I'll have to add it to my list.
It can certainly be very effective to characterize a villain through their impact on the world, that's for sure. That way, while you may not actually meet them until the end, you still get a really good sense of what kind of person they are. Final Fantasy VII might be the go-to example for this sort of thing, as Sephiroth is largely absent from the first act of the game, with players only really able to see his *aftermath.* Bloody corridors and impaled serpents -- all *very* evocative stuff that did a great job depicting him as a very, *very* dangerous person, without him needing to utter so much as a single line.
I absolutely loved reading this. Thank you for sharing.
The thing about needing a finite experience is absolutely me. Even if it's not a story (Although having even the vaguest story helps so much) it should have guiding goals and a spot where it says 'okay you're done with everything, playing past here is just for any personal goals you might have'. Similarly, if the story is interesting or characters compelling I can look past so many gameplay flaws. UI flaws are harder these days, so I'd definitely suggest an easy to use UI.
Speaking of The Baron in Witcher 3, or the various Persona 5 villain arcs --
There's a working structure for video game storytelling where the game can have a lot of good little self-contained stories all over the place, each with a beginning, middle, and end, even if the overall game itself meanders along forever. The Arabian Nights approach, if you will.
Bethesda and Obsidian are trying for this, even if neither of them is terribly consistent. The Like a Dragon / Yakuza games are masters at this, where the tonal whiplash between the soap opera melodrama of the main plot and the goofy low-stakes side stories somehow only makes all the side stories funnier by contrast.
How much story is “too much” story? … I guess it all depends on how good it is, which is to say, how much of a genius you are. Execution is everything, and if that could be boiled down to a useful algorithm, Concord wouldn’t have happened.
I think when it comes to particularly bad RPG stories, my goto would be Neverwinter Knights 2. Specifically to this article, too many factions, particularly in the enemy department. And they may have been “distinct” but the bloat was such that all nuance and detail was lost.
Another factor from that game I would highlight is losing the plot entirely. This is less spending time away from the story and more the story spending time away from itself. Going from point B to C had this horrible tendency to go off in directions that had so little to do with C and went on for so, 𝘴𝘰 long, you forget why C was even a thing. Combined with the “too many factions” problem and any ties back to C was lost in the scramble.
If the end goal is to climb Mt. Plotdevice, you don’t necessarily have to spend every minute marching towards it. But most of the time, it should at least be visible in the background. If the river in Distraction Canyon flows from its climactic peaks, you can still have your hole-in-the-ground fun and not have your audience forget what they came for.
Spot on - old chap! Or as we say in the realms of the Nether-Netherlands - The stoy starts to suck, if the compression is too obvious. Happens in older age more than in younger age, I guess.
So, you hit on one thing here that I really disagree with. Quite a bit, actually. This comment is gonna be a long one, so I absolutely won't blame you if you don't read it at all (in fact, I might even advise it). Perhaps it would be better if I could demonstrate some self-restraint here; alas, that is not the world we presently find ourselves in.
"It doesn't matter how good your content is if nobody sees it."
I feel like this is solid conventional wisdom for old media, but less applicable to games. If someone can't watch a movie all the way to the end? Or read a novel through to the final chapter? Yeah, that's a problem. But games are a different beast altogether, demanding a far greater investment of time and attention. And CRPGs specifically? Even moreso.
I get that on the production side of things, it can feel like wasted effort: why bother writing a quest few people are ever going to trigger? Why bother writing dialog for a character few people will ever meet? Why design a dungeon few players will ever even notice, let alone explore? It can be disheartening. But for video games -- for RPGs in particular -- I think it's necessary.
We talk a lot about "player agency" with regard to games and RPGs, but what that ultimately comes down to is simply giving the players a choice. Do you go through Door A or Door B? If both doors lead to the same corridor, it's a false choice, no more than a poorly-maintained illusion of agency. What makes that decision *meaningful* is that by choosing Door A, a player will not know what lay beyond Door B. It's a decision with a consequence.
Perhaps the biggest example of "wasted effort" expended on this sort of thing do date is The Witcher 2's (in)famous second act, which took place in an entirely different region, featuring entirely different quests and characters, depending on a single choice the player was presented with at the end of the first act -- with nothing to indicate just how far-reaching the consequences of that choice would be.
For those who played The Witcher 2 to completion, every single player missed out on an *enormous* chunk of the game. And simply *knowing* that their choices led to that outcome makes those choices more meaningful, even if, given the opportunity, they'd make the same choice again.
God knows I've played through The Witcher 2 a good three times or so, and I've never -- not once -- seen any Act 2 but the Scoia'Tel's. Similarly, I've played the big classics, the KOTORs and Baldurs' Gates and Planescape and Fallouts many times, but more often than not, I find myself making the same decisions in those crucial moments. Choices that only have meaning because I could have chosen otherwise.
So, consider turning your thinking around. You said:
"For a game this length, that is an AMAZINGLY high percentage. Yet, it's still really low for a game we're calling a classic."
But I would suggest that the real question here is why is it that a game can be a classic *despite* people generally not experiencing the thing in whole?
And I would posit it's down to the medium. Certainly, I can think of many other classic games that most players are unlikely to see through to the end. How many people have beaten every level of a Mario game, I wonder? Or gone through Dark Souls all the way to the end credits? You also brought up The Last Of Us, a relatively short game featuring a linear narrative, but even in that case, current trophy stats indicate that only around 40% of players played through to the end credits. Yet, still, it is widely-considered a classic. And, indeed, if we're going to use Achievements/Trophies as a measure here, we must also confront the odd fact that that, for the vast majority of games, few than half of the players reach the "end," wherever that may be. (And only a fraction of a fraction manage to clear games "100%".)
I don't think it's unreasonable to assume that games saw similar statistics back in the years and decades before we had these systems to track player behavior.
....
And while this may be a bit of an unnecessary aside in a comment already overlong, but on the topic of Baldur's Gate 3, it's probably important to note just how beneficial this kind of "rare" or "unseen" content has been for marketing. So much of the chatter around the game -- which continues, unabated, to this day -- is oriented around rare interactions and consequences, the kinds of things that even the most dedicated of players are unlikely to see. Hell, in my current run of BG3 (a game, by the way, I've logged 500 hours into without yet finishing -- though I *do* intend to) I'm playing as a Drow Ranger, and took the path through the Underdark out of Act 1. And I *just* saw a TikTok highlighting a unique interaction that was *only* available to Drow PCs on that path through the game that I managed to miss entirely.
I could put another 500 hours into the game and still never see the scene on my own. It is *glorious.*
....
.... .... .... ....
But, also, I want to make it clear that your point is *still* very valid. Most every form or written media can and *must* be improved by ruthlessly cutting anything that isn't essential. The best writing advice I've ever had to give students is simply this: write your essay, write your short story, then sleep on it. The next day, highlight the first paragraph, or the entire first page -- and DELETE it. Then read through it again and ask yourself, "Did it lose anything worth having?" And most of the time the answer is going to be, "No."
I firmly believe that the best habit for any writer, the best skill to have, is simply the ability to ruthlessly cut their own work. You may doubt that from my post here, which is a bit much, I freely admit -- but I can at least promise you that if I were saying this in any kind of formal context, I'd be "gently" revising the word count down to at least half its current length.
(Second best skill, as you noted, is, of course, following the KISS rule. :D)
....
.... .... ....
A couple of other small comments I'd like to make, briefly, before I draw this ordeal to a close:
Re: Bloodborne's story -- I think you're maybe falling into a common but reductive misunderstanding of what "story" is. Basically, plot is only an aspect of a story, not the story itself. And while Bloodborne's plot can indeed be described as "completely incomprehensible," I'd argue that what makes the game so memorable is how well-executed all of the *other* narrative aspects are: specifically the setting and the characters. The bosses and NPCs are all unique and memorable, despite typically having only very limited amounts of dialog; and the setting itself, why... that IS the game. This is true of all modern FROMSoft games' narratives, I think: they prioritize setting and mood most of all, and plot least of all.
Meanwhile, with cases like Minecraft, there's still a narrative at play for players to engage with, it's just something wholly emergent and player-constructed. I don't think that invalidates the narrative element of the game, however. Harkening back to 3rd grade English class, one of the first things we're ever taught is that whole "there are three kinds of conflict," thing -- and what is a survival game but an implicit story of Man vs. Nature?
....
Re: Humor, you're absolutely right, but at the same time, the advice to "be funny" is just... well, let's just say it seldom works. The easiest way for a writer *not* to be funny, in my experience, is to *try* to be funny.
Rather, I'd frame it more as writers need to learn to not be quite so serious, all the time. Especially for video games. *Especially* especially for 40+ hour RPGs. If you want players to spend a lot of time inside your constructed world, that world needs to have texture to it -- which, with writing, means different tonalities. Make some content dramatic, make some content comedic; make some content profound, make other content absurd. Take advantage of the juxtaposition in tone to make the world feel more vibrant and interesting and -- perhaps most crucially of all -- unexpected.
There's nothing more delightful than laughing -- or crying -- when you least expected to. And that's the kind of feeling that game storytelling is uniquely positioned to deliver, I think.
Thank you for the long, thoughtful comments!
I can't tell players what they should prefer. If people play 15% of a game and are happy, I'm glad for them. But if a really good, epic, acclaimed game can boast only 20% of people finishing it, and that's the GOOD result, I think something has gone very wrong. Remember, I'm think about this in the context of a flailing industry and out of control development costs.
"I think you're maybe falling into a common but reductive misunderstanding of what "story" is."
Yes, everything that happens in a game can be folded into a looser definition of story, but think we all know where I'm looking at here. When game designers get together to hash out the story, "Story" means the formal, fixed, written part.
We we ask, "What is the story for this mess?" we aren't thinking about when Joel shoots 87 zombies. That's the game part. In a few games, game=story, but that's very few.
Ahaha... I do have a bit of a problem with talking too much.
The context here is definitely something to keep in mind -- BG3 is a ridiculously over-ambitious game, and then you hear Swen talk about all these *other* big ideas he had to cut... it was definitely over-scoped. I don't know just how much of a problem that is for games in-general, but with RPGs specifically it seems to be a very common problem. I can't help but think here on the hubris of the Xenosaga series -- a spiritual sequel to the (in)famously-unfinished Xenogears -- which, for some ungodly reason, they thought they'd be able to make a *six-game series* out of. Or any of the many games that end in big cliffhangers, on the assumption of getting a sequel, only for that sequel to never materialize.
As for the comment on story, yes, I'm well aware of what you meant, and I wasn't trying to say you were wrong -- rather, it's that I don't think Dark Souls was the best example to use there, as I think those are games that (generally) do an excellent job with their stories, even with the plots being as threadbare and hard-to-decipher as they are. And just speaking personally, I think it's valuable to acknowledge and appreciate games that try to craft unconventional narratives. I wasn't really considering emergent narratives (and entirely other can of worms) but rather the work that goes into art direction and level design to make a play space feel compelling and engaging.
Anyway, I think part of the issue here might simply be one of energy allocation. Even with a big, expensive game like Baldur's Gate 3, you've only got a small number of writers penning the narrative, and they're only going to have so much energy to expend in so many directions. The trend in modern RPGs certainly seems to be focusing on expanding and developing companion characters more than anything else, so I'm not exactly surprised that that "main plot," as it were, wound up being so messy -- it's not really the focus of games' writers' efforts, I think.
And to an extent, that might even be justified by the completion statistics: everyone who plays BG3 will encounter the companions and spend a lot of time interacting with them; but only a fraction of that number will see the main quest through to its conclusion.
And speaking of trophy stats, just for fun, I looked up the stats for BG2 -- they're even worse. Granted, we only have this data for the Enhanced Editions, but... they're quite bad. On Steam, a whopping 70% of players haven't earned *any* achievements. That means only 3 out of every 10 players have completed the *prologue.* And only 10% of players have the achievement for beating Irenicus and clearing the game.
Does that mean that something has "gone very wrong" with BG2, too?
I guess I'm just... unconvinced that this is really that much of a problem, rather than simply being a (neutral) pattern of behavior. Most players don't complete most games, and it makes sense to me that the larger a game is, the longer it takes to clear, the smaller that demographic is going to be. Is that a problem? I don't really know. I'm certain it's very disheartening for the people who who are writing those stories, designing those dungeons and monsters and what have you, that their work is only reaching a fraction of their audience... but if that work hadn't been done, how much audience would these games have in the first place?
Like I don't think the math is so simple here that we can say, "Well, if only 20% of players are reaching the end of a game, then if we cut the ending and don't waste resources on content only 20% of players will see, we'll only be losing 20% of our audience." Much like how I think decisions in RPGs can only matter if there's the potential for there to be different outcomes, I don't think many players will be willing to get invested in a story without the potential for their to be a gratifying conclusion at the end of their journey, even if they never reach it.
Thank god I'm not the only person who immediately restarted Witcher 2 to experience the stuff I missed and then.... immediately sided with the Scoia'tael again, just like my first time.
At least in my case I did end up doing a third run through in which I finally did side with the Blue Stripes, no regrets :)
So small counterpoint from me from somebody who loved Witcher 2 and played both paths. How many people who played Witcher 2 do you think actually understood how much they even missed? Insert plenty of other games here too.
In other games like Walking Dead people got the illusion of their choices mattering and were really happy about it, but some who looked too deeply were upset once they realized things didn't really change as much as they desired. But is that a problem with the game or a problem with how a chunk of us consume game media and fret over what we could have done or missed or whatever. We demand huge sprawling things and then often don't play them or complain about bloat.
We demand 'oh you should allow people to be evil' in CRPGs but statistically almost nobody does that. Why not cut all that out and instead allow better nuanced options on how to be a good person in RPGs, you still get choices that way but I'd argue they're actually meaningful as opposed to a meaningless choice between good and evil because basically everyone will just choose 'good'.
Okay, you touched on a lot of different things that I think are really interesting to talk about... so please forgive me if I'm throwing too much text at you in response!
*ahem*
Re: The Witcher 2: I take your point, but to start with a small digression: how many TW2 players noticed that? Quite a few of them, I imagine -- this was back when the Witcher series was still relatively niche and aimed at RPG-enthusiasts, who tended to be very online. At the very least I'm sure it was a significant fraction. But I recognize this digression is besides the point. I might similarly point to another RPG (and, personally, I have zero qualms calling it such) generally regarded as a classic: Majora's Mask -- a game that, arguably, disincentivized exploration with its strict time-management system. How many people who've played Majora's Mask to completion saw all of the side-quests to completion, including the Anju/Kafei quest that is generally considered to be some of the best little bits of the storytelling in the series? Not many.
What we're really hitting on here, I think, is that what matters most of all isn't so much that choices have consequences, or that players have agency, but rather than players appreciate the *illusion* of consequence, of agency, even if they don't really have as much as they think they do. Part of the reason why the simple Be Good/Be Evil dichotomy is so common in games is because it's an easy way to create that illusion, if thinly, with minimal resources spent on optional content: conversations can either end with an NPC being killed by the player, or handing over a MacGuffin; all you really need is an Ending A and Ending B. Like in Mass Effect (1), a game generally praised for being "reactive," nothing the player says or does really affects the outcome of the story, beyond the final few choices which only affect a few minutes (possibly only a few tens-of-seconds, I don't remember) of the ending sequence. Of course, RPGs need more than that -- as you suggested, it's a duality easily tossed in favor of giving the player different options on "how to be the hero." Which, I think, is precisely what most modern RPGs do -- your Mass Effects, your Dragon Ages, your Witchers, all have a predefined idea of the player character as a generally heroic person, with the players agency in the narrative demonstrated mostly in how they *respond* to the world rather than how their actions *affect* the world.
....
I'm sorry, I feel like this response is really scattered. There's just so much here to talk about!
On the subject of breadth and depth, for example, I don't know know that audiences are a really claiming for the maximalist approach you're suggesting they do, certainly not in any great numbers. I seldom see people bemoaning, say, Skyrim, for it's total lack of reactivity -- they're too busy exploring the vast open world. Likewise, I've yet to see *anyone* suggest that Baldur's Gate 3 would've been better as an open-world game -- there, they're too busy exploring all of the many intricacies of its small, but deeply-reactive world.
.....
But what I really want to talk about here, speaking as a lonely writer myself, and mindful of our current context at the bottom of a Jeff Vogel blog-post, is... RESOURCE ALLOCATION. Which, I promise, is more interesting than it sounds.
Basically, all of the things people love about big RPGs are things that require a great deal of resources to deliver, that make them complete nonstarters for many projects. And while I've not experience with creating video games, I do have some small experience with interactive storytelling (interactive nonfiction, if you'd believe it) and collaborative narrative projects -- so I'm very familiar with how quickly these endeavors can become bloated and unwieldly and fall apart under the weight of too many ambitions.
The golden rule, I think, is not that "players choice matters," as RPG fans often seem to think, but rather -- and I think I hinted at this earlier -- that "the *illusion* of player choice matter." You don't actually need an RPG to react to the player as much as they think it could or should.
Essentially, what matters is that whenever players are presented with a choice, you want them to think about what the *consequences* of that choice are THIS is what matters most -- the paramount experience of an RPG -- not the subsequent validation of seeing said consequence play out, or invalidation of seeing it *not* play out, or subversion of seeing it play out in an unexpected manner.
The problem is, you need to have just enough actual reactivity to maintain that illusion -- too little and it falls apart. If every negative dialog option elicits a "But Thou Must," in response, players are going to quickly assume there's no reactivity, and fail to recognize it when it happens.
One example I would point to would be Chrono Trigger, an otherwise fairly straightforward and linear JRPG, which does something rather clever in the tutorial: the player has a lot of different actions they can take or not take during a festival, all minor choices, none of which feel particularly consequential... but then PLOT happens, and you're placed on trial, and all of those choices are then interrogated. Suddenly, players realize that, all this time, they were being watched -- and judged -- by the game, and are having to face the consequences of their actions. And that little bit of reactivity casts a shadow over the entire rest of the game, lending even small choices a sense weight they'd otherwise lack, even though there's little more reactivity in the entire game until very end.
Basically, you want that tension in every scene: that things *might* matter more than players suspect.
It's the video game equivalent of a DM just randomly rolling some dice in the background to keep their players nervous, or smiling or laughing to themselves as though they know something the players don't (and maybe should have noticed) when in reality the only thought in their brain is how they should've ordered the pizzas a half-hour earlier.
(I think there's also an entire other conversation we could have on different kinds of reactivity, which I could broadly categorize as front-facing and back-facing, which are *dramatically* different to implement, but I'll save that for another time.)
My apologies if this comment is a bit hard to parse, or too scattershot. My health's been a bit poor of late, and some of the medications I'm on can affect my mental state. I hope it was all sufficiently coherent, at least? I just have a lot of fun talking about these things, so I really couldn't wait for my mind to be clearer. :D
You were wordy but coherent, don't worry.
What I was alluding to in my reply is that what gamers and purchasers say they want from games and what actually causes them to purchase and appreciate games tend to diverge from each other.
Statistically, people don't buy short games as much as long ones, even though I think also statistically people enjoy and finish short games more than long ones. This leaves game developers in an awkward place where they're developing a lot of content that goes entirely unseen because people demand that it be present so they can not interact with it at all. Because people will vocally say 'This game wasn't good value for my dollar so I didn't buy it'. Good versus evil content is one of those things because it'll be something like 89% of people will choose good and 11% choose evil. So evil paths are typically stunted and unsatisfying even for people who actually enjoy the play of being evil, but people will complain if they don't exist at all.
To bring back to the BG3 discussion and how it's so big, apparently like only 4-5% of people did like me and chose to play an Origin Character as their Avatar and control them and they put an absolutely nonsense amount of work into it. It's lovely that Larian was able to do that and I had a lot of fun with it, but wouldn't we have a lot more better games if the companies weren't expected to do all this and just deliver tight directed experiences?
The illusion of choice was a great way to do it but it has broken some in the modern era with the internet. When you can go online and see that 'no you actually didn't change much of anything' people feel cheapened even though they were perfectly happy in the moment and in the illusion. Now that video games are a topic of discussion both in person and online people are going to share their experiences about it more.
As you point out, minor reactivity can often do the trick just fine. As long as the game is saying 'hey something you did mattered and I noticed it', and you don't try to pretend it was something super huge, it seems to have a really good effect on people. You don't need the Witcher 2 chapter split you just need 'you stole that fruit at the festival' or 'I decided to let this guy live but I thought that guy was a jerk and killed him for what he did'.
I think a lot of weird game design decisions and vestigial systems can be laid at the feet that gaming consumers and the market have certain expectations even though the same people will also complain about those expectations.
I'm actually having a particularly bad day and just had to take some of my heaviest meds... so lets hope I can maintain that coherence!
---
Your first point is a very good one: consumers (of any media) are, in general, very bad at articulating just what it is, exactly, that they want. There's that handy old truism that "No one knows what they want until they get it."
For a very long time, I think video game players, especially, fell prey to conflating *quality* with *quantity*. Particularly in the RPG space. I'm tempted to say that this has gotten better in recent years, but then I start thinking about how incredibly bloated and overdesigned the average open-world game is, and I wonder....
I'm definitely old enough to remember when RPGs were described first and foremost by the number of hours one could invest in them. BG2 was never described to be as a great RPG because of the writing, or freedom, or exploration, or combat systems, or any of the other aspects that make it so enjoyable... but, rather, because it was "a 200 hour RPG."
With scope and scale increasing as much as it has, and more and more developers forced to rely on various shortcuts to make games as big as possible, I think players are at least a little more savvy to the fact that more content doesn't mean better content, and that *too much* content can, in fact, be a detriment to an experience. Or I hope so. Right now, for example, I'm thinking of Assassins Creed -- Oddyssey and Valhalla, especially. Games with hundreds of hours of things to do and people to talk to, set in some of the most expansive and beautifully-detailed open worlds to date... yet, in my experience at least, none of that content was especially memorable or compelling. Not once did I encounter anything that surprised me, or made me smile, or made me feel anything other than, "Gee, this sure is a decent way to keep my hands and eyes occupied while listening to this audiobook."
---
Back to the multi-path thing -- and this touches on the two different types of reactivity I mentioned before, which I guess makes this a good point to talk about them? Well, in a bit. Anyway, my argument would be that while it's true that "people will complain if they don't exist at all," even (and especially) when they're not even electing to experience those routes themselves, I don't think that invalidates the criticism. The thing is, if you've got an good route (defeat the Dragon Lord!) and an evil route (join forces with the Dragon Lord!) both options lead to the same outcome, simply because that's the option most players will pick... then its a meaningless choice. That's a choice that only has value *because* there's an alternative.
And, indeed, as we see in these sorts of games, those minority-routes, those evil-routes, are consistently executed with far less depth and care than the more conventional alternative(s). It took Larian a little over a year to patch-in some more detailed "evil endings" to BG3, simply to bring those routes just a little bit closer to the more conventional endings, so I think it's pretty clear where their priorities will be.
Personally, the way I look at things, I think it's better to be ambitious and fail than to to be unambitious and succeed. I think video games are a spectacular medium whose potential as a storytelling vehicle we've only ever barely begun to scratch the surface of, and I think there's more than enough room for both wildly ambitious, overscoped, reactive games like Baldur's Gate 3, and those smaller, "tight-directed experiences."
It all comes down to $$$. If you've got the resources to be ambitious, be ambitious; if you don't have the resources to be ambitious, be clever.
---
Okay, so about the two different kinds of reactivity... I'm honestly not sure where the best place would be to dig into this, so I'm just sort of randomly diving in here.
Basically, as I said before, I think we can broadly segregate reactivity into two categories: front and back-facing. By which I mean, there's reactivity (or perhaps 'player agency' is the better term) that is focused *before* a story-critical event, and reactivity that occurs *after.*
The latter is the more straightforward (and, arguably, also more expensive/difficult) and conventional type: there's some big plot event, and the player is given a choice to make, typically by selecting a line of dialog from among multiple options. Each dialog option, presumably, leads to a different possible permutation, branching outward from that moment.
This is the general format of a choose-your-own-adventure story, where player agency is used to determine how events play out. It is extremely difficult to do well, and very time-consuming. These "trees" can grow quite large and quite tangled, and making every possible path through as satisfying as any of the others is *extraordinarily* difficult. This is why most games built around this kind of reactivity rely so heavily on implication and illusion, I think -- they just give the player a ton of different dialog options, only a small number of which *actually* matter, but the player has no real way of knowing what options matter and which don't. I believe The Witcher 3, for example, gave the player hundreds of different dialog choices -- possibly thousands -- throughout the course of it's main plot, but the actual decisions that affected the ultimate outcome of the story? As I recall, they numbered only four or five.
Front-facing reactivity is an entirely different beast, and I think it's the kind of reactivity that Larian RPGs are based on. Basically, those big plot moments are set in stone: the outcomes are fixed, but HOW your each those outcomes is where players have the most agency. Here, the investment/resources need to go more into systems and mechanics than writing, which makes these stories easier to write, I think. To jump back to BG3, for example, the big choices the player makes in Act 1 with regard to Minthara largely happen *before* you ever meet her. How do you treat the Tiefling refugees when you meet them? How do you interact with the Goblin Camp in-between you-and-her? You've got a ton of different options with how to approach the various problems in the game, but there are very few different outcomes.
Am I still making sense? I worry I'm not making sense. I've been wanting to write up my own essay on BG3, someday, and this is an aspect of the game I really want to explore, but I've never been quite able to articulate as well as I think I should.
Basically, it's the difference between players being given a lot of different, interesting choices *before* an encounter, versus only getting those choices *during* an encounter.
---
On the subject of people looking up in-game choices on the Internet, later, and feeling as though the experience was diminished if they learn those were false or illusory choices... I agree that's an issue, but to what extent? What matters more, I wonder, what people feel about those choices in-the-moment, or afterward, upon reflection? My gut is to say that the former weighs more -- like I said earlier (I think), I'm really drawn to that immediate tension players get when presented with a choice, when they have to consider what the consequences might be. And, at least in my case, when I do look things up, and discover that the narrative structure of a game was deceptively simple, well... for my part, I'm more impressed by anything else. It's like magic, really: you can enjoy a trick for what it is, and even learning how it was done won't exactly invalidate the wonder you initially felt in the moment. Sometimes, even, seeing "how the sausage was made" can grant one greater appreciation for it.
For example, if you haven't seen Penn and Teller's old SNL skit, I think that's a good example. Their trick is solid enough when you watch it, but becomes something altogether different once they reveal *how* they accomplished it: https://www.youtube.com/watch?v=mwkmgqbYXdE
(Post continued, apparently I'm talking too damned much here....)
---
Hm... I've kinda lost my train of thought. Anyway, in closing... I certainly agree with your last point about vestigial systems. Though I wouldn't necessarily place the blame *entirely* on consumers -- most game developers, themselves, are passionate video game-consumers, too. And they're going to want to create the same sorts of things they like. We're all both creating and maintaining a cycle of expectation, I think. This is how tropes are perpetuated.
And "vestigial" is definitely a good word to use -- certainly, there are a lot of systemic, mechanical and narrative tropes that seem to persist unchallenged, simply by virtue of their ubiquity. Like, one of my own personal pet peeves is the entire *concept* of "mana." Though I know some people will (quite enthusiastically) disagree, to me, it's always seemed like a redundant concept. Mana, effectively, is meant to be an asptraction of a person's life force or power -- by another word, chi, or ki, or chakra, anima, or what have you. Only we *already* have that concept abstracted in the form of HP.
Ultimately, I think it functions as a pretty boring resource-management problem, since there's seldom in real trade-off to consuming mana beyond simply having less mana to work with in-the-moment. Whereas if we combine these two different abstracted currencies into one, suddenly there's a more meaningful trade-off to consider, since the more of whatever this combined resource is call is expended, the weaker the player will be. If you want to cast a giant spell with mana, all you have to lose is that mana, whereas if it's combined mana/HP, then all of a sudden there's a risk to that choice: if your big attack doesn't work, you'll be greatly disadvantaged in the following moments. Etc., etc.
Likewise, it's interesting, I think, to note just how many action games were created in the years/decades following the widespread adoption of the dualshock-style gamepad before the breakout success of Dark Souls demonstrated so ably that using the left and right shoulder buttons for left-and-right-handed weapons/tools made a lot more intuitive sense that the old method of tying face buttons to different attacks more-or-less at random. That was a design trope so deeply-ingrained that it went largely unquestioned... right up until it wasn't.
But, uh... I guess I ended here on a bit of a digression. Oh well.
Again, don't worry, I can still follow what you're saying. And I did read all of it even if I'll pick and choose some to expand on here.
One thing I find interesting is your commentary of a 'meaningless choice'. I don't think there's any meaning in a binary choice if everyone is going to pick one side. I don't think the existence of another side gives any meaning to a choice. The meaning here is that we chose to interact with this content in the first place. There are so many games out there with so many premises, the fact that I am choosing to play an RPG where I am on a quest to defeat the Dark Lord and be a hero is itself intrinsically meaningful.
Similarly, if there's another game where I join forces with the Dark Lord (like say Tyranny perhaps) it's also meaningful that people play and interact with that. Rather than have a vestigial and mostly pointless evil mode in a different game why not make a fully fleshed game about being the bad guy?
To me there is no meaning for the good/evil choice because _I was never going to choose the evil path_. Or in the rare case where I say 'today I'm signing up to be evil' I'm never choosing the good path. Meaning is only derived if I am actually tempted between options, if I have to reflect on the situation and think about whether I'd do it, and this nuance is getting a little more common these days but not that much. I have seen choices where they can get closer to meaning by a similar vein to Bioshock where it's like 'save the girl and get X/2' vs 'kill the girl and get X' where they try to lure you into the dark side via resources. Unfortunately, Bioshock, like most games, sabotages itself by making the good path ultimately get unique and equivalent/better resources anyway. Even Frostpunk, another game that often asks you to make rough choices, it simply adds more mechanical complexity that a skilled player can bypass. It means a little that I'll dump mechanical complexity on myself to help people out better but maybe not a lot.
This isn't to say you can't have meaningful choices. Take a look at Scarlet Hollow, a supremely complicated VN that is all about choice. They have 'balancing passes' on a VN, because you can choose 2 of 7 traits at the beginning and those traits all do something interesting in the gameworld and let you bypass having to make a bad choice once in the game. For example at one point you can choose whether person A or B dies but if you picked the Strong trait you can save both. But all of its meaningful choices are driving towards the same end, you are a character that is trying to accomplish a thing. You're always going to accomplish the thing, but how you do it and your journey along the way changes, and who you are is shaped by the choices you make, and you literally cannot solve everything. To me, this is meaningful.
Witcher itself has plenty of meaningful choices because they're all viewed through the lens of Geralt and there's rarely a 'happy ending / golden' choice. Sometimes there is, so you still have to stay sharp and guess, but many times you're always choosing something to sacrifice. Sure the ultimate story goes the same way but the journey was a lot more than its finality. BG3 is about a lot of little side stories as well and choices you didn't really realize you were making, people you just missed seeing (or did spot). Its main plot is again 'be the goodly hero' or 'be a real jerk'.
I'm glad you're impressed if the illusion works for you, I tend to be too, and I find I tend to enjoy games considerably more than many of my friends for it!
I think having a series of medium bad's is good way to get people hyped and talking about the story. This is villains that dominate a small area and probably an only personally hate by some of the cast (going down to 1 later on), who she enough villainy from them that you want them to go down hard but not enough that they get Joker immunity and the fans keep insisting they be brought back longer after all the plot juice as been squeezed out them. I think you can a lot of flexibility as a writer with medium Bads, as there many types of villains wouldn't work as final boss of an 80 hour rpg and you an have more of them.
For shortly more episodic stories I really liked Dragon Quest VII as every town, which you had to do one at time as their was only one new town at a time had a short story with a few new character, a few short dungeons and local villain and it stood on its own and left you wondering what the heck is going on in the next town. When we got the end and the dungeons got longer and everything was about another dragonquest demonlord I wasn't having as much fun.
A lovely article as ever Jeff. If you haven't played it already I would recommend a play through of Darkside Detective/DD : A Fumble In The Dark. It both includes humour that works, and also really pares back what's necessary in an adventure game.
It also has a structure that I think could definitely work in RPGs - a set of distinct and not always related missions, which I enjoyed it so much I rationed myself to a scenario per night. It'd be interesting to adapt that to an RPG, possibly with 'ranked missions' (tier 1/2/3.. choices) to cope with the level gains inherent in most RPGs.
People will go to a particular pizzeria in town because it has the *best cheese ever*. The restaurant chose to spare no expense on that ingredient, and they reap the rewards for those that find that ingredient to be the most important in their pizza. But... others continue to go to Dominos because it's cheap and predictable.
Video games are a combination of ingredients. Extreme quality in one area can compensate for lackluster quality in another. Diablo has an "ok" little story, but that's not why (most) people play Diablo... it's for the high-production fireworks and loot-fest. Diablo doesn't try to be a story game; it tries to be the flagship loot game. But it's also a fantasy (A)RPG, so it's gotta have at least some story.
My point is... the story ingredient one part of the pie. For some games, the story is the primary ingredient. For some the gameplay is the primary ingredient.
It's my opinion that the primary ingredient of your games is the world-building. It's not necessarily story itself, but the *clarity* of the world building and the stories that exist in it. I want to play more because the presentation is incredibly clear (through outstanding writing), and I truly desire to see what is around the next corner or what interesting new things are in that next location.
With the pizza analogy, of course every worthy pizza has to have some core ingredients for anyone to even give it a chance (no sauce? huh?). It's your craftsmanship in applying written word to the expected ingredients of CRPGs (monsters, character advancement, exploration) that makes your games stand apart. The combat engine works, the sound functions, the graphics get the point across (I think really well in a retro-way), but it's the world building that keeps me coming back.
So yes, if you do it (story) right, then you'll make more money. But you're speaking from the perspective of someone who is masterclass in RPG writing. It is primarily your skill at writing that makes your money. It is what makes your boutique company stand out, and you as artisan CRPG author.
Every video game needs at least a little bit of story. But for some, it's the main ingredient.
I also have a question for you, if you don't mind -- and perhaps you've explored this elsewhere (I am a relatively new Spiderweb fan): what are your thoughts on the unique challenges around creating memorable, compelling villains in CRPGs? A genre where it's typically very difficult to write a scene where the player-character talking to the villain without needing to contrive some reason why they can't simply murder the Big Bad right then and there?
Baldur's Gate 2 got around this largely with dream sequences that kept Irenicus front-and-center despite being at a far physical remove from the player; Baldur's Gate 3, meanwhile, seldom lets you meet any of its villains without being able to end them right then and there. It seems to me that CRPGs are a uniquely difficult genre to write villains for, and I'm curious what your favorite (and least favorite) methods for getting around those problems might be.
I can't say I've yet played many of your games, but I am currently nearing the end of The Queen's Wish (and greatly enjoying it) -- and I particularly like the conceit of the "Zoom Meetings" the player has with the royal family, a continent a way. It's a great way to flesh out relationships, and the degree of social and political ambiguity in those relationships and those interactions gives the story a fantastic degree of tension -- I'm never quite certain if I'm talking to a potential friend or a potential foe, and have to weigh every word with these far-off overseers with my more immediate actions and concerns. This one choice, I think, really elevates would would otherwise have been a fairly simplistic, episodic narrative structure.
And I'm deeply curious to see what other, similar tricks, you've used in your other games.
The hard thing about making villains in a game is that the whole thing is from the main player POV, so the villain only exists at a distance.
So you need to find a way to make the player get really familiar with the villain. A walk to talk to the enemy or see it acting or encounter it in other ways. There's a lot of ways to do this, but you need it.
A big bad villain I really like and like to hate is the governor from a house of many doors. The more you explore the world and the stories of the officers the more you learn just how much they twisted the world into something even darker then it would have been. And when you have the final revelation of the game (on one route) and how smug they are when you realize just how much of a monster they are it hits harder after they were so mysterious the whole game.
Ooh, I haven't played that one, I'll have to add it to my list.
It can certainly be very effective to characterize a villain through their impact on the world, that's for sure. That way, while you may not actually meet them until the end, you still get a really good sense of what kind of person they are. Final Fantasy VII might be the go-to example for this sort of thing, as Sephiroth is largely absent from the first act of the game, with players only really able to see his *aftermath.* Bloody corridors and impaled serpents -- all *very* evocative stuff that did a great job depicting him as a very, *very* dangerous person, without him needing to utter so much as a single line.