Just like how cinematography and visuals effects should be used to tell a story in film and television, gameplay and graphics should also be an important storytelling component in video games. So why do some developers refuse to acknowledge the importance of gameplay as a storytelling aspect?

So why do some video game developers still brush off the importance of gameplay despite having proven that gameplay is the most importance aspect of storytelling in video games?

This is a very big and very loaded question, I think. It assumes two things: that gameplay is the most important method of telling a story in a game, and that developers are ignoring it despite this being patently obvious.

Let's start with assumption #1. Frankly, I believe that since only games have gameplay, gameplay should be front and center in games in a way it's not in passive media. That's just what makes them special, so it should be emphasized. However, that's really not the only reasonable approach.

Imagine if you wanted to make an animated movie, but the movie had branching paths the viewer could choose for the story. Unfortunately, there basically is no such possible movie. The technology to do it simply isn't there. But, with a game, you can. In fact, most story-driven games occupy this sort of space: barf forth the story for the player to passively absorb, task the player with some kind of test of skill the results of which will decide the next bit of story from a few different alternatives (or even no alternatives), and then vomit out some more story for the player to passively absorb. You may recognize this as the cutscene-gameplay-cutscene model of storytelling in games. It feels janky and disconnected in theory, but in practice it works surprisingly well.

After all, pretty much all the recent third person AAA action games do this to a large degree (Tomb Raider, Uncharted, The Last of Us, Shadow of Mordor, and so on). Many first person shooters do, too.

Now, is this somehow less of a game? Not really. Maybe it is using the capabilities of games in a surprisingly limited way regarding story-telling, but we don't call Transformers an awful movie just because the whole thing is shot using that orbiting camera thing Michael Bay loves so much. Rather, it's just that it's a certain sort of film made for a certain sort of audience that appreciates a certain sort of experience that happens to only use certain tools in the film-maker's tool-kit, like orbiting camera shots to make things feel big and epic.

Similarly, it seems that a lot of people really love "give me the story in passive chunks and separate it by semi-associated mechanical challenges." The mechanical challenges don't even have to be any good; Bioshock's shooting is notoriously awful, and it makes up most of the gameplay, and yet it has a 96 critic score and 88 user score on the 360 on metacritic, and not much different on any other platform. Obviously, a good story - even sans good mechanics - sells. This should be no surprise: we love books and movies, so a book or a movie with a little extra player input is still going to be a hit if its story is good.

So, in the first place, while I agree that one should maximize one's use of what makes games unique as a medium in order to make a truly great game, it seems that really an awful lot of people are satisfied with less extensive use of what makes games unique, and that's not really a problem. A good story is a good story, even if it's not maximizing games' potential, and there's nothing wrong with that.

In the second place, it seems that devs are often either just like those people in their audience that praise stories regardless of mechanics, or else recognize that there are a damn lot of these people out there. In neither case are they "brushing off" gameplay - they are making a conscious decision to favor one thing over another, and not necessarily an unfounded one.

To quote Plinkett from Red Letter Media when comparing Citizen Kane to the Star Wars Prequels:

RLM is a comedy venture. They're pretty mediocre as actual criticism, so I wouldn't rely on them to make your point for you. I don't think what they have to say is really relevant here, anyway: none of the criticism you're leveling at games is that they use their stories to show off their gameplay, nor that they use their gameplay to show off their stories, but it's that their gameplay is a nonexistent after-thought. It's more like saying "director just uses shots of talking heads to tell their story when they have so many more tools available," which was notoriously a criticism levied at many directors by Alfred Hitchcock.

And yet, many directors continue to do dialogue this way. It's a commonly accepted practice. It's in tons of films, all the time. Was Hitchcock right that there are more ways to do it? Would they be advancing film more as a medium? Probably. But there's also nothing necessarily wrong with talking heads shots, just like there's also nothing necessarily wrong with "here's the story with some buttons to push to keep you interested."

The ability to interact with people, items and creatures plays a huge role in the gamer's understanding of the plot.

In Half Life 2? The game where literally the entire plot is fed to you through dialogue, generally from tag-along companions? It's hard to call "sit here and listen to this dialogue" as storytelling through game-play, honestly.

If you want to make a good story, shouldn't you try and make good gameplay?

I think a realm of game you may be more interested in are sandbox games: games that have mechanics designed to create a story, rather than to tell one.

Honestly, mechanics that effectively tell a story are still a huge mess these days. Nobody really knows how to pull them off in a way unique to games. Recently we've seen the "walking cutscene" - you retain control and walk somewhere while getting talked at - take off. It's perhaps a step in the right direction, but it's still basically just a cutscene, which is still a movie.

The reason you see this, I think, is that games are made on principles, not on a set of laws. To some people, they want to be immersed in a story. Interacting with the story lightly is more immersive than just passively absorbing it, so these people prefer stuff like QTEs to a movie. But, for them, immersion is paramount, so to die or whatever drags them out of the story. Simultaneously, a lot of other people hold challenge to be paramount: they want to solve problems, and that's gaming to them, and maybe story just creates a context in which solving problems makes sense. They want to be able to lose - to them, that's not immersion breaking, perhaps because they don't care about that sort of immersion. Sometimes, you can see these things combined, like how in Prince of Persia when you die the Prince recounting the story goes "no, no, that's not how it happened, let me start again," or how in Dark Souls you die and respawn as an integral component of the story in an associative way. But often, they aren't - and perhaps can't be.

Different theories of what makes games good hold that there are perhaps a dozen or more characteristics that people find fun in games, and these are often working at cross purposes. When you see someone "ignoring" gameplay, nine times out of ten you should just believe they have a different set of priorities than you, not that they're somehow doing it wrong.

/r/Games Thread