Latest News

Tuesday, October 9, 2018

We need to stop defining "progress" in games as "better hardware"


Opinion by Matt S.

"Serious gamers need serious performance. Get ready for a new standard in gaming."

This ad on Twitter, by Intel, announcing some new hardware or other that it had developed, reminded me to get these long overdue thoughts down on the virtual paper: this idea that improvements and progress in gaming are tied to developing better hardware is utter nonsense. In fact, it's downright damaging to video games as an artistic medium.

Think about just about every other artistic medium; painters continue to use canvas as standard. Books have had their approximate format for a few centuries now. Even photography and film, which are themselves mediums that is closely related to technology, have long shifted the conversation away from the latest new technology gimmicks and on to the substance of what's being shown.



But not games. Oh no. There's still this assumption that "better graphics" and "more processing power" inherently results in a better quality game. As though games produced on this newer hardware render the older experiences obsolete. But that's as ridiculous as saying the Mona Lisa no longer has value because people can use Adobe Photoshop. It's a ridiculous argument, but it's also a very common one.

Unfortunately in games, we have, as a community, allowed technology companies to dictate the terms in which we talk about progress. There's a historical reason for this; from early PCs to consoles, the NES to the SNES, the PlayStation 1 to PlayStation 2, the earlier hardware always functioned as a inhibitor to creativity. The SNES couldn't render 3D worlds, for example. The PlayStation 2 wasn't capable of creating a genuine sense of realism. For this reason, each new "generation" of hardware also brought with it an expanded set of tools and creative options to developers, and that automatically resulted in a wider range of games being available.

Indeed, the whole concept of "hardware wars" wasn't just driven by people who liked Nintendo, SEGA, Sony and Microsoft's respective logos. Back in the day, the games you could play were determined to an extreme extent by the hardware that the manufacturers provided.


But that's no longer the case. With both the PlayStation 3 "generation" and now the PlayStation 4, the hardware has not significantly limited the creative ambition of ambitious developers. The Last Of Us on PS3 strove for - and achieved - "realism". Mass Effect provided epic space operas. Assassin's Creed on PS3 demonstrated the now-standard approach to open worlds. Yes, the PS4 has provided iterative improvements such that developers can do all that stuff that they did before better, but as far as creative enablement goes, I can't think of much that has been done on PS4 that wasn't done in some form or other on PS3. No Man's Sky, perhaps, with the scope of its procedural generation?

This isn't a criticism of the PlayStation 4, or what developers have achieved on it. Rather, to me it's a sign that we should be redefining what counts as "progress" in games. Much like what happened with novels, once authors figured out how to maximise their readability (I can assure you, really early novels are actually very difficult to read, because the formatting and structure is so odd to us now), or film, once the basic rules of filmmaking were set in stone, we should be shifting our idea of "progress" in games to the themes expressed by them, and the stories they tell (or encourage, in the case of emergent storytelling). The way they make people think and feel, rather than which one has a bigger world or "more realistic graphics".

The PlayStation 6, once it's announced, won't be progress. It'll be a slightly bigger canvas with more bells and whistles, but developers will be using it in the same way as they have before. Rather, for an example of progress, we should be looking at NieR: Automata, which was both a remarkably popular and well-structured game, but also one that challenged players to think a little more about what they're playing, and play a little differently. NieR: Automata wouldn't have been better if it was on the PlayStation 6, or Intel's fancy new chip, because NieR: Automata's strengths as an experience are completely independent of the way it uses technology.


Of course, Intel is a technology company, with a vested interest in convincing people they have to buy its latest products. You can't blame the company for trying to link its products to the progress of the games industry. But that's just the point. Intel is not an arts supplier. It's not a publishing company. Intel's actual role in the arts is at a tangent, at best. And like every technology company I've ever encountered, those leading Intel and its product design are almost certainly not approaching their work from an artistic development point of view.

So, this is another challenge to the games industry and community. If we accept that video games are indeed an artistic medium, then we also need to shift the language about how we talk about progress in games. Because "progress" is never found in improving the canvas. Progress is found in deepening the conversations about the art itself.

- Matt S.
Editor-in-Chief
Find me on Twitter: @digitallydownld


Please help keep DDNet running: Running an online publication isn't cheap, and it's highly time consuming. Please help me keep the site running and providing interviews, reviews, and features like this by supporting me on Patreon. Even $1/ month would be a hugely appreciated vote of confidence in the kind of work we're doing. Please click here to be taken to my Patreon, and thank you for reading and your support!
We need to stop defining "progress" in games as "better hardware"
  • Blogger Comments
  • Facebook Comments
Top