Back Then

How do we talk about old games?

The monkey’s paw has certainly curled for me a little bit re: whining about video games online. It seems that there are more people these days than just myself submitting their thoughts on games as art to the world, but not necessarily in a way I would deem…”constructive.” See, through the development of my own process and the studying of many different media, I feel like I’ve only become a little less clear on what exactly, if any, the primary goal of “creativity” is. I don’t believe there is any one answer or ideal. In my “Artists Only” writings, I think you can see a clear progression away from strict idealism. It’s true, I hold expression through art as a very valuable thing and a mindset I personally wish to pursue ostensibly forever, but what one way is there to do it? How sacrosanct is the idea of “art” really, when you stop to consider how much of the “art world” has come to be only through bourgeois elitism or other abstractions?

These are big and almost meaningless questions for how incoherent they and their responses may be. I have come to find it much more reasonable and in fact helpful to my own brain to think less about picking an ideal and judging everything against it. Art, like anything, should be approached dialectically, much in the same way it was (knowingly or not) created. For games, many people (including me) struggle with idealism a bit. Intrusive ideas about marketing and journalism have introduced modern players to an environment devoid of criticism, analysis, or thought. Whether we see games “as art” is only the first step, because once you see them as art, then what? If they are meant to be expressive or instructive, what are they expressing or instructing? Surely there must be more than arguing about “good writing” all day.

In “Je Vous Salue, Midgar,” I posed that the development of the atmosphere and the attempt to wholistically immerse a player in it is an ideal, and perhaps the highest one at present. There is obviously a large part of me that believes that, and I see certain games taking certain approaches as more valuable–things that catalyze political discussion, teach us things about our world and our beliefs, provide unique perspectives and styles never before considered, etc. But even I must admit that there are a great many games that do not take this approach, and still hold significant value to myself and to the form. I don’t rebuke my previous sentiments, and I recall saying that there are certainly some games intended as straightforward toys or puzzles and are very adept at being just that, which is not wrong in any way. Many multiplayer games, like fighting games or deck-builders or competitive RTS, they aren’t lacking in depth or value. In a less direct way, they can certainly teach us things about our own proclivities, thought processes, and communities, albeit in less novel ways than I believe the form may be capable of.

How do we begin to discuss things of this nature? If there are other ideals than intellectualism, than immersion, than instruction, which there certainly are, then what are they? This is where it starts to break down a bit for me, at least from what I have seen. Stepping away from my own biases and looking at less lofty, less “serious” games with a more open mind, trying to extract more value from them than simple “fun,” is a rewarding process indeed. I’m just not sure if everyone who claims to be on board with some of these games really is on board! Arcade games seem to be in the crosshairs regularly these days, in genres and design philosophies of many kinds. To me, this is an interesting question: I have previously made comments about such games from the good ol’ days as being designed with coinage and artificial difficulty in mind, but this is obviously too reductive. In fact, many people young and old (especially on Backloggd, which is sort of a satanic and evil website,) have identified that this is a common mindset being held against arcade games, which is unfair. The strawman shoggoth of “quarter munchers” seems to represent a modern audience’s dismissal of these games, refusing to acknowledge that they may have contained depth well beyond what people who didn’t experience them could understand in retrospect.

My basic dialectical approach to this would be to consider how my own, perhaps shaky thesis (arcade games may be less valuable than more modern games) can be synthesized with a typical counterargument (arcade games are actually better than you think and perhaps even better than modern games). Surely though arcade games were often made for the “quarter munching” environment, they must, as I have surmised in the past, have more to them than basic Pavlovian responses to high scores and “continue?” screens. If for no other reason than to simply set them apart within the market, but beyond that they must have developed many different design principles as ways to show off the work that went into their presentations, or to stave off the inevitable boredom of having played them for a long time. It was something of a cambrian explosion, games evolving their design philosophies in many hundreds of thousands of ways in a very short time in order to actually escape the subconscious notion that they only exist to siphon your pocket change away. Donkey Kong was a pretty winning formula already, but we would eventually come to have our Golden Axe, our Dragon’s Lair, and even our Virtua Fighter well within the lifespan of the archetypal arcade. Surely this was more than the invisible hand of the free market.

Again, it would be reductive to truly assume all arcade games exist only to be quarter vacuums. Video games are not such simple things, and even if they were, they wouldn’t have stuck around and changed the way they have. Looking to older games as a way of understanding modern games is necessary and very beneficial; older games make up the bulk of what I personally play these days anyway. We still have our arguments over the best console generations, the best years, and the best formats for games historically–people still care, and for good reason. Many of the things we like and dislike about gaming now can be directly traced back to trends and design choices of the arcade games, the environment where much of game design as we know it came to be created. Modern fighting games feel worse because we no longer have the buffers of sitting out and coughing up quarters for turns; learning complex, skill-based games feels less rewarding because the resources to get better are less scarce; exporting games into configurable home systems has left them more susceptible to cluttered, poorly designed visuals; things of this nature. These arguments sort of seem to favor older games on a cursory inspection, don’t they?

These arguments that I alluded to, however: do they really teach us anything? When we think about the positive components of arcade games and the accompanying experience, how much of what we take away really has to do with game design?

Consider one of the positives I listed a second ago, that learning complex games can feel less rewarding with fewer obstacles in the way of progress. I have seen this come up a lot, the general argument being that “difficult” arcade games are far less difficult than they’re remembered as being, but they simply lack tutorials and resources to encourage a wider audience to complete them without experimentation. This makes sense, in a way–the commands listed on the cabinet are all you need, everything else you learn organically. The learning is the point of the game, as it were. Something like Dark Souls has sold itself on principles like these for years, and only become more beloved over time. But consider the difference in an arcade scenario, despite this basic design principle remaining the same across over 30 years of time. Ignoring the obvious “I can look shit up on the wiki” difference (we’re assuming that someone at this level of care for game criticism has a desire to be honest when approaching a new game,) the two scenarios progress completely differently. If you’re playing Final Fight, you could master the game’s actual combat all the way up to the point with the stupid fucking fire floors, then simply have your learning and progress instantly deleted by some random new mechanic from left field–dependent on your quarter supply. Does this not seem sort of artificial? If a player has already gone through the tedium of paying in individual instances to keep trialing-and-erroring their way to a certain level of knowledge, what purpose does a late game, unexpected, entirely unrelated setback, something which only comes up once or twice and can ONLY be surmounted by repeated payments actually serve? Alternatively, you would restart the entire game up to that point (and then getting back in line to try again, if the arcade’s jumpin,) which can’t be better for learning.

Why do you think we have come to call perceived unfairness in games “cheapness?” You gotta keep putting quarters in to figure out each little detail! Not every decision in world of arcade games existed to foster learning and creativity, and I would in fact contend that most of them didn’t. Haven’t you ever played House of the Dead? In the example given, consider the Dark Souls model we have today. The player is still encouraged to learn through experimentation, but those experiments aren’t limited only to trial-and-error, nor are they required to re-queue and produce a new payment in-between experiments conducive to learning. Engagement with the world outside the combat mechanics reinforces what you need to know about those mechanics, allowing you to intuit certain advantages or strategies from cryptic lore entries, NPC dialogue, or environment design. Death will revert you to a bonfire, but much of your progress is retained, and bonfires themselves act as checkpoints to (within reason) prevent you from needing to redo the entire game in a single, specified order just to get a handle on one facet of it.

Now, you could argue that this model isn’t really any less cheap than what I described earlier in Final Fight. If you find that Ornstein and Smough are an unfair fight because you simply can’t kill them fast enough to keep their gank advantage from bearing down on you, your only real options at that point would be to grind and farm creeps to get stronger, or keep throwing your head at the wall as is. The game cannot continue past a certain point until they are dead. It’s true, you don’t have to keep putting quarters in–you just gave Bandai Namco a big bag of quarters for the game up front, and are only obligated by your own desire to keep trying at the difficult, non-tutorialized parts. In actuality, both of these games seem to be designed with at least this one principle in mind–organic, experimental learning of the combat mechanics–but the ways they challenge you differ based on the expected environment in which you’re playing. An arcade game would halt your learning with random crap to shock you out of getting complacent, while an RPG would halt your learning by forcing you to engage with a separate system to maintain a learning curve you prefer.

Now, which of these models is necessarily “better?” Fewer literal obstacles to trying again certainly lets players try more things, but in the two provided examples, one is enormous compared to the other. The sprawling, grinding RPG never had any place in the arcade, and it might just sort of be folly to try to compare their systems at all, even when the frustrations of learning them may feel the same. Final Fight is a hard arcade game, Dark Souls is a hard console RPG. But if you think about it, the latter is essentially the only frame of reference we have now. What “arcade scene” really exists now like it did back then, even in other countries? Maybe if you’re a Korean Tekken player, but even now, Bandai Namco is doing its darndest to phase those out. The truth is: I’m not sure what modern games you could compare to arcade games and really come to any meaningful conclusion. Arcade games were designed for a totally different environment, and in that environment it seemed like players really liked having relatively simple but surprisingly obtuse skill challenges without bigger worlds to explore. How would you in an arcade, after all? These things are different, but is there any way to call one superior without just conceding to the dreaded “subjectivity?”

You could compare and contrast games from different eras all day and reach the same inconclusions. There is a reason (beyond laziness, I swear,) that I haven’t brought up too many specific examples. It almost doesn’t matter! The synthesis you will have to reach between “old games are primitive and only exist to swindle your money” and “old games were better and modern games swindle your money anyway” is, unfortunately, that the differences are only relevant case-by-case. I suppose it is the lazy Hegelian answer to just say “fuck it, it contains multitudes,” but I don’t personally believe that arcade games, the enormous and nebulously-defined group of design philosophies that it is, really constitutes anything “worse” or “better” than modern games. I can certainly concede that games from older eras are valuable and worth playing, but comparing their value to the new games we play today just isn’t so simple a question as good games and bad games. So do these arguments really teach us anything? No, I don’t think they do. I think that it’s completely fair to consider that many games built for arcades were, indeed, “quarter munchers,” and that for all the faults with our current sickly and dying games industry, not every new mutation of the Tetris evolutionary phylum is necessarily cynical, perverse, or inferior to what we once had. Arcade games may as well be considered their own medium. I don’t know if it’s worth arguing about how valuable they are (or were) compared to modern games.

Export this argument into other media–I mean really export it–and you should see where the folly lies. Compare a film made today to a film made at the dawn of cinema, and you should notice that they’re so radically different as to be nigh impossible to compare. What do we really learn from asking “which is better, The Avengers or Workers Leaving the Lumière Factory?” They may both consist of images captured with a camera, but they were made with such different motivations in such different circumstances. Take even films slightly closer together in time, to make it seem less airless. Which film is really “better,” Kino Pravda or La Région Centrale? Two films made with a similar reverence for uncritical cameras, for naturalism and for cinematic truth without acting or careful editing. 50 years apart but attempting a lot of the same things with similar techniques, only with very different structures for very different audiences. How do you say which one uses its techniques “better?” Is Snow’s film the superior design because of the smart choice to automate the moves, removing human bias from the capture of the image? Is Vertov’s film the superior design because the images it captures are more concrete and provide deeper insight into recognizable life? How would we possibly approach or even come up with this argument as film critics; who would even pose these questions?

This has been sort of a roundabout way to get to a much simpler truth, but what else is new for this blog. Why do you think I run a blog and not a substack? I wouldn’t make people pay to read this stuff! Through all the tangents I have concocted, the conclusion I’m prepared to make is less about the specific, realistically small crowd of people who prefer arcade games to modern games. It’s really about the idea of conceptualizing games (or any art form) into distinct blocks and periods of time from which we can make generalized comparisons. The idealism I began this piece by scrutinizing is, in video games, the driving force behind those slapfight arguments I’ve referenced. Whether it’s arcade games, or whether it’s 2004, or whether it’s the “6th Console Generation,” or whatever–is there any actual, honest way to criticize game design in these terms? It ends up being idealism in practice, picking some arbitrary design principles from a time and place one likes for any superficial reason, then assuming that everything else is failing to meet the expectations of those principles. Whatever forum culture YOU specifically grew up, that is obviously the best time and place for video gaming.

Sorting things into top 10s and finding the “best” games and postulating when gaming “peaked;” all of this is just so meaningless when you try to articulate a real argument in any direction. I have theorized myself that modern games for modern systems leave us in the best position to create meaningful games, because of how they allow to us to explore worlds and atmospheres, or to enrich ourselves in other people’s worldviews. But if I stop to consider other kinds of games (the arcade game, for example,) not everything is perfectly reducible. Those games are closer than you might think to what we have now. I have my own ideals and my own preferences, but would it actually mean anything to say one is “better” than the other? If you see someone arguing that old games are just better, that’s not necessarily an indefensible position. It’s sort of not a position at all: are they explicitly wrong? What do they even mean? Is it that they miss the less predatory monetization? Is it that they miss having a collection of discs and manuals? Are they just nostalgia blind? As a question of design, these diatribes are actually useless. If someone claims that gaming “peaked” in the arcade days…why? Is all you want from your games twin-stick shooters and racing games to put high scores on? Conversely, If someone claims that gaming is at its peak today…why? Do you want every game to be a sandbox grinding game with a 90s TV show narrative to watch as a series of cutscenes?

I believe gaming is a wholistic entity. All eras of gaming have certain things to teach us. The specific internet posts I saw that inspired me to think about this (pretend that I am inspired to read and write by talking to human beings, rather than observing them in digital zoos,) were originally decrying modern games for railroading their players. I’ve seen people accuse games like Resident Evil or Call of Duty of disallowing players from experimenting or controlling their own pace by punishing them with (virtual) death for refusing to retreat in combat, or for trying to find exploits in enemy AI and map geometry to give them uncontested advantages. From what perspective are these bad things necessarily? Resident Evil, for one, has garnered a lot of praise for the way it does this specifically because it was (and is still) subversive, which makes its horror more exciting. If the player is warned not to fight the stupid dogs at the beginning, and they choose to do it anyway, in what way would it be consistent with the game’s world or atmosphere to just easily succeed in doing so? How would you be afraid of something you know to be far less threatening than it purports? Notice that speedrunners are often unaffected by such subversive moments in their chosen games!

Of course, this is still some idealism on my part. Is it necessarily better that the game be consistent and true to its vision, or that the player be free to interact with the world contained within the vision? I have given some praise to Dark Souls, but that game most certainly allows the player freedom to try things out of their ideal order and find new paths forward. You can walk straight into the catacombs within an hour and just get stuck down there, and then you have to walk all the way back out. I love Dark Souls; is this bad design? Is like, Ridge Racer or something a “better” game because it doesn’t allow for these questions of dissonance? “Bad” and “good” are already sort of unhelpful, but “better” and “worse” happen to be even more useless. To make these judgments, you’ll have to accept that there are many different ideals, and simply be honest with yourself about which ones seem more important to you. Sometimes making a top 10 list is fun, but how could it really hold any value? How could you truly score something an “8.5/10” with a straight face?

I suppose my final takeaway is (as usual) that I don’t think people are saying what they mean, or talking about anything relevant when they try to develop criticism. We see discussions of “larping” or “friction” or “tank controls,” but nobody really approaches games on their own terms. We see a lot of arguments about how stupid kids don’t understand old Armored Core, and a lot of arguments about how old guys need to shut up and just play the Persona remasters. Do we ever talk about what games from these eras we all love to remember fondly actually do? Can we come up with things we like or dislike about them, and think about why we come up with those things? When adjudicating, say, arcade games, can we approach the idea that they were built for a different world and different people, but may have innovated some important individual ideas as a reaction to that world? Ideas we should study? I don’t think we need to cry about combo systems or soulslikes much anymore, and by now it might even be futile to push back against live service gambling crap. We need to ignore the superficial stuff and hone in more on why games are persistent in all their current forms, and what we can learn from them.

To ask “where gaming peaked” is basically on the same level as those tweets about how gacha games actually have deep characters and meaningful stories. It’s fandom stuff, it’s thought exercises for amusement, and it’s basically all recycled. I personally would like to approach games old and new as if they are important things (they are). Fallout, for example, which I have had a finished critique of written and unpublished for a while because I want to make it part of a larger writing I haven’t finished. Fallout is a great game, I love it, and I would probably consider it one of the greatest games ever…for whatever that’s worth. Rather than continue making tierlists or ‘member whens, though, I would like to examine why it has stuck around in the American cultural memory and what we can take away from it today. That’s all I’m saying: find new discussions and new conclusions! Be more pedantic and scholarly online!

Not so pretty when I sound it out like that, but I just can’t help the feeling. You see guys who’ve played thousands of games over decades of time, and it’s all just “kids these days will never experience God Hand, will they?” You sound like an idiot posting this crap every day! And worse still, it doesn’t get much more intellectual if you look away from the internet: I’ve seen such people provide resources, when petitioned, for fundamental writing on game design and criticism. They can only provide undergrad textbooks and old blog articles (my house is only mostly glass, I can throw a couple stones). There really is no more reliable authority, is there? If there were something worth reading about all this, I could’ve saved us all some trouble and just gone and done some homework! Games are young and are rapidly returning to the people as the industry continues to rot and game technologies become less esoteric. We, as players of games (“gamer” sounds too funny) ought to feel some responsibility for talking about them more seriously. It’s almost entirely on us.

While I’m not going to try to talk down to everyone for sounding dumb on forums, I do want to provide my 2 cents. We do not need to worship the artists or the visions, nor do we need to worship the player’s full autonomy in all cases. We do not need to worship certain times or certain scenes, or certain genres. We do not need any sacred cows, and there can be more than simple arguments about “good” and “bad.” We should treat games perhaps not so seriously, but at least maturely. It’s dialectics, or something!



Leave a comment