Quality Control, the Lack of Itcomments powered by Disqus
Posted on Friday, February 14 2014 @ 12:29:11 PST
This member blog post was promoted to the GameRevolution homepage.
“Those who don’t learn from history are doomed to repeat it.” While that quote may not exactly be word for word, the meaning behind it is all the same, including the video game industry. Remember how Nintendo back in the very early days of gaming had draconic policies regarding how games could be released on their systems? Most people would remember Nintendo’s infamous censorship policies (seriously, changing Holy to Pearl?), but what most people don’t remember is Nintendo’s iron grip over quality control.
For those of you who weren’t around for the event, the Video Game Crash of 1983 was an event that signified the death of video games. The crash was a time where the idea of quality control wasn’t invented (for video games at least), which meant anyone, and I mean anyone, could produce a video game and push it out to the market. You had over half a dozen companies all producing the same exact game all while trying to convince you to buy their version over their competitors’. You also weren’t guaranteed on the quality of the games, so it was up to random chance on whether or not the game you bought would have minimal bugs or even run at all. Between word of mouth, the media, and stores being unable to clear their inventory of unsold games and consoles, it wasn’t hard to imagine that video games were just a fad.
When Nintendo took control over the video game industry, they made it their duty that any video game produced on their console would have no game-breaking glitches and would be playable from the get-go. Nintendo’s vision came to fruition after the massive success that the NES had brought to the overseas market. With video games actually being fun to play and not having bugs that would cause the game to crash, the video game industry was revitalized. Of course, quality control came at a price; with Nintendo dictating what could and could not go on their video game consoles, developers were at the mercy of a monopoly that they had to either abide by the rules or go home. Censorship and limits on how many video games a studio could produce in a year was Nintendo’s main strategy in the gaming market. With censorship, Nintendo was able to have video games that didn’t offend a particular group or religion (thus maximizing profits) and with limits on how many games a studio could produce, there would be less incentive to rush out garbage games and focus would actually have to be applied to guarantee that a game would sell.
Nintendo’s strict policies came at a price; at first, game studios created smaller studios under a different name in order to bypass Nintendo’s limits on how many games can be made by one studio. However, since Nintendo had total control over how their consoles could be made and who can get what additions (some say that a few studios got the shaft on the SNES by being denied a bigger sized game cart), many developers felt frustrated that they were stuck on a console that left them with little leeway. Once Sony came along with the Playstation, many studios ditched Nintendo since Sony had less censorship and a console that was more open and friendly compared to Nintendo’s restrictive N64. Naturally, allowing flexibility for developers is a good thing, but sometimes, too much of a good thing can be a bad thing.
As time progressed, video games became more complex and time-consuming to develop. The probability of bugs also steadily increased, but with budgets being how they were and time not being friendly to everyone, developers sometimes leave the glitches in the game if they are not that big of an issue. Games with major bugs have become more and more common, almost to the point of being commonplace where a game gets pushed out the door and then getting a hotfix later. In a strange twist of events, what was once seen as an abomination of gaming to have game-breaking bugs has now become so commonplace that we expect every newly launched game to have at least one. With restrictions on video games becoming looser and looser, games nowadays are being pushed out the door with either horrible bugs or dubious quality.
On top of all this, there are more games today being pushed out in the guise of early access/playable beta (or even alpha) with the promise that the game will get finished. Not only most people expect big glitches in newly launched games, but now we have people who are willing to pay for beta access to a glitchy game (basically paying developers to become bug testers) or others who are willing to go out of their way to remove glitches on their own time for free. Steam isn’t immune to the problem either since it’s slowly being filled with buggy games, low-quality games that got a free pass from the Greenlight service, and games that promise features that haven’t been delivered yet. The Wii also had a similar issue when the console was known for mostly shovelware games due to Nintendo easing up on who can develop for their console. The Playstation 4 was launched with bugs right off the bat with Sony telling people weeks in advance that they have to download a patch to address the bugs before people can even begin to use their console.
It what seems to be an ironic twist of events, easing restrictions on what can be developed and how they can be developed has caused a drop in quality for video games in general. After all, if you’re free to make games to how you see fit, why bother going the extra mile for quality assurance, knowing that your consumers will buy your product anyway or knowing that you can start charging people for beta testing so that you don't have to hire actual bug testers?
Now, some people would tell me that we don’t need publishers or even the big companies like Nintendo telling developers how to develop their own games, but we are now living in an age where time seems to be the enemy on release dates and a game has to be pushed out the door, bugs be damned. People seem to be too afraid to delay the release of a game from bug issues just for the sake of selling on a hot release date (usually Christmas season) and it’s hurting everyone in the end; with games having glitches that could have been easily stomped out before release, the consumer starts to invest less in video games due to the low quality (on a technical standpoint), and then game studios start laying off people or even closing down due to low sales. The industry should not be afraid of making people wait a little longer for a mostly bug-free game to be released, for the wait shall be very worth it. No one should be afraid to put their foot down and say, “This is unacceptable and you will go back and revise your work."
A firm grip on control and what can get pushed out doesn’t restrict creativity. In fact, doing so would make developers work even harder to make sure that people will enjoy and buy their games. At the same time, we as the consumer should not be okay with paying for an unfinished product to test; we are not bug testers. We are players and we are here to play the game, not debugging it. While the old video games of yore had their share of glitches, do you even remember them compared to modern games of today?
The opinions expressed here does not necessarily reflect the views of Game Revolution, but we believe it's worthy of being featured on our site. This article, posted originally on February 6, 2014, has been lightly edited for grammar and image inclusion. You can find more Vox Pop articles here. ~Ed. Nick Tan
|More On GameRevolution|