Being a video game enthusiast isn’t as glamorous as you’d expect. For every champagne-drenched supermodel lingerie party I’m invited to because I can beat Through the Fire and the Flames, there are several lonely hours spent on forums typing in ALL CAPS because a developer didn’t take my own personal feelings into account when making an insignificant change to a series that has brought me more unparalleled joy than anything in my otherwise dull and unfulfilling existence.
That being said, even from an objective non-tragically-alone point of view, the industry is currently skidding balls-first down a slope at frankly irresponsible speeds into a gaping maw of complete bullshit, and here are a few reasons why:
Three Week Correspondence Courses in Controller Safety and How to Push Buttons
I’m not going to bother pre-facing this with, ‘if you’ve ever played a Wii game’ because you have, so you might have noticed that when you booted the game up it gave you anywhere between one to three inescapable screens of mind-numbingly obvious advice on all the correct ways to avoid being a fucking idiot and jamming your controller through the screen or your friend’s face.
Did you ever think it was strange how almost every story you read about people smashing shit up with a Wii remote started with ‘we were having a party and I’d had a couple of drinks’? Either that or they’ll make the pathetic excuse that the strap broke which shouldn’t matter, unless you’re playing some sort of cheerleading game that involves letting go of the controller and spinning it like a frantic madman around your wrist. If that is the case then I’d spend more time worrying about how you’re going to come out to your parents than how much of an idiot you are for forgetting to hold onto your damn controller.
It’s a sad fact of life that, the more popular something becomes, the greater number of idiots will be drawn to it, and the more those in charge will feel the need to cater to the broadest possible audience. I’m not saying that everyone who plays the Wii on a regular basis is an idiot, but the fact that these warnings appear at the start of every game suggests that Nintendo certainly thinks you are. And now, because Sony and Microsoft are nothing if not eager to jump on a bandwagon three years late, all my games for their respective consoles feature these warnings as well! Marvelous.
Despite having played video games for most of my life, I’m not what I’d consider a ‘hardcore gamer’ or, as they’re better known, ‘scary virgin,’ so I don’t want you to get the impression that these tutorials are wasting precious time I could be spending writing Naruto slash-fics or showing people on Chatroulette my dick or whatever it is people with Level 70 Mages do in their spare time. Die, mostly, I think.
I finish most of the games I buy, but I’ll very rarely get 100% or attempt to beat the hardest difficulty settings; that’s just not what I play videogames for. In spite of this, I am fairly competent and I can pick up the swing of a game within about half an hour of playing it. QED: after playing Battlefield 2 for a couple of rounds I managed to shoot a helicopter down with a rocket launcher, which isn’t exactly easy. Later, while trying to disable a tank, I stuck C4 to a rock and blew myself up by accident, which isn’t exactly not retarded.
I don’t know why developer’s now think that new players need to be taken by the hand, when they know exactly as much as I did when I started playing games, but the first level of every game is now a tutorial stage that will be excruciatingly drawn out and constantly interrupted with text boxes explaining the fundamentals of shooting things with a gun.
I have faith in humanity and I believe that, left to their own devices, most people could quickly work out what button is jump and what button is shoot; mainly because control-mapping has remained largely unchanged for the last fifteen years. ‘X’ will always be jump; the right shoulder button will always shoot; the right analog stick will move the camera. These things will never change, because they make sense from a design point-of-view, but rather than just letting you get on without you’ve got 15 minutes of learning how to take cover left to sort out. Hug that fucking wall, private!
I’m open to the possibility that these tutorials could just represent a shift in the industry, one that has to take place as more and more casual gamers are brought into the hobby; people that only have a spare half hour or an afternoon to play a game and don’t have either the time or patience to pick it up at their leisure. But answer me this: Super Mario Galaxy 2 came with a DVD telling you how to play the game.
Super Mario is fucking Super Mario, there are only two significant buttons in the whole game (after you strip away all that superfluous powerup bullshit that’s only there to justify the spastic waving of the Wiimote,) and those are jump and attack. In fact, you can probably beat most levels by jumping and dodging, so even attacking isn’t strictly necessary.
The sheer condescension of including something like this in a game would suggest to me that it was meant for children, but even that makes no sense because children are better at video games than anyone else; partly because they have the most free time and partly because they don’t get frustrated as easily and give up. Whenever I play old Megadrive games I used to be able to beat, I get about two levels in before losing all my lives and going back to porn.
These tutorials become an even more uncomfortable dick up the arse when it comes to a game like Dead Rising 2, which is specifically designed to be replayed several times (it is practically impossible to beat the game on your first playthrough) but will force you through the tutorial every time you start over. You can skip them if you want and just wait for the game to load. While you’re doing that you can build those shelves you’ve been putting off for weeks or maybe make a start on that novel you‘ve been meaning to write, courtesy of…
Loading Times That Last Longer than the Actual Games
According to the forums I spend time starting arguments on, this is a point of contention for a lot of gamers. Some don’t notice the problem, others are ready to move to Tibet and begin a possession-free existence because of it. I’m going to use Dead Rising 2 as an example again because I literally spent as much time watching this game load as I did playing it, and that isn‘t an exaggeration. To put it another way: while trying to beat the 75,000 kills challenge, I took to playing games of Freecell inbetween loading screens.
The game is set in a Vegas-style resort split into different casinos, shopping centres and other things you’d find in a place like that. Every time you go to a different area on the map, you get to sit through a delightful loading screen that can last anywhere between thirty and fifty seconds. Start a mission, though, and the fun really begins, as almost every single one has you running to one end of the resort, rescuing someone and then running back to your hideout. Notice that I said you’ll be running, because there really isn’t anything to do between Point A and Point B except for kill zombies, which is admittedly super fun by itself, but the missions are all timed meaning you’ll want to get them out of the way as fast as possible.
This means that the actual time you spend in each area getting to the target will be anywhere between ten and fifteen seconds; roughly half of what it will take to load the next area. Then, when you get back to your safehouse, you can’t just send the survivor inside, you have to go with them, resulting in another loading screen. If you’re really super-duper lucky and this was a plot-advancing mission, you’ll then get to wait while a cutscene loads and, if this is one of the repeat playthroughs I mentioned above, when you decide to skip the cutscene you get to sit through another loading screen while the game puts you in your safehouse which you have to leave via another loading screen to get on with the game.
So, in conclusion: the mission time totals approximately one minute of playtime against five minutes of loading. If that sounds like fun to you then congratulations: you have narcolepsy.
After researching it on forums, I’m lead to believe that installing the game on the console reduces the time of the loads, but not their frequency. Ignoring the fact that I’d need a bigger HDD to install the game (and I don’t feel like paying eighty quid for the privilege of playing a forty quid game properly,) this sort of brings me to the entire point: I own a console; not a fucking PC. The entire advantage of playing games on a console is supposed to be the ability to ‘plug-and-play,’ i.e. you put the disc or cartridge in and away you go.
Even after installation, though, some games go beyond the pale. Bayonetta has set the industry benchmark in absurdly long loading times, hitting you with a loading screen whenever absolutely anything occurs. Pause the game? Loading screen. Go to the options menu? Loading screen. Pick up an item in-game? Loading screen. Want to quit the game? Extra-long loading screen. It got to the point where I was glad I couldn’t import my old save game to my new PS3 because I had an excuse never to play it again.
Surely, though, these are just examples of particularly poor design, right? Well, no, actually. Bayonetta is like a punishment from God but is ultimately well-designed and Dead Rising 2 is an otherwise pretty solid game, and a lot of fun when you actually get the chance to play it, which reminds me that I was actually supposed to be making a point here about how the industry is doomed and not just whinging about games I’ve played.
My original point was supposed to be that, because companies feel continually pressured to create more and more powerful hardware that they both can’t afford to produce and don’t have the technology to actually make work, developers are being forced to either make games that suckerpunch the processors, resulting in minute long load times, or make games that look incredible but last the same amount of time as a healthy burp.
Some indie developers have shown that you don’t need all your games to look like Final Fantasy to create an engaging experience – although it does help if they look like cartoons explaining the dangers of Western culture invading the Motherland – but generally speaking, most people won’t play a game unless everything has been rendered sixteen times and you can see the production stamp on each spent shell casing.
I‘m not some change-fearing luddite, and I‘ll be the first to admit that no 3D game between 1995 and 2000 has aged well (seriously, try playing Goldeneye now; you‘ll wonder what genius turned on the ‘why does everything look like I‘ve got cataracts‘ cheat,) but I’m saying there’s a point where enough is enough. The graphics of the last console generation were pretty solid, (clearly, because half the games are now being re-released in HD on the Playstation Store,) so there is no reason to up the ante when it’s not currently necessary or even possible. Especially when you could instead focus on, oh I don’t know, maybe making games that don’t constantly load and/or break altogether.
To Hell with it, though, sometimes you just have to roll with the punches; shit or get off the pot; accept a shitty console generation for what it is or stop playing and finish university. So pop the disc in and let’s fire this bitch up and shoot some foreign nationals! Wait, the fuck is this? Why can’t I…What? Oh, we forgot to mention, before you play your game that you spunked a weeks wages on, please wait while the console starts…
Installing the Patch to Fix the Update for the Last Patch
I’m sympathetic to the industry, I really am. I understand that, with video games being more popular than ever before, companies are under a lot more pressure to get products to market as quickly as possible. For the majority of games, though, ‘as quickly as possible’ usually means ‘fuck it, let’s just ship and patch whatever we didn‘t finish.’
I buy a lot of used games because, despite the several bleeding hearts online that say I’m killing the industry, I refuse to pay forty to fifty quid for a product in a market where every other piece of media retails for, at the most, a half of that; even the PC version of any multi-platform title always goes for at least a tenner cheaper than the console alternative.
There was a time when those who were console-loyal would have defended this price gulf by saying, ‘but with the PC, you have to spend hours tweaking and getting homemade bug fixes and packing ice around your tower just to get a game to run!’ whereas the console version was always delivered ready-to-play. Nowadays, the only difference between console and PC versions is that when the PC version doesn’t work, you can spend hours tweaking and getting homemade bug fixes and packing ice around your tower to get the game to run. If your console version is buggy and unplayable you get to suck a dick and wait for the developers to come up with a patch. When they feel like it. If they’re not busy. Like, this weekend is a long weekend. Next week’s not looking good either.
Alternatively, you can complain about it online and watch idiots drunk on gleeful superiority tell you you should have got a 360 instead. (Protip: For added irony, go on a 360 forum for the same game and watch PS3 users say the exact same thing.)
Anyway, back to what I was saying about second-hand games: I buy a lot of them. Usually this means that they’re at least a few months old when I get them, and I do this for two reasons: 1) It gives the developers time to fix whatever was broken and 2)It’s a hell of a lot cheaper. Mostly it’s number two. Recently, though, I had to buy a new PS3 console for reasons that will be explained shortly and, to get it at the best price I had to buy a game with it. Being the discerning economist that I am, I went straight to the bargain bin and picked up Little Big Planet for a fiver.
Once I got home and set everything up I decided to give LBP a run just to kill some time, at which point I was ordered to wait while thirteen updates downloaded and installed, all of which took about forty minutes. Thirteen updates. A baker’s dozen. Granted, the game is about three years old now, but that means that there were either thirteen separate errors missed during the development stage or that, even more tragically, patches had to be released to fix errors created by previous patches. And the real kicker? When I loaded the game it had no sound. At all. Thirteen updates and the damn thing still didn’t work properly.
In days gone by, there was a name for this stage in a games life, and it was called ‘game testing’. If the name doesn’t help, this is the part where people are paid to try and break the game; forcing errors to manifest so that they can be documented and, hopefully, ironed out before a product ships. However, since gaming turned into a multi-million pound industry, some of the more discerning members on the board of executives took one look at the budget itinerary and suggested, ‘is there any way we can get consumers to test the games for us; free of charge? Also, someone order me a hooker that doesn’t use all my coke; I’m not made of money.’
This isn’t completely new in gaming; alot of developers offer opportunities for ‘beta testing’ new games online, usually MMORPG’s or shooters like Modern Warfare. Beta testing is a win-win for developers and gamers alike because the devs get to test-drive the product before it’s release – making sure everything operates as it’s supposed to in its intended environment – and players get the privilege of being the first to try out a new game for free. Take special note of the free part. As in, it didn’t cost them anything other than an internet connection and whatever online subscriptions they already had.
With the innovative new setup, games are being released as a broken mess while developers wait for complaints to start appearing so they can isolate an issue and release a patch for it. Essentially, consumers now play the part of unpaid game testers, only they are paying forty to fifty quid for the honour; and that‘s not even a guarantee that anything will change. As I previously mentioned, I recently had to buy a new PS3 after the lens in my old one burned out. This didn’t happen because I forgot to read the tutorial on not pouring soup into the disc slot or because I thought it would make a better couch than my actual couch, but because a game I’d got for Christmas, Fallout: New Vegas, was so badly designed that it literally broke my console.
I’ll actually come clean at this point and admit that there was a bit of user error on my part: I was unaware that killing the power to the unit could damage the lens, (because the reboot screen only ever cautioned against a loss of data) and didn’t realise there was an option to do a hard reset. If you’re wondering why I had to perform a reset of any kind, it’s because the game crashed. A lot.
Every time I booted the game up, at least nine out of ten of my sessions would end abruptly when the game decided enough was enough, picked up it’s ball, and went home. This could happen anywhere between ten minutes and three hours into play, making it even more unpredictable than who’ll say the most stupid, borderline offensive bullshit on the One Show on any given day.
Even though I got the game when it was released, I’m middle-class enough that I received almost every new release over the Christmas period, so it was about six months before I actually got around to making a start on it. After sitting through half an hour of update screens, I started a fairly enjoyable if somewhat tedious adventure, but it was apparent that – despite it clearly being an issue with many players – nothing had or could be done about the game-breaking bug. Towards the end, crashes became steadily more frequent to that point that, even if it hadn’t broke my console, I’d probably have quit playing it anyway.
Given the current standards in the industry, this could almost forgivable, if it wasn’t for the fact that Bethesda had already released Fallout 3 two years previously; a game with the exact same engine and the exact same problems. Ignoring the more traditional notion of ‘if it’s not broke, don’t fix it’ they seem to gravitate more towards, ‘if people still buy it when it’s broke, fuck them.’
Honestly, I can’t really say I blame them, but while we’re on the subject of things breaking…
I’ve Kept Teenage Runaways in my Basement that Lasted Longer Than This
The older I get, by which I mean the longer I play video games, the harder it is to shake the feeling that nobody working in the industry really gives a shit anymore. Gaming is the only medium where it’s alright to release a broken product. Film studios wouldn’t get away with replacing ten minutes of footage with their grandaughter’s cello recital as if they’d drunkenly put the wrong tape in a VCR, and you wouldn’t just chalk it up to fate if your red wine was actually just cat’s blood.
The main difference between video games and other media is that they aren’t a fixed experience. Everyone watching a film is seeing they same thing. They might interpret it differently, but they are all experiencing the same series of images and sounds. Everyone reading a book is seeing the exact same words. Some will finish it and put up on the shelf and some will go on to murder a Beatle; but the words never change.
With the exception of choose-your-own-ending storybooks, though, video games are the only user-influenced medium. And when you have a community that, at large, could charitably be described ‘socially retarded serial killers in waiting’ you have to take it as read that at least a small portion of the userbase is going to do everything in their power to break the shit out of your products. This will inevitably lead to good things, like people who’ve used Kinect to make it do something interesting instead of being a novelty remote control for the amusement of stupids; it will even lead to brilliant things like Red vs. Blue; but it will also lead to people playing through a game obsessively until they uncover a glitch that tears the code apart and turns them into Tron.
These things will happen, and it’s not always a developer’s fault. What is their fault is continuing to ship dodgy hardware they know to be broken on the offchance it will run because it’s alot cheaper than performing a product recall and re-manufacturing the products to not be a bag of horseshit.
This year, I’ve had to replace both my PS3, for reasons I’ve already explained, and Xbox 360. The 360 actually broke three years ago (and within a year of me buying it) but because I’m lazy and I resented having to pay the five pound shipping fee by the time I got round to getting it fixed I discovered my extended warranty didn’t extend until forever. This is complete bullshit.
I understand that my PS3 broke partly due to user error and partly due to shitty third-party software, but my 360 RROD’d; that shit is all on Microsoft. If you are unfamiliar, the Red Ring of Death is the anal-discharge-styled name for the situation wherein your console goes into cardiac arrest and refuses to work ever again. The beauty of this feature is that it could happen any time from the moment you first switch the console on to after having used it every day for years. It’s sort of like a lottery where every single prize is watching your parents get executed by a Columbian firing squad. And they’re drunk so they have trouble making a killshot.
If your console happens to break out of its warranty, you are in for a treat. You can either: pay to have it repaired by the company, which will earn you a bill almost half that of a new console; or you can take it to a shop and have it repair for slightly less, but voiding the warranty. This lead to a fascinating conundrum recently when I planned to get my PS3 repaired then sell it, using the money to buy a newer model. I discovered that, while many shops were more than happy to repair it for me, they would not then buy the console off me because, ‘some idiot’s broken the warranty.’
I’m not a technician or even particularly bright, so I don’t know the exact reason why things break, but I do know it’s something to do with the fact that the console wasn’t properly put together during production, and that people are spastics. For the longest time I thought it was an innocent – albeit costly – mistake that went un-noticed by the company, until I read this.
It reminded me of the infamous Ford Pinto, wherein a memo was allegedly sent out during production saying it was cheaper to pay compensation to families on the off-chance rather than re-engineer all their cars so that they didn’t fucking explode. That’s a pretty extreme example though; the chances of you dying from a broken Xbox 360 are unlikely. Unless you set yourself on fire wrapping up the console in towels and forcing it to overheat because the warranty is expired and it’s the only way to fix the damn thing without a degree in electronics and some experience with a soldering iron.
So now my PS3 has joined the 360, three iPods and copious number of mobile phones in the list of modern tech that broke within three years of me owning them. This wasn’t due to carelessness, by the way. I am meticulously careful with everything I own; to the point of obsession. I cried when I broke a pair of shitty plastic sunglasses I found in the Epcot centre in Florida. As a less embarrassing means of demonstrating this, I have a black and white Gameboy that still works, and I recently powered up the N64 at my parents house – mainly to learn humility when I preach about ‘graphics not mattering’ – and found it running as smoothly as the day I got it fifteen years ago.
I appreciate that it’s getting harder to make money in an industry where most mainstream games need budgets that dwarf Hollywood films to make any actual returns, so cutting costs in any way possible is not an unheard of or even generally unwholesome practice. And, without sounding like the kind of dick who uses the M$ acronym to describe Microsoft, it’s a way of operating they’ve been comfortable with for years – each new operating system being steadily more broken than the last – but when what you’re releasing doesn’t even work it makes me wonder why you’d bother at all. Oh, because morons will still buy it, myself included. Yeah, that makes sense.
I’m not even entirely sure what the point of this whole article was. I’ve had a shitty year with video game related experiences but, like a battered wife, I keep coming back for more. Maybe it’s because, like a battered wife, I can remember the better times before my console used to drink and black out so much, and when it could still perform sexually. I’m rapidly losing a grip on this metaphor, but I think all I really want is to be able to play video games like I used to; having fun and not worrying about boiling discs or suspending my console from the ceiling via a complex series of ropes and pulleys to get the disc to sit at the optimum angle.
I mean, what’s the alternative? Start leaving the house on a regular basis?