For a while now, gamers have seen a shift in focus towards online and multiplayer gaming.

Back in "ye old days" (otherwise known as the nineties), most games focused on singleplayer. Multiplayer either wasn't on the cards or was tacked on, and very few games placed any emphasis on it.

Today, however, things are very different. Today we have games that are multiplayer only (Counter-Strike, World of Warcraft, Shadowrun, Warhawk, just to name a few).

So a question a lot of people are wondering is - how important is multiplayer in order to make a good, complete, and (from a developer's point of view) successful game?

Firstly, I think it is best to explore all the different types of options we have with games in regards to how we can play, both alone and with others. I won't be specific in genre here, but it goes without saying that some genres fit multiplayer better than others, although this can't be used as an excuse to omit them. After all, that is what innovative design is about - coming up with new ways to do things and expand the genres. A good example of which was the inclusion of co-op in Resident Evil 5 for the Survival Horror genre.

Single-player

The first type of gaming experience we have is the singleplayer concept. Even this has changed over the last few years, as it wasn't that long ago that a good singleplayer game had to last around twenty hours at minimum, and anything less was considered too short. Whilst reviewing The Bourne Conspiracy I came to the conclusion that there is a new type of singleplayer game in the market, which can ably be described as "The One Day Wonder". These are singleplayer-only games that take six to ten hours to complete. This would be fine, except the publishers are still charging us full price for the experience.

The problem here is that story-driven games offer very little in the way of replay value (even the fun ones). You may play through once, twice or even three times but still, for a full price game I personally expect more. I'm not saying these games should be made longer, because sometimes that short amount of time is the perfect time to really capture the experience without making it old - for example, Assassin's Creed could have benefited from being a bit shorter, making the experience more compact and less repetitive. My opinion on this matter is that the pricing model should be changed.

An old argument for setting the price of a game was to compare it to seeing a movie. You pay $15 for a movie that will last roughly two hours, or you pay a bit more for a game that can last you up to 20-40 hours. This was a great bargain and made some sense.

But it no longer applies with these types of games. By that same comparison, a ten-hour game should cost $75; instead, we are paying $120 for most games regardless of their length. I understand development costs are getting higher and higher, and I appreciate the publisher's position, but I look at things as a consumer. And as a consumer I'm being ripped off. I'm paying more money and getting a shorter experience than most games five years ago.

What Insomniac has done with Ratchet & Clank Future: Quest for Booty is fantastic - they know it's a shorter game, so they've charged a lower price, but you still get the same great experience you've come to expect from a Ratchet & Clank game. It's win-win!

Co-operative

Moving on, there's co-op games. Adding co-op to a game (online and splitscreen) is almost always a good idea; games are usually more fun with a friend. But it's important not to forget the split-screen option. A lot of developers lately are adding online co-op only, and this takes away that feeling of playing with your friends at the arcade - in other words, having a fun social experience. Instead of being together playing, you feel more disconnected, and that's obviously not a good thing. It's all about choices, and the more choices we have as gamers, the better.

Multiplayer

Regular multiplayer is next. There really is something special about playing a great game against an equally matched opponent. The challenge, the bragging rights of victory, the fact that even when you lose you will still almost always learn something and gain from the experience. AI opponents generally follow patterns which you can figure out and take advantage of. Humans are not as easy to predict. Best of all, if a game has good multiplayer you get almost unlimited hours of replayability from it.

You only need to ask anyone who used to (or indeed still does) play Counter-Strike or Diablo II about their multiplayer experience to get an emotive response, and StarCraft is still played as a national sport in Korea! Heck, look at World of Warcraft (man, Blizzard knows how to do good multiplayer); fun multiplayer can give a game all the staying power it'll need for a very long time.

Weighing up the options

So what's important for a game to have? Some people say that in today's generation if a game does not have multiplayer then there is just no point. That's not really fair. Look at a game like BioShock; it's a very impressive package and an incredible singleplayer experience. Would multiplayer have made it cooler? Possibly, but if it wasn't done right then it could have hurt the overall package.

This is a delicate balancing situation and something that we in the gaming press need to think about when reviewing games. How should we grade them as a package? If we have a game that is a great single-player experience but has a lousy, tacked on multiplayer, do we punish it for that? An example could be the The Darkness. Very cool singleplayer game, but not only was its multiplayer simply not fun, it was also broken. There was lag, the servers would crash and it just didn't work. So should that affect the game's overall score? (Should reviews even be based on a score? Sorry, that's a whole other, and longer, rant.)

I believe for some games it should be an issue. Army of Two is an example that comes to mind. The game (including its competitive multiplayer) was based around co-op and the servers for that just did not function. Playing through the single-player game in co-op was fine (and lots of fun) but the competitive multiplayer, which was trying to do something different and would have given the game much more replayability, was broken.

Then take a game like Condemned 2, which had multiplayer but it just wasn't that good. It worked fine, but it just wasn't a lot of fun and felt tacked on. However, the singleplayer experience (which was definitely the core of the game) was very good. So should it be punished because it had a bad multiplayer component? Would it have been better if they just left it off?

Continued on next page...