The Next 20: What's in Store for Videogames in the Coming Two Decades? Hat

The Next 20

What's in Store for Videogames in the Coming Two Decades

It's 1913. You're an avid viewer of moving picture shows. You're asked to predict what such "movies" will be like 20 years in the future. What do you say?

The screens will be bigger. The cameras will move. There will be sound! There will be colour!

It's easy to make predictions about technology: you just take what you have already and add what's missing or improve what's already there.

What else might you predict?

There will be more movies. They will be more entertaining. They will be longer! People will be able to make their own movies!

It's easy to make predictions about content: you just take what you have already and add what's missing or improve what's already there.

It's easy to make predictions about any medium. Your predictions might not come true within 20 years, but you can be reasonably sure that most of them will come true, so long as you keep it simple.

Yet would you, back in 1913, have predicted the evolution of the studio system? Your favourite actors will be contracted to big studios; they'll have little say over what movies they'll appear in.

Would you have predicted that a vocabulary of cinematic techniques would develop? Jump cuts, cross-cuts, fades, deep focus, split screens, whip pans — they'd all be symbols that viewers interpret as having narrative meaning.

Would you have predicted that a quarter of the movies made would be westerns? Or that there would be an appetite for musicals? Or that documentaries would not prove popular?

What would you have said about the price of a movie ticket? Would it go up, down or remain the same?

Extrapolative predictions are easy to make, but causal ones are much harder.


* * *

It's 2013. You're an avid player of videogames. You're asked to predict what such "games" will be like 20 years in the future. What do you say?

Let's start with technology. We'll have bigger screens, and smaller screens with higher resolution. This stuff is easy!

Except, "screen" isn't the right word: the right word is "display". Google Glass can project a display onto your retina. We'll have displays that can occupy your full field of vision; in fact, they'll be able to occupy more than that: turn your head and the display will track what you "see" to match it, all in seamless 3D. That kind of immersion-inducing environment has to be great for games!

Well no, it's not necessarily as great as you might think. How would you interact with such a display? You can't touch it — and if you could, the lactic acid would build up in your muscles as you continually manipulated virtual objects at eye level; you'd develop what early touchscreen researchers called "gorilla arm". Tablets avoid this by putting the screen at a book angle, but a virtual world in which looking down equated to looking ahead would actually work against immersion. If you wanted to manipulate a full-vision display using your fingers, you'd therefore use a linked tablet as an input device: hands on the tablet, eyes on the big display. A configurable console-style controller would be better for certain kinds of games, but be less portable. Neither would be any better at dealing with 3D than today's technology, though; we'll have to hope for a major UI breakthrough of some kind if we want to be able to select semi-occluded objects reliably.

Other input methods will be possible (if not, as with voice, always advisable when you're travelling home on the bus). Perhaps the most interesting are brainwave readers: these have potential, but to get the kind of fidelity of command demanded by even today's games is probably more than 20 years away. Besides, you always had a suspicion that anything capable of reading brainwaves might be capable of writing them...

This all assumes that virtual reality is A Good Thing for games in the first place. It might look that way for your kind of game, but people 20 years from now will still be playing Bejeweled and even a holodeck wouldn't make one iota of difference to them. They'll still outnumber you, too. Nevertheless, it's likely that there will be a VR boom, but that it probably won't deliver quite what players imagine. The thing is, for VR to come into its own it will have to escape the gimmick phase. 3D movies are struggling to do this at the moment: the early, obvious potential of the new medium has been explored, but its full implications have not been. Yes, swooping shots of luscious 3D scenery is impressive, but what does it imply beyond "that looks pretty"? How do you handle concepts such as focus and depth of field? Is there anything to 3D that adds as much to movies as colour and sound did, or is it a solution in search of a problem? At least directors have figured out that actually most people don't like objects flying out of the screen at them, which is a blessing... VR has all this and more heading its way.

Some technologies don't escape the gimmick phase. The Wii controller is an example of this: its obvious potential for physicality was explored, but not its potential as a simple pointing device. Four players can interact independently on the same screen at the same time: that suggests many more possibilities than jumping around for 20 minutes, but designers weren't allowed to experiment. The same could happen to VR: its success won't depend on what it does different, but what it does the same-only-better. Is a VR game just a regular game with a VR interface, or does VR bring something genuinely new? Eventually, yes, it must realise its promise — but perhaps not in the next 20 years. Coolness only goes so far.

All this presupposes that technology will actually be a factor 20 years from now. It's entirely possible that games will be abstracted away from particular platforms — and even particular interfaces. Cloud-based systems will be accessible from multiple devices; you can play using whatever is convenient at the time. Sure, you won't get the same experience with Halo 11 on your watch as you will on your custom rig, but some things — trading, for example — might actually be easier that way.

What's unlikely to happen is that streaming technologies will allow real-time interaction over the open Internet. When streamed games were first proposed in the late 1990s, it was argued that increases in bandwidth would remove the showstopping problem of lag; unfortunately, it turned out that content providers of all colours always expand their offerings to fill available capacity. Unless there's an as-yet-unforeseen multiple-orders-of-magnitude increase in bandwidth on the cards, there will always be lag and therefore always a problem for live-streamed games.

Cloud-based games enable more subtle possibilities, too. One area that would see enormous interest is customisation through open standards. Many games replicate what other games already have, which is both a duplication of effort and a limiting factor. Cosmetic-only features could easily be made compatible across games. If one game comes with a human-male-rub-chin-thoughtfully animation, why can't it be used in another game that uses the same character model? The potential for clothing (categorised, so you don't get bikinis in ancient Rome) is particularly interesting, because of its enormous commercial potential. It's unlikely that an industry standard will arise within the next 20 years, but an in-house style might. If a single developer churns out MMOs, for example, then it could prove quite lucrative to let people take their outfits from one into another.

Finally on the tech side, look out for voice fonts. Players like to be able to communicate with one another using speech, but it can annoy other folk in the same room and you don't always sound the way your character ought to sound (gender difference being the most glaring case). If you could type words that would be conveyed to other players as speech, you could participate in conversations among mic-using players. You could also speak into a mic and have speech-recognition software convert your words into text which is then read out in a different voice. A games application incorporating voice fonts will probably emerge during the next two decades, although it's hard to say exactly when.


* * *

The games industry itself is set to change in the next 20 years. An easy prediction to make is that there will be many more games available — millions of them — but that most will garner hardly any players. It used to be seen as advantageous that the Internet wasn't limited by shelf space: a developer could easily place a game in an online store, but might even have to pay for it to go on sale in a bricks-and-mortar store. Now, however, this feature is seen as disadvantageous: when there's a single marketplace that dominates a platform (app stores, Steam) then unlimited shelf space means unlimited numbers of games. To succeed, a product has either to strike it extremely lucky (as did Angry Birds) or it has to have a $500,000 marketing budget and not suck. Creating the world's best game is immaterial if no-one knows about it.

This is good news for companies that do have access to a sizeable marketing budget, which is to say most established developers. However, there are other icebergs on the horizon that they will have to steer clear of if they're to keep afloat. The two main ones are overspecialisation and overshooting.

When people play one type of game, after a while they grok it. They want to keep with the same genre, but go more hard core. We saw this in casual games: to non-expert players, today's hidden-object games bring new and painful meaning to "hunt the pixel". When such specialisation goes too far, though, the genre becomes inaccessible and disappears up its own backside. This happened with old-style adventure games: "oh, I see, I need to put the masking tape on the shed door then pet the cat so it runs into the shed and I can collect some of its fur off the masking tape then combine it with the syrup I found earlier to fashion a false moustache so I can disguise myself to look like the man whose passport I have stolen, if I use a marker pen to draw a moustache on his passport photograph because he doesn't actually have a moustache". Thank you, Gabriel Knight 3.

So which game genres are likely to overspecialise in the next 20 years? Judging by their trajectories in recent years, real-time strategy games and sports sims must be prime candidates. They may have millions of fans, but so did adventure games. Unless something happens to shake them up, they'll become ever-more sophisticated as they meet the ever-more increasing demands of their ever-more veteran audience, shedding lesser players and failing to attract new ones as they do so. Other genres run less of a risk: thanks to user-created content, first-person shooters seem reasonably immune, for example. However, some are in danger of going too far in the opposite direction. While overspecialisation reduces the user base to a rump of dedicated players supported by small-scale developers (today's adventure games are actually pretty good, but you're not going to try one), overshooting expands it to a mass of dilettantes.

It used to be that sports cars were small, nimble vehicles boasting high performance. After the demand for them flattened, manufacturers started to add features that would appeal to a wider market. They softened the experience, adding better safety features, more space, less spirited throttle response and higher driver comfort. It worked, for a time: more sports cars were sold. However, the designs drifted away from what a sports car was. In attempting to sell more sports cars, manufacturers lost their core audience and wound up competing with regular cars for sales. When the Mazda MX5 came out, its back-to-basics approach was exactly what the forgotten core sports car enthusiast had been waiting for: as a result, it rapidly became the world's best-selling sports car.

Overshooting — incrementally diluting the essence of a product in order to attract more users — is something we see in the games industry. The genre currently most vulnerable to it is MMOs. Developers have dumbed down their offerings so much in an attempt to draw in casual players that they're now in competition not so much with each other as with the likes of Farmville 2. Some time in the coming few years, this genre is almost certain to undergo a reboot, as a Mazda MX5-type MMO appears that appeals to those millions of former MMO players who are currently treading water in single-player games or MOBAs such as Defense of the Ancients. This ebb and flow of overspecialising and overshooting will continue in the next two decades, but should result in a clearer partition between games for particular audiences. You don't need a million players to make money from an MMO: you can get by with 20,000 if you design for it. So long as the niches aren't sealed off from the rest of the gaming world (either by interface, subject matter or impenetrable gameplay), we should see a stable market develop in which anyone who wants to find a game that works for them will be able to do so.

One final point about the industry of the future concerns revenue models. At the moment, free-to-play rules for online games; given that the trend is for pretty well all games of any substance to have an online component, we can expect to see its dominance continue. Will it still be the main way to pay for games 20 years from now, though?

Free-to-play works for games with large user bases, but it doesn't scale down well for games that don't have them — especially if those games are not social in nature. Free-to-play will still be big, but there will be stronger dividing lines between what is and isn't acceptable to sell to those people who don't want to play for free. Whales will have learned that virtual goods are ephemeral and that virtual friendships come and go; they'll therefore be more discerning in deciding where to spend their money. Also, there will always be a market for games that eschew free-to-play, because of the negative impact the concept has on fairness (although "fairness" is a relative concept: one person's pay-to-cheat is another person's pay-to-explore).

It's certain that other revenue models will be tried. Economic conditions could mean older ideas such as per-hour charging will be given a second chance (and fail). It's unlikely that the classic pay-to-own paradigm will be superseded, though: development costs for high-end games may come down, but they won't come down so much that recouping them through the one-off selling of something (even if it's only access to a server) is no longer necessary.


* * *

What will the games themselves be like 20 years from now?

Well, that's easy: they'll be better!

Fair enough, but "better" in what way? Different people like different things: your idea of a better game might be someone else's idea of a worse one. Interestingly, though, we probably can say that the play experience for any individual player is, on the whole, likely to be an improvement on their current experience from the point of view of that player.

The reason we can say this follows from the fact that although games have been played since before there were people (bears play games), only now, with videogames, are they being played by adults in large numbers. Most UK residents in their 30s either grew up playing games or grew up among others who played them. We've been schooled in playing games. In playing them, we've learned more about playing them, and our understanding of them as a player has matured. You may have enjoyed The Very Hungry Caterpillar when you were learning to read, but your preferred reading material today isn't The Very Hungry Caterpillar of the Rings. You've moved on. So it will be with games: people's tastes will refine and they'll look for games with more depth and meaning.

A consequence of this will be that game designers will become better known. You won't have to go to MobyGames to find out who the lead designers for the first Medal of Honor were (Christopher Cross and Lynn Henson) any more than you'd need to go to a specialist site to find out who wrote Harry Potter and the Philosopher's Stone (one J. K. Rowling, apparently). At the moment, there are very few celebrity game designers, mainly because the developers like it that way: as in the early days of film, they'd prefer you to think of a Nintendo game or a Bioware game or a Popcap game. This will change when developers realise that some people really are better at designing games than others and start to poach talent. It's already happened to some degree, although not always with success (Richard Garriott's Tabula Rasa is no longer with us). We can expect to see it occur with increasing frequency, though, and with more reliable results.

When designers can call the shots, they will begin to do so. They will be allowed side projects to indulge their playfulness and self-expression. Some of those side projects will become monster hits. No-one will know which, least of all the designers themselves, but it'll happen. We could see whole new genres appear this way by 2033.

The increasing celebrity of designers will occur in tandem with the increasing acceptance of games as part of popular culture. Those who today worry about games turning us all into child-molesters addicted to murder will be well into their twilight years by then, if not dead. They won't be of any more concern than those today who 20 years ago were wailing about "video nasties". We will see leading figures from the games industry with knighthoods. We could conceivably see one or two in the House of Lords. That's not a joke.

There's another reason that we'll have better games. With the final acceptance of games as something other than low culture, academics will be free to study them for their own sake, rather than having to dress up their research as "serious games" (which is like having to pass off an interest in creative writing as journalism). This means that scientifically valid metrics for "better" will be developed, so ways to improve game designs will open up. Game criticism will extend beyond the "I liked the graphics, it didn't crash and I shot 143 zombies" level we see all-too-often today. Individuals will actually be paid to be full-time game critics.

Game-playing as a spectator sport is unlikely to take off. It can work when there are a limited number of games that are wildly popular (such as Starcraft in South Korea), but it's probable that there will simply be too many games, updating too often, for any single one of them to get the necessary hold.


* * *

Games will be ubiquitous. Games will be playable everywhere. There will always be a game that you want to play. They will be culturally embedded to the extent that the babies being born this month will grow up scarcely able to conceive of what the world would be like without them.

They'll also be nothing special. Some new media form will have come along to supersede them, just as games superseded videos superseded television superseded radio superseded movies superseded music hall superseded theatre. Times change. Games are at the cutting edge of entertainment now, but who knows what lies ahead?

It's 2013. You're an avid player of videogames. You're asked to predict what such "games" will be like 20 years in the future. What do you say?


Copyright © Richard A. Bartle (richard@mud.co.uk)
12th August :\webdes~1\ edge.htm