Games, historically, have been for the player or players.
The word game comes from Old English gamen, which carried meanings of “joy, fun, amusement.” Gamen is a common Germanic compound; cognates include Old Frisian game “joy, glee,” Old High German gaman “sport, merriment,” Swedish gamman “merriment” and several others. All these are combinations of the Proto-Germanic elements ga- (collective prefix) and mann “person,” the latter being the ancestor of the English word man. This compound thus conveys the sense of “people together.”
The primary goals of games are to pass time and to foster companionship. Games have always been for our entertainment in some way, whether they be displays of athleticism in sports, wit-battles in chess or brief amusements that children make up for themselves.
Video games, a subset of electronic games, involve interaction with a user interface to generate visual feedback on a video device (as opposed to audio games, whose feedback is solely audible).
Some of the earliest video games were simply digital recreations of games and sports that had been around a long time. Pong (1972) was nothing more than an electronically-rendered game of table tennis. Pong and the like had the same player-centric goals as their “real-life” counterparts; the only difference was that they now could be played on a screen.
Those who created, or developed, these primitive softwares vied merely to provide the most functional and accessible product in the infantile market. Any other gimmick was secondary.
With the advances in hardware of the next couple decades, however, came more complex and sophisticated video games. Games were no longer just a couple moving pixels. Better technology meant superior graphics and more detailed game environments – and thus greater room for creativity.
While early computer role-playing games (RPGs) had involved personal player narratives for quite some time, games of other genres were beginning to incorporate set, unchanging plots for the player to experience through gameplay. Some game development studios were building video games with story in mind and started employing cinematics, or “cutscenes,” for the player to watch between levels.
By the early years of the new millennium, many games being created were fully voice-acted and required significant amounts of animation. Music had evolved from the beeping melodies of Super Mario Bros. to full orchestral scores with dynamic and layered sounds to complement the player on his journey.
After a few more years, it became commonplace for major game development studios to utilise motion-capture (mo-cap) technology in their animation. Video games now have an unprecedented level of detail in their characters and can present more nuanced stories. Games that do this particularly well, such as Heavy Rain and The Last of Us, have been known to evoke sincere emotional reactions from players.
Today, the development of major video games (referred to as “triple-A” titles) is not unlike that of a film. A video game no longer is exclusively about gameplay. The typical group behind a AAA video game project now has hundreds of employees. In addition to the requisite programmers, studios now hire writers, directors, actors, sound designers, music composers and sketch and concept artists, among others, to bring a game to life. And they have ludicrously large budgets to accomplish this.
The amount of work involved in the development of the well-made video game is staggering, to put it lightly. The process is a far cry from the early days of a group of computer geeks, numbering no more than fifteen or twenty. Video game development had established itself as a multi-disciplinary artistic medium on par with other artistic media like literature and cinema.
However, video games as such are unique in that they require player input via a controller or a keyboard and mouse. What happens on-screen is directly influenced by the player, whereas a novel or a film or a painting is the same every time it is viewed.
This means that the player is still the key component or ingredient that makes a video game a video game (after all, a video game without the game part is really just images on a screen). And this, in turn, means that the game needs to be engaging and fun. If it is boring, then it’s more of a chore or work than it is entertainment, and people won’t play it.
The expectations of gamers have always leant toward immersion. They want an escape from the harshness of real life, to be immersed in a fictional world where they can do things they could not or would not do in the real one. Immersion is achieved chiefly through stimulating gameplay and realistic visuals, although there are other methods. Some of the most popular video game franchises in the industry that employ these pillars of immersion fall into the genres of action, open-world and first-person shooter (FPS). Grand Theft Auto, The Elder Scrolls and Call of Duty all are wildly successful commercially because they adhere to what the average gamer expects of an immersive experience and do it well.
Developers understand all this, but, at the same time, they have an artistic vision that they want to realise. A vision that may be thwarted by being tied down by the tried-and-true gameplay mechanics and tropes of yesteryear.
So, is the video game primarily a toy for the consumer’s enjoyment – or is it primarily a creative vehicle for the developers?
It may be true that the developers do not owe the consumer anything, and it also may be true that nobody is forcing anybody to buy the developers’ product, but, at the end of the day, it’s the consumer’s dollar that keeps the developers afloat.
An independent game development studio must take this risk into account when crafting its games, but developers owned by a publisher (a relationship akin to a musician signed on with a record label) have no choice but to bend to the will of their financial overlords, who make broad creative decisions in accordance with what they know gamers like and will purchase (or will beg their parents to purchase for them). These decisions often hamper the creative potential of the developers and, in some cases, can undermine the essence of a beloved franchise.
There is a great deal of evidence that the story of Halo 5: Guardians was substantially weakened (if not ruined altogether) through interference from the publisher, Microsoft Studios. While developer 343 Industries’ debut Halo game, Halo 4, featured a strong, powerful and critically-acclaimed story, many fans were displeased with it – mainly because its tone was starkly different from that of the original trilogy and because one of the series’ most treasured characters was killed off. It is believed that, following such poor reception from the fans, the script of Halo 5 underwent a number of significant rewrites relatively late in its development upon the urging of Microsoft, most notably the senseless revival of the dead character and her asinine relegation to villain status. The story was supposed to be the best and most riveting to date, but it fell flat because the developers were forced to abandon their original vision.
Conversely, the multiplayer side of Halo 5 flourished because of fan input. The general consensus is that it is superior in almost every aspect to that of Halo 4, and it’s largely because of the whining—er, I mean constructive feedback—of the community that ensued following 4’s release.
But I digress. The point I am trying to establish is that, with a video game franchise like Halo, the ownership may be ambiguous. It could be argued that the members of a fanbase with that much creative influence are the true “owners” of the game.
More conservative gamers might say that this is the way it should be, for the developers to listen to the buyers and to tailor their games accordingly. Those more liberal in the matter might say that the developers should not be total pushovers by allowing the consumers to dictate what goes into the studio’s product.
Me? Well, I suppose I have an obligation to take a stance here, as it is often considered bad creative etiquette to raise philosophical questions without even attempting to answer them. Interestingly, my doing so is, itself, an instance of yielding to consumer pressure to alter my content.
I’ll cop out and say that I’m middle ground. I feel that the player absolutely does have a say in the video games he buys. But it cannot be denied that there has been a shift from the player-centric goals of the video game to goals more artist-centric.