Avatars you can trust – A survey on the issue of trust and communication in MMORPGs

First published: September 10, 2003
Updated: Oct 24, 2006

[Graphs missing]

Designers of MMORPGs face hard technical challenges but as the somewhat brief history of these games has made clear, they also need a clear sociological understanding of group behaviour. This article reports on a small survey of MMORPG players addressing the issues of trust and communication. Results are analyzed through a theoretical perspective drawing inspiration from theories of cooperation and collective action, mainly sociological formulations of economic game theory.

Theoretical background
Human interaction, by any standard definition, requires communication. In order to express our needs and desires, to engage in trade, to ask for directions – not to mention cooperating on a nationwide level – we need the powers of communication. In many cases, however, communication itself is not enough. To coordinate the efforts of building a lighthouse (to take an economy textbook classic) we’ll need the precious resource of trust. If a person is to contribute to the common good, he or she needs to be convinced that other people are not just piggybacking on his or her efforts. If I am to contribute to the lighthouse, I’ll want some insurance that no substantial number of people are freeriding; enjoying the benefits without contributing on their own. These have been core issues in political science for centuries. The problem – which is really the problem of how society is possible at all – is one of trust. How do agents who feel any kind of discrepancy between personal and collective interests (and are sometimes tempted to look after the former) manage to cooperate? Historically there have been two solutions that we may refer to as the Neutral Third Party Approach and the Responsibility Through Positive Sum Approach. The former has been famously phrased by the contract theorists Thomas Hobbes, John Locke, and Jean-Jacques Rousseau. People, in this perspective, understand that they would be better off if they cooperated, but have no way of trusting each other without a neutral guarantor. This guarantor, typically the state, may punish those who break contracts or act against the common good. Thus, even the purely selfish will find it sensible to cooperate. The Responsibility Through Positive Sum Approach works without a neutral third party. In this view, social order (and general prosperity) may arise through the largely unregulated interaction of selfish agents by way of various mechanisms, most famously the surplus value generated by specialisation. Hence, classical economist Adam Smith’s (Smith, 1776/1993) well known claim that

“It is not from the benevolence of the butcher, the brewer, or the baker, that we expect our dinner, but from their regard to their own interest. We address ourselves, not to their humanity but of their advantages.”

Buy a loaf of bread from the baker instead of baking it yourself and you’ll both profit.

This view is echoed in some quarters of economic/social game theory and rhymes well with many observations from biology.

Now, both of these approaches have been known to work under certain conditions. MMORPG designers will often have to strike a balance between the two approaches, letting the system itself assume the properties of the neutral third party while various mechanisms facility some degree of emergent social order. The details of how this can (and does) work will not be discussed here. Rather, it is important to understand that multiplayer games are subject to the exact same problems and concerns as any other human group. With one obvious exception; many games have strong competitive elements (indeed one could argue that the less competition we have, the less of a game do we have) and such elements differ somewhat from many forms of “physical” human cooperation. This is true to the extend that the competition is what game theorists refer to as zero-sum, a game with a fixed amount of points; one side wins as the other side loses. Examples of zero-sum games are (stand-alone games of) chess, tennis, and Tekken. MMORPGs are less competitive than Tekken and thus more obviously concerned with social interaction. However, it is worth noting that many games that may seem (practically) zero-sum have dimensions that rely on trust. For instance, the real-time strategy game Age of Empires (Microsoft, 1999) matched players for multi-player battles through a web interface that required some amounts of chatting and opened up a variety of trust issues. For instance, players would often lie about their skills in order to find willing opponents (whom they might even have the pleasure of giving a thorough and rating-reducing beating). Thus, whereas the actual battles with standard setting had no trust problems (they were zero-sum and you simply assumed that the opponent was out to get you) the matching interface was rife with such issues.

In MMORPGs, of course, cooperation is a necessity, when forming parties or guilds and when facing the need of a variety of character class specific skills. Also, these games generally share an ambition of creating worlds, presumably including some sorts of communities. What we have, then, is that very basic of human phenomena; the need for cooperation and the emergence and violations of norms. It may well be possible to establish a template for the emergence of social issues in online role-playing games or indeed many types of online communities. Certainly, many of the more well-documented specimen seem to have followed a common path.

1. The establishing of the system. Users may be few and friendly towards the project. Social issues will not be dramatic.
2. Opening of the world to outsiders that do not share the cooperative pioneering spirit of the first users.
3. Social trouble arising from the abuse of privileges.
4. Implementation of system-level norms and rules and a system of sanctions (if not, the system may well lose its value and fade away).

On many levels, this schema describes the evolution of systems such as the CommuniTree bulletin board (Stone, 1992), LambdaMOO (Dibbel, 1999; Curtis, 1992), the educational MUD called MicroMUSE (Smith, 1999), the early graphical MUD Habitat (Morningstar & Farmer, 1990:9) and Ultima Online which was initially plagued by large-scale social trouble. Famously, Blizzard’s Diablo was taught many game designers not to expect everyone to voluntarily refrain from cheating. A variety of “hacks” would seriously tip the balance in multiplayer games. It is interesting to note that an informal survey done in 1997 showed that 89% of those who had cheated would have preferred not to have been able to do so (Greenhill, 1997). The design lesson to be learned from that may be to help players stop cheating by leaving the system less open to exploitation and even not to be afraid to help players from themselves.

Collectively, game world sabotage (or forms of play that run contrary to the enjoyment of other players) is often labelled ‘grief play’. This problem may be decreasing as game worlds are designed with less utopian assumptions of player behaviour but it obviously still demands many resources and to some extent dictates design decisions. The FAQ of Mythic Entertainment’s Dark Age of Camelot (2001+) states that

“An unfortunate situation has arisen in several currently-available online games where some game players go out of their way to ruin the gaming experience for other players by killing them repeatedly, “stealing” their monster kills, and generally making a nuisance of themselves. Camelot has several built-in methods for discouraging this behavior.“
(http://www.darkageofcamelot.com/faq/)

Expectations
Real-life successful communities usually fulfil a range of criteria (Ostrom, 1990). More generically, following Robert Axelrod’s seminal analysis (Axelrod, 1984), trust without central command may arise in positive sum systems characterized by
• Repeated interaction. The likelihood of future interaction must be sufficiently large.
• Knowledge of interaction history. Agents must be able to recall past interactions.
• Recognition capabilities. Agents must be able to recognize one another.

To this we may add that stable group boundaries and indeed small group sizes may support asynchronous niceness, known as reciprocal altruism (within biology) or generalized exchange (within sociology). In such situations, one agent will cooperate with another without the lure of immediate reward.
Now, if players recognize this on any level they may be expected to desire features that enable trusting in-game relationships to form, most notably: Strong communication features, limited and stable group sizes, persistent user identities (to enable recognition), and memory support in the form of note-taking or being able to attach labels to other players’ profiles. Of course, players might also care nothing for trust and just enjoy the lawless anarchy of online gaming.

Methodology
Survey methodology – as indeed all methodologies – is fraught with problems and pitfalls for the unwary. On a general level we should be sceptical about people’s self-perceptions. Asking someone about her media use, for instance, may yield highly non-factual answers. We are not completely conscious of our daily life and habits and we all present self-images clouded by wishful thinking – at times even to ourselves. Most obviously, people tend to downplay media use perceived as vulgar in favour of more socially respected pastimes (e.g. Lewis, 1991:53). Also, some types of knowledge cannot be put into words. While we can ask someone if he or she can ride a bicycle, we cannot ask someone how he or she rides a bicycle. Riding a bike is not an entirely conscious process. Similarly, we can’t ask someone directly how he or she communicates with others or evaluate the personalities of others. Thus, the answers given to this type of questions in the survey may not be accurate.

In this particular case the respondents involved were found on a limited number of websites etc. Or rather, they found themselves since they were perfectly able not to take the survey. Thus, the respondents who did chose to answer were self-selected. This introduces bias, since the sample is not representative. It might be that the opinions and habits of the hard-core gamers who answered are interesting to us (or to designers) but on the whole the results should be considered indicative rather than conclusive.

Practical approach
The survey was advertised, with an introductory text, at www.game-research.com between 5th of October, 2001 and 8th of January, 2002. In addition, respondents were recruited in a variety of USENETnewsgroups. The questionnaire itself was web-based and besides basic demographic questions consisted mostly of closed questions in which respondents were asked to rate statements such as “Communication/chat with other players is an appealing part of online gaming.” Results were analysed for statistical significance within single questions (could the outcome be a coincidence?) and between questions (for instance, do respondents who value communication/chat also find that users should have persistent user names?). Significance, here, is measured at the level of p<0.05.

Survey results
The most significant results of the survey will be presented below. Whereas this discussion focuses mostly on significant distributions within single questions, additional and different analyses may well be performed on the data than the ones discussed here.

Respondent demographics
Respondents were, not surprisingly, overwhelmingly male (91,7%). 42,5% were in their twenties, while the mean age was 24,7. Whereas Americans constituted the largest group (42,4%), British respondents accounted for 17,9% of all responses.

Saboteurs are a problem
Online gamers, of course, are a motley crowd. Different game genres may present different problems of cooperation and different player types may have different concepts of fun. Furthermore, we may speculate that people who find online gaming worthwhile at all do not find the problems to be critical.
Graph 1, however, shows that respondents do think that saboteurs are a problem. Even if we consider the middle category “sometimes“ as a statement of neutrality towards the issue (as is done throughout the following), a significant number (41,4%) reply that saboteurs are a problem “often” or “all the time”.

To what degree do you find that online gaming is troubled by saboteurs (player killers, cheaters etc.)?

Establishing trust
When players (or indeed avatars) meet, they will often want to gauge the trustworthiness of each other, whether to engage in trade or dragon slaying. Respondents were asked how they evaluate such trustworthiness by being requested to rate the following statements (among others):

• I judge by the seriousness of their user names
• I judge them by their writing skills and apparent level of education
• I judge them on the basis of dialogue (value statements etc.)

On the whole, user names were not taken by the respondents as valuable indicators of personality or intentions. One could speculate that silly or youthful names would signal low trustworthiness but respondents claim that this is not the case (at any significant level).

On the other hand, writing skills and apparent level of education is considered an important indicator. It might well be that paying attention to grammar and wording in general comes across as a commitment to the interaction. A communicator who is willing to spend time and effort on an exchange is likely to be serious about future commitment. It also means, of course, that good communicators (people who are used to textual interaction) have clear advantages when self-representation consists only of text.
Whereas form is important, actual statements and choice of subject matter appears to be even more crucial. Disregarding those who answer “sometimes” (29,5% of all) 81,4% of the remaining group claim to judge others on the basis of dialogue “often” or “all the time”. This is hardly surprising. Value statements go to the heart of trust, and it would be strange not to take stock of extreme statements of egoism or altruism (although in some settings, one might be sceptical of the last sort).

I judge them by their writing skills and apparent level of education?

I judge them on the the basis of dialogue (value statements etc.)?

I judge them by their reputation (eg. by asking others)?

I judge by the seriousness of their user names?

Design preferences
Respondents were also asked to evaluate a small series of design feature proposals and a few more general statements. These features and statements are directly related to the issue of trust. We might expect the gamers to desire strong communication features and to want ways of handling saboteurs. Particularly, if the respondents follow predictions derived from the theoretical perspective outlined above, they should want permanence on the issue of identity and clear connections between gamers and their user names (i.e. they should want user names to be more or less permanent).

The respondents, in fact, agreed to a high degree that it should be possible to hold others accountable by attaching labels to their user profiles (much like it is done on www.e-bay.com). Also, the responses stressed the importance of persistent identities. Not all respondents agree, of course, but on those two issues, the respondents in favour of such measures outnumber those opposed.

Interestingly, though perhaps not surprising to most, the respondents value communication for its own sake (not just as a necessary evil). This should not be taken to mean that what they really come for is the company – if that were the case they could fulfil their needs in other (much cheaper) systems, such as bulletin boards or instant messengers. But communication does seem to be a major reason to play online as opposed to single-player fun.

It should be possible to attach notes to other users about their reliability etc and to make these notes available to friends/allies?

Communication/chat with other players is an appealing part of online gaming?

Online games should focus heavily on communication features enabling coorperation between players (pooling resources with allies, teaming up etc.)?

Players should be clearly connected to user names (user names should be permanent/persistent and/or hard to get)?

There should be strict limits as to how many players are let into the same game world (or game room etc.)?

Management should try to let players work out their difficulties before stepping in?

Communication/chat with other players is a necessary but not appealing part of online gaming?

New players should have restricted powers within MUDs and roleplaying games until they’ve proven themselves in some way?

Conclusions and perspectives
On the issue of actual in-game player behaviour one must not place too much stock on player perceptions. Player claims, however, may inform us on what players look for in games and give us general impression of what features they value and would like to see improved. Importantly, saboteurs or grief players trouble many online games and even where they don’t we may want to ask if the game designers are avoiding unconstructive behaviour at the cost of restrictions on player freedom.
The results presented here indicate that gamers, consciously or not, are concerned with issues of trust and cooperation. They tend to prefer design features that facilitate constructive behaviour. Such features have been studied intensively by disciplines such as political science and sociology and it seems likely that game designers would be able to benefit from paying attention to these disciplines.
In the future it would be interesting to try to document what concrete design features lead to what types of behaviour. By systematically and empirically studying the sociology of MMORPGs we will even be able to generalize results and thus provide valuable knowledge that may extend far outside the field of games. Just as game desigers may benefit from the insights of sociologist, so the study of society and politics may be able to look to virtual worlds for valuable data and ideas.

Literature
• Axelrod, Robert (1984). The Evolution of Co-operation. London: Penguin Books.
• Curtis, Pavel (1992). Mudding: Social Phenomena in Text-Based Virtual Realities. Proceedings of Directions and Implications of Advanced Computing, Berkeley, California.
• Dibell, Julian (1999). My Tiny Life. London: Fourth Estate.
• Greenhill, Richard (1997). Diablo, and Online Multiplayer Game’s Future. Games Domain Review.
• Lewis, Justin (1991). The Ideological Octopus – An Exploration of Television and Its Audience. London: Routledge.
• Morningstar, Chip & Farmer, Randall F. (1990). The Lessons of Lucasfilms’s Habitat. In: Wardrup-Fruin & Montfort, Nick (2003). The New Media Reader. London: The MIT Press.
• Ostrom, Elinor (1990). Governing the Commons – The Evolution of Institutions for Collective Action. Cambridge: Cambridge University Press.
• Smith, Adam (1776/1993). An Enquiry into the Nature and Causes of the Wealth of Nations. Oxford: Oxford University Press.
• Smith, Anna DuVal (1999). Problems of conflict management in virtual communities. In: Kollock, Peter & Smith, Marc (eds.). (1999). Communities in Cyberspace. New York: Routledge.
• Stone, Alluequere Rosanne (1992). Will the real body please stand up? – Boundary Stories about Virtual Cultures. In: Benedikt, Michael (ed.). Cyberspace: First Steps. Cambridge: The MIT Press.
NOTE: This article replaces a briefer version previously published at this site. For further discussion of the results and a more detailed theoretical framework, please see my MA thesis The Architectures of Trust – Supporting Cooperation in the Computer-Supported Community