When people think about fairness, they often think about social norms and values, or about general moral principles such as equality between humans and impartiality when solving conflicts. Fairness is thus often associated with a genuine concern for other people's well-being. In a variety of scientific disciplines, fairness is even equated with altruism and contrasted with egoism (e.g., De Waal, 1996; Sober & Wilson, 1998).
But is it really appropriate to make such a sharp distinction between fairness and egoism, or to have such a benign perspective on fairness in the first place? Fairness is in the eye of the beholder, and hence, what people believe to be fair may very well depend on what people want to believe to be fair. It is therefore likely that when confronted with a social problem, people have the tendency to evaluate those solutions as fair that happen to benefit the self. As a thought experiment, imagine an important soccer match between two competing teams, Team A and Team B. Following a tackle by a Team B player, the referee gives a controversial penalty to Team A. In situations of this sort, players and supporters of Team A are likely to interpret the fairness of the situation differently than players and supporters of Team B. Team A members are more likely to believe that the tackle was a severe offense and that the subsequent penalty was fair. Team B members, however, are more likely to believe that the tackle was a relatively minor offense and that the subsequent penalty was unfair. This thought experiment illustrates how fairness judgments may be shaped by egocentrism: The team that benefited from the referees' decision believed that the decision was fair, but the team that was harmed by the referees' decision believed that the decision was unfair.
A plausible reason why fairness judgments often are egocentric judgments can be found in the way people perceive the social situations that they encounter. Research suggested that people have an almost natural egocentric perception of the surrounding social world. This is inevitable, as we effortlessly see the world through our own eyes, but we can only imagine what the world looks like through the eyes of someone else. Thus, people experience their own perspective, but they need to infer the perspective of someone else (Caruso, Epley, & Bazerman, 2006). This inescapable egocentrism in human perception has notable consequences for people's fairness judgments. It has been argued and found by social psychologists that fairness judgments are mostly based on how people feel about a situation: People intuitively feel good or bad about a situation, and based on this moral sentiment, people conclude whether or not a given situation is fair or unfair (Haidt, 2001). When the insight that human perception essentially is egocentric is combined with the insight that fairness judgments are based on how one feels about a situation, it follows that people’s fairness judgments are based on the extent to which people experience a particular situation as good or bad for themselves (Epley & Caruso, 2002).
In the following, I illuminate the role of egocentrism in fairness judgments by focusing on two moral dilemmas that are very common in everyday life. The first dilemma is how to distribute valuable resources, like money, goods, or services. Whenever such resources are distributed, people make an evaluation of whether they received a fair share in the distribution. Such fairness evaluations of resource distributions are referred to as distributive fairness judgments. The second dilemma to be discussed here is what procedures authorities should use to make complex decisions that affect the lives of other people. During these decision-making processes, people often make fairness judgments by evaluating, for instance, the accuracy of the process, the extent to which decision-makers are objective, and the extent to which the people that will be affected by the decision are listened to. These fairness evaluations of decision-making procedures are referred to as procedural justice judgments.
When facing the question how valuable resources should be distributed, people often feel that there should be a fair ratio between the resources that one receives and the inputs that one has provided. For instance, people generally feel that an employee who works 40 hours a week (the input) should receive a salary (the resource) that is twice as high when compared with an employee that works 20 hours a week. Both relative underpayment and overpayment are generally considered to be unfair. But in everyday life, things are often not that simple. In many situations, allocation decisions are far more complex, for instance because various inputs are not easily comparable. To illustrate, various employees often have different task demands, different levels of responsibility, a different background of work experience or education, and the like. People generally do not feel that someone who works 40 hours a week should receive twice as much salary when compared with an employee that works 20 hours a week but carries far more responsibility. Research indicated that in these complex allocation situations, people have the tendency to overestimate their own contributions and to underestimate the contributions of others. As a consequence, people have an egocentrically inflated perception of the resources that they believe to deserve (Messick & Sentis, 1979). Such egocentric interpretations of distributive fairness can have detrimental consequences in a variety of social situations. For instance, research indicated that egocentric interpretations of distributive fairness can impede negotiations: When the negotiating parties differ fundamentally in their perceptions of what possible resource distributions would be fair, it is very hard to reach an agreement that is satisfactory to all parties involved (Thompson & Loewenstein, 1992).
Likewise, when faced with the question how fair decision-making procedures are, people are more concerned about what the procedures imply for themselves than about what the procedures imply for others. A dramatic illustration of such egocentrism in procedural fairness judgments can be found in an experiment by Lind, Kray, and Thompson (1998). These authors posed the question in what kind of situations people would feel worse: After experiencing a minor unfairness that is targeted at themselves (e.g., not receiving appropriate change after buying a bus ticket), or after perceiving a major unfairness that is targeted at others (e.g., someone else being robbed of all belongings)? To investigate this question, Lind and colleagues designed an experiment that compared two possible situations, both involving sessions with three participants. In one situation, two participants received procedures that people generally consider to be fair on three occasions (i.e., they were allowed opportunities to express their opinions), but they had to witness how a third participant received procedures that people generally consider to be unfair on three occasions (i.e., the participant was consistently denied opportunities to express an opinion). In the other situation, all three participants received a fair procedure on two occasions and an unfair procedure on one occasion. These two situations thus draw a comparison between perceiving a major injustice happening to someone else (the other receives three unfair procedures) versus experiencing a minor injustice targeted at the self (receiving one unfair procedure in conjunction with two fair procedures). Results revealed that ‘privileged’ participants, who received three fair procedures but witnessed how another participant received three unfair procedures, considered the decision-maker's behavior as fairer when compared with participants that received one unfair procedure themselves. These findings thus suggest that people consider a minor unfairness that happens to themselves as worse than a major unfairness that happens to someone else.
Related to these findings is a recent study by Van Prooijen, Van den Bos, Lind, and Wilke (2006). These authors investigated how people respond to indications that a decision-maker may not be impartial, such as when a referee in a soccer match also happens to be a supporter of one of the teams. In particular, how do people respond when this partiality works against them (e.g., the referee is supporter of the other team) as opposed to when this partiality works in favor of them (e.g., the referee is supporter of one's own team)? In an experiment, a decision-maker would be dividing lottery tickets among two participants. Situations were created where the decision-maker seemed to like or dislike the two participants in four possible ways: The experimenter seemed to like both participants (positive and impartial), the experimenter seemed to dislike both participants (negative but impartial), the experimenter seemed to like the participant over the other participant (favorably partial), and the experimenter seemed to like the other participant over the participant (negatively partial). Results revealed that, when participants subsequently received a procedure that usually is considered to be unfair (i.e., opportunities to express an opinion about how to divide the lottery tickets were denied), participants rated the procedure to be most unfair when the experimenter was unfavorably partial. Of importance, participants responded exactly the same when the experimenter was favorably partial as when the experimenter was positively or negatively impartial. This suggests that people associate violations of impartiality with procedural unfairness only when the partiality works against them, and not when it works in favor of them. These findings further underscore how egocentric fairness judgments can be.
A remaining question, then, is to what extent people are consciously aware that their fairness judgments are inflated by egocentrism. Based on the idea that people automatically and effortlessly have an egocentric perception of the surrounding social world (Caruso et al., 2006), it stands to reason that egocentrism in fairness judgments also occurs automatically. As such, it is likely that people often are unaware that that their fairness evaluations are shaped by a concern to benefit themselves. Preliminary evidence for such unawareness of one's own egocentrism was found in research by Ham and Van den Bos (in press). In an experiment, participants read descriptions of fair or unfair events. These events either referred to themselves (e.g., "You and your colleague do the same work. You make 1400 Euros a month and your colleague makes 4100 Euros a month") or to third-person pronouns (e.g., "He and his colleague do the same work. He makes 1400 Euros a month and his colleague makes 4100 Euros a month"). After each event, it was assessed by means of reaction times to what extent justice knowledge was automatically activated. Results revealed that justice knowledge was more strongly activated following self-related descriptions than following other-related descriptions. These findings suggest that when people themselves are involved in a moral dilemma, evaluations of the fairness of the situation are more strongly determined by uncontrollable psychological processes than when only others are involved in the dilemma. Extrapolating these findings to the soccer team example at the beginning of this contribution, it seems likely that Team A and Team B members do not always consciously realize to what extent their fairness judgments are influenced by the extent to which the penalty was favorable or unfavorable to themselves.
To conclude, it does not seem appropriate to sharply distinguish fairness from egocentrism, as fairness judgments often are shaped by egocentrism. This is not to say, of course, that prosocial motives and ‘genuine’ morality do not play any possible role in fairness judgments. Most certainly, people often display a genuine effort to act fairly in moral dilemmas, and honestly believe to have other’s best interests in mind. The presently reviewed findings suggest, however, that such “genuine morality” is most likely to successfully shape fairness judgments in situations where people are truly independent evaluators, who have no personal concerns at stake in the situation at hand. When people do have personal concerns in a moral dilemma, their fairness-based reasoning is likely to be influenced by egocentrism, presumably without being very much aware of it. Egocentrism may thus be a very potent factor to explain why people can differ so immensely in what solutions to social problems they consider to be fair or unfair.
References
Caruso, E. M., Epley, N., & Bazerman, M. H. (2006). The costs and benefits of undoing egocentric responsibility assessments in groups. Journal of Personality and Social Psychology, 91, 857-871.
De Waal, F. (1996). Good natured: The origins of right and wrong in humans and other animals. Cambridge and London: Harvard University Press.
Epley, N., & Caruso, E. M. (2004). Egocentric ethics. Social Justice Research, 17, 171-188.
Haidt, J. (2001). The emotional dog and its rational tail: A social intuitionist approach to moral judgments. Psychological Review, 108, 814-834.
Ham, J., & Van den Bos, K. (in press). Not fair for me! The influence of personal relevance on social justice inferences. Journal of Experimental Social Psychology.
Lind, E. A., Kray, L., & Thompson, L. (1998). The social construction of injustice: Fairness judgments in response to own and others’ unfair treatment by authorities. Organizational Behavior and Human Decision Processes, 75, 1-22.
Messick, D. M., & Sentis, K. P. (1979). Fairness and preference. Journal of Experimental Social Psychology, 15, 418-434.
Sober, E., & Wilson, D. S. (1998). Unto others: The evolution and psychology of unselfish behavior. Cambridge and London: Harvard University Press.
Thompson, L., & Loewenstein, G. (1992). Egocentric interpretations of fairness and interpersonal conflict. Organizational Behavior and Human Decision Processes, 51, 176-197.
Van Prooijen, J.-W., Van den Bos, K., Lind, E. A., & Wilke, H. A. M. (2006). How do people react to negative procedures? On the moderating role of authority’s biased attitudes. Journal of Experimental Social Psychology, 42, 632-645.