The last post explored what I would call a sympathetic, liberal understanding of the Doctrine of Total Depravity, against traditional liberalism's affirmation of human perfectibility. I am not saying the former is true, while the latter is an incorrect way of describing the situation. One can view human nature in either light. But I am saying, it's worth exploring the hopeful pessimism of the dimmer view of human nature. After all, is it easier to do the right thing, or the most convenient thing? And how are we motivated to do the former in preference to the latter?
The second of our Unitarian Universalist Principles covenants us to affirm and promote justice, equity and compassion in human relations. One would like to hope that almost everyone would agree that we should treat people well and treat people fairly. And while the optimist can take cheer for the observation that, from day to day, people are often kind and fair enough to each other, there is plenty of room for pessimism. It's no surprise to survey, not just history, but the present, and observe how frequently people perpetuate cruelty and injustice. At the historical level, this can rise to the proportions of the Holocaust, while at smaller scales, for example, we easily excuse ourselves for buying products made in sweatshops.
Much psychological research has been inspired by attempts to understand the conditions that made for the Holocaust. Why did (and do) people perpetrate such cruelty? Until the work of Henri Tajfel, psychologists had put this down to either individual personality (some people are more inclined toward evil than others), or competition for resources. Tajfel's work in the 1970s implies that group loyalty and social identity may have much to do with favoring the in-group; and that it didn't take much to create in people a sense of group identity to which they could be loyal.
Tajfel set out to investigate the minimal conditions under which people act on a sense of group identity. In his first experiments, he showed 14 to 15-year-old boys a series of abstract paintings by Wassily Kandinsky or Paul Klee and asked which of the two they preferred. He then told them they had been placed into one of two groups based on this preference, when, in fact, they had been randomly assigned to one of the groups. He then gave them a workbook through which they would distribute a small financial reward to other participants in the study. They could not reward themselves, and all they knew about the other participants was which group they had been placed into. Tajfel found that the schoolboys tended consistently to reward those in their own group more than those in the other group.
Further work has demonstrated that this result is generalizable to adults, that we will show group loyalty even if our group assignment is based on a coin-toss, and that we will disciminate against people ,we are told, transferred from the other group into our own. We will even prefer to assign less to our own group as long as we can assign even less to the others! (For example, we would give our own group 11 coins against 1 for the other group, rather than give our own group 15 and the other group 10).
Tajfel found that in-group favoritism does not require us to have much in common with our group. We need have no future accountability toward its members, and no personal interest in resources. So how much more might we act on prejudice, when we have connections with others in our group of culture, language, blood and affection, along with reciprocal favoritism from them in the future.
Before Tajfel's work on group identity, Stanley Milgram had demonstrated that more than two thirds of his subjects were willing to administer remotely what they believed were lethal electric shocks, as long as they were ordered to do so by a white-coated scientist in the belief that it was "all for the good of science". It is heartening that almost one third of the subjects refused to continue, although they were placing themselves at no risk in doing so. I wonder what the proportions would be if the disobedient could be summarily executed.
And before that, Solomon Asch demonstrated that subjects were willing to acquiesce in a group's obvious misjudgment. Briefly, the experimental subject was shown three lines of different lengths on one card, and then on another card a single line matching one of the three. The task, as the subject believed, was to match the single line to one of the three. This was to be done as part of a group, but unknown to the subject, the other members of the group were part of the experiment, having been briefed beforehand whether they should unanimously give the correct answer or the incorrect one. The room was so arranged that the subject would be the last or penultimate one to respond.
Asch found that a surprisingly large proportion of subjects would conform to the group's obviously incorrect answer. About three-quarters of subjects went along with the group at least once, while, overall, participants went along with the group a little under one third of the time. Although acquiescing in incorrect assessments of line lengths is of little moral importance, the subjects were not under much pressure to conform. In circumstances where going along with the group is immoral, there is often coercion to conform and danger in dissent. At the very least, there is the possibility of breaking existing relationships.
All of this research is conducted on "ordinary" people. They are not what we might think of as especially evil people; indeed, they are good like you and me. I imagine that, as social animals, we are deeply programmed for loyalty and trust. It's certainly not difficult to imagine evolutionary grounds for our tendency to trust authority or be loyal to our group; people who willingly gave loyalty to their group and trusted their leaders likely prevailed over other groups. That it was once successful, doesn't mean that it is now good behavior - or that it ever was.
None of the above settles whether, by our own means, we can climb out of this state of affairs; or alternatively, we need prompting from something beyond us. Here's how I see it. The depth of programming in us for group loyalty and trust makes it very difficult - impossible maybe - to know whether our social motivations are good. And if we separate from groups and leaders, then we are left on our own, and, as individuals, I don't believe we are any better off. How can we tell if we are being selfish or not? Or if we are rationalizing instead of reasoning?
Given my pessimistic description of human nature, a cynical response might be to go along with the survival of the fittest; and a despairing response might be to give up altogether on working for justice, equity and compassion.
Rather, I would choose a hopeful response. There are resources that can tip the balance, that help us to do the right as we discern it, rather than the merely convenient, even in the face of possible hardship. And this is what religions should provide. These resources don't make us perfect, and they don't make us better than other people. But perhaps they can transform us to grow toward being in ways we could not previously have imagined. And while, no doubt, these resources exist elsewhere, I have found hope and Good News in Christianity among Unitarian Universalists. Not that we're all pessimists about human nature; some, maybe most, disagree with me. But God is the ground for our debate, not a weapon.
Next time: What happens when we miss the mark, and how we might be saved from it.
Tuesday, March 5, 2013
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment