|Social and political comments|
Evolution and politics
Back to Social and political comments page
The notion of biological evolution as a steady movement from low to high extended itself to the movement of history—or ran in parallel with that notion, since we find Hegel and Marx both seeing history as a movement towards some high(est) end. Social Darwinism shared a similar position: we should get out of the way of survival-of-the-fittest to let it takes its objective, non-judgmental course to steady improvement of the species.
Mendel and then analysis of DNA structure (along with understanding of mutation) produced an understandable evolutionary mechanism, with the necessary addition that survival of the fittest refers to propagation of genes into offspring that will live to sexual adulthood and themselves propagate. While the genetic understanding did not intrinsically require that evolution be viewed as a progress from low to high, the evaluation continued that evolving from one-celled animals to humankind represented a movement from low to high. No less was there a sense of social evolution: with every generation society was getting better and better—despite lack of universal agreement about the exact meaning of “better” and a “perfection” toward which social or biological evolution was supposedly moving. Within the Western “liberal” tradition, the Idea of Progress, however, seemed a naturally inevitable concomitant of what became to all but religious conservatives the patent truth of some kind of evolution. Faith generally increased that science would provide elixirs to solve vexing, insistent life problems like supplying energy, reducing onerous (as opposed to enjoyable) human labor, dazzling us with gimmickry that would entertain and serve us, providing greater leisure time for greater individual creativity or just gratification.
The exact way evolution works ran into controversy—e.g., steady, slow adaptation or punctuated equilibrium. But these differences reflected interesting reflection on alternative procedures, not fundamental questioning of a general truth of biological evolution.
Sociobiology, an approach of recent decades, seemed to me a fruitful way of considering parallels in cultural evolution. On the grounds (or fear) that this approach encouraged a determinism about social roles that harked back to notions that we all belong where we are in life, the left mostly distrusted this approach. But it never seemed to me that this was a necessary consequence. On the contrary, I felt that if we could identify ways in which culture evolved (not in any inevitable sequence, but as a result of accident and experimentation) we could address ways to undermine cultural forces that seem unfair and better anticipate consequences of social engineering while constructing useful alternatives. I still feel this, but the issue no longer feels so important, perhaps because such a “science” would in fact be so incapable of certainty.
In the West, somewhere in the 1970s, the idea of progress, or the conviction of its inevitability—and exactly what it meant anyway—ceased being a given. Causes likely included a failing economy, increased awareness of ecological destruction our technology was producing, the increased rate of cancer that must at least in part reflect the steadily increasing amounts and expanding kinds of chemicals to which our bodies had never had a chance to adapt or, if somewhat adapted, had become present at levels too high for that adaptation to help us. Genocide as a tool of civilization, which we wanted to believe had to have ended with the defeat and exposure of Nazism in particular and fascism in general, crept back into our vision—in Cambodia, Rwanda, the Balkans. Especially starting with the Reagan administration, reversals grew steadily of liberal—not radical—agendas that we thought were entrenched, and leftish political activity increasingly had to focus on holding the line on (and trying to reinstate) social and legal values we had once fought hard to institute.
What in the West during the 60s and early 70s (and despite the election of Richard Nixon) had seemed to some of us like heady times presaging permanent radical change evolved into a cultural environment encouraging us to doubt the long-term effects of what we considered meaningful social change. The degree to which people voted their pocketbooks (or their conceptions of what affected their pocketbooks) without regard to well-being beyond themselves became more and more dismaying. Politicians who supposedly were heirs to at least liberal views and often more progressive ones—Mitterand, Blair, Schroeder, Clinton—grabbed for the middle at best, developed “realistic” policies that sought to pre-empt rightward policies by instituting policies a bit less right.
For all I know, such courses are inevitable, that when people get into power they face, or come to believe they're facing, political conditions that simply won’t permit the luxury of prettier ideological stances. We can see such a phenomenon on the right, too, certainly, as with “moderate” stances war-criminal Ariel Sharon felt compelled to take, albeit kicking and screaming, to fend off formerly familiar allies.
The distinction would be between being in power or out of power. Al Gore, a political figure I never liked from when I knew about him in the mid-80s—he seemed glib, narrow in his thinking, seeking to say only what he thought would be popular—came into his own only after “losing” the 2000 presidential election. I suspect had he won he would have been Clintontonian in general, though perhaps a stronger supporter of environmental causes. Although his being out of power allowed the Bush administration to run roughshod over the economy, civil liberties and international “diplomacy,” it did seem to give Gore a voice at last. Though with some flaws, on the whole I find his film, An Inconvenient Truth, an enormous success in rendering complex ecological and environmental history issues understandable. I suspect that film has done more to legitimize environmentalism as a vital social concern than any other single activity or person. Just as the Soviet Union never looked so good as in the time since its demise, Gore has never looked so good as when he cannot have direct involvement in the government.
Alongside deterioration of the ability to believe in inevitable social progress were forces like postmodernism. Apparently I was a premature postmodernist: by the end of my undergraduate major in philosophy (1963), I had concluded what I have held to this day--that “objectivity” in any traditional meaning of the term is impossible, and that all ideas and observations, including those of science, are filtered through the individual brain, which no matter the effort cannot avoid some preconceptions that bias the results. Radical skepticism is unavoidable. This is a modernized version of Kant’s formulation of inbred categories that keep us from seeing any things-in-themselves or Plato's allegory of the cave.
As I used to say to my students, however, the principle that everyone has a right to his or her own opinion does not mean that all opinions are equally right; we can still build standards that reduce, even minimize, our inherent subjectivity so that we can in fact fairly well share knowledge and its implications. My own bias is towards a naïve realism that says most of our empirical observations are close to accurate. “Most” (perhaps “many” would be apter) is crucial: we need always be on guard that a given moment is not one of those times when our senses are deluding us. I also suggest that personal honesty, the effort to be as aware as possible of our own subjectivity, is crucial to communicating more than solipsistic (or narcissistic) ideas. Inevitable as subjectivity is, we are nonetheless part of a world that consists of seemingly endless other subjectivities, and somehow we do find ways to ameliorate the theoretical alienation that should result.
Viewed in one way (a way I’d like to encourage), acceptance of the impossibility of “objectivity” should produce some humility (though it can also produce arrogance) to soften the edges of human beings’ separateness from one another. It can make us more receptive to other points of view, more patient in working out our own ideas and understandings, more collaborative with spousal figures, children, friends, organizations, even enemies. As it can run afoul of arrogance, so recognition of subjectivity can make us too malleable, too accepting of conditions (I think especially of injustice) on the grounds that no one can say for sure what is right and wrong.
I can offer no hard-and-fast rule for not falling victim to the perils of misunderstanding the limitations on our judgments. Like any other principle we wish to affirm in life, it is ultimately insusceptible of rational demonstration or control. All reasoning starts with a priori principles; only if others accept a particular set do we have much chance of communicating. An obvious example: debating religion gets nowhere unless the debaters share some basic religious attitude—usually, belief (or disbelief) in a deity. Those of us who have supported civil disobedience have certainly attempted to persuade others of our rightness; but there is really nothing obvious or inevitable about our reasoning, any more than there is about the reasoning of those who would sit in at abortion centers (or gun down abortionists on the grounds of doing god’s work).
Of course, people try to get around this problem by finding some common assumption on which to build an (often tenuous) argument. In the worst cases, this happens by appeal to people’ worst instincts or reflexes, as commonly occurs when national leaders seek support for war-making or other destructive policies. After 9/11, at least for awhile, there was no holding back the upswell of militarism in supposed defense of personal security against the new satanic force, terrorism. Warning voices (which proved accurate) beseeching people to consider the longer term effects on Muslims around the world (notice that this appeal uses the same assumption as the war-mongers—that we all want to feel secure) were condemned as sell-outs, cowards, bleeding hearts, traitors (fill in your own favorite ad hominem label).
What I am saying here is largely not new to me. What is new is the attempt to explain specifics, at least in the US, about the discouraging nature of the last few decades for ongoing, meaningful social change in the service of social justice.