Matt Ridley referred recently to some new research on a population of free-living monkeys (macaques) which claims to show that individuals having a specific version of two genes tended to be less social.
But, as Ridley points out, there is nothing very surprising here, as studies of human twins have previously found evidence of genetic heritability with respect to sociability and related traits similar to that reported in the monkeys.
'The mutations making monkeys less social,' Ridley explains, 'have been tied to anxiety and a tendency to avoid risks. This may explain why they persist. Although well-connected monkeys generally have more offspring, anxious monkeys may be more vigilant to threats.'
It may not be drawing too long a bow to see here an embryonic simian parallel to the conservative/liberal divide, as traits like anxiety and risk aversion are normally associated with a conservative tendency.
The weight of research evidence has gradually muted the protests that this sort of research (linking genes and behavior) used to inflame.
Ridley alludes, for example, to the controversies sparked by the pioneering work of Seymour Benzer, who set out in the 1960s to find mutations in fruit flies that affected behavior. [On a slightly different tack, Benzer's mother was quoted as asking skeptically: 'From this, you can make a living?']
Benzer 'was soon able to identify mutations related to hyperexcitability, learning, homosexuality and unusual circadian rhythms, like his own: Benzer was almost wholly nocturnal.'
Ridley is not a genetic determinist, however. 'A discovery that genes affect behavior is no more or less deterministic than a discovery that family or education does so. Whether you are anti-social because your mother was unemotional a fashionable theory in the 1960s or because of a mutation tells you nothing about whether your condition can be remedied by some intervention.'
This is the crucial point: the extent to which these tendencies are amenable to change. Individual variation complicates the picture of course: some of us, no doubt, are more nocturnal or whatever than others.
But I daresay future research will lead, not only to detailed knowledge of particular gene-behavior linkages and potential intervention techniques, but also to a more satisfactory general understanding of the potential for reconfiguring our brains than we have at present.
Which is not to say that we should be too ready to intervene, even where it is possible; but, clearly, where it proves not to be possible or not to be practicable it makes sense to embrace our natural tendencies and integrate them as best we can into a broad social vision and way of life.
Quick and off the cuff. You know I have said before, "I am unique in ways science cannot fathom." (And so are you.) I am nocturnal, for instance. So perhaps I have some genetic markers that could enable a scientist to predict that I am far more alert, active and creative after dark. But even if every other genetic "trait" in my body were similarly identifiable, it would be the combination that makes me unique. I am "intellectual," for instance (it's all in my head!), so I think at night, where other people might be more physically oriented and tend to jog at night. Obviously this affects my interests, my activities, etc. And so on, through countless "traits" that combine to make me what I am. After enumerating all of these, the best a scientist can actually say is, "Gee. He certainly is different from the rest of his family." (LOL) There might even be a few people (out of seven billion) just like me, but they interact with different societies, different friends, etc., thus have different experiences and milieus. So there are no two people totally alike anywhere on earth, and science just takes us so far.
ReplyDeleteNonetheless it's really cool to find out what can be "marked" or traced, even if they'll never make a machine like me.
I don't think there's any question about the uniqueness of the individual; but I suspect you are really more concerned here with freedom (which is a thornier problem).
DeleteSo you know me too well, then. =D
DeleteYou're right, I'm always concerned with freedom.
But my individuality (or uniqueness) has nothing to do with my freedom. I am just free, period -- unique in myself or not.
Even if all seven billion of us were completely alike genetically, all spoke the same language, and all were raised in the same Chinese culture -- ie, even if no one were in any meaningful sense "unique" from all others -- still we would be free to think as we please and act as we please (even if none of us did so), provided only this: that thinking occurs. If I can think, I can be free even if every other "trait" of mine is the same as everyone else's. (The question is, taking a cue from Heidegger: does thinking ever happen?) Even if it never does happen, nature would not rain down brimstone upon anyone for thinking differently from everyone else; there are no laws of nature that force us to think anything in particular at all; therefore we are by nature free. Our challenge is not to "become free" at any point; it is, instead, to think for ourselves. And therein is the freedom.
I get your point, I think. It chimes with certain elements of the Western philosophical tradition. You mention Heidegger, who was influenced by late-medieval writers as well as the pre-Socratics. Though I respect and am intrigued by many of Heidegger's ideas, and the mystical-tending traditions he generally draws on, I find myself looking in skeptically from outside rather than 'being there'.
DeleteAlso, as I said in the course of a previous discussion, I see traces of what appears to be Cartesian dualism in what you say. On the one hand you have genes, bodies, environment (the 'mechanistic' elements); on the other, thought or thinking, which is not mechanistic.
It would be dualism to say that thought occurs (or thoughts exist) independent from the body -- ie, that thought exists in a separate realm from the physical. Plato to the core.
DeleteI don't believe that. Thinking is a physical process. It occurs in the brain. When I refer to thinking, I always mean conscious thought -- the thought one conducts in language -- and I consider that to be a physical process. The thoughts produced are physically stored; every thought has some physical analog in tissue (becoming memories, associations etc). So our ideas always sit in a physical substrate. No dualism there.
But the substrate -- brain -- is plastic. It is structured to be a substrate -- tabula rase -- upon which anything can be written. If tissue dictated ideas, all my thoughts would be inherited physically from my ancestors.
Thinking is physical, and "what we think about" admittedly is influenced by emotions, motives, experiences and other factors not entirely accessible to conscious thought -- tending to steer "what we're interested in" (such as conservative politics) -- but the crucial and fascinating object to study is the end product of all those neurons firing: the ideas, once they are made. The making of ideas is a physical process, but it's the content produced that ultimately matters -- and I am absolutely free to produce any content I am capable of thinking up, as long as I am thinking.
I am, therefore I think.
I hate typos. Tabula rasa.
DeleteP.S. I think for the most part Heidegger was a crock, idealism in general is a crock, there is informatively nothing in "beingness," Forms do not exist (we make them up), Existentialism (broad brush) is 99% mysticism and "the soul" is just an idea. (Useful but mythical.) Nonetheless people (including you sometimes) keep saying they detect dualism and existentialism in my expressions. I do refer to those things (obliquely) but the intent is not to promote them, but to strip the veneer and get down to the grain of reality beneath the idea we call "the idea."
A plausible explanation for people seeing dualism, existentialism, etc. in what you say is that there are in fact elements of those traditions there. Nothing wrong with that: our thinking is not just enriched but is enabled by the intellectual context.
DeleteThat bit about stripping off the veneer and getting to the reality behind the idea we call the "idea" sounds pretty Platonistic to me!
I would be wary of the tabula rasa metaphor, by the way, which is generally associated with a naive view of the brain. But in the context of human language (and other symbolic systems) conscious memory could be seen as being analogous to a medium like a slate or a sheet of paper upon which anything (up to a certain level of complexity, at any rate) may be written.