Thursday, June 10, 2010

Oryx and Crake

It has been two months. I will make up for my long absence by posting a novel (2,300 words, seriously). Tl;dr version: things are going well. I have lots of posts in the works. I was troubled by the novel Oryx and Crake. Scientists are good people, generally speaking. People are good people, generally speaking.

I have, according to blogger, five posts begun or drafted and not posted. I have, rather obviously, been having some trouble figuring out how I want to use this blog and what I want to post here. I have stayed away from any specifics about my research for a couple reasons, which I can go into at length but are most likely self-evident. On the other hand, my posts of 'this is what I have been doing' seem vapid and meaningless. Most of the posts are on scientific topics that have come up either in conversation with various non-scientists of my acquaintance or in my random wanderings through pubmed. I'll try to polish those up to readable-form. I'd like to turn these ramblings into thoughts on science, and science policy, and the interactions between science and laypeople, because that's something that I've been thinking quite a bit about lately and I don't imagine that will stop as I become more invested in science and scientific research.

I've written next to no fiction. I have a few ideas sitting around but I haven't found the right words for them, haven't found the right tone and characters or setting. They're amorphous and, well, embryonic -- the seeds of a story but not a story itself. And when I sit down to write them they slip away from me, much like the mouse embryos in a dish slipping past my forceps. (I've been doing a lot of dissections lately, can you tell?)

I have, however, been reading fiction. Which brings me to the title, at least. I just finished Oryx and Crake (Margaret Atwood's second-latest, I think?) and it puzzles me. In short, I found the world she created to be, frankly, unbelievable in the extreme. I liked the characters, I loved the interactions between them, and there were aspects of the plot that I really, really liked. But the speculative aspects of the 'speculative fiction' category just left me frustrated, slightly offended on behalf of scientists everywhere, and vaguely worried that such pop-culture and science fiction depictions of science really were indicative of a broader social attitude towards science and scientists.

To be brief: in her latest post-apocalyptic imagining, Atwood kills everyone off using genetic engineering. (As opposed to, say, A Handmaid's Tale, where she kills everyone off using Nuclear Power. And then she doesn't quite kill everyone off. Genetic engineering, it turns out, is much more efficient than a nuclear bomb.) But before she does that, she creates a dystopic, gen. eng.-ridden world run by big pharma and biotech (without drawing any line between the two). Literally. The lucky intellectual elites live in cookie-cutter company town suburbia, the plebeian masses being shunted off to polluted, out-of-the-way "plebelands" where they are exploited for what little money they have and experimented on without their knowledge. This tenuous equilibrium is broken (at the very last minute) by the genius of the novel, who kills everyone -- for a while we are left to believe every last human except one; at the end of the novel we discover three additional survivors -- with a genetically engineered ebola-like virus (quick death by hemorrhagic fever).

What rankled, basically, was the thesis, implicit in the novel, that scientists - or possibly engineers, but there is never any line drawn between the two - fit into one of two categories. Either they are incapable of seeing the bigger picture or understanding how their actions upset the natural balance of nature, or they are psychopathic zealots who are willing to kill off the entire human race with a smile. The latter, according to the novel, are the well-meaning ones. There's this tacit assumption that permeates the book, that if "word people" instead of "numbers people", by which I assume Atwood means poets, artists, dramatists, philosophers, and, to put a point on it, novelists, were more appreciated in their (our) society, the terrible future would never have happened. Because poets, artists, philosophers, novelists, can read the writing on the wall, whereas scientists and engineers are cursed with the tunnel-vision of Autism or Asperger's and therefore incapable. Or because scientists and engineers think they're Gods. Or, well, something.

There are several problems I have with that idea. The first is that she's creating an inaccurate and chimeric dichotomy that doesn't actually exist between 'numbers people' and 'word people'. The second is that it's an inaccurate and vicious depiction of scientists and engineers. The third, and perhaps most important, is that it overlooks what I see as the real danger and the real motivating cause of the potential dystopia she creates, which is corporate greed, and not science.

"Elizabeth," I hear you saying, "You're just bitter because an author cast your field of research as the cause of the apocalypse without creating a character you felt you could identify with." To which I respond, well, that's exactly the point. I'm a scientist, which, I guess, classes me as a 'numbers person' according to the rubric used in Oryx and Crake. I'm also very much a 'word person'. While I admit that people segregate themselves into camps, science versus humanities, I have never thought that that is all-encompassing. I'm a scientist who loves literature and philosophy and weird old words. I was in no way out of the norm in college. In fact I was surrounded by scientists who loved the humanities and philosophers who loved science. It was stranger to find someone who didn't have a broad interest.

Maybe part of that is the kind of person who goes to the University of Chicago instead of, say, MIT or Caltech. Although I have also met people from MIT and Caltech who have a more-than-passing interest in literature, philosophy, or history. But I think the dichotomy that Atwood created is more rightly a multi-dimensional continuum, different kinds of 'intelligence' for different people, in different quantities and combinations. Not just mathematical or linguistic intelligence, but tactile and visual and even olfactory. What do you say to the genius choreographer who can turn emotion into movement in a way that speaks to millions, but who cannot multiply seven by thirty-one or solve a simple crossword puzzle? Is he not a genius?

The point is that human genius takes on as many forms as does human culture, and that science and language are just two of them. "Numbers" and "Words" is an oversimplification and a false dichotomy.

But even in the world of "Numbers people" and "Words people", why are "Numbers people", in Atwood's universe, severely stunted socially, and without a sense of ethics? Sure, Autism Spectrum Disorders give people a way of looking at the world that can be incredibly useful and amenable to science and engineering, and while I will freely admit that one of the things I love as a scientist is that I can wear t-shirts and worn out jeans to work every day, that I don't need to wear makeup, and that the tomboyish aspects of my personality are accepted and smiled upon, I do not think that these are the defining characteristics of a scientific career or the personality of the typical scientist. That, I would say, is a sense of wonder and curiosity. A sense which was entirely absent from the 'science' in Oryx and Crake.

Moreover, even should the Asperger's-type be the defining norm in scientific academia, I would not say that this means we are without vision of the bigger picture. Which, in this case, was cast as having the responsibility to not utterly trash the planet. To be perfectly honest, I don't know whether scientists or people in technical fields are more or less likely to, say, recycle. Or drive a hybrid car, or bike to work. But at the same time, it felt like a bit of a specious argument. How do you define 'utterly trash the planet'? The fact of the matter is that even without a car or an air conditioner, even if I recycle and compost and grow vegetables in a small garden in my backyard, even if I keep my lights off, etcetera etcetera and so forth, my modest existence is not, likely, sustainable. Not even getting into the unsustainability of my workplace, full of disposable plastic pipettes, dishes, and rubber gloves, toxic reagents that have to be disposed of somehow, and so on and so forth. Stanford does a decent job trying to greenwash its campus but facts are facts, and without serious restructuring and quite a bit of alternative sources, the middle class existence is simply not something that the planet can sustain for some six billion-odd people.

But that's not, as I see it, a failing of science. If anything, science and engineering provides the route and roadmap towards alternative sources to help ameliorate this. If anything, science and engineering provide our way out of this mess. I won't say science and engineering are totally innocent. They provided the tools that got us here. But I'm very hesitant to use that fact to denigrate science and engineering, when it seems the real culprit is corporate greed and the exploitation/misuse of natural resources through science and engineering.

My problem with that argument being that it hearkens back to "Guns don't kill people, people kill people." Which is a terrible argument. And I guess that there are a couple caveats that differentiate my desire to exculpate science and engineering from such an argument against gun control.

First of all, I wouldn't say that science and engineering should be pursued without any oversight or guidance. Ethical committees and top-view oversight are absolutely necessary. At times government intervention, either to create regulations which limit certain kinds of research, or support other kinds of research, are absolutely necessary. These are fuzzy territories and each individual draws the line between 'good science' and 'bad science' in a different place, but it is an essential conversation to have, it is an essential topic to wrestle with, and to a certain extent we are having that conversation and wrestling with that topic right now. More would be better, of course, but these talks are happening. So in a way, I'm pro-science control in a similar way to being pro-gun control. I want scientists to be trained in ethics and safety, to be taught how to prevent disasters, and to be held to high standards just like I want the same things of gun owners. (Really, I'd like that for everyone. Wouldn't that be nice?)

Secondly, when considering any argument like "Guns don't kill people, people kill people", I guess what you need is an alternative. What else do guns do? A shovel is a neutral tool. There are uses that are perfectly legitimate: gardening. There are uses that are illegitimate: bashing someone's head in and burying the body. Admittedly, there are a lot more gardeners than there are shovel-murderers. Which is very much a good thing. A gun is, in some ways, less neutral. I think the majority of gun owners are sportsmen or women, target shooters and hunters. Some have guns for self defense or security. And a few have guns in order to threaten or commit violence. But the nature of a gun is de-facto violent, which makes it more difficult for people to accept than a shovel.

The tools of biological engineering have many legitimate uses. In a gross level, they are responsible for agriculture and domesticated animals. Modern medicine. You get the point. They are also capable of being used to commit ethically dubious actions, like Monsanto's creation of sterile corn crops (which is what allows them to stay in business from year to year, fund new and better products, and so forth, but also creates a monopoly and allows them to artificially limit supply and therefore increase prices). And unspeakable atrocities. It's the shovel analogy all over again. We have to weigh the good with the bad, and come to some compromise. It's not sufficient to say "Science is evil because Eugenics was used to justify Nazism." or "Engineering is evil because the atom bomb killed millions." And I felt like Oryx and Crake was, in many ways, trying to say just that, or rather, "Science is evil -- look at the misuses of genetic engineering! Scientists want to play God, and they don't see what they are doing is wrong!"

I tried, hard, to see the mitigating characters. Jimmy's mom, a biochemist who pleads with her husband to do "something basic" instead of creating a new 'cure for aging' or some such, and then runs away and turns traitor to the state/company/economy. And, of course, Crake, who sees the horror and chooses to end it all. But when presented as scientific heroes one ineffectual researcher and the mastermind who eradicates the human race, I was admittedly dismayed.

There were many parts of the book I loved. I adored Jimmy's interactions with Oryx -- the fact that he was willing to create a past for her, whole cloth, and she was willing to accept and develop the fiction was compelling and moving. The way Snowman deals with the Crakers, the stories he tells them and the ways he finds of explaining the world, the way he doesn't understand them and knows he never will but loves them anyway, is touching to say the least. But throughout the book there was a nagging thorn in my side, a thought in the back of my mind. It said two things. The first was how self-pitying it was for a novelist to write a novel in which the world ends because we don't care about literature enough. The second was, I hope people don't really think this is an accurate depiction of the scientific establishment. Because if so, they must hate us something fierce.

5 comments:

Ayn said...

I read Oryx and Crake back when it first came out, in 2003, so it;s been awhile. I remember having qualms about it (though I don't specifically remember what bothered me).

When I do remember from all of the Margaret Atwood books I've read is that they always preset the worst in people.

Elizabeth said...

I dunno. I loved the characterization in "The Blind Assassin" -- on one hand I can see why you say 'they always present the worst in people' because no one was kind or trustworthy. But it felt authentic, and the emotion felt so real. And I liked that part of Oryx and Crake as well.

I guess part of my problem isn't just that individual people were tragically flawed (I think most people are tragically flawed), but that the society amplified rather than ameliorated those flaws, and the structures they were interacting with seemed evil and far-afield from our social structures.

On the other hand, it's possible that I'm (1) a Pollyanna desperate to think that everything is always for the best, (2) in denial about our society, (3) biased towards a favorable view of science, (4) otherwise blind to the realism of Atwood's apocalypse, or (5) all of the above. So take it with a grain of salt I guess?

Ryo Chijiiwa said...
This comment has been removed by the author.
Ryo Chijiiwa said...

(Sorry, wanted to make clarifying changes to one sentence after posting, but there was no "edit" so deleted and reposting.)

In trying to argue that shovels are more neutral than guns by attempting to weigh "legitimate" vs "illegitimate" uses, I think you're falling into the same trap as your detractors.

Trying to put any given piece of technology on a good-evil scale is almost always a futil exercise, because it simply can not be done objectively. The benefits of gun ownership can't be quantified (even though the "cost" can be more easily). Similarly, the "cost" of bioengineering can't be quantified, because nobody knows what adverse consequences may lay ahead, and it's impossible to quantitatively (or even qualitatively) compare them to the potential benefits, which are also unknown. At the end, all such arguments devolve into questions of aesthetics, subjective taste, faith, and personal risk tolerance levels, all of which are unresolvable (and therefore unproductive).

Rather than trying to place technologies on a single-dimensional spectrum (shovels:neutral, guns:less-than-neutral), I think it is more productive to accept that all technologies are both good AND evil, simultaneously. The task isn't to argue whether something is inherently good or evil, but rather to be cognizant of the inherent multidimensionality of everything that exists, and to identify what aspects are good, and what aspects are evil, and do our best to eliminate the evil without also sacrificing the good. The good vs evil arguments polarize us and merely serve to distract us from the matters that are actually important, like preventing gun violence, shovel deaths, and mutant viruses that eat your brains for desert.

Elizabeth said...

Ryo: You have a very good point, and I certainly agree with most of it. Mine was a rather inchoate argument, in any case. First of all, the shovel metaphor was meant to highlight exactly that all tools are good and bad, and that you can't say "such and such a tool is bad" or "such and such a tool is good". All you can say in any case is "such and such a use is bad" or "such and such a use is good".

And at that point it stops being about a tool and starts being about a use. So, up until that point I totally agree with you. I guess from there it becomes a practical problem. You say the task is to "do our best to eliminate the evil without also sacrificing the good", and I agree with you, but I don't know how to do that without issuing some sort of value judgment on a tool.

As an example, postulate that bad uses come about from bad users. Killer viruses will be created by crazy people (this does not seem to be that unreasonable a hypothesis to me). So, to prevent killer viruses, all we need to do is to prevent crazy people from getting their hands on the tools of genetic engineering; sane people are totally A-ok.

But we simply can't regulate that. Apart from the obvious reasons why controlling 'crazy' people based on predictions of what they will do rather than realities of what they have done is possibly more dangerous than letting said 'crazy' people run amok, there's no (or next to no) reliable way to tell who will snap and when they will snap. And what's your definition of crazy? It varies from person to person, time to time, place to place.

Since we can't, and don't want to, regulate people, since we can't 100% weed out the abusers, at a certain level we require regulation at the level of tools; there have to be some things that no one has access to, or that no one has access to without a really good reason (although what those reasons would be and who adjudicates them is up in the air).

But regulating tools is a touchy subject. You can't regulate all tools equally (I see that as being either totalitarianism or anarchy) and you have to draw the line somewhere -- different tools have different hoops you need to jump through to get them. Which requires asking the question of "Someone who wants to use this; are they more likely to want to use it appropriately or inappropriately? What are the risks of the inappropriate use as compared to the benefits of the appropriate use?" Which does becomes a value judgment on the tool, no matter how you phrase it. And that becomes an individual judgment that varies from person to person and situation to situation -- there's no good moral calculus for it, either. (There's just no good moral calculus.)

But I still think it's better to talk about it than not, I guess?