Log in

What's up, pre-doc?

Wow, it's been eons since I've posted a real blog entry. And I don't have any excuses; I just didn't have much to say. And even now I can feel a filler-update coming over me. So I may as well acquiesce.

September Update: The Work Side of Things

What have I been doing with myself these last four months since the previous update? --God, has it only been four months? Feels a hell of a lot longer than four...

For one thing, I finally began my PhD program in Anatomy Science & Neurobiology. At the moment, I'm just taking one lecture class (Biochemistry) and am filling the rest of my time with labwork, reading, and writing. As far as the labwork, I'm receiving training in Transmission Electron Microscopy (EM), immunocytochemistry and specimen preparations, and just generally getting used to the lab environment. At the moment I'm trying to see if I'd be able to work with animals; the lab I'm hoping to be in works largely with brain tissue from tree shrews and of course that means that it's necessary to be able to do surgeries on these animals and then be capable of perfusing them. (For anyone not familiar with the term "perfusion" in an animal research sense, I recommend not looking it up.) I just don't know if I can do it. I love animals, I hate to see them in pain. And while ample anaesthetic is used on the tree shrew before the perfusion even begins so that they don't actually feel any of it, I know I would be taking a life.

The surgical nature of the act, the blood, the organs, those don't affect me so much. It's the moment of incision, the moments of death that I've found difficult watching. The slow loss of consciousness, the shallow breaths, the last breath... The invasive nature of the act, the death of that animal doesn't change whether I refuse to do it or am performing the task myself. The creature will die regardless. I just don't know if I'm capable of doing it, whether I could ever be so desensitized.

Anyways, life and death aside, I've also been doing some writing. As I mentioned before, I wrote a book review for the Journal of Autism and Developmental Disorders. It should be available in hardcopy print fairly soon. I've also coauthored a concept article with my bf, called Autism and dyslexia: A spectrum of cognitive styles as defined by minicolumnar morphometry, published in the journal Medical Hypotheses. We're also beginning to gather materials to write another concept paper which we're planning on submitting for publication within the next couple of months. The timing of our submission will depend on a particular grant submission though, so we may end up sitting on the publication for a little while. And after that one, I have another paper in mind I want to start working on. So lots of writing so far.

September Update: The Personal Side of Things

JNSQ and I are still together and will be celebrating our one-year anniversary next month. In honor of it, I'm hoping we can take a weekend trip to my hometown, St. Louis. But with our busy schedules and his primary custody of his youngest, that might not happen. But hope springs eternal.

We've certainly had our ups and our downs, JNSQ and I. I won't go into details but our situation isn't exactly a simple one. So there's always been a lot of stressors for us, which test any relationship. But in spite of those complications, we're still together and hope to be for a long time.

And that's all I have to say about that.


Eye Contact & Evolution

Okay, so I'm kind of stealing this post from my Case Study on Gestalt. I may consider suing myself at some point for copyright violation. But for now, I'll simply push blindly and brazenly ahead in spite of any of my protests.

A recent post on Gestalt made me think about eye contact in autism:

In an evolutionary sense, the eyes hold a considerable amount of information. They are vital communicative tools and have the potential to produce a great amount of anxiety in the gaze-receiver. In many mammals, direct gaze can be an aggressive gesture, including in humans (although with the intricate subtleties of our nonverbal language, direct gaze can mean far more than just aggression). And even in some of the least social animals, they are still cognitively capable of identifying the eyes of an unrelated species. Ever noticed how your pets are capable of knowing precisely where and what your eyes are? They know when you're watching them and when you're not. And when you think about it, that's an incredibly complex neural template to be generated. It means that, for the most part, any unfamiliar animal you-- or your pet-- come across, you have a schema for "eyes" despite that you've never seen that creature before. In predator/prey species which usually involve hunting, chase, and capture, the eyes are vital to survival. The predator needs to know when the prey isn't looking at it, so that it can better take the animal unaware. The prey also needs to know when the predator is looking at it, when it is likely to charge, where the predator's attention is, etc. In social species, the eyes are even more important, providing an additional tool for communication and cementing social bonds.

So, eyes for many animals are fraught with potential anxiety. This is true for humans (notice how in some cultures, eye contact tends to be avoided more often with strangers than with familiars). Therefore, it should also be true for autistics. And since we autistics tend to take anxiety to an extreme, it makes some sense that we might take anxiety produced by eye contact to a greater extreme than our nonautistic counterparts.

Even within the autistic spectrum there is a spectrum of eye-anxiety. Some autistics have such extreme anxiety provoked by eye-to-eye contact that they avoid looking at the face altogether. Others have anxiety to a lesser extent and may be able to look somewhere on the face or make briefer glances at the eyes. Others may have overcompensated for lack of eye contact and take their correction to the extreme and make too much eye contact. And still others, I'm sure, have fairly average amounts of eye contact. Like I said, we're a spectrum where eye-anxiety may be more likely but is not a prerequisite for diagnosis.

For those individuals (myself included) who are capable of making general "face contact" but still find the eyes a bit too overwhelming or distracting, "mouth contact" may be an alternate gaze point. In my experience, humans are extremely sensitive to variations in horizontal (side-to-side) gaze but not in vertical (up-and-down) gaze. So if one were to look at the mouth rather than the eyes, which are in direct vertical line with one another, then most people can't tell that true eye contact isn't being made. Only until the vertical gaze variation is more extreme can someone tell; for instance, if I'm talking with someone and they're just a few inches from my face.

Autistics do seem, on average, to exhibit more "mouth contact" than nonautistics, as shown by eye-tracking technology. One study by Ami Klin et al. (2002) reported:

"Consistent with our predictions, individuals with autism focused 2 times more on the mouth region, 2 times less on the eye region, 2 times more on the body region, and 2 times more on the object region relative to age- and verbal IQ-matched controls. Effect size was greatest for fixation on the eye region, making it the best predictor of group membership. [...]

"We next explored the association between fixation time measures and measures of social competence. Contrary to our expectation, fixation time on the eye region was not associated with either social adaptation (VABS-E socialization scores) or social disability (ADOS social scores) [...] In contrast, fixation times on the mouth region and on the object region were strong predictors of social competence, albeit in different directions. Fixation time on the mouth region was associated with greater social adaptation (ie, more socially able) and lower autistic social impairment (ie, less socially disabled). Going in the opposite direction, fixation time on the object region was associated with lower social adaptation and greater autistic social impairment" (pp. 812-813).

One possibility is that those autistics who showed little "mouth contact" simply found the eye region too overwhelming altogether and were therefore reduced to looking more at objects, while those who looked at the mouths had less eye anxiety and so were able to focus in on the mouth more. The level of eye anxiety could be directly related to severity of the autistic phenotype: the more extreme the expression of autism, the more severe the anxiety provoked by eye contact. And also usually the more severe the expression, the less socially able an autistic is.

There's a lot of information in the eyes. It's hard to tell just how much a lack of eye contact detracts from social fluidity. Surely it plays some role, although how large is hard to tell. I.e., how much is lack of eye contact a direct cause of social difficulties and how much of it is simply correlated to the level of phenotypic severity?

In any case, having been on these online forums since 2004, it's obvious to me that many autistics have a difficult time with anxiety produced by eye contact, so much that we frequently avoid it to varying degrees. And, interestingly, the neural substrates that underlie this tendency are probably not new to autism, but can be found throughout primate evolution and resonate as far back as the early predator-prey terrestrial hunting relationships.

I don't usually refer to myself as "a writer" but I suppose in the loose confines of this blog I can make the presumption. Because if I'm not "a writer" then I couldn't possibly have had "writer's block" these past several months and this blog post has ended before it's even begun. (I'm sure there's some fallacy enmeshed in there somewhere, I just haven't the strength to fight the witlessness of writer's block and my own mid-morning witlessness, so I'll leave that digression into Logic and Debate for a later date. If I have the time. And wit.)

Despite that it's far too late to make a long story short, yes, that was my introduction into what's been happening to me since February: writer's block. I suppose if I had been utterly determined, I could've sat down and forced myself to write something, anything, so my archives of February, March, and April didn't resound with an echoing CLUNK when throwing a rock into a deep empty well...

Oh well.

Here I am, at last. Writing dribble. Some of the most random monologic prose since Dostoevsky's personal journal-- if he had ever written one. Not precisely sure where I'm going, even less sure where this came from. But 100% certain that hardly anyone will read it, and those one or two who do are surely avoiding some inconceivably more mundane task than the reading of this blog requires. In which case, I pity you and hope your procrastination goes as well as mine has these last three months.


Perhaps I could liven this post up with a little This Is My Life update. To date, the score stands thus:

1) I've finished off my Bachelor of Arts in Psychology, although that hasn't changed since my last post. Only now, my graduation ceremony is in a week's time. So I will be going back to St. Louis, and getting all dressed up in my cap and gown which I got to pay a ridiculous $65 for-- just so I could receive an all black garb with a weeeeeee line of blue and yellow on the cuffs to indicate my alma mater. The colors are there to ensure that a) I don't forget where I graduated from, and b) that I have to purchase an otherwise all-black cap and gown through the university bookstore, to give my money to the school one last time. I'm sure I'll remember my time there even more fondly for it. So in a week, I'll be walking in my black cap and gown with the color on the cuffs, be given Departmental Honors, and will finally receive my Bachelors certificate.

2) I've finally moved to Louisville and have been here for almost a month. The move went smoothly, although it has been a bit up and down getting used to such huge change. But my partner, JNSQ, has been a godsend and made the transition bearable and frequently enjoyable.

3) Aside from the move, JNSQ and I have been working on "us". Just enjoying spending time together, figuring out a little more day by day where we're going.

4) I'm waiting for my Ph.D. program to begin. It won't start until August, but in the interim I'm hoping to get a volunteer position at the university and work with JNSQ, keeping active in research rather than becoming a couch potato over the summer.

5) In the meantime, I've finished writing a book review of an autism-related biography which will be published in the Journal of Autism and Developmental Disorders, sometime in the future. And I've continued to work on an ongoing project which I started at Washington University involving the ADI-R (Autism Diagnostic Interview- Revised).

Anyways. Cette est ma vie maintenant. C'est tout.


Amor Ad Absurdum

Well, no, my life isn't really absurd at the moment. Perhaps a tiny bit boring, but certainly not absurd. --Come to think of it, it's not especially boring either. True, I seem to have quite a bit of time on my hands since I'm somewhere in this finite limbo between my Bachelors program and Ph.D. But even in my most drawn out moments of nothing-to-do-ness, there's inevitably plenty to do. I just don't always get it done.

Most of my time though has been filled with two things: 1) my newfound relationship, and 2) research for a paper I hope to begin writing in the near future. As with most types of human relations, the relationship is enjoyably fascinating, yet confusing. The research I'm doing-- while it deals with complicated topics such as encephalization, sexual selection, and the evolution of complex social hierachies-- is far easier to comprehend (and predict) than a single human relationship. And so, much of my time has been dedicated to pondering the complexities of human relations, both on a universal level and a personal one.

You know, it's an interesting hybridization of thought when one splices together awareness of evolutionary theory about reproduction and one's personal life. To look at your partner and simultaneously think, "I'm in love with you, I'd even die for you!" and have that followed up by flashbacks from Dawkins' 1976 book, The Selfish Gene, and whisper to yourself, "...but I'm actually just using you for you DNA." --Takes some of the romance out of the moment I can tell you. And then you have to take a step back, shake away the logic and the realization that you're actually just a selfish creature out there to make as many babies as possible (a fate I logically hope to avoid), and then try to let your amygdalas suck you back into the mood once more.

Is it any wonder we write poetry, make music and art, not just to express this romance but to continue on that high, to keep getting our fix? Ah, yes, I think most of us know it: that brilliant dopaminergic rush of feeling good and wanting to fuck like rabbits. --Or I should say "like humans". I mean, my God, I used to think I had a brain, had a life, loved thought and rationalization and learning and knowledge. But show me a picture of my lovey and I melt like goo in the sultry hand of Cupid. My animalistic DNA has completely taken over and any illusion of self-control over my desires in life is shot clean out the twelfth story window.

And then some small part of my brain steps back, shaking its head, wondering where it all went wrong. "I used to have such incredible discipline... And now look at this pitiful creature. Moaning and moping so much, sometimes she even forgets to eat!" All the while my hypothalamus and amygdalas are giving my left prefrontal cortex the middle finger.

But that's the way it's supposed to be. If I weren't acting like an obsessive lovestruck idiot, why, I doubt my ancestors would've made it this far. They've all acted like complete amorous imbeciles before me-- and should my DNA manage to once again triumph over my inhibitions and I do end up having children, I'm sure they'll act just as ridiculous as I am.

Who knew billions of years ago, when the first amino acids came into being in an oceanic muck of water, carbon dioxide, methane, and ammonia, that several billion years later I'd be sitting here at this computer, obsessively perplexed and fascinated over another DNA-toting "robot"-- my lovey, my soulmate, my je-ne-sais-quoi!-- writing such romantic scientific drivel as this?...



As a person who is in the beginnings of her research career, I wish to say that though my understanding of Statistics and Research Methodology is not yet at its zenith, even at this early stage in my learning I am familiar enough with data analysis to state very affirmatively that a layperson's view of "science" and that of the scientist, him- or herself, are frequently light years apart. Ideally, the scientist is trained to view data with a skeptical eye, aware of the fact that, while numbers don't ever lie, sheer probability, human error, and interpretation are all too frequent companions to design and analysis. In addition to being skeptical and critical of the results of research, it is also important to note that scientists are trained to never base a conclusion on the results of a single study due to the extreme chance of error inevitably present in all forms of research. In science, theories are based upon the replication of results, not of a single study. Most laypeople do not seem to grasp this concept because we are so used to being bombarded by reports through the media of single studies and anecdotal evidence. A layperson's view of science is one which has been mutated and shaped by the media.

You might ask why it is that scientists are made to be so distrustful of data. Aside from the risk of a poor research design falsifying study results, part of that reason is also found in the statistical analyses frequently used to analyze the data collected in a quantitative study. There are a group of statistical analyses called "Analysis of Variance"; what these do are analyze trends within sets of data to discern the probability of a relationship between these variables. Most scientists don't do these calculations by hand but set up data tables in statistical programs, like SPSS, which do the analyses for the investigator and spit out the results. One of these results which are given with the Analyses of Variance is something called a p-value, which tells the investigator whether the results of the analysis are deemed "significant" or not. A p-value is based upon the "alpha level," which is a value the researcher sets at the beginning of the test; an alpha level is the percentage risk the researcher is willing to take that the results were achieved by chance and chance alone, as opposed to being a result of a true relationship between the variables. So by setting an alpha level of 0.05, you are stating that you are willing to take a 5% risk that your results are due to chance alone. 0.05 tends to be the upper level that most scientists will take, although this tends to vary by the type of research being done. In the Behavioral Sciences, 0.05 alpha is common, while 0.01 (or 1% chance) is seen often in something like the Biological Sciences.

This is one reason-- mathematics-- that scientists distrust the reliability of data. Think of what this is saying: if I set my alpha level at 0.05, I am accepting the fact that I am taking a 5% risk of getting a false positive and that there is really no relationship at all in the variables I'm studying. What the alpha level does is set the amount of variation the sample is allowed to contain in order to calculate whether the relationship is deemed "significant" or not. An alpha level of 0.05 allows for greater variation than does an alpha level of 0.01. By setting my alpha at 5%, I am accepting that, according to probability, if I were to replicate this exact study 20 times, 1 out of these 20 studies would give me false results. --And that's assuming the design in my study is flawless and without human error.

Now you may wonder why in the world I've begun giving a mini-lecture on data analysis when my title implies I'm writing some sort of editorial in response to Katie Wright's post on the Age of Autism blog. I've written the above to illustrate in greater detail the differences between a layperson's grasp of Science and that of the scientist. Not only in this day and age has every Tom, Dick, Harry, and Harriet become armchair psychologists, but they have become armchair scientists as well. There's not too many careers out there where you can be treated as a qualified professional either by attending Harvard OR the University of Google. As someone who has been both on the receiving end of poor health care and someone who is going into the "treating" end of the fields, I have to say how presumptuous it is for every Tom Cruise out there to profess his expertise in interpreting research without the knowledge and training to do so!

By saying that, I am also not implying that every professional out there sincerely knows what they're doing either. Any person who's been on the receiving end of ridiculously ignorant patient care knows full well that a degree hanging on a doctor's wall simply signifies he's gone through medical school. But if I were to read up on car mechanics despite not having the training and experience to work in that field, would you want me fixing YOUR car??? Well, no offense, but I don't want Jenny McCarthy interpreting MY data either.

Perhaps you tell me a car is not the same thing as a child and that parents of autistic children would be offended by such an analogy. Well, I hope you're right, considering if you destroy the engine of my car I can just go and buy a new car. A child, however, is precious and irreplaceable. And it's for this reason that it is vitally important not to leap to any single conclusion concerning the results of research because of the potentially dire effects it might have.

Example: I have a son (I don't actually, this is for the sake of argument). My son has been diagnosed as autistic. I don't know too much about Jenny McCarthy but, through word of mouth, I hear that she's a huge proponent of the gluten-free/casein-free diet and she feels it can cure autism. I'm a cautious parent, but since this isn't anything involving medications and just a change in diet, I figure it's worth the risk to cure my son. Rather than taking my son in to his doctor to run tests on his antibody teeters or to have a tissue sample taken to verify the presence of gluten or casein antibodies (since I'm not a doctor and have little idea what antibodies are other than people seem to mention them when talking about the immune system), I do the if-it-works-then-it's-a-diagnosis method. Unfortunately for my son, his food sensory issues are so extreme that all he will eat are variations of sandwiches. Now that he's gluten-free, however, this portion of nutrition has been removed from his diet. After several weeks of this new diet regime, my son is doing more poorly even though I try to supplement his diet with vitamins. He still has stool problems, his behavior is worse, and, while he was thin before, he's lost even more weight. I finally take him into the doctor, some tests are run, and I am told that, rather than being intolerant of gluten or casein, my son is intolerant to the proteins in milk.

Why did I just give this hypothetical story? Solely to illustrate two things: Generalization and Assumption. Jenny McCarthy made the mistake of generalizing that autism can be cured by a GF/CF diet, without regards to thorough research. And I have made the assumption she's right and have harmed the health of my son in the meantime. Did I do this purposefully? Vindictively? No, not at all. I did what I felt was best for my son. And not being a doctor or a researcher, I wasn't fully aware of the potential dangers inherent in making these assumptions.

Katie Wright is a mother, not a scientist. I can't for one moment imagine her crusade is one of vindictiveness or for lack of caring for her son. In fact, her fervor implies the opposite. As I said though, she is acting as a mother, not a scientist, and she is making the same mistakes that I've illustrated above: she has assumed that aspects of reported research are absolutely true regardless of research design or replication, and she has generalized her assumptions to indicate that all of autism is caused by vaccinations.

I don't know enough about the underbelly of politics going on within the Autism Speaks organization. Frankly, I'm not so sure I'd want to know. And the same thing goes for the CDC, the FDA, and the pharmaceutical companies. I think Katie Wright is right in saying this whole situation isn't just about helping autistic people, their families, and doing the research. It's just as much politics and business as anything else.

What is my take on the whole Vaccine Theory of Autism? I have a fairly precise idea which is neither here nor there. Given the amount of research coming out nowadays on the immune system in autism (and I'm not talking about research put out by the Autism Research Institute, I'm talking about WELL-DESIGNED research), I think it's fairly clear that effects of the immune system are somehow involved in a portion of cases of autism. Now, to what extent, and whether the immune system plays some role in the level of severity, I could only speculate. What role do vaccinations potentially play in severity of autistic expression? Again, unknown. And that is largely because politics (the CDC) has tried to barely touch that with a 40 foot pole, and has solely focused on black-and-white all-or-nothing-at-all research designs: do vaccinations cause autism? Not whether vaccinations might play some role in severity of autistic expression.

To some extent, I can understand the government's and medical community's hesitation in doing the research. Vaccinations have been lifesaving for many people. What do you think would happen if a set of research studies came out with strong evidence that vaccinations increase severity of the autistic phenotype? There'd be mass panic, parents would refuse to get their children inoculated, and certain childhood diseases would likely begin to reappear in force and create larger problems than autism. Already, with the mild panic that's already been induced, some childhood diseases are making a slight comeback.

At the same time, however, the parents groups seem to run wild with anecdote. Like Stalin said, "The death of one man is a tragedy, the death of millions is a statistic." And how right he was. Listen to a single person's hardluck story and we're in tears; hear about the innumerable holocausts in our human history, and it barely turns our attention.

Why do you think, in statistics, GROUPS of participants are used more often rather than relying upon single case studies? Because, while statistics aren't so moving, numbers don't lie-- and the more numbers you have, the more potentially representative it is.

But emotion is infectious: a single person can move us far beyond any number, even if that number is based upon hundreds, or thousands, or even millions. It's because we're human: we're designed to be socially-influenced creatures. We haven't evolved to be natural statisticians.

But the only way for laypeople to interpret the results that are disseminated to the public, without coming to inaccurate conclusions, is in fact by being good skeptical statisticians. Unfortunately, we must constantly caution ourselves that emotion can run away with us, leading us down false paths. It can cause us to hurt the ones we are trying to help.

Where've I been? --I don't know...

Yes, I'm posting, after what-seems-like several millenia I am posting. I don't think I have anything earth-shatteringly brilliant to say (damn, foiled again!), but I suppose an update on my life will do as filler for now.

... Good Lord... where in the world do I start? Well, I suppose my life is the same and yet not the same. I'm just about to finish my bachelors degree (finally! thank Bob, finally, FINALLY!!!!!!!!); I just have one more cumulative final this Tuesday and I shall be rid of the cursed thing forever, take my sheepskin, and move on to graduate school and playing with dead peoples' brains. Of course, my university, being cheap as they are, only has a single graduation ceremony per year-- and that's held in May. So for us December graduates, we just get to wait. --In my case, I get to wait AND come back into town to attend my own graduation.

That's another thing: I'll be moving in a few months. I've just finished sending in my application to graduate school; I've still got one more transcript I've gotta get sent in (hoping to do that tomorrow) and my supervisor should be mailing in my final Letter of Recommendation towards the end of this week. Et finis!

--You may being asking, "But, Emily, does that mean you're planning on moving even without knowing the outcome of your application?" Ah, how keenly perceptive you are. Why yes, yes I am. I think I have an excellent chance of being accepted into this program; however, even if I do not get accepted for this coming term, heaven forbid, I will still move to the area and then enroll in the local community college and spend that year taking more Biology courses (which my Bachelors career had been somewhat lacking unfortunately and was my main weakness as a PhD candidate for this program). And of course I would reapply for Fall, 2010.

After moving, I'll have a few months to spare before the program starts. I may get one of the lab rotations out of the way in the meantime. I am also wanting to work on some computer simulation studies I've been designing (just the meat and bones mind you, I'm no computer science major).

In the meantime while I'm still in town, since I'll be finishing my BA this coming week, I'll continue to work at the WashU lab and will also be starting a collaborative paper with two other researchers in the area of Cognitive Neuroscience and Religious Studies (a strange hybrid of a field that has arisen from viewing Religious Studies topics from a Neuroscience perspective). Ideally, I'm going to be bringing the neurobiology and (I hope) evolutionary theory to the team. I'm looking really forward to working with those two scientists and have been reviewing literature, making notes, jotting down ideas, etc. Now if only I can guide them into FINALLY picking a date and time to meet... ::sigh::  --Like herding cats, folks...

Anyways, that's about it as far as my "professional" (I use the term loosely) endeavors. On the personal side, I have started dating someone, a wonderful guy and I'm extremely happy. It's a very new feeling for me.

K, that's my update. Who knows: maybe some time in the future I may actually have something more than just filler, heh.

First off, when speaking of Darwinian Evolution, too much emphasis is placed upon Natural Selection: an organism’s capability of survival. Granted, survival is imperative; however, survival means nothing to a sexual being that does not reproduce. Therefore, the true evolutionary agonist is Sexual Selection, while Natural Selection is a mere means to an end.

So when speaking of autism as a set of advantageous or disadvantageous traits, one must discern how such traits improve or impede the organism’s chances at reproducing. For autism, clearly given the greater difficulty in social bonding, reproductive rates for the entire Spectrum are likely impeded to a certain extent. Therefore, one asks then why autism has not completely disappeared from the human phenotype. It’s a minority, yet appears to be a strong and constant one.

Autism is a continuum of behavior, a continuum which can be seen throughout the entire human race, incorporating obsessive-compulsiveness, cognitive specialization, and general intelligence in its neural repertoire. The existence of this continuum implies a range of phenotypic variation overlain upon a more primitive foundation—a foundation which all humans undoubtedly share: the genetic building blocks of the human cortex.

If autism were due to a set of aberrant genes which the remainder of the species did not share to some extent or combination, I would predict two things: 1) that gene mapping of the Autistic Spectrum would be much further along, and 2) that autism would have been culled from the gene pool long before now. It’s sheer continued existence despite its detriment to reproduction is proof positive that the genetics linked to autism are somehow reproductively beneficial, although they may impede reproduction in those organisms who have more severe expressions.

Therefore, if diagnosed autistic individuals are less likely to reproduce than the general population—so that it is not they who are solely responsible for continuing to disseminate these autism-related genes to future generations-- then the answer must lie in the general population: the “non-autistics”. But then if these genes are so widely distributed through our species, they must provide some benefit to reproduction. A gene with negative effect will soon be culled, a neutral gene cannot be selected for and sheer probability will likely cull it from a given gene pool.

Richard Alexander (1974) stated in the introduction of his seminal paper, The Evolution of Social Behavior, that “because of their peculiarly direct relationship to the forces of selection, behavior and life history phenomena . . . may be among the most predictable of all phenotypic attributes” (p. 325). To put it another way: because behavior is a key factor in attracting or repelling a mate, it is behavior, and not other physical attributes, which is most easily selected for via Sexual Selection. Considering that premise, one asks, what is it about the behavior in the broader phenotype of autism that is a boon to sexual reproduction?

I can’t answer that, except for a few speculations. In his book, The Red Queen, Matt Ridley (1993) states that the defining hallmark of our species, our expansive neocortex, is a metabolically expensive attribute to maintain. He speculates, therefore, that because of its expense, its advantageous link to reproduction must be even more direct. According to Ridley, our big brains are not just for making tools, outwitting predators, hunting prey, and the like; instead, our brains are designed to attract and keep a mate. Because humans are a monogamous yet adulterous species, it is imperative we are adept not only at attracting a mate but keeping him/her present and relatively faithful, reducing the risk of cuckoldry and our chosen mate’s philandery.

It is easy to grind an axe; even birds use tools and can built intricate nests. They don’t need a large neocortex to do so. But socialization requires vast stores of cognitive capability, especially in a social, yet monogamous, yet adulterous species such as ourselves. Therefore, the expansion of the neocortex, particularly the isocortex, has been selected for by Sexual Selection to improve reproductive success, via our abilities to win and keep a mate. All socialization are outcroppings of this goal.

In the case of autism, savant abilities aside, autism-related genes likely reach a peak of reproductive benefit to the human organism when expressed in greater moderation, e.g., the broader phenotype, simply because these genes in more severe phenotypes reduce the likelihood of reproduction. This, however, does not negate the worth of an autistic person simply because he or she may be less likely to reproduce. Because humans are an intelligent social species, our intelligence has made life more than just about sex, despite that sex is our mode of genetic replication. Therefore, it is not necessary for someone to be a savant just so that his or her worth as a human being can be appreciated.

However, utilizing the paradigm of evolutionary theory, lower rates of reproduction will most likely keep the Autistic Spectrum, as it is currently defined, as a minority. And yet the unidentified yet obvious benefit that lesser phenotypes of these same genes impart implies a substantial benefit to human reproductive success. Therefore, in studying the makeup of autism, in observing savant abilities, and traits such as the deficits in language and socialization, we may instead be studying the extreme form of the human cortical continuum.

Somehow, Sexual Selection has selected for autism. It is a wiser question to begin to wonder: WHY?


Alexander, R.D. (1974). The evolution of social behavior. Annual Review in Ecology and Systematics, 5, p. 325.

Ridley, M. (1993). The Red Queen: Sex and the Evolution of Human Nature. Harper Perennial.

**This essay was first posted on the forums at the AWARES Conference 2008.

In a radio interview given earlier this month, Australian Prime Minister Kevin Rudd was asked by the host what his "biggest argument in favour [sic] of God" was. A bit taken aback, Rudd calmly answered,

For me, it's ultimately the order of the cosmos or what I describe as creation. You can't simply have, in my own judgment, creation simply being a random event because it is so inherently ordered, and the fact that the natural environment is being ordered where it can properly coexist over time. If you were simply reducing that to mathematically [sic] probabilities I've got to say it probably wouldn't have happened. So I think there is an intelligent mind at work.

According to Dawkins' The God Delusion (2008), Rudd has proffered a classic example of the Teleological Argument: the universe appears ordered, therefore somebody must have ordered it. There is also a bit of Occam's Razor intertwined in this hypothesis because, while the universe does appear to follow a predictable set of laws (physics, chemistry, biology), it is one of the simpler solutions to propose that an overarching deity has designed it that way-- as opposed to proposing that, say, Porky the Pig's evil twin sneezed one day and thus the universe and all its laws were created.

It may come as a shock to most people except those who are statisticians or neuroscientists, but humans tend to be poor predictors of probability. Both the Clustering Illusion and the Representativeness Heuristic hit on this point. The Clustering Illusion states that humans tend to see clusters of "relevance" in random data or information; this point is well illustrated in an experiment performed by Thomas Gilovich (1991). In this experiment, Gilovich showed participants a random binary code, OXXXOXXXOXXOOOXOOXXOO, and asked them whether they thought the string of symbols appeared random or nonrandom. Unsurprisingly, most people felt that the data looked nonrandom, when in fact the binary presents with several characteristics of statistical randomness, most notably there are equal occurrences of both binaries and the combination of adjacent results also occur equally. (In true probability, when there are two possible outcomes, each outcome is just as likely to occur and the previous outcome does not effect any succeeding outcomes.) The Representativeness Heuristic similarly states that objects or situations which are alike in appearance or other characteristics are often assumed to be related. In addition to these theories, there are human tendencies such as pareidolia, the tendency to attribute significance to insignificant stimuli, especially images or sounds, and apophenia, the tendency to see patterns in random data, such as faces in clouds or woodgrain. These also illustrate the fact that humans tend to under-predict the probable and over-predict the less probable with certain types of data or in certain situations.

Why are these theories about human nature relevant? As Dawkins (2008) would say, while our judgment of probability is less than accurate, it has certainly served a purpose in human relations and in our ability to adapt to our environment. It is far more beneficial to assume animation in inanimacy than vice versa, because animate organisms have always posed a far greater threat to us than inanimate objects. Therefore, those organisms who were not only capable of recognizing living from nonliving creatures but also saw "life" where there was none were less likely to become food for stealthy predators than those organisms who did not perform likewise. It is a similar concept to Pascal's Wager: given probability, you are far wiser to assume there is an overarching deity than not because there are many benefits if you are correct and few consequences if you are wrong. In an evolutionary sense, you are covering your bases if you over-assume animacy because you are more likely to outlive those who do not, even if some of the consequences happen to be worshiping trees and seeing your dead Aunt Margarette's face in a passing cloud.

Having stated all this, I won't go into whether I feel P.M. Kevin Rudd's belief in God is false, since not only would that take up more print but it would also be a waste of valuable time given that the bedrock of scientific analysis, both quantitative and qualitative, is falsifiability and I've yet to meet the individual who's capable of proving or disproving any incorporeal body. However, whether Rudd is ultimately correct in his conclusions or not, his argument is a false one. If I am asked to add 2 + 2 and my answer is 4 however I calculated that answer not by counting up 1 + 1 + 1 + 1 but instead dividing 100 by 25, then my method is inherently flawed despite that my answer is correct. (And, reader, please note with my little mathematical analogy that I have just utilized the representativeness heuristic to my favor in an attempt to convince you, despite that Arithmetic is in no way the same thing as Theology!) But regardless of Rudd's belief in God, his proposal, that the occurrence of life without the design and guidance of a Designer as improbable, is a fallacy regardless of its accuracy. A fallacy which, as Dawkins (2008) would say, is just one characteristic outcome of genetic inheritance. And according to Dawkins, who are the designers of this fallacy? Natural and Sexual Selections.


Dawkins, R. (2008). The God Delusion. Boston: A Houghton Mifflin Co.

Gilovich, T. (1991). How We Know What Isn't So: The Fallibility of Human Reason in Everyday Life. New York: The Free Press, p. 16.

Sydney Morning Harold, The. (2008). Cosmos order proves God exists: Rudd. Retrieved on 09/28/2008 from http://news.smh.com.au/national/cosmos-order-proves-god-existsrudd-20080829-45b6.html