Label: Transhumanist

This one starts off with a chain of definitions from Wiktionary: transhumanism

  1. a philosophy favouring the use of science and technology, especially neurotechnology, biotechnology, and nanotechnology, to overcome human limitations and improve the human condition

transhuman

  1. A human who recognizes and embraces the coming posthuman condition

posthuman

  1. Succeeding human beings as presently defined; more than, or beyond, what is human.

I understand that there's a lot of (sometimes contradictory) cultural baggage that comes along with the title "Transhumanist," and that some more committed transhumanists may disagree with my usage.  But I want to take a minute to talk about what transhumanism means to me.

Elezer D. Yudowsky makes the case for "Transhumanism as simplified Humanism," which I agree with.

I don't necessarily believe* that we're going to develop nanotechnology that will push our brains into super-overdrive.  I don't necessarily believe that we're going to fully unpick the workings of the human mind within my lifetime.  (I do think that prosthetics are quickly reaching the point where, given specific intentions, specialized prosthetics may be better options for some individuals than their natural-born limbs, and I think that range will expand rapidly, if not necessarily ever fully overtaking natural-born human biology.)

I do think that penicillin, laser eye surgery, cell phones and cars constitute exactly the same sort of augmentation to humanity that transhumanists propose.  I think language is a clever manipulation of our innate sound-making capacity, and written language is an augmentation onto that.  Fire is false environment.  Clothes are augmented skin.

When I say I'm a transhumanist, I mean to affirm my confidence in, and acceptance of, humans' relationship to technology.  I believe, as put beautifully by Dresden Codak author Aaron Diaz, [EDIT: in the words of his character, Kaito Kusanagi]

No, never say "us" and "them."  You separate a man from his tools -- Take his clothes, his history and his language away... he becomes an animal.  The machines... They are the hands and we are the head.  Only together do we make humanity.

(Emphasis mine.)

I call myself a transhumanist, specifically, to separate myself from people who wouldn't affirm that claim.  I call myself a transhumanist to separate myself from that particular kind of hypocrisy, from the false nostalgia for some sort of platonic humanity that convinces people that they're not intimately, existentially dependent upon technology, not just to enjoy the comfort of their life as it exists, but to be human in the way they understand it.  Humanity, as a philosophical concept, is largely taken to mean something other than an arbitrary sequence of genetic code.  That philosophical concept falls apart in absence of technology, and it's hypocritical to affirm all of that technology right up until around when you turned 15, after which point everything is an unnatural aberration.

I also want to make it clear that when I call myself a transhumanist, I don't mean singularitarian.  I don't believe that we're on our way to a technological singularity.  I don't believe we aren't, either -- I hold a position of strong agnosticism on the year 2030** and the rapture of the geeks.  That is to say, I don't know.  There's a huge body of analysis and argument both in favor of and against the singularity, and I'm simply not qualified to pick through it.  I find both arguments in favor of it and against it compelling, which speaks mostly to the skill of the writers.  I don't know how to tell true from false in that context, so I will remain singularity-agnostic either until the singularity hits, until it is comprehensibly, empirically falsified, or until my death.

 

*In this article, I want to emphasize as much as possible that when I say 'I don't believe,' I do not mean 'I believe the opposite of,'.  When I say I don't believe something, unless otherwise specified, I mean to imply that I'm unconvinced.  Not that I'm convinced against. **or 2045 if you ask Ray Kurzweil.