Tuesday, 16 October 2007

Diametrically Opposed


You know, when it comes to diametrically opposed views, it’s not Atheists and Pagans who are glaring at each other across the boardroom table-or rather, not all Atheists and Pagans.
It’s rather that tiny subset of Atheists, the Transhumanists, whom most Pagans find very, very little common ground with, and vice versa of course.

I wandered across an article in New Scientist today,which covers the ninth annual meeting of the World Transhumanist Association.

My first reaction when bumping my nose on the Transhumanist paradigm is utter fear and revulsion, which is quite revealing, really, about me. I mean- who are these jokers?

I remember a poll over at my Learned Blogfriend Oaksong. The results were about what one would expect – more males than females in favour of ‘eternal life’ , with a dropping away in enthusiasm for it on both sides of roughly middle age.

Some good points were raised in the course of that discussion,though. Not least because a couple of the participants were members of my favourite musical band- Gaia Consort.

Well, back to the question of who are these jokers who think it’s a good idea to upload human consciousness into computer hardware – apparently, members of the patriarchal backbone, white middle aged male technogeeks , some of whom think they are scientists. Now, I was born into and raised up in a scientific family . I have lived most of my life among scientists, studied and worked with them – I am one, in fact. Wonderful people who are or were also scientists include Carl Sagan and my third year astrophysics lecturer David Bennewith. But these jerks only think, or want to believe, that they are members of one of the most accomplished, intelligent and humane subsets of humanity on earth today – they most distinctly are not.

Here’s Marvin Minsky, “Father” of AI (and that nomenclature gives you an idea of where he’s coming from):
"Ordinary citizens wouldn't know what to do with eternal life," says Minsky. "The masses don't have any clear-cut goals or purpose."

Excuse me, Mr Minsky? A bit of an elitist mindset we’ve got there, not so?

Do you really believe that you’re so much better than me? I invite you to any kind of intelligence test you care to name – you’ll be up against me, and I guarantee you , you won’t walk away the clear winner.

As far as humans go, Mr. Minsky, my gamma – male dog is a better person than you are.

This kind of thinking is the direct cause of us finding ourselves where we are today-in a society which believes that violence and threats of violence solve everything, on a planet we have whipped into submission to within an inch of Her life-taking us all down with Her.

Way to go, you overinflated buffoons.

Had enough of the Libertarian who thinks he’s the only type fit to live forever? Here’s some more:
"Scientists shouldn't have ethical responsibility for their inventions, they should be able to do what they want," he says. "You shouldn't ask them to have the same values as other people."


But the most telling remark- the quote which reminds us that these folk are operating from a basepoint of fear of not knowing everything, comes from AI fundi Eliezer Yudkowsky .:
"Saying AI will save the world or cure cancer sounds better than saying 'I don't know what's going to happen'."


Oh really? That at least is the bare truth.

One day, you will assimilate the fact that none of us know diddly squat about the Universe, although we try. And then you will,perhaps, be free.






Pic courtesy Accelerating Future