Category Archives: Morality

Failed Utopia #4-2

Followup toInterpersonal Entanglement

    Shock after shock after shock –
    First, the awakening adrenaline jolt, the thought that he was falling.  His body tried to sit up in automatic adjustment, and his hands hit the floor to steady himself.  It launched him into the air, and he fell back to the floor too slowly.
    Second shock.  His body had changed.  Fat had melted away in places, old scars had faded; the tip of his left ring finger, long ago lost to a knife accident, had now suddenly returned.
    And the third shock –
    "I had nothing to do with it!" she cried desperately, the woman huddled in on herself in one corner of the windowless stone cell.  Tears streaked her delicate face, fell like slow raindrops into the décolletage of her dress.  "Nothing!  Oh, you must believe me!"
    With perceptual instantaneity – the speed of surprise – his mind had already labeled her as the most beautiful woman he'd ever met, including his wife.

Continue reading "Failed Utopia #4-2" »

GD Star Rating
loading...

Interpersonal Entanglement

Previously in seriesSympathetic Minds

Today I shall criticize yet another Utopia.  This Utopia isn't famous in the literature.  But it's considerably superior to many better-known Utopias – more fun than the Christian Heaven, or Greg Egan's upload societies, for example.  And so the main flaw is well worth pointing out.

This Utopia consists of a one-line remark on an IRC channel:

<reedspacer> living in your volcano lair with catgirls is probably a vast increase in standard of living for most of humanity

I've come to think of this as Reedspacer's Lower Bound.

Sure, it sounds silly.  But if your grand vision of the future isn't at least as much fun as a volcano lair with catpersons of the appropriate gender, you should just go with that instead.  This rules out a surprising number of proposals.

But today I am here to criticize Reedspacer's Lower Bound – the problem being the catgirls.

I've joked about the subject, now and then – "Donate now, and get a free catgirl or catboy after the Singularity!" – but I think it would actually be a terrible idea.  In fact, today's post could have been entitled "Why Fun Theorists Don't Believe In Catgirls."

Continue reading "Interpersonal Entanglement" »

GD Star Rating
loading...

Sympathetic Minds

Previously in seriesIn Praise of Boredom
Followup toHumans in Funny Suits

"Mirror neurons" are neurons that are active both when performing an action and observing the same action – for example, a neuron that fires when you hold up a finger or see someone else holding up a finger.  Such neurons have been directly recorded in primates, and consistent neuroimaging evidence has been found for humans.

You may recall from my previous writing on "empathic inference" the idea that brains are so complex that the only way to simulate them is by forcing a similar brain to behave similarly.  A brain is so complex that if a human tried to understand brains the way that we understand e.g. gravity or a car – observing the whole, observing the parts, building up a theory from scratch – then we would be unable to invent good hypotheses in our mere mortal lifetimes.  The only possible way you can hit on an "Aha!" that describes a system as incredibly complex as an Other Mind, is if you happen to run across something amazingly similar to the Other Mind – namely your own brain – which you can actually force to behave similarly and use as a hypothesis, yielding predictions.

So that is what I would call "empathy".

And then "sympathy" is something else on top of this – to smile when you see someone else smile, to hurt when you see someone else hurt.  It goes beyond the realm of prediction into the realm of reinforcement.

And you ask, "Why would callous natural selection do anything that nice?"

Continue reading "Sympathetic Minds" »

GD Star Rating
loading...

In Praise of Boredom

Previously in seriesSeduced by Imagination

If I were to make a short list of the most important human qualities –

– and yes, this is a fool's errand, because human nature is immensely complicated, and we don't even notice all the tiny tweaks that fine-tune our moral categories, and who knows how our attractors would change shape if we eliminated a single human emotion –

– but even so, if I had to point to just a few things and say, "If you lose just one of these things, you lose most of the expected value of the Future; but conversely if an alien species independently evolved just these few things, we might even want to be friends" –

– then the top three items on the list would be sympathy, boredom and consciousness.

Boredom is a subtle-splendored thing.  You wouldn't want to get bored with breathing, for example – even though it's the same motions over and over and over and over again for minutes and hours and years and decades.

Now I know some of you out there are thinking, "Actually, I'm quite bored with breathing and I wish I didn't have to," but then you wouldn't want to get bored with switching transistors.

According to the human value of boredom, some things are allowed to be highly repetitive without being boring – like obeying the same laws of physics every day.

Conversely, other repetitions are supposed to be boring, like playing the same level of Super Mario Brothers over and over and over again until the end of time.  And let us note that if the pixels in the game level have a slightly different color each time, that is not sufficient to prevent it from being "the same damn thing, over and over and over again".

Once you take a closer look, it turns out that boredom is quite interesting.

Continue reading "In Praise of Boredom" »

GD Star Rating
loading...

Justified Expectation of Pleasant Surprises

Previously in seriesEutopia is Scary

I recently tried playing a computer game that made a major fun-theoretic error.  (At least I strongly suspect it's an error, though they are game designers and I am not.)

The game showed me – right from the start of play – what abilities I could purchase as I increased in level.  Worse, there were many different choices; still worse, you had to pay a cost in fungible points to acquire them, making you feel like you were losing a resource…  But today, I'd just like to focus on the problem of telling me, right at the start of the game, about all the nice things that might happen to me later.

I can't think of a good experimental result that backs this up; but I'd expect that a pleasant surprise would have a greater hedonic impact, than being told about the same gift in advance.  Sure, the moment you were first told about the gift would be good news, a moment of pleasure in the moment of being told.  But you wouldn't have the gift in hand at that moment, which limits the pleasure.  And then you have to wait.  And then when you finally get the gift – it's pleasant to go from not having it to having it, if you didn't wait too long; but a surprise would have a larger momentary impact, I would think.

This particular game had a status screen that showed all my future class abilities at the start of the game – inactive and dark but with full information still displayed.  From a hedonic standpoint this seems like miserable fun theory.  All the "good news" is lumped into a gigantic package; the items of news would have much greater impact if encountered separately.  And then I have to wait a long time to actually acquire the abilities, so I get an extended period of comparing my current weak game-self to all the wonderful abilities I could have but don't.

Imagine living in two possible worlds.  Both worlds are otherwise rich in challenge, novelty, and other aspects of Fun.  In both worlds, you get smarter with age and acquire more abilities over time, so that your life is always getting better.

But in one world, the abilities that come with seniority are openly discussed, hence widely known; you know what you have to look forward to.

In the other world, anyone older than you will refuse to talk about certain aspects of growing up; you'll just have to wait and find out.

Continue reading "Justified Expectation of Pleasant Surprises" »

GD Star Rating
loading...

Eutopia is Scary

Previously in seriesContinuous Improvement
Followup toWhy is the Future So Absurd?

    "The big thing to remember about far-future cyberpunk is that it will be truly ultra-tech.  The mind and body changes available to a 23rd-century Solid Citizen would probably amaze, disgust and frighten that 2050 netrunner!"
        — GURPS Cyberpunk

Pick up someone from the 18th century – a smart someone.  Ben Franklin, say.  Drop them into the early 21st century.

We, in our time, think our life has improved in the last two or three hundred years.  Ben Franklin is probably smart and forward-looking enough to agree that life has improved.  But if you don't think Ben Franklin would be amazed, disgusted, and frightened, then I think you far overestimate the "normality" of your own time.  You can think of reasons why Ben should find our world compatible, but Ben himself might not do the same.

Movies that were made in say the 40s or 50s, seem much more alien – to me – than modern movies allegedly set hundreds of years in the future, or in different universes.  Watch a movie from 1950 and you may see a man slapping a woman.  Doesn't happen a lot in Lord of the Rings, does it?  Drop back to the 16th century and one popular entertainment was setting a cat on fire.  Ever see that in any moving picture, no matter how "lowbrow"?

("But," you say, "that's showing how discomforting the Past's culture was, not how scary the Future is."  Of which I wrote, "When we look over history, we see changes away from absurd conditions such as everyone being a peasant farmer and women not having the vote, toward normal conditions like a majority middle class and equal rights…")

Something about the Future will shock we 21st-century folk, if we were dropped in without slow adaptation.  This is not because the Future is cold and gloomy – I am speaking of a positive, successful Future; the negative outcomes are probably just blank.  Nor am I speaking of the idea that every Utopia has some dark hidden flaw.  I am saying that the Future would discomfort us because it is better.

Continue reading "Eutopia is Scary" »

GD Star Rating
loading...

Continuous Improvement

Previously in seriesSerious Stories

When is it adaptive for an organism to be satisfied with what it has?  When does an organism have enough children and enough food?  The answer to the second question, at least, is obviously "never" from an evolutionary standpoint.  The first proposition might be true if the reproductive risks of all available options exceed their reproductive benefits.  In general, though, it is a rare organism in a rare environment whose reproductively optimal strategy is to rest with a smile on its face, feeling happy.

To a first approximation, we might say something like "The evolutionary purpose of emotion is to direct the cognitive processing of the organism toward achievable, reproductively relevant goals".  Achievable goals are usually located in the Future, since you can't affect the Past.  Memory is a useful trick, but learning the lesson of a success or failure isn't the same goal as the original event – and usually the emotions associated with the memory are less intense than those of the original event.

Then the way organisms and brains are built right now, "true happiness" might be a chimera, a carrot dangled in front of us to make us take the next step, and then yanked out of our reach as soon as we achieve our goals.

This hypothesis is known as the hedonic treadmill.

The famous pilot studies in this domain demonstrated e.g. that past lottery winners' stated subjective well-being was not significantly greater than that of an average person, after a few years or even months.  Conversely, accident victims with severed spinal cords were not as happy as before the accident after six months – around 0.75 sd less than control groups – but they'd still adjusted much more than they had expected to adjust.

This being the transhumanist form of Fun Theory, you might perhaps say:  "Let's get rid of this effect.  Just delete the treadmill, at least for positive events."

Continue reading "Continuous Improvement" »

GD Star Rating
loading...

Serious Stories

Previously in seriesEmotional Involvement

Every Utopia ever constructed – in philosophy, fiction, or religion – has been, to one degree or another, a place where you wouldn't actually want to live.  I am not alone in this important observation:  George Orwell said much the same thing in "Why Socialists Don't Believe In Fun", and I expect that many others said it earlier.

If you read books on How To Write – and there are a lot of books out there on How To Write, because amazingly a lot of book-writers think they know something about writing – these books will tell you that stories must contain "conflict".

That is, the more lukewarm sort of instructional book will tell you that stories contain "conflict".  But some authors speak more plainly.

"Stories are about people's pain."  Orson Scott Card.

"Every scene must end in disaster."  Jack Bickham.

In the age of my youthful folly, I took for granted that authors were excused from the search for true Eutopia, because if you constructed a Utopia that wasn't flawed… what stories could you write, set there?  "Once upon a time they lived happily ever after."  What use would it be for a science-fiction author to try to depict a positive Singularity, when a positive Singularity would be…

…the end of all stories?

It seemed like a reasonable framework with which to examine the literary problem of Utopia, but something about that final conclusion produced a quiet, nagging doubt.

Continue reading "Serious Stories" »

GD Star Rating
loading...

Emotional Involvement

Previously in seriesChanging Emotions
Followup toEvolutionary Psychology, Thou Art Godshatter, Existential Angst Factory

Can your emotions get involved in a video game?  Yes, but not much.  Whatever sympathetic echo of triumph you experience on destroying the Evil Empire in a video game, it's probably not remotely close to the feeling of triumph you'd get from saving the world in real life.  I've played video games powerful enough to bring tears to my eyes, but they still aren't as powerful as the feeling of significantly helping just one single real human being.

Because when the video game is finished, and you put it away, the events within the game have no long-term consequences.

Maybe if you had a major epiphany while playing…  But even then, only your thoughts would matter; the mere fact that you saved the world, inside the game, wouldn't count toward anything in the continuing story of your life.

Thus fails the Utopia of playing lots of really cool video games forever.  Even if the games are difficult, novel, and sensual, this is still the idiom of life chopped up into a series of disconnected episodes with no lasting consequences.  A life in which equality of consequences is forcefully ensured, or in which little is at stake because all desires are instantly fulfilled without individual work – these likewise will appear as flawed Utopias of dispassion and angst.  "Rich people with nothing to do" syndrome.  A life of disconnected episodes and unimportant consequences is a life of weak passions, of emotional uninvolvement.

Our emotions, for all the obvious evolutionary reasons, tend to associate to events that had major reproductive consequences in the ancestral environment, and to invoke the strongest passions for events with the biggest consequences:

Falling in love… birthing a child… finding food when you're starving… getting wounded… being chased by a tiger… your child being chased by a tiger… finally killing a hated enemy…

Continue reading "Emotional Involvement" »

GD Star Rating
loading...

Changing Emotions

Previously in series:  Growing Up is Hard

    Lest anyone reading this journal of a primitive man should think we spend our time mired in abstractions, let me also say that I am discovering the richness available to those who are willing to alter their major characteristics.  The variety of emotions available to a reconfigured human mind, thinking thoughts impossible to its ancestors…
    The emotion of -*-, describable only as something between sexual love and the joy of intellection – making love to a thought?  Or &&, the true reverse of pain, not "pleasure" but a "warning" of healing, growth and change. Or (^+^), the most complex emotion yet discovered, felt by those who consciously endure the change between mind configurations, and experience the broad spectrum of possibilities inherent in thinking and being.

        — Greg Bear, Eon

So… I'm basically on board with that sort of thing as a fine and desirable future.  But I think that the difficulty and danger of fiddling with emotions is oft-underestimated.  Not necessarily underestimated by Greg Bear, per se; the above journal entry is from a character who was receiving superintelligent help.

But I still remember one time on the Extropians mailing list when someone talked about creating a female yet "otherwise identical" copy of himself.  Something about that just fell on my camel's back as the last straw.  I'm sorry, but there are some things that are much more complicated to actually do than to rattle off as short English phrases, and "changing sex" has to rank very high on that list.  Even if you're omnipotent so far as raw ability goes, it's not like people have a binary attribute reading "M" or "F" that can be flipped as a primitive action.

Changing sex makes a good, vivid example of the sort of difficulties you might run into when messing with emotional architecture, so I'll use it as my archetype:

Continue reading "Changing Emotions" »

GD Star Rating
loading...