Min Em Wage Enslaves?

Paul Krugman in ’06:

Serfdom in Russia wasn’t an institution that dated back to the Dark Ages. Instead, it was mainly a 16th-century creation, contemporaneous with the beginning of the great Russian expansion into the steppes. Why? … There’s no point in enslaving … a man unless the wage you would have to pay him if he was free is substantially above the cost of feeding, housing, and clothing him. … Indeed, by 1300 – with Europe very much a Malthusian society – serfdom had withered away from lack of interest.

But now suppose that for some reason land becomes abundant, and labor scarce. Then competition among landowners will tend to push up wages of free workers, and the ruling class will try, if it can, to pin peasants down and prevent them from bargaining for a higher standard of living. In Russia, it was all about gunpowder: suddenly steppe nomads were no longer so formidable, and the rich lands of the Ukraine were open for settlement. (more; HT Mark Thoma via TGGP)

Two aspects of a future em scenario especially bother people:

  1. The em robots might be enslaved.
  2. They might get near subsistence wages.

Many propose regulations to address #2, such as minimum wages or limits on em reproduction. But the case of Russian serfdom contains a warning: above-cost em wages will increase the temptation to enslave ems. If ems can be created for $10 a year but market wages are $100 a year, many will be tempted to create hidden em slaves to do their work. Ordinary ems might be copied against their will into secret computers and then tortured to work.

Short of continually inspecting every physical object that might house a computer, it might be very hard to detect such hidden slavery. A far more robust solution is to just let wages fall to near subsistence, where the temptation to enslave will be greatly reduced.

GD Star Rating
loading...
Tagged as: ,
Trackback URL:
  • Hedonic Treader

    Short of continually inspecting every physical object that might house a computer, it might be very hard to detect such hidden slavery.

    In a world in which sentient algorithms can be run on private computing resources, such universal inspection is probably necessary anyway – unless the rest of society doesn’t care who’s tortured in private for any reason (including plain old sadism).

    If it were possible to create a sysop singleton that enforces a certain set of sentients’ rights globally, such as the right not to suffer involuntarily, or the right for self-aware sentients to self-terminate at any time, this could do the trick. Such a sysop would have access to inspect any computational resources anywhere in the world, but would otherwise not disturb free interactions. Of course, building this system would itself create a single point of failure with significant risks for harm.

    Currently, we don’t seem too concerned with private abuse of non-consenting sentients. For instance, we allow private pet ownership while we insist on the privacy of our own homes. We don’t really control who tortures whom in their basements. However, this attitude might shift in a world in which anyone’s connectome data can be stolen and instantiated on other people’s private computational resources.

    Many propose regulations to address #2, such as minimum wages or limits on em reproduction.

    Or a non-free-market allocation mechanism for material resources and energy. An indirect control of em reproduction would follow from that, considering that computational resources are based on material substrates.

    I’m not sure what kinds of demand for labor would scale with population in a world in which art and intellectual work can be copied limitlessly, and any material good can be represented as a digital blueprint to be instantiated in universal factories such as nanofactories or 3D-printers. Bandwidth, matter and energy seem bottlenecks, but intellectual labor? Maybe administrative tasks could scale, or interactions that draw their subjective value from interacting with self-aware sentients, such as coaching or sexual services. But the former could be aided by expert systems, and demand for the latter seems to be reduced by free interactions between a large connected populace of free individuals.

    It may be a lack of imagination, but I find it hard to see what kind of work all these ems are supposed to do in such a world.

    • http://daedalus2u.blogspot.com/ daedalus2u

      HT, I think the “work” that would be most in demand would be having sentients to bully and torture. People with control and power fetishes will always want individuals to bully and torture. Robin even recognizes that even if he can’t imagine what pretext i.e. what “labor” the bullies and torturers would demand to stop the torture. Of course there won’t be any “labor” that would be sufficient because what the “owner” wants is the emotional gratification of bullying and torturing a self-aware entity.

      That is really the whole point of wealth in an industrial society. Providing food, clothing and shelter to everyone is really cheap. But if everyone has food, clothing and shelter, the opportunities to bully and torture them are greatly reduced.

      Wealth is simply a convenient measure of social power. Those who want a lot of social power over others have to acquire a lot of wealth in order to achieve it. Then they can bully and torture others to some extent. That is the whole point of being at the top of a social power hierarchy, so you can bully and torture people that are below you on it.

      A society could easily prevent that but likely won’t because it is those who enjoy bullying and torturing that will be writing and enforcing the laws so as to give them the power to do so. That is what is happening now. There is no great effort to stop it now for actual living human beings. I appreciate why there is no great effort to stop it now, and appreciate why Robin is so worried about future EMs but (apparently) not so greatly worried about the bullying and torturing that is going on now.

      Universal access to an internet-like communication system at a few hundred baud would cost essentially nothing and could essentially completely prevent torture of EMs. Network Neutrality would too, but that is why it is being fought so vigorously by the ISPs. Without the ability to exploit a monopoly power, ISPs would have to actually compete against each other with better services which would reduce profits and power acquisition.

      • Hedonic Treader

        That is really the whole point of wealth in an industrial society. Providing food, clothing and shelter to everyone is really cheap. But if everyone has food, clothing and shelter, the opportunities to bully and torture them are greatly reduced.

        Over-generalization, maybe? Malicious intentions and power games are clearly a part of human psychology, but so are solidarity instincts. Status instincts lead to zero-sum games or even net-negative games due to waste of limited resources, but they’re not necessarily malicious in nature. Furthermore, wealth allocation has more than one function in modern society (it’s also a motivator for innovation).

        Otoh, I do think it’s realistic that malicious gratification is a real psychological factor, and the question of how to build a system based on a net-positive equilibrium is actually really hard. Simple answers like collectivism and libertarian free-market absolutism (while failing to enforce justice of aquisition in practice) haven’t been historically successful. I’m curious how this works out in an ever-more interconnected world with increasing transparency, but I’m not terribly optimistic.

    • http://www.gwern.net gwern

      > Such a sysop would have access to inspect any computational resources anywhere in the world, but would otherwise not disturb free interactions. Of course, building this system would itself create a single point of failure with significant risks for harm.

      One might wonder whether such a system is possible even in principle. Access and inspection disturb computations. Cache misses, latency, debugger hooks… Remember, you can steal private keys just by measuring cache hits. Such access may be deeply harmful and inefficient.

      As well, it may not even solve the problem. Examining an arbitrary computation and asking whether it’s ‘illegal’ instantly runs afoul of Rice’s theorem/the Halting Problem. And what happens if the computation is homomorphically encrypted? For that matter, what if it’s an encrypted multi-party computation? Or even just a parasitic computation?

  • jb

    hrrrmm.

    I know that if I were an ’em’, I would have no interest in doing slave labor for anyone. ‘Turn me off, I don’t care, I’m just a brain in a vat’. Talk about ennui.

    It seems like the best approach to ems would be to keep them from knowing that they are ems. If one can simulate an entire human brain, one can simulate an external world for that em at only a modestly larger budget. And given the right incentives and environment, you could “aim” that em towards performing work that you find valuable, without the em ever being aware of its status. Heck, they might even think they’re economics/philosophy professors at universities outside Washington DC.

  • Rafal Smigrodzki

    Why torture if you can rewrite the whole motivational system *and* make the rewrite ego-syntonic? A lot of human thinking consists of adaptations to survival as reproducing entities in tribes, with strong competition for reproduction-relevant resources (food, sexual access, or position in tribe that assures protection from predators, or intra-species competitors). A created, non-self-reproducing ems could be more efficient if all that baggage was excised – so as soon as this modification became possible, such ems would be made, and immediately would outcompete unmodified ones, at least as employees. In a way, they would outsource their reproduction directly to market forces acting through interactions between capital owners.

    • Hedonic Treader

      Why torture if you can rewrite the whole motivational system *and* make the rewrite ego-syntonic?

      Sadism? Maliscious power rush?

    • Aron

      Robin doesn’t engage with this rather obvious line of argumentation. I think you have to merely take (or leave it) as a premise that at some point we replicate human minds in the machine, yet we haven’t the capability to engineer any substantial changes to them at first or even after a substantially long enough amount of time to establish a new economic status quo based on them.

      Meanwhile constraints pile up as to which variety of human is the target for uploading.

      • torekp

        I think you have to merely take (or leave it) as a premise that at some point we replicate human minds in the machine, yet we haven’t the capability to engineer any substantial changes to them

        Leave it.

      • Jeffrey Soreff

        Aron, I too am skeptical of this premise. To put another spin on it: In order for us to be able to replicate human minds in the machine, we’d need to be able to replicate all of their major parts. The brain is not a homogeneous jelly of neurons, there are of the order of 100 anatomically distinct regions in it. In order for uploading to work, we’d have to be able to engineer all of these architectures correctly (as well as a scanning process to get individual people, which has to work on all types of neurons, and which is only needed for this purpose – but I digress…). If we had a subset of the regions working, and attempted to upload someone, we’d automatically be making major changes, simply because of the missing pieces.

        Consider the economic competition with biomemetic AI. For biomemetic AI to start occupying economic niches, it needs for us to understand useful pieces of brain architecture well enough to start applying them. To a significant extent, this is already happening. AI in the areas of computer vision and hearing is already useful enough for a good many applications. I expect the most probable course will be to understand brain modules in a highly uneven way, adding them to our computational tool set. We could easily reach a takeoff point where these tools add up to enough capability to take over all economically important human tasks (including closure – the tasks of building more of the hardware and of further extending its capability), without ever having enough fidelity to really upload someone.

        In this scenario, the time between uploading human minds and modifying them turns negative. Modified pieces of human minds get built and applied first, and extended till the modified and optimized systems dominate the economy, without ever uploading a single complete human.

  • donK

    When it comes to doing work any machine is only as good and as cheap as its hardware and control system. An ’em’ can’t be cheaper that its hardware even if the software is free. I’m curious what work you see ’em’s doing and what you think the cost of the actual nuts and bolts will be.

  • http://daedalus2u.blogspot.com/ daedalus2u

    Obviously doing captcha.

    http://en.wikipedia.org/wiki/CAPTCHA

  • http://thecandidefund.wordpress.com/ dirk

    Would ems have a sex drive?

    • http://entitledtoanopinion.wordpress.com TGGP

      Robin gave his theory here.

      Also, I can’t understand what your deal is. You seem half troll and half normal.

      • http://thecandidefund.wordpress.com/ dirk

        Thanks for the link. Yeah, that’s about what I am.

        It strikes me that the initial generation of ems will be very depressed about their circumstances: a human mind without a human body — and fail to get any work done. Perhaps they will spend most of their time looking at porn since they can’t achieve satisfaction. They can’t even hope to ATTRACT the opposite sex, so why would working be worth the effort for them? Or to the extent they would work: wouldn’t they spend their time interested in solving their own problems, such as how to simulate a human body? If they accomplished that my next concern would be that the ems would enslave the humans — at least the female humans — and not the other way around.

        As Robin suggests, sex for our descendents won’t be the same as it is now, but for the ems in vats not to be miserable sex would have to be so vastly different as not to require a human body at all. Seems unlikely.

      • http://thecandidefund.wordpress.com/ dirk

        Basically I’m saying that ems are a chicken-egg problem when it comes to sexual evolution. Hanson’s theory relies on post-em evolution, but if ems are brain scans it means the first generation will feel itself to be all too human and have all too human desires. An army of quadriplegics will be awfully hard to motivate. Even the most brilliant quadriplegic of our time left his wife for his nurse. He had immense fame, a mind AND a human face to pull that one off. A brilliant mind in the form of electrons is going to be extremely bad at the art of seduction — yet that mind will be just as interested in fucking and sucking as any other human mind. It seems far more likely they would first set themselves on the task of building robotic bodies capable of experiencing human-like sensations and then proceed onward to procuring real human women (flip the script if they’re prototype was gay)– whatever it takes to do so — because that is what their human minds will desire.

        Or another possibility is the altered-states scenario. Humans hallucinate when placed in sensory deprivation chambers for long. A first generation em will exist in the ultimate sensory deprivation chamber. How to get those brains to work on real-world problems if they are likely to spend a lot of time hallucinating?

      • http://lukeparrish.rationalsites.com/ Luke Parrish

        First-generation ems would probably prefer embodiment in a virtual reality environment where they can have sex with other ems. However, it seems more likely to me that the more efficient and thus competitive ems (at least for most tasks) would develop ways to deactivate the sex drive in order to focus on their work rather than wasting their cpu time on pornographic simulations. If not all ems do this, porn could be used as a form of information-warfare, targeting populations that one wishes to cause to consume cpu time nonproductively and thereby gain a competitive advantage over.

  • http://www.weidai.com Wei Dai

    I question the implied association between enslavement of ems and torture of ems. These seem like independent issues to me.

    If the threat of torture makes for stronger motivation for ems to work (compared to just the promise of continued existence), then absent effective regulation, ems will be tortured even if wages do fall to subsistence and there is no enslavement, because those ems who are willing to voluntarily accept torture in order to increase their own productivity will out-compete and out-reproduce those who aren’t willing to be tortured.

    On the other hand, if the threat of torture is not a better motivator for ems, then there is no reason to torture enslaved ems. It would be more productive for the slave-owner to just tell them that if they fail to do a sufficient amount of work they will be terminated.

  • rapscallion

    Slavery is somewhat difficult to define when people are software who live inside hardware; cases 1 and 2 might overlap. In our world, anyone who will be killed by another if they stop working is considered a slave. Morally speaking, this principle should carry over to the future. If an em will simply be deleted by the hardware owner for not working—as might be the case even if 2 holds–that’s slavery. I don’t think it’s analogous to a lazy man starving from lack of money from work because another agent, not nature, is killing the em.

    Of course, the technology might work in a way such that the, “lazy man starving from lack of work,” analogy is apt, like if programs have to work directly for power or memory space or something.

  • Eric

    Ugh.

    This reminds me of Iain Bank’s new novel about Surface Detail where some societies dream up nightmarish hells for “em”-type entities for all sorts of reasons (religious, sadistic, etc.).

    Actually, the threat of something like that makes me think the risk of EM life outweighs benefits.

    Perhaps 90 versions of my future workaholic selves will find themselves in bliss, endlessly solving project management and engineering problems and other rewarding experiences (for me to “enjoy” this fate, I’d need serious editing to my reward systems). But perhaps another 10 unlucky versions of end up tortured over and over at high-clock speed for the rest of eternity, with reward and other psychological systems especially modified to maximize the pain and horror of existence.

    Is this kind of risk really worth it?

    The fact that we can imagine this kind of future is really, really disturbing. I’d much rather have completely alien AIs utterly indifferent to this kind of thing emerge rather than risk highly twisted human / post-human imaginations and motivations from getting any power what-so-ever over em’s.

    SkyNet would be a kind of salvation if torture of em’s becomes feasible and difficult to prevent.

  • http://silasx.blogspot.com Silas Barta

    Many propose regulations to address [human emulations in software getting near subsistence wages].

    Um, did you just use “many” and a political position regarding ems in the same sentence? I think you’re overestimating how much attention this topic gets! lol

    • http://www.gwern.net gwern

      Perhaps it was meant to be a counterfactual prediction – as in, ‘many on hearing of these problems would simply propose creating additional regulations to ensure ems get higher wages’.

      I’d agree that that would be the reaction of many upon hearing of the problem…

  • Mercy

    @dirk
    “It seems far more likely they would first set themselves on the task of building robotic bodies capable of experiencing human-like sensations and then proceed onward to procuring real human women (flip the script if they’re prototype was gay”
    “If they accomplished that my next concern would be that the ems would enslave the humans — at least the female humans — and not the other way around.”

    Are you assuming that ems can/will only be taken from male humans, or that the sex issue will only be a problem with male humans?

    In any case there are people out there who are happily asexual, which should suffice. One assumes there will only be a small pool of “optimal” em donors who get copied millions of times, Henrietta Lacks style, so qualities that are merely practically universal aren’t a problem. And we can hormonally kill peoples sex drives without much difficulty, even if we were completely unable to do it directly to a computer simulation.