This is our monthly place to discuss relevant topics that have not appeared in recent posts.
Since Robin has an interest in paternalism and better access to info, I'd like him to help get the word out about the FDA's attempt to get in the way of people knowing their own genetic information.
Citation? I've seen studies that give suicide rates that high for borderline personality disorder, but never for DSM-IV Major Depressive Disorder. The correlation is embarrassingly weak for depression and suicide.
I do not understand. Are you saying depression is the recouperation after such a flight-or-fight response? or that it is part of the response itself?
Mayo Clinic studies find that those who suffer from reoccurring depression have a 9% or 1 in 11 chance of death from suicide. This makes it one of the most common causes of death - just above firearms and just below motor vehicle accidents.
It seems you can buy a service two ways:1. jobs I can do, but don't want to do2. jobs I can't do
In the former are many menial jobs and in the latter are many of the finer things in life. If you tell a cook how you want your food, your dining experience is the former. You'd never tell your heart surgeon how you want your incision.
These New York chefs are defending the status of their product.
Interesting book! I may have to pick that up. I have gone around with Robin on the issue and I think he is wrong. Research from anthropologists and archeologists (such as Keeley and LeBlanc) makes it pretty clear that foragers are warlike.
REALIST: research on primitive societies shows that they are very warlikeUTOPIAN: Those societies are not primitive enough. They are all at least somewhat "post-forager." Some are pastoralists, some have limited agriculture, etc...REALIST: if you look strictly at foragers like the Innuit, Kung San, and Aborigines you see that their history is very warlike.UTOPIAN: those are only a small number of data points. We don't have enough true foragers to examine.REALIST: If you look at the archeological record, you find people before the neolithic times were very warlike. There is ample evidence of arrows and axes designed just for warfare, defensive formations, bones with weapons still in them, etc...UTOPIAN: the archeological record is also spotty.REALIST: Now consider our model. Records of people on small islands shows that they do not live "in balance" with the environment. They tend to overuse their local resources slowly. E.g. piles of discarded abalone shells get smaller from generation to generation, indicating that they are slowly overusing their resource until they disappear from the island. Why doesn't this happen as often on the mainland? Because people who slowly deplete their local resources make war on their neighbors.UTOPIAN: that is just a theory. You haven't proven that.REALIST: I think any good Bayesian would have concluded that foragers are warlike unless they had a subjective prior that is heavily skewed to the Noble Savage. And even then, the available evidence strongly confirms the warlike savage.
You've written a lot about foragers vs farmers, including farming bringing war. Azar Gat's "War in Human Civilization" is excellent (so far at least) and places a lot of attention on the distinction between true hunter-gatherers and mere primitive agriculturalists, and how we can get uncontaminated evidence of what our ancestors were like. He concludes that they were pretty similar to primitive farmers with respect to their proclivity towards war.
I think there is nothing irrational about depression. I see depression in terms of physiology, that it is the necessary aversive mental state between “at rest” and the euphoric state of near death metabolic stress when you are running from a bear, where to be caught is certain death. Physiology induces a state of euphoria so that one can run until one has escaped from the bear or dropped dead from exhaustion. Being caught and dropping dead from exhaustion are *the same* as far as evolution is concerned. That state has to be euphoric if one is going to ignore the pain signals and continue running. If organisms could enter a euphoric state easily they would, and would risk death with no benefit.
Evolution has configured physiology to minimize the sum of deaths from being caught by bears, from dropping dead from exhaustion and from suicide.
I would just like to point out that I think that there is a implementation of hirerachichal society that might be friendlier than the traditional one.
Tribalism seems to be cruical in this. I can easily imagine several parallel societies that don't care much about what the others think about them since they assign status according to different value systems.
If people are allowed to leave their society (but societies are allowed to deny access to anyone) this would provide a small tirckle of people who find due to their innate talents easier to advance on the status track in another society rather than their own and this might offset the cost of adjusting to a foreign (or should I say alien) culture.
But modern Westerners are remarkably demanding of ideological conformity, not only that they are universalits, so this plan might not be workeable with a predominantly Western-derived transhuman society. China dosen't seem any better for now, though I struggel to find a "real" metric to compare the situation in say Korea or Japan to our own, so maybe they are a alternative.
There is nothing pathological about a fear of death, when it has a fiar overwhelming likleyhood of happening in the time frame one cares about.
I don't know about others but I'm definetly a "far" guy.
To what extent can rationality aid those with abnormal psychology - those who are already predisposed to heavy biases? What specific techniques, models, or questions best address such skewed perspectives?
For example, if you had a friend who was clinically depressed, what advice on rationality might you offer? After the basics of: seek medication and counseling, exercise, and get enough sunlight?
No doubt, but my point concerned a decision today in anticipation of far-future uploading.
My answer to that followed: But many people have already performed this in imagination as a philosophical thought experiment or intuition pump, and this has pumped their intuitions. It’s a popular thought experiment.
In particular, there's probably a significant overlap between upload enthusiasts and science fiction genre fans. Anyone who has read a significant amount of science fiction has almost certainly gone through this or equivalent thought experiments more times than he can remember.
But for a person concerned *today* about being uploaded, there’s no rational basis for prejudice against the exact duplicate compared to the upload.
There is a rational basis for not caring about that which you have no control over anyway, such as perfect copies of you vastly far away. There is a rational basis for reserving your concern for that which you have some control over. You have finite resources, and it is rational to spend them where they will do some good.
This, of course, assumes that you are only one of the copies. On another interpretation, where we take qualitative identity to entail numerical identity, you are all of the copies. You do care about all of them because you are all of them. Since you care about yourself, you care about all of them.
What I think is going on is that we only have room for one “me,” and we reject continuators coming in multiples.
Certainly. This intuition however can be cured by merging. You allow yourself to be duplicated, split into two people. Each person goes out and has a day or a week or a year of experiences. Then at the end of the year, they are re-integrated into a single person. He will have the memories of both. He will not be able to place the memories of either one in order either before or after the other. But this is not unprecedented in reality. I have often had two experiences, and then later on when remembering, was unsure which came first and which came second. He will, however, remember having been split into two, and he will remember having then been each of the two, though seemingly (so it seems to him) at different times.
Do this enough times, and he will expect to become both, not simultaneously, but (as he sees it) at different times. If he is split into two, then each of the duplicates will think something like, "I am this duplicate this time, though at another time I will be - or maybe was (and temporarily forgot) - the other duplicate, though eventually, at re-integration, I will remember everything." And, as he is about to be split into two, he will think, "I am about to become first one of the duplicates, and then the other, though I don't know which one will come first, nor, at the end, will I remember which one came first."
This intuition could, conceivably, be so strong that he feels that he will become both (but "at different times") even if he knows there will be no re-integration at the end. And each duplicate may even face death with equanimity, with the gut feeling that all that his death amounts to is the loss of some memories - as long as the other duplicate survives.
With the intuition properly prepared, then, a person can anticipate becoming both of the duplicates when a person is split in two. For example, if he is uploaded but kept alive so that there are now two of him. Or if he is copied a thousand times.
"Once the process of repeatedly uploading and downloading a personality gets going and is repeated many times, the resulting personalities will be well-stocked with memories of having been uploaded and downloaded multiple times and having survived the transfer, and as a result they will anticipate without any trepidation the next upload."
No doubt, but my point concerned a decision today in anticipation of far-future uploading. (I'm not sure I was clear regarding that.)
"Applying janos’s point, they fail to care because they don’t remember having been those other people, so they don’t anticipate being those people, i.e. their intuition hasn’t been properly pumped to get them to anticipate that."
Yes, I agree with that [except possibly your application of "intuition pump, but let's leave that aside]. But for a person concerned *today* about being uploaded, there's no rational basis for prejudice against the exact duplicate compared to the upload. I can make the distinction by introducing causal continuity, but what justifies this when your duplicate has been influenced by exactly the same events and has responded with exactly the same thoughts? Why should I care about causal continuity when it makes absolutely no difference to my experience? If "I" am constituted by the contents of my thoughts, then anyone who has exactly those thoughts is me, to the same degree as the upload is me.
What I think is going on is that we only have room for one "me," and we reject continuators coming in multiples. But surely someone has asked this question: what if I'm uploaded while I'm alive? Given a choice favoring the welfare of the upload over the meat version or the reverse, would you be equally loyal to each prior to the uploading (you must decide?) What if a thousand versions of you are created? Would that be a good thing (because "you" multiply "your" experiences), or a bad thing (because you are forced to divide your loyalties prior to uploading between a thousand versions?
I'm sure the conditioning process you and Janos envision would succeed, but it's entirely an acquired taste. But when a person becomes concerned *today* about being uploaded tomorrow, his intuition pumps should be primed by these thought experiments. If the person nevertheless decides to continue to be concerned, he's in the grip of a philosophical position; it is *not* the way our intuitions spontaneously turn, as taking *those* intuitions seriously leads to absurdities because our concept of identity is limited to one being; but a being created after I die is no different from a being created before I die, and It's hard to see how anyone's intuitions would claim a relevant distinction. Therefore, it seems that any method of continuation that's capable of repeated application fails to create a sense of identity.
I truly wonder whether those who want to be uploaded might prefer being repeatedly uploaded.
Yes, thank you. Hedge funds get "transmissible spongioform encephalopathies" too. However, they have had the Federal Reseve bank of New York and private banks "cure" their illness. Like cannibals they should suffer the same fate. However.......
Hedge Funds move value. Under any circumstance, they fail to create value.
A delusion is a belief in the face of overwhelming evidence that the belief is wrong. I don't have a delusion about the persistence of identity because I don't think that identity does persist. I don't have a false belief about it. My belief follows the evidence. There is no evidence that self-identity persists, I don't believe that self-identity does persist.
There is the illusion of continuity of self-identity, but that is an illusion, just like an optical illusion. It only becomes a delusion when belief in the non-real occurs in the face of overwhelming evidence to the contrary. I think the belief in continuity of self-identity is closer to delusional than the belief that there is no such thing. There is gigantic evidence that there is no continuity of self-identity. Why would anyone think that there is such a thing if not for some pretty strong illusions?
I do not accept grounding objections in philosophy. I think counterfactual logic is perfectly acceptable. But we don't need to worry about it. If there are persistent identities then we can falsify uploading. If there are not persistent identities then we do not need to.
Suppose that X99 walks into an upload booth, has his brain scanned, and then X100 is created. A few seconds later, X99 is disintegrated. What happens phenomenologically to X99?
Skeptics: X99 walks into a booth and then dies.
Persistent Identity Uploaders: X99 walks into a booth, becomes aware of both X99 and X100 at the same time, then is only aware of X100.
Non-Persistent Identity Uploaders: X99 walks into a booth and dies. But so what? The X99 at 3:00 PM is no more like the X99 at 3:01 PM than he is like X100. To the extent that X99 at 3:01 is still the "real" X99, then so X100.
We can falsify persistent identity uploading because the first person to upload can report that they never became aware of both X1 and X2 at the same time. Their memories of X1 ended with the brain scan.
We don't need to falsify non-persistent identity uploading because everyone agrees about how it works. The only question is whether or not doing it is rational. I don't think that is is rational.
I made the reference to The Prestige because at one point, Hugh Jackman's character talks about how he got his hands dirty because he never knew if he'd end up dying in the tank, or as the prestige who got to take the bow. Same reasoning might apply to physically integrating with a computer and then "cutting the cord".
The molecules that make the meat you up are, for the most part, not the molecules that made you up a decade ago. Most of the cells have been replaced with new cells. If sheer physicality is the measure of the residence of consciousness, then none of us are the people we were born as. Thus, consciousness must instead be an algorithm, independent of a physical basis. And if consciousness is an algorithm, the computer you is no less you than the meat you.
Either both the computer and meat you are you, or neither is.
Do you think that "meat you" will have a telepathic bond with "computer you"? That they will know each others' thoughts?
I think a more reasonable alternative is that consciousness is an emergent property that supervenes on the physical. If so,then there would be two "you's", both of which can remember being the physical you. But I think Constant is correct below - once you really start to think about identity you must recognize that if naturalism is true, that a persistent identity of any type is an illusion. In that sense, I would go with the option of "neither".