Category Archives: Personal

Singularity Summit 2008

FYI all:  The Singularity Summit 2008 is coming up, 9am-5pm October 25th, 2008 in San Jose, CA.  This is run by my host organization, the Singularity Institute.  Speakers this year include Vernor Vinge, Marvin Minsky, the CTO of Intel, and the chair of the X Prize Foundation.

Before anyone posts any angry comments: yes, the registration costs actual money this year.  The Singularity Institute has run free events before, and will run free events in the future.  But while past Singularity Summits have been media successes, they haven’t been fundraising successes up to this point.  So Tyler Emerson et. al. are trying it a little differently.  TANSTAAFL.

Lots of speakers talking for short periods this year.  I’m intrigued by that format.  We’ll see how it goes.

Continue reading "Singularity Summit 2008" »

GD Star Rating
loading...

Brief Break

I’ve been feeling burned on Overcoming Bias lately, meaning that I take too long to write my posts, which decreases the amount of recovery time, making me feel more burned, etc.

So I’m taking at most a one-week break.  I’ll post small units of rationality quotes each day, so as to not quite abandon you.  I may even post some actual writing, if I feel spontaneous, but definitely not for the next two days; I have to enforce this break upon myself.

When I get back, my schedule calls for me to finish up the Anthropomorphism sequence, and then talk about Marcus Hutter’s AIXI, which I think is the last brain-malfunction-causing subject I need to discuss.  My posts should then hopefully go back to being shorter and easier.

Hey, at least I got through over a solid year of posts without taking a vacation.

GD Star Rating
loading...

Doctor, There are Two Kinds of “No Evidence”

I have a relative who has cancer and has been taking a particular chemotherapy drug.  It has been very successful; all of the tests and scans have been coming back negative for some time.  Recently I went along to an appointment with a fancy consulting oncologist to get his opinion about how much longer to continue with the drug.  Going into the appointment, I had the idea (based on nothing but what seemed to me like common sense) that there was a tradeoff: more chemo means a higher chance that the cancer won’t reappear, but also means a higher chance of serious side effects, and that we were going there to get his opinion on whether in this case the pros outweighed the cons or vice-versa.  What he said instead was that there was "no evidence" that additional chemo, after there are no signs of disease, did *any* additional good at all, and that the treatments therefore should have been stopped a long time ago and should certainly stop now.  I asked him what was incorrect about the (seemingly) common sense notion that additional chemo might get rid of the last little bits of cancer that are too small to show up on scans, and he said, more or less, that it’s not my idea of common sense that matters, it’s the evidence, and there is no evidence that things work that way.  So then I asked him whether by "no evidence" he meant that there have been lots of studies directly on this point which came back with the result that more chemo doesn’t help, or whether he meant that there was no evidence because there were few or no relevant studies.  If the former was true, then it’d be pretty much game over: the case for discontinuing the chemo would be overwhelming.  But if the latter was true, then things would be much hazier: in the absence of conclusive evidence one way or the other, one would have to operate in the realm of interpreting imperfect evidence; one would have to make judgments based on anecdotal evidence, by theoretical knowledge of how the body works and how cancer works, or whatever.  And good people, maybe I’m being unfair and underestimating this guy, but I swear to you that this fancy oncologist in this very prestigious institution didn’t seem to understand the difference between these two types of "no evidence."  So while he had a very strong and very (generally) laudable instinct that one ought to base one’s medical opinions on evidence rather than instinct, he seemed to be unable to avoid what strikes me as a pretty fundamental mistake.*  I’d love to hear thoughts about this, particularly from doctors who either have something to say about whether this is a common mistake among doctors or who have something to say about the chemotherapy question itself.

*The most generous possible interpretation of what went on, but which would require me to attribute to him a thought process that he did not express at all, is that he understands the difference between the two types of "no evidence" but has come to believe that doctors’ interpretations of imperfect evidence will systematically lead them to over-treat and so has adopted a rule of "do nothing unless there is strong evidence that you should do something" as a second-best optimum.

GD Star Rating
loading...
Tagged as: ,

Touching the Old

I’m in Oxford right now, for the Global Catastrophic Risks conference.

There’s a psychological impact in walking down a street where where any given building might be older than your whole country.

Toby Ord and Anders Sandberg pointed out to me an old church tower in Oxford, that is a thousand years old.

At the risk conference I heard a talk from someone talking about what the universe will look like in 10100 years (barring intelligent modification thereof, which he didn’t consider).

The psychological impact of seeing that old church tower was greater.  I’m not defending this reaction, only admitting it.

I haven’t traveled as much as I would travel if I were free to follow my whims; I’ve never seen the Pyramids.  I don’t think I’ve ever touched anything that has endured in the world for longer than that church tower.

A thousand years…  I’ve lived less than half of 70, and sometimes it seems like a long time to me.  What would it be like, to be as old as that tower?  To have lasted through that much of the world, that much history and that much change?

Continue reading "Touching the Old" »

GD Star Rating
loading...

Helsinki Meetup

Apparently there are blog readers who’d like to meet in Helsinki while I’m visiting.  So we will meet Thursday July 10 at Loiste at 20:00 local time.  (It’s on the 10th floor.) 

Added 11July: Thanks to all who came for a great evening!  Clearly Helsinki has an OB weight far out of proportion to its world population fraction. 🙂

GD Star Rating
loading...
Tagged as:

Eliezer’s Post Dependencies; Book Notification; Graphic Designer Wanted

I’m going to try and produce summaries of the quantum physics series today or tomorrow.

Andrew Hay has produced a neat graph of (explicit) dependencies among my Overcoming Bias posts – an automatically generated map of the "Followup to" structure:

Eliezer’s Post Dependencies (includes only posts with dependencies)
All of my posts (including posts without dependencies)

Subscribe here to future email notifications for when the popular book comes out (which may be a year or two later), and/or I start producing e-books:

Notifications for the rationality book, or for any other stuff I produce

(Thanks to Christian Rovner for setting up PHPList.)

Sometime in the next two weeks, I need to get at least one Powerpoint presentation of mine re-produced to professional standards of graphic design.  Ideally, in a form that will let me make small modifications myself.  This is likely to lead into other graphic design work on producing the ebooks, redesigning my personal website, creating Bayesian Conspiracy T-shirts, etc.

I am not looking for an unpaid volunteer.  I am looking for a professional graphic designer who can do sporadic small units of work quickly.

Continue reading "Eliezer’s Post Dependencies; Book Notification; Graphic Designer Wanted" »

GD Star Rating
loading...

Bloggingheads: Yudkowsky and Horgan

I appear today on Bloggingheads.tv, in "Science Saturday: Singularity Edition", speaking with John Horgan about the Singularity.  I talked too much.  This episode needed to be around two hours longer.

One question I fumbled at 62:30 was "What’s the strongest opposition you’ve seen to Singularity ideas?"  The basic problem is that nearly everyone who attacks the Singularity is either completely unacquainted with the existing thinking, or they attack Kurzweil, and in any case it’s more a collection of disconnected broadsides (often mostly ad hominem) than a coherent criticism.  There’s no equivalent in Singularity studies of Richard Jones’s critique of nanotechnology – which I don’t agree with, but at least Jones has read Drexler.  People who don’t buy the Singularity don’t put in the time and hard work to criticize it properly.

What I should have done, though, was interpreted the question more charitably as "What’s the strongest opposition to strong AI or transhumanism?" in which case there’s Sir Roger Penrose, Jaron Lanier, Leon Kass, and many others.  None of these are good arguments – or I would have to accept them! – but at least they are painstakingly crafted arguments, and something like organized opposition.

Continue reading "Bloggingheads: Yudkowsky and Horgan" »

GD Star Rating
loading...

I’m in Boston

Tyler Cowen tells me blog readers like to hear personal details.  It feels a bit odd, but hey, let’s try it sometimes.  Last weekend I held a party – some reviews here.  Today at 4:30 I’m on a Harvard panel:

This will be a debate on Bryan Caplan’s very controversial, but well-argued, book "The Myth of the Rational Voter." Panelists include Caplan and Robin Hanson (both economists from George Mason University), David Estlund (Chair of the Philosophy department at Brown, arguing the pro-democracy side), and economist Jeffrey Miron of Harvard.

Tomorrow noon I talk at MIT’s Center for Collective Intelligence:

Nobel winner Robert Aumann (Econ. ’05) showed in ’76 that Bayesians with a common prior could not "agree to disagree," i.e., have common knowledge of exact yet differing opinions.  Aumann made strong  assumptions, but similar results follow from much weaker assumptions: Bayesian wannabes who believe in symmetric prior origins cannot have common belief of one of them foreseeing how another will later disagree.  I review this literature, illustrate with concrete examples, and discuss the disturbing implications for the honesty and rationality of familiar human disagreement.

I’m very much the absent-minded professor – on my last trip I lost my favorite shirt and my glasses.  I’m seriously considering lasik, to avoid the hundreds of dollars a year I spend replacing lost glasses. 

GD Star Rating
loading...
Tagged as: