Followup to: Life’s Story Continues, Surprised by Brains, Cascades, Cycles, Insight, Recursion, Magic, Engelbart: Insufficiently Recursive, Total Nano Domination
I think that at some point in the development of Artificial Intelligence, we are likely to see a fast, local increase in capability – "AI go FOOM". Just to be clear on the claim, "fast" means on a timescale of weeks or hours rather than years or decades; and "FOOM" means way the hell smarter than anything else around, capable of delivering in short time periods technological advancements that would take humans decades, probably including full-scale molecular nanotechnology (that it gets by e.g. ordering custom proteins over the Internet with 72-hour turnaround time). Not, "ooh, it’s a little Einstein but it doesn’t have any robot hands, how cute".
Most people who object to this scenario, object to the "fast" part. Robin Hanson objected to the "local" part. I’ll try to handle both, though not all in one shot today.
We are setting forth to analyze the developmental velocity of an Artificial Intelligence. We’ll break down this velocity into optimization slope, optimization resources, and optimization efficiency. We’ll need to understand cascades, cycles, insight and recursion; and we’ll stratify our recursive levels into the metacognitive, cognitive, metaknowledge, knowledge, and object level.
Quick review:
Continue reading "Recursive Self-Improvement" »
loading...