If futarchy ends up as successful as I hope, the world will have to choose explicit ex-post-measurable goals to assign to their governments, and political talk and activism will turn to focus more on fighting over such goals.
It would be helpful if you explained exactly what you polled better than you have here - it's not clear how you converted "ranking" (if that's what you polled) to numbers 0..100. And while you're clear about the timing for "self right now", "lifetime", and "all spacetime", it's not clear what time period was ranked for the other scopes.
But this offers a glimpse into possible political controversy and disagreement under Futarchy. People may argue over time scale and scope. (Some will think high property values are *bad* as they make housing more expensive.) Exactly why 'multiverse' seems to elicit different priorities than 'world' seems worth studying.
Of course the values your respondents ranked are subject to differing interpretations - did you define 'liberty' in any way? I see personal liberty as important but 'national liberty' (not being ruled by outsiders) as irrelevant except as it influences the other values. Maybe I'm an outlier. 'Econ efficiency' seems a means to an end, not an end. You could say the same about some of the others.
I like the as ambition of this work. But two observations. 1. As Henry Ford may have said, "if I asked people what they wanted, they would have wanted a faster horse." I've spent a lot of my working life asking people what they want/need and I agree with Henry that people are very short sighted. You need to coach them a lot before they can imagine a better future and answer the question intelligently. 2. Even as an Australian, which is culturally pretty close to being north American, I get a strong US-centric vibe from the list. US public discussion priorities liberty like no other public discourse I can think of. People who don't have peace and safety will usually put this at number 1, well ahead of liberty. Equality is also a big deal in non-US discourse, as is respect for tradition and holiness in some places.
The median value in the table is 24.3-24.4; the overall mean value is 34.9; and the most common value is 100. 26 values are >50; 18 of those are in the upper left corner. 12 are <10; 7 of those are in the lower right corner.
Our desires in the moment are uninformed. If we learned more about facts and consequences, and gained more experience with how we feel after different outcomes, then we would have better-informed desires. Our true desires are what we would want if we were ideally informed about facts, consequences, and how we feel after different outcomes.
And that's what values in futarchy should be tied to; not our desires in the moment, but our posterior desires, with the benefit of knowledge and hindsight.
But decisions now will have to be based on best estimates now of those future better known desires. There should be ways to give traders good incentives to estimate those.
It would be better if the markets could payoff based on fulfillment of the true, better-informed desires we may have in the future, rather than fulfillment of our current estimates of what those true desires might be.
This is really interesting.
It would be helpful if you explained exactly what you polled better than you have here - it's not clear how you converted "ranking" (if that's what you polled) to numbers 0..100. And while you're clear about the timing for "self right now", "lifetime", and "all spacetime", it's not clear what time period was ranked for the other scopes.
But this offers a glimpse into possible political controversy and disagreement under Futarchy. People may argue over time scale and scope. (Some will think high property values are *bad* as they make housing more expensive.) Exactly why 'multiverse' seems to elicit different priorities than 'world' seems worth studying.
Of course the values your respondents ranked are subject to differing interpretations - did you define 'liberty' in any way? I see personal liberty as important but 'national liberty' (not being ruled by outsiders) as irrelevant except as it influences the other values. Maybe I'm an outlier. 'Econ efficiency' seems a means to an end, not an end. You could say the same about some of the others.
See the added to the post.
I like the as ambition of this work. But two observations. 1. As Henry Ford may have said, "if I asked people what they wanted, they would have wanted a faster horse." I've spent a lot of my working life asking people what they want/need and I agree with Henry that people are very short sighted. You need to coach them a lot before they can imagine a better future and answer the question intelligently. 2. Even as an Australian, which is culturally pretty close to being north American, I get a strong US-centric vibe from the list. US public discussion priorities liberty like no other public discourse I can think of. People who don't have peace and safety will usually put this at number 1, well ahead of liberty. Equality is also a big deal in non-US discourse, as is respect for tradition and holiness in some places.
A few notes:
There are 112 values in the table.
The median value in the table is 24.3-24.4; the overall mean value is 34.9; and the most common value is 100. 26 values are >50; 18 of those are in the upper left corner. 12 are <10; 7 of those are in the lower right corner.
Our desires in the moment are uninformed. If we learned more about facts and consequences, and gained more experience with how we feel after different outcomes, then we would have better-informed desires. Our true desires are what we would want if we were ideally informed about facts, consequences, and how we feel after different outcomes.
And that's what values in futarchy should be tied to; not our desires in the moment, but our posterior desires, with the benefit of knowledge and hindsight.
But decisions now will have to be based on best estimates now of those future better known desires. There should be ways to give traders good incentives to estimate those.
It would be better if the markets could payoff based on fulfillment of the true, better-informed desires we may have in the future, rather than fulfillment of our current estimates of what those true desires might be.
Yes, but how can we do that?