Imagine: your relative is a celebrity, currently the focus of a media scandal. You and other relatives are talking together, strategizing. Reporters are asking for interviews, and digging into your world, probably via both legal and illegal means. What should you do?
Some of you say to fight, fight, fight. Threaten legal action, hire a security team, and threaten any relative who doesn’t hold a hard line with expulsion from the family. This is about moral absolutes, after all, and “extremism in defense of liberty is no vice.” Give ’em an inch and they’ll take a mile. So never ever compromise.
Others of you suggest compromise. Hold a press conference and directly tell reporters many of the things they want to know. Enough to make them likely sufficiently satisfied to let up on their digging. It wouldn’t be an explicit deal, but they have limited resources and plenty of other things to do. And if they dig more, they may find new issues to focus on.
This second group is usually right. Peace is usually better than war. Maybe in some cases reporters couldn’t be placated, and would still push into all your private niches. But usually not. This is the idea behind the proposal of my last post. Biometrics is getting cheaper and more reliable, our imperfectly-secure smart-phones know a great deal about us, and the people and organizations around us have strong reasons to want to know a few key things about us.
So let’s compromise, I say. Find a way to set up a system to give those people and orgs around us the few key bits that are they most want to know, and that we don’t so much mind them knowing. And then maybe they won’t work so hard to extract this key info out of all the cues that we naturally leak. Key info that they likely could get anyway if they worked hard enough and coordinated, and as a side effect they may obtain and use other info that we might rather keep private.
When conflicts gain a moral color, it can look bad to advocate compromise. But in general, war is bad, peace is good, and compromise can bring peace.
If we follow your strategy, people in the future will settle around a new normal where of course our biometric data will be public and the new question is how to deal with the companies who want to know everything you eat and drink. Then they'll be more people who argue similarly that we should compromise with the companies and provide them this new trivial information so they'll be placated and not look for more.
The cycle will repeat a few more times (as it's repeated in the past many times before) until all our information is public and the concept of private information doesn't even exist.
We live in a world post-Snowden leaks. There is no reasoning with the data-gatherers, they are ruthless and will steal all your data that isn't tied down.
But then there's the issue of relativity. Wherever you draw the line, the value of the data on the off-limits side of that line is apt to increase. Exclusive data tends to be valuable data. Also, big tech is very, very good at finding a use for just about every piece of info they can pry off of you, especially now that they finally have at their disposal fairly decent quasi-AI that can detect meaningful patterns with superhuman proficiency. We are approaching the point where, to them, there is virtually no such thing as noise.