Tag Archives: Privacy

Signaling Gains From Transparency

I said back in February:

For millennia, we humans have shown off our intelligence via complicated arguments and large vocabularies, health via sport achievement, heavy drink, and long hours, and wealth via expensive clothes, houses, trips, etc. Today we appear to have the more efficient signaling substitutes, such as IQ tests, medical health tests, and bank statements. Yet we continue to show off in the old ways, and rarely substitute such new ways. Why?

One explanation is inertia. Signaling equilibria require complex coordination, and those who try to change it via deviations can seem non-conformist and socially clueless. Another explanation is hypocrisy. As we discuss in our new book, The Elephant in the Brain, ancient and continuing norms against bragging push us to find plausible deniability for our brags. We can pretend that big vocabularies help us convey info, that sports are just fun, and that expensive clothes, etc. are prettier or more comfortable. It is much harder to find excuses to waive around your IQ test or bank statement for others to see.

It recently occurred to me that a sufficient lack of privacy would be an obvious fix for this problem. Imagine that it were easy to use face recognition to find someone’s official records, and from there to find out their net worth, IQ scores, and health test scores. In that case, observers could more cheaply acquire the same info that we are now try to show off in deniable ways.

Yes, we say to want to keep such info private, but the big efforts most of us go through to show off our smarts, health, and wealth suggests that we doth protest too much there. And as usual, it is less that we don’t know what policies would make us better off, and more than we don’t much care about that when we choose our political efforts.

Added 7a: Of course there may also be big disadvantages to losing privacy, and our evolved preferences may be tied more to particular surface behaviors and cues than to their general underlying signaling functions.

GD Star Rating
loading...
Tagged as: ,

Security Has Costs

Technical systems are often insecure, in that they allow unauthorized access and control. While strong security is usually feasible if designed in carefully from the start, such systems are usually made fast on the cheap. So they usually ignore security at first, and then later address it as an afterthought, which as a result becomes a crude ongoing struggle to patch holes as fast as holes are made or discovered.

The more complex a system is, the more different other systems it is adapted to, the more different organizations that share a system, and the more that such systems are pushed to the edge of technical or financial feasibility, the more likely that related security is full of holes.

A dramatic example of this is cell phone security. Most anyone in the world can use your cell phone to find out where your phone is, and hence where you are. And there’s not much anyone is going to do about this anytime soon. From today’s Post: Continue reading "Security Has Costs" »

GD Star Rating
loading...
Tagged as: , ,

Fair Date Reporting Act

A week ago I heard an NPR radio interview with an FTC representative on web and phone privacy. She said the FTC protects your privacy by making sure firms who collect info on our activities can only use it to sell us stuff, but not to decide on hiring, renting, lending, or insuring. I thought: why is this where we draw our line of “privacy”?

Looking up a recent FTC report (quotes below), I see it goes back to the 1972 Fair Credit Reporting Act (FCRA), which required firms that rate you or collect info on you for hiring, renting, lending, or insuring to show you everything your rating is based on, and let you challenge any part of it. And given how completely infeasible it would be to show you all internet info collected about you, or let you challenge any of it, this law basically says that hiring, renting, lending, and insuring decisions must not benefit from the vast increase in info that web/phone tech now creates.

Adverse selection, where the people you least want are mostly like to apply, can plague hiring, renting, lending, and insuring. This is a big problem, and many regulations are said to be designed to deal with it. Yet the FCRA clearly makes this hidden info problem worse, by greatly limiting the info on which such decisions can be based.

To see how far this can go wrong, imagine a Fair Date Reporting Act, requiring all dating decisions to be made on the basis of documented information that potential dates can inspect and challenge. You couldn’t reject someone for a date unless you could clearly document that they are unsuitable. You’d end up relying heavily on some very crude indicators, like weight, education, income, and hair color, and enjoy your dates a lot less. And then they’d probably pass laws prohibiting some indicators as unfair, such as weight.

So why are we more willing to mess up decisions about hiring, renting, lending, or insuring, relative to dating? Because we see those deciders as dominating, because they choose to accept or reject us, and we see big firms as evil. Why don’t we similarly restrict the info firms can use to try to sell us stuff? Because we see ourselves as doing more of the choosing there, making us the dominant party.

Added 11p: Imagine that you were required by law to score all job offers on objective verifiable criteria, such as salary and location, and had to take the job that scored highest. How close would that be to slavery?

Those promised FTC report quotes: Continue reading "Fair Date Reporting Act" »

GD Star Rating
loading...
Tagged as: , ,