Tag Archives: Privacy

Security Has Costs

Technical systems are often insecure, in that they allow unauthorized access and control. While strong security is usually feasible if designed in carefully from the start, such systems are usually made fast on the cheap. So they usually ignore security at first, and then later address it as an afterthought, which as a result becomes a crude ongoing struggle to patch holes as fast as holes are made or discovered.

The more complex a system is, the more different other systems it is adapted to, the more different organizations that share a system, and the more that such systems are pushed to the edge of technical or financial feasibility, the more likely that related security is full of holes.

A dramatic example of this is cell phone security. Most anyone in the world can use your cell phone to find out where your phone is, and hence where you are. And there’s not much anyone is going to do about this anytime soon. From today’s Post: Continue reading "Security Has Costs" »

GD Star Rating
loading...
Tagged as: , ,

Fair Date Reporting Act

A week ago I heard an NPR radio interview with an FTC representative on web and phone privacy. She said the FTC protects your privacy by making sure firms who collect info on our activities can only use it to sell us stuff, but not to decide on hiring, renting, lending, or insuring. I thought: why is this where we draw our line of “privacy”?

Looking up a recent FTC report (quotes below), I see it goes back to the 1972 Fair Credit Reporting Act (FCRA), which required firms that rate you or collect info on you for hiring, renting, lending, or insuring to show you everything your rating is based on, and let you challenge any part of it. And given how completely infeasible it would be to show you all internet info collected about you, or let you challenge any of it, this law basically says that hiring, renting, lending, and insuring decisions must not benefit from the vast increase in info that web/phone tech now creates.

Adverse selection, where the people you least want are mostly like to apply, can plague hiring, renting, lending, and insuring. This is a big problem, and many regulations are said to be designed to deal with it. Yet the FCRA clearly makes this hidden info problem worse, by greatly limiting the info on which such decisions can be based.

To see how far this can go wrong, imagine a Fair Date Reporting Act, requiring all dating decisions to be made on the basis of documented information that potential dates can inspect and challenge. You couldn’t reject someone for a date unless you could clearly document that they are unsuitable. You’d end up relying heavily on some very crude indicators, like weight, education, income, and hair color, and enjoy your dates a lot less. And then they’d probably pass laws prohibiting some indicators as unfair, such as weight.

So why are we more willing to mess up decisions about hiring, renting, lending, or insuring, relative to dating? Because we see those deciders as dominating, because they choose to accept or reject us, and we see big firms as evil. Why don’t we similarly restrict the info firms can use to try to sell us stuff? Because we see ourselves as doing more of the choosing there, making us the dominant party.

Added 11p: Imagine that you were required by law to score all job offers on objective verifiable criteria, such as salary and location, and had to take the job that scored highest. How close would that be to slavery?

Those promised FTC report quotes: Continue reading "Fair Date Reporting Act" »

GD Star Rating
loading...
Tagged as: , ,