Years ago I was being surprised to learn that patients usually can’t pick docs based on track records of previous patient outcomes. Because, people say, that would invade privacy and make bad incentives for docs picking patients. They suggest instead relying on personal impressions, wait times, “bedside” manner, and prestige of doc med school or hospital. (Yeah, those couldn’t possibly make bad incentives.) Few ever study if such cues correlate with patient outcomes, and we actively prevent the collection of patient satisfaction track records.
For lawyers, most trials are in the public record, so privacy shouldn’t be an obstacle to getting track records. So people pick lawyers based on track records, right? Actually no. People who ask are repeatedly told: no practically you can’t get lawyer track records, so just pick lawyers based on personal impressions or the prestige of their law firm or school. (Few study if those correlate with client outcomes.)
A new firm Premonition has been trying to change that:
Despite being public record, court data is surprisingly inaccessible in bulk, nor is there a unified system to access it, outside of the Federal Courts. Clerks of courts refused Premonition requests for case data. Resolved to go about it the hard way, Unwin … wrote a web crawler to mine courthouse web sites for the data, read it, then analyze it in a database. …
Many publications run “Top Lawyer” lists, people who are recognized by their peers as being “the best”. Premonition analyzed the win rates of these attorneys, it turned out most were average. The only way that they stood out was a disproportionate number of appealed and re-opened cases, i.e. they were good at dragging out litigation. They discovered that even the law firms themselves were poor at picking litigators. In a study of the United Kingdom Court of Appeals, it found a slight negative correlation of -0.1 between win rates and re-hiring rates, i.e. a barrister 20% better than their peers was actually 2% less likely to be re-hired! … Premonition was formed in March 2014 and expected to find a fertile market for their services amongst the big law firms. They found little appetite and much opposition. …
The system found an attorney with 22 straight wins before the judge – the next person down was 7. A bit of checking revealed the lawyer was actually a criminal defense specialist who operated out of a strip mall. … The firm claims such outliers are far from rare. Their web site … shows an example of an attorney with 32 straight wins before a judge in Orange County, Florida. (more)
As a society we supposedly coordinate in many ways to make medicine and law more effective, such as via funding med research, licensing professionals, and publishing legal precedents. Yet we don’t bother to coordinate to create track records for docs or lawyers, and in fact our public representatives tend to actively block such things. And strikingly: customers don’t much care. A politician who proposed to dump professional licensing would face outrage, and lose. A politician who proposed to post public track records would instead lose by being too boring.
On reflection, these examples are part of a larger pattern. For example, I’ve mentioned before that a media firm had a project to collect track records of media pundits, but then abandoned the project once it realized that this would reduce reader demand for pundits. Readers are instead told to pick pundits based on their wit, fame, and publication prestige. If readers really wanted pundit track records, some publication would offer them, but readers don’t much care.
Attempts to publish track records of school teachers based on students outcomes have produced mostly opposition. Parents are instead encouraged to rely on personal impressions and the prestige of where the person teaches or went to school. No one even considers doing this for college teachers, we at most just survey student satisfaction just after a class ends (and don’t even do that right).
Regarding student evaluations, we coordinate greatly to make standard widely accessible tests for deciding who to admit to schools. But we have almost no such measures of students when they leave school for work. Instead of showing employers a standard measure of what students have learned, we tell employers to rely on personal impressions and the prestige of the school from which the student came. Some have suggested making standard what-I-learned tests, but few are interested, including employers.
For researchers like myself, publications and job position are measures of endorsements by prestigious authorities. Citations are a better measure of the long term impact of research on intellectual progress, but citations get much less attention in evaluations of researchers. Academics don’t put their citation count on their vita (= resume), and when a reporter decides which researcher to call, or a department decides who to hire, they don’t look much at citations. (Yes, I look better by citations than by publications or jobs, and my prestige is based more on the later.)
Related is the phenomenon of people being more interested in others said to have the potential to achieve X, than in people who have actually achieved X. Related also is the phenomenon of firms being reluctant to use formulaic measures of employee performance that aren’t mediated mostly by subjective boss evaluations.
It seems to me that there are striking common patterns here, and I have in mind a common explanation for them. But I’ll wait to explain that in my next post. Till then, how do you explain these patterns? And what other data do we have on how we treat track records elsewhere?
Added 22Mar: Real estate sales are also technically in the public record, and yet it is hard for customers to collect comparable sales track records for real estate agents, and few seem to care enough to ask for them.
All wrong. In reality big firm, prestige lawyers often work very easy cases, and when they get a tough one, it's still an easy one because the sophisticated client doesn't expect much, hoping only to delay payment or for some kind of hail-mary like a credulous judge. Examples are easy to supply, just ask.
Well, for one thing, if you rank lawyers based on their win percentages your 'top lawyers' will be the lawyers who found a way to end up with the easiest cases, and nothing more than that.
An idiotic idea threatens everyone's existing status, but that's not why the idea is not adopted; it is not adopted simply because it is idiotic.