Yeah, I'm obviously talking about math, not mere math. Perfect for PhD prep and signalling that you're a powerful human being, but overkill for nearly all practical applications.
But, hey, if it's a status-mongering world then play it as it lies.
And preferring to work on useful stuff is also a way to signal status.
And you know what's a really great way to signal status? To talk about how everything boils down to signalling status.
And an even better way to signal status? Talk about how status theory predicts that even if status theory is true, people will not believe in status theory because it's true, but only because it's a good way of signalling status.
Yup. My research uses geospatial and oceaongraphic data. My entire research project could be replaced by a few sampling programs over 5 years, instead of having me do 2 years worth of analysis on existing data. The difference is that my time costs maybe $60,000 (inc. overheads) over those 2 years (I'm a student), and the sampling program would cost O($100,000,000).
Obviously it's rational for myself (and O(5) ) others need to milk this data for all it's worth.
This is probably a niche thing that no-one outside of academia would do, but there is a practical need for some more complicated (what might get accused of "masturbatory") approaches. But it's my understanding that this is what university research departments are *for* -- solving the non-standard problems. Academia is where you go to do work that, while important, no-one else would bother with.
Preferring to work on intrinsic interestingness instead of working on useful stuff is a way to signal status.
From an evolutionary perspective low status people can't waste their precious time with things that have intrinsic interestingness.Those people with high status who don't have to spend their time with practical stuff however can spend their time on things with intrinsic interestingness.
No, you need the math (or at least some kinds of math) even if you want to do applied work. Real life seldom throws you problems that you can solve simply by pressing a button in Stata; you have to know what lies behind the buttons both to decide which buttons to press and to write up your own models and algorithms when no button would work.
(nothing in the above statement should be construed as an endorsement of the typical standard curriculum of undergraduate and graduate economics education; that would require a separate lengthy discussion, but my personal belief is that the total weight of math in the ideal curriculum is not going to be any lower than it is)
I agree with Peter Gardes. I would push his argument even further: fads may also be generated by intrinsic interestingness independently of any status seeking (though I am not denying the importance of the latter here). If you hear a lot of colleagues talking about a given class of problems, you may simply get more curious about that class; and, when you start digging, almost any area of research will present interesting tidbits. I guess it is a kind of rational herding.
Right, you need the math if you want to play with the big boys. And the fun train never stops if you signed up because you wanted to study math and statistics. Economics is what you learn on your own time.
I'm happy, but my brain was never poisoned by the thought that I might learn something useful.
Don't neglect the issue of intrinsic interestingness. Things which require novel thoughtful solutions can provide pleasure that more mundane work does not. This, frankly, is what drives a great deal of mathematics though fads do play a part.
Econ graduate school is certainly math heavy, but this background knowledge is necessary for understanding and adding to the frontier of economics research. For example, everyone knows how to open STATA and run an OLS regression with robust standard error. However, to develop a new statistical test or to create a good estimator of some new phenomenon you need to understand the nuts and bolts of what is going on in OLS, like different kinds of probabilistic convergence, and a whole slew of theorems relating to them. This takes a fair bit of real analysis and probability theory.
Having a loose understanding of the meaning of some important bits of economic theory (say the Lucas critique, or the asset premium paradox or whatever) is good up to a point, but if you want to add something useful to the discussion, you have to understand the nitty gritty mathematical details.
I read Andrew's point as this: that the difficulty of doing useful work is a utility cost that often outweighs the utility gains of the status associated with such work. Different marginal returns on work in the form of status may not be predictive of behavior because those marginal returns and the marginal costs of doing the work are highly correlated to the point that net motivation is only weakly correlated with status outcomes.He suggests that academics like hot fields- those where marginal return to marginal cost is high, attracting many investigators as one would expect in an efficient market that just discovered an intellectual arbitrage to exploit before it disappears. This does not contradict that high-status/low-status is not a predictive variable. It can support that status per work is the predictive variable if those assumptions about what makes fields hot. Aggregation behavior is as indicative of a feeding frenzy as of self-reinforcing status games.Finally, graph-only papers seem like an absurd edge case and not especially useful in proving your point. First, you have not made the case for why all-graph papers should be written more than they are, especially given Andrew's description of how much harder it is to write graphs than text. Second, I suspect that good all-graph papers can usually be improved by adding other analysis, and I would expect reviewers to recommend revision for papers that could easily be improved by adding more analysis even if they thought that paper were already publishable.Without refinement, the status theory does not make predictions as clearly as you suggest in this post.
My guess is that the "impressiveness" and "status" effects are real, but they are unintended artifacts of the journal-publishing process.
Editors of top stats journals would like to publish articles which will be heavily cited and boost their journal's impact factor. Using impressive, esoteric techniques is a credible signal that the paper is more likely to be highly cited, because: (1) some esoteric techniques turn out to be actually useful, even though most are not, and (2) esoteric techniques are related to effort, and expending lots of effort on a single paper signals that the author's hidden info about that particular paper is very good.
Note that the most celebrated papers tend to be cited across disciplines and/or reported in the science news. It would be hard to argue that the authors of these papers are simply trying to impress their fellow academics.
I'm not sure about the graphs vs. math issue, but since information visualization is a well-established field, I find Seth's claims about the "low" status of graphics to be unconvincing.
The goal of an academic journal is to publish papers that incrementally advance knowledge, and the goal of peer review is to make sure that the paper actually includes an incremental advance -- it isn't enough for the author to claim so.
This naturally leads to an 'auditability bias': papers that cannot definitively demonstrate their incremental advance, as interesting as they may be, generally fall to lower-tier journals.
I'm not sure whether this is a bug or a feature, but I find it hard to believe that any journal editor or reviewer has ever said "this is an interesting paper, but the topic is just too darn useful to publish." Instead, it turns out that useful papers are often not easily audited.
As one of my mentors said long ago, "if someone hasn't written a paper on an obviously important issue yet, either you are the first to think of it or it is really hard to do well." A key part of an academic researcher's growth is it realize, humbly, that there are probably a lot of smart people who tried to address obviously important issues before, and failed.
A secondary observation: an incremental contribution to knowledge is typically not all that useful to present day problems -- if it were, someone in the private sector would be doing the work and profiting from it. Why would that be the best use of researchers employed by non-profit organizations and governments?
I don't know anything, but I had heard that graduate schools of economics are basically applied mathematics fields. They don't actually study economics... instead they just try to do the most 'impressive' mathematics formulas for supply and demand calculations that they can handle.
Yeah, I'm obviously talking about math, not mere math. Perfect for PhD prep and signalling that you're a powerful human being, but overkill for nearly all practical applications.
But, hey, if it's a status-mongering world then play it as it lies.
And preferring to work on useful stuff is also a way to signal status.
And you know what's a really great way to signal status? To talk about how everything boils down to signalling status.
And an even better way to signal status? Talk about how status theory predicts that even if status theory is true, people will not believe in status theory because it's true, but only because it's a good way of signalling status.
I'm king of the world!
Typo above. I have one too many zeros: should be $10,000,000.
Yup. My research uses geospatial and oceaongraphic data. My entire research project could be replaced by a few sampling programs over 5 years, instead of having me do 2 years worth of analysis on existing data. The difference is that my time costs maybe $60,000 (inc. overheads) over those 2 years (I'm a student), and the sampling program would cost O($100,000,000).
Obviously it's rational for myself (and O(5) ) others need to milk this data for all it's worth.
This is probably a niche thing that no-one outside of academia would do, but there is a practical need for some more complicated (what might get accused of "masturbatory") approaches. But it's my understanding that this is what university research departments are *for* -- solving the non-standard problems. Academia is where you go to do work that, while important, no-one else would bother with.
Preferring to work on intrinsic interestingness instead of working on useful stuff is a way to signal status.
From an evolutionary perspective low status people can't waste their precious time with things that have intrinsic interestingness.Those people with high status who don't have to spend their time with practical stuff however can spend their time on things with intrinsic interestingness.
No, you need the math (or at least some kinds of math) even if you want to do applied work. Real life seldom throws you problems that you can solve simply by pressing a button in Stata; you have to know what lies behind the buttons both to decide which buttons to press and to write up your own models and algorithms when no button would work.
(nothing in the above statement should be construed as an endorsement of the typical standard curriculum of undergraduate and graduate economics education; that would require a separate lengthy discussion, but my personal belief is that the total weight of math in the ideal curriculum is not going to be any lower than it is)
I agree with Peter Gardes. I would push his argument even further: fads may also be generated by intrinsic interestingness independently of any status seeking (though I am not denying the importance of the latter here). If you hear a lot of colleagues talking about a given class of problems, you may simply get more curious about that class; and, when you start digging, almost any area of research will present interesting tidbits. I guess it is a kind of rational herding.
Might more complex methods be appropriate where data is more expensive to collect, time horizons are longer, testing more difficult, etc?
Right, you need the math if you want to play with the big boys. And the fun train never stops if you signed up because you wanted to study math and statistics. Economics is what you learn on your own time.
I'm happy, but my brain was never poisoned by the thought that I might learn something useful.
Don't neglect the issue of intrinsic interestingness. Things which require novel thoughtful solutions can provide pleasure that more mundane work does not. This, frankly, is what drives a great deal of mathematics though fads do play a part.
Econ graduate school is certainly math heavy, but this background knowledge is necessary for understanding and adding to the frontier of economics research. For example, everyone knows how to open STATA and run an OLS regression with robust standard error. However, to develop a new statistical test or to create a good estimator of some new phenomenon you need to understand the nuts and bolts of what is going on in OLS, like different kinds of probabilistic convergence, and a whole slew of theorems relating to them. This takes a fair bit of real analysis and probability theory.
Having a loose understanding of the meaning of some important bits of economic theory (say the Lucas critique, or the asset premium paradox or whatever) is good up to a point, but if you want to add something useful to the discussion, you have to understand the nitty gritty mathematical details.
I read Andrew's point as this: that the difficulty of doing useful work is a utility cost that often outweighs the utility gains of the status associated with such work. Different marginal returns on work in the form of status may not be predictive of behavior because those marginal returns and the marginal costs of doing the work are highly correlated to the point that net motivation is only weakly correlated with status outcomes.He suggests that academics like hot fields- those where marginal return to marginal cost is high, attracting many investigators as one would expect in an efficient market that just discovered an intellectual arbitrage to exploit before it disappears. This does not contradict that high-status/low-status is not a predictive variable. It can support that status per work is the predictive variable if those assumptions about what makes fields hot. Aggregation behavior is as indicative of a feeding frenzy as of self-reinforcing status games.Finally, graph-only papers seem like an absurd edge case and not especially useful in proving your point. First, you have not made the case for why all-graph papers should be written more than they are, especially given Andrew's description of how much harder it is to write graphs than text. Second, I suspect that good all-graph papers can usually be improved by adding other analysis, and I would expect reviewers to recommend revision for papers that could easily be improved by adding more analysis even if they thought that paper were already publishable.Without refinement, the status theory does not make predictions as clearly as you suggest in this post.
My guess is that the "impressiveness" and "status" effects are real, but they are unintended artifacts of the journal-publishing process.
Editors of top stats journals would like to publish articles which will be heavily cited and boost their journal's impact factor. Using impressive, esoteric techniques is a credible signal that the paper is more likely to be highly cited, because: (1) some esoteric techniques turn out to be actually useful, even though most are not, and (2) esoteric techniques are related to effort, and expending lots of effort on a single paper signals that the author's hidden info about that particular paper is very good.
Note that the most celebrated papers tend to be cited across disciplines and/or reported in the science news. It would be hard to argue that the authors of these papers are simply trying to impress their fellow academics.
I'm not sure about the graphs vs. math issue, but since information visualization is a well-established field, I find Seth's claims about the "low" status of graphics to be unconvincing.
The goal of an academic journal is to publish papers that incrementally advance knowledge, and the goal of peer review is to make sure that the paper actually includes an incremental advance -- it isn't enough for the author to claim so.
This naturally leads to an 'auditability bias': papers that cannot definitively demonstrate their incremental advance, as interesting as they may be, generally fall to lower-tier journals.
I'm not sure whether this is a bug or a feature, but I find it hard to believe that any journal editor or reviewer has ever said "this is an interesting paper, but the topic is just too darn useful to publish." Instead, it turns out that useful papers are often not easily audited.
As one of my mentors said long ago, "if someone hasn't written a paper on an obviously important issue yet, either you are the first to think of it or it is really hard to do well." A key part of an academic researcher's growth is it realize, humbly, that there are probably a lot of smart people who tried to address obviously important issues before, and failed.
A secondary observation: an incremental contribution to knowledge is typically not all that useful to present day problems -- if it were, someone in the private sector would be doing the work and profiting from it. Why would that be the best use of researchers employed by non-profit organizations and governments?
Yep. Math, math, stats, and math with a bit of economics sprinkled on top.
I don't know anything, but I had heard that graduate schools of economics are basically applied mathematics fields. They don't actually study economics... instead they just try to do the most 'impressive' mathematics formulas for supply and demand calculations that they can handle.