October 2019

Sun Mon Tue Wed Thu Fri Sat
    1 2 3 4 5
6 7 8 9 10 11 12
13 14 15 16 17 18 19
20 21 22 23 24 25 26
27 28 29 30 31    
Blog powered by Typepad

« Can You Fill This Out? | Main | Come on Frank, You Can Do Better Than This »

Comments

Feed You can follow this conversation by subscribing to the comment feed for this post.


At NYU in 1980 1st year macro was rather old-style hydraulic Keynesianism, which I found impossible to understand. Micro was also pretty plunky with Varian as the text. But I also had game theory and learned about the chain-store paradox when it was still a working paper. Andy Schotter told me to read Karl Menger’s book “Morality, Will and Social Organization,” which I did to great profit. Econometrics contained no time series to speak of, if any at all. Along with Dick Langlois, Sandy Ikeda, George Selgin, and several others, I caught Machlup’s last methodology class, which helped me to learn my Schutz. I also had his class in international finance, during which he called himself an Austrian economist and advocated “introspection.” Lachmann taught a wonderful seminar whose catalog description promised a class in differential topology! Kirzner’s history of thought class got me started on Walras. Joe Ostroy taught us about “no-surplus equilibria.” I never used Joe’s work, but it made me smarter. And, of course, the colloquium taught us about the Lachmann-Kirzner debates. Indeed, we were all participants in the Lachmann-Kirzner debates. In those days, Don Lavoie was a strong voice of opposition to Lachmann’s perspective! Richard Ebeling was into Schutz and probably closer to Lachmann than Kirzner. But I think it was only later that Richard grew more “hermeneutical.”

At Auburn I finally learned some macro, first from Roger Garrison who went over Keynes’s General Theory in detail, and then from Leland Yeager who focused on Patinkin. From Yeager we learned the three keys to macroeconomic theory: 1) real cash balances, 2) real cash balances, and 3) real cash balances. The micro sequence, by Bob Ekelund, Richard Saba, and Richard Ault, taught us the sort of applied microeconomic reasoning that you really do in your professional work as a researcher. Randy Holcombe and Jim Long tried to teach me some public finance, but I was not a good student.

I know this is "off point," but we were pointed in the direction of reading a graduate student's experience in economics graduate school, which in its second posting links you to an interview with Peter Temin.

Both the grad student and Temin argue that the development of a general equilibrium "macro" approach have been heavily influenced by a "bias" for an economic model that "shows" the desirability for government doing nothing; and that some who develop these models are funded by those in society who want government to do nothing in the economy.

Furthermore, Temin ridicules those who think of a world with limited or little government intervention -- the cruelty of a world without minimum wages, union power, or redistribution of wealth. And that, he insists, shows the shallowness and the emptiness, and the harm of models constructed on the idea that government does not intervene, regulate, or redistribute.

This is dangerous stuff. And I say this as an economist, not as a classical liberal. Why? Let me suggest that such an approach can easily slide into a form of the ("old") institutionalism and German Historicism, which rejects the idea of the usefulness or the relevance of a universal theory of human decision-making and market order; but which insists in thinking in terms of historically "given" institutional realities.

And which argues that to ask: (a) how does an economy "work" without various government interventions? and (b) what are the consequences or impacts of introducing a type of government intervention? is to "prejudice" the case in favor of thinking about an economy from a non-interventionist perspective.

Furthermore, Temin ridicules the very notion of viewing the market order on its own, and its outcomes, as "natural." Here we see an attitude that could easily can become a view that implies that even to speak about the "spontaneous order" of human association to be value-laden, and therefore, to be suspect. (He throws out a nasty barb at Hayek as merely an economic "blogger" before the internet.)

Here we find the echo of the German Historicists who insisted that "justice" and "fairness" could not be separated from economic analysis. That to start the analysis of labor markets without minimum wages or union-power determined wages "shows" the political motivation of such an economist.

What, of course, makes Temin's argument have plausibility is his (and the graduate student's) correct portrayal of much of mainstream macroeconomics as being constructed on highly unrealistic assumptions and (by "positivist"-type assumptions) possessing little "empirical" or predictive content. (The graduate student majored in physics as an undergraduate, and presumes that a "real" economics should be quantitative and open to empirical predictive content, just like physics.)

And here, again, we see an argument and/or debate joined without the "Austrian" alternative. That is, the alternatives appear to be either a macro general equilibrium theory with highly unrealistic assumptions for mathematical tractability or an institutionally-given historicism that at the same time wants quantitative predictive power.

The triumph of either one, or a debate between only these two alternatives involves a deep and serious loss for economics as a science.

Richard Ebeling

Roger wrote:

"From Yeager we learned the three keys to macroeconomic theory: 1) real cash balances, 2) real cash balances, and 3) real cash balances."

Can you teach that to Pete? :)

Teach which to Pete? It's best to take them one at a time.

I see the grad student hasn't learned humility. I love hearing students scorching great thinkers. Sorry he wasted his time reading Hayek.

I just finished my first year in a PhD program and can sympathize with the Noah Smith's comments. We pretty much learned the standard stuff, so I'll just make a few comments on the topics I found most interesting.

First, in second semester macro we started with a brief history of thought and motivated the standard RBC model after learning about the Lucas Critique. Progressively we built more bells and whistles into the model, i.e. adjustment costs, variable utilization, etc. Then we talked about New Keynesian models and optimal monetary policy. We concluded with a huge medium scale quantitative model that was a Frankenstein like mutation of the original parsimonious RBC model. Ironically, we began where we started: a big mess of equations with which one can not make casual inferences.

Although my perception is that the "salt water/ fresh water" divide is overblown, most Minnesota types say the large scale models are bogus and attempts to really understand the black box is a waste of time and making predictions with these types of models is dangerous since the researcher doesn't really understand it. The Cambridge people retort that the Minnesotites have models that are too simple. Although both strands are quite different than what one would read in a standard Austrian macro book (such as Professor Garrison's or Professor Horwitz's) a certain Minnesota economist said (to paraphrase) modern macro starts with the premise that humans act purposefully and goes from there. Sounds like something I read in chapter one of Rothbard.

Also, I learned that the macroeconomic debate in the academy isn't hydraulic Keynesianism versus more Classical types. The academic macroeconomists that I have met are humble about their model’s ability to closely approximate the real world and are skeptical of government intervention and even monetary policy in practice. The caricatures of modern macro as strictly representative agent, rational expectations, etc. and a math orgy is a misunderstanding of what is actually being discussed.

In econometrics we talked extensively about quasi random experiments fleshed out through instrumental variables, difference in differences and a few other methods. The bottom line I came away with is that counterfactuals are necessary to talk about causality and the econometrician never observes the counterfactual. Therefore, we should try to approximate the counterfactual as well as possible using some microeconometric approach. One implication is that the scope of inference is pretty narrow and regressions that attempt to answer the "big questions" (i.e. most of the early empirical level on economic growth) can just identify correlations.

Micro was pretty much the same as it is everywhere. Comps are in a month and I am looking forward to some Austrian seminars this summer, or as the cynic in me calls them, "deprogramming sessions".

Rob,

You said, "The caricatures of modern macro as strictly representative agent, rational expectations, etc. and a math orgy is a misunderstanding of what is actually being discussed." Could you elaborate, please?

Well I'll elaborate a little on each.

1) Math orgy: the math modern macroeconomists use is, in large part, no more sophisticated than it was thirty years ago. Prior to Kydland and Prescott and the rise of the "computational experiment" most of the modeling was done in continuous time which requires continuous time optimization and stochastic calculus if you do stuff with uncertainty. Most macro now is done in discrete time with dynamic programming or some kind of Lagrangian formulation. The toughest part, at least in my limited experience is writing the correct computer code to accurately represent the model. This is not meant to imply that macro theorists are less sophisticated than they were several decades ago, quite the contrary, progress in the field requires novel insights which usually, but not always, require a little mathematical wizardry. However, now less mathematically astute people like me can understand the basic workings of the baseline models with basic experience in calculus.

Representative agent: It’s true that assuming a representative agent makes life easier and there is some theoretical justification for it. For example, in an Arrow-Debreau GE framework people trade away all idiosyncratic risk so people are effectively identical. Of course this framework breaks down when there are informational assymetries, transaction costs, or any other problem that the real world forces us to confront. In the first year we primarily focused on RBC type models which use representative agent. The shocking thing is a simple RBC model explains a large part of cyclical volatility in the data. I guess the cutting edge in research would include things like: labor search with heterogeneous agents and firms, adverse selection and moral hazard problems in financial markets, and New Keynesian models focusing on market imperfections that give rise to the non neutrality of money. I'm looking at the research frontier from a distance, but this Steve Williamson syllabus should give everyone a pretty good idea of what modern researchers focus on http://artsci.wustl.edu/~swilliam/papers/syl10.pdf.

Rational expectations: Maybe I shouldn't have mentioned this one. I think most work still relies on RE. Two questions we have to ask though are what does it mean for expectations to be nonrational and what is an alternative modeling paradigm. To answer the former, at a basic level RE just means that people don't make persistent mistakes in one direction. That seems pretty reasonable. On the second, I think the common belief among macro guys is that once you through out RE the modeling process loses its discipline and becomes ad hoc. One of the last JEP had a forum on macroeconomics after the financial crises that talked about nonrational expectations. http://www.aeaweb.org/issue.php?journal=JEP&volume=24&issue=4.

Once again I issue the disclaimer that this is macroeconomics as I understand it. I'm still a novice with a lot left to learn.

One final thing: I'd like to thank Professor Boettke for his FEE lectures which I listened to on my ipod. Your insights on economics and encouragement to pursue interesting questions and not do "chalkboard economics" really keeps my motor going when I'm working on some irrelevant social choice or game theory problem.

I dunno, Rob, it could be that the edge of macro is no more advanced than you say, but I think ratex is a zombie. And I didn't detect anything on network topology in your answer. In the JEP symposium you refer to, Caballero points to complexity theory in general and network topology in particular as a way out of the DSGE cul-de-sac. Things are still shaking out, but I just think macro is gonna be all about network topology for a while at least. It'll be about network topology for as long as the Great Recession is the big traumatic event we're dealing with. Generals and economists are always fighting the last war.

Oh! I forgot to thank you, Rob, for that careful response to my query. Thanks!

I learned in the 1970s at Chicago how to think in a rigorous way. This was more important than the precise theories and approaches taught. I consider this the true Knight-Stigler-Friedman legacy.

I think Roger is right. Of course, complexity theory and network topology are the mathematical versions of spontaneous order theory. The math is starting to catch up with Austrian economics. Of course, it required powerful computers to do so. Those who understand Austrian economsit are in a unique position to truly understand how to use agent-based modeling, strange attractors, chaos theory, bios theory, self-organization, including self-organized criticality, emergence/catastrophe theory, and network theory.


I think very robust about it and want to read more.I would like to see more informative blogs to be posted on your site.Thanks for sharing.

This is a good,common sense article.Very helpful to one who is just finding the resouces about this part.It will certainly help educate me.

The comments to this entry are closed.

Our Books