From an analytical point of view, economic crises offer a lot of learning opportunities. Where others see chaos and sheer downfall, economists observe a complex system adapting to radical shifts in equilibrium values (Austrians would say recalculations), creating large-scale arbitrage opportunities, and thereby motivating a major restructuring of the (global) economy. Adam Smith coined this dynamic process as the 'invisible hand'. Just think of Neo in Matrix (No. 1). In highest danger, he finally succeeds to see through the matrix, to get to the codes and routines. Understanding as an act of liberation. How wonderful! That's too the economist's bliss: to make the invisible hand analytically visible, to 'see through'. Unfortunately, we are not there yet. Partly because of otherwise invested manpower, partly because of the limitations in our tool sets, we still do not know why the economic machinery is convergent, given all the nasty-behaved real-world properties (non-convexities, non-linearities, frictions or transaction costs, computational and cognitive limitations, sunspots, etc.).
However, real-world economies are in general remarkably stable. Gross substitutionality of aggregate excess demand functions is still adhockery (see the SMD results), but in our real economies the price system and the institutional setting indeed seem to temper cumulative forces by allowing negative feedback effects to dominate. This indicates that the scarcity-nexus is much more complex than our Walrasian models suggest (they are indispensable nevertheless) and that the trading process matters a lot more than we usually presume. In the long run, I am optimistic that economics will produce interesting new results. The visualization of the invisible hand is however still work-in-process.
Anyway, most economists at least implicitly assume that the price system do decentralize the global allocation problem. In this light, it is somewhat surprising to see them convinced that the Great Recession is caused by (financial) overinvestment and its feedback on the banking system. Overinvestment means that economies grow too fast relative to primitives. Malinvestment implies that they also grow in wrong directions. In general, so the profession seem to understand the crisis, financial markets seem to have make a very, very poor job in allocating funds to productive and consumptive activites. Ex post, the investment plans realized seem inconsistent with real data, and massively so. Did the price system quit its job? The widespread believe that financial operations lead to clusters of error is surprising for at least three reasons:
1. Up to the crisis, the majority of economists believed that markets do not accumulate resources against or in conflict with preferences so as to explain large recessions or even depressions. The Austrian recalculation story of deep crisis was discredited since their acknowledgment of 'secondary deflation'. It turns out that our little understanding of the disequilibrium dynamics leaves us open to whims and passions. It is unfortunate to see how close the profession's interpretation of the event is to the public perception. This is a bad sign for a counterintuitive science, for members of the Dismal Science - formerly proud to be so. I do not deny the obvious: financial markets went down; wealth was burned. As always when such crisis occur the public blames 'markets' in general and speculators in concrete. Greed, stupidity, ideology: these are believed to have shaken the economy. In contrast, economics has developed an understanding of anonymous processes. The cognitive and informational requirements for individuals are relatively low. Somehow, the price system seems able to decentralize allocation. No matter what individual specific forecast and interpretative framework, the market process systematically drives out bad decision makers by eliminating their control over resources. For the recalculation story to be true, something must have deterred the signaling function of prices and the anonymous learning routines. What is it? Credit Default Swaps? Out-of-balance vehicles? If so, why is this? Because they induce a too high leverage? But high relative to what? May I remember that to approximate answers to such questions you should rely on God's information set! All preference relations, the global book of blueprint, all endowments in every detail? For all times? For alle states? And what happened to the staples of tests over periods with and without tails that markets are informationally efficient? That is, they may be wrong but still provide the most informed forecast. Can regulators or economist ever know better? By what standards then do we rule out short selling, financial products, or organisational solutions? By macroprudential surveys? Can we hope for more than to inform and thereby to improve the market forecast? Do we even know pareto-moves to reduce systemic risk? Consider that systemic risk depends on regulatory settings and their higher-order time derivates also (regime uncertainty).
2. The second surprise is closely related. Since the beginning of neoclassical theory, it is understood that explanations relating crisis to the 'psychology of financial markets' are recipes for intellectual escape. 'Why the crisis?' 'Markets went crazy.' Aha. True, standard theory excludes social interaction other then by prices. We are indeed blind to many facets of frictions that may or may not lead to systematic clusters of error. Alan Kirman is strong on this (and in general). Yet, these interactions are not external to the scarcity-nexus but part of it. Any such analysis has to take into account that trading processes (of course financial, too) are potentially convergent as long as arbitrage opportunities are open and reliably signalled by money price relations and calculation methods. The equilibrating force comes by the heart of neoclassical theory: the axiomatic expression of relative scarcities and of choice in terms of opportunity costs. In post-Marshallian theory, supply curves are inverted demand curves, and, given convex preferences, increasing due to the 'strength of excluded demand'. Locally increasing sectoral costs indicate increases of the relative scarcity of alternative consumption (alternative, since necessary resources are limited (persistent in supply) and of various uses). Thus, even if agents have new information indicating large-scale malinvestments, this must mean that by the same information set they become aware of massive arbitrage opportunities. Keynes was sceptical that such financial adjustments keep the real economy on track. This is what his saving paradox is about. The saving paradox, however, is inconsistent with our Walrasian perspective. New-Keynesian models went beyond old IS-LM. Thus, if financial markets are the villains of our plot, we have to find defects in the anonymous operation of updating routines indicated above. We have to show that informational efficiency may fail (as Grossman/Stiglitz indeed did show, but for Arrow-Debreu-like economies which lack the institutional structure that may produce sufficient informational efficiency; see the Coase theorem). Or we have to show how informationally efficient markets fail to converge to allocatively efficient limits. This happens when prices reveal all information available, but the information set is not fine enough. Yet, I don't see a surge in mechanism design. If we apply 'in dubio pro reo', regulation as well as bail-outs should be very limited.
3. Finally, there is a surprising neglect of what economists since classical orthodoxy identify as the major source of deep macroeconomic downturns: nominal aggregate demand failures given nominal rigidities. Already the debates surrounding Say's Law were informed that in contrast to general gluts (the case where markets suddenly realize that all goods are in excess supply), a case regarded logically incoherent, partial crises can indeed end in falling aggregate supply (and thus real demand) because of wage and price rigidities. Positive feedback effects then dominate negative feedback effects. J. St. Mill clearly saw the problems with decreasing velocity. In classical theory, furthermore, the quantity equation was used to determine the price level by the relation between nominal aggregate demand and real aggregate output! Since Friedman we use it to estimate velocity. In classical theory the quantity theory is a short-run theory of the price level (the long-run value was given by the commodity value of gold). Only in neoclassical theory, the quantity equation became a tool to interpret long-run trends. Yet, also in Wicksell's and Fisher's outline, the price level is determined by the relation between aggregate nominal demand and real supply. They simply understood that nominal AD is not determined by the monetary base. Money substitutes came into play. Fisher wrote: MV+M'V'=PY, the second term on the LHS indicating components of effective nominal demand not epressed by cash transactions. A general practice of the time was to treat changes in velocity as approximation of changes in aggregate demand not related to changes in base money (HT Sumner). Wicksell eliminated money altogether to show that commercial banks add indeterminacy to the price level by leaving the level of aggregate nominal demand open to possibly radical shifts in expectations. Today, we know that nominal AD is a forward-looking concept and we know that central banks can control it by controlling market forecasts. Except the short-run, monetary authorities controls nominal values only. If the level of nominal demand drifts the entire blame is on the authority. Markets cannot determine nominal values. Note that in New Keynesian models short-run real demand is determined by central banks setting the real rate (given nominal rigidities described by the forward-looking Phillips curve). An old Keynesian storyline imposed on the New Keynesian model suggests that demand management stabilizes the economy. But in modern theory, the management of real demand is just a part of the story. In rational expectation models the near present loses dominance. The central problem is to determine nominal expectations over time by defining a reliable target. The target is always feasible because central banks rule over the nominal world. The determination of nominal demand is expressed by the central bank reaction function. Inflation targeting is the chosen strategy. Given neutral real growth, future nominal interest rates and a more than proportional reaction to shocks is assumed to determine inflation locally. And here is the surprise: even though we know since classical theory that - given nominal rigidities - level drifts in nominal demand are root causes of severe crises, and even though we know since the rational expectations revolution that such level drifts can be bounded by an appropriate target, we chose an inappropriate one: we target the rate of nominal growth (inflation targeting) instead of the level-path (instead of level-targeting). Our economies are thus left vulnerable to whatever 'market psychology' (to sunspots). As Sumner points out, The Real Problem was Nominal!