Menu Close

Capitalism with friends like these, you don’t need enemies (part 1)

Steve Keen

This is a chapter from Prof Steve Keen’s book Rebuilding Economics from the Top Down, scheduled to be published in 2024 by the Budapest Centre for LongTerm Sustainability.

Though I have been interested in ecological economics ever since I read The Limits to Growth (Meadows, Randers, and Meadows 1972), E.F. Schumacher (Schumacher 1973, 1979) and Hermann Daly (Daly 1974) in the early 1970s, and I have been a critic of neo-classical economics for just as long, I didn’t start critiquing the neoclassical approach to climate change until 2019. This was because, though I expected it to be bad, I felt that I could not critique it until after I had made a positive contribution to ecological economics myself.

That occurred when, while working with the brilliant pioneer of energy economics Bob Ayres; the aphorism “labour without energy is a corpse, while capital without energy is sculpture” occurred to me, and enabled me to work out how to bring energy into mathematical models of production in a fundamental way, using the concepts explained in the last chapter.

“Milton Friedman, proud father of global misery” by Steve Rhodes is licenced by CC BY-NC-ND 2.0 DEED

In June 2019, after our paper “A Note on the Role of Energy in Production” (Keen, Ayres and Standish 2019, p41) had been published — and after William Nordhaus had been awarded the “Nobel” Prize in Economics in 2018 for his work on the economics of climate change — I sat down to read the neoclassical literature, commencing with Richard Tol’s overview paper “The Economic Effects of Climate Change” (Tol 2009).

Minutes later, I read the sentences quoted below, and I was both horrified, and very regretful of my decision to delay taking this area on: An alternative approach, exemplified in Mendelsohn’s work, can be called the statistical approach. It is based on direct estimates of the welfare impacts, using observed variations (across space within a single country) in prices and expenditures to discern the effect of climate. Mendelsohn assumes that the observed variation of economic activity with climate over space holds over time as well; and uses climate models to estimate the future effect of climate change. (Tol 2009, p32, emphasis added)

This assumption was patently absurd -as I explain in the next chapter – and yet it had been published in a “top five” economics journal (Bornmann, Butz and Wohlrabe 2018; Mixon and Upadhyaya 2022). Worse, as I read the literature in detail, I discovered that — although many other aspects of the neoclassical economics of climate change had been criticized by other economists, (Kaufmann 1997, 1998; Darwin 1999; Quiggin and Horowitz 1999; DeCanio 2003; Schlenker, Hanemann and Fisher 2005; Ackerman and Stanton 2008; Stanton, Ackerman and Kartha 2009; Ackerman, Stanton and Bueno 2010; Weitzman 2011b, 2011a; Ackerman and Munitz 2012; Pindyck 2013, 2017), no-one had criticised it for what I saw as its most obvious flaw: the simply ludicrous “data” to which Neoclassical models of climate change had been fitted.

The empirical assumptions that economists specialising in climate change have made are – to be frank – stupid, so much so that, even if their mathematical models perfectly captured the actual structure of the global economy precisely (which of course they don’t), their forecasts of economic damages from climate change would still be ludicrously low. They are also so obviously wrong that the mystery is why these assumptions were ever published. Therefore, before I discuss their work on climate change, I have to take a diversion into the topics of scientific and economic methodology.

“Simplifying Assumptions”, Milton Friedman, and the “F-twist”
As I noted in Chapter 5, every survey that has been done of the cost structure of real-world firms has returned a result that contradicts neoclassical economic theory. Rather than firms facing rising marginal cost because of diminishing marginal productivity, the vast majority of real-world firms operate with substantial excess capacity, and therefore experience constant or even rising marginal productivity as production increases. This means that marginal cost either remains constant or falls with output, rather than rising (which is what mainstream economic theory assumes).

In the 1930s, 1940s and early 1950s, a large number of papers were published reporting on these results in the leading journals of the discipline, such as Oxford Economic Papers (Hall and Hitch 1939; Tucker 1940; Andrews 1941, 1949, 1950; Andrews and Brunner 1950), the Quarterly Journal of Economics (Eiteman 1945), and the American Economic Review (Means 1936; Tucker 1937; Garver et al 1938; Tucker 1938; Lester 1946; Oliver 1947; Eiteman 1947; Lester 1947; Eiteman 1948; Eiteman and Guthrie 1952; Eiteman 1953). These papers made the point that, since marginal cost is either constant or falling, the mainstream profit-maximisation rule, of equating marginal revenue to marginal cost, must be wrong.

Writing in American Economic Review, Eiteman put it this way in 1947: an engineer designs a factory:
so as to cause the variable factor to be used most efficiently when the plant is operated close to capacity. Under such conditions an average variable cost curve declines steadily until the point of capacity output is reached. A marginal curve derived from such an average cost curve lies below the average curve at all scales of operation short of peak production, a fact that makes it physically impossible for an enterprise to determine a scale of operations by equating the marginal cost and marginal revenues unless demand is extremely inelastic. (Eiteman 1947, p913; emphasis added)

One might have expected that economists would have reacted to this empirical discovery by realising that economic theory had to change. But instead, in “The Methodology of Positive Economics” (Friedman 1953), Milton Friedman argued that economists should ignore these papers, and criticism of economics for being unrealistic in general, on the basis that the more significant a theory was, the more “unrealistic” its assumptions would be – an argument that Samuelson dubbed “The F-twist” (Archibald, Simon, and Samuelson 1963; Wong 1973): In so far as a theory can be said to have “assumptions” at all, and in so far as their “realism” can be judged independently of the validity of predictions, the relation between the significance of a theory and the “realism” of its “assumptions” is almost the opposite of that suggested by the view under criticism. Truly important and significant hypotheses will be found to have “assumptions” that are wildly inaccurate descriptive representations of reality, and, in general, the more significant the theory, the more unrealistic the assumptions (in this sense). The reason is simple. A hypothesis is important if it “explains” much by little, that is, if it abstracts the common and crucial elements from the mass of complex and detailed circumstances surrounding the phenomena to be explained and permits valid predictions on the basis of them alone. To be important, therefore, a hypothesis must be descriptively false in its assumptions… (Friedman 1953, p14; emphasis added).

He followed up with an attack on the significance of the papers which pointed out that marginal cost does not rise with output (as well as an attack on the model of imperfect competition): The theory of monopolistic and imperfect competition is one example of the
neglect in economic theory of these propositions. The development of this analysis was explicitly motivated, and its wide acceptance and approval largely explained, by the belief that the assumptions of “perfect competition” or, “perfect monopoly” said to underlie neoclassical economic theory are a false image of reality. And this belief was itself based almost entirely on the directly perceived descriptive inaccuracy of the assumptions rather than on any recognized contradiction of predictions derived from neoclassical economic theory. The lengthy discussion on marginal analysis in the American Economic Review some years ago is an even clearer, though much less important, example. The articles on both sides of the controversy largely neglect what seems to me clearly the main issue -the conformity to experience of the implications of, the marginal analysis – and concentrate on the largely irrelevant question whether businessmen do or do not in fact reach their decisions by consulting schedules, or curves, or multivariable functions showing marginal cost and marginal revenue. (Friedman 1953, p15; emphasis added)

Friedman ridiculed the survey methods behind this research:
The abstract methodological issues we have been discussing have a direct bearing on the perennial criticism of “orthodox” economic theory as being “unrealistic”… A particularly clear example is furnished by the recent criticisms of the maximization-of-returns hypothesis on the grounds that businessmen do not – indeed cannot – behave as the neoclassical theory “assumes” they do. The evidence cited in support of this assertion is generally taken either from the answers given by businessmen to questions about the factors affecting their decisions – a procedure for testing economic theories that is about on a par with testing theories of longevity by asking octogenarians how they would account for their long life – or from descriptive studies of the decision-making activities of individual firms. Little if any evidence is ever cited on the conformity of businessmen’s actual market behaviour – what they do rather than what they say they do – with the implications of the hypothesis being criticized, on the one hand, and of an alternative hypothesis, on the other. (Friedman 1953, pp3031; emphasis added)

And he also ridiculed the search for more realism in general:
A theory or its “assumptions” cannot possibly be thoroughly “realistic” in the immediate descriptive sense so often assigned, to this term. A completely “realistic” theory of the wheat market would have to include not only the conditions directly underlying the supply and demand for wheat but also the kind of coins or credit instruments used to make exchanges; the personal characteristics of wheat-traders such as the colour of each trader’s hair and eyes, his antecedents and education, the number of members of his family, their characteristics, antecedents, and education, etc.; the soil type on which the wheat was grown, its physical and chemical characteristics, the weather prevailing during the growing season; the personal characteristics of the farmers growing the wheat and of the consumers who will ultimately use it;
and so on indefinitely. Any attempt to move very far in achieving this kind of “‘realism” is certain to render a theory utterly useless. (Friedman 1953, p32)

Friedman’s paper merely codified the standard retort that economists have always made when their assumptions have been challenged; but since the appearance of his paper, he has been cited as the authority when needed. However, his paper had a definite – if perverse – effect on the development of neoclassical theory: though he cautioned in a footnote that “The converse of the proposition does not of course hold: assumptions that are unrealistic (in this sense) do not guarantee a significant theory”, his claim almost led to an arms race amongst economists to make the most unrealistic assumptions possible.

Domain Assumptions, Paradigms, and Scientific Revolutions
In the paper ‘Unreal Assumptions’ in Economic Theory: The F‐Twist Untwisted (Musgrave 1981), we find the explanation by philosopher Alan Musgrave that Friedman’s dictum was true of “simplifying assumptions”, but utterly false when applied to what he called “domain assumptions”.

A simplifying assumption is a decision to omit some aspect of the real world which, if you included it, would make your model vastly more complicated, but only change your results very slightly. Items listed by Friedman in his example of a “completely realistic theory of the wheat market” happen to be unrealistic instances of this: an economic model including “the colour of each trader’s hair and eyes” would be vastly more complicated, and would obviously have no effect on the model’s predictive power, but why would anyone bother creating such a model?

A more realistic example of a simplifying assumption is Galileo’s apocryphal proof that objects of different weight fall at the same speed by dropping lead balls out of the Leaning Tower of Pisa. Such an experiment “assumes” that the balls are being dropped in a vacuum. Taking air resistance into account would result in a vastly more complicated experiment, but the result would be much the same, because given the height of the Leaning Tower of Pisa, and the density and weight of lead balls, the simplifying assumption that the existence of air resistance can be ignored is reasonable.

But a domain assumption is completely different: this is an assumption which determines whether your model applies or not. If your domain assumption applies, then your theory also applies, and is valid; if it does not, then your theory does not apply and it is invalid. Therefore, domain assumptions should be realistic, otherwise the resulting theory will be false. Realism in domain assumptions is essential.

This is why the target of Friedman’s ire is so important: he was defending, not a simplifying assumption, but a domain assumption which is false.

This article is based on a chapter from Prof Steve Keen’s book Rebuilding Economics from the Top Down, which is scheduled to be published in 2024.
Reference:
Keen, S. (2024), Rebuilding Economics from the Top Down; Budapest Hungary (Budapest Centre for Long-Term Sustainability, together with Pallas Athene Publishing House).
The article is republished with the permission of the author. If you like Steve’s economic analysis then you are invited to support his work by signing up to either of the following networks:
(a) Patreon (https://www.patreon.com/ProfSteveKeen – minimum $10 per year);
(b) Substack (https://profstevekeen.substack.com/ — minimum $5 per month).
The continuation of this article will appear as part 2 in the next issue of ERA Review.

References
Ackerman, Frank, and Charles Munitz. 2012. ‘Climate damages in the FUND model: A disaggregated analysis’, Ecological Economics, 77: 219-24.
Ackerman, Frank, and Elizabeth A. Stanton. 2008. ‘A comment on “Economy-wide estimates of the implications of climate change: Human health”‘, Ecological Economics, 66: 8-13.
Ackerman, Frank, Elizabeth A. Stanton, and Ramón Bueno. 2010. ‘Fat tails, exponents, extreme uncertainty: Simulating catastrophe in DICE’, Ecological Economics, 69: 1657-65.
Andrews, P. W. S. 1941. ‘A survey of industrial development in great britain planned since the commencement of the war’, Oxford Economic Papers, 5: 55-71.
Andrews, P.W.S. 1949. ‘A reconsideration of the theory of the individual business: costs in the individual business; the determination of prices’, Oxford Economic Papers: 54-89.
Andrews, P.W.S. 1950. ‘SOME ASPECTS OF COMPETITION IN RETAIL TRADE’, Oxford Economic Papers, 2: 137-75.
Andrews, P. W. S., and Elizabeth Brunner. 1950. ‘PRODUCTIVITY AND THE BUSINESS MAN’, Oxford Economic Papers, 2: 197-225.
Archibald, G. C., Herbert A. Simon, and Paul A. Samuelson. 1963. ‘Problems of Methodology Discussion’, American Economic Review, 53: 227-36.
Bornmann, Lutz, Alexander Butz, and Klaus Wohlrabe. 2018. ‘What are the top five journals in economics? A new meta-ranking’, Applied Economics, 50: 659-75.
Daly, Herman E. 1974. ‘Steady-State Economics versus Growthmania: A Critique of the Orthodox Conceptions of Growth, Wants, Scarcity, and Efficiency’, Policy Sciences, 5: 149-67.
Vol 16 No 3 ERA Review 17
Darwin, Roy. 1999. ‘The Impact of Global Warming on Agriculture: A Ricardian Analysis: Comment’, American Economic Review, 89: 1049-52.
DeCanio, Stephen J. 2003. Economic models of climate change : a critique (Palgrave Macmillan: New York).
Eiteman, Wilford J. 1945. ‘The Equilibrium of the Firm in Multi-Process Industries’, Quarterly Journal of Economics, 59: 280-86.
Eiteman, Wilford J. 1947. ‘Factors Determining the Location of the Least Cost Point’, American Economic Review, 37: 910-18.
Eiteman, Wilford J. 1948. ‘The Least Cost Point, Capacity, and Marginal Analysis: A Rejoinder’, American Economic Review, 38: 899-904.
Eiteman, Wilford J. 1953. ‘The Shape of the Average Cost Curve: Rejoinder’, American Economic Review, 43: 628-30.
Eiteman, Wilford J., and Glenn E. Guthrie. 1952. ‘The Shape of the Average Cost Curve’, American Economic Review, 42: 832-38.
Friedman, Milton. 1953. ‘The Methodology of Positive Economics.’ in, Essays in positive economics (University of Chicago Press: Chicago).
Garver, Frederick B., Gustav Seidler, L. G. Reynolds, Francis M. Boddy, and Rufus S. Tucker. 1938. ‘Corporate Price Policies’, American Economic Review, 28: 86-89.
Hall, R. L., and C. J. Hitch. 1939. ‘Price Theory and Business Behaviour’, Oxford Economic Papers: 12-45.
Kaufmann, Robert K. 1997. ‘Assessing The Dice Model: Uncertainty Associated With The Emission And Retention Of Greenhouse Gases’, Climatic Change, 35: 435-48.
Kaufmann, Robert K. 1998. ‘The impact of climate change on US agriculture: a response to Mendelssohn et al. (1994)’, Ecological Economics, 26: 113-19.
Keen, Steve, Robert U. Ayres, and Russell Standish. 2019. ‘A Note on the Role of Energy in Production’, Ecological Economics, 157: 40-46.
Lester, Richard A. . 1946. ‘Shortcomings of Marginal Analysis for Wage-Employment Problems’, American Economic Review, 36: 63-82.
Lester, Richard A. 1947. ‘Marginalism, Minimum Wages, and Labor Markets’, American Economic Review, 37: 135-48.
Meadows, Donella H., Jorgen Randers, and Dennis Meadows. 1972. The limits to growth (Signet: New York).
Means, Gardiner C. 1936. ‘Notes on Inflexible Prices’, American Economic Review, 26: 23-35.
Mixon, Franklin G., and Kamal P. Upadhyaya. 2022. ‘Top to bottom: an expanded ranking of economics journals’, Applied economics letters, 29: 226-37.
Musgrave, Alan. 1981. ”Unreal Assumptions’ in Economic Theory: The F‐Twist Untwisted’, Kyklos (Basel), 34: 377-87.
Oliver, Henry M. 1947. ‘Marginal Theory and Business Behavior’, American Economic Review, 37: 375-83.
Pindyck, Robert S. 2013. ‘Climate Change Policy: What Do the Models Tell Us?’, Journal of Economic Literature, 51: 860-72.
Pindyck, Robert S. 2017. ‘The Use and Misuse of Models for Climate Policy’, Review of Environmental Economics and Policy, 11: 100-14.
Quiggin, John, and John Horowitz. 1999. ‘The impact of global warming on agriculture: A Ricardian analysis: Comment’, American Economic Review, 89: 1044-45.
Schlenker, Wolfram, W. Michael Hanemann, and Anthony C. Fisher. 2005. ‘Will U.S. Agriculture Really Benefit from Global Warming? Accounting for Irrigation in the Hedonic Approach’, American Economic Review, 95: 395-406.
Schumacher, E. F. 1973. Small is beautiful : a study of economics as if people mattered / E.F. Schumacher (Blond and Briggs: London).
Schumacher, E. F. 1979. ‘On Population and Energy Use’, Population and Development Review, 5: 535-41.
Stanton, Elizabeth A., Frank Ackerman, and Sivan Kartha. 2009. ‘Inside the integrated assessment models: Four issues in climate economics’, Climate and Development, 1: 166-84.
Vol 16 No 3 ERA Review 18
Tol, Richard S. J. 2009. ‘The Economic Effects of Climate Change’, Journal of Economic Perspectives, 23: 29–51.
Tucker, Rufus S. 1937. ‘Is There a Tendency for Profits to Equalize?’, American Economic Review, 27: 519-24.
Tucker, Rufus S. 1938. ‘The Reasons for Price Rigidity’, American Economic Review, 28: 41-54.
Tucker, Rufus S. 1940. ‘The Degree of Monopoly’, Quarterly Journal of Economics, 55: 167-69.
Weitzman, Martin L. 2011a. ‘Fat-Tailed Uncertainty in the Economics of Catastrophic Climate Change’, Review of Environmental Economics and Policy, 5: 275-92.
Weitzman, Martin L. 2011b. ‘Revisiting Fat-Tailed Uncertainty in the Economics of Climate Change’, REEP Symposium on Fat Tails, 5: 275–92.
Wong, Stanley. 1973. ‘The “F-Twist” and the Methodology of Paul Samuelson’, American Economic Review, 63: 312-25.

Leave a Reply