Understanding capitalist dynamics - Weekly Worker

Date: unknown

Location: weeklyworker.co.uk


Ian Wright reviews 'How labor powers the global economy' by Emmanuel Farjoun, Moshé Machover and David Zachariah (Springer Publishing 2022, pp166, £90)

This book is a powerful contribution to the Marxist critique of capitalist political economy and essential reading for anyone interested in understanding the dynamics of contemporary capitalism.

How labour powers the global economy is a sequel to Farjoun’s and Machover’s seminal and groundbreaking Laws of chaos, first published in 1983, which introduced the methodological approach of interpreting key Marxist categories as probabilistic, rather than deterministic, quantities.1 This was a bold and novel theoretical move that required readers to have some familiarity with linear algebra, probability theory and statistics. Hence the readership of Laws of chaos was unavoidably limited, even amongst Marxists. Nonetheless, due to its remarkable analytic insights, interest in that book steadily increased, culminating in its republication in 2020.

The book continues and deepens the ‘probabilistic turn’ in Marxist theory. The authors split their book into three parts. The first introduces their main categories that are direct probabilistic analogues of Marx’s basic value-theoretic categories. The second part confronts these theoretical foundations with the empirical phenomenon of contemporary global capitalism. The authors deduce a collection of highly explanatory, yet relatively simple, probabilistic laws that constrain how capitalism evolves over time. The third part explains why such laws inhibit human flourishing, and points to our need to organise global production for social good rather than private profit. The authors recover, and in many respects surpass, the scientific content of Marx’s original formulations. What we get from this exercise is a reinvigorated version of key aspects of Marx’s theory of capitalism that directly relates to modern empirical data.

A good starting point to understand the probabilistic turn is to contrast how Marx and the authors relate labour time to prices. Recall that Marx defined the value of a commodity as “socially necessary” labour in the following sense: “The labour time socially necessary is that required to produce an article under the normal conditions of production, and with the average degree of skill and intensity prevalent at the time”.2 In consequence, the value of an individual commodity, such as a rollerball pen, is not the actual working time supplied when making it (for otherwise inefficient production techniques, which use up relatively more labour time, would produce pens of higher value). Rather, every individual rollerball pen has the same value, determined by the conditions of production involved in the production of all rollerball pens. Ex post we can deduce the current value of an individual pen by dividing the total labour time supplied to produce all rollerball pens by the total quantity produced (eg, if society produces 1,000 pens with 100 hours of labour time, then the value of one pen is 0.1 hours).

Marx’s values are therefore a property of the social conditions of production, not individual labour processes, and Marx’s modifier, “socially necessary”, controls the heterogeneity in the conditions of production of the same commodity type by considering each individual commodity as an “average sample of its class”.3

Marx’s explicit use of “average sample” when defining ‘value’ in the opening chapters of Capital should be noted, because it indicates his desire to capture the irreducible variability of economic reality: there is not a single production process for pens (or any other commodity, for that matter), but multiple processes, and they necessarily differ: some are highly efficient, others less so. Hence ‘the’ value of a commodity is necessarily an average. Social systems, as the authors note, have a huge number of degrees of freedom and therefore any lawful regularities necessarily have a probabilistic character. In fact, as Julian Wells has carefully documented,4 Marx repeatedly employs informal statistical and probabilistic argumentation throughout Capital, influenced by his reading of the statistician, Adolphe Quetelet (pioneer of statistics and creator of the concept, ‘average man’), in order to grasp the reality of capitalism in thought. This may seem, at first, a surprising assertion, but, once you notice Marx’s informal use of probabilistic reasoning, it becomes impossible to unsee it.


The authors, unlike Marx, have all the tools of linear algebra, probability theory and statistical mechanics at their disposal, plus access to modern macroeconomic data. They apply these tools to gain even better purchase on concrete reality. Farjoun, Machover and Zachariah carefully walk the reader through their core theoretical propositions and probabilistic formulae, and relegate the majority of their mathematical deductions to appendices. This book has been written to be as accessible as possible, given its irreducibly technical nature. The argumentation can be followed by any reader comfortable with relatively straightforward mathematical equations.

The authors introduce the concept ‘labour content’ (or L-content) as a formal development of Marx’s concept of value. L-content remains an average property of the conditions of production, but explicitly deals with the interconnectedness of economic production (vertical integration) and the existence of heterogeneous production units that produce multiple outputs (so-called ‘joint production’). They show how L-content can be approximated from national accounting data. Like Marx they aim to explain how the underlying labour processes in the “hidden abode of production”, quantitatively captured by L-content, constrain and shape the more readily apparent surface phenomena (and fetishes) of the capitalist economy, such as prices, profits, growth - and the class struggle over the distribution of the surplus.

The ‘law of one price’ - a staple of mainstream economics - states that identical goods sell at identical prices (in the absence of ‘market imperfections’). That ‘law’ is empirically false, because market prices are subject to all kinds of accidental determinations and therefore the price of the same commodity type typically varies (eg, consider the different discounts on rollerball pens across thousands of different retailers). The authors embrace this variability. They define the “specific price” of a commodity, in a concrete act of exchange, as its selling price divided by its L-content (eg, if a particular pen sells for $1 then its specific price, in this sepific transaction, is $1 divided by 0.1 hours, which is $10 per hour). At the microeconomic level the same rollerball pen sells for many different specific prices. Hence, ‘specific price’ is an example of a random, rather than a deterministic, variable that takes on multiple values, where each value empirically manifests with a different probability defined by an underlying probability distribution. The authors argue that the distribution of specific price has roughly a log-normal form, where most specific prices cluster around a central mode, but there remains a low probability of very low and very high specific prices.

It is worth reflecting on the power and parsimony of this theoretical move. The authors do not ‘heroically’ assume that instances of the same commodity type sell at the same price. Instead they consider the totality of all market transactions in a given period as probabilistically relating quantities of money, as expressed by the selling price, to quantities of labour time, as expressed by the L-content. The microeconomic disorder of the exchange of millions of different commodities types, produced under diverse conditions of production, and which sell at prices that vary so much as to be (almost) accidental, can nonetheless be captured by a single, macroeconomic, random variable - the ‘specific price’ - which reduces this bewildering disorder to a single, orderly probability distribution. The higher level of abstraction - the probabilistic turn - views any market transaction, involving any commodity whatsoever, as simply an exchange of a sum of money for a sum of expended labour time, governed by a macroeconomic probability distribution.

The authors, having defined L-content and the random, variable ‘specific price’, then use probabilistic reasoning to deduce that the money price of any sufficiently large basket of commodities is approximately proportional to its L-content. In other words, market prices ultimately reflect, and are constrained by, underlying real costs of production, measured by labour time.

This basic relationship was, of course, conceptually central to all formulations of the classical labour theory of value. But it was implicitly formulated in deterministic, not probabilistic, terms. In the simple labour theory of value, proposed by Adam Smith and applicable to an economy that lacks profit on money-capital invested, ‘the’ price of a commodity is proportional to ‘the’ value. But Marx, following both Adam Smith and David Ricardo, rejected this basic relationship, even when expressed deterministically, because profits, which form a component of prices, have a non-obvious relationship to labour time and, at the very least, distort market prices away from simple proportionality to values.

Farjoun, Machover and Zachariah, continuing a theme from Laws of chaos, correctly point out that the distortion is maximal when profit rates across different sectors of the economy are entirely uniform - which obtains in a hypothesised state of steady, deterministic equilibrium. The authors, consistent with their probabilistic turn, reject deterministic equilibrium as inappropriate for the analysis of economic systems. Instead, if an equilibrium prevails, it will be a statistical one, where microeconomic variance persists in the context of macroeconomic steady-state probability distributions. Hence there is no single industrial profit rate, and it is never uniform; Instead, the profit rate is a random variable. And, in consequence, the proportional relationship between market prices and underlying values, far from being a naive simplification applicable to special cases, in fact holds generally in capitalism.

Monetary phenomena

The authors explain that the probabilistic relationship between money prices and L-content predicts that sectoral prices (ie, the prices of baskets of commodities of a similar type) will be highly positively correlated with measures of L-content, and that sectoral profit rates will be positively correlated with labour intensity. They point to the growing body of econometric studies - based on modern input-output data and widely replicated within Marxian and heterodox economics - which find precisely such correlations. The probabilistic approach therefore yields a very clear, and important, fundamental economic proposition of central importance - the probabilistic relation between prices and values - that is consistent with the observed data.

Marx’s proposal that values constrain prices and that the origin of profit is ultimately labour, not capital, is fully vindicated by the probabilistic turn - but not quite in the way Marx expected. In volume 3 of Capital he attempts to explain how competitive prices that correspond to a deterministic equilibrium of uniform profit-rates, although not proportional to values, are nonetheless conservative transformations of them, and therefore the average profit rate is ultimately determined by labour time.

This theory, in the subsequent 150 years or more, has generated huge controversy simply because it is a theorem that Marx’s transformation cannot be conservative, and therefore profits, contra Marx, seem unconnected to the contributions of labour. The authors, to a large extent, successfully cut this Gordian knot by pointing out that no such deterministic equilibrium exists. There is an average profit rate, of course - but there is not ‘the’ profit rate. The authors derive a probabilistic relationship between the average profit rate and (i) the total labour supplied in the economy as a whole, (ii) the L-content of all capital goods in production, and (iii) the wage share (ie, the proportion of the surplus product that workers, as a whole, can purchase). In other words, the average profit rate is indeed constrained by labour time - specifically the size of the workforce and the labour time necessary to produce the capital goods in operation. If workers supply more labour, then, all other things being equal, the average profit rate will be higher; if production is more capital-intensive, then, all other things being equal, the average profit rate will be lower, etc. This result re-establishes the sought-for link between profit rates and labour time, but Marx’s theory of the transformation is no longer needed, theoretically vestigial, and hence the controversy is neatly sidestepped.

I have focused on the probabilistic relation between prices and L-content, and drawn contrasts with Marx’s original approach, in order to communicate some of the specific flavour of the book’s contributions. However, it tackles a much wider variety of topics in capitalist political economy. The novel and unorthodox approach of recasting Marx’s theory in terms of random variables generates largely orthodox conclusions entirely consistent with the intent, and largely consistent with the content, of Marx’s scientific contributions. As the book develops, a clear methodological pattern emerges: Farjoun, Machover and Zachariah consider a fundamental economic phenomenon, and apply their probabilistic approach to deduce a relatively simple, yet insightful, aggregate probabilistic relationship that reveals how that phenomenon is ultimately constrained and governed by properties of the labour process. In virtue of the power of the probabilistic approach, they deduce new insights with relatively short derivations and back-of-the-envelope approximations that nonetheless always hit the empirical data.

For example, the authors observe that capitalism, driven by the competitive scramble for profit, tends to revolutionise the conditions of production, and that, empirically, the L-content of all the many diverse commodity types tends to decrease over time (equivalently, the productivity of labour tends to increase). They call this persistent pattern “the law of decreasing labour content” (LDLC). They demonstrate, again probabilistically, that monetary cost-cutting by individual firms, which consists of strategies to cut mixtures of wages and non-labour costs and also input replacement strategies, nonetheless generates, as an unintended consequence, a high probability that the L-content of any sample basket of commodities will decrease over time. They derive quantitative bounds for this rate of decrease and then demonstrate that their predictions are consistent with the relevant empirical data.

Another example: the authors derive a formula that approximates the rate of economic growth as a simple sum of the growth rate of the workforce and the rate of increase in the productivity of labour. Armed with this formula, they predict that the global growth rate must stay below about 4% a year and demonstrate, once again, that their prediction is consistent with empirical data.

This methodological pattern is repeated, with great conceptual and quantitative success, throughout the book: Farjoun, Machover and Zachariah explain why a capitalist economy cannot grow by more than 2%-3% a year above the growth of the working population; why wage incomes, as a share of total output, are necessarily confined within a narrow range; why overall productivity cannot grow more than 3%-4% per year; why money-capitalists must impose an interest rate of at least 2%-3% above expected inflation to preserve their share of total output; and why total automation is entirely incompatible with capitalist social relations, and so on. The book, in many respects, is a theoretical tour de force.


Some issues, however, despite the authors’ efforts, are not fully and satisfactorily resolved, and therefore point to opportunities to further develop the probabilistic approach to political economy. I will briefly mention two: the ‘objective’ nature of Marx’s theory of value; and the status of the transformation problem.

Ricardo had merely wanted to define a standard of economic value - something objective, outside of the market - that could explain the equilibrium prices of reproducible goods. Any real cost basis (be it labour time, quantities of gold or corn, or other basic commodities) could potentially perform this role. Marx, in contrast, claimed that the dynamics of capitalist competition instantiate objective laws that bind the form of value - such as pounds, dollars or euros - to a specific content, which is labour time. Quantities of money, and therefore prices, in fact represent labour time, and not anything else - not because an economic theorist decides that labour time is a convenient measure to understand the economy, but in virtue of our own social activity. In this sense, prices and labour time are objectively and uniquely related - regardless of whether anyone subjectively thinks they are or not.

The authors, in contrast, justify their choice of using L-content as a key explanatory variable because “we are, of course, mostly interested in labour and labour time as a socio-political factor of production” (p44) and then survey various properties of labour that entail its unique role in powering the development of capitalism. However, their justification for adopting L-content ultimately reduces to the subjective convenience of the theorist. They do not attempt to develop Marx’s theory of how social activity can instantiate objective semantic relations between a representation (eg, money) and a referent (eg, labour time). In this sense their value-theoretic ambitions fall short of Marx’s.

As mentioned, Farjoun, Machover and Zachariah sidestep the transformation problem and view it as a rather large red herring. The probabilistic approach, although wildly successful in re-establishing clear quantitative relationships between monetary phenomena and underlying labour values, does not really sidestep the problem, but hides it underneath a rather beautiful tapestry of random variables and ‘good enough’ approximations. Ricardo in 1805, employing his own back-of-the-envelope calculations, already understood that “the great cause of the variation of [the price of] commodities is the greater or less quantity of labour that may be necessary to produce them”, but there is another “less powerful cause of their variation”, which is profit.5 In other words, the majority of the variance in competitive prices is indeed approximately explained by labour time.

The authors reproduce this (true) proposition in more sophisticated probabilistic terms. However, motivated critics will reproduce a “probabilistic transformation problem” in the authors’ framework that demonstrates that changes in the specific-price distribution cannot be fully explained by changes in L-content. Further, the accuracy of the proportional relation between money prices and L-content will wax and wane depending on the increase and decrease in the variance of profit rates: if the variance is suitably wide then the approximation will hold, but if that variance reduces then the approximation will begin to break down until hitting the limit of a degenerate distribution: ie, the classical case of deterministic equilibrium and uniform profits. Such an extreme degenerate case, of course, does not empirically manifest. But the variance of profit-rates does empirically change over time.

Can the ‘correctness’ of any version of the labour theory of value depend on a varying empirical ‘strength’ of aggregate correlations between prices and values? For example, at what level of correlation should we decide to accept the authors’ value-theoretic claims? Above 95%, 90% - or is 85% or even 75% OK? In years where the variance of profit rates is more narrow, is their theory of value less true? Perhaps any residual lack of correlation, however ‘small’, is precisely a reward to money-capitalists for their own abstemious contributions to production?

Regardless of these unresolved difficulties, the authors’ contributions help clarify the nature and impact of the transformation problem.

Theory of capitalism

Textual interpretations of Marx, narrowly concerned with hermeneutic analysis of the three volumes of Capital, are 10 a penny. And such works, whatever their merits in promoting Marx’s ideas, tend to view Marxism as merely a type of ‘critical theory’. Yet Marx, a self-avowed scientist, was eager to marshal the latest theoretical tools - such as Hegelian philosophy, the counterfactual reasoning of classical political economy, simultaneous and differential equations, nascent probabilistic reasoning, etc - and the latest available empirical data (such as government reports on the length of the working day, debates on the English Factory Acts, the history of productivity and employment revolutions in the cotton trade, national differences in wages, interest rate data, etc, etc) in order to understand the objective laws of motion of capitalist society, and therefore the real possibilities for effective political intervention to transcend it.

Most Marxology rarely develops Marx’s critique of political economy in this scientific sense and therefore constitutes what Imre Lakatos would call a “degenerative research programme”. How labor powers the global economy, in contrast, is a completely different kind of book and squarely within the tradition of classical, and therefore scientific, Marxism. The authors, throughout, directly confront Marx’s theoretical framework - literally enlivened by the mathematical theory of random variables - with the phenomenon of contemporary global capitalism, in order to explain key aspects of its change over time.

The causal power of labour - albeit ruthlessly controlled and organised by the rule of capital and its manic scramble for profit - is the key driver, and central explanatory variable of the modern world. In consequence, and as the authors’ subtitle states, the theory of capitalism must be “a labor theory of capitalism”, in which labour is properly recognised as the underlying creative substance that powers the global economy. This concise yet rich book successfully demonstrates that Marx’s theory of value, suitably reinvigorated, is the key to understanding contemporary capitalism.

In my view, both Laws of chaos and this sequel are essential reading for those who wish to deepen their understanding of Marx’s value theory and capitalist dynamics. The probabilistic turn in Marxist theory cannot be ignored and deserves to be understood much more widely.

Ian Wright

  1. E Farjoun, M Machover Laws of chaos: a probabilistic approach to political economy London 1989.↩︎

  2. K Marx Capital Vol 1, chapter1, section 1: www.marxists.org/archive/marx/works/1867-c1/ch01.htm.↩︎

  3. Ibid.↩︎

  4. J Wells, ‘Marx reads Quetelet: a preliminary report’ (2017): mpra.ub.uni-muenchen.de/98255/1/mpra_paper_98255.pdf.↩︎

  5. D Ricardo, ‘Absolute value and exchangeable value’, in P Sraffa, MH Dobb (eds) The works and correspondence of David Ricardo Vol 4: Pamphlets and papers 1815-1823 Carmel, Indianapolis, 2004.↩︎