Materialist Economics, Part 3: Accountable Planning

  1. Constitutional Human Rights and the Rule of Law

Without protected human rights, it is impossible to know the will of the people. Any system of democratic production requires, as a precondition, constitutional protections for speech, assembly, and political participation. The rule of law must be upheld. Legal challenges must meet reasonable evidential standards, and accusations of wrongdoing must be resolved through due process.

This requirement is not ornamental. The entire proposal depends on the premise that production should be directed by popular will. But popular will can only be ascertained under conditions where people are free to express it. If a government can imprison critics, shut down independent media, or rig polling mechanisms, then “production by popular vote” becomes production by decree.

Human rights must be constitutionally entrenched. Not granted by the government of the day but prior to it, binding on all officeholders, enforceable by independent courts. Without this, the government described below has no foundation. We cannot risk the temporary suppression of opposition becoming a permanent one.

  1. Democratic Job Creation

To enforce accountability, the government polls the people on what goods and services they want. It then creates jobs in the relevant industries to produce those goods. Subsidizing production by popular vote directly replaces the purchasing-power filter with a democratic one. What counts as demand is no longer determined by who has the most capital. It is determined by what people actually need.

The polling mechanism works as follows. Every citizen receives an equal budget of vote credits. No one can vote for their own products. Citizens allocate their credits across goods categories. The cost of casting additional votes on a single category rises quadratically: casting n votes on one category costs n-squared credits. This is quadratic voting, a mechanism proposed by E. Glen Weyl and proved by Lalley and Weyl (2018) to yield approximately Pareto-efficient outcomes under standard assumptions. (An outcome is Pareto-efficient if no one can be made better off without making someone else worse off; the quadratic cost structure approximates this condition because it forces voters to internalize the social cost of their influence over each category.) The quadratic cost structure has two critical properties. First, it captures the intensity of preferences, not just their direction: a citizen who desperately needs housing can concentrate credits there, while someone with many moderate needs spreads credits across categories. Second, it protects minority preferences. Because concentrating votes is costly, no bloc can cheaply dominate all categories. An intense minority exerts disproportionate influence in its area of greatest need.

Arrow’s impossibility theorem, which establishes that no ranked voting system can simultaneously satisfy a small number of reasonable fairness conditions, does not apply to this mechanism. Arrow’s theorem governs ordinal systems in which voters can only rank alternatives. As Arrow himself later acknowledged, systems based on cardinal utilities, where voters express the strength of their preferences, are not subject to his impossibility result. Quadratic voting is a cardinal system. Buterin, Hitzig, and Weyl (2019) extended this mechanism into “quadratic funding” and proved that it yields optimal provision of public goods without requiring a centralized legislature to determine the optimal level (Management Science, Vol. 65, No. 11, pp. 5171–5187).

This has been tried in practice. The Colorado House Democratic Caucus used quadratic voting in 2019 to prioritize 107 bills. The city council of Gramado, Brazil used it to prioritize its legislative agenda, in a project organized by ITS Rio in partnership with RadicalxChange. Taiwan’s Presidential Hackathon has used quadratic voting since 2019, with citizens allocating 99 voice credits across competing proposals to select projects for the national policy agenda. Participatory budgeting more broadly, pioneered in Porto Alegre, Brazil in 1989, has been implemented in over 11,500 municipalities worldwide as of 2024. Porto Alegre’s system used tiered assemblies where citizens prioritized investment categories and elected delegates to a participatory budget council. Participation grew from fewer than 1,000 per year in 1990 to approximately 40,000 in 1999 (World Bank, Bhatnagar et al.), governing the allocation of roughly 20 percent of the municipal budget. Sewer and water connections rose from 75 percent to 98 percent of households between 1988 and 1997, and a World Bank study of 253 Brazilian municipalities found that those using participatory budgeting collected 39 percent more in local taxes.

Counting these votes on a decentralized system requires reliable infrastructure. Fortunately, there are already systems to transfer and store information digitally. The field of algorithmic mechanism design studies the construction of algorithms that resist manipulation by strategic agents. A centralized ledger with distributed storage would be one method. Courts would settle disputes in distribution. The quadratic voting mechanism is itself resistant to strategic manipulation: Weyl (2017) showed that the mechanism resists certain forms of collusion, fraud, and aggregate uncertainty, because the quadratic cost structure makes it expensive for any individual or group to dominate the outcome. This planning system would be part of the constitution; amending it would require a supermajority in a referendum.

If the government lacks the productive resources to meet these demands, and the people have voted for sufficient essential goods to justify it, the necessary means of production are expropriated from their current owners and assigned to workers. This is necessary collectivization, not total collectivization.

The purpose is to free workers from dependence on capitalists in specific, concrete ways. Capitalist employers can stop creating jobs when returns decline, stop selling essential goods when it is unprofitable, and move capital abroad to escape unfavorable conditions. Under this proposal, none of these exits are available. The economy keeps producing what the people vote for regardless of profitability.

  1. Accountable Planning

A centrally planned economy in the Soviet mold requires the central authority to determine not only what is produced but how it is produced and by whom. The Soviet Union assigned workers to jobs far from their families, and citizens who wished to change professions or relocate faced bureaucratic obstacles that many experienced as a form of coercion no less real than market compulsion.

Jane Jacobs documented in The Economy of Cities (1969) how a mining company transitioned into making sandpaper from its abrasive materials, and from there into adhesive tape. This happened through organic experimentation at the point of production. It’s the kind of transition that is difficult to anticipate from the center unless the planning agency’s reach is very small.

Under the current proposal, the plan is set from the bottom by popular vote, not from the top by a party bureau. The aggregated vote results become what economists call the final demand vector: a list specifying how much of each finished good the population wants produced for consumption, as opposed to intermediate goods (like steel or electricity) that are used up in making other products.

How production is organized locally (what techniques to use, how to divide tasks, what intermediate innovations to pursue) is left to workers and communities. The government’s role is the allocation of the means of production to industries by the will of the masses:

Are the demanded goods being produced?

Are workers being treated fairly?

Are resources being used responsibly?

This is accountable planning. The plan originates with the people, and the planners are accountable to them.

The empirical basis for this planning is the labor theory of value. Studies across multiple countries and decades have found strong correlations between labor values and market prices (see Part 1, “Labor Values Predict Market Prices,” for the full survey of evidence): Shaikh (1998) for the United States, Petrovic (1987) for Yugoslavia, Ochoa (1989) for the United States using different methods, Cockshott and Cottrell (1997) for the United Kingdom, and Zachariah (2006) for eighteen OECD countries. The reported R-squared values are consistently high, typically above 0.90, though the exact figures vary with the level of sectoral aggregation and the regression specification employed. (As with any cross-sectoral regression, higher levels of aggregation tend to produce higher R-squared values; the correlation remains strong but the specific magnitude depends on methodology.) The MELT (the Monetary Expression of Labour-time, explained in Part 1, “Why the Regularity Holds: Statistical Mechanics”) provides the bridge between labor-time accounting and monetary prices. The ratio is empirically stable and calculable from national input-output data. There is no need for the planners to guess at the relationship between labor and price. It can be measured.

Hayek said that an economy involves so much distributed information that no central authority could possibly aggregate it. The price system achieves this aggregation with remarkable economy. As Cockshott, Cottrell, Michaelson, Wright, and Yakovenko demonstrate in Classical Econophysics (Routledge, 2009, Chapter 15), it is quantitatively wrong.

The argument proceeds in two stages:

a) A comparison of the information that must flow through a market economy versus a planned economy.

b) A demonstration that the planning computation itself is fast enough to perform routinely.

To understand the information-flow comparison, consider what happens in a market economy when it adjusts toward equilibrium. Every firm must communicate with every supplier, requesting price quotes, reading them, calculating costs, placing orders, and processing deliveries. If an economy has n firms, each with on average m suppliers, and each message requires b bits to encode a price or quantity, then every iteration of this adjustment process requires approximately 4nm(b + 2) bits of communication — four types of message (price request, price quote, order, delivery note) multiplied across all firm-supplier pairs.

In a planned system, the planning bureau collects the technology matrix, a table recording, for each product in the economy, what inputs are needed to make one unit of it (so many tons of coal per ton of iron, so many hours of labor per ton of corn, and so on). This matrix is transmitted once. In the first iteration, at a communications cost comparable to one round of market price exchange. After that, the bureau already knows the production structure. Subsequent iterations require only updates on current stocks and delivery instructions, cutting the per-iteration communications load to roughly half the market’s.

The mathematically optimal trajectory from the economy’s current output structure to its target (Dorfman, Samuelson, and Solow, 1958) is called a turnpike path. As the planning bureau is able to compute this, it converges on the target faster than a market system, where each step of adjustment requires the physical movement of goods and the slow propagation of price signals through the real economy.

The term “turnpike” comes from the theorem’s central insight. Just as the fastest route between two nearby towns may involve getting on the highway (the turnpike) even though the towns are close, the fastest path between two economic configurations typically involves moving toward the maximum balanced-growth path and then departing from it near the end to hit the exact target.

Fundamentally, the computation at the heart of planning, deriving labor values and balanced production targets from an input-output table, is what mathematicians call a contractive affine transform (Barnsley, 1988). A contractive transform is one that, when applied repeatedly, brings its output closer and closer to a fixed point. The planning computation works this way because any viable economy produces a net surplus. When the bureau makes an initial estimate of how much of each input is needed and feeds that estimate back through the production table, any error in the estimate is spread over a larger quantity of output, so the percentage error shrinks with each pass. After enough passes, the estimates converge on a consistent set of production targets.

Cockshott and Cottrell demonstrate this with a worked example. An economy producing iron, coal, corn, and bread, where the planners start with a final demand of 20,000 tons of coal and 1,000 tons of bread. In the first pass they estimate the gross output needed; in each subsequent pass the estimates improve. By approximately the twentieth pass they converge on a consistent gross output vector (3,708 tons of iron, 34,896 tons of coal, 1,667 tons of corn, and 1,000 tons of bread). The computation is the same one used to calculate labor values from input-output tables (see Part 1, “Labor Values Predict Market Prices,” for the methodology using the Leontief inverse).

The algorithm was tested on model economies ranging from one thousand to one million products on a commodity personal computer circa 2004; even a continental-scale economy of ten million products, which Nove (1983) considered too large to plan, would require approximately an hour on a single 64-bit processor of 2006 vintage and could be parallelized on a cluster of networked PCs to under ten minutes. This is trivial by contemporary standards.

In accountable planning, the planning bureau takes the final demand vector set by the popular vote, feeds it into the input-output computation, and produces the balanced production targets that the government sector then executes.

What about the neoclassical objection that economic equilibrium is too complex to compute? Here two very different concepts of equilibrium must be distinguished. Neoclassical general equilibrium, the kind formalized by Arrow and Debreu, is a mechanical equilibrium, a unique point in the space of all possible economic configurations where every market clears simultaneously and no agent can improve their position. Two independent lines of research have shown that finding such an equilibrium is computationally intractable.

Deng and Huang (2006) proved that finding a market equilibrium that maximizes social welfare in a Leontief exchange economy is NP-hard (Information Processing Letters, Vol. 97, No. 1, pp. 4–11). A Leontief exchange economy is one in which each agent’s preferences are of a fixed-proportions type: the agent wants goods in specific ratios (one unit of bread requires exactly two units of butter, say) and derives no benefit from extra units of one good without corresponding units of the others. This is the simplest realistic model of production inputs, where components must be combined in fixed proportions, and it is the natural setting for studying the complexity of equilibrium computation. NP-hard is a term from computational complexity theory designating problems for which no polynomial-time algorithm is known — finding one would imply P = NP, one of the deepest open problems in mathematics. The computational cost of NP-hard problems grows exponentially with the size of the input: a problem with n variables may require on the order of 2^n operations to solve, meaning that even modest increases in problem size cause the computation to explode beyond any practical capacity. What Deng and Huang showed is that even for economies guaranteed to have an equilibrium, selecting the best one (the one that maximizes total welfare) is a problem in this intractable class. Cockshott, Cottrell, Michaelson, Wright, and Yakovenko discuss this result in Classical Econophysics (Routledge, 2009, Chapter 15).

The broader problem of computing any market equilibrium at all, without the welfare-maximization requirement, falls into a different complexity class, PPAD (Polynomial Parity Arguments on Directed graphs). PPAD-completeness was established for computing Nash equilibria by Daskalakis, Goldberg, and Papadimitriou (2006), who proved it for games with three or more players, and extended to two-player games by Chen and Deng (2006). The connection to market equilibria in Leontief economies runs through the equivalence between Leontief exchange and two-player games demonstrated by Codenotti, Saberi, Varadarajan, and Ye (2006). PPAD-complete problems are believed to be intractable (no polynomial-time algorithm is known, and finding one would collapse the complexity class), but they differ from NP-hard problems in a specific way: PPAD guarantees that a solution exists (by a parity argument on a directed graph), so the difficulty lies not in determining whether a solution exists but in finding it. The practical implication is the same: even when an equilibrium is guaranteed to exist, no known algorithm can find it in time that grows polynomially with the size of the economy.

Both results support the same conclusion. However, this knife cuts both ways. If the problem is intractable, then no collection of millions of individuals interacting via the market can solve it either. For the NP-hard welfare-maximization problem, the cost grows exponentially with the number of economic actors. Computational resources grow linearly with n, but the cost of the computation grows as 2^n. For the PPAD-complete problem of finding any equilibrium at all, no polynomial-time algorithm is known, and the market has no structural advantage over a computer in searching for a solution whose existence is guaranteed but whose location is computationally elusive. Either way, the market cannot find its own neoclassical equilibrium any more than a planning computer can.

The neoclassical equilibrium is a mirage. No real economy, whether planned or market-driven, has ever operated at a point of Arrow-Debreu general equilibrium. No economy could, because the computation required to find such a point exceeds the computational capacity of the economy itself.

The economic equilibrium under discussion is statistical equilibrium. It’s not a single point where the economy sits motionless. Rather, a stable probability distribution over possible states, analogous to the way a gas in a container has a stable distribution of molecular velocities even though individual molecules are constantly bouncing around (see Part 1, “Why the Regularity Holds: Statistical Mechanics,” for the derivation from conservation laws). Individual firms and workers change their positions constantly, but the overall distribution, the shape of the income distribution, the dispersion of profit rates, the relationship between labor values and prices, remains stable. This equilibrium is computationally tractable for both markets and planners. Economic planning does not have to solve the impossible problem of neoclassical equilibrium. It has to apply the law of value more efficiently than the market does.

Hayek was also wrong about the sufficiency of prices as information carriers. Prices are a lossy compression of the economy’s structure. The full input-output table of an economy with n products contains n-squared entries. Prices compress this to a vector of n numbers. In information-theoretic terms, if we let H_I denote the entropy of interconnection encoded in the input-output table, the entropy of the price vector H_P grows only as the square root of H_I — that is, H_P is approximately equal to the square root of H_I (Cockshott et al., 2009, Section 15.1.1).

Entropy here is used in the information-theoretic sense established by Shannon, a measure of the amount of information contained in a data structure. Prices contain far less information than the production structure they are supposed to regulate. See Part 1, “Information Theory and Thermodynamics,” for the relationship between Shannon entropy and physical entropy.

The authors note that this treatment somewhat overestimates the true entropy of interconnection, but the fundamental point stands. The reduction in information is very substantial. Prices in themselves provide adequate knowledge for rational calculation only if they are at their long-run equilibrium levels, but for Hayek they never are. A firm that sees the price of tin rise cannot tell from the price alone whether the increase is temporary (a strike) or permanent (exhaustion of reserves). The rational response differs in the two cases. Hayek’s own trade cycle theory admits that disequilibrium prices cause malinvestment. Prices are not the only information channel in an economy. Actual orders for commodities, specified in quantities, carry information that prices do not. A firm that only watches prices and ignores order volumes will not survive long. A planning system that tracks both quantities and labor costs operates with more information, not less, than a system that relies on prices alone.

Ian Wright’s agent-based simulations (Review of Political Economy, 2008; developed further in Classical Econophysics, Chapter 9) provide theoretical support for the division of labor between center and periphery that this proposal adopts. In Wright’s models (described in Part 1, “Why the Regularity Holds: Statistical Mechanics”), when many agents trade subject to budget and production constraints, prices gravitate toward labor values without any agent calculating them. Deviations from labor values function as error signals that reallocate social labor across sectors. Accountable planning can exploit the same statistical regularities, setting the broad parameters (what to produce and in what quantities) while allowing local agents to find efficient production methods within those parameters.

  1. Preventing Corruption: A Divided Government

Over time, vanguard parties tend to transform into a labor aristocracy. As Milovan Djilas argued in The New Class (1957), party members stepped into the role of a ruling class, exercising collective political control over the means of production as a new form of monopoly ownership. Mikhail Voslensky documented the system in Nomenklatura: The Soviet Ruling Class (1984), estimating the core nomenklatura at roughly 750,000 top officials, and with their families at approximately three million people, less than 1.5% of a population of over two hundred million. This is structural. A party that holds a monopoly on political power has no mechanism by which the working class can hold it accountable.

The solution is not to install a trustworthy party. The solution is to structure the government so that no faction within it can coordinate against the people. The government is run by secretaries it has hired. The secretaries are required by law to poll the people on their demands and then enact their will. They are not members of one party. This fragmentation prevents opportunists from coordinating to subvert the government’s mandate. If the secretaries fail to follow the law, they are fired.

This government is divided within itself, divided in its interests, its personnel, its institutional loyalties, so that it cannot conspire against the people. The state must be weak so that the masses can be powerful.

  1. Labor Vouchers and the Price Mechanism

Government workers are paid in labor vouchers. The critical distinction between labor vouchers and money is that labor vouchers do not circulate. As Marx argued in the Critique of the Gotha Programme (1875), a worker receives from society a certificate of the labor contributed, and with that certificate withdraws from the social stock of consumer goods a quantity whose production cost the same amount of labor.

The voucher records an individual’s contribution to a social process and is cancelled upon use. It cannot be lent, invested, or accumulated. This eliminates the possibility of capital accumulation through the voucher system.

Under capitalism, money circulates. It passes from buyer to seller and the seller spends it again, enabling the accumulation circuit M–C–M’, described in Part 1. Labor vouchers break this circuit by design: they are issued for work, exchanged once for goods, and then destroyed.

The non-circulation principle applies to consumer labor vouchers, the tokens issued to individual workers and redeemed at public stores. The inter-enterprise accounting described in Section 6 below, in which labor-time costs are debited and credited between production units, is a different mechanism. Entries in a common ledger that track the flow of embodied labor through the production process, not the circulation of tokens between holders. The distinction is between labor-time as a unit of account (used in enterprise-level planning) and labor vouchers as a medium of individual remuneration (issued to workers and cancelled upon redemption). The former is an accounting identity; the latter is a non-transferable certificate. The two operate in different registers.

The real purchasing power of labor vouchers depends on the quantity of goods available for sale, not the denomination of the vouchers themselves. This proposal decrees neither fixed salaries nor fixed prices. It decrees increased production and sale of the goods that voters demand. If supply increases, prices naturally fall.

A sustained, economy-wide decline in prices is called deflation. At first glance, falling prices sound like a good thing, the same money buys more. But under capitalism, deflation is catastrophic, and understanding why is essential to understanding how this proposal differs.

In a capitalist economy, nearly all production is financed by borrowing. A firm takes out a loan to buy equipment, hires workers, produces goods, sells them, and repays the loan with interest out of the revenue. The loan is denominated in nominal terms. If a firm borrows $100,000, it owes $100,000 plus interest regardless of what happens to prices. When prices fall, the firm’s revenue declines (it sells the same goods for less money), but the debt stays the same. The real burden of debt, the amount of actual goods and labor required to repay it increases.

The economist Irving Fisher formalized this in 1933 as the debt-deflation theory. When indebted firms and households find their revenues falling while their debts remain fixed, they are forced to sell assets to meet their obligations. But mass selling drives asset prices down further, which increases the real debt burden further, which forces more selling. The result is a self-reinforcing spiral. Falling prices cause insolvency, insolvency causes distress selling, distress selling causes further price declines. Fisher developed the theory to explain the Great Depression, during which this spiral drove mass bankruptcy, unemployment, and a contraction in output that persisted until government intervention broke the cycle.

There is a second mechanism. When consumers expect prices to keep falling, they delay purchases. Why buy today what will be cheaper tomorrow? This individually rational behavior collectively reduces demand, which pushes prices down further, which reinforces the expectation of future declines. Businesses, facing declining sales and rising real debt burdens, cut wages, lay off workers, and halt investment. Production shuts down precisely when people most need it to continue. Japan’s experience from the 1990s onward illustrates the pattern. After an asset bubble burst in 1991, the economy entered a prolonged deflationary period in which nominal GDP in 2001 was approximately what it had been in 1995, real wages fell, and businesses hoarded cash rather than investing, preferring the guaranteed real return of holding money whose purchasing power was rising. The stagnation lasted, by some measures, three decades.

Inflation is the mirror-image. When the money supply expands faster than the real output of the economy, or when production costs rise across sectors simultaneously, prices rise. The mechanisms are multiple and reinforcing. Cost-push inflation occurs when the price of a key input (energy, raw materials, labor) rises, and producers pass the increase through to final prices, which in turn raises the cost of living, which generates pressure for wage increases, which raises costs further. Demand-pull inflation occurs when aggregate spending exceeds the economy’s productive capacity. Too much money chasing too few goods.

Monetary inflation occurs when the money supply expands through credit creation or central bank intervention without a corresponding expansion in real output, diluting the purchasing power of existing money (see Part 1, “Money Creation as Redistribution,” for the conservation-law analysis of this process). In practice, all three mechanisms operate simultaneously and interact. The stagflation of the 1970s, in which high inflation coincided with stagnant output and rising unemployment, demonstrated that cost-push and monetary factors can produce sustained price increases even in the absence of excess demand, confounding the simple Phillips curve trade-off between inflation and unemployment that had guided postwar policy.

The effect on workers is a real pay cut. Nominal wages may rise, but if they rise more slowly than prices, the worker commands less real output per hour worked. The effect on debtors is the opposite of deflation. The real burden of fixed-rate debt falls, because the debt is repaid in money whose purchasing power has declined. This is why moderate inflation benefits borrowers (including the government, which is typically the largest debtor) at the expense of savers and creditors, and why deflation benefits creditors at the expense of borrowers. The asymmetry between who benefits and who loses under each scenario is a distributional question, hence the political character of monetary policy.

Capitalist economies have developed an extensive institutional apparatus to prevent deflation. Since the 1990s, most major central banks have adopted explicit inflation targets, typically around 2 percent per year. The target is set above zero precisely to maintain a buffer against deflation. If inflation is already at 2 percent and the economy enters a downturn, the central bank can cut interest rates to stimulate borrowing and spending, tolerating a temporary decline toward zero inflation without entering deflationary territory. The 2 percent figure was first adopted by New Zealand’s Reserve Bank in 1990 and subsequently became the standard for the Bank of England, the European Central Bank, and (officially, from 2012) the US Federal Reserve.

In normal times, the central bank lowers short-term interest rates to make borrowing cheaper, encouraging firms to invest and consumers to spend rather than save. When interest rates hit zero and deflation still threatens (the situation Fisher feared and Japan experienced) central banks resort to unconventional measures. Quantitative easing (large-scale purchases of government bonds and other assets to inject money into the financial system), forward guidance (public commitments to keep rates low for extended periods), and in some cases negative interest rates on bank reserves.

The entire apparatus exists because deflation, under capitalism, means the M–C–M’ circuit breaks down. A capitalist invests money (M) to produce a commodity (C) and sell it for more money (M’), but prices fall so that M’ is less than M, the circuit yields a loss. The rational response is to stop investing. Hold money instead, since its purchasing power is rising. When enough capitalists make this individually rational choice, production halts, workers are laid off, and output collapses.

The standard policy responses are inflation targeting, interest rate manipulation, quantitative easing, etc. These are attempts to prevent prices from falling in the first place, so that the M–C–M’ circuit remains profitable and production continues. But this means that under capitalism, continued production depends on the maintenance of a positive inflation rate. Goods must keep getting slightly more expensive, year after year, so that the returns on investment remain positive and firms keep producing. The system requires that the purchasing power of money be deliberately eroded as a condition of its own stability. (The redistributive consequences of these monetary interventions — the fact that quantitative easing inflates asset prices and benefits asset-holders at the expense of wage-earners — are discussed in Part 1, “Money Creation as Redistribution.”)

This creates a political contradiction that capitalist democracies have never resolved. Voters hate inflation. They experience it not as an abstract macroeconomic variable but as a concrete decline in their standard of living. The same paycheck buys less food, less housing, less medicine. Polling consistently shows that rising prices rank among voters’ top concerns, often above unemployment, and incumbents who preside over inflation are punished at the ballot box regardless of whether the inflation was caused by their policies.

For workers whose wages do not keep pace with prices, inflation is a real pay cut delivered invisibly, without any employer announcing it. But the system cannot function without it. Central banks target 2 percent inflation not because voters want prices to rise 2 percent per year but because the alternative, stable or falling prices, would break the profit-seeking circuit that keeps capitalist production going. The result is that the institutional apparatus of capitalism is structurally committed to making voters’ money worth less every year, and voters are structurally committed to resenting it.

When inflation spikes above the gentle 2 percent background rate, as it did in the 1970s, and again in the early 2020s, the resentment intensifies. The psychology of what happens next is well studied.

Terror Management Theory, developed by Greenberg, Pyszczynski, and Solomon (1986) from the work of cultural anthropologist Ernest Becker (The Denial of Death, 1973), proposes that human beings manage the existential terror of mortality by investing in cultural worldviews that give life meaning and in self-esteem that makes them feel like valued participants in that meaningful world. When these psychological structures are threatened, people respond by clinging more intensely to whatever promises to restore a sense of order, significance, and protection. The experimental evidence is extensive. In the standard paradigm, participants who are asked to think about their own death (a manipulation called “mortality salience”) subsequently show intensified defense of their cultural worldview. Harsher judgments of those who violate cultural norms, more positive evaluations of those who uphold them, increased prejudice against outgroups, and increased aggression toward those perceived as threatening. A meta-analysis of 277 experiments across 164 articles found moderate and consistent effects (Burke, Martens, and Faucher, 2010).

Critically for politics, Cohen, Solomon, Maxfield, Pyszczynski, and Greenberg (2004) found that mortality salience specifically increased support for charismatic leaders, those who articulate a bold, emotionally compelling vision of collective greatness, over both task-oriented leaders (who emphasize pragmatic planning) and relationship-oriented leaders (who emphasize compassion and trust). The same research group found that mortality salience shifted support toward George W. Bush and away from John Kerry in the 2004 presidential election (Landau, Solomon, Greenberg, et al., 2004). A subsequent study found that mortality salience increased Americans’ support for Donald Trump (Cohen, Solomon, et al., 2017). A meta-analysis specifically examining mortality salience effects on political attitudes (Burke, Kosloff, and Landau, 2013) found support for both a “worldview defense” effect (people cling harder to their existing ideology) and a “conservative shift” effect (people move toward authoritarian and system-justifying positions regardless of their prior ideology), with the worldview-defense effect somewhat stronger overall.

Economic insecurity functions as an existential threat in the same register as mortality. When a worker’s purchasing power declines and they cannot provide for their family at the standard they expected, the cultural worldview that gave their life meaning is destabilized. The self-esteem derived from being a competent provider is undermined. The resulting anxiety activates the same defensive responses that mortality salience produces in the laboratory: intensified in-group loyalty, hostility toward perceived outsiders. Above all a hunger for charismatic leaders who promise to restore the threatened worldview by force of will. Demagogues do not need to understand TMT to exploit it. They need only to name the threat (immigrants, foreign competition, elite corruption), promise decisive action, and project the kind of bold, unyielding confidence that mortality-salient individuals find irresistible. The pattern is visible in the stagflation of the 1970s, which powered the rise of Thatcher and Reagan; in the post-2008 austerity period, which fueled nationalist movements across Europe; and in the inflation spike of the early 2020s, which provided political fuel for authoritarian populism in multiple democracies simultaneously.

If capitalism structurally requires inflation, and inflation structurally produces a specific kind of political vulnerability, then the political instability of capitalist democracies is not a contingent failure of leadership. It’s a predictable consequence of the system’s own operating requirements. The inflation that the system needs to survive is the same inflation that erodes public trust in the system’s legitimacy. Capitalism does not just produce economic instability. It produces the precise form of existential anxiety that makes democratic populations vulnerable to the leaders least likely to govern in their interest.

There is no M–C–M’ circuit under accountable planning. Production is not financed by debt and does not need to generate a monetary return. It is driven by popular demand and funded by the direct allocation of labor and resources. When prices fall because goods are abundant, no debt spiral is triggered as the government sector does not borrow to produce. No firm faces insolvency from declining revenue, because there is no revenue target to meet. Workers are not laid off when prices fall because their employment depends on the democratic mandate. Deflation does not trigger deferred consumption because the goods are already being produced in the quantities the population voted for. The purchasing power of labor vouchers rises as goods become abundant. The standard of living rises with all of this.

Because workers’ purchasing power rises rather than erodes, the existential anxiety that drives populations toward authoritarian leaders is defused.

  1. Coordination of Intermediate Production

Multiple projects that require the same intermediate products should pool resources. For example, all construction projects requiring steel bars in a region could collectively support a dedicated facility that produces high-quality steel and delivers it to its supporters. A housing development, a bridge repair, and a hospital expansion all need structural steel. Rather than each project independently negotiating with suppliers or attempting to produce its own steel, they contribute a portion of their labor budgets to a shared steelworks. The steelworks specializes, achieves economies of scale, and delivers to all its supporters at cost denominated in labor time.

This is voluntary coordination among producers. If a project finds a better supplier or a more efficient method, it is free to redirect its support. The steelworks that loses supporters must improve or close.

The accounting mechanism is labor-time cost. When the steelworks delivers steel to a construction project, the labor content of that steel (direct labor at the steelworks plus the indirect labor embodied in the raw materials it consumed) is debited from the construction project’s labor budget and credited to the steelworks. There is no profit margin. The cost of intermediate goods reflects only the labor required to produce them. This is the same quantity that Cockshott and Cottrell’s methodology computes when calculating the Leontief inverse (see Part 1, “Labor Values Predict Market Prices,” for the methodology). The same quantity that empirically predicts market prices. As noted in Section 5, these debits and credits are a mechanism for tracking embodied labor through the production chain, not the circulation of labor vouchers between holders.

This structure also provides a natural mechanism for identifying inefficiency. If one facility’s labor costs per unit are higher than those of a comparable facility, the projects it serves have a direct incentive to switch. The consumers of intermediate goods perform this function automatically, because they are working within fixed labor budgets.

  1. Assignment of the Means of Production

Let’s say a group of applicants want farming jobs. They are assigned land by the government. Their work is subsidized as long as the people vote for agricultural production. If they mismanage the land, it is reassigned to other applicants.

If there is persistent failure to produce, damage to the land or failure to deliver agreed-upon outputs, they are accused of mismanagement. Accusations are resolved through legal challenges with reasonable evidential standards. General legal guidelines define what counts as mismanagement.

This is a legal process, not an administrative one. The aim is to prevent both the arbitrary seizure of productive land and the indefinite occupation of land by those who fail to use it.

This system provides a permanent alternative to wage labor. Anyone willing to farm can apply for land and receive government support. This proposal removes the means of production from the commodity circuit.

  1. The Capitalist Remainder

This proposal does not seek to ban competition or market exchange. The problem is not competition or individual initiative. It’s the profit motive and its associated ills: the instability caused by the business cycle, the falling rate of profit, the distortion of demand by purchasing power, and the concentration of ownership that these forces produce. Just as a minimal market existed before capitalism, a minimal market will exist after capitalism.

Every economy that has attempted to abolish market exchange entirely has generated an informal sector. Gregory Grossman coined the term “second economy” for the Soviet Union’s informal sector. In a 1977 article in Problems of Communism, he defined it as all economic activity conducted for direct private gain or in knowing contravention of existing law. The Berkeley-Duke emigre household budget survey estimated that approximately 28 to 33 percent of urban household income in the late 1970s derived from second-economy activity (Grossman, 1987, using data from the survey administered to 1,061 emigre households).

Society is too divided for total collectivization. It is better strategy to target the principal actors than to criminalize every person who participates in exchange. Collectivization should be carried out, but as necessary. It should not try to encompass the whole economy.

The government does not concern itself with private producers unless they obstruct the fulfillment of popular demand. Private businesses may exist. The labor voucher system is the sole unit of account in the government sector and operates independently of whatever medium of exchange the remaining private sector adopts. Within the government sector there is no “currency” in the conventional sense, because labor vouchers do not circulate. They are issued for work performed and cancelled upon redemption. The private sector may use whatever medium of exchange it develops, but this has no standing within the government system.

Taking means of production from private owners is triggered by one of two conditions:

a) The people have voted for goods whose production requires means currently in private hands and those goods cannot otherwise be supplied.

b) A private owner has accumulated productive assets beyond the threshold at which continued accumulation threatens democratic governance.

The first trigger (unmet popular demand) ensures that private ownership cannot block essential production. The second (asset-accumulation threshold) ensures that no private actor amasses enough economic power to mount a serious political challenge. Under this proposal, businesses are explicitly told that if their growth becomes too large, the means of production will be seized. Without the hope of potentially infinite growth, these mechanisms ensure that the businesses surviving under these conditions are marginal to the economy.

When either condition is triggered, the confiscation is an administrative and legal process analogous to eminent domain or a regulatory fine, not a criminal sanction. It is carried out by authorized agents of the government acting under legal authority.

“Cannot otherwise be supplied” has an operational definition like: the people have voted for a good, the government has issued production contracts or subsidized production in the relevant sector, a statutory period has elapsed and the quantity produced still falls below the target by a specified margin. The determination is made by a court on the basis of publicly auditable production data.

Private-sector capacity can be assessed through the same input-output data the proposal already relies on for labor-value calculations. If Cockshott and Cottrell’s methods can compute labor values from national input-output tables, the same tables reveal sectoral output and capacity. The measurement problem is real but not different in kind from the measurement problems the proposal already accepts as tractable.

Since collectivization is not total, some M–C–M’ remains in the economy. Even if marginal, speculation does extract positive surplus value. If private producers are operating for profit using wage labor, then value relations persist in that sector, and the labor provided there has the character of wages; it is indirectly social.

“Indirectly social” means that the labor is validated as socially useful only after the fact, through the sale of the commodity on the market. If the product does not sell, the labor was wasted. It never becomes part of the social total. By contrast, labor in the government sector is “directly social”. It is recognized as socially useful at the point of production, as the production plan itself was set by popular vote. The distinction matters because Marx argued that labor vouchers presuppose directly social labor (see Section 2 of Part 4).

The capitalist remainder is a genuine remnant of the old mode of production, operating with money, wages, profits, and indirectly social labor. However, it is under total control of a democratic authority that can prune it as necessary.

The factor that keeps workers trapped in exploitative jobs under capitalism is the coercive structure of the labor market. Workers must sell their labor-power to whoever will buy it because the alternative is destitution. The government sector provides a permanent exit from that dependency. Capitalists who remain in operation must offer conditions attractive enough to compete with the government sector, or they will have no workers. This breaks the power of private employers and inverts the power relation that prevails under capitalism.

Leave a comment

Design a site like this with WordPress.com
Get started