These days it’s hard not to despair when we look to the future of our food supply.
Headlines tell of soaring temperatures and supersized storms, of endless droughts and aquifers that sink by the day, of seas saturated with plastic and fisheries at or near collapse.
And yet most of us retain our faith in our power, as a society, to master the challenge of feeding the people of the world. Whatever comes our way, we believe, someone somewhere will think up a solution that allows us to survive, even thrive.
Nick Thompson captured this delicate balance recently in an otherwise doleful piece in The New Yorker. Musing on the “terrible news” that the concentration of carbon dioxide in the atmosphere recently passed 400 parts per million, a level that augurs further rapid climate change, Thompson concluded that humans will ultimately have to “invent our way out” of the crisis.
Given humanity’s proven ability to master new technologies, such faith is not unreasonable. Over the last two centuries, we have harnessed electricity and nuclear power, revolutionized the speed at which we travel, entirely rethought how we share and manipulate information, and learned to engineer the basic materials of life.
The question we should be asking, however, is not whether we have the technical smarts to ensure we all have enough to eat. It’s whether we have the political smarts to protect those among us—the scientists and engineers and entrepreneurs—who will think up the better ideas and better ways and put them to use.
The fundamental danger we face is simple. In recent years a few private companies have captured almost complete control over many vital technological realms, and the managers of many of these firms increasingly have an incentive to manipulate technological advancement in a way that serves their private interests only. As a result we find our society directed down fewer and fewer technological pathways, hence toward fewer and fewer potential futures.
In the late 19th century, when Americans first came face to face with immense private monopoly, it was easy to see how giants could block alternative pathways to a market. When independent oil drillers in Pennsylvania tried to build a pipeline to the coast, for instance, Standard Oil bought up long strips of land—often called “dead lines”—to cut them off.
Nowadays most such battles take place in more virtual realms. In the pharmaceutical industry, GlaxoSmithKline and Pfizer routinely pay rivals not to manufacture generic versions of profitable drugs. In semiconductors, trustbusters in Europe, Japan, and South Korea all concluded in recent years that Intel was using secret arrangements to prevent its customers from buying from competitors. In software, the fashion today is simply to buy up one’s rivals; Oracle has purchased more than 70 competitors just since 2005.
Nowhere do we see so many “dead lines” cutting across our future than in agriculture and food.
By now most of us know the chemical company Monsanto dominates immense swaths of the seed business, with its genetic traits in some 90 percent of our soybean crop and 80 percent of our corn. Less well-known is that Monsantobuttresses this awesome control through cross-licensing arrangements that, as a Food & Water Watch report details, intimately interweave its interests with its “rivals,” as it did with DuPont last March. The practical result? Where many thousands of farmers, small seed companies, and university scientists once worked to develop stronger seeds, we see but a handful of giants. And because these firms now share the profits of most innovations, they have less real incentive on any given day to risk investing in what is new.
Much the same is true among the giant corporations that butcher and package our cows, hogs, and chickens. In the past Smithfield and other giant processors merely enjoyed power over the farmer and consumer. Now they also rule over the genetic material of the animals themselves, which they claim is a form of “intellectual property,” as they noted in congressional hearings earlier this month. Their power is so complete that the companies that trade in animals bred through traditional, open methods—like Niman Ranch—find it ever harder to survive.
The swelling power of the trading companies that collect the grains of the farmer and distribute food to the citizen further speed such technological pruning and intellectual simplification. When giant grain-trading companies buy up traditional transport and storage facilities, as ADM is now doing with GrainCorp, higher transport and storage fees can threaten smaller farmers, along with their accumulated knowledge of seed, soils, and climates. When Wal-Mart cuts what it pays its suppliers, those suppliers often respond, as Charles Fishman detailed in The Wal-Mart Effect, by cutting quality, variety, and what they invest in new and better products. They also often respond by merging with one another, which means both fewer pathways to the market and less competitive pressure to introduce and test new ideas.
The effects of such gigantism on technological advance can last a long time. The Austrian economist Joseph Schumpeter in 1942 famously wrote of how the fear of “potential competition” can keep even a complete monopolist on its toes. But in the real world, such “potential” is realized only rarely. Instead, we see processes and products that remain the same over very long periods of time, as the companies that control these systems choose to invest instead in buying up and blocking off what is new, even if better.
In the 20th century, both Democrats and Republicans routinely used antitrust law to drive competition in some of our most important technological realms—including electronics and communications. The result was a series of fantastic technological advances. Yet the trustbusters did not always succeed. In 1911 they lost a case against U.S. Steel. Seventy-five years later, that company still relied on the same old furnaces even as foreign companies in more competitive markets developed vastly superior technologies. In 1926 the trustbusters failed to force GE to open up the light bulb market. As a result, we continued to use notoriously wasteful 19th-century tungsten filament technology well into the 21stcentury.
Before we can ever effectively address concentration and its effects on technological advance—and the chance to create a better, perhaps sustainable world—we must first overcome the widespread misapprehension that competition is wasteful, hence that in regulating our political economy we must aim foremost at efficiency.
The idea is not a new. In America we can trace the idea to men like John. D. Rockefeller and J.P. Morgan, who justified their predations in large part by claiming they were making business more “efficient.” Such thinking was soon adopted—and formalized—by many economists and other “experts.” A century ago in America, many promoters of “scientific management” of production truly believed there was a “one best way” to accomplish every task.
But the idea did not win wide acceptance. Classic liberals like Woodrow Wilson, Franklin Roosevelt, and Dwight Eisenhower believed strongly in the need to promote competition through the distribution of power—for both political and economic reasons. One practical result was very rapid technological advancein those sectors where antitrust was enforced. For instance, the great industrial historian Alfred Chandler concluded that it was the trustbusters who “set the stage” for the Silicon Valley scientists who invented the “electronic century.”
In the 1960s, however, the idea of efficiency surged back into fashion on both wings of the political spectrum, especially among followers of leftist economist John Kenneth Galbraith and the right-wing “Chicago school” of economics. The argument was essentially the same one put forward by Rockefeller a century earlier: The bigger a firm, the greater its ability to drive down prices and serve the “consumer.” But this time, with the backing of the “consumer movement” forged in the 1960s, the advocates of efficiency at last achieved the revolutionary change they sought. The key to their political victories? The argument that enforcement of antitrust law should be oriented around a new concept—that of “consumer welfare.”
In the three decades since the “consumer welfare” test was built into our anti-monopoly laws, Americans have witnessed perhaps the greatest roll-up of power ever, one that has remade almost every sector of the U.S. economy. And the basic thinking remains largely unchanged, even after the Wall Street crash of 2008 revealed some of the structural dangers posed by concentration and even after the Tea Party and Occupy movements proved that Americans still very much fear monopoly. The idea that government should help private actors to impose “efficiency” in business and banking—supposedly to help the “consumer”—still shapes decision-making by the Obama administration, Congress, and the Federal Reserve.
Strangely, one of the strongest bastions of support for this “consumer welfare” argument is the environmental movement. Here the origins of the idea that competition is “wasteful” trace largely to Theodore Roosevelt-era thinking on “conservation.” But the practical result is the same, as we can see when groups like the Environmental Defense Fund and Natural Resources Defense Council embrace Wal-Mart and other goliaths precisely because they believe they can “create environmental progress by leveraging corporate purchasing power.”
In 1798 the British economist Thomas Malthus published “An Essay on the Principle of Population.” His argument was simple. The limited amount of land in the world meant a limited amount of food, hence a limit to the total number of people. If the population grew beyond this limit, starvation would soon bring the numbers back into balance. For decades, right through human catastrophes like the Irish Potato Famine, the rulers of Europe used Malthus’ zero-sum argument as an excuse to do nothing.
Yet across the Atlantic, the American people were already proving Malthus wrong. Having broken the power of the lords and clerics who for so long ruled over our land and industry and minds, Americans now took advantage of our new freedom to think up better ways to reap and sow and improve our seeds. And so, long before gasoline-powered tractors, petroleum-based fertilizers, and genetic manipulation of plant matter, American farmers adapted plants like wheat and corn to entirely new ranges and greatly increased per-acre yields.
We stand today, as a society, armed with innumerable better ideas. They include, as Michael Pollan told Slate recently, farming techniques that empower us simultaneously to address “climate change and soil quality and food security.” This includes, as Frederick Kaufman wrote, “open source” approaches to improving foods that empower many scientists to participate in projects where one or a few corporate teams now rule. It is impossible, today, to know which ideas will best enable us to adapt our food system to a world of rapid, even chaotic environmental change. What we do know is the best way to sort through these ideas is within transparent and competitive marketplaces.
The American food movement has achieved much success in developing new models of farming, and in creating markets and regional networks that connect growers to eaters. But the movement can never win the big fight—to apply these principles to the systems that feed all the people of the world—until it stands up squarely to the command-and-control corporate systems that increasingly dominate our political economy, and to the arguments used to justify that power.
If we find ourselves living in a new era of food shortages, it will not be due only to our failure to control carbon. It will be due even more to our failure to protect the open-market systems that empower us not merely to exchange, but to think and adapt.