by Chris Otter
It is one of the most striking paradoxes of our time. Today, more people around the world go hungry than ever before in human history. At the same time, even more people are now classified as obese–part of what observers are calling an overweight 'epidemic' and health crisis. This month, historian Chris Otter explores the history of how we have chosen to produce, distribute, consume, and think about food to explain how we have arrived at these extremes of feast and famine.
In June 2009, the United Nations' Food and Agriculture Organization (FAO) reported that, worldwide, the number of hungry people had reached one billion. Today, more people are hungry than at any point in human history. They are concentrated in the developing world, and their hunger has been exacerbated by the global financial crisis.
In 2008, world wheat prices reached a nineteen-year high, and over thirty countries experienced food riots. "Hunger seasons" have become the norm in many parts of the global south, and women bear the brunt of this food shortage.
According to ActionAid International, women produce up to seventy percent of food in developing countries. However, women also make up seventy percent of the world's hungry, and they own only one percent of the world's land. They might prepare most of the world's food, but they do not eat their fair share of it.
In the west, however, what strikes us is not hunger, but its opposite: obesity. According to a recent World Health Organization (WHO) study, more than 1.6 billion people globally are overweight or obese—that is 60% more than go hungry. As early as 1987, the American media began murmuring about an "obesity epidemic," and in 2001 the WHO began to speak of "globesity."
This epidemic is not limited to America and Western Europe: it is visible in East Asia, Central and South America, and even in Africa. In South Africa, 30.5% of black women are obese. In China, the prevalence of childhood obesity rose from 1.5% in 1989 to 12.6% in 1997.
Escalating global hunger and obesity levels might seem like a gigantic paradox. It is not. It is part of a single global food crisis, with economic, geopolitical, and environmental dimensions. It is perhaps the starkest, most basic way in which global inequality is manifest.
It has many tangled causes, one being simple competition for basic cereals. The growth of nonwestern economies like China almost invariably generates a shift to a more "western" style diet, which involves rising meat consumption, which in turn necessitates diverting vast quantities of cereals from humans to cattle. This is a high-status but inefficient way to consume protein and calories.
This competition has recently been magnified by the expansion of the biofuel industry, which diverts cereals from humans to cars. Southern Africa, for example, has been promoted as the new "Middle East of biofuels," to grow crops not to feed Africans, but to power automobiles.
Spiraling grain prices, increasing meat consumption, and the question of biofuels are merely three facets of a multidimensional global phenomenon that is affecting how we produce, distribute, and consume food.
Other aspects include climate change, which is making tropical seasons hotter and drier; speculation and collusion on commodity markets; dwindling grain reserves; and export restrictions imposed by panicking nations keen to protect domestic consumers.
More "medium-term" causes of the contemporary global food crisis include market distortions produced by large-scale government subsidies to European and American farming and World Bank programs of structural adjustment, which have systematically dismantled national systems of subsidized farming elsewhere in the world.
In other words, the world food crisis is a particularly instructive, if unsettling, event that can illustrate certain aspects of "globalization." It demonstrates how the basic act of eating a piece of bread or meat binds consumers seamlessly with distant farmers, large corporations, energy systems, economic forces, and international politics.
A History of Food Systems
The historical origins of today's global linkages between food, capital, energy, environment, and technology lie well before the mid-twentieth century.
For most of history, humans hunted or grew food for their own consumption, and food travelled only short distances from source to stomach. Yet, orchestrated, long-distance exchanges of food go back millennia: the spice trade dates back to ancient times, for example. Islamic farmers brought sugar to the Mediterranean around 600AD, and the Spanish, along with other European powers, brought it to the new world, and established the huge plantation complexes that formed a recognizably long-distance food system.
From the early modern period, European historians can identify a series of relatively distinctive "food systems" or food regimes, which can help us locate the origins of today's global food crisis in deeper historical time.
The period 1500-1750 saw a "mercantile" food system. Most basic foodstuffs (grains, milk, and meat) were produced within Europe, but "exotics" were drawn from the colonies, with protective tariffs ensuring that such colonies could only trade with their mother nations.
During the nineteenth century, this nakedly extractive system was largely dismantled and replaced with a "settler-colonial regime" (c.1850-1930). White settler colonies (America, Canada, Argentina, Australasia) increasingly supplied Europe with luxury and basic foodstuffs (particularly meat and wheat), the profits from which were used to purchase European manufactured goods.
After 1945, following the compound shocks of two World Wars and the 1930s financial crisis, a new "productivist" food regime emerged. This new food system was typified by the re-emergence of European and American agricultural protectionism, and the growing power of the food industries (such as Kellogg's and Del Monte).
There were important institutional dimensions to this post-World War II shift. With the foundation of the UN and the FAO (1945), the idea that the entire world could collectively suffer a “food crisis” (of maldistribution, hunger, and famine) can be said to have been born, as can the idea that a world free from hunger was both feasible and politically expedient.
In 1951, the Rockefeller Center's Mexican Agricultural Program produced a paper on the "World Food Problem," which explicitly conceived the issue of global food scarcity in terms of geopolitical security: hunger, caused by overpopulation, was viewed as a primary cause of political instability. The result was the "Green Revolution": with the aid of high-yielding crop varieties, fertilizers, and pesticides, agricultural underproduction in the developing world might be overcome.
This "productivist" regime, however, did not survive the early 1970s, when a convergence of economic and climatic events produced perhaps the first recognizable "world food crisis". This crisis, again, was caused by a confluence of interrelated factors: the El Niño weather pattern, the oil crisis, the collapse of Bretton Woods, and tensions surrounding globalization.
Rising meat consumption also played its role. A Soviet Union determined to outpace American levels of meat-eating had to import enormous quantities of grain to support that measure, which drove up world prices, leading to shortages. Famine spread across the southern hemisphere, from West Africa to Bangladesh.
Following these years of turbulence, another food regime developed after 1980. We might term this a "neoliberal" food regime, typified by growing multinational corporate and institutional power. This new system promotes a "global diet" (high in sugars and fats) increasingly at odds with older national or cultural culinary traditions. Countries as diverse as Japan, Iran and the Democratic Republic of Congo consume vastly more wheat products than they did in 1945, for example.
This food regime has also witnessed controversies over biotechnology and an "organic backlash" in the wealthier parts of the western world. At present, we might be witnessing the unraveling, reconfiguration, or intensification of this regime.
Agro-Food Systems 1850-1900
From a historical perspective, the second half of the nineteenth century is particularly interesting: this is when a novel form of world food system came into being, which was centered on Europe, and Britain in particular.
In the mid-nineteenth century, Britain pursued an aggressive policy of trade liberalization, dismantling protective tariffs on most foodstuffs, most famously grain in 1846. The rest of Europe followed, but quite rapidly reverted to protection when the economic climate changed after 1870.
This made Britain the most important single global market for food in the second half of the nineteenth century: in 1860, 49% of the total food exports of all of Africa, Asia and Latin America went to Britain. This food system allowed speculation: the Liverpool corn exchange authorized futures trading in 1883.
This food system also displaced onto much of the rest of the world the burden of feeding the exploding populations of industrializing Europe, who generally ate better than ever before. The later nineteenth century saw up to fifty million famine deaths in India, China, Korea, Brazil, Russia, Ethiopia and Sudan. The global causes of these famines—which connect the economic, the political and the climatic—disturbingly prefigure today's food crisis.
To illustrate the development of this food system, let's look at two key foodstuffs: wheat and meat. European wheat consumption rose dramatically in the nineteenth century, but much of this rise was composed of imports. Britain produced 80% of her wheat in 1841; by 1900, this figure was 25%. With the advent of the railway and steamship, it became cheaper to grow wheat in Montana or the Canadian prairies and ship it to Liverpool than to grow it in, say, Lincolnshire.
With functional telegraphy conveying price signals, this produced the world's first truly integrated market. The international grain trade became increasingly controlled by a small group of companies (Cargill, Bunge, Dreyfus, Continental, André). By the early twentieth century, a one-cent price fluctuation might produce a 50,000 acre annual difference in land planted in Argentina.
Such reliance on imports raised grave questions of food security, as W.J. Gordon noted in How London Lives (1897): "if this country were to lose the command of the seas the people would starve." Britain relied on her navy rather than her farmers for food.
When Fritz Haber synthesized nitrate fertilizer in 1908, he thought this would allow Germany to achieve "food independence": domestic production would rise sufficiently for dependence on imports to cease. Such geopolitical concerns were borne out by World War One, which was, in historian Avner Offer's words, a "war of bread and potatoes" as well as one of steel and gold.
The interwar period saw European nations struggle to resurrect some form of nutritional self-sufficiency: Italy, for example, launched its Battle for Grain in 1925. Even the British began the retreat from a century of laissez-faire policies with the 1932 Wheat Act, which reintroduced protectionism.
Wheat production already demonstrated the kind of truly transnational links, and capacity for destabilization and volatility, that it does today.
Rising meat consumption, meanwhile, is perhaps the most important phenomenon in modern western dietary history. Average annual per capita meat consumption in Germany, for example was below 20 kilograms before 1820, but by 1900, it was almost 50 kilograms. Meat-eating was symbolically and viscerally linked to every kind of power: masculine, racial, imperial, and national (vegetarianism, reactively, was born in its modern guise in the nineteenth century).
Yet, ironically, this avalanche of meat could not be produced domestically: by 1914, Britain imported 42% of its meat. World trade in meat rose seventeenfold between 1854 and 1913.
Importing meat, of course, required more than territory, railways and steamships. Because of the distances involved, it also needed refrigeration: hence the production of a cold chain linking abattoirs in Argentina and Australia with European cold stores.
By 1891, artificial refrigeration apparatus had replaced natural ice on the steamships bringing frozen meat from America to Britain. Such controlled environments, which arrested organic decay, temporarily cheated time and generated the modern western experience of being able to eat practically anything, year-round.
From the perspective of today's food crisis, two points are particularly germane here. First, this particular food system, in which production was concentrated in particular parts of the world, and in which fossil-fuel inputs (for transportation, refrigeration, and fertilizer) escalated, was more globally connected, and more energy-intensive, than anything previously seen.
Today, around seventeen percent of American energy consumption goes towards food distribution. Well before the 1970s oil crisis and the current biofuel controversy, food and energy systems have been inseparable.
Second, this system, like those following it, created and sustained a calorific rift dividing western Europe and North America from much of the rest of the world.
This combination of technologically-embedded and energy-intense agriculture and distribution, and globalized asymmetry of consumption patterns made food crises on a truly global scale possible.
The Western Surplus: Obesity
In 1800, a diet of 2,000 calories per day was normal in many European countries. From around this point onwards, a steady calorific rise is discernible, with most European nations breaking the 3,000 calorie threshold by the early twentieth century.
Great nutritional disparities existed within the West—as social investigators demonstrated and hungry slum-dwellers protested—yet a significant calorific chasm had emerged between the west and much of the rest of the world.
Hunger persisted, and indeed rose. Yet, production—buoyed by synthetic fertilizers, pesticides, and developments in plant genetics—easily kept pace with world population growth.
The FAO Second World Food Survey (1952) noted that 59.5% of the world's population lived in countries where daily food supplies were below 2,200 calories, a figure falling to ten percent by the mid-1980s. West German and British gross food output, moreover, doubled from 1950 to the 1970s, as agricultural self-sufficiency returned to post-industrial Europe.
The western dietary complex—sugar, wheat, beef (and increasingly chicken), dairy products, plus caffeinated and alcoholic beverages—has increasingly become a diet to which developing nations aspire even as health-conscious westerners try desperately to emulate the unprocessed diets of pre-modern peasants.
Obesity, however, is neither a wholly modern phenomenon nor a wholly modern concern. Aristotle thought too much fat harmful, for example. Writing on obesity flourished in the late seventeenth and early eighteenth centuries, when it was generally seen through the lens of "the humors" which shaped much medical thinking at the time.
Being overweight, however, was usually the preserve of the wealthy, and it remained a sign of fashion, status, or even virtue well into the nineteenth century. An 1865 article in the English periodical Blackwood's Magazine noted that the corpulent were incapable of deceit or violence: "stout people are not revengeful; nor, as a general rule, are they agitated by gusts of passion. Few murderers weigh more than ten stone [140 pounds]."
The "war" on the waist-line, however, was beginning. In an interesting reversal of today, nineteenth-century American observers were sometimes struck by the level of obesity they saw on English streets. The British surgeon William Wadd noted in 1829 that it was becoming increasingly difficult for the corpulent to secure space on stagecoaches, while other British doctors began noting the effect that sedentary lifestyles and dietary abundance were having on middle-class girths.
By 1900, fat was ceasing to be either fashionable or a status symbol: it displayed a failure to control one's appetite and, writ large, a failure to control one's self.
A pivotal figure in this shift was a plump English undertaker, William Banting, the designer of the Duke of Wellington's coffin. In his best-selling "Letter on Corpulence" (1863), Banting recounted his long struggle against being overweight.
He tried numerous remedies, he noted, including exercise, sea air, riding, and Turkish baths, "yet the evil still increased, and, like the parasite of barnacles on a ship, if it did not destroy the structure, it obstructed its fair, comfortable progress in the path of life." He eventually found success with a diet low in sugar, starch, and fat, which left him thirty-five pounds lighter.
Banting's text was among the first to view the remedy for obesity not in terms of the volume but of the type of food consumed. Although it was not explicitly couched in terms of "low carbohydrate," the idea of monitoring food groups was beginning to take shape, as was that of weighing oneself regularly.
Banting recommended frequent weighing, something made feasible by the proliferation of public and private scales. These, along with the later emergence of standard clothes sizes and more regular medical examination, made it increasingly difficult for any westerner to escape knowledge of how much they weighed, and whether or not this weight was normal.
By the early twentieth century, commentators were connecting Banting's recommendations about the individual to more collective concerns about the effect of modern life on health and the body. In 1901, the economist J.A. Hobson suggested that British "command over commodities," particularly "carboniferous foods" was producing excess consumption.
F.A. Hornibrook, in The Culture of the Abdomen (1924), depicted obesity as the paradigmatic disease of civilization, and recommended a series of ergonomic exercises (abdominal retraction, deep breathing) for those wishing to escape its starchy clutches. Other innovators flooded the market with a variety of remedies for flab, from abdominal lipectomy to massage machines.
Modern technology, for some critics, was as culpable as overabundant food. From automobiles and washing machines, to, later, remote controls and online shopping, technology was subtly reducing the amount of calories unconsciously burned over the course of a day.
In The Road to Wigan Pier (1937), George Orwell grumbled how technological advance was threatening to produce "a paradise of little fat men." The focus on men is important here, since obesity was often equated with diminished male virility. This was particularly the case for those already suffering from potential emasculation, like clerks (whose tedious, sedentary lifestyles prevented the sorts of physical feats of strength and courage that defined masculinity at the time).
Others placed obesity in an anthropological context: the further one travelled (geographically, culturally, chronologically) from western cities, with their surfeit of cars, office machinery, and sugars, the thinner people appeared to get. Conversely, as the western dietary and technological complex encroached, the more obesity, along with heart disease, diabetes, constipation, food poisoning, and bowel cancer, replaced hunger and famine as the primary food-related threats human beings faced.
For all the angst about overweight male clerks, however, the stigma of obesity was increasingly felt by women. In The Art of Feminine Beauty (1930), Helena Rubenstein sternly warned that "an abundance of fat is something repulsive and not in accord with the principles that rule our conception of the beautiful."
Fifty-six years earlier, around the time that Banting was formulating his rather cordial version of carbophobia, William Gull (a royal doctor) coined the term anorexia nervosa to refer to a condition he and other Anglo-American doctors were seeing with greater frequency. Middle-class girls were refusing to eat, without any sign of being insane or otherwise ill.
Anorexia nervosa, as many studies have shown, appeared in the late nineteenth and early twentieth centuries for numerous reasons: changing female bodily norms, the rise of feminism, shifting fashions, alterations in bourgeois family dynamics, and the medicalization of everyday life. But it also appeared at precisely the time when the west began to plug itself into a seemingly inexhaustible stream of food drawn from around the world.
The World Food Crisis
In 2008, Eva Clayton, the former special adviser to the Director-General of the FAO, spoke before the U.S. House of Representatives. "The situation is dire," she stated. "Our response must be decisive and forward thinking. The failure to strengthen our global food system would ultimately lead to political and economic upheaval all over the world."
The food crisis is indeed dire. It is also systemic and global: it unites the world, but its pathologies are geographically distinct. On the "developed" side of the calorific rift, fat is accumulating at a startling rate. On the "developing" side, huge populations are increasingly vulnerable to hunger and famine.
The bifurcation of the world into fat and hungry zones is the most visceral way in which global inequality is lived, felt, and seen. Although this process has accelerated in recent years, the origins of such corporeal polarity and stratification lie deep in historical time.
As Europeans colonized the world and built food systems that underpinned their industrialization and development, they embedded dietary inequality within these systems. The global food crisis is a product of these past practices.
One of the greatest challenges of the twenty-first century, then, is to find a way of overcoming this history and producing a more equitable global food system, one in which the obese will lose some of their weight while the starving will gain some.
All images and content are the property of eHistory at The Ohio State University unless otherwise stated.
Copyright © 2013 OSU Department of History. All rights reserved.
[citation and copyright information]