Twenty-five years of food expenditure trends

One of the interesting data tidbits in the recent report from the Government Accountability Office (GAO) is annual American food expenditures. Using data from the USDA and the consumer price index (to correct for inflation), the GAO calculated per capita food expenditures and the share of disposable income spent on food for six different years between 1982 and 2007.

The first chart made from their tabular data shows annual per capita spending. Total spending (the black bars) is divided into two categories, spending for food eaten at home (blue bars) and food eaten away from home (red bars). (Note that all are adjusted to 2008 dollars). “Away from home” food is defined by the USDA as food that was prepared outside of the home, regardless of where it was eaten. This category therefore covers meals in restaurants, take out and delivered prepared food.

Between 1982 and 2007, total spending on food increased by about 16%. “At home” spending has been more or less flat, so the increase came solely from rising “away from home” spending.


But incomes have clearly also been rising, as the data in the next chart indicates. Even as we’ve been spending more dollars, then, the share of our disposable income that goes to food has decreased by about 20% between 1982 and 2007. In other words, we are spending smaller percentage of our disposable income on food today than we were twenty-five years ago. Given the shift away from cooking at home, it’s not surprising that almost all of that decrease happened in the ‘at home’ category, while the share of disposable income that we spend on food away from home stayed more or less constant.


I don’t know the exact cause of the drop, but I’m sure it’s somewhere in the mountains of data at the USDA, Bureau of Labor Statistics or somewhere else, and the explanation is probably simple: wages rose faster than food prices.

Today, we are spending a greater share of our food dollar away from home than we did twenty-five years ago. Restaurant food has never been known for its slimming properties, but has the share of calories we get from away-from-home food risen more than we might expect over that period? The answer is most likely yes. Although the USDA has masses of data on what we eat, the most recent data on the caloric split between at home and away from home food is for the periods of 1977-78 and 1994-96.* During that period, the share of calories provided by away from home food increased from 18% to 32% of total calories. Using the data in the GAO report, I calculate that in 1982 Americans spent 40% of their food dollar on “away from home” food; in 1997 it was 47%. Comparing the calorie share and dollar share reveals that the 1997 dollar bought a lot more calories than in earlier years.

Continuing to look at calories, I went back to the USDA’s Food Availability database that I wrote about last year to see how calorie availability** changed in the 1982-2007 period. Not surprisingly, availability has increased significantly, from 2,200 calories per person per day in 1982 to 2,679 in 2007. So, if we assume that the trend indicated by availability is valid, Americans have been spending a smaller share of their income on food each year while receiving more calories. The next figure shows this graphically. The black bars are the same total share data from the second chart; the gray bars are the share of disposable income required to purchase 2000 calories per day. Note how the gray bar drops much more quickly. That means that we are receiving more for less. But this bounty of calories — which is probably more often in the form of soybean oil and high fructose corn syrup, instead of celery and honeydew melons — has brought with it a bounty of trouble: obesity, diabetes, heart disease, and environmental degradation, to name a few. Not much of a bargain, if you ask me.



*Much more food expenditure data can be found at the USDA’s Economic Research Service and at the Bureau of Labor Statistics.

** The USDA’s Economic Research Service provides the following disclaimer about their availablity data: “the data are useful as an indicator of trends, but as the USDA’s “Limitations of the Data” section says, “…the loss-adjusted data series does not measure actual consumption or the quantities ingested because neither series is based on direct observations of individual intake.”

Top photo from iStockphoto, charts created by the author.

3 Responsesto “Twenty-five years of food expenditure trends”

  1. I always find it interesting that ready to cook/heat foods such as “chicken” fingers, frozen pizzas, TV dinners, and the such, are included  in the at home category when they seem to better fit in the away from home definition given the “regardless of where it’s eaten” clause. Is it possible to cut the data to include those types of foods as away from home and re-run your analysis? 

  2. Kiera says:

    Great post, Marc. I wrote it up on Mother Jones’ Blue Marble blog here:

    Also added some points about how junk-food prices are less subject to inflation than “healthy” food prices.

  3. North says:

    I think one critical thing to look into is how this data breaks out by income – the same period that’s been marked by a strong increase in income inequality.  It’s possible that the increase in ‘away from home’ spending comes from increased spending by the wealthy on expensive restaurant meals.  Also possible that it’s not, but that’s a question for the data.