Whether you parcel them out for lunch the next day or squirrel them away with the best intentions until they’ve gone bad, leftovers are a mostly unremarkable reality of modern life. But leftovers have a story to tell, and their curious history tells us about changes in technology and in attitudes both toward affluence and dinner.
Until the icebox (aka proto-refrigerator) became standard in many homes at the turn of the 20th century, “leftovers” didn’t exist. Because there was no way to keep food in the form a freshly prepared meal took at the table, preservation of remaining food was as much a part of the culinary process as preparation. Cookbooks would often follow directions for a meal with instruction for pickling, curing, or salting the remains to prolong the life of all ingredients.
These weren’t leftovers as we think of them today, but the basis of another meal or food item entirely. But the ability to reliably keep things cool changed all that, as people could hang onto last night’s dinner without worrying about immediate spoilage. And so the notion of “leftover”—the remains of a meal that could be kept and consumed in a recognizably similar form later—was born, thanks to this technological innovation of the early 20th century.
The most interesting thing about leftovers, however, is not their invention but shifting attitudes toward them. The luxury of an icebox didn’t mean abundance was taken for granted. In fact, in World War I, eating one’s leftovers was positioned as so patriotic that some celebrated killing house pets rather than recklessly waste human food on them (in those days, pets ate scraps from human meals). From the wartime years through the intense poverty of the Depression, resourcefulness with this new category of “leftover” proved one’s virtuous frugality even more strongly. A 1917 U.S. Food Administration poster reminded citizens to “serve just enough/use what is left”; while a Good Housekeeping headline from 1930 admonished, “Leftovers Shouldn’t Be Left Over.”