Calories on food packets are wrong – it’s time to change that

New Scientist Default Image

A CALORIE is a calorie, so they say. It shouldn’t matter whether it comes from steak, a carrot or a doughnut. Except that it does. And those calorie counts on food packets? Well, they aren’t much to be trusted either.

A food calorie is defined as the amount of energy it takes to raise the temperature of 1 litre of water by 1°C at sea level. Somewhat confusingly, this is 1000 times larger than a heat calorie, so is technically called a Calorie, with a big “c”, to make the distinction. In other words, a Calorie is a kilocalorie, or kcal for short.

Much of what we know about food calories comes from work in the late 1880s by Wilbur Atwater at Wesleyan University in Connecticut, who spent his career trying to figure out what proportion of different foods humans could digest. To measure the calories in food, Atwater set up an experiment using a “bomb calorimeter”– a highly pressurised sealed container that is filled with pure oxygen for burning food to a crisp. The heat given off during this is used to calculate the food’s calorie content, which is also known as its heat of combustion.


Humans, however, aren’t bomb calorimeters. The acidic cauldron of the stomach aside, digestion is a time-consuming, but actually relatively benign, series of chemical reactions. Thus, we are only able to extract a proportion of the calories in any given food.

In Atwater’s experiment, he fed various foods to human volunteers and measured the heat of combustion of the resulting faeces (reflect on this the next time you want to complain about your job). By calculating the difference in the heat of combustion between the food and the faeces, he approximated the calories that were absorbed by his volunteers.

In 1900, after a whole lot of burnt poop, Atwater presented his calculations to the world: we absorb 9 kcal per gram of fat, 4 kcal per gram of carbohydrates and 4 kcal per gram of protein. More than 120 years on, these “Atwater factors” are still the basis for how calorie counts on all food packaging are derived.

Yet, they are wrong. By the 1970s, it was clear they weren’t adding up. While Atwater took into account the fibre in food, which we can’t digest (hello sweetcorn), as well as the nitrogen extracted from protein and excreted as urea in our urine, he didn’t take into account the heat given off during metabolism. This is known as diet-induced thermogenesis and is the significant energy cost of converting protein, fat and carbs into the amino acids, fatty acids and glucose that our body needs.

Protein has a caloric availability of 70 per cent, meaning that for every 100 kcals of protein that makes it into the bloodstream, we are only able to use 70 kcals, with the other 30 kcals given off as heat from diet-induced thermogenesis.

By comparison, fat has a caloric availability of 98 per cent, hence why it is such an efficient long-term fuel store. As for carbs, it depends on whether we are talking about the complex (90 per cent availability) or refined (95 per cent) variety. This, in part, is why a calorie of protein makes you feel fuller than a calorie of fat or carbs.

In 2001, consultant Geoffrey Livesey coined the term “net metabolisable energy” to describe the concept of caloric availability, and proposed replacing the Atwater factors on food labels. But it was ignored by the food industry and gained no traction.

In a world where much of the burden of non-transmissible illnesses is diet-related, we need to have a better understanding of the quality of our food and that begins with the labels on packets. So 20 years on, this is me picking up the baton from Livesey and trying to push caloric availability into the conversation. It does indeed matter whether a calorie comes from steak, a carrot or a doughnut. We just need the right information to be able to judge.

More on these topics: