Yes, yes, you can tell me all about why I definitely shouldn’t be eating those mashed potatoes in a moment. But right now, I’ve got some news for you (and some potatoes to eat).
A new study in September’s Quarterly Review of Biology attempts to do some forensic reconstruction of ancient human diets using an interesting method: looking at physiological changes and the nutrition that would have been necessary to support them. Of particular interest to the researchers were a pair of salivary amylase genes that began to rise a million years or so ago, along with the rise of cooking.
While that wouldn’t have been of much use in helping with the digestion of raw starches, researchers say that cooking would have turned that around entirely—and may even have been responsible for fueling some physiological changes itself:
The rapid growth in hominin brain size during the Middle Pleistocene will have required an increased supply of preformed glucose. Such increased demands can be met through a range of biologically and culturally driven dietary adaptations. Noting that there is considerable overlap in date estimates for the origins of controlled fire use and the origins of AMY1 CNV, we hypothesize a gene-culture coadaptation scenario whereby cooking starch-rich plant foods coevolved with increased salivary amylase activity in the human lineage. Without cooking, the consumption of starch-rich plant foods is unlikely to have met the high demands for preformed glucose noted in modern humans.
You can read the whole thing (perhaps over a nice loaf of bread) at Quarterly Review of Biology.