They harvested mescal in spring and reactivated irrigation ditches in May

The historical record there and elsewhere thus suggests that during summer and early fall starches were consumed in far greater number than during the winter, when protein and fat consumption increased proportionally.By 1600, as another example, Western Apache communities had developed a “seasonal cycle” that veered between food gathering, horticulture, and winter hunts.In July they harvested saguaro fruit in the Gila Valley of modern day Arizona. In July they also began a month-long harvest of acorns. Fall and winter were dominated by hunting for animal meats. Similarly, on the Texas plains prior to European contact, Apache communities spent spring and summer in agricultural villages, moving towards hunting dominated nutrition in the winter.While surveying indigenous communities who had avoided the level of European contact suffered by their southern neighbors, Weston A. Price’s Nutrition and Physical Degeneration suggested that similar practices were maintained during the early twentieth century among communities living inside the Rocky Mountain Range in far northern Canada. Moving further towards speculation, or at least a working hypothesis, it is worth considering whether health declined as Native Americans were forced to consume maize and other starches during the winter months, having lost access to ancestral winter hunting grounds. After all, seasonal consumption of carbohydrates could have improved health by limiting the overall production of insulin in the bloodstream . In assessing such a hypothesis, students would be able to consult much recent research on the evolutionary dimension of obesity and metabolic syndrome. Firstly,stacking flower pot tower it has been suggested that the move towards all-year round abundance of food in many populations may have caused over-consumption at times when the human metabolism may otherwise have benefited from calorie reduction and/or greater insulin sensitivity.

The latter has been linked to reduced carbohydrate and protein access in comparison to fat, often during winter. Secondly, a number of studies have shown a correlation between blood vitamin D levels and insulin sensitivity. Thus, from an evolutionary perspective, it may be postulated that some human populations may be adapted to consume more starch during those months when insulin sensitivity is higher due to raised blood vitamin D levels from available sunshine.What, then, might students make of the statements made by the Spanish colonizer Cabeza de Vaca, who lived with Coahuiltecan communities in what is now part of Texas, for eight years from 1538? During winter they hunted buffalo, deer, and javelin, while during the summer and fall they were sustained by fish, plants, and starches such as the mesquite bean. In what has been described as a “feast or famine economy”, animal-based diets in winter would thus give way to gorging thanks to “the ripening of fruit, or tuna, of the prickly pears [which] typically meant days of feasting until the fruit ran out.” During the winter period of hunting for animal proteins and fats, de Vaca noted the ability of Coahuiltecans to maintain aerobic activity for long periods of time: “The men could run after a deer for an entire day without resting and without apparent fatigue. . . one man near seven feet in stature. . . runs down a buffalo on foot and slays it with his knife or lance, as he runs by its side.”Students might even consider the above, and other similar original source testimonies, in light of modern scientific research on fat-adapted aerobic activity, including the use of ketones as an energy source. The metabolic state of ketosis is defined as the elevation of the ketone bodies D-beta-hydroxybutyrate and acetoacetate in the body in response to the consumption of a diet low in glucose and high in fats, or following long periods of fasting.

A number of recent studies have suggested that the metabolic use of ketones may benefit long periods of medium-intensity movement by reducing the need for regular consumption of carbohydrates. They have focused on the ability of endurance athletes to maintain or perhaps even increase performance whilst consuming a high fat diet .Students and researchers might consider the possibility that some indigenous communities prior to European contact would at least have cycled between periods of ketoadaptation and period of glucose-burning, depending on the season. It has certainly been suggested that communities in the northernmost parts of Canada and Alaska have historically utilized ketogenic or keto-adapted diets, burning fat obtained from meat and fish rather than glucose as a primary fuel for many months of the year .Though much more research is needed to determine the nature of fat burning during physical exercise, a useful hypothesis can be gleaned by noting that many Native Americans undertook long hunts over several hours at just the point in the season when they may have benefited from increased endurance due to their fat-adapted metabolic states and high fat consumption. Any European disruption to winter hunts, according to such a proposition, would also have disrupted ancestral metabolic patterns that incorporated seasonal fat-adaptation, or even seasonal ketosis. Fat-adapted metabolic diets have been found to be therapeutic for a number of medical disorders, particularly but not only neuropathies. Evidence of their potential health-promoting benefits, at least in periodic cycles, might also support the hypothesis that disruption to ancestral metabolic cycles could have prevented other benefits that are as yet unknown, pending further scientific research – thus potentially exacerbating Native American susceptibility to infectious diseases following European contact.

Of course, students would be encouraged to examine confounding evidence to such a hypothesis. In this case, for example, they might consider the historical record for Tarahumara communities in Northwestern Mexico, who ran great distances through the colonial era while eating comparatively few animal meat products. A similar association could be found with historic Apaches and Hopis, both of whom ran for long distances with a diet dominated by maize, squash, and beans. To be sure, it is ambiguous whether, historically, Tarahumaras relied solely on plant carbohydrates and proteins from beans, or whether such an account represents a teleological extrapolation from their diets which were examined during the second half of the twentieth century.But students would certainly be able to note the association between their high starch diet and their high-distance aerobic activities . Yet even here, other analytical indicators might become relevant. Several studies have shown low plasma cholesterol levels for modern Tarahumaras, at least in regard to HDL. But other studies have also shown relatively high levels of cardio-respiratory problems in rural populations who have maintained a high starch diet while decreasing their physical movement. That is to say, the problematic effects of high blood glucose might have been mitigated by intense exercise burning the substance for fuel rather than raising insulin to a sub-optimal level.The Tarahumara case study might even lead students to ask a further set of questions, which speak to recent research by O’Keefe and others on burning glucose during periods of aerobic endurance activity. Compared to long-distance aerobic activity in a fat-adapted state, students might ask, what are the long term consequences of burning glucose for fuel in a high state of oxidative stress, such as long distance running and hunting? Oxidative stress can be defined as the increased production of oxidizing species or a significant decrease in the effectiveness of antioxidant defenses, such as glutathione, in association with intense consumption of oxygen during periods of activity . Even if exercise may have prevented glucose from being stored as fat among Tarahumaras, and lowered inflammatory markers,ebb and flow oxidative damage due to their metabolic state during exercise may have portended problematic cardiovascular health outcomes.Ongoing scientific discussions about optimal nutritional health – as defined in peer-reviewed studies – should help to inform our historical understanding of the contact period between Europeans and Native-Americans, from the sixteenth century through to the nineteenth century. Students should be in a position to question the perception of nomadic hunting as the only indigenous Native American nutritional and ecological practice in the three centuries following European contact. The regular or seasonal consumption of hunted wild animal products was not antithetical to horticultural cultivation. Both were often complementary during the pre-contact era, providing micro-nutrients and macro-nutrients that varied according to the season. Distrust of European agriculture – including its perceived association with the spread of diseases – did not require Native-Americans to conceive of themselves as a people who eschewed ecological cultivation altogether. If any generalizations are to be made, rather, students would be better off examining the ways in which Native American populations perceived colonial farming as a threat to their own land management of crops as well as to their continued hunter-gatherer activities outside indigenous cultivated settlements. We ought to examine the existence of both ecological systems in order to consider their distinction from those that followed the European encounter. During the era of contact, horticulture declined. But although hunting and gathering increased in response to colonial pressure on cultivated plants, its character and regional context changed in order to accompany new European technologies, often deployed in new lands. Those communities who migrated to the Great Plains – whether from the East, the West, or the Northern Great Lakes – eventually suffered from diminishing access to hunted animals, without the potential to mitigate that loss through renewed horticulture sustenance.

The same phenomenon also often occurred among Native American communities who remained at the site of first European contact. As is evident in the tragic history of Native American health and ecology in the three centuries after first contact, then, external interventions in the indigenous food system may well have contributed to heightened susceptibility to infectious diseases and near demographic collapse. The modern scientific literature on nutrition and health should inform our understanding of the negative health outcomes that were associated with diminished access to indigenously cultivated crops and/or hunter-gathered animals and plants. In evaluating declining health after European contact, students and researchers should assess the effects of curtailed hunting and gathering, the threat to pre-contact forms of horticulture, and the colonial misunderstanding of their symbiosis. Such an assessment could be informed by our understanding of the nutritional and ecological changes that accompanied the introduction of infectious diseases: a greater threat from zoonotically spread pathogens; diminished access to fats, fat soluble vitamins, proteins and essential minerals from animal and plant sources; an increasing inability to gather potentially beneficial indigenous starches and Resistant Starch sources; a growing threat to seasonal oscillations between winter higher fat diets and summer starches; and even a decline in cyclical ketosis in some regions. In light of current research on the relationship between nutritional density and immunity, and metabolic syndromes, diminished access to indigenous food sources can be related to the greater vulnerability to infectious diseases, above and beyond the differing historical immunities that distinguished Native Americans from Europeans. Questioning or modifying the Biological Exchange thesis should help students and researchers evaluate an important and ongoing question in public policy: historically, since European colonization efforts, why have top-down political interventions in nutrition so often accompanied a decline in ancestral health principles, health, and fertility? Indeed, such a correlation is suggested by events following the three centuries of European colonization, from the mid-nineteenth century to the present day. They might offer important insights as a concluding section or even a postscript to the proposed course, connecting the colonial period to the contemporary era. In 1867, for example, the Treaty of Medicine Lodge required Southern Plains Native Americans to give up land in return for government annuities. The federal government then began supplying them with food handouts, using the industrial and transportation systems developed to supply troops with grains during the Civil War. The containment of Native Americans in reserves severely limited the physical activity to which their communities had grown accustomed over previous centuries. It has been suggested that their decreased energy expenditure would have made them less likely to burn increasing volumes of glucose in their diet, contributing even further to the development of diabetes and other immune disorders through the twentieth century. Students and researchers would do well to examine the correlation between the Treaty of Medicine Lodge, other similar documents, and declining nutritional health among Native Americans during the post-colonial era. Aleš Hrdlička, a medic and an anthropologist, reported in his 1902 Physiological and Medical Observations among the Indians of Southwestern United States and Northern Mexico that obesity and associated “grave disease[s] of the liver” were “exclusively” found among Pima Indians on new reservations, rather than among those who relied on a more traditional system of hunting meats and gathering or cultivating fibrous seeds, chenopods, plants, and starchy tubers.