History Podcasts

What was the main diet of pre-agricultural Asians?

What was the main diet of pre-agricultural Asians?


We are searching data for your request:

Forums and discussions:
Manuals and reference books:
Data from registers:
Wait the end of the search in all databases.
Upon completion, a link will appear to access the found materials.

The modern Asian diet is based mostly around rice. Was rice a major part of the paleolithic Asian diet? Did they know how to process and eat rice before agriculture?

Aside from meats, what were other major parts of their diet? What kind of fruits were common?

I'm looking mainly at East Asian (China/Korean/Japan) kind of diets, but alternative diets for other rice eaters (like Southeast Asian, Indian) would also be helpful.


This paper in Nature is fascinating - unfortunately, the chemical studies described were not performed on ancient East Asians, but it lines up with archaeological and anthropological evidence worldwide.

There have only been two studies of Palaeolithic modern humans, Homo sapiens sapiens. A study of the isotope values of humans from the late Upper Palaeolithic (ca 13 000 years old) site of Gough's and Sun Hole Cave in Southern England (Richards et al, 2000a) indicated, again by the delta15N values, that the main source of dietary protein was animal-based, and most likely herbivore flesh. The second study (Richards et al, 2001) was a survey of isotope values of humans from Gravettian and later (approximately 30 000-20 000 years old) Eurasian sites. The delta13C and delta15N values here indicated high animal protein diets, but the type of animal protein was more varied than the Neanderthals, incorporating aquatic foods in their diets. As this study was a survey, and associated faunal delta13C and delta15N values were not measured, it is not possible to further pinpoint the sources of dietary protein at all of these sites. Interestingly, this adaptation to aquatic resources becomes more extreme in much later (ca 10 000-5000 BP, depending on area) Mesolithic periods in parts of Europe. For example, isotope studies of Mesolithic humans from the Danube Gorges in Southeastern Europe indicate that the majority of protein was from freshwater fish, which is supported by the archaeological evidence of fishing equipment and large numbers of fish bones (Bonsall et al, 1997).

More recent archaeological chemical analyses, such as the one done on remains from Tianyuan cave, also find extensive freshwater fish consumption, indicating things were similar in East Asia. The evidence is that the diet of modern humans in the paleolithic worldwide was primarily animal flesh, supplemented by easily gathered plant material.

Paleolithic tools used in the gathering or preparation of plant foods are either absent, or unrecognizable as such - in light of such an absence, and with the evidence that the diet was primarily meat-based, it must be inferred that plant-based foods that required processing or extensive effort to gather was not a large part of the diet.

This includes wild rice and other grains, most of which required extensive domestication efforts. The earliest evidence of rice consumption only dates to the early neolithic, 11-12kybp, and wild barley only goes as far back as 23kybp, but not as a staple, and not in East Asia. There is one 2009 study that concludes that other wild grains were harvested as early as 90kybp, and claims to have found stone tools to prove it, but this is not yet corroborated, and may not be widespread. The chemical analyses show that animal protein was the dominant dietary staple.


In terms of paleolithic food in Indian subcontinent there are few classical Tamil Litratures like "Purananuru" & "Madurai Kanchi" and relegious tamil Litrature "Devaram" and "Tiruvasakam" gives more information about the paleolithic food.

As per these literatures some of which are dated back to 600 BCE suggests that the peoples used to hunt for their food and they also domesticated animals like chicken and goat which were used as food. Rice cultivation was also present at that time. But, due to unpredictability of rain which remains the one of the few source for water at that time. The peoples have a alternate dry crops like millets which was eaten by boiling, streaming and eaten raw as well.

Per the documented evidences, we may conclude that the peoples of Indian subcontinent not only used rice as their staple food. But, also other forms of millets.


Cuisine of the Thirteen Colonies

The cuisine of the Thirteen Colonies includes the foods, bread, eating habits, and cooking methods of the Colonial United States.

In the period leading up to 1776, a number of events led to a drastic change in the diet of the American colonists. As they could no longer rely on British and West-Indian imports, agricultural practices of the colonists began to focus on becoming completely self-sufficient. [1]


The origins of agriculture:

What might head a list of the defining characteristics of the human species? While our view of ourselves could hardly avoid highlighting our accomplishments in engineering, art, medicine, space travel and the like, in a more dispassionate assessment agriculture would probably displace all other contenders for top billing. Most of the other achievements of humankind have followed from this one. Almost without exception, all people on earth today are sustained by agriculture. With a minute number of exceptions, no other species is a farmer. Essentially all of the arable land in the world is under cultivation. Yet agriculture began just a few thousand years ago, long after the appearance of anatomically modern humans.

Given the rate and the scope of this revolution in human biology, it is quite extraordinary that there is no generally accepted model accounting for the origin of agriculture. Indeed, an increasing array of arguments over recent years has suggested that agriculture, far from being a natural and upward step, in fact led commonly to a lower quality of life. Hunter-gatherers typically do less work for the same amount of food, are healthier, and are less prone to famine than primitive farmers (Lee & DeVore 1968, Cohen 1977, 1989). A biological assessment of what has been called the puzzle of agriculture might phrase it in simple ethological terms: why was this behaviour (agriculture) reinforced (and hence selected for) if it was not offering adaptive rewards surpassing those accruing to hunter-gathering or foraging economies?

This paradox is responsible for a profusion of models of the origin of agriculture. 'Few topics in prehistory', noted Hayden (1990) 'have engendered as much discussion and resulted in so few satisfying answers as the attempt to explain why hunter/gatherers began to cultivate plants and raise animals. Climatic change, population pressure, sedentism, resource concentration from desertification, girls' hormones, land ownership, geniuses, rituals, scheduling conflicts, random genetic kicks, natural selection, broad spectrum adaptation and multicausal retreats from explanation have all been proffered to explain domestication. All have major flaws . the data do not accord well with any one of these models.'

Recent discoveries of potentially psychoactive substances in certain agricultural products -- cereals and milk -- suggest an additional perspective on the adoption of agriculture and the behavioural changes ('civilisation') that followed it. In this paper we review the evidence for the drug-like properties of these foods, and then show how they can help to solve the biological puzzle just described.

The emergence of agriculture and civilisation in the Neolithic

The transition to agriculture

From about 10,000 years ago, groups of people in several areas around the world began to abandon the foraging lifestyle that had been successful, universal and largely unchanged for millennia (Lee & DeVore 1968). They began to gather, then cultivate and settle around, patches of cereal grasses and to domesticate animals for meat, labour, skins and other materials, and milk.

Farming, based predominantly on wheat and barley, first appeared in the Middle East, and spread quickly to western Asia, Egypt and Europe. The earliest civilisations all relied primarily on cereal agriculture. Cultivation of fruit trees began three thousand years later, again in the MiddleEast, and vegetables and other crops followed (Zohari 1986). Cultivation of rice began in Asia about 7000 years ago (Stark 1986).

To this day, for most people, two-thirds of protein and calorie intake is cereal-derived. (In the west, in the twentieth century, cereal consumption has decreased slightly in favour of meat, sugar, fats and so on.) The respective contributions of each cereal to current total world production are: wheat (28 per cent), corn/maize (27 per cent), rice (25 per cent), barley (10 per cent), others (10 per cent) (Pedersen et al. 1989).

The change in the diet due to agriculture

The modern human diet is very different from that of closely related primates and, almost certainly, early hominids (Gordon 1987). Though there is controversy over what humans ate before the development of agriculture, the diet certainly did not include cereals and milk in appreciable quantities. The storage pits and processing tools necessary for significant consumption of cereals did not appear until the Neolithic (Washburn & Lancaster 1968). Dairy products were not available in quantity before the domestication of animals.

The early hominid diet (from about four million years ago), evolving as it did from that of primate ancestors, consisted primarily of fruits, nuts and other vegetable matter, and some meat -- items that could be foraged for and eaten with little or no processing. Comparisons of primate and fossil-hominid anatomy, and of the types and distribution of plants eaten raw by modern chimpanzees, baboons and humans (Peters & O'Brien 1981, Kay 1985), as well as microscope analysis of wear patterns on fossil teeth (Walker 1981, Peuch et al.1983) suggest that australopithecines were 'mainly frugivorous omnivores with a dietary pattern similar to that of modern chimpanzees' (Susman 1987:171).

The diet of pre-agricultural but anatomically modern humans (from 30,000 years ago) diversified somewhat, but still consisted of meat, fruits, nuts, legumes, edible roots and tubers, with consumption of cereal seeds only increasing towards the end of the Pleistocene (e.g. Constantini 1989 and subsequent chapters in Harris and Hillman 1989).

The rise of civilisation

Within a few thousand years of the adoption of cereal agriculture, the old hunter-gatherer style of social organisation began to decline. Large, hierarchically organised societies appeared, centred around villages and then cities. With the rise of civilisation and the state came socioeconomic classes, job specialisation, governments and armies.

The size of populations living as coordinated units rose dramatically above pre-agricultural norms. While hunter-gatherers lived in egalitarian, autonomous bands of about 20 closely related persons, with at most a tribal level of organisation above that, early agricultural villages had 50 to 200 inhabitants, and early cities 10,000 or more. People 'had to learn to curb deep-rooted forces which worked for increasing conflict and violence in large groups' (Pfeiffer 1977:438).

Agriculture and civilisation meant the end of foraging -- a subsistence method with shortterm goals and rewards -- and the beginning (for most) of regular arduous work, oriented to future payoffs and the demands of superiors. 'With the coming of large communities, families no longer cultivated the land for themselves and their immediate needs alone, but for strangers and for the future. They worked all day instead of a few hours a day, as hunter-gatherers had done. There were schedules, quotas, overseers, and punishments for slacking off' (Pfeiffer 1977:21).

Explaining the origins of agriculture and civilisation

The phenomena of human agriculture and civilisation are ethologically interesting, because (1) virtually no other species lives this way, and (2) humans did not live this way until relatively recently. Why was this way of life adopted, and why has it become dominant in the human species?

Problems explaining agriculture

Until recent decades, the transition to farming was seen as an inherently progressive one: people learnt that planting seeds caused crops to grow, and this new improved food source led to larger populations, sedentary farm and town life, more leisure time and so to specialisation, writing, technological advances and civilisation. It is now clear that agriculture was adopted despite certain disadvantages of that lifestyle (e.g. Flannery 1973, Henry 1989). There is a substantial literature (e.g. Reed 1977), not only on how agriculture began, but why. Palaeopathological and comparative studies show that health deteriorated in populations that adopted cereal agriculture, returning to pre-agricultural levels only in modem times. This is in part attributable to the spread of infection in crowded cities, but is largely due to a decline in dietary quality that accompanied intensive cereal farming (Cohen 1989). People in many parts of the world remained hunter-gatherers until quite recently though they were quite aware of the existence and methods of agriculture, they declined to undertake it (Lee & DeVore 1968, Harris 1977). Cohen (1977:141) summarised the problem by asking: 'If agriculture provides neither better diet, nor greater dietary reliability, nor greater ease, but conversely appears to provide a poorer diet, less reliably, with greater labor costs, why does anyone become a farmer?'

Many explanations have been offered, usually centred around a particular factor that forced the adoption of agriculture, such as environmental or population pressure (for reviews see Rindos 1984, Pryor 1986, Redding 1988, Blumler & Byrne 1991). Each of these models has been criticised extensively, and there is at this time no generally accepted explanation of the origin of agriculture.

Problems explaining civilisation

A similar problem is posed by the post-agricultural appearance, all over the world, of cities and states, and again there is a large literature devoted to explaining it (e.g. Claessen & Skalnik 1978). The major behavioural changes made in adopting the civilised lifestyle beg explanation. Bledsoe (1987:136) summarised the situation thus:

'There has never been and there is not now agreement on the nature and significance of the rise of civilisation. The questions posed by the problem are simple, yet fundamental. How did civilisation come about? What animus impelled man to forego the independence, intimacies, and invariability of tribal existence for the much larger and more impersonal political complexity we call the state? What forces fused to initiate the mutation that slowly transformed nomadic societies into populous cities with ethnic mixtures, stratified societies, diversified economies and unique cultural forms? Was the advent of civilisation the inevitable result of social evolution and natural laws of progress or was man the designer of his own destiny? Have technological innovations been the motivating force or was it some intangible factor such as religion or intellectual advancement?'

To a very good approximation, every civilisation that came into being had cereal agriculture as its subsistence base, and wherever cereals were cultivated, civilisation appeared. Some hypotheses have linked the two. For example, Wittfogel's (1957) 'hydraulic theory' postulated that irrigation was needed for agriculture, and the state was in turn needed to organise irrigation. But not all civilisations used irrigation, and other possible factors (e.g. river valley placement, warfare, trade, technology, religion, and ecological and population pressure) have not led to a universally accepted model.

Pharmacological properties of cereals and milk

Recent research into the pharmacology of food presents a new perspective on these problems.

Exorphins: opioid substances in food

Prompted by a possible link between diet and mental illness, several researchers in the late 1970s began investigating the occurrence of drug-like substances in some common foodstuffs.

Dohan (1966, 1984) and Dohan et al. (1973, 1983) found that symptoms of schizophrenia were relieved somewhat when patients were fed a diet free of cereals and milk. He also found that people with coeliac disease -- those who are unable to eat wheat gluten because of higher than normal permeability of the gut -- were statistically likely to suffer also from schizophrenia. Research in some Pacific communities showed that schizophrenia became prevalent in these populations only after they became 'partially westernised and consumed wheat, barley beer, and rice' (Dohan 1984).

Groups led by Zioudrou (1979) and Brantl (1979) found opioid activity in wheat, maize and barley (exorphins), and bovine and human milk (casomorphin), as well as stimulatory activity in these proteins, and in oats, rye and soy. Cereal exorphin is much stronger than bovine casomorphin, which in turn is stronger than human casomorphin. Mycroft et al. (1982, 1987) found an analogue of MIF-1, a naturally occurring dopaminergic peptide, in wheat and milk. It occurs in no other exogenous protein. (In subsequent sections we use the term exorphin to cover exorphins, casomorphin, and the MIF-1 analogue. Though opioid and dopaminergic substances work in different ways, they are both 'rewarding', and thus more or less equivalent for our purposes.)

Since then, researchers have measured the potency of exorphins, showing them to be comparable to morphine and enkephalin (Heubner et al. 1984), determined their amino acid sequences (Fukudome &Yoshikawa 1992), and shown that they are absorbed from the intestine (Svedburg et al.1985) and can produce effects such as analgesia and reduction of anxiety which are usually associated with poppy-derived opioids (Greksch et al.1981, Panksepp et al.1984). Mycroft et al. estimated that 150 mg of the MIF-1 analogue could be produced by normal daily intake of cereals and milk, noting that such quantities are orally active, and half this amount 'has induced mood alterations in clinically depressed subjects' (Mycroft et al. 1982:895). (For detailed reviews see Gardner 1985 and Paroli 1988.)

Most common drugs of addiction are either opioid (e.g heroin and morphine) or dopaminergic (e.g. cocaine and amphetamine), and work by activating reward centres in the brain. Hence we may ask, do these findings mean that cereals and milk are chemically rewarding? Are humans somehow 'addicted' to these foods?

Problems in interpreting these findings

Discussion of the possible behavioural effects of exorphins, in normal dietary amounts, has been cautious. Interpretations of their significance have been of two types:

where a pathological effect is proposed (usually by cereal researchers, and related to Dohan's findings, though see also Ramabadran & Bansinath 1988), and

where a natural function is proposed (by milk researchers, who suggest that casomorphin may help in mother-infant bonding or otherwise regulate infant development).

We believe that there can be no natural function for ingestion of exorphins by adult humans. It may be that a desire to find a natural function has impeded interpretation (as well as causing attention to focus on milk, where a natural function is more plausible) . It is unlikely that humans are adapted to a large intake of cereal exorphin, because the modern dominance of cereals in the diet is simply too new. If exorphin is found in cow's milk, then it may have a natural function for cows similarly, exorphins in human milk may have a function for infants. But whether this is so or not, adult humans do not naturally drink milk of any kind, so any natural function could not apply to them.

Our sympathies therefore lie with the pathological interpretation of exorphins, whereby substances found in cereals and milk are seen as modern dietary abnormalities which may cause schizophrenia, coeliac disease or whatever. But these are serious diseases found in a minority. Can exorphins be having an effect on humankind at large?

Other evidence for 'drug-like' effects of these foods

Research into food allergy has shown that normal quantities of some foods can have pharmacological, including behavioural, effects. Many people develop intolerances to particular foods. Various foods are implicated, and a variety of symptoms is produced. (The term 'intolerance' rather than allergy is often used, as in many cases the immune system may not be involved (Egger 1988:159). Some intolerance symptoms, such as anxiety, depression, epilepsy, hyperactivity, and schizophrenic episodes involve brain function (Egger 1988, Scadding & Brostoff 1988).

Radcliffe (1982, quoted in 1987:808) listed the foods at fault, in descending order of frequency, in a trial involving 50 people: wheat (more than 70 per cent of subjects reacted in some way to it), milk (60 per cent), egg (35 per cent), corn, cheese, potato, coffee, rice, yeast, chocolate, tea, citrus, oats, pork, plaice, cane, and beef (10 per cent). This is virtually a list of foods that have become common in the diet following the adoption of agriculture, in order of prevalence. The symptoms most commonly alleviated by treatment were mood change (>50 per cent) followed by headache, musculoskeletal and respiratory ailments.

One of the most striking phenomena in these studies is that patients often exhibit cravings, addiction and withdrawal symptoms with regard to these foods (Egger 1988:170, citing Randolph 1978 see also Radcliffe 1987:808-10, 814, Kroker 1987:856, 864, Sprague & Milam 1987:949, 953, Wraith 1987:489, 491). Brostoff and Gamlin (1989:103) estimated that 50 per cent of intolerance patients crave the foods that cause them problems, and experience withdrawal symptoms when excluding those foods from their diet. Withdrawal symptoms are similar to those associated with drug addictions (Radcliffe 1987:808). The possibility that exorphins are involved has been noted (Bell 1987:715), and Brostoff and Gamlin conclude (1989:230):

'. the results so far suggest that they might influence our mood. There is certainly no question of anyone getting 'high' on a glass of milk or a slice of bread - the amounts involved are too small for that - but these foods might induce a sense of comfort and wellbeing, as food-intolerant patients often say they do. There are also other hormone-like peptides in partial digests of food, which might have other effects on the body.'

There is no possibility that craving these foods has anything to do with the popular notion of the body telling the brain what it needs for nutritional purposes. These foods were not significant in the human diet before agriculture, and large quantities of them cannot be necessary for nutrition. In fact, the standard way to treat food intolerance is to remove the offending items from the patient's diet.

A suggested interpretation of exorphin research

But what are the effects of these foods on normal people? Though exorphins cannot have a naturally selected physiological function in humans, this does not mean that they have no effect. Food intolerance research suggests that cereals and milk, in normal dietary quantities, are capable of affecting behaviour in many people. And if severe behavioural effects in schizophrenics and coeliacs can be caused by higher than normal absorption of peptides, then more subtle effects, which may not even be regarded as abnormal, could be produced in people generally.

The evidence presented so far suggests the following interpretation.

The ingestion of cereals and milk, in normal modern dietary amounts by normal humans, activates reward centres in the brain. Foods that were common in the diet before agriculture (fruits and so on) do not have this pharmacological property. The effects of exorphins are qualitatively the same as those produced by other opioid and / or dopaminergic drugs, that is, reward, motivation, reduction of anxiety, a sense of wellbeing, and perhaps even addiction. Though the effects of a typical meal are quantitatively less than those of doses of those drugs, most modern humans experience them several times a day, every day of their adult lives.

Hypothesis: exorphins and the origin of agriculture and civilisation

When this scenario of human dietary practices is viewed in the light of the problem of the origin of agriculture described earlier, it suggests an hypothesis that combines the results of these lines of enquiry.

Exorphin researchers, perhaps lacking a long-term historical perspective, have generally not investigated the possibility that these foods really are drug-like, and have instead searched without success for exorphin's natural function. The adoption of cereal agriculture and the subsequent rise of civilisation have not been satisfactorily explained, because the behavioural changes underlying them have no obvious adaptive basis.

These unsolved and until-now unrelated problems may in fact solve each other. The answer, we suggest, is this: cereals and dairy foods are not natural human foods, but rather are preferred because they contain exorphins. This chemical reward was the incentive for the adoption of cereal agriculture in the Neolithic. Regular self-administration of these substances facilitated the behavioural changes that led to the subsequent appearance of civilisation.

This is the sequence of events that we envisage.

Climatic change at the end of the last glacial period led to an increase in the size and concentration of patches of wild cereals in certain areas (Wright 1977). The large quantities of cereals newly available provided an incentive to try to make a meal of them. People who succeeded in eating sizeable amounts of cereal seeds discovered the rewarding properties of the exorphins contained in them. Processing methods such as grinding and cooking were developed to make cereals more edible. The more palatable they could be made, the more they were consumed, and the more important the exorphin reward became for more people.

At first, patches of wild cereals were protected and harvested. Later, land was cleared and seeds were planted and tended, to increase quantity and reliability of supply. Exorphins attracted people to settle around cereal patches, abandoning their nomadic lifestyle, and allowed them to display tolerance instead of aggression as population densities rose in these new conditions.

Though it was, we suggest, the presence of exorphins that caused cereals (and not an alternative already prevalent in the diet) to be the major early cultigens, this does not mean that cereals are 'just drugs'. They have been staples for thousands of years, and clearly have nutritional value. However, treating cereals as 'just food' leads to difficulties in explaining why anyone bothered to cultivate them. The fact that overall health declined when they were incorporated into the diet suggests that their rapid, almost total replacement of other foods was due more to chemical reward than to nutritional reasons.

It is noteworthy that the extent to which early groups became civilised correlates with the type of agriculture they practised. That is, major civilisations (in south-west Asia, Europe, India, and east and parts of South-East Asia central and parts of north and south America Egypt, Ethiopia and parts of tropical and west Africa) stemmed from groups which practised cereal, particularly wheat, agriculture (Bender 1975:12, Adams 1987:201, Thatcher 1987:212). (The rarer nomadic civilisations were based on dairy farming.)

Groups which practised vegeculture (of fruits, tubers etc.), or no agriculture (in tropical and south Africa, north and central Asia, Australia, New Guinea and the Pacific, and much of north and south America) did not become civilised to the same extent.

Thus major civilisations have in common that their populations were frequent ingesters of exorphins. We propose that large, hierarchical states were a natural consequence among such populations. Civilisation arose because reliable, on-demand availability of dietary opioids to individuals changed their behaviour, reducing aggression, and allowed them to become tolerant of sedentary life in crowded groups, to perform regular work, and to be more easily subjugated by rulers. Two socioeconomic classes emerged where before there had been only one (Johnson & Earle 1987:270), thus establishing a pattern which has been prevalent since that time.

Discussion

The natural diet and genetic change

Some nutritionists deny the notion of a pre-agricultural natural human diet on the basis that humans are omnivorous, or have adapted to agricultural foods (e.g. Garn & Leonard 1989 for the contrary view see for example Eaton & Konner 1985). An omnivore, however, is simply an animal that eats both meat and plants: it can still be quite specialised in its preferences (chimpanzees are an appropriate example). A degree of omnivory in early humans might have preadapted them to some of the nutrients contained in cereals, but not to exorphins, which are unique to cereals.

The differential rates of lactase deficiency, coeliac disease and favism (the inability to metabolise fava beans) among modern racial groups are usually explained as the result of varying genetic adaptation to post-agricultural diets (Simopoulos 1990:27-9), and this could be thought of as implying some adaptation to exorphins as well. We argue that little or no such adaptation has occurred, for two reasons: first, allergy research indicates that these foods still cause abnormal reactions in many people, and that susceptibility is variable within as well as between populations, indicating that differential adaptation is not the only factor involved. Second, the function of the adaptations mentioned is to enable humans to digest those foods, and if they are adaptations, they arose because they conferred a survival advantage. But would susceptibility to the rewarding effects of exorphins lead to lower, or higher, reproductive success? One would expect in general that an animal with a supply of drugs would behave less adaptively and so lower its chances of survival. But our model shows how the widespread exorphin ingestion in humans has led to increased population. And once civilisation was the norm, non-susceptibility to exorphins would have meant not fitting in with society. Thus, though there may be adaptation to the nutritional content of cereals, there will be little or none to exorphins. In any case, while contemporary humans may enjoy the benefits of some adaptation to agricultural diets, those who actually made the change ten thousand years ago did not.

Other 'non-nutritional' origins of agriculture models

We are not the first to suggest a non-nutritional motive for early agriculture. Hayden (1990) argued that early cultigens and trade items had more prestige value than utility, and suggested that agriculture began because the powerful used its products for competitive feasting and accrual of wealth. Braidwood et al. (1953) and later Katz and Voigt (1986) suggested that the incentive for cereal cultivation was the production of alcoholic beer:

'Under what conditions would the consumption of a wild plant resource be sufficiently important to lead to a change in behaviour (experiments with cultivation) in order to ensure an adequate supply of this resource? If wild cereals were in fact a minor part of the diet, any argument based on caloric need is weakened. It is our contention that the desire for alcohol would constitute a perceived psychological and social need that might easily prompt changes in subsistence behaviour' (Katz & Voigt 1986:33).

This view is clearly compatible with ours. However there may be problems with an alcohol hypothesis: beer may have appeared after bread and other cereal products, and been consumed less widely or less frequently (Braidwood et al. 1953). Unlike alcohol, exorphins are present in all these products. This makes the case for chemical reward as the motive for agriculture much stronger. Opium poppies, too, were an early cultigen (Zohari 1986). Exorphin, alcohol, and opium are primarily rewarding (as opposed to the typically hallucinogenic drugs used by some hunter-gatherers) and it is the artificial reward which is necessary, we claim, for civilisation. Perhaps all three were instrumental in causing civilised behaviour to emerge.

Cereals have important qualities that differentiate them from most other drugs. They are a food source as well as a drug, and can be stored and transported easily. They are ingested in frequent small doses (not occasional large ones), and do not impede work performance in most people. A desire for the drug, even cravings or withdrawal, can be confused with hunger. These features make cereals the ideal facilitator of civilisation (and may also have contributed to the long delay in recognising their pharmacological properties).

Compatibility, limitations, more data needed

Our hypothesis is not a refutation of existing accounts of the origins of agriculture, but rather fits alongside them, explaining why cereal agriculture was adopted despite its apparent disadvantages and how it led to civilisation.

Gaps in our knowledge of exorphins limit the generality and strength of our claims. We do not know whether rice, millet and sorghum, nor grass species which were harvested by African and Australian hunter-gatherers, contain exorphins. We need to be sure that preagricultural staples do not contain exorphins in amounts similar to those in cereals. We do not know whether domestication has affected exorphin content or-potency. A test of our hypothesis by correlation of diet and degree of civilisation in different populations will require quantitative knowledge of the behavioural effects of all these foods.

We do not comment on the origin of noncereal agriculture, nor why some groups used a combination of foraging and farming, reverted from farming to foraging, or did not farm at all. Cereal agriculture and civilisation have, during the past ten thousand years, become virtually universal. The question, then, is not why they happened here and not there, but why they took longer to become established in some places than in others. At all times and places, chemical reward and the influence of civilisations already using cereals weighed in favour of adopting this lifestyle, the disadvantages of agriculture weighed against it, and factors such as climate, geography, soil quality, and availability of cultigens influenced the outcome. There is a recent trend to multi-causal models of the origins of agriculture (e.g. Redding 1988, Henry 1989), and exorphins can be thought of as simply another factor in the list. Analysis of the relative importance of all the factors involved, at all times and places, is beyond the scope of this paper.

Conclusion

'An animal is a survival machine for the genes that built it. We too are animals, and we too are survival machines for our genes. That is the theory. In practice it makes a lot of sense when we look at wild animals. It is very different when we look at ourselves. We appear to be a serious exception to the Darwinian law. It obviously just isn't true that most of us spend our time working energetically for the preservation of our genes' (Dawkins 1989:138).

Many ethologists have acknowledged difficulties in explaining civilised human behaviour on evolutionary grounds, in some cases suggesting that modern humans do not always behave adaptively. Yet since agriculture began, the human population has risen by a factor of 1000: Irons (1990) notes that 'population growth is not the expected effect of maladaptive behaviour'.

We have reviewed evidence from several areas of research which shows that cereals and dairy foods have drug-like properties, and shown how these properties may have been the incentive for the initial adoption of agriculture. We suggested further that constant exorphin intake facilitated the behavioural changes and subsequent population growth of civilisation, by increasing people's tolerance of (a) living in crowded sedentary conditions, (b) devoting effort to the benefit of non-kin, and (c) playing a subservient role in a vast hierarchical social structure.

Cereals are still staples, and methods of artificial reward have diversified since that time, including today a wide range of pharmacological and non-pharmacological cultural artifacts whose function, ethologically speaking, is to provide reward without adaptive benefit. It seems reasonable then to suggest that civilisation not only arose out of self-administration of artificial reward, but is maintained in this way among contemporary humans. Hence a step towards resolution of the problem of explaining civilised human behaviour may be to incorporate into ethological models this widespread distortion of behaviour by artificial reward.

References

Adams, W .M., 1987, Cereals before cities except after Jacobs, in M. Melko & L.R. Scott eds, The boundaries of civilizations in space and time, University Press of America, Lanham.

Bell, I. R., 1987, Effects of food allergy on the central nervous system, in J. Brostoff and S. J. Challacombe, eds, Food allergy and intolerance, Bailliere Tindall, London.

Bender, B., 1975, Farming in prehistory: from hunter-gatherer to food producer, John Baker, London.

Bledsoe, W., 1987, Theories of the origins of civilization, in M. Melko and L. R. Scott, eds, The boundaries of civilizations in space and time, University Press of America, Lanham.

Blumler, M., & Byrne, R., 1991, The ecological genetics of domestication and the origins of agriculture, Current Anthropology 32: 2-35.

Braidwood, R. J., Sauer, J.D., Helbaek, H., Mangelsdorf, P.C., Cutler, H.C., Coon, C.S., Linton, R., Steward J. & Oppenheim, A.L., 1953, Symposium: did man once live by beer alone? American Anthropologist 55: 515-26.

Brantl, V., Teschemacher, H., Henschen, A. & Lottspeich, F., 1979, Novel opioid peptides derived from casein (beta-casomorphins), Hoppe-Seyler's Zeitschrift fur Physiologische Chemie 360:1211-6.

Brostoff, J., & Gamlin, L., 1989, The complete guide to food allergy and intolerance, Bloomsbury, London.

Chang, T. T., 1989, Domestication and the spread of the cultivated rices, in D.R. Harris and G.C. Hillman, eds, Foraging and farming: the evolution of plant exploitation, Unwin Hyman, London.

Claessen, H. J. M. & Skalnik P., eds, 1978, The early state, Mouton, The Hague.

Cohen, M. N., 1977, Population pressure and the origins of agriculture: an archaeological example from the coast of Peru, in Reed, C.A., ed., The origins of agriculture, Mouton, The Hague.

Cohen, M. N., 1989, Health and the rise of civilization, Yale University Press, New Haven.

Constantini, L., 1989, Plant exploitation at Grotta dell'Uzzo, Sicily: new evidence for the transition from Mesolithic to Neolithic subsistence in southern Europe, in Harris, D. R. & Hillman, G. C., eds, Foraging and farming: the evolution of plant exploitation, Unwin Hyman, London.

Dawkins, R., 1989, Darwinism and human purpose, in Durant, J. R., ed., Human origins, Clarendon Press, Oxford.

Dohan, F., 1966, Cereals and schizophrenia: data and hypothesis, Acta Psychiatrica Scandinavica 42:125-52.

Dohan, F., 1983, More on coeliac disease as a model of schizophrenia, Biological Psychiatry 18:561-4.

Dohan, F. & Grasberger, J., 1973, Relapsed schizophrenics: earlier discharge from the hospital after cereal-free, milk-free diet, American Journal of Psychiatry 130:685-8.

Dohan, F., Harper, E., Clark, M., Ratigue, R., & Zigos, V., 1984, Is schizophrenia rare if grain is rare? Biological Psychiatry 19: 385-99.

Eaton, S. B. & Konner, M., 1985, Paleolithic nutrition - a consideration of its nature and current implications, New England Journal of Medicine 312: 283-90.

Egger, J., 1988, Food allergy and the central nervous system, in Reinhardt, D. & Schmidt E., eds, Food allergy, Raven, New York.

Flannery, K. V., 1973, The origins of agriculture, Annual Review of Anthropology 2:271-310.

Fukudome, S., & Yoshikawa, M., 1992, Opioid peptides derived from wheat gluten: their isolation and characterization, FEBS Letters 296:107-11.

Gardner, M. L. G., 1985, Production of pharmacologically active peptides from foods in the gut. in Hunter, J. & Alun-Jones, V., eds, Food and the gut, Bailliere Tindall, London.

Gam, S. M. & Leonard, W. R., 1989, What did our ancestors eat? Nutritional Reviews 47:337 45.

Gordon, K. D., 1987, Evolutionary perspectives on human diet, in Johnston, F., ed, Nutritional Anthropology, Alan R. Liss, New York.

Greksch, G., Schweiger C., Matthies, H., 1981, Evidence for analgesic activity of beta-casomorphin in rats, Neuroscience Letters 27:325

Harlan, J. R., 1986, Plant domestication: diffuse origins and diffusion, in Barigozzi, G., ed., The origin and domestication of cultivated plants, Elsevier, Amsterdam.

Harris, D. R., 1977, Alternative pathways towards agriculture, in Reed, C. A., ed., The origins of agriculture, Mouton, The Hague.

Harris, D. R. & Hillman, G. C., eds, 1989, Foraging and farming: the evolution of plant exploitation, Unwin Hyman, London.

Hayden, B., 1990, Nimrods, piscators, pluckers, and planters: the emergence of food production, Journal of Anthropological Archaeology 9:31-69.

Henry, D. O., 1989, From foraging to agriculture: the Levant at the end of the ice age, University of Pennsylvania Press, Philadelphia.

Heubner, F., Liebeman, K., Rubino, R. & Wall, J., 1984, Demonstration of high opioid-like activity in isolated peptides from wheat gluten hydrolysates, Peptides 5:1139-47.

Irons, W., 1990, Let's make our perspective broader rather than narrower, Ethology and Sociobiology 11: 361-74

Johnson, A. W. & Earle, T., 1987, The evolution of human societies: from foraging group to agrarian state, Stanford University Press, Stanford.

Katz, S. H. & Voigt, M. M., 1986, Bread and beer: the early use of cereals in the human diet, Expedition 28:23-34.

Kay, R. F., 1985, Dental evidence for the diet of Australopithecus, Annual Review of Anthropology 14:315 41.

Kroker, G. F., 1987, Chronic candiosis and allergy, in Brostoff, J. & Challacombe, S.J., eds, Food allergy and intolerance, Bailliere Tindall, London.

Lee, R. B. & DeVore, I., 1968, Problems in the study of hunters and gatherers, in Lee, R.B. & DeVore, I., eds, Man the hunter, Aldine, Chicago.

Mycroft, F. J., Wei, E. T., Bernardin, J. E. & Kasarda, D. D., 1982, MlF-like sequences in milk and wheat proteins, New England Journal of Medicine 301:895.

Mycroft, F. J., Bhargava, H. N. & Wei, E. T., 1987, Pharmacalogical activities of the MIF-1 analogues Pro-Leu-Gly, Tyr-Pro-Leu-Gly and pareptide, Peptides 8:1051-5.

Panksepp, J., Normansell, L., Siviy, S., Rossi, J. & Zolovick, A., 1984, Casomorphins reduce separation distress in chicks, Peptides 5:829-83.

Paroli, E., 1988, Opioid peptides from food (the exorphins), World review of nutrition and dietetics 55:58-97.

Pedersen, B., Knudsen, K. E. B. & Eggum, B. 0., 1989, Nutritive value of cereal products with emphasis on the effect of milling, World review of nutrition and dietetics 60:1-91.

Peters, C. R. & O'Brien, E. M., 1981, The early hominid plant-food niche: insights from an analysis of plant exploitation by Homo, Pan, and Papio in eastern and southern Africa, Current Anthropology 22:127-40.

Peuch, P., Albertini, H. & Serratrice, C., 1983, Tooth microwear and dietary patterns in early hominids from Laetoli, Hadar, and Olduvai, Journal of Human Evolution 12:721-9.

Pfeiffer, J. E., 1977, The emergence of society: a prehistory of the establishment, McGraw Hill, New York.

Pryor, F. L., 1986, The adoption of agriculture: some theoretical and empirical evidence, American Anthropologist 88:879-97.

Radcliffe, M. J., 1987, Diagnostic use of dietary regimes, in Brostoff, J. & Challacombe, S. J., eds, Food allergy and intolerance, Bailliere Tindall, London.

Ramabadran, K. & Bansinath, M., 1988, Opioid peptides from milk as a possible cause of Sudden Infant Death Syndrome, Medical Hypotheses 27:181-7.

Randolph, T. G., 1978, Specific adaptation, in Annals of Allergy 40:333-45

Redding, R., 1988, A general explanation of subsistence change from hunting and gathering to food production, Journal of Anthropological Archaeology 7:56-97.

Reed, C. A., ed., 1977, The origins of agriculture, Mouton, The Hague.

Rindos, D., 1984, The origins of agriculture: an evolutionary perspective, Academic Press, Orlando.

Scadding, G. K. & Brostoff, J., 1988, The dietic treatment of food allergy, in Reinhardt, D. & Schmidt, E., eds, Food allergy, Raven, New York.

Simopoulos, A. P., 1990, Genetics and nutrition: or what your genes can tell you about nutrition, World review of nutrition and dietetics 63:25-34.

Sprague, D. E. & Milam, M. J., 1987, Concept of an environmental unit, in Brostoff, J. & .Challacombe, S. J., eds, Food allergy and intolerance, Bailliere Tindall, London.

Stark, B. L., 1986, Origins of food production in the New World, in Meltzer, D. J., Fowler, D. D. & Sabloff, J. A., eds, American archaeology past and future, Smithsonian Institute Press, Washington.

Susman, R. L., 1987, Pygmy chimpanzees and common chimpanzees: models for the behavioural ecology of the earliest hominids, in Kinzey, W. G., ed., The evolution of human behaviour: primate models, State University of New York Press, Albany.

Svedburg, J., De Haas, J., Leimenstoll, G., Paul, F. & Teschemacher, H., 1985, Demonstration of betacasomorphin immunoreactive materials in in-vitro digests of bovine milk and in small intestine contents after bovine milk ingestion in adult humans, Peptides 6:825-30.

Thatcher, J. P., 1987, The economic base for civilization in the New World, in Melko, M. & Scott, L. R., eds, The boundaries of civilizations in space and time, University Press of America, Lanham.

Walker, A., 1981, Dietary hypotheses and human evolution, Philosophical Transactions of the Royal Society of London B292:57-64.

Washburn, L. & Lancaster, C. S., 1968, The evolution of hunting, in Lee, R. B. & DeVore, I., eds, Man the hunter, Aldine, Chicago.

Wittfogel, K., 1957, Oriental Despotism, Yale University Press, New Haven.

Wraith, D. G., 1987, Asthma, in Brostoff, J. & Challacombe, S. J., eds, Food allergy and intolerance, Bailliere Tindall, London.

Wright, H. E., 1977, Environmental changes and the origin of agriculture in the Near East, in Reed, C. A., ed, The origins of agriculture, Mouton, The Hague.

Zioudrou, C., Streaty, R. & Klee, W., 1979, Opioid peptides derived from food proteins: the exorphins Journal of Biological Chemistry 254:244S9.

Zohari, D., 1986, The origin and early spread of agriculture in the Old World, in Barigozzi, G., ed., The origin and domestication of cultivated plants, Elsevier, Amsterdam


Ancient Veggies Were Small, Unpalatable

Most of what the prototypical Fred and Wilma consumed simply isn’t available today. Modern chickens, cows, sheep, and goats are plumper, more placid, and genetically different from their feral ancestors. Paleolithic fruit, though often smaller and tarter than modern varieties, was recognizably fruit. Apples, grapes, figs, plums, and pears have been tempting mammals for tens, if not hundreds, of thousands of years. But Paleolithic vegetables are another story. In fact, the Paleolithic veggie might easily be the subject of the Woody Allen joke about the two elderly women at a Catskill Mountain resort, who complain that not only is the food bad, the portions are so small, too. (See “‘Chicken From Hell’ Dinosaur“)

Ancient tomatoes were the size of berries potatoes were no bigger than peanuts. Corn was a wild grass, its tooth-cracking kernels borne in clusters as small as pencil erasers. Cucumbers were spiny as sea urchins lettuce was bitter and prickly. Peas were so starchy and unpalatable that, before eating, they had to be roasted like chestnuts and peeled. The sole available cabbage—the great-great-granddaddy of today’s kale, kohlrabi, broccoli, Brussels sprouts, and cauliflower—was sea kale, a tough and tongue-curling leafy weed that grew along the temperate sea coasts. Carrots were scrawny. Beans were naturally laced with cyanide.

The vegetables that grace every salad bar today are latecomers. Vegetables didn’t really get off or out of the ground until the Neolithic Period, the civilized tail end of the Stone Age, generally said to have begun about 10,000 years ago. The Neolithic is when we gave up the careless, footloose lifestyle of the hunter-gatherer and began to settle down on farms and in villages. Pottery was invented animals were domesticated. We began to worry about drought, weeds, and grasshoppers, and somewhere in there, almost certainly, we coined the prehistoric words for “backache,” “blister,” and “chore.”

Through painstaking selection and cultivation, Neolithic farmers, the world’s first and most patient genetic engineers, produced over the next centuries fat, lush, and yummy vegetable varieties, the descendants of which are still on our plates today. Human beings, collectively, have done a lot of great stuff. We’ve invented the printing press, built the Great Wall of China, discovered penicillin, gone to the Moon. But perhaps the greatest and earliest of our accomplishments were those of a scattering of Freds and Wilmas armed with stone hoes and digging sticks. (See “What Makes Us Human? Cooking, Study Says“)

Because of them, nobody has to eat a paleo diet anymore.

This story is part of National Geographic‘s special eight-month “Future of Food” series.


The history of food, diet and nutrition

Our ancestors collected food from the nature in order to survive, and it can be noted that humans have more than two million years of certain dietary habits. It is believed that the preparation of meals began more than 500,000 years ago. The oldest descriptions of food and meals, as well as their effects on health are described by the ancient Egyptians, and are dated 3200 BC.

The connection between food and health is well known throughout the human history. All the so-called “non-scientific” facts from ancient history to the 18th century created the basis for the modern science of nutrition.

The discovery of fire, writing and the science of nutrition are very young indeed compared to the age of our species. The modern science of nutrition is approximately 200 years old, considering that it starts after the pioneering work of French chemist Lavoisier.

The history and the development of food and nutrition can be roughly divided into three important periods: the pre-agricultural age, the age of agriculture, which began 10,000 years BC, and agro-industrial age that began some 150 years ago. When this time span could be squeezed in a year and assuming that the man appeared on January 1 st, the agricultural age would start in the second half of December, and the agro-industrial age would start on the evening of December 31 st.

The pre-agricultural era begins three million years ago, it is characterized by the collection of food, hunting and fishing, as well as developing tools and activities. In the beginning, the food is eaten raw, but after the discovery of fire humans used cooked food also. The search for food and collecting food play a major role in the bio-cultural development of man: hunting, preparing food and gathering around the fire, contributes to the development of socialization, and food and nutrition become integral part of the community. Towards the end of this time the food was eaten raw, cooked or fermented, and various items were used for collecting, handling, maintenance, preparation and feeding: shells, turtle shells, wood bark, and later on clay pots.

The era of agriculture is based on the cultivation of crops and domestication of animals which then become a major part of human food. During this period, agriculture was gradually developed on fertile soil, almost simultaneously in several places in the world, the Mediterranean, the Middle and Far East, resulting in human settlements, nations and empires. On the European continent we see the domestication of wheat, oats, peas, lentils, flax, and animals like dogs, pigs, goat, sheep and cattle, and people everywhere introduced various tools for working the land. Throughout the discovery of the New World and the development of trade, man distributed a variety of plants and animals in every corner of the world. Since that time Europe has become customized to the cultivation of corn, potatoes, tomatoes, beans, peppers, sunflowers and tobacco.

By the development of capitalism in the early 16th century, a new way of thinking evolves based on the discoveries of the Renaissance and the Reformation, the “new agriculture” spreads it’s wings and whose main goal is to increase grain production and diversification of food consumption. The need for growing more crops is strictly connected with the production of more fertilizers, more animals and more animal feed. The interdependence of cultivated plants and animals has increased manifold.

The agro-industrial era began some 150 years ago. Rigorous experimentation and new discoveries in chemistry, biology, microbiology and mechanics during the 19th century influenced the development of agriculture as a science as well as its main branches: genetics, nutrition (in the broad sense, which here includes the concept of the entire process of assimilation and energy in a living organism, and not limited to the human diet) and hygiene (to protect plants from disease and insects).

Agro-industrial era is characterized by a combination of agricultural and industrial activity – machines are introduced in agriculture, the production of food and raw materials is increased, the building of roads and railways increased the transport of goods, there was a sudden development food industry, particularly due to the creation of refrigeration chains, preservation of food products, and new households appliances (eg. refrigerator).

Under the industrialization pressure, the basic agriculture products converted to agro-industrial products: new technologies are now frequently used in food production such as canning, concentration, extraction, etc.

In 1804 Nicolas Appert discovered a new method to extend the shelf life of food – sterilization, and the first industrial plant was built in France in 1860. The scientific background for the sterilization process was given by Pasteur, and his pasteurization method is now used in various fields of human activity, not only in food preparation.

In the late 19th century, Nestlé produces the first condensed milk, and J. Liebig makes the first meat extract and concentrated soups. It was 1869 when The Mege-Mouries manufactures the first margarine. Little by little, agro-industrial products replaced agricultural products (eg industrial butter replaces the butter from domestic production). FastFood is the most recent agro-industrial product, these are semi-finished and finished products that drastically reduce the activity of preparing food in the household.

The development of the nutrition science

Nutritional science began with modern chemistry and its founder Antoine Lavoisier in the late 18th century. The basis for the new science of human nutrition was laid down through the knowledge of general chemistry (identification of elements and compounds), the development chemical analysis, biochemistry, physiology, and scientific, quantitative testing of old and new theories and ideas. The development of the nutrition science was largely dependent on the development of analytical chemistry and general physiology.

Before Lavoisier: naturalistic era – from Hippocrates to Lavoisier

The Greek physician Hippocrates (460-377 BC) knew that the same food and drink cannot be given to the healthy and sick people. Cornelius Celsius in the 1st century is considered the treatment of patients with diet the hardest, but the best part of medicine.

Galenus (131-201) teachings literally dominated the European medicine more than a thousand years. He was known by prescribing fasting in the treatment of many diseases. Anthimus (511-534) describes hundred foods in the book “Epistula de Observation Ciborum”.

Sigmund Albich, a Czech physician, wrote one of the first books on dietetics “Dietetics for Old Men”. The Italian physiologist Sanctorius (1561-1636) weighed all the food he consumed for over thirty years, as well as body fluids. He also writes a discussion about metabolism.

John Mayon (1641-1679) finds that muscular work depends on the combustion of some chemical compounds. The English physician William Stark tried on himself the harmful and non-harmful foods.

Many folk remedies, and some food was used to treat diseases. Abut the year 1550 it was already known that citrus fruits prevent and cure scurvy. The traditional folk remedy for vision problems was cooked liver (from domestic and wild animals). Dried seaweed and dried sea sponges or their ashes obtained by incineration were the old folk remedy to treat goiter.

1746 – James Lind, an English physician, performed the first modern controlled clinical study using different potential antiscorbutics. Lind divided twelve scorbutic sailors into 2 groups, and each group was administered by a different therapy. Sailors who got lemons and oranges after 6 days were nearly healed, while the second group, treated with dilute sulfuric acid or vinegar, showed no improvement even after two weeks. In fact, at that time it was the opinion that scurvy can be treated with citric acid from citrus fruits. But as citrus fruits spoiled during longer voyages, in exchange were used stronger and more stable acids such as dilute sulfuric acid and vinegar (acetic acid). It was also thought that scurvy was exclusively a sailors disease, and that it does not appear in other people.

1750 – Scurvy was first treated with lime juice.

1768-1771 James Cook’s (1728-1779) sailors must eat sauerkraut and citrus fruit to prevent scurvy, though no one at that time did not know how these foods prevented scurvy.

Modern Nutrition – Chemical analytical era

1777 – The most important experiments of Antoine Lavoisier (1743-1794) are directly linked with the development of nutrition. Lavoisier proved that the combustion process involves a combination of various chemical substances and oxygen, and that plant and animal respiration is a slow combustion of organic matter using oxygen from the atmosphere. Lavoisier and Pierre-Simon Laplace prove the connection between heat and CO2 that occur in animals. Lavoisier measured oxygen consumption and CO2 release in man, and realized that they are increased after the consumption of food and physical exertion. These experiments lead him to the conclusion “La vie est donc une combustion” – “Life is so the combustion process.” Unfortunately, his life was ended by the French Revolution guillotine.

1812 – After the discovery of the chemical element iodine, a French chemist suggests the use of iodine in the treatment of goiter. The idea soon falls into oblivion since the elemental iodine did not show any effect.

1816 – Francois Magendie concluded after animal experiments that “the diversity of food is especially important in hygiene, and this diversity is achieved by our instincts.”

1823-1827 The English chemist and physician William Prout (1785-1850) isolated hydrochloric acid from the stomach of man. He believes that food is composed of three basic components: proteins, fats and carbohydrates, and recognizes that these substances should be taken with food daily.

1830-1850 – Rickets is treated with fish oil and butter.

1833 – American William Beaumont realizes that the already known hydrochloric acid is secreted in the stomach after a meal.

1838 – Swedish chemist Jons Jacob Berzelius (1779-1848) discovers proteins. He is considered, with Lavoisier, the father of modern chemistry.

1839 – French chemist Jean Baptiste Boussungault conducted the first nitrogen balance study. Such balance studies with various substances are carried out even today, eg. the retention of calcium in the body at high dietary intakes or when using supplements.

1839 – Dutchman Gerrit Mulder develops a protein theory. He believes that “animal products” (the proteins albumin, fibrin and casein) originate from the same “protein” radicals, and that they differ by their share of phosphorus, sulfur, or both elements.

1842 – Justus Liebig (1803-1873), German chemist, an experienced expert in organic chemistry and influential scholar, worked on the food chemistry and connected it with physiology. By observing muscles he found out that they have no carbohydrates and fat, and concludes that the energy for muscle contraction must come from the decomposition of protein. He believed that the only real nutrients are proteins, or the only ingredient that is able to build and replace active tissue, and provide the body with energy. This theory was later challenged by many chemists.

1842 – Budd treats night blindness with fish liver oil.

1850 – Claude Bernard reveals the pancreatic secretions and the emulsifying ability of bile, and so determines they play an important role in digestion and in the absorption of fat. He concludes that the central role in digestion can not be attributed only to the stomach, as was the opinion of his contemporaries.

1850 – After nearly 100 years of uncertainty in the treatment of scurvy, A. Bryson concludes that citric acid has no anti-scorbutic activity.

1866 – Englishman Edward Frankland developed a technique for directly measuring the food and urea energy of combustion. He determined experimentally 1 gram of protein gives 4.37 kcal. By reviewing the experiments of his colleagues Frankland concludes that most of the energy for muscular work must come from fats and / or carbohydrates, and thus challenged Liebig’s protein hypothesis.

1880-1900 – The discovery of many microbes, hygiene and sanitation are gaining in importance. At this point of time it was believed that the most common nutrition diseases were caused by microorganisms or their toxins. By 1885 the modern science of nutrition dealt mainly with proteins and energy metabolism, but in the next 60 years the gradually discovered factors in food (vitamins) were finally connected with the development of various diseases. Scientists focused research on the following diseases: anemia, beri-beri (polineuritis), rickets, night blindness, goiter and others.

1887 – American Wilbur Olin Atwater (1844-1907), inspired by the German school of Carl Voit, set the American standard for protein at 125 g / day. The standard for German workers was 118 g / day, and Voit considered that vegetarians, despite remaining in nitrogen balance, exhibited “inconveniences”. Many experts considered that the essential daily protein intake must be greater than 100 g / day. But after a few years, Atwater concluded that this numbers were outrageous, and recommended the implementation of controlled studies to determine how nutrients affect the metabolism and muscular work. Atwater reviewed his own research and his concern grew up over findings that the U.S. population consumed too much food, especially fats and sweets, and did not exercise enough.

1890 – Ralph Stockman treats anemia with subcutaneous injections of iron citrate and iron sulphide capsules and obtained very good results.

1894 – The German physiologist and hygienist Max Rubner (1854-1932) quantitatively determined the calorific value of protein, fat and carbohydrates, whether they were spent in a living organism, or simply burned in a calorimeter. He experimentally proved that the heat of warm-blooded animals was the energy from the food nutrients.

1896 – Atwater (1844-1907) and E.B. Rosa determined the calorific value of many foods and thus created the first caloric food tables. Atwater conducted food analysis, quantifying food ingredients, determined the energy output of physical activity and food digestion. In 1896 Atwater and colleagues published a compilation of 2600 chemical analyzes of food, in 1899 the publication was augmented by another 5000 analyzes. The second edition of the chemical analyses of food was published in 1906 and included the maximum, minimum and average value of moisture, protein, fat, total carbohydrate, ash and energy value. The main objective of Atwater in the preparation of these tables was to teach the poor how to achieve the appropriate level of protein in the diet.

1899 – Atwater and EB Rosa built the most accurate respiration calorimeter for the study of human metabolism. We recommend Atwater and Rosa’s (1899), Atwater and Benedict (1905), and Benedict and Carpenter’s (1910) papers to anyone who is engaged in research in the field of nutrition and physical activity, because this papers are actual today with their detailed technical data and experimental procedures which can be used for measuring energy consumption. Atwater definitely confirms that the First law of thermodynamics applies to the human body as well as the substances around us. Atwater’s comment dated 1895. sounds like it is spoken just now:

“Food is a material that, when put into the body, is used to form the tissue or to create energy, or both. This definition includes all the usual materials of food, since they build tissues and produce energy. It includes sugars and starches, because they produce energy and form adipose tissue. It includes alcohol, because it gives energy, though it does not form tissues. Food does not include creatine, creatinine, and other nitrogen extracts from meat, as well as theine and caffeine, because they do not serve in the formation of tissues, or to obtain energy, although they may in some cases be a good asset to the diet. ”

Biological era – from the beginning of the 20th century onwards

Thanks to the advances in biochemistry and physiology nutrition throughout the 20th century studies the role of macro and micro nutrients (vitamins and minerals). Scientists have used various combinations of purified nutrients (proteins, carbohydrates and fats) to cause nutrient deficiency in animals in order to identify the missing nutrient. In the first half of the 20th century nutritionists discovered amino acids and essential fatty acids. In the second half of the 20 century emphasis was placed on exploring the role of essential nutrients and discovering ways in which vitamins and minerals act on enzymes and hormones. Large epidemiological studies of the 60’s and 70’s showed the effect of carbohydrates, fiber and fat in the development of diseases such as diabetes, constipation and atherosclerosis (civilization diseases).

1902 – German chemist Emil Fisher (1852-1919) was one of the greatest chemists of modern times. He takes credit for many merits in the field of nutrition, he is responsible for the detection of active ingredients of tea, coffee and cocoa. From 1882 until 1906 he discovered the structure 16 aldohexoses stereoisomers, with the most important sugar glucose. He sintetized glucose, fructose and mannose, he discovers adenine, xanthine, guanine which belong to the family of purines. He also contributed significantly to the understanding of protein and amino acid isolation – Fisher synthesized peptides, polypeptides and proteins, and discovered the peptide bond. In 1902 Fisher is awarded the Nobel prize for Chemistry for his work on clarification and synthesis of proteins and carbohydrates. Together with Frederico Hopkins he discovered the primary importance of amino acids as the basic components of proteins.

Englishman biochemist and physiologist Frederick Hopkins (1861-1947) was a pioneer in the study of vitamins, he was the first scientist to isolate tryptophan and glutathione. In 1929 Hopkins received the Nobel Prize for research in nutrition deficiency disease.

1904 – Upon completion of his own studies Russell Chittenden (1856-1943) denied the high standards of protein set up by the U.S. and German schools. His statement is notable:

“People did not become rich because they eat more protein, but eat more protein and more expensive high-protein foods because they can afford them.”

Chittenden maintained nitrogen balance at 60 g / protein per day, which is less than half of the then recommended standards at that time. But interestingly, the researchers throughout the 19th century considered that the human body absorbs intact proteins, that are then transformed into the required form. Although it was known from the beginning of the 20th century that pancreatic juices contain a substance trypsin that dissolves proteins, these degradation products were not too interesting for nutritionists that time. In fact aminoacids were considered to represent an excess of proteins that break down and become unusable, and are therefore excreted from the body.

1905-1950 – The intensive search for “food factors” (vitamins) and other food ingredients and their effects on human health.

1912 – The thirty-year quest for vitamin thiamine (B1) is finally concluded. The thiamine deficiency beriberi occurred in most parts of Asia, all the way to Japan. While Japanese doctor Kanehiro Takaki believed that the cause of the disease was insufficient protein intake, Dutchman Christian Eijkman (1858-1930), who was working in Jakarta, was looking to find a microbial cause. Eijkman was very persistent and methodical and very quickly ruled off the microbial cause. He analized the inconsistent results of experiments on chickens, and found out that servants were sometimes feeding chicken with cooked rice. He concluded that the main problem was in the cooked rice, and started to study this food. Eijkman learned very quickly that the military used polished rice, because brown rice spoiled quickly in the tropical conditions (resulting in rancidity). He finally discovers that the real cause of beriberi was a lack of a substance that was located in the rice bran. This later led to the concept and discovery of vitamins. Eijkman was awarded the Nobel Prize in physiology medicine in 1929. Other researchers continued Eijkman work, and especially significant Gerrit Grijns, who stared in 1901:

“The absence of natural food substances leads to serious damage to the peripheral nervous system. These substances are distributed differently in food, and are very difficult to isolate because they are unstable. These substances can not be replaced by other simple molecules.”

In 1905 the Dutch researchers in Indonesia showed that beriberi was caused by the consumption of polished rice which lacked a thermo-labile component. Many tried to isolate this component from rice, and the first who succeeded was a polishman Casimir Funk (1884-1967). He discovered the substance which lack caused nutritional polineuritis, and he first introduced the term “vitamin”. Funk draws attention by his work on vitamin deficiency diseases – he is in fact responsible for the coin “vitamin”, and later he postulates the existency of another 3 vitamins: B2, C and D, which he claimed that are necessary for normal health and disease prevention. Funk believed that small amounts of vitamins that are naturally present in a variety of different foods can prevent poor growth and some diseases.

1912 – Norwegians Alex Holst and Theodore Frohlich discovered vitamin C.

1912-1914 – Elmer McCollum and Marguerite Davis discovered vitamin A. In 1913 scientist at Yale University discovered this compound in butter.

1918 – The concept of “protective foods” is developed: milk, fruit and vegetables become the first protective foods.

1919 – Francis Gano Benedict (1870-1957) assisted by Atwater, during a 12 year period conducted more than 500 experiments in Atwater-Rosa’s respiratory calorimeter – studies that targeted energy expenditure at rest, during physical activity and after consuming food. Benedict also published studies on the physiological effects of alcohol, muscle work, the effect of mental effort on energy metabolism, he studied the metabolism of infants, children and adolescents, the metabolism of starvation, also the metabolism of athletes and vegetarians.

1919 – Benedict and Harris published the “metabolic standards” – tables based on gender, age, height and weight used for comparison of healthy and ill people.

Ivan Petrovich Pavlov (1849-1936), russian physiologist, was involved in the physiology of digestion, and discovered the conditioned reflex. An important scientist in the field physiology of digestion was also Claude Bernard (1813-1878), which explored the function of the pancreas in digestion, and showed that the level of plasma glucose can vary in healthy individuals, and many of his findings were later useful in the study of diabetes, and liver function studies.

1922 – Nobel prize Laureate Frederick Grant Banting (1891-1941), Canadian physician and Charles Herbert Best (1899-1978), Canadian physiologist, led by J.J.R. MacLeod found the pancreatic hormone insulin, the discovery that is considered to be one of the most important medical advances of that time. Until then, millions of people around the world who suffered from diabetes could not be treated, and their prognosis was very poor.

1922 – Edward Mellanby discovered D, Americans Herbert Evans and Katherine Bishop discovered vitamin E.

1923 – Fortification of table salt with iodine to prevent goiter was introduced for the first time in Switzerland. England and U.S. enriched milk with vitamin D for the prevention of rickets.

1926 – George Minot (1885-1950) and William Murphy (1892-1979) treat people suffering from pernicious anemia using “liver food.”

1926 – D. T. Smith and E. G. Hendrick discovered vitamin B2 (riboflavin), and the same vitamin was first synthesized in 1935.

1929 – Discovery of essential fatty acids.

1933 – Lucy Wills discovered folic acid.

1934 – Paul Gyorgy discovered vitamin B6 (pyridoxine).

1935 – Vitamin C was the first vitamin synthesized in the laboratory (ascorbic acid).

1937 – American Conrad Elvehjem discovered the vitamin niacin.

1941 – The first edition of the U.S. Recommended Dietary Allowances (RDA) .

1947 – Synthesis of vitamin A.

1948 – Discovery of vitamin cobalamine B12.

1950’s – Hygiene develops, and also food technology, labeling of food, the nutritional needs were quantified by age groups for men and women. The vitamins are discovered, and the science of nutrition continued the investigation of biochemical effects of food continues on health, and biochemical exploration of other aspects of human nutrition begins.

1950 – 1970 – Some huge studies are conducted the reveal the connection between illnesses and food consumption. Agricultural research is aimed at increasing the production of meat and milk, pigs are poultry are intensively farmed. Food becomes cheaper and more accessible in developed countries. Food industry and multinational companies are booming, supermarkets offer a never seen before variety of food products. Confectionery, sweets, cakes, biscuits, butter and milk are made available to everyone, and there is a major change in eating habits. There is a growing need for simplifying the process of preparing meals in households, so there was an increasing demand of kitchenware and appliances, especially the refrigerator. The purchasing power increases, as wee as the demand for food that is prepared quickly. It is now possible to obtain seasonal produce throughout the year. Cooling chain literally allows the transfer of food halfway around the world. Food is packed in new packaging – cans, plastic containers, vacuum and modified atmosphere packaging – thereby the shelflife is extending. A variety of foods that were previously prepared for hours and days are now available everyday.

1963 – FAO and WHO created the Codex Alimentarius Commission, whose mission was to develop food standards, and safeguarding the health of consumers. The Commission was made up of food technologists, toxicologists, and among other things, the Commision sets the international regulations for analytical methods, food labeling, toxicological aspects of food, etc.

1980’s and 1990’s – developed countries, U.S. and European countries produce surplus food, but there is the growing problem of hunger in the developing countries. Also the developing countries accept Western culture and the Western diet.

1988 – The Quetelet’s index or body mass index (BMI) is first used for the definition and diagnosis of malnutrition, according to the Belgian mathematician Adolphe Quetelet (1796-1874).

1992 – The U.S. Department of Agriculture (USDA) has officially announced the food pyramid – Food Guide Pyramid, which was supposed to help the American food choices, achieve good health and reduce the risk of chronic diseases.

The investigations are of course continuing today, but while the classic nutrition in the middle of the last century has been primarily preoccupied with the problem of hunger, rationalizing food in war, and prevention of nutritional disorders, the science of nutrition of today trends to determine the significance of individual food components (fiber, cholesterol, vitamins, minerals, phytochemicals) and diet on health and disease.


Hunting and Gathering Society

Studies of modern-day hunter-gatherers offer a glimpse into the lifestyle of small, nomadic tribes dating back almost 2 million years ago.

With limited resources, these groups were egalitarian by nature, scraping up enough food to survive and fashioning basic shelter for all. Division of labor by gender became more pronounced with the advancement of hunting techniques, particularly for larger game.

Along with cooking, controlled use of fire fostered societal growth through communal time around the hearth. Physiological evolution also led to changes, with the bigger brains of more recent ancestors leading to longer periods of childhood and adolescence.

By the time of the Neanderthals, hunter-gatherers were displaying such “human” characteristics as burying their dead and creating ornamental objects. Homo sapiens continued fostering more complex societies. By 130,000 years ago, they were interacting with other groups based nearly 200 miles away.


Japanese Recipes

Kinpira (Burdock and Carrot)

Kinpira is one of the classic Japanese home-cooked dishes, featuring two great root vegetables, burdock and carrots. In this sauteed dish the burdock combines beautifully with the sweet carrots, red peppers and roasted sesame seeds. Crunchy, soft, sweet and hot, no wonder this Japanese recipe is a popular winter dish in Japan.

Burdock, or gobo, is a fiber-rich Japanese root vegetable with a delectable earthiness. Look for burdock at Japanese markets or gourmet supermarkets.

1 medium (8 ounce) burdock root

1 tablespoon canola oil or rice bran oil

2 dried Japanese (or Thai chili, Santaka or Szechuan) red peppers

1 cup carrot, cut into matchstick-sized slivers

1 tablespoon sake (rice wine)

1 tablespoon reduced-sodium soy sauce

2 teaspoons mirin (a cooking wine made from glutenous rice)

1 teaspoon granulated sugar

1 teaspoon toasted and ground sesame seeds

1. Scrub the exterior of the burdock root with a vegetable brush to remove excess dirt and the skin. Cut the burdock root into 2½ to 3-inch-long matchsticks, and rinse quickly under cold water. You will have approximately 2 cups of burdock root matchsticks.

Continued

2. Heat the oil in a medium skillet over medium-high heat. Add the red peppers and saute for 30 seconds. Add the burdock root and saute until tender, about 3 minutes it will appear translucent on the surface. Stir in the carrot and saute for 2 minutes.

3. Reduce the heat to low and add the sake, soy, mirin, and sugar. Stir the vegetables for 1 minute more to allow them to absorb the sauce. Remove and discard the red peppers and arrange the vegetables in a mound in the center of a serving bowl and garnish with the sesame seeds.

Excerpted from Japanese Women Don't Get Old or Fat by Naomi Moriyama and William Doyle. Copyright © 2005 by Naomi Moriyama and William Doyle. Excerpted by permission of Delta, a division of Random House, Inc. All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.

Here's a perfect example of how Japanese home cooks create a delicious and filling beef dish -- with very small portions of beef. An abbreviated version of sukiyaki (a combination of thinly sliced beef and vegetables in a sweet soy broth), this is spooned over hot cooked rice in a bowl.

Thinly sliced beef is available in the freezer section of most Japanese markets. It's convenient to use, extremely tender and perfect for this healthy cold-weather dish. If you choose to purchase the beef in a regular market, freeze the meat before you cut it. This will enable you to carve it (with an extremely sharp knife) into paper-thin slices.

I often think that the best part of this beef bowl isn't the beef, but the hot nutty rice saturated with the sweet beef juices.

2 cups dashi (a fish-and-sea-vegetable stock, available online or in Asian grocery stores)

1 medium yellow onion, peeled, halved and cut into thin crescents

1 Tokyo negi (or 1 small leek), with roots and rough portion of the top cut off, cleaned, rinsed and cut diagonally into thin slices

Continued

3 tablespoons reduced-sodium soy sauce

1 tablespoon granulated sugar

1 teaspoon fine-ground sea salt

1 teaspoon mirin (a cooking wine made from glutenous rice)

½ pound very thinly sliced beef fillet (about 1/8 inch thick), or, if you prefer, ground beef

6 cups hot cooked brown or white rice

1 scallion, roots and top portion cut off, and thinly sliced

1. Place the dashi and sake in a medium saucepan over high heat. Add the onion and Tokyo negi (or leek) and bring the mixture to a boil. Reduce the heat to medium and simmer until the vegetables are tender, about 5 minutes. Stir in the soy, sugar, salt, and mirin. Add the beef and simmer until it is just cooked through, about 40 seconds (it will cook rapidly if cut into paper-thin slices).

2. Lay out 4 bowls. Fill each one with 1½ cups of hot cooked rice and ladle even portions of the beef mixture over the top. Garnish each serving with a sprinkling of scallion.

Excerpted from Japanese Women Don't Get Old or Fat by Naomi Moriyama and William Doyle. Copyright © 2005 by Naomi Moriyama and William Doyle. Excerpted by permission of Delta, a division of Random House, Inc. All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.


Causes Of The Neolithic Revolution

There was no single factor that led humans to begin farming roughly 12,000 years ago. The causes of the Neolithic Revolution may have varied from region to region.

The Earth entered a warming trend around 14,000 years ago at the end of the last Ice Age. Some scientists theorize that climate changes drove the Agricultural Revolution.

In the Fertile Crescent, bounded on the west by the Mediterranean Sea and on the east by the Persian Gulf, wild wheat and barley began to grow as it got warmer. Pre-Neolithic people called Natufians started building permanent houses in the region.

Other scientists suggest that intellectual advances in the human brain may have caused people to settle down. Religious artifacts and artistic imagery—progenitors of human civilization—have been uncovered at the earliest Neolithic settlements.

The Neolithic Era began when some groups of humans gave up the nomadic, hunter-gatherer lifestyle completely to begin farming. It may have taken humans hundreds or even thousands of years to transition fully from a lifestyle of subsisting on wild plants to keeping small gardens and later tending large crop fields.


4 FOOD FOR RELIGIOUS AND HOLIDAY CELEBRATIONS

As a result of Peru's heavy Spanish influence, most Peruvians (90 percent) are devout Catholics. Christian holidays such as Easter, Christmas, and All Saints' Day are joyously celebrated throughout the country, often with fireworks, bullfights, dancing, and roast pig. The remainder of the population adheres to indigenous beliefs, believing in the gods and spirits the Incas once did hundreds of years ago. Many Christian holidays coincide with existing traditional festivals, allowing most Peruvians, regardless of differences in beliefs, to celebrate together.

Christmas brings great joy to the Christians of Peru, especially children who await the arrival of Santa Claus. Families use the holiday time to travel to the homes of family and close friends. Because of the number of people rushing about through Peru's streets, vendors rush to sell holiday foods and other goods to passing people. Sweet mango juice, bakery rolls, and homemade doughnuts coated with sugar and syrup are Christmas favorites. Flan, caramel custard enjoyed throughout Central and South American countries (as well as Spain, the Philippines, and the United States), is also a dessert enjoyed by Peruvians.

Ingredients

  • ¼ cup sugar, plus ¾ cup sugar
  • 4 drops lemon juice
  • 2 cups milk
  • 1 teaspoon vanilla
  • 4 eggs

Procedure

  1. Preheat oven to 350ଏ.
  2. In a small saucepan, heat ¼ cup sugar and drops of lemon juice over low heat until mixture is dark brown, like caramel syrup. (Don't worry if syrup burns a little.)
  3. Pour into a flan mold (oven-proof straight-sided souffle dish or individual molds work nicely), covering all sides and bottom with the sugar syrup.
  4. Place in the refrigerator while preparing flan.
  5. Bring milk and vanilla to a boil in a small pot over low heat.
  6. In a separate mixing bowl, combine the eggs and ¾ cup sugar, beating well.
  7. Slowly add the egg and sugar mixture to the boiled milk.
  8. Pour into refrigerated mold. Place flan mold into a larger baking dish. Add water to a depth of about one inch, and carefully place in the oven.
  9. Bake 35 to 40 minutes. Flan is done when knife inserted in the center comes out clean.
  10. Cool and remove from mold. Serve chilled.

Carnavales (kar-nah-VAH-lays Carnival) is an elaborately celebrated national holiday that takes place a few days before Lent. It is the last opportunity for people to drink and dance before the fasting period of Lent begins, when such activities are not allowed. During these few days, some practice native traditions of rounding up wild game to present to a priest or mayor, who in return provides chichi and cocoa leaves. The offering of the animals dates back several hundred years to the Incas, who used to give offerings of food to the gods in hope for a good harvest. Papas a la huancaína (potatoes with cheese) is a popular meal during Carnival.

Papas a la Huancaína (Potatoes with Cheese)

Ingredients

  • ¼ cup lemon juice
  • ⅛ teaspoon ground red pepper, or to taste
  • Salt, to taste
  • 1 onion, thinly sliced
  • 2 Tablespoons vegetable oil
  • 3 cups Monterey Jack or Swiss cheese, shredded
  • ½ teaspoon turmeric
  • 1½ cups heavy cream
  • 6 potatoes, drained, peeled, and quartered
  • 1 to 2 hard-boiled eggs, for garnish

Procedure

  1. Scrub the potatoes, place them in a saucepan, cover with water, and boil until tender (about 20 minutes). Drain, allow the potatoes to cool. Peel them, cut them into quarters, and set aside.
  2. In a small mixing bowl, combine the lemon juice, red pepper, and salt. Add onion slices and coat them with the mixture. Stir well and set aside.
  3. Heat oil in a large skillet over low heat.
  4. Add cheese, turmeric, and heavy cream. Stirring constantly, continue cooking over low heat until cheese melts and mixture is smooth.
  5. Add the cooked potatoes and gently stir to heat through, about 5 minutes. Do not allow mixture to boil, or it will curdle.
  6. Transfer to a serving bowl and garnish with hard-boiled eggs.
  7. Sprinkle onion mixture over the potatoes. Serve immediately while potatoes are hot.

The Healthiest People In The World Eat A Lot Of Carbs

Japanese people are, as a whole, very healthy: They have the second-highest life expectancies compared to any other country in the world (the U.S. comes in at number 43) and have an obesity rate of just 3.5 percent, which is one-tenth of America's 35 percent obesity rate.

The reason for Japan's superior health? Their grain-heavy, high-carb diet.

According to a new study by researchers at the National Center for Global Health and Medicine in Tokyo, people who strongly adhere to Japan’s recommended dietary guidelines are 15 percent less likely to die of any cause -- such as cardiovascular disease and stroke -- compared to those who don't adhere well.

Japan’s nutrition guidelines reflect the country's traditional diet, which is high in grains, fish and soybean products, but low in fat. In the U.S., where the tide appears to be turning against grains and toward larger intakes of fat, Japan's contrasting food guidelines are a good reminder that there’s no “correct" way to eat nutritious food -- just different styles that suit different people and cultures best.

Why Japanese people can eat so many grains (and not get fat)

For the study, 80,000 participants answered detailed lifestyle and food questionnaires that determined how well they followed the guidelines, and then researchers tracked their health for 15 years. The top quarter of people who followed the guidelines best had a decreased risk of death from any cause. The researchers controlled for factors like age, sex, BMI, smoking status, total physical activity and history of hypertension, diabetes and dyslipidemia. People with a history of cancer, stroke, heart disease or chronic liver disease were also excluded.

James DiNicolantonio, a cardiovascular research scientist at St. Luke’s Mid America Heart Institute, is a passionate defender of the theory that sugar and carbs are the true cause of obesity and metabolic disease. He also encourages people who want to lose weight to eat more high-fat, high-calorie foods to make them feel more full.

“We can learn a lot about how to be healthy from the Japanese, and it really comes down to ⟪t real food' and ‘exercise.’”

But even he notes that the high-carb Japanese diet works, and it’s because of the quality of the food they eat, how little fat they eat, and their activity levels, he explained to HuffPost. DiNicolantonio, who was not involved in the study, says that it's a uniquely Japanese combination of macronutrients that may be saving them from obesity and metabolic disease.

"Combining a high intake of carbohydrates and fat is the perfect storm for obesity," he said. "The Japanese tend to eat high carb (both rice and vegetables) but a low intake of fat."

DiNicolantonio also noted that Japanese people tend to eat lots of seafood, which is rich in healthy omega-3 fatty acids, and they don’t eat as many processed foods.

What’s more is that the average Japanese person walks over 7,000 steps a day, while Americans walk an average of about 5,000 steps per day. Also of note: The health trend to walk 10,000 steps per day actually started in Japan.

Given their diet of whole, unprocessed foods, as well as their active lifestyle, it's no wonder that Japanese people can tolerate more grains than the average American, said DiNicolantonio.

"I think the best takeaway for Americans, when looking at the Japanese, is that if we restrict our intake of refined sugar, industrial seed oils, and increase [our] intake of marine omega-3s, then we might be able to tolerate eating more rice,” he said. “We can learn a lot about how to be healthy from the Japanese, and it really comes down to 'eat real food' and 'exercise.'"

Japan's nutrition guidelines are easy to follow

Japan's 2005 food guidelines represent this culinary history. While Americans enjoyed a pyramid before being presented with a plate, Japan's guidelines are illustrated as a spinning top. Kayo Kuratani, a researcher at the National Center for Global Health and Medicine and one of the study's authors, notes that the graphic is easy to understand and follow. The spinning top is "dish-based," while U.S. guidelines talk mostly about raw ingredients.

"The dish-based method is not only easily understood by those who prepare meals but also by those who eat them," Kuratani told HuffPost. "It is expressed in terms of actual dishes eaten at the table rather than the foods selected or used in meal preparation. This makes it readily understandable even for those who rarely cook."

A figure running around the top represents the need for physical activity. The top's handle is made of a glass of water and tea, and no serving size is recommended for snacks, candy and other beverages (meaning, sugary ones).

The largest section of the top is made up of grain dishes like rice, bread, noodles, and rice cakes, recommended for five to seven servings a day. That’s followed by five to six servings of vegetable dishes, then the spinning top narrows further to three to five servings of protein including meat, fish, egg and soy bean dishes.

The final section is split in two: two servings per day each of fruit and milk or dairy products.

What Americans can learn from Japan

Dr. Lydia Bazzano, a nutrition and diabetes researcher at Tulane University, points out that the spinning top guide may be potentially deceptive for Americans. She notes that accompanying written guidelines point out the top is variable according to age, sex and activity level. Highly active young men, for example, can eat more grains than a sedentary woman in her old age.

“Among people who are very physically active, low fat diets with higher grain intake do not necessarily contribute to poor health outcomes and conditions like obesity,” Bazzano said. “However, among persons who are less physically active, higher grain intake, especially refined grain intakes, may contribute to poorer health outcomes and/or obesity.”

Japan’s Ministry of Health, Labour and Welfare did make one major update to the most recent guidelines: Because Japanese people mostly eat white rice as their main grain, and white rice is linked to an increased risk of chronic diseases, the 2010 guidelines recommend that only 50 to 65 percent of a person’s diet should be carbs, and that people should begin to explore whole grains like brown rice, explained Kuratani.

Still, the ideal Japanese diet is a powerful reminder that there’s no one way to achieve a healthy weight and avoid chronic disease. So the next time anyone gives you flack about (gasp!) eating grains for lunch, just let them know that you’re on the Japanese spinning top plan.