Typically, when one thinks of being on a diet and what that entails, visions of boring salads and dry flavorless chicken breast run through their head. The word “diet” itself has come to be synonymous with the idea of deprivation and hardship. This is a far cry from its real meaning, which is simply the food which an animal or person eats on a regular basis.

Enter the Paleo diet—a meal plan that’s actually grounded with the true definition of a diet. This isn’t a two week gimmick encouraging you to gulp down gag-inducing shakes in order to lose a few pounds. The Paleo diet is the original human diet, based on foods that are evolutionarily appropriate for people to eat and geared towards optimizing health, not just weight loss. Indeed, the theory of evolution and natural selection is central to this concept.

The main premise of the Paleo diet centers around the idea that the average human body hasn’t yet sufficiently adapted to eating foods that became available with the invention of agriculture 10,000 years ago. This lack of full adaptation is much more significant with regard to industrial food products developed within only the last century. Eating foods that exist outside of our ancestral dietary menu has resulted in a growing number of people now dealing with chronic “diseases of civilization” like heart disease, diabetes, and obesity.

At its most basic, the Paleo diet consists of the types of foods available to hunter-gatherers during the paleolithic period of history. This would include foods like fish, game and other lean meats, vegetables, fruit, nuts, and tubers. Essentially, eat plenty of animal protein and vegetables. The Paleo diet doesn’t include bread, grains, legumes, refined sugars, seed oils, and industrially processed junk food.

A common rebuttal to this concept is – cavemen lived short and brutish lives, so why would one want to emulate their diet? This argument always rears its head at some point and it’s a shame, because it is based on a glaring misconception. The idea that ancient humans lived only an average of 30-40 years is not particularly enlightening.

The high rate of infant mortality and adolescent death due to accidents or predation (which is common for most animals living in the wild) radically skews the average lifespan to a younger age.1 In reality, most humans could expect to live a decently long and robust life if they managed to make it out of adolescence.

There is no evidence suggesting that hunter-gatherers suffered from debilitating diseases like diabetes or cancer. In fact, European explorers making first contact with American Great Plains tribes were astonished by their amazing physicality and health.2 Meanwhile, extant populations of hunter-gatherers and horticulturists like the Kitavans, Maasai, and !Kung still demonstrate remarkable levels of vitality.3 What do all of these cultures have in common? A reliance on traditional food gathering and preparation methods.

While bread and grains are often referred to as the “staff of life,” it begs the question “how did human life exist before the invention of bread?” Quite well it seems, based on the fossil and anthropological evidence. So, why are grains so bad and why did humans eat them despite their negative qualities?

From an evolutionary standpoint, agriculture and its products (like bread), were just another tool. Growing food crops was a clever way to survive periods where more traditional foods were in short supply. However, this practice was not without drawbacks. Unlike fruiting plants which offer up their seeds in a nice tasty package for animals to help disperse, grains (like corn and wheat) don’t display any traits suggesting they want their reproductive parts to be eaten.

On the contrary, they have evolved a multitude of mechanisms with the purpose of discouraging or preventing the digestion of their seeds. Plants don’t have teeth or claws like animals. Instead, they have defensive proteins called lectins. These proteins are especially resistant to being broken down through a wide range of temperatures and animal enzymes. Wheat germ agglutinin, a lectin found in wheat, is normally intended as a defense against insects, but is suspected of causing significant harm to humans as well.

Gluten is a much more well-known protein which has been definitively implicated in celiac disease. The adverse reaction to gluten by celiacs causes damage to the lining of the small intestine and prevents the absorption of parts of food that are important for maintaining health. Another facet of this intestinal damage is a condition called “leaky gut” where the disruption of tight junctions between your intestinal microvilli leads to hyperpermeability.

This condition is theorized to be related to the development of various autoimmune and inflammatory diseases like type 1 diabetes, asthma, and inflammatory bowel disease. There is also growing evidence that leaky gut can be induced in non-celiac individuals due to some form of gluten intolerance.

Grains, legumes and nuts also contain another compound called phytate. Phytate is a non-digestible substance (for humans) that chelates and hinders the absorption of important minerals like zinc, iron, calcium and magnesium. There are various traditional methods for reducing the amount of phytates in these foods, like soaking and cooking, though they aren’t 100% effective. Therefore, if your diet relies heavily on these types of foods, you’ll likely experience issues obtaining enough of those essential minerals.

A more recent and unexpected factor in the modern assault on our bodies is the risk that seed oils pose to human health. As a result of years of the government vilification of saturated fats and animal products (which has proven to be largely unfounded), polyunsaturated fatty acids like those found in seed or vegetable oils have become pervasive in our diet. Extolled as supremely beneficial to cardiovascular health, canola and soybean oils have dramatically replaced traditional fats like butter and lard.

The unintended side-effect of this switch is that the standard diet has become inundated with copious amounts of omega-6 fatty acids like linoleic acid. The ancestral norm is something closer to a 1:1 or 2:1 ratio of omega-6 to omega-3. Unfortunately, our modern diet consists of a completely lopsided ratio as high as 20:1. This is proving to be extremely detrimental. Since both of these fatty acid groups share common enzymatic pathways, the pro-inflammatory omega 6’s dominate the body’s tissues and lead to runaway levels of systemic inflammation.

On the sweeter side of things we have refined sugar, the sweet tooth’s kryptonite. Ever since humans first tasted fruit, we’ve undoubtedly lusted after the sugar it contained. Whereas in paleolithic times, fruit would have been available for only a limited time throughout the year, now it can be found year round in your local supermarket—and in much larger and sweeter varieties. Additionally, the constituent components of the sugars that fruit and cane plants contain have been isolated to form industrial refined sugars like high-fructose corn syrup.

Just a couple hundred years ago, the annual sugar consumption in the US was only about 2 pounds per person. Today, we’re over 150 pounds per person each year. The result of this barrage of sugar in the human diet are extraordinary levels of obesity and diabetes. Indeed, “adult-onset diabetes” might need to be reclassified since so many children are now being diagnosed with the disease at younger and younger ages. One of the more obvious culprits are the soft drinks which have gradually replaced water as the beverage of choice for children.

Plant-based foods aren’t the only thing that gets scrutinized in the Paleo diet. Dairy has also been singled out as a problematic food group, creating gastrointestinal distress for many people. While the ability to retain the use of lactate into adulthood in order to digest milk was a useful adaptation by migrating paleolithic humans, not all population groups developed this trait. It is estimated that 70% of adults worldwide show some decrease in lactase utilization during adulthood. Fermented dairy or products such as butter will likely be better tolerated due to far lower levels of lactose, but if symptoms persist then dairy should probably be limited in your  diet.

These guidelines are the concepts which most Paleo practitioners generally agree upon. However, there isn’t an official canon from which to draw from, so there is still a healthy amount of debate in the community on various matters like potatoes or saturated fat consumption. The distinction between Paleo and primal is commonly used to differentiate between these various points of contention. It is also vital to recognize your own health status, genetic makeup, tolerances and health goals when determining your ideal diet within the Paleo diet template.

The more orthodox strict Paleo model will encourage one to restrict saturated fat consumption, like that from butter, coconut oil, or fatty meats. The primal model on the other hand would endorse the use of these foods, while both restrict the usage of things like canola oil or artificial sweeteners. Over time, many of the attributes of both models have been mixed interchangeably by people as they seek out the methodology that best works for them. Ultimately, food choices are a very personal matter, and the resulting health implications are dependent on individual genetics and medical history.

The end goal of the Paleo diet is to not only get healthier, but to also optimize your health. Hopefully, armed with a better understanding of how the modern human handles the food we have available, you’ll be able to make healthier dietary choices for you and your family.

References

1 Between Zeus and Salmon: The Biodemography of Longevity K. Wachter C. Finch, National Academy Press 1997, pages 176-179

2 Tallest in the world: Native Americans of the Great Plains in the Nineteenth Century RH Steckel, The American Economic Review, Vol.91, No. 1, Mar., 2001 http://www.jstor.org/stable/2677910

3 Age relations of cardiovascular risk factors in a traditional Melanesian society: the Kitava study S Lindeberg, American Journal of Clinical Nutrition, Vol 66, 845-852 http://www.ajcn.org/content/66/4/845.short