Insects: A New Protein Source

Fried Grasshoppers

According to the Food and Agricultural Organization of the United Nations (FAO), “insects supplement the diets of approximately 2 billion people.” Moreover, roughly 80% of the world’s population incorporates insects into their diet in some capacity. In the media, using insects as a source of protein has also been dubbed as the future of food. This is partly because the world’s population is estimated to reach nine billion the year 2050! And while we may not be ready to see insect delicacies featured on our local restaurant menu, we need to ask ourselves— how are farmers and food processing companies supposed to feed all these people healthy food?

Companies like ExoChapul, and Entomo Farms are helping the U.S., Canada, and Europe successfully incorporate insects into their diet without the ‘ick factor.’ Through insect-based protein powders and bars, these companies are helping redefine what it means to eat bugs. Even General Mills is hopping on the bandwagon and investigating new ways to “use crickets as a sustainable source of protein.”

“If a family of 4 ate just 1 meal a week using insect protein for a year they would save the Earth 650,000 liters of water.”
(Entomo Farms )
That equates to 2,749,500 8oz glasses of water per year!

Preserving our farmland and water resources is extremely important if we hope to feed future generations. Insect protein is one of the most sustainable ways to provide nutrient-dense food to a growing population— without using excess water, land, feed, or energy. Today, one in nine people do not have enough food to lead a nutritionally healthy life. Raising and harvesting insects for food is a step in the right direction in the fight against world hunger.  Surprisingly, however, sustainability is actually just a bonus of insect farming. The real benefit of insect farming is the healthy, lean protein they provide.

How are insects farmed?

Farmed insects are not caught in the wild, captured, cooked, and served. Like many farm-raised animals, insects are bred and harvested. Insects can be wild-harvested (which is often seen throughout many parts of Southeast Asia) but, wild-harvesting can actually compromise your health. The wild-harvest process is not regulated, thus it can lead to health uncertainties, specifically because wild-harvested insects are not typically intended for human consumption. If you choose to consume insects, experts recommend sticking with products that have been farmed. In order to better understand the insect farming process, we spoke with Entomo Farms co-founder Dr. Jarrod Goldin who explained the Entomo approach.

Their primary concern is creating safe and clean insects. For their cricket products, Entomo Farms uses retrofitted chicken farms in order to properly cultivate their insects. Aptly nicknamed condo’s, the retrofit farms are divided into six habitats that maximize surface area for the crickets. The insects’ food is kept at the top of the condo and within it is a trough of running water. While some companies choose to use water bowls, Entomo believes stagnant water is inevitably not as clean as running water. The crickets are fed organic grain and are harvested at six weeks. In order to harvest the cricket for human consumption, the insects are immediately flash frozen with the use of dry ice. Because crickets are cold-blooded animals this process is also extremely humane. After they are frozen, the crickets are transported to the processing facility where they are washed thoroughly before being roasted.


Cricket Colony – barns and housing – Entomo Farms

Entomo Farms sent their crickets to be tested by a Government Certified Lab in order to determine the number of bacteria that were present in their cricket product. An Aerobic Plate Count (APC), is used as an indicator of bacterial populations on a sample. According to the FDA, a suitable range for frozen, chilled, precooked, or prepared food is 25-250 colonies per plate. The reported aerobic plate count for Entomo Farms Cricket Powder was roughly 10 colonies per plate. So, next time you are looking for a minimally-processed protein source, you might want to keep Entomo’s insect products in mind!

Health and nutrition profile of insects

Forbes Magazine dubbed insects “the next new miracle superfood” because of their dense protein content. Some insect species weigh in at roughly 80% protein, with a majority of species weighing in above the 50% protein by weight marker. Additionally, some insect species, like crickets, contain all nine essential amino acids. According to the Food and Agriculture Organization of the United Nations (FAO), crickets are also very high in micronutrients, such as magnesium, iron, and zinc. Insect species are also known to be high in calcium, vitamins B12 and A, and are reported to have a nearly perfect ratio of omega-3 to omega-6 fatty acids.

Source: Precision Nutrition

When you eat insects, you’re not just eating muscle, you’re also eating bones and organs, which deliver calcium, iron, vitamin B12, and zinc. It’s like if somebody ground up a whole cow and ate it!” (Daniella Martin, author of Edible)

The nutritional profile above demonstrates how 100g of cricket protein measures up to a traditional meal of steak and broccoli. It is important to note, however, that a typical serving size of cricket powder is roughly 2 tablespoons (17 grams). Therefore, it would take approximately 5 servings of cricket powder to equal a 100 gram (3.5 oz) serving of steak.

For more information on the nutritional value of insects with regards to human consumption, we recommend the following chapter from the FAO Forestry Paper, “Edible Insects: Future Prospects for Food and Feed Security” 

According to Dr. Goldin, an additional benefit of insect nutrition is the gut microbiota. As you may recall, D2D recently reviewed the importance of gut health and its effect on your brain in our article, “Your Second Brain: Gut Microbiota.” Probiotics help facilitate the growth of native gut microbes, but in order for probiotics to be successful at their job, they need fuel— this is where prebiotics come into the picture. Prebiotics feed probiotics and insects are considered rich prebiotics because of the fiber in their exoskeleton.

It is also important to note that insects can share common food allergens with crustacean, as both species are classified as an arthropod. Unfortunately, there is very little research pertaining to insect-related food allergens as the industry is just starting to expand. Because of this, the European Food Safety Agency warns anyone allergic to shellfish or mites to avoid eating insects.

Food Safety and Regulation

In the United States, insect farming is still in its infancy stages. In fact, 2016 marked the first year a conference was held completely dedicated to edible insects. The North American Edible Insects Coalition met in Detroit in May 2016 to discuss the future of harvesting insects for food.

One major effort that is being hedged by the coalition is increased federal regulation as “best practices” within the edible insect space are still being established by the FDA. Lobbyists for edible insects have launched a campaign to urge the FDA to “add mealworms, crickets protein powder, and other insect products to the agency’s database of Generally Recognized as Safe ingredients (GRAS)” (Bloomberg News).

In order for the insect-for-food industry to become more socially accepted, there needs to be an appropriate level of regulation for these products. Although insect products made by companies like Exo, Chapul, and Entomo Farms are considered food in the eyes of the FDA, they are not clearly regulated. One way to start successfully integrating insects into a traditional Western diet would be for the FDA to deem edible insects as GRAS.

As it stands now, the FDA allows the sale of bugs if they are raised for human consumption. Insect parts or additives can be found at specialty shops but technically aren’t classified as food-safe ingredients because of their exclusion from the GRAS list. (Bloomberg News)

And while we certainly do not suggest or expect you to replace all of your chicken or beef meals with insect protein— we recommend giving edible insects a chance!

You can add the ultra-fine cricket powder to just about anything. Sprinkle it on top of your oatmeal, add it to a peanut butter sandwich, even mix it in with the stir-fry you are cooking. The powder can help make healthy or marginally healthy food even healthier without much effort.

Cricket flour cookies. image: pixabay

We see a day where people have sugar, salt, pepper, and cricket powder on their countertop…and you add it throughout your cooking, as you would those condiments. It would be a great step for their health and wellness and for sustainability.
– Entomo Farms

Grass or Grain, Beef is Beef

cows in field staring at camera

Some nutritionists argue that grass-fed beef contains more omega-3 fatty acids, less saturated fat, and fewer calories than grain-fed beef. Environmentalists argue that grass-fed cattle are better for the environment and do not have any microbial diseases. But how much of this is based on research and how much is based on speculation? While we want to think of cattle as happily roaming the range, we need to look at the facts.

What is a grass-fed cow?


Grass-fed cattle on a Wyoming ranch

Almost all cattle live the first weeks of their life drinking their mother’s milk when kept in the pasture. After about eight to nine weeks, the calves are developed enough to forage for grass with the herd. Once the calf weighs approximately 700 pounds, 99% are sold to feedlots to fatten up to about 1,450 pounds. Here they gain about three pounds a day before they are generally harvested around 18 months. The other 1% are fed grass their entire life. Grass-fed cattle tend to live eight months longer to 26 months longer because they gain only about one and a half to two pounds per day on their grass diet. They also have the opportunity to walk around more so have less fat, more muscle and burn off their food. 

All cattle are grass-fed to some degree. The difference lies in whether they are grass finished.

Only about 1% of beef sales today are “grass finished”. However, the grass-fed market is growing by roughly 20% a year.

Is there a nutritional advantage to eating grass-fed beef?

The primary nutritional difference between grass-fed and grain-fed beef lies in the saturated and unsaturated fat content. You may remember from our previous post, Fat: Our New Friend, we should get approximately 27% of our daily calories from fat. Fat protects our brain, maintains our cell membranes, and helps us absorb vitamins.

Our bodies are able to synthesize (or create) fatty acids from the fatty acids we consume. There are two healthy fatty acids that are an exception to this rule:  omega-3 (alpha-linolenic) and omega–6 (linoleic acid).  Grass-fed beef has 3-5x more omega-3 fatty acids than grain-fed beef.

Why?  Because grass has high levels of alpha-linolenic acid and corn has very little

Omega-3 fatty acids may help lower your risk of heart disease, depression, dementia, and arthritis.  But, let’s put everything into perspective. Does this mean you should use beef as your source of omega-3s? 

Well, you would certainly have to eat a lot of beef!  Comparatively, salmon has 35x more omega-3’s than grass-fed beef.  Other fatty fish, such as anchovies, herring, mackerel. trout, and tuna are also a great way to provide your body with a high dose of omega-3.  Even a tablespoon of canola oil, say in your salad dressing, would meet your omega-3 daily requirement of 1.1 grams for women and reach 87% of the 1.6 grams for men.

As far as the other nutritional comparisons go, Texas A&M and Texas Tech universities completed independent studies comparing omega-3, oleic acid, and total saturated fat from grass-fed and conventionally grain-fed cattle.  Their analysis concluded that “there is no scientific evidence to support the claims that ground beef from grass-fed cattle is a healthier alternative to ground beef from conventionally raised, grain-fed cattle.” In addition, the basic nutritional components of amino acids, B vitamins, zinc, iron, and phosphorus are all the same in both meat options.

 


Source: Texas A&M University, Department of Animal Science

If you, like us at Dirt-to-Dinner, love a good steak or hamburger, you can get some of your important saturated fats, polyunsaturated fat, and monounsaturated fat from any kind of beef.

If you prefer grass-fed beef, the most potent cattle (in our opinion), are those that eat grass in the high country because the growing season is so short the grass grows with higher amounts of linoleic acid.  As a result, there is plenty of omega-3s in the cattle’s beef.  Their cardiovascular system gets the benefit of exercise in high altitude – thus they are leaner than most.

While nutritionally there is not much difference, grass-fed versus grain-fed beef can vary in flavor. Depending on your taste preference, you may find you do not enjoy grass-fed beef as much as grain-fed.  Some people like the soft marble feel of a grain-fed cow, while others prefer the leaner taste of grass-fed.  One Wyoming rancher told us that grass-fed cattle tastes “wild” and digests as quickly as broccoli!  She felt that you didn’t feel as satiated after eating her grass-fed cows. 

Fun Facts:

NFL footballs are made of cowhide.  About 3,000 cowhides are required to make footballs for one season.

Beef Tongue is a Japanese delicacy.  About 50% of US cattle tongues are shipped to Japan every year.  Try one – thinly sliced and grilled!

Disneyland sells over 4 million hamburgers each year and McDonald’s sells approximately 75 hamburgers a second – 225 million burghers worldwide every year.

Where are grass-fed cows raised? 

The one billion cattle grown globally give us approximately 59 million tons of meat.  That is enough to give the world’s 7.4 billion people 18 pounds of beef a year. The major beef producing countries are the United States 18%, from Brazil 12%, from China 8%, and from Argentina 4%.  (FAOSTAT).


Typical Feedlot

The United States is awash in corn, so it is easy to feed and grow our cattle in feedlots.  States like Wyoming, Montana, Kansas, and western Nebraska have thousands of grassy acres to support their cattle in the summer but of course not in the winter. In the fall, all those cows either head to the feedlots or have to be given feed rations to keep increasing their weight growing through the wintertime. 

Thus, grass-fed beef is harder to grow in the U.S.  Australia and Uruguay, on the other hand, have acres of land which can support grass-fed cattle throughout the year making their grass-fed farming more cost effective.

Do grass-fed cattle have a happier life?

According to Dr. Temple Grandin, the animal welfare expert of cattle,

“It doesn’t matter whether a cow is in a feedlot or on the ‘range’. What is important is whether the animal has shelter, proper drainage for the rain, consistent food, and is not put in stressful situations.”

Sure, it is nice to think of a cow having access to a beautiful grassy field, but keep in mind, not all pastures are grassy! Some are dry, some have no water, and some are terribly arid. Some farmers claim that their cows are fed only grass – but they are contained in a feedlot and fed grass pellets! All feedlot owners are not the same either. Some feedlot owners pay attention to every single cow and some do not. What the cattle are fed or their ability to roam are not the determining factors for good animal welfare. What really matters is the quality of care and attention given by the farmer, and each farmer is different.

Are grass fed cattle better for the environment?

One can say that cattle are the perfect “crop” for those grassy areas that don’t have great soil for grains and oilseeds.  Their hooves aerate and their manure fertilizes the soil which enables the grass to grow better than it would otherwise.  For example, parts of western Nebraska have 50,000-acre ranches which are perfect for the grass-fed cattle.

However, when most people think of the environment, with respect to cattle, they think of methane emissions.  And, in fact, cattle are often blamed for global warming!  Yes, the media and Hollywood have convinced people that cows produce more pollution than cars or trucks – check out Cowspiracy. This is based on the UN Food and Agriculture Organizations 2006 report, Livestock’s Long Shadow.

While there is a difference in cow methane production in the developed world versus the developing world, Dr. Frank Mitloehner, Associate Professor and air quality extension specialist at the University of California, Davis, disputes the FAO report and explains that the difference is in the animal’s nutrition.  In the developed world, we have very good veterinary care, excellent cow nutrition, and strong genetics. This combination plus a well-managed ranch reduces the parasites that compete for nutrients in the cows’ digestive system. The better the digestion – which you have when the cattle eat a good diet full of nutrients – the less the greenhouse gas production.  In fact, because grass-fed cows live eight months longer – combined with their grassy diet – their emissions are higher.

According to the EPA, in the United States, agriculture as a whole contributes 9% to greenhouse gas emissions compared to electricity which weighs in at 30%. Animal agriculture, which has increased its meat production by almost 50% since 1990, has remained constant at about 3% of U.S. greenhouse gas emissions. The fact that emissions from U.S. animal agriculture have remained relatively constant while protein production has increased dramatically reflects improved feed efficiencies, better manure management strategies, and efficient use of cropland.

Air quality is just one piece of the environmental discussion concerning cattle. It is important to consider water quality, land usage, composting, birds, and wildlife diversity.  Sustainable farming is a multi-faceted approach to all aspects of the environment, not just one.  It is not whether cattle are grass-fed or grain0fed that gives us sustainability – it is the overall environmental responsibility of each individual farmer or country. The North American Meat Institute provides informative fact sheets on meat production.

What about E. coli and mad cow disease?

Some of the grass-fed marketing efforts try to tell the consumer that there is no risk of mad cow disease or E. coli O157: H7. Let’s separate these issues for a moment.  E. coli lives in the cow’s digestive system and is excreted in its manure. Cows have manure on their hide before they go to the processing plant – thus there is the risk of E. coli on the hide.  This is why it is considered best practices for beef processing plants to wash and sterilize the hide with best practices before the cows are processed. They basically go through a car wash for cows.  There are approximately 6,200 processing plants in the United States that include about 8,000 federal inspectors on-site making sure our meat is safe.

Bovine spongiform encephalopathy (BSE), or more commonly known as “mad cow disease”, on the other hand, is an illness that results in brain degeneration. The significant cause is when cows are fed feed containing other mammalian protein – a practice that is now against the law. (The real mad cow disease started with sheep byproduct being fed to live sheep.)  When the spinal cord or brains of these cattle are eaten, there is a chance the disease can be spread to humans. 

Today, all cattle are carefully processed without any brain or spinal tissue. In addition, they are all harvested well before 36 months, the incubation period for the disease.

What are the certifications for grass-fed beef?

The U.S. Department of Agriculture requires that labeling beef as grass-fed means that these cattle can only eat grass after they are weaned from their mother.  The Animal Welfare Approved Standards (AWA),  the American Grassfed Association, and Food Alliance are certifications you can find on your beef that ensures that they are grass-fed their entire life.

Produce Variety Helps Diet Variety!

broccoli, carrots, radish, tomatoes and peppers -fresh vegetables

Our choices and varieties of fruits and vegetables have expanded.

Early in the 20th century, what people ate in the U.S. primarily depended on their heritage and traditions, where they lived, what they could grow, and how much money they had. Fruits such as oranges and bananas were a special treat compared to the role of “lunchbox staple” that they play in our diets today.

The average American diet is no longer restricted by local or seasonal produce. Because of our expanded choices, the fresh produce Americans eat today is not the same as it was 100 years ago. There has been a considerable change in the commodities we enjoy year-round. Prior to the turn of the century, many produce items were primarily available only in season – i.e., blueberries, kiwi, papaya, persimmons, pineapples, raspberries, and miscellaneous tropical fruits. Other commodities such as mizuna and kohlrabi, although common outside the U.S., were virtually unheard of until recent years!

We still enjoy the same fruits and vegetables as we did in 1970!

While we have integrated new produce into our diet regimen, it is safe to say, old habits die hard. In 1970, three vegetables – lettuce, tomatoes, and potatoes – were the most consumed fresh vegetables in the US.

Per capita fresh vegetable consumption, 1970 and 2013

Food Availability Data

The latest USDA statistics for 2013 show that these same three commodities are still the leading fresh vegetables consumed in the U.S. However, we have expanded the diversity of these three popular veggies. Between 1970 and 2013, there were changes in the number of potatoes and the different types of lettuce available, as well as an increased variety of other vegetables incorporated into the average American diet.

For example, after a peak in the late 80s/early 90s, by 2013 head lettuce consumption declined by 51% while romaine and leaf lettuce consumption increased by 69%. U.S. consumers also ate more broccoli, cucumbers, onions, and peppers during this same time frame. Still, even with our preference for new lettuce types and increased consumption of other vegetables, our preference for lettuce, tomatoes, and potatoes stayed relatively consistent.

We have retained a strong preference for certain fruits.

In 2013, American’s fruits of choice were bananas, melons, apples, and oranges. Our fruit preferences were the same in 1970. In the 43-year time span, consumption of avocados, bananas, cantaloupes, grapes, pineapples, and strawberries increased while consumption of apples, cranberries, peaches, and plums declined. In recent years, robust demand for avocados, blueberries, cherries, lemons, limes, mangoes, papayas, and pineapples has been driving growth in fresh fruit commodities. USDA analysts attribute this growth in fruit used to the preparation of traditional dishes by a more ethnically diverse population as well as heightened interest in a healthy diet.

There are various interactive graphs illustrating the changing American diet from 1970 to 2012/2013.  See the FlowingData.com’s website and articles in  Scientific American and Time magazine’s articles.

Not only have there been changes in the diversity of what Americans eat, but there has also been an even greater change in when we eat fresh produce. Prior to the turn of the century, the majority of the U.S. population was eating strawberries for one, two, or if you were lucky, maybe three months of the year. Now eating fresh strawberries year-round is commonplace, as is noted by a 320% increase in per capita consumption from 1970 to 2010. And it is just not strawberries – the same is true for blueberries, raspberries, blackberries, pineapples, cantaloupe, and a litany of other fruits and vegetables.

So why the increasing diversity in our produce?

To help meet the growing demand for fruits and vegetables, plant breeding has resulted in new varieties of popular produce items with increased yields, extended growing seasons, improved product quality fruit, and enhanced shelf-life. Tomatoes and strawberries are two prime examples of fruits where year-round availability is a direct result of breeding new varieties.

Suppliers have also improved shelf-life and product quality during transportation by modifying harvesting methods. A good example of these improvements can be seen with the banana which bruises easily when it is ripe. Bananas used to be harvested after ripening until growers discovered they could harvest unripe, green bananas and ship them all over the world without damaging the still firm unripe fruit.

The 5:2 Fasting Diet

tape measure wrapped around an apple - diet

Alternate-day fasting diets, like the 5:2 diet, have become a popular way to quickly lose weight. The 5:2 diet made its way into the spotlight in 2013 when BBC aired a documentary entitled Eat, Fast & Live Longer. In this program, journalist Michael Mosley investigated the health benefits of fasting. Before attempting the various and attainable fasting methods himself, Mosley met with a series of doctors and industry professionals who assessed his current health condition. Mosley wanted to understand how to best protect himself against the negative effects of aging. From his story on alternate-day fasting, Mosley derived the 5:2 diet, which subsequently took the UK by storm.

The belief that fasting can improve your health shares similarities with the Paleo diet. Like Paleo dieters, Mosley looked to our ancestors for help when investigating fasting. When hunters and gatherers had a successful kill, they gorged themselves on the meat. This feast might last a few days and certainly was not restricted— however, if the hunters went days without a kill, they would be starved, surviving on minimal food and nutrients. Thus, our bodies are capable of functioning when we are underfed. But, bear in mind, our hunting and gathering ancestors put themselves in great peril, even wrestling mammoths to provide a feast. That is a lot of physical activity that we do not necessarily get today.

Throughout Mosley’s investigation, he interviewed a handful of researchers and specialists, one of them being Mark Mattson, an expert on the aging brain. Mattson, Chief of the Laboratory of Neurosciences at the National Institute on Aging and Professor at John Hopkins University, discussed the laboratory studies he performed regarding starvation. Based on the tests he had been conducting on mice, Mattson identified positive aspects of fasting. In one of these studies, Mattson found that when mice were given an unhealthy diet high in saturated fats and sugars, mice health declined much more rapidly, roughly 3-4 months sooner. On the other hand, mice given a diet lower in fat and subjected to intermittent fasting lived roughly 6 months longer. Thus, the mice maintaining a smaller size proved to live longer.

5:2 dieters argue that our bodies are not made to handle the modernization of food and that giving the digestive system frequent “breaks” helps to mend any issues with digestion.

Additionally, in his meeting with Mark Mattson, Michael Mosley learned that sporadic bouts of hunger help stimulate new neurons to grow in our brains. Mattson also looked to our mammoth-hunting ancestors to answer the question regarding cell growth. From a survival standpoint, hunger provides a survival advantage as it causes you to be more focused. Fasting’s effect on the brain is actually compared to exercising’s effect on your muscles…well, for mice anyway. In order to truly prove that these findings hold true for humans, human trials must be performed.

So how did this research and studies like it lead to Mosley’s famed 5:2 diet?

As Mosley attempted intermittent fasting, he realized how difficult this task is. Anyone can attest that we need food, and regularly! To accommodate this need, Mosley met with Dr. Krista Varady, author of The Every Other Day Diet and an advocate of alternate-day fasting. Like the 5:2 program, the “Every Other Day Diet” instructs participants to limit their caloric intake to 500 calories on fasting days. Although they are very similar in practice, on the “EODD” you are fasting slightly more than on the 5:2 diet. For example, one week you will fast 3 days and the next you will fast 4, then the following you are back to 3 days of fasting, and so on…

During Dr. Varady’s clinical studies of alternate-day fasting, researchers found participants decreased their levels of LDL cholesterol (bad cholesterol), triglycerides (fat), and blood pressure. Surprisingly, these scientists found it actually didn’t matter if you were eating a high-fat diet versus a low-fat diet on the given feast days—the LDL cholesterol and blood pressure were relatively the same for all participants.

Because they consumed 25% of their energy needs on fasting days, Dr. Varady predicted that most participants would consume 175% of their energy needs on a “feed” day. But, throughout the course of her study, participants were only consuming 110% of their energy needs on the feed days. Inevitably, there is a -65% consumption deficit.  This tells us that starving a few days a week and then feasting on cookies, pasta, pizza, and cheeseburgers will probably help you lose weight because you are reducing your overall caloric intake. However, your body will be missing proper nutrients. Additionally, if you are exercising regularly, your energy levels may be negatively affected by the significant decrease in calories on the fast days.

While the 5:2 diet and similar programs are not sensible dieting practices, the science behind fasting is worth a second look.

While we dismiss the 5:2 diet and similar programs, like the “Every Other Day Diet”, as viable dieting practices, we acknowledge that the science behind fasting and Mark Mattson’s research is worth a second look. Scientists have found that restricting caloric intake can help to regulate your body’s blood sugar levels. Research in mice has discovered that by reducing daily caloric intake, the body lowers its production of hormone IGF-1. A drop in the creation of this hormone is known to help your body go into repair mode—meaning, the body begins to protect itself against carcinogens, heart disease, diabetes, and other health issues.

From his studies on mice, Mattson has also determined that “intermittent energy restriction” may help prevent the onset of Alzheimer’s disease. The mice Mattson studied are destined to develop the disease and by controlling their food regimen, he was able to delay the onset of the disease and keep the mice healthier for a longer period of time. In his TED talk, Mattson explained that intermittent fasting helps to stimulate the growth of cells in your brain. Why? Fasting is a challenge to your brain and your brain responds to that challenge of not having food by activating adaptive stress response pathways that help your brain cope with stress and resist disease.” (Mattson, 2014). By forcing your brain to handle stress and fight disease, Mattson believes you are increasing your brain’s productivity and potentially slowing the natural progression of aging in your brain.

In agreement with Mark Mattson, Valter Longo, a cell biologist at the University of Southern California, also pioneered studies on the health benefits of fasting. Dr. Longo put hormone IGF-1 under the microscope and was another influential resource in Michael Mosley’s special for BBC. Longo, however, does not recommend the 5:2 diet. In fact, he doesn’t recommend any fad diets. He believes in “time-restricted feeding”, which means you eat 2 meals a day between 3 and 12 hours of each other. This, he argues, will keep the effects of aging at bay. How? Through the reduction of IGF-1. According to Dr. Longo, “the reduction of IGF-1 is really key in the anti-aging effects of some of the interventions. Both the dietary ones and the genetic ones. We’ve been putting a lot of work into mutations of the growth hormone receptor that are well established now to release IGF-1 and also cause a record lifespan extension in mice” (Jones, 2014).

With no balanced diet, intermittent fasting will not help to encourage healthy eating habits.

Nutritionists argue, however, that intermittent fasting will not help to encourage healthy eating habits. Because of the structure of the 5:2 diet, or any diet where you are encouraged to eat more freely on your “food days”, the importance of balanced healthy eating is not emphasized. 

With all of the concentration on calorie restriction, we are missing the importance of healthy eating. Don’t forget, your body needs food. A balanced diet consists of roughly 2,000 calories a day, made up of 2 servings of fruit and 3 servings of vegetables, roughly 1 gram of protein per pound of body weight, and 3 to 5 servings of whole grains. By fasting and feasting, you are not “tricking the system”.

Understanding the Paleo Diet

cookbook with kale on page

These days, it feels like new food trends are constantly coming to market. From juice cleansing to going gluten-free, dieters and healthy eaters alike are left wondering, “What are the smart choices for my diet and my body?” In an effort to offer some clarity and take it back to simpler times, we have chosen to examine: the Paleolithic diet.

Inspired by the foods of our ancestors.

A Paleolithic, or “Paleo,” diet is a diet inspired by the foods of our ancestors. Often called “the caveman diet,” this diet regimen focuses on a more simple call to action: clean, primal eating. The Paleo diet emphasizes the importance of the foods that our ancestors had access too, which include grass-produced meats, nuts, fresh fruits and vegetables, fish, healthy oils (those of olive, avocado, coconut, etc.), and animal products, such as eggs.

 

The Paleo method believes that human metabolism was not made to digest today’s highly processed foods. Instead, the diet emphasizes the importance of the foods that our ancestors had access too, which include grass-produced meats, nuts, fresh fruits and vegetables, fish, and healthy oils (those of olive, avocado, coconut, etc.), and animal products, such as eggs.

Proponents of the Paleo diet believe the human digestive system has wrongfully adapted to eating “toxic” foods, such as grain, legumes, and dairy. Foods such as these were not available to our ancestors, thus our bodies are not designed to consume them. However, lean meats, seafood, and seasonal fruits and vegetables were the basis of a Neanderthal’s diet and our digestive system is equipped to break these foods down.

or example, while we agree that protein is a very important part of your diet, the way our ancestors consumed protein is not similar to modern practices. When our ancestors hunted and killed an animal for its meat they gorged themselves on the food for days and could go months without another successful hunt.

There are health benefits from eating whole grains.

While it is healthy to consume protein and whole fruits and veggies as the diet prescribes, there are health benefits from eating whole grain. Whole grains are high in fiber–which is good for your digestive system, are digested slowly so can keep you full for longer, and can help reduce the risk of heart disease. As for the argument that eating whole grain can cause inflammation, this is certainly true for those suffering from celiac’s disease, however, it is untrue if you have no wheat sensitivity. In fact, going gluten-free can often lead to a diet higher in sugar and saturated fats. For these reasons, we disagree with the Paleo diets requirement to cut grain completely from your diet.

There are health benefits from eating legumes.

In addition to eliminating grain, the diet recommends eliminating legumes, like lentils, beans, or peas, from your diet. The Paleo diet argues that the lectins, which is a sugar-binding protein, found in legumes eliminate their nutritional value. But this is not true! A 2013 study suggests the nutritional content of legumes outweighs the issue with lectins. The Huffington Post also reported that cooking legumes can eliminate the anti-nutrient qualities of lectins. Legumes pack a powerful punch! They are high in dietary fiber, protein, antioxidants, vitamins, and minerals while being low in fat.

There are health benefits from drinking and eating dairy products.

Now that we’ve covered grains and legumes, let’s put dairy under the microscope. Dairy is where it gets a little trickier. Did you know that the human species is the only species that consume dairy in adulthood? This is one of the primary reasons why dairy is a strict “no” in the Paleo diet. Paleo dieters believe that by eating the food that our ancestors ate, we are eating the most natural, “untouched” foods. The milk we know today has been harvested from animals that have been bred for milk production. When we drink cow’s milk, we are ingesting the hormones that have been fed to the cow, which the Paleo diet does not condone. But—when you think about it, of course, humans are able to eat dairy into adulthood…because we can produce it.

When consumed in moderation, dairy is a good source of potassium, protein, and fat, and is important for your bone health. Many non-Paleo physicians argue that adults have no nutritional requirements for dairy. Our opinion? You do not need to eliminate the food group entirely, but you do not need to consume more than two servings of dairy per day to maintain a balanced diet. There are additional ways to get potassium, protein, and healthy fats.

In order to properly follow the Paleo diet, you must eliminate potatoes, dairy, cereal grains, salt, refined vegetable oils, and refined sugar from your diet. Eating at a restaurant is not easy!

The Modern Paleo: 85:15

Legumes, whole grains, and dairy can be consumed as part of a healthy, well-rounded diet. There are certainly some benefits to the Paleo approach, specifically that your diet is high in fruits and vegetables, lean meats and fish, nuts and healthy fats— but it is unnecessary to eliminate entire groups of food from your diet unless prescribed by a doctor. A modern version of the Paleo diet is the 85:15 rule. This means 85% of the time you are strictly Paleo and 15% you are allowed to consume non-Paleo foods. That way you are not completely eliminating certain beneficial food groups from your diet.

Investigating the “Natural” Label

barley field with sun setting in background

If you are unclear on what the word “natural” on your food label means, you are not alone.  We are not sure if anyone knows the true meaning of “natural”. There is a renewed consumer interest in eating only food grown from our hunting and gathering days. Is that realistic? Forget for a moment that the average life span of our Paleo cousins was about 33 years. Is your food really better if it is not made in the lab or a food ingredient facility? Is cane sugar more natural than high fructose corn syrup? Is “natural” food better for you? Consumers and even some food companies are left to their imagination. The U.S. Food and Drug Administration (FDA) agrees that most food in the grocery aisle is not exactly like it was when it left the farm.  Their definition vaguely informs us of the following,

From a food science perspective, it is difficult to define a food product that is ‘natural’ because the food has probably been processed and is no longer the product of the earth. That said, FDA has not developed a definition for use of the term natural or its derivatives. However, the agency has not objected to the use of the term if the food does not contain added color, artificial flavors, or synthetic substances. **

FDA

**Note: The FDA is currently in the process of reviewing the “natural” label, and has extended the comment period until May 2016. Learn more here.

There seems to be a lot of room for interpretation. For example, the U.S. Department of Agriculture (USDA) defines all natural meat as “minimally processed”.  The Food Safety and Inspection Service (FSIS) agrees with the FDA and the USDA by saying that any product labeled as “natural” cannot contain artificial ingredients or added color and the product can only be “minimally processed”, meaning “not fundamentally altered”.


Image Source: Nolan Ryan Beef. http://nolanryanbeef.com/

Consumers want food without chemicals, synthetics, or ingredients that are considered bad for you, and “factory fear” is growing in popularity. As a result, companies are labeling their products as “natural” to distinguish their products as healthy. According to Mintel Marketing Research, the natural label market in the U.S. today is significant: 11 percent of all food sold in the grocery store.

In fact, because the word natural is so ambiguous, there have been teams of lawyers reviewing the products in your local grocery store, looking to see what is truly “natural”. Kraft was sued for false advertising over its “natural cheese” claim as the cheese had artificial coloring. General Mills, Trader Joe’s, PepsiCo, and Kashi have all settled liability suits and removed the 100% natural claim from their packaging. These class action lawsuits are trying to prove that companies are deceiving the consumer—when they might be just as confused.

So what is happening in response? Companies are now showing what is NOT in their box as a protection against lawsuits. Packaging labels such as “gluten-free”, “no High Fructose Corn Syrup”, and “GMO-free” infer that the products are healthier. But these claims can be deceptive, as there is nothing scientifically or medically wrong with GMO’s and High Fructose Corn Syrup and you only need to avoid gluten if you are celiac.

Just because some of your food is created in a lab doesn’t mean that it is filled with unhealthy ingredients. Take synthesized vitamins, for instance. Numerous studies have been done on each synthesized vitamin to make sure that the purpose of the chemically created vitamin is the same. For example, when we eat meat, we ingest Vitamin B12. B12 comes from the stomach bacteria in an animal. When B12 is created in a lab, the exact bacteria fermentation is simulated to create the identical B12 vitamin. No chemicals or dyes are made— it is created healthily. Vitamin C is made the same way, through biosynthesis, whether it is from a lab or a fruit. Fruits pull up calcium, phosphate, and nitrogen from the soil and make their vitamins. The lab just combines the same minerals and creates a synthetic vitamin. While they may not be considered “natural” they are not harmful to your body.

Natural flavorings are often looked to as an alternative. But some of them are not all that appealing. For instance, as an all-natural alternative to red color number 40, the coloring agent is crushed insects. Need lemon flavoring? It comes from grass. Or how about this one, the natural smell of raspberries could be from an unmentionable part of a beaver.

Confused about labels? Here’s what you should do.

The real questions to ask ourselves are is: Is my food healthy? Does it have lots of sugar? Where does the fat come from? Is this a one-time snack or an everyday snack? How many calories am I eating? Greek Yogurt sold with fruit is delicious, but watch out for the added 13 grams sugar— half of your daily allowance for added sugar. Pasteurized milk is not “natural” but it makes your milk safe to drink.

 

For example, let’s take a look at Jennie’s All Natural Coconut Macaroon cookies. Because it is a cookie, our instincts tell us that is isn’t healthy. But, when you look at the label, you find that these cookies are also Non-GMO, Wheat Free, Gluten Free, Dairy Free, Yeast Free, Sulfite Free, Soy Free, Lactose-Free and Trans Fat-Free. But does that make them GOOD for you? Not really. While they may be the lesser of all cookie evils, for just two cookies, they still have 32% of your daily saturated fat recommendation and 63% of your recommended added sugar for the day. Are those two cookies worth an additional 130 calories? Even though they are considered “natural” these cookies are certainly not a healthy snack.

The best thing you can do for yourself is to be mindful of the nutritional label versus the marketing labels.

There is no solution for inflammation comparable to maintaining a healthy lifestyle. And we are certainly not proponents of the “quick fix”, particularly if there is an underlying issue that is not being addressed. However, if you are typically fairly active and a healthy eater that has indulged and looking to get back on track there are some antidotes that may help fight inflammation. Cryotherapy and baby aspirin are believed to reduce swelling.“Cryotherapy takes advantage of the body’s natural tendency to vasoconstrict (vessels tighten) when exposed to cold. This is why we apply ice to a trauma, like a swollen ankle, after hurting it. When we apply cold, the vessels tighten, which limits swelling. This is a good counter to the body’s natural tendency to swell and heat up an area of injury.” (Dr. Bongiourno) Additionally, baby aspirin is often prescribed to help reduce pain and swelling. 

Should We Eat Wheat?

sliced wheat bread displayed with wheat

Wheat has come under fire recently. The rise in gluten-free dieting has left many questioning its nutritional value. One-third of American consumers are trying to eliminate gluten, and subsequently wheat, in the hopes of losing weight.

But the U.S. Department of Agriculture advises adults to eat between 3 and 5 servings of whole grains a day, and 6 to 11 servings for children.

Is wheat unhealthy?

It is hard to talk about wheat without mentioning its relationship to gluten. Walk into your local grocery store and the popularity of gluten-free products is astounding. Even foods that would never contain gluten are being stamped with the famous “GF” mark. We recently discussed “the gluten myth” on D2D and can confirm: gluten is not the enemy. Many non-celiac afflicted people choosing to maintain a GF diet do find they experience sudden weight loss, however, this is from the elimination of an entire food group and sudden change in eating habits. This is not gluten weighing you down. And whole grains are an important part of a balanced diet.

Modern wheat production

Some researchers have taken issue with modern wheat because it has changed from its original form. In order to keep up with a rapidly growing population, wheat farming has adapted. As such, mass-farming has manipulated the wheat we consume today relative to the wheat that our ancestors consumed.

The creator of modern wheat, Norman Borlaug, a biologist from Iowa, won both the Nobel Peace Prize and the World Food Prize for his positive contribution to farming.

Norman Ernest Borlaug, photographed in Mexico for LIFE Magazine in November 1970 (Flickr)

Borlaug was able to roughly double wheat production per acre. Instead of long grain stalks, wheat farmers are now producing higher yielding crops, which are smaller in size—18 inches in height compared to the traditional 4-foot tall wheat plant. These crops are smaller in size due to the weight of excess grain now created per stalk. If they maintained their original height, the stalks would not be able to support themselves. While these crops produce more wheat to feed the growing population, it is argued that these crops are less nutritious.

What is Wheat Belly?

One anti-wheat proponent, who lobbies for all humans to eliminate wheat from their diet is Dr. William Davis MD, author of Wheat Belly. According to Dr. Davis, we are victims of “Frankenwheat”, which he considers addictive and toxic. Davis asserts that today’s wheat contains a protein called gliadin that, Davis argues, “has the potential to bind to the opiate receptors of the human brain—like heroin or morphine—except it has a different effect of course. Wheat doesn’t provide relief from pain, it doesn’t provide a euphoria, it only stimulates appetite, so that people who consume modern wheat are triggered to consume 440 calories more per day.” (Davis, Wheat Belly).

Davis believes that consuming gliadin tells your body it wants more carbohydrates and as a response, you end up overeating. These excess carbs eventually are stored as fat. Dr. Davis believes if you eliminate modern wheat from your diet you will see a noticeable change in your hunger levels, lose weight, and benefit from positive health changes like decreased blood pressure, low blood-sugar levels, and less joint pain.

Gliadins are not the cause of overeating

How much of this argument should we hold true? According the article “Does Wheat Make Us Sick and Fat?” published by the Journal of Cereal Science, Davis’ understanding of gliadins is misleading as gliadins are present in all forms of wheat, including ancient grains. In some cases, “modern wheat” actually contains less gliadin than the grain of our ancestors. The article reports, “there is no evidence that selective breeding has resulted in detrimental effects on the nutritional properties or health benefits of the wheat grain” (Shewry et al., 2011).

Gliadins are not the cause of addictive eating behaviors

As for Davis’ theory regarding wheat opioids and their effect on the human brain, the Journal of Cereal Science also discredits this claim. According to a 2008 study, although gliadin is known to release a peptide called gliadorphin, which can induce an opiate-like effect, the compound’s composition of 7 amino acids actually cannot be absorbed into the intestine. Because of this, gliadin is not present in its original form in the circulatory system and therefore the opiate effects of gliadorphin do not affect the central nervous system. The evidence of this study undermines the Wheat Belly argument concerning gliadin. Therefore, Davis’ claims cannot be substantiated given today’s scientific understanding of wheat.

A pro-wheat organization that has examined the science behind grains is American Association of Cereal Chemists (AACC). Their journal Cereal Foods World is responsible for bringing current industry information regarding grain science and technology to light. Cereal Foods World does not believe that modern wheat is a so-called “super carbohydrate.” In a report written by researchers at CFW, the process of crop cultivation and modernization is examined. To quote their findings,

Modern cultivated food plants are the product of thousands of years of plant breeding, and wheat is no exception. Breeding programs have enabled a number of positive outcomes in terms of plant yield, food quality, and nutritional value. It is interesting to note that wheat varieties carried to the New World by colonists did very poorly because the varieties were not suited to the new climatic conditions…Despite the implication in the book, these varieties were produced using traditional plant breeding techniques. Currently, there are no commercially available, genetically modified wheat varieties sold. (Brouns, 2013)

So why do “wheat-eliminators” lose weight and subsequently feel better?

Again, the answer is the drastic change in diet. When you eliminate an entire food group from your diet—especially one that you consumed frequently—your system is shocked and responds rapidly. Especially if the wheat you were consuming before making this switch was an indulgence, like pasta, bagels, or even pretzels. You aren’t eliminating gliadin, you are eliminating junk food!

Wheat and brain diseases?

Another anti-wheat assertion is that wheat consumption is a contributing factor in long term brain diseases, such as dementia and Alzheimer’s. David Perlmutter, MD is the author of the national bestseller, Grain Brain: The Surprising Truth about Wheat, Carbs, and Sugar. Perlmutter argues that the modern human diet has steered off course, relying heavily on carbs, whereas our ancestors’ diet was mostly fat and protein-based.

Grain Brain believes this dietary shift is the reason for increased inflammation in the body. “Researchers have known for some time now that the cornerstone of all degenerative conditions, including brain disorders, is inflammation. But what they haven’t documented until now are the instigators of that inflammation— the first missteps that prompt this deadly reaction. And what they are finding is that gluten, and a high-carbohydrate diet for that matter, are among the most prominent stimulators of inflammatory pathways that reach the brain.” (Perlmutter, Grain Brain)

One of the biggest issues Dr. Perlmutter and Dr. Davis have with processed grain is its ability to spike your blood sugar levels. When your blood sugar increases your body creates more insulin—and while insulin helps keep the glucose level of your cells healthy, too much insulin will cause your cells to desensitize. Davis and Perlmutter, MD believe that this leads to inflammation and inevitably may contribute to Alzheimer’s. However, data that shows a very weak link between blood and glucose levels and a risk for developing Alzheimer’s. This conclusion is considered to be a far stretch with current evidence.

In order to consider the harmful effects of inflammation, we must examine inflammation and its relationship with your body’s insulin and blood sugar levels.

Both Grain Brain and Wheat Belly discuss the effect whole grain can have on your blood sugar by highlighting the fact that two pieces of whole wheat bread actually raise blood sugar levels more than a Snickers candy bar. So, why is this?

Processed grains, like whole grain or white bread, cause blood sugar levels to rise, but you should not be scared away from wheat by the glycemic value and its likening to a chocolate bar. The nutrients of the two foods are very different and you cannot draw a conclusion on one being unhealthy because of its similarity to the other.

Do you really believe that a snicker’s bar is healthier than a serving of whole wheat bread? As delightful as it tastes, a Snickers has 250 calories, 12g of fat, and 27 grams of sugar with little nutritional components. Two slices of whole wheat bread also have 250 calories but only 5 g of fat and include protein and fiber as nutrition. You also need to consider how much whole wheat is in the bread in question. For a serving of bread with a Glycemic Index of 71, the bread in question is processed whole wheat or white bread—but these are not your only options. 100% stone ground wheat is a low Glycemic Index food, for example, Ezekiel 4:9 bread has a GI value of 35. Additionally, you are typically eating the serving of bread with a protein, such a turkey or peanut butter, which can also slow your spike in blood sugar.

When discussing the glycemic index, you must also consider wheat’s glycemic load. The glycemic load relative amount of carbohydrate the food contains in an average serving. By taking each gram of carbohydrate into account, you are able to better estimate how the food will affect your glucose levels. Yes, the glycemic index helps interpret how quickly glucose levels rise, but the glycemic load helps interpret how long glucose levels will stay elevated for, ie. how much the sugar is affecting you. Read what our research says about Glycemic Index vs. Glycemic Load.

So while two pieces of whole wheat bread can raise your insulin levels, consuming whole wheat will not lead to rapid weight gain. In his book, Dr. Davis makes the argument that our ancestors avoided diabetes because of their diet, which mainly consisted of wild boar, salmon, and berries. But there is no scientific data regarding the possible diabetic condition of hunters and gatherers! Not to mention, their diets relied entirely on what they were able to hunt or collect and their lifespans were much shorter than the average human today. As such, Dr. Davis is drawing hard conclusions from limited evidence.

Complex carbs, such as whole oats, sprouted bread, or even pasta do not have the same effect on blood sugar levels as the average piece of white or whole wheat bread. This type of grain is actually helpful for keeping blood sugar levels low as they are high in dietary fibers and take a longer time to metabolize. The more refined the grain is, the higher your blood sugar will spike.

Is Red Meat Carcinogenic

red meat steak protein

If I eat steak or bacon, will I get cancer?

NO!

On October 26th, 2015, the International Agency for Research on Cancer— the cancer agency of the World Health Organization— gave a press release that evaluated the consumption of processed and red meat and its link to cancer. The study looked specifically at colorectal cancer and its association to stomach, prostate, and pancreatic cancer. While the IARC classified red meat as “probably carcinogenic to humans” and processed meat as “carcinogenic to humans”, it is important to note that the evidence supporting these claims is very limited.

The research reviewed over 800 individual studies and was run by twenty-two experts from ten different countries, and yet the findings released were not conclusive.

According to the American Cancer Society, in 2018, the chance of getting colorectal cancer for an average 50-year-old male or female is 4.49% or 4.15%, respectively. The World Health Organization stated the possibility of an 18% increase from eating red meat. It is misleading to say that one will have an 18% chance of getting cancer when it is really an 18% increase over a base of a little over 4%. This brings us to 4.9% (for women) and a 5.23% (for men) chance of getting colorectal cancer if we eat 50 grams of processed or red meat per day.

The cancer risk related to the consumption of red meat is more difficult to estimate because the evidence that red meat causes cancer is not as strong. However, if the association of red meat and colorectal cancer were proven to be causal, data from the same studies suggest that the risk of colorectal cancer could increase by 17% for every 100-gram portion of red meat eaten daily.
– World Health Organization

Consuming large amounts of processed meat is worth monitoring and not something to incorporate every day. So while you might not want to have 2 servings of bacon every day, you can enjoy it a few times a week without fear.

The American Cancer Society also weighed in on the issue. ACS managing director of nutrition and physical activity says, “we should be limiting red and processed meat to help reduce colon cancer risk, and possibly, the risk of other cancers. The occasional hot dog or hamburger is okay.” So, when consumed in moderation, red or processed meat does not pose a big health threat.

When considering the IARC’s classification of carcinogenic foods, you have to be aware of the serving size.

The degree to which your red or processed meat consumption will affect your health has a lot to do with the other lifestyle choices you make. Do you have a well-balanced diet, exercise regularly, and drink enough water? All of these factors influence your overall health. The protein and iron that your body receives from red meat support your cells, tissues, organs, bones, and overall immune system.

Based on the study’s findings, the World Health Organization labeled red meat as Group 2A, stating that the classification was made on “limited evidence.” The IARC clarifies, “limited evidence means that a positive association has been observed between exposure to the agent and cancer but that other explanations for the observations (technically termed chance, bias, or confounding) could not be ruled out.”

The WHO also inappropriately labeled processed meat as Group 1, the same group that contains asbestos, arsenic, and tobacco— some of the most carcinogenic dangers to humans. Is it fair and reasonable to say that your chance of getting cancer from smoking is equal to getting cancer from eating meat? Of course not. Then the WHO discredited their own argument by stating the following:

Processed meat has been classified in the same category as causes of cancer such as tobacco smoking and asbestos (IARC Group 1, carcinogenic to humans), but this does NOT mean that they are all equally dangerous. The IARC classifications describe the strength of the scientific evidence about an agent being a cause of cancer, rather than assessing the level of risk.