The Four Pillars: The Foundation of World Cuisine
1. Pillar Number 1: Meat On The Bone
a. Cooking Meat, Rule Number One: Don’t Overcook It
b. Cooking Meat, Rule Number Two: Use Moisture, Time, and Parts
c. Cooking Meat, Rule Number Three: Use the Fat
i. More Than Flavor: Fat’s Synergistic Effects
d. Cooking Meat, Rule Number Four: Make Bone Stock
2. Pillar Number 2: Organ Meat, Offal Good For You
i. Why You Should Eat That Liver Paté
3. Pillar Number 3: Better Than Fresh, Fermentation and Sprouting
a. Fermentation, Part I: Single Cell Vitamin Factories
b. Fermentation, Part II—Boost Your Immune System With Probiotics
i. Seeds of Change: Why Sprouted Grain Bread is Better than Whole Wheat
4. Pillar Number 4: Fresh, the Benefits of Raw
a. Fresh Greens: Potency that Can’t be Bottled
b. Fresh Dairy: Why Mess With Udder Perfection?
i. Lactose Intolerance
ii. Why Most Milk is Pasteurized Today
iii. The Difference Between Fresh and Processed
c. Fresh Meat
5. How The Four Pillars Will Make You Healthier
6. Two Steps to Perfect Health
7. Two Ingredients to Avoid
Foods that Program Your Body for Beauty, Brains, and Health
The Four Pillars: The Foundation of World Cuisine
One way you could reproduce a healthy diet would be to simply pick a single region’s traditional cuisine and copy it precisely. The problem is, we don’t do that. When you get books on, say, the Mediterranean or Okinawan diet and use those recipes, rarely are you creating the same dishes as the people actually living in those regions. Why not? Typically, the recipes are inaccurate. The authors reinterpret them, replacing difficult-to-obtain or unfamiliar ingredients with substitutes you can find at any Costco. Traditional fats, like lard, are replaced by government-recommended vegetable oils. (Why is that a problem? Variety cuts, unfamiliar and often unavailable, are replaced with boneless,
skinless, low-fat alternatives. Any meal that takes more than an hour to prepare is deleted from the list of possibilities. And if the recipe originally required homemade components—like bone stock, fresh pasta, or fermented vegetables—the instructions are rewritten in the name of convenience and you wind up with instructions for making foods stripped of the very things that made them tasty, authentic, and healthy in the first place. You get American food with exotic spices. I’m going to show you what all those cookbooks have been missing. Those components of traditional cuisine removed from the typical diet or cookbook comprise the very components that every successful traditional diet has in common. I call these components the Four Pillars. These fundamental foods provide healthy people all around the world the consistent stream of nutrition that, no matter the regional culinary peculiarities, adequately provides the nutritional input our bodies have been programmed to require. Though each local interpretation appears unique, as far as your body’s cells are concerned, healthy diets are all essentially the same, resting on the same Four Pillars: meat on the bone, fermented and sprouted foods, organs and other “nasty bits,” and fresh, unadulterated plant and animal products. To our palates, the spectrum of regional cuisines is as diverse as the ecology of our planet. In Hawaii before Captain Cook’s arrival, the staple food was poi, a paste made of roasted and dried taro (a tuberous root vegetable) that could be stored for months, rehydrated on demand and then, as a final step, fermented. This staple was supplemented most often with fish, coconut, and banana. (Interestingly, the alii, or royal class, ate more fish and other high-nutrition foods than poi and were also taller. I suspect that, as with any society, the cause-and-effect relationship between height and access to the choicest foods went in both directions: Better foods made some people relatively tall; being taller offered access to better foods.) Until around 1940, the Netsilik Eskimo traditionally ate seal, fish, lichen, and not much else. In the
Mongolian desert today, nomadic bands of camel breeders eat mainly dairy products, some grains, lots of tea, root vegetables, and meat. In the rain forest of Papua New Guinea, one of the last surviving hunter-gatherer groups, the Kombai, dine on fat grubs of giant flies, lizards, birds, pounded sago palm hearts, and—for special occasions—fattened pig. In West Africa, farmers known as the Mofu grow millet, beans, peanuts, forage for insects, and raise goats and chickens, just as they have for thousands of years. While each of these seemingly diverse diets contain foods that may strike you as bizarre, the nutritional content they represent is as familiar to your body, and to your epigenome, as salt or water. As far as your body’s cells are concerned, vegetable oil and massive doses of sugar—now that’s strange. If you’ve been eating a standard, food pyramid-compliant American diet, any authentic regional diet, no matter how exotic, along with the abandonment of vegetable oil and sugar, would bring your body, your cells, and your genes a welcome and long-awaited relief. But you don’t have to move to get the benefits of these traditions. Simply include foods from each of the Four Pillars into your diet. Start with eating something fresh once every day. And work your way up to using foods from two or more categories daily.
Although no region has cornered the market on health, French cooking is special. Against the backdrop of international food, French cuisine stands out for its variety, depth, and indulgent sensuality. The French literally wrote the book on culinary arts, as every chef trained in the Western tradition owes his or her skills to Escoffier and the culinary pioneers who preceded him. Some would argue that China deserves equal billing with France as a culinary epicenter, as it is the original source of so many foods we now take for granted. But unlike Chinese or Italian or Mexican food, French food served in the US and around the world is often prepared using age-old techniques, allowing it to retain unparalleled flavor
profiles and healthful character. You could say that French cuisine stands firmly on all of the Four Pillars. Of all the cuisines in all the restaurants in all the world, why would French food enter the 21st century looking very much the same as it did in Napoleon’s court? In a word, snobbery. This famously French attribute definitely has its good side because, without it, the universally celebrated gift of authentic epicurean expression would never have come to exist. The early 19th century middle classes wanted to prove that they had been elevated beyond “the mere physical needs of nourishment.”The result was a new brand of cooking which the upwardly mobile, who could now afford to hire chefs, would come to call grande cuisine. Grande cuisine was, and is, a style of cooking offered by high-class restaurants. Chefs would seek out the best regional ingredients in season and perfect the techniques used to prepare them, not so much to maximize nutrition as to maximize flavor. “The grande cuisine attained its status because it emphasized the pleasure of eating rather than its purely nutritional status.”In spite of this new emphasis, grande cuisine originated at a time when real ingredients—as opposed to things like MSG and sugar—were the only edible materials available. So as these chefs concentrated real, quality ingredients to intensify flavor, they couldn’t help but concentrate their nutrients at the same time. The codification of grande cuisine in professional texts has encapsulated in amber centuries-old techniques for extracting flavor and nutrients from foods grown throughout Europe and Asia. By no coincidence, foods representing each of the Four Pillars appear again and again in classical French cooking. I told you about the “Latina Paradox,” the fact that relatively less affluent, recently immigrated Hispanic women, eating traditional Hispanic foods, somehow still manage to have healthier children than the average American woman. As you know, the French have their own health “paradox”—relatively low rates of heart disease, despite a notoriously rich diet. Now that you understand why these traditional diets are actually far healthier than the typical American diet, you can see that there really never was any mystery at all. The answer is in healthy fats, very little sugar, and plenty of foods from each of the Four Pillars, starting with meat on the bone.
It’s easy to enjoy well-prepared meat, but we’re not born with the knowledge of how to make it taste good. That part, we have to learn. Though the art of making meat taste great can be as simple as it is rewarding, if you’ve never seen a person do it, you’d never know the trick. The secret? Leave it on the bone. Thanksgiving dinner is, for many, the most memorable meal of the year, which happens to be centered on a large bird, slow-cooked whole. When cooking meat, the more everything stays together—fat, bone, marrow, skin other connective tissue—the better. This section will introduce you to the simple techniques that primitive and haute cuisines use to make meat taste succulent, juicy, and complex. The better the material you start with, the better it tastes, and the better it is for you. For that reason, and more, animals raised humanely and pastured on mineralrich soil are best. I’ll show you the four rules you need to know to preserve and enhance the taste and nutrition of all our precious animal-derived items. And I’ll show you the science that explains why mastering the art of cooking meat is the first step toward capturing the true power of food.
Cooking Meat, Rule Number One: Don’t Overcook It
There are two kinds of people, those who like their steak rare and those who don’t. If you’re the medium-rare type, you’ll know which side you fall on by answering this question: What would upset you more, if the steak you just ordered came to your table undercooked or overcooked? When I started eating meat again after experimenting with vegetarianism in graduate school, Luke’s opinion that well-done meat is wasted meat was unconvincing. But after studying the chemistry of well-done versus rare, I recognized that, once again, Luke’s primal instinct was spot on. I can still recall the effort required to swallow my first bloody, glumpy, chewy bite when I crossed over to the other side of the culinary divide. Luke’s delicious brown stock gravy helped my first time go much easier. Now, five years later and much the wiser, I find meat cooked as much as medium to be stringy, chewy, coarse and devoid of the savory flavor of juicy red blood. I’ll never go back. When it comes to steak, it’s not the size that matters; it’s the consistency and texture. Overcooked meat is tough because its fat, protein, and sugar molecules have gotten tangled and fused together during a wild, heat-crazed chemical orgy. The result is a kind of tissue polymer that requires more work to cut with a knife and more chewing, as well as more time to digest. The worst part is that so many of the nutrients we need are ruined. Ruined nutrients don’t just politely disappear. Once ingested, your body won’t be able to simply flush them down some metabolic drainpipe. When heat kills nutrients, it does so by causing reactions between nutrients, forming new chemical compounds including known carcinogens (such as aromatic hydrocarbons and cyclic amines), as well as other molecular fusions that damage your kidneys and blood vessels. When meat is cooked properly, fewer harmful reactions occur.The nutrients and flavor compounds survive, and can now be gently released into the meat’s juices where they are more bioavailable, and more readily tasted and absorbed.
So how much heat is too much heat? If, when you slice it, there’s not even a trickle of juice, it’s way overdone. Steak should be juicy and red. I recommend you work your way down to medium rare, and once you get used to that, go for rare. One last thought: If you’re an Anthony Bourdain fan, you already know that restaurant patrons who order their steak well done get the oldest, least choice cuts. It’s not that the chefs have it in for people who order their steaks brown. They have to save the freshest product for those palates that can taste the difference.
Cooking Meat, Rule Number Two: Use Moisture, Time, and Parts
Not long ago, at a party, I met a dark-eyed Peruvian woman with a sultry accent who had just discovered her slow cooker. She’d owned it for two years before a visiting friend released it from confinement in the back of the kitchen cabinet. That whole week they ate nothing but stews. After years of indifference toward it, my new friend had fallen in love with her slow cooker because “it giff so mush flavor!” When I told her that good, complex flavor means good nutrition, and that she should use it as often as she wants, she fell in love with me. It is a little-known fact that when a chef talks about flavor, he’s also talking about nutrients. When he says some flavors take time to develop, he’s saying sometimes you have to wait for certain nutrients to be released. Cooking meat slow is the best way to turn an ordinary meal into something extraordinary—in terms of taste and nutrition. The potential flavor of meat, or any food, derives from its complexity. Depending on the cut, “meat” may include muscle, tendon, bone, fat, skin, blood, and glands—each a world of chemical diversity. When that diversity is released on your tongue you can taste it, and the rich, savory flavor means a world of nutrients are on their way.
You don’t actually need a slow cooker to cook meat slowly and enjoy all the same benefits. All you need is moisture, time, and parts (as many different tissue types as possible: ligament, bone, fat, skin, etc.). Making soup, stewing, keeping a top on to trap the steam, basting often when cooking in the oven—all these techniques keep the moisture inside the meat, enabling water molecules to make magic happen. Here’s how. The transformation of, say, a cold and flavorless chicken leg into something delicious begins when heated moisture trapped in the meat creates the perfect conditions for hydrolytic cleavage. At gentle heating temperatures, water molecules act like miniature hacksaws, neatly chopping the long, tough strands of protein apart, gently tenderizing even the toughest tissue. And because water also prevents nearby strands from fusing together, keeping meat moist prevents the formation of the protein tangles that make overcooked meat so tough. How does hydrolytic cleavage translate into taste? It’s simple. Taste buds are small. The receptor site where chemicals bind to them is tiny. So things that impart taste (called flavor ligands) must be tiny, too. If you were to take a bite of a cold, raw leg of chicken, you wouldn’t get much flavor from it. Cooking releases trapped flavor because, during the process of hydrolytic cleavage, some proteins are chopped into very small segments, creating short strings of amino acids called peptides. Peptides are tiny enough to fit into receptors in our taste buds. When they do, we get the sensation of savoryness food manufacturers call the “fifth flavor,” or umami. (Sour, bitter, salt, and sweet are the other four major flavors.) How does having additional parts (skin, ligaments, etc.) create additional nutrition? Water molecules tug apart the connective tissue in skin, ligaments, cartilage and even bone, releasing a special family of molecules called glycosaminoglycans. You will find the three most famous members of this family in nutritional supplements for joints: glucosamine, chondroitin sulfate, and hyaluronic acid. But these processed supplements don’t hold a candle to gelatinous stews, rich with the entire extended family of jointbuilding molecules. What is more, cartilage and other connective tissues are nearly flavorless before slow-cooking because (just as with muscle protein) the huge glycosaminoglycan molecules are too big to fit into taste bud receptors. After slow-cooking, many amino acids and sugars are cleaved away from the parent molecule. Once released, we can taste them. Slow-cooked meat and parts are more nutritious than their mistreated cousins for still another reason: minerals. Mineral salts are released from bone and cartilage during stewing, as well as from the meat itself. These tissues are mineral warehouses, rich in calcium, potassium, iron, sulfate, phosphate and, of course, sodium and chloride. It turns out our taste buds can detect more of these ions than previously suspected, including calcium, magnesium, potassium, and possibly iron and sulfate, in addition to the sodium and chloride ions that make up table salt.Overcooking traps these flavorful materials in an indigestible matrix of polymerized flesh that forms when meat begins to dry out. You can only taste, and your body can only make use of, minerals that remain free and available. A word about flavor complexity. Although we’ve been told that some taste buds taste only salty, others sour, others bitter, and others sweet, studies have revealed that, though taste buds may taste one kind of flavor predominately, one bud can in fact detect different flavor ligands simultaneously. It turns out, the more, different kinds of flavors there are, the more we taste each one. When peptides and salt ions bind at the same taste bud, the result is not a doubling of flavor, but a powerful thousand-fold magnification in the signal going to your brain. In this way, our taste buds are engineered to help us identify and enjoy (nutritional) complexity. (This is why hot dogs, for instance—or better yet, actual sausage—taste better with sauerkraut and bittersweet mustard.) Now, some of you might still pine for your Arby’s or your Big Mac. But keep in mind, the MSG and free amino acids in fast foods are tricking your tongue. The artificial flavoring MSG (a free amino acid, called glutamate) binds taste receptors just as peptides in slow cooked meat would. MSG and other hydrolyzed proteins are manufactured by taking hydrolytic cleavage to its completion, fully breaking down plant or animal protein products into free amino acids while refining them away from other cellular components. Health food stores sell these taste-enhancers in the form of Bragg’s Aminos, which is no better for you than hydrolyzed soy sauces. (Brewed soy-sauces derive flavor from peptides, which are safe.) The problem with these products comes from the fact that certain free amino acids have neurostimulatory effects that can lead to nerve damage (amino acids glutamate and aspartate are the most potent). When consumed in small amounts as part of a meal containing a diversity of nutrients, free amino acids are actually good for us. But when consumed in large quantity without their normal complement of nutrients (most notably, without calcium or magnesium), these amino acids can cause temporary memory loss, migraines, dizziness, and more. This is why the concept of whole foods must be applied to animal products as well as plants. Simply refining the protein away from its source turns normal, healthy amino acids into potentially harmful compounds.
Cooking Meat, Rule Number Three: Use the Fat
We need to eat animal fat, just as we always have. Many people believe that the animals we eat today are unusually fat, but that’s not true. While grain-fed animals do contain unhealthy fat (see Why Organic Meat Is Worth the Price), and lots of it where it’s bad for the animal (like within the muscle), the animals humans historically ate were relatively chunky too because, whenever possible, people picked them at the peak of plumpness. Freerange deer, for instance, are as much as fifteen percent fat (by weight) in summer.But by the time hunting season rolls around they’ve stuffed themselves for winter fasting and tip the scales at thirty to forty percent body fat. According to early American explorers like Samuel Hearne and Cabeza de Vaca, North American Natives preferred the fattest animals, and valued their fattiest parts most of all. When hunting was especially good, they’d leave the lean muscle meat behind for the wolves. What are the nutritional benefits of our appetite for fat? For one thing, fat is a source of energy, like sugar. Unlike sugar, however, fat is a major building material for our cells, comprising 30 to 80 percent (dry weight) of our cell membranes. And unlike sugar, fat doesn’t trigger the release of insulin, which promotes weight gain. Furthermore, a high-sugar meal damages our tissues, but a high (natural) fat meal doesn’t. And this is something I was tested on in med school but forgot right after the test: We need fat to be able to absorb most fat-soluble nutrients, including vitamins A, D, E, and K. The fact that the presence of fat in meat also helps protect it during cooking—let’s just call that a happy coincidence. To be honest, though, it’s not always just a coincidence. Since, to keep meat moist, fat must be located on the outside of a cut of meat, good butchers strive to produce cuts encased inside a neat layer of rich, tasty fat. In smaller, leaner animals like birds, most of the fat sits right under the skin, naturally in the perfect location to keep meat moist during cooking. If you want a flavorful, juicy bird, for goodness’ sake don’t peel off the skin! One of the latest new trends in the food world falls squarely in the category of everything-old-is-new-again: grass-fed beef. Pasture-raised beef has all kinds of advantages, both for you and for the animals. You may have heard that grass-fed is good for you because of its higher omega-3 content. That’s true. But to get that omega-3, you have to get large cuts of meat with an exterior layer of fat (or the liver, or the bone marrow, or other “nasty bits”—see below). Compared to most grocery store beef, which comes from grain fed cows and is heavily marbled with heat-resistant saturated fat, the muscle in pastured cows is relatively lean. So when you buy a grass-fed steak, it’s practically fat free and will dry out quicker during cooking than the typical grocery store steaks that you might be used to.
More Than Flavor: Fat’s Synergistic Effects
Have you ever wondered why fat tastes so good? We have five well known flavor receptors.
1) Sweet, which detects carbohydrate.
2) Sour, which detects acid (acid plays a role in making nutrients more available).
3) Bitter, which detects antioxidants, some of which are also poisons.
4) Salty, which detects sodium and other minerals. And
5) Umami, the amino-acid detector described above.
If we have no receptor for fat, why do we like it so much? It’s not just your imagination that fat free cookies don’t taste as good as the real thing. Fat was long thought to impart flavor by way of the nose. But in 2005, French researchers blocking off study subjects’ ability to smell using—you guessed it—clothespins on their noses found evidence of a receptor in the mouth that does detect fat, called CD38.The subjects proved they could detect a variety of long-chain fatty acids, from saturated, to monounsaturated, to polyunsaturated, as well as potentially harmful oxidized fat. They could even discriminate between fatty acid types.Just as Ayurvedic culinary masters indicated thousands of years ago, there may be six major flavor groups our tongue can detect. Not only can we detect fat, just as with other flavor ligands, there is a synergistic effect. When fatty acids bind to their receptors, it affects other taste buds such that their ability to detect sour, salt, and bitter flavors is enhanced. This makes sense because many of the compounds that taste sour and bitter are fat soluble, and fat would be expected to enhance their absorption into our bodies as well. So it appears our tongues are wired to guide us toward nutritionally complex foods. Unless a food has been “doped” with MSG, other artificial flavor agents or sugar, or if our senses are dulled by chronic sugar ingestion, if something tastes delicious, it is almost guaranteed to be good for you. Why Organic, Pasture-Raised Meat is Worth the Price If you have a limited budget and you want to get organic, skip the lowfat fruits and vegetables and head over to the butcher aisle. Organic animal products give you more bang for your buck because they benefit from bio-concentration. Concentration refers to the percent of a substance present in something. Bioconcentration is a process that results in a living organism having a higher concentration of a substance than its surrounding media. Bioconcentration is usually used in reference to pollutants. When you spray plants with herbicides and pesticides, some get taken up into their tissues. When animals eat these plants, they also eat the pesticides and herbicides. The majority of these chemicals are fat soluble and will accumulate in fat. Since vegetables are naturally low in fat, when you buy organic vegetables, you are only avoiding a little bit of poison. When you buy organic meat, especially the fatty cuts, you’re avoiding a lot. Bioconcentration has a good side too. After all, it’s what eating is all about, getting lots of good information from what you eat. Plants bioconcentrate nutrients from the soil, so that a pound of grass, for instance, has more potassium than a pound of the dirt in which it grows. Animals carry this process one step further. Their tissues bioconcentrate the minerals grasses have taken from the soil and the vitamins that grasses manufacture. Research has shown that caribou can see which blades of grass are the most nutrient-rich and preferentially graze on those. Presumably, other herbivores also have the same ability. This suggests that organically raised animals kept in confinement will not be as healthy as those raised on large pastures. And a creature living freely in the wild should be healthiest of all. So if you hunt, or if you know a hunter who has extra, don’t let this amazing resource go to waste: Eat as much of the animal as you know how! There’s one more factor making organic meat worth the price. Organically grown animals cannot (yet) legally be given antibiotics or other drugs except in case of illness. This means the farmer has to keep them healthier, which means they’re healthier to eat. Nor can organically grown animals legally (at this point) be injected with growth hormones. Growth hormones have been proven capable of surviving the cooking and digestion processes. And some believe growth hormones in animal products are adding to the problems of obesity and cancer.Unfortunately, as the mega-industries grow stronger, they are changing the rules to make it easier to put the word organic on the label. The best bet is to get friendly with your county farmers.
Cooking Meat, Rule Number Four: Make Bone Stock
More than anything else, the health of your joints depends upon the health of the collagen in your ligaments, tendons, and on the ends of your bones. Collagens are a large family of biomolecules, which include the glycosaminoglycans, very special molecules that help keep our joints healthy. People used to eat soup and stock made from bones all the time, and doing so supplied their bodies with the whole family of glycosaminoglycans, which used to protect people’s joints. Now that few people make bone stock anymore, many of us are limping into doctors’ offices for prescriptions, surgeries and, lately, recommendations to buy over-the-counter joint supplements containing glucosamine. And what is glucosamine? One of the members of the glycosaminoglycan family of joint-building molecules. Veterinarians have been using glucosamine supplements to treat arthritic pets for decades. But physicians dismissed the practice as a waste of time, assuming that, since glucosamine is a protein, the digestive system would break it down into its component amino acids. Nobody can explain how, but studies have shown that glucosamine is somehow able to resist digestion and pass through the intestinal wall intact. Once it gets into your bloodstream, “…glucosamine has a special tropism for cartilage.” (That’s techno-speak for “somehow, it knows just where to go.”) Even more amazing, glucosamine can actually stimulate the growth of new, healthy collagen and help repair damaged joints. And collagen isn’t just in your joints; it’s in bone, and skin, and arteries, and hair, and just about everywhere in between. This means that glucosamine-rich broth is a kind of youth serum, capable of rejuvinating your body, no matter what your age. After decades of skepticism, orthopedists and rheumatologists are now embracing its use in people with arthritis, recommending it to “overcome or possibly reverse some of the degradation that occurs with injuries or disease.” Given these facts, it hardly seems far fetched to suggest that eating this stuff in soups and sauces from childhood makes joints stronger in the first place. One of Luke’s golfing buddies, local Kauai born and bred, didn’t need convincing. As a child of a Filipino household, he ate lots of meat on the bone growing up. One day, chopping a goat leg to stir into stew, he asked his mother about the white, shiny stuff on the ends of the bones. She told him that he had the very same kind of material in his own joints. Instantly, he decided that eating that shiny cartilage would be good for his shiny cartilage. He has eaten meat on the bone ever since, making sure to chew on the ends. Now his friends are on arthritis meds, while he’s surfing and golfing twice a week. Not only do bone broths build healthy joints, the calcium and other minerals help to grow your bones. One of my patients is a charming young boy whose father is a chef. The chef is 5 foot 10 and his wife 5 foot 5. Both parents are lactose intolerant, and so, for years his dad, the chef, made bone stocks and used them as a base for making rice, mashed potatoes, soups, and reduction sauce gravies. He did this so that he and his lactose-intolerant wife would get plenty of dietary calcium. Aside from calcium, bone broth also contains glycosaminoglycans, as well as magnesium and other bonebuilding minerals—basically a total bone and joint building package—most of which the chef didn’t know about. However, his son’s DNA did. This child of average-height parents started life at normal size, but his growth chart illustrates that, over the years, he’s gotten progressively taller than average. Now, at ten, his height and muscle mass are already off the chart. By the way, his teeth are straight, he doesn’t need glasses, and he is the number one swimmer on his team. Coincidence? Misleading anecdotal data? I don’t think so. We all know that vitamin D and calcium are good for a child’s growing bones. And it takes a whole array of vitamins and minerals to build a healthy skeleton. Cooking meat on the bone extracts all those well-known vitamins and minerals, plus the glycosaminoglycan growth factors. To have tall, strong, well-proportioned children, we’re often told to get them to drink milk. And if we’re talking about organic whole milk—especially raw!—I’m all for it. But if it were my kids, I’d also make sure they were getting regular helpings of home-made soups and sauces, and anything else I could think of to get them to eat more stock. The benefits of broth consumption far outweigh the benefits of taking a pill for a couple of reasons: First, the low heat used to slowly simmer the nutrient material from bone and joint is far gentler than the destructive heat and pressure involved in the production of glucosamine tablets. Second, instead of extracting only one or two factors, broth gives you the entire complex of cartilage components—some of which have yet to be identified in the lab—plus minerals and vitamins. Broth’s nutritional complexity makes it a nearly perfect bone-building joint-health-supporting package. And it’s no coincidence that it tastes great. Rich, satisfying flavors convinced the father of modern French culinary science, Auguste Escoffier, that stock was an absolute kitchen essential. “Without it, nothing can be done.” Our ancestors probably discovered the magic in bones a very long time ago. In the Pacific Northwest, archeologic digs have uncovered evidence that, centuries before Escoffier, early Native Americans supplemented their winter diet of dried fish by deliberately fracturing herbivorous animal bones prior to stewing them. Not only did this release bone nutrients, it released the marrow fat and vitamins into the simmering soup. And anthropologists studying hunter-gatherers from Canada to the Kalahari find that this practice of exploiting bone and marrow nutrients was, and is, “almost ubiquitous.” While visiting a farm in New Zealand, I met a spry and engaging 80-something woman who told me about the Scottish tradition of “passing the bone.” In the little village where she grew up, nothing went to waste. Cartilaginous knee joints and bony shanks were especially prized, and passed from house to house. Each family would put the bones into a pot over the stove to simmer for a night before passing them on to their neighbor until the bone was “spent.” As she hiked with us over the rolling green hills of her estate, she explained that the bones were shared because she and her neighbors were convinced that “something in them was sustaining.” Indeed there is. So skip the pharmacy aisle and head straight to your local butcher for bones to make your own homemade stock. For thousands of years, people all over the world made full use of the animals they consumed, every last bit right down to the marrow and joints. You might suppose that, over all that time and all those generations, our bodies, including our joints, might grow so accustomed to those nutrients that they wouldn’t grow, repair, and function normally without them. You’d be right. And what is true of bones is true of other animal parts. Over time, our genes have been programmed with the need and expectation of a steady input of familiar nutrients, some of which can only be derived from the variety meats, which include bones, joints, and organs.
Pillar Number 2: Organ Meat, Offal Good For You
Long ago, when a deer was killed then lifted on a hook to be dismembered, the hunter began by inserting a knife just below the xiphoid process at the lower end of the sternum and briskly drawing it down to the pubic bone. When properly done, the guts spilled out of the belly and naturally fell to the ground—off fall. In modern usage, the term offal encompasses every part of an animal except ordinary muscle meat. If you’ve ever seen one of those travel shows hosted by a snarky gourmand eating strange foods in exotic locales, you might recall watching scenes of street venders in Calcutta frying brains on a skillet, or sweetmeats served in a dusty open-air eatery in Uzbekistan, and thinking, How can they eat that? It’s all a matter of what you’ve grown up with. Had you been born elsewhere, you might drool at the sight of lungs on a stick just as you might now go ga-ga over a greasy corn dog. In fact, until recently, those offal meats were a big part of American dining, integrated into our diets through a wide range of dishes. Turn the cookbook pages back just a few generations and you’ll find Halloweenish recipes calling for organ meats and other variety cuts alongside familiar casseroles and crumb cakes. My 1953 version of Joy of Cooking lists Calf Brain Fritters and ten other brainy recipes, as well as instructions for making meals from liver, kidney, tongue, heart, head, and thymus. If you dig further back to cookbooks printed before the Industrial Revolution, you’ll find ghastly instructions requiring a witch’s arsenal of implements, large cauldrons and bone-splicing hatchets. From The Ladies New Book of Cookery, published in 1852, listed under preparation of beef, we learn the private housewife was to “take a green tongue, stick it with cloves and boil it gently for three hours.” Also included are practical tips on how to estimate internal temperature without a meat thermometer: “When the eyes drop out, the pig is half done.” Plus pointers on mannerly kitchen protocols: “It is better to leave the wind-pipe on, for if it hangs out of the pot while the head is cooking, all the froth will escape through it.” Our founding fathers’ wives followed recipes that made extensive use of offal meats, especially in the fall when many animals would be killed to conserve precious grass and hay for the best breeders that could repopulate the pastures again in spring. Since offal goes bad quickly, they needed to be consumed or preserved as soon as possible. The prudent housewife of the 17th, 18th, and early 19th centuries would want to make use of every last scrap and, nutritionally speaking, nothing would better prepare her family for the long winter ahead. Offal meats are rich in vitamins, especially fat-soluble vitamins, which can be stored in our own fat reserves for months. As winter wore on, and root cellars emptied, those larders of nutrients built up internally by feasting in the fall sometimes made the difference between life and death, or a successful pregnancy and one fraught with complications.
Why You Should Eat That Liver Paté
One of offal meat’s most famous proponents was Adelle Davis, a biochemist who pioneered the fledgling field of nutrition in the mid 20th century. A patient of mine, who was taken to her in the 1940s on the advice of his pediatrician for help with his disabling asthma, was not simply treated. He was cured. Back then, there were no handheld inhalers. Every time he developed a cold or the weather changed, his mother would have to rush him to the hospital for shots of adrenaline. Davis advised his mother to send him off to school with a thermos of pureed raw cow’s liver every day, which he managed to drink primarily because he wanted to avoid the emergency room. The raw cow’s liver provided a spectrum of missing nutrients to calm the inflammation that triggered his asthma attacks. But it may also have done much more, ensuring his entire nervous system was wired correctly. Today, in his seventies, his reflexes are still so fast that he can trounce Luke on the tennis court. I don’t recommend you eat raw liver unless you are familiar with the source and have taken proper measures to prevent parasites. But a quick glance at the nutrition tables for liver and other variety cuts reveals why nutrition-oriented physicians might use these parts as cure-alls like Davis did; they’re the real vitamin supplements. As she explains in her book Let’s Cook It Right, “The liver is the storage place or the ‘savings bank’ of the body. If there is an excess of protein, sugar, vitamins, and any mineral except calcium and phosphorus, part of the excess is stored in the liver until it is needed….Liver is, therefore, nutritionally the most outstanding meat which can be purchased.” Of course, if the cow is sickly, or raised on depleted soil, the savings bank of the liver is likely depleted as well. The following are just a few examples of the benefits of eating different variety meats. The Latin name for the retina of the eye is macula lutea. (Lutea is Latin for yellow.) This thick, membranous yellow layer of the eyeball is a rich source of the nutrient lutein, a member of the retinoid family of vitamin A precursors. Lutein supplements are now promoted as being good for prostate health and for preventing macular degeneration. The fat behind the eyeball is a rich source of vitamin A and lutein. (If you think you’d rather swallow a supplement than pop an eyeball after breakfast, remember that vitamins are heat-, light-, and oxygen-sensitive and unlikely to survive processing.) And while you’re digesting the idea of eating eyeball fat, consider that the gooey juice in the eye is primarily hyaluronic acid, rich in glycosaminoglycans. You can get hyaluronic acid injected into your lips (to fill them out), your knee (as a treatment for osteoarthritis), and even your own eye (to treat certain ocular diseases) for $200 a dose (twenty one-thousandths of a gram). It’s called Restylane. But you can get this useful nutrient into your body just by eating the eyes you find in your fish head soup, and the glycosaminoglycans will find their way to the parts of the body that need them most. Brain and nervous tissues are fantastic sources of omega-3 and other brain-building fatty acids and phospholipids, and with more than 1.2 grams per 100 gram portion, they are a richer source of this vital nutrient than almost anything else. Even windpipe contains stuff we don’t get enough of these days—those glycosaminoglycans again. Many of my patients spend upwards of a hundred dollars a month buying supplemental nutrients that are far less potent than what our ancestors enjoyed daily, simply by including variety meats in their diet. You may have noticed a pattern here: eating Eyes is good for your eyes. Eating joints is good for your joints. The idea that the consumption of a part of an animal’s body is good for the same part of your own is an interpretation of homeopathy—meaning like cures like. Unfortunately, today most of these powerful “supplements” are going to waste as today’s meat producers wash these rich sources of nutrition down drains in the slaughterhouse floor, or pass them off to rendering plants where heaps of rotting tissue are reprocessed into animal feeds, yellow fat, and something called “recycled meat.” The good news is, since our society values them so little, if your butcher can save them for you, he’ll likely sell them to you cheap. The bad news is, once we’ve got them, making them taste good isn’t especially easy to do; it takes a little time and know-how. For adults, the reward is a powerful resistance to disease. For children, the awakening of their genetic (growth) potential brings rewards that are indescribably greater.
Pillar Number 3: Better Than Fresh, Fermentation and Sprouting
Egyptians set aside their dough until it decayed, and observed with pleasure the process that took place. —Herodotus, 5th century BC.
On a recent trip to the Bay Area where I was giving a talk on nutrition, a good friend took us out for lunch. “You’re into healthy food,” she said. “There’s a hip new vegan restaurant we’ve got to try.” Opening the menu felt like cracking open a history book to do your assigned reading; nothing looked appetizing. Though the menu was peppered with pop-nutrition vernacular—“ living,” “dynamic,” “enzyme,” the selections were simply awkward interpretations of familiar foods: the raw pizza, the cold burrito. Luke ordered the burrito, a compressed disc of rancid seeds laureled with a splash of fresh greens. I ordered the pizza, an identical compressed disc with a different kind of dressing on the greens. The greens were good. The disc was not. Truly living food is more dynamic than salad leaves, and more potent than a plate of compressed seeds; it’s food that’s been awakened by the process of fermentation, sprouting, or both. Vegetarians in particular will benefit from these two potent methodologies for enhancing nutrition. Fermentation and sprouting are crucial for one simple reason: Plants didn’t evolve with the idea that they should be good to eat. In fact, plants spend a great deal of energy thwarting overzealous grazers and other creatures that would gladly eat them into oblivion. Not as helpless as they may seem, plants protect their foliage, stems, seeds, roots, and to a lesser degree even their fruits, with natural insecticides and bitter toxins that make some plants unsafe for human consumption. Unless your species has evolved the physiologic means to neutralize them, a plant’s various hemagglutinints, enzyme inhibitors, cyanogens, anti-vitamins, carcinogens, neurotoxins, and allergens say, “Eat at your own risk.” Although I disagree, some investigators have gone so far as to suggest that “nearly all the carcinogens in the diet are of natural rather than—as widely perceived—industrial origin.” Sprouting and fermenting effectively deactivates may of these irritants, which explains why sprouted grains and lacto-fermented vegetables are known to be easier to digest. Many of today’s best foods were originally fermented, sprouted, or both. Take away fermentation and there’s no such thing as wine. Or beer. You can forget bread, yoghurt, and cheese. Chocolate’s out, since cacao nibs must sit in the sun for a week or so to let the fruit ferment around the nibs and develop the full symphony of flavor. And the same goes for coffee berries. The list of fermented foods grows surprisingly long when you throw in things like sauerkraut, pickles, ketchup and other condiments that—though now industrially mass-produced by steeping in vinegar and salt—traditionally generated their own acid preservatives during fermentation. In The Story of Wine, writer Hugh Johnson celebrates fermentation as a central driving force of civilization. The oldest recipe known to exist, written in cuneiform, is for a kind of beer bread. If we’d never allowed cereal grains to sprout, we would never have invented bread nourishing enough to sustain a population; for the first ten thousand years of wheat and grain cultivation, the technology to crush open the kernels did not exist. And so, for the majority of human history, life-giving bread was made not with flour, but with partially germinated seeds. Unfortunately, even in places like France, people often fail to appreciate their own wild, indigenous microbes. And so many foods (cheeses, breads, wines, etc.) have had their flavors tamed by way of pasteurization, by the use of faster-acting cultures that are easier to work with, or both. In the next two sections, we’ll take a look at the battle of wills between human and vegetable, and see why traditional, low-tech methods for neutralizing plant toxins and maximizing nutrition are far more effective at producing healthy products than contemporary methods.
Fermentation, Part I: Single Cell Vitamin Factories
The human digestive system is a chimera. It’s one part us, one trillion parts them. We supply the long, hollow tube that begins at our mouth and coils for a dozen meters or so inside our abdominal cavity until it ends at the rear. The microbial world populates the tube with enough bacteria and fungi to outnumber our own cells ten to one. The average human colon contains over 800 species of microbiota and at least 7000 different strains. 60 percent of the fecal matter you produce consists of microbial bodies. Are all these microbes just freeloaders, or do we somehow benefit from their presence? To answer that, we need to understand something about a process called fermentation. My Webster’s dictionary describes fermentation as an “enzymatically controlled transformation of an organic product.” The key term is transformation. Bacteria are capable of transforming indigestible, bland, and even toxic compounds into nourishing and delicious foods. Without them, multi-celled organisms, from flies to frogs to mammals, would be unable to digest their food. With an arsenal of enzymes, microbes can break down toxins that might otherwise sicken or kill us outright, turn simple sugars into complex nutrients, make vitamins our diets might otherwise lack (such as K2 and B12), and wage chemical warfare on would-be pathogens. All we do for them is provide a warm place to work and plenty of water. From their perspective, we are the freeloaders living off their hard labor. The obliging microbe isn’t especially particular about where it lives. Requiring little more than consistent temperature, water, and a few organic materials, bacteria and fungi are equally happy whether inside our digestive tract, in a warm clay pot in the sun, an oak casket in a cave, a leather sac, or even an egg buried underground. Thousands of years ago, people learned to harness the power of these invisible “factors,” which developed predictably under a certain set of conditions. That skill opened up a world of possibility, enabling us to preserve our food and create a whole new set of flavors. Fermentation would ultimately be put to use by people around the globe, and form one of the foundational pillars of all traditional cuisine. Though today we tend to think of bacteria and fungi in our food as unwanted enemies, usually calling them “germs,” civilization owes much to these contaminants. Without yeast naturally present in the air, we never would have been able to leaven our bread, and in the 1960s, doctors discovered a dramatic example of the value of leavening. Poor Turkish families were having children with a type of dwarfism initially thought to be due to genetic mutation. When no defective gene could be identified, researchers looked to nutritional problems. It turned out that the mothers of affected children, as well as the children themselves, had low levels of zinc and other minerals. Further investigation revealed the cause of the mineral deficiency to be unleavened bread consumption. Wheat, like all seeds, contains mineral-binding compounds called phytates, which hold minerals in stasis until conditions are right for germination. Yeast and other microbes (such as those in sourdough) contain enzymes (called phytases) that break down phytates in the seed, freeing the zinc, calcium, magnesium and other minerals from their chemical cages. The parents of dwarfed children were buying cheaper, unleavened bread and were also unable to afford much meat, a good source of zinc and magnesium. The unleavened bread was the last straw. Bound to phytates, the zinc and magnesium in the bread passed through undigested, leading to mineral deficiencies that prevented proper expression of the children’s bone-building genes.This is just one example of what happens when people buy food based on price rather than on its nutritional value. Because few people appreciate the difference between authentic food that costs more, and similar substitutes that cost less, manufactures skip the labor-intensive fermentation steps whenever they can. Which is why I want to tell you the truth about soy. Some of my patients speak so proudly about how they’ve started eating tofu and drinking soymilk, obviously presuming that I think these things are healthy, I can hardly bear to burst their bubble. Soybeans contain chemicals called goitrogens and phytoestrogens, which disrupt thyroid and sex hormone function. The Chinese and Japanese who traditionally ate soy would soak, rinse, and then ferment the beans for extended periods, neutralizing the harmful compounds and using the fat- and protein-rich beans as a substrate for microbial action. Traditional tofu, natto, miso, and other cultured soy products are incredibly nutritious. Commercially made soymilk, tofu, and soy-based infant formulas, on the other hand, are not. Loaded with goitrogens and phytoestrogens, these foods are known to cause hypo- and hyper-thyroidism, thyroid cancer, and—particularly when consumed during infancy or pregnancy—male and female reproductive disorders. I have helped several patients with abnormal thyroid hormone levels and menstrual irregularities return their lab results and their bodies back to normal simply by advising them to stop eating soy. Pound for pound, fermented material will have more nutrition packed into it than the raw material it came from because, aside from acting like miniature detoxification machines, microbes add heaps of nutrients to whatever it is they’re growing in. Using enzyme power, single-celled bacteria and fungi manufacture all the vitamins, amino acids, nucleic acids, fatty acids (and so on) they need from simple starting materials like sugar, starch, and cellulose. They can thrive on foods that would leave us horribly malnourished. But we are bigger than they are. When we eat yoghurt, real pickles, real sauerkraut—or any food containing living cultures—our digestive juices attack and destroy many of the little critters, exploding their fragile bodies. Many survive (and protect us, see below), but those who are digested donate all their nutritious parts to us. Though after the fermentation process is finished foods like wine and cheese no longer contain living organisms, they have been enriched by the life-forms they once housed: wine has more antioxidants than grape juice, and cheese more protein than milk. The little critters can actually make all the vitamins we need except D, and all the essential amino acids. And they have one more trick up their sleeve. As if it’s not enough that they can free up minerals, preserve our food, manufacture vitamins, and clean up the nasty plant chemicals that our bodies can’t handle, once inside your body, they will literally fight for your life.
Fermentation, Part II—Boost Your Immune System With Probiotics
In 1993, E. coli hamburgers from Jack in the Box restaurants sickened hundreds of children, killing several. Around the same time, E. coli outbreaks in the apple industry led to the requirement that apple juice be pasteurized. In 2006, spinach laced with manure made more people ill. In 2008, salmonella-tainted tomatoes were blamed for another outbreak—until they decided it was actually Jalapeño peppers. It seems as though there’s always something yucky in our food ready to make us sick. No doubt, there are nasty microbial agents in the general food supply all the time. The question is, Why do they make some people deathly ill while leaving the rest of us alone? Turns out, it has to do with our social lives. I’m not talking about the people we go to parties with, but our bacterial bosom buddies. Microbiologist Dr. Bonnie Bassler discovered that microbes have social lives too. Far from behaving like mindless pre-programmed specks, they form gangs, coordinate efforts, and even scheme against other groups of bacteria. In fact, the turbulent world of micro-organisms shares all the violence and drama of a Spaghetti Western. And the microbial world operates under the same binary rubric. As far as your body’s concerned, when it comes to bacteria and fungi, there really are just two kinds: good and bad. The first group, often referred to with the umbrella term probiotics, is comprised of the same beneficial bacteria that preserve, detoxify, and enrich our food. These microbes are friendly and very well behaved. After all, we feed and house them, so it is in their best interest to keep us healthy. To that end, they secrete hormones that help coordinate the muscular contractions of intestinal peristalsis, while keeping a sharp look out for bad guys: the pathogens. Probiotics work with our immune system. If pathogens hope to gain a foothold, they have to get past the phalanx of probiotics first. While you’re watching Survivor or Top Chef, microbes in your gut are making alliances and scheming against each other for control of your internal real estate. Not only does the outcome of their battles determine whether or not a deadly strain of E Coli in your manure-tainted spinach kills you, studies have shown that live-cultured foods containing probiotics help to prevent a whole range of allergic, autoimmune, and inflammatory diseases.The people who originally mastered the art of fermenting fruits, vegetables, meats, and so on were probably seeking ways to preserve their food. Crops tend to ripen all at once. Fish swim in schools. Many game animals travel in large herds. These periodic abundances necessitated the development of effective food-preservation methods. The microbial world is so obliging that a little salt, a container, and some know-how are all you—I should say the microbes—need. Today we have simpler options for preserving our food, including canning, refrigeration, freezing, pickling (seeping in vinegar) and drying. But in terms of nutrient conservation, each pales in comparison to fermentation, which often adds new nutrients. Even your refrigerator can’t keep fresh fruits and vegetables from declining in nutrient content. Vitamin C, for instance, declines so drastically in storage that refrigerated green beans lose 77% after only seven days off the vine.If you’ve never fermented anything, you should. With a little instruction and practice, you can make yourself the best sauerkraut you’ve ever tasted. And it’s ridiculously easy: Shred a cabbage in the food processor. Mix with a full teaspoon of salt and a little liquid from a jar of Bubbies brand pickles (or other fermented vegetable product) and pack into a lightproof container with something heavy, like a jar full of water, sitting on top to keep the cabbage under the liquid. Cover with a towel to keep the bugs off. Wait a week or so, and eat. Not simple enough? Okay, here’s something even easier. With sprouting, you just let nature take its course.
Seeds of Change: Why Sprouted Grain Bread is Better than Whole Wheat
A lot of my patients tell me that they feel better when they cut wheat from their diet, and more kids than ever are developing celiac disease and other allergies to wheat and products made from wheat. After 10,000 years of cultivation, why the sudden change? There are plenty of potential causes, from the GMOs to the pesticides to the fact that flour is often heavily contaminated with mold toxins and allergenic proteins (insect parts and rat feces). Even when organically grown, manufacturers treat wheat flour like a construction material, extruding it into geometric shapes and puffing it into crunchy cereal cushions, rendering the proteins allergenic. Whether you suffer from wheat allergies or you just want to buy the healthiest bread available, bread made from sprouted wheat (or other grains) is your best bet. Wheat seeds are called wheat berries. Like all seeds, wheat berries can be sprouted. These days, the only exposure most of us get to sprouts is at the salad bar. People used to eat sprouted stuff all the time, only they didn’t let the sprouts develop as fully as those in a salad bar. Our ancestors who didn’t have mills were able to acquire more nutrition from their harvests of grain than we do today with all our technological advancements simply by waiting until the germination process begins. Why does germinating a seed first make it more nutritious? Seeds are designed to greedily hang on to their stored proteins, fats, and minerals over extended periods of time. To that end, the plant sheaths them in a hard, nearly impenetrable carapace and locks down nutrients with chemical binders that digestive enzymes can’t loosen. Moistening the seeds for a few days activates the plant’s own enzymes—including phytase, which digests phytates—to soften the seed, free up bound nutrients, and even create new ones by converting stored starch and fatty acids into proteins and vitamins. Today’s bread is nothing like the bread described in the Bible. The crust of a Domino’s pizza and bread made by indigenous people around the world are, nutritionally speaking, as alike as a packet of chicken-flavored powder and wild grouse. Modern bread is made of flour, while ancient breads were made of ground, germinated seeds. Although some of the stone artifacts found in places like Peru, the Nile Delta, or North America may look like something you could use to grind wheat berries into dry flour, I suspect the berries were partly germinated first. Wheat berries are as hard as ball bearings. It’s far easier to use seeds softened by germination. I know because I’ve conducted a study. In grade school, a friend of mine returned from a visit to a Native American reservation with a set of milling stones that we just had to build an afternoon’s drama around. We both plaited our hair in what we understood to be proper squaw fashion and walked out into her backyard to figure out how to make “genuine” Indian bread. It was 1973, when every East Coast mother walked in step with hippy trends, so naturally my friend’s kitchen had plenty of wheat berries with which to experiment. Enthusiastic as we were, those tiny brown pebbles tested our patience to the breaking point, shooting laterally off the grinding stone and onto the ground until we were convinced that this methodology would fail to generate oven-ready dough by the time my mom was to pick me up. We decided to take a short cut. Back in the kitchen, her mother had a jar of lentils soaking in water, softened but not yet fully sprouted. They were smushy enough to hold still under the rolling stone. In no time, we had ourselves a small pile of greenish-yellow lentil “dough.” (More of a paste, really, since lentils have no gluten). Ever since, I’ve been skeptical of anthropologists’ claims that similar stones were used to grind wheat or other hard seeds into flour. More likely, seeds used for making bread were pre-softened by letting nature take its course. You can soak any kind of seed you want, from kidney beans to wheat berries and more. Simply put some into a jar, cover with water, then cover with a bug-proof cloth and, in anywhere from one to four days, the seeds will start to germinate (you’ll need to change the water once a day). You can tell once they’ve awakened because you’ll see a tiny white rootlet begin to take form. That’s the point at which it’s ready to be used as a vitamin-rich version of an ordinary kidney bean or wheat berry. Or even easier than doing it yourself, you can buy breads made with sprouted grain in health food stores. Usually, you have to look in the freezer section because, without artificial preservatives, these breads mold quickly.If you can’t find sprouted grain breads, the next best thing is whole wheat. But when shopping for bread, be aware of a savvy marketing trick. The label on brown bread can say wheat flour even though they used white flour because, yes, even white flour originally came from a wheat field. The addition of caramel coloring turns the dough dark, completing the illusion that you’ve bought healthier, whole wheat bread. Until the food producers’ lobbyists strip them of all meaning, you can feel fairly confident when the ingredients include the words whole-wheat flour. Or better yet, sprout some wheat berries and use them to bake your own.
Pillar Number 4: Fresh, the Benefits of Raw
Every time I give a talk about nutrition, someone in the audience will raise a hand to ask my opinion of the latest antioxidant miracle being said to have otherworldly curative properties. Maybe it’s bearberry, or bee pollen, or goji, or ginseng. It could be a liquid extract, or a powder, or a pill—it doesn’t really matter. The idea behind all antioxidant supplements on the market is the same: to give the consumer a blend of electron-trapping chemicals that help prevent the two most common causes of tissue inflammation and degenerative disease: lipid oxidation and advanced-glycation-end-product formation. And every time, my answer goes like this, “If you want antioxidants, skip the latest fad products and use that money to buy fresh food.”
Fresh Greens: Potency that Can’t be Bottled
There are so many antioxidant miracles on the market now that, if you were so inclined, you could spend an entire paycheck and barely scratch the surface. But it would be a waste of good money. What the nutraceutical industry doesn’t want you to know is that there’s nothing unique about any of their “unique” formulations; all fresh fruits and vegetables contain antioxidants, flavinoids, and other categories of chemicals used as selling points on nutraceutical packages. In fact, as they will tell you, they make their products from fresh fruits and vegetables. It’s just that they use fruits and vegetables with more exotic-sounding names. The truth is, you’ll get a better blend of antioxidants simply by eating a variety of familiar greens, along with fresh herbs and spices: Sprinkle your marinara with basil and thyme, or make your own salad dressing with garlic and dill. Because supplements have been processed and certain chemicals may concentrate, supplements can have side effects. Fresh, whole foods (including raw meat and fish) universally contain a safe, balanced blend of antioxidants because all living organisms—plant and animal—use them to prevent oxygen damage. Plants are capable of manufacturing so many different kinds of antioxidants that we’ll probably never catalogue even a tenth of them. Family names for some of the more common antioxidants include flavinoids, terpenes, phenolics, coumarins, and retinoids (vitamin A precursors). Since antioxidants must work as a team to be effective, where you find one, you find a lot—but only when they’re fresh. If you want a power pack of antioxidants, you can get them cheap if you follow writer Michael Pollen’s advice and grow a tray of fresh herbs out on the balcony. They tend to taste a whole lot better than a capsule of sterile dust. Why is freshness so important when it comes to antioxidants? As much as antioxidants retard oxidation, oxygen spoils antioxidants. Antioxidants protect our tissues against oxygen damage by acting like selfless chemical heroes, throwing themselves in the line of fire to protect other chemicals from free radical and oxygen damage. Not only do antioxidants gradually lose their ability to do this over time, as oxidation inevitably occurs during storage, their potency can be neutralized through the drying and/or heating of processing. This is why a lot of foods deliver the most antioxidant punch when eaten raw. You can taste how much nutritional power a given plant is packing: More intense flavor means more intense nutrition. Both nutrient density and flavor intensity result from a bioconcentration of vitamins, minerals, and other nutrient systems. Pungent vegetables like celery, peppers, broccoli, arugula, and garlic contain more antioxidants, vitamins, and minerals per bite than tuberous vegetables like potatoes and turnips. Remember, cooking burns up antioxidants and damages many vitamins. So the more you eat cooked foods, the more you need to balance your diet by eating fresh, uncooked, pungent-tasting herbs and vegetables. Be aware that raw isn’t always better, thanks to cellulose, the material that gives plants their stiffness and their crispy crunch. Locked within cellulose-rich cell walls, vitamins and minerals in high-cellulose plant products pass right through our omnivore’s digestive system. Without heat or caustic chemicals, cellulose can only be broken down using specialized bacteria and extended gut-fermentation—something humans lack the intestinal yardage to accomplish (though they can replicate it; see section on fermentation, above). Studies show that a mere one percent of the retinoids (vitamin A precursors) in raw carrots, for instance, get absorbed. But cooking (which hydrolyzes cellulose in much the same way it hydrolyzes proteins) increases that percentage to thirty. Only a short list of plant parts are low enough in cellulose for our digestive enzymes to break them down without either cooking or fermenting them first, and these include fresh herbs and spices, nuts and fruits, and young, tender lettuce leaves and other leafy greens. However we eat our veggies, raw or gently cooked, freshness is paramount. As Mrs. A. P. Hill wrote in her 1867 cookbook, “It cannot be questioned that articles originally good and wholesome derive a poisonous character from changes taking place in their own composition.” Therefore, “[a] few only can be kept twelve hours without detriment.” This was before refrigeration, of course. But even so, the precipitous drop in nutrition and flavor after picking—and the fact that most grocery store vegetables are grown in poor soil, picked before they’re ripe, and then travel the world in cold storage, reducing nutrition and flavor further still—helps explain why so many kids won’t eat their veggies. While gaining access to many of the nutrients in plants often requires (judicious) use of heat, many animal products are so abundant in nutrients that adding thermal energy risks fusing them together. This is why we need to cook our meat so gently, and why raw meat and seafood dishes comprise a valuable part of many international diets, from sashimi in Japan to ceviche in Spain and South America to steak tartare, popular around the world. But there’s one animal product we think of as fresh even though the vast majority of what we find in most grocery stores is, in reality, anything but: milk.
Fresh Dairy: Why Mess With Udder Perfection?
Milk may be the single most historically important food to human health. Not just any milk, mind you, but raw milk from healthy, free-toroam, grass-fed cows. The difference between the milk you buy in the store, and the milk your great-great grandparents enjoyed is, unfortunately, enormous. If we lived in a country where raw milk from healthy, pastured cows were still a legal product and available as readily as, say, soda or a handgun, we’d all be taller and healthier, and I’d see fewer elderly patients with hunched backs and broken hips. If you’re lucky enough to live in a state where raw milk is available in stores and you don’t buy it, you are passing up a huge opportunity to improve your health immediately. If you have kids, raw milk will not only help them grow, but will also boost their immune systems so they get sick less often. And, since the cream in raw milk is an important source of brain-building fats, whole milk and other raw dairy products will also help them to learn. It’s a common misperception that milk drinking is a relatively new practice, one limited to Europeans. The reality is that our cultural—and now, our epigenetic—dependence on milk most likely originated somewhere in Africa. It is highly likely that milk consumption gave those who practiced animal husbandry such an advantage that it rapidly spread across the continent and then into Europe and Asia. With such widespread use, it’s likely that to allow for optimal expression, many of our genes now require it. In those countries where people’s stature most benefited from the consumption of raw milk, when raw milk is replaced with a processed alternative, their bones take the hardest hit. It’s a case of the bigger they are the harder they fall. In places like Norway, Sweden, and Denmark, people now suffer from particularly high rates of osteoporosis and degenerative arthritis. Our genes have been infused with real dairy products for tens of thousands of years. Recent geologic and climatologic research reveals that between 100,000 and 10,000 years ago, the Sahara was a lush paradise of grassland. During that window of abundance, the human population exploded. To deal with the consequential depletion of wild resources, people began experiments in “proto-farming,” a term coined by biologist and historian Colin Tudge to describe humanity’s slow-motion leap from living in harmony with the land as hunter-gatherers to adopting the now-familiar program of altering the ecology to suit our interests. Author Thom Hartmann explains in his book The Last Hours of Ancient Sunlight:
'Something important happened around 40,000 years ago: humans figured out a way to change the patterns of nature so we could get more sunlight/food than other species did. The human food supply was determined by how many deer or rabbits the local forest could support […]. But in areas where the soil was too poor for farming or forest, supporting only scrub brush and grasses, humans discovered that ruminant (grazing animals like goats, sheep, and cows) could eat those plants that we couldn’t, and could convert the daily sunlight captured by the scrub and wild plants on that “useless” land into animal flesh, which we could eat'.
Or drink, as the case may be. For millennia, much of the world’s population has depended largely on milk for nutritional sustenance. However, the medical world has been ignorant of milk’s nearly ubiquitous use, confused by the issue of lactose intolerance. Because Europeans have lower rates of lactose intolerance, most Western physicians presume that only European populations have historically practiced dairying. But this confusion arises in part because most Western physicians don’t know very much about fermentation.
Lactose Intolerance
Lactose is the major type of sugar in milk. Nearly everyone can digest it while we’re babies and dependent on our mother’s milk, but many people lose the lactase enzyme in the lining of the intestine, growing lactose intolerant as they get older. Fermentation breaks down lactose, and so you don’t need that enzyme as long as you only eat fermented dairy products, such as yoghurt and cheese. The reason people living in warmer climates tend to be lactose intolerant more often than Europeans stems from the fact that fermentation progresses rapidly in warmer climates. Once fermented, the potentially irritating lactose sugars are gone. A child living in a warmer climate would, after weaning, have such infrequent need for the lactase enzyme that the epigenetic librarian would simply switch the gene off. In cooler European climates, fresh milk stays fresh for hours or days, and was presumably consumed that way often enough to keep the lactase enzyme epigenetically activated throughout a person’s life. If you have true lactose intolerance, as opposed to a protein allergy, you should be able to tolerate yoghurt, cheese and cream (dairy fat contains little to no lactose—and minimal protein).
Why Most Milk is Pasteurized Today
Most of us also have heard that milk needs to be pasteurized to be safe. But we haven’t heard the whole story. For perhaps thousands of years, people who gave their animals the basic, humane care they deserved survived and thrived drinking completely raw, fresh milk. The need for pasteurization was a reality when in-city dairies housed diseased cows whose hindquarters ran with rivulets of manure. Tainting milk’s reputation even further, around the same time, dairymen were often infected with diphtheria, spreading the deadly bacteria through the medium of warm, protein-rich milk. But no epidemics have ever been traced to raw milk consumption when the cows were healthy and the humans milking them were disease free. If the animal is sickly—as they invariably are when raise in crowded, nightmarish conditions—its milk should probably not be consumed at all. When that’s your only choice, then, yes it ought to be cooked first to reduce risk of potentially lethal infections including undulant fever, hemolytic uremia, sepsis, and more. But it’s not your only choice. If you erase any ethical entanglement, impulse of social responsibility, nagging moral prohibition, and investment in human health, you could call milk pasteurization a good thing. In terms of volume of product output per production unit, pasteurization plays a crucial role in converting small family farms into perfectly efficient milk producers for the national brands: cheaper feed (silage and grain instead of fresh grass and hay), more cows per square foot, more “milk” per cow. That explains why big agribusiness roots for pasteurization. But how did the rest of us get convinced? Our fear of fresh milk can be traced to the energetic campaigning of a man named Charles North who patented the first batch-processing pasteurization machine in 1907. A skilled orator and savvy businessman, he traveled small towns throughout the country creating publicity and interest in his machines by claiming to have come directly from another small town, just like theirs, where people were dying from drinking unpasteurized milk. Of course, his claims were total fiction and doctors were staunchly opposed to pasteurization. The facts were on their side. Unfortunately, North had something better—fear. And he milked that fear right into a small fortune. The pasteurization industry mushroomed from nonexistence to a major political presence. Today, at the University of Pennsylvania where medical professors once protested that pasteurization “should never be had recourse to, medical students are given lessons on the many health benefits of pasteurization. Whenever I have a patient who was raised on a farm, one who looks tough and boasts how rarely they get sick, I ask them if they drank raw milk as a child. Nine times out of ten, they say yes. Every family dairyman I’ve talked to keeps raw milk around for their own families and happily testifies to its health benefits. Unlike meat or fruit or really any other food, milk is unique in that its one and only purpose is to nourish something else. Not only is it loaded with nutrients, it is engineered with an intricate microarchitecture that is key to enhancing digestive function while preventing the nourishing compounds from reacting with one another. Processing fundamentally alters this micro-architecture and diminishes nutritive value significantly. How much of a difference does this make? Enough that, based on their health and bone structure, I can guess with a high degree of accuracy which of my patients had access to raw milk as a child and which did not. Since 1948, when states began passing mandatory pasteurization laws, raw milk fans have waged a bitter battle against government intervention. During hearings in which laws requiring pasteurization have been challenged, pasteurization proponents deny any nutritional difference between pasteurized, homogenized milk and raw. But as dairy scientists point out, heat denatures proteins, and homogenization explodes the fat droplets in milk. This is significant. Even to the naked eye, there’s a difference: Unlike cooked milk, the fresh product has a layer of cream floating at the top. But to fully understand how these two products differ, we need to bust out the microscope.
The Difference Between Fresh and Processed
If we put a drop of fresh milk on a slide, we see thousands of lipid droplets of varying size streaming under the cover slip and maybe a living lactobacilli or two wiggling from edge to edge. These come from the cow’s udders which, when well cared for, are colonized with beneficial bacteria, as is human skin. We want good bacteria in our milk. These probiotics protect both the milk and the milk consumer from pathogens. Good bacteria accomplish this by using the same bacterial communication techniques we read about in the section on fermentation. Using the powerful electron microscope, we can magnify milk 10,000,000 times. Now we can see casein micelles, which are amazingly complex. Imagine a mound of spaghetti and meatballs formed into a big round ball. The strands of spaghetti are made of protein (casein), and the meatballs are made of the most digestible form of calcium phosphate, called colloidal calcium phosphate, which holds the spaghetti strands together in a clump with its tiny magnetic charge. This clumping prevents sugar from reacting with and destroying milk’s essential amino acids. Each tiny globe of fat in the milk is enclosed inside a phospholipid membrane very similar to the membrane surrounding every cell in your body. The mammary gland cell that produced the fat droplet donated some of its membrane when the droplet exited the cell. This coating performs several tasks, starting in the milk duct where it prevents fat droplets from coalescing and clogging up mom’s mammary passageways. The milk fat globule’s lipid bilayer is studded with a variety of specialized proteins, just like the living cells in your body. Some proteins protect the globule from bacterial infection while others are tagged with short chains of sugars that may function as a signal to the intestinal cell that the contents are to be accepted without immune inspection, streamlining digestion. Still others may act as intestinal cell growth factors, encouraging and directing intestinal cells growth and function. As long as the coating surrounds the milk fat globule, the fat is easily digested, the gallbladder doesn’t have to squeeze out any bile for the fat to be absorbed, the fatty acids inside the blob are isolated from the calcium in the casein micelles, and everything goes smoothly. But if calcium and fats come into contact with one another, as we’ll see in a moment, milk loses much of its capacity to deliver nutrients into your body. Let’s go back to the light microscope to take a look at pasteurized, homogenized milk and identify what distinguishes it from raw. One striking difference will be the homogeneity of fat globule sizes and the absence of living bacteria. But the real damage is hiding behind all this homogeneity and is only revealed under the electron microscope. Now, we see that these fat blobs lack the sophisticated bilayer wrapping and are instead caked with minerals and tangled remnants of casein micelles. Why does it look like this? The heat of pasteurization forces the sugar to react with amino acids, denaturing the proteins and knocking the fragile colloidal calcium phosphate out of the spaghetti-and-meatballs matrix, while the denatured spaghetti strands tangle into a tight, hard knot. Homogenization squeezes the milk through tiny holes under intense pressure, destroying the architecture of the fat globules. Once the two processing steps have destroyed the natural architecture of milk, valuable nutrients react with each other with health-damaging consequences. Processing can render milk highly irritating to the intestinal tract, and such a wide variety of chemical changes may occur that processed milk can lead to diarrhea or constipation. During processing, the nice, soft meatball of colloidal calcium phosphate fuses with the fatty acids to form a kind of milkfat soap. This reaction, called saponification, irritates many people’s GI tracts and makes the calcium and phosphate much less bioavailable and more difficult to absorb. How difficult? Food conglomerates have a lot of influence on the direction of research funding. And the dairy industry is big business. Little wonder that no studies have been funded to compare the nutritional value raw, whole cow’s milk to pasteurized head-to-head. But studies have been done on skim milk and human breast milk comparing fresh versus pasteurized, and the difference is dramatic: Processed milks contained anywhere from one half to one sixth the bioavailable minerals of the fresh products. When fresh, the milk fat globule carries signal molecules on the surface, which help your body recognize milk as a helpful substance as opposed to, say, an invasive bacteria. Processing demolishes those handy signals and so, instead of getting a free pass into the intestinal cell, the curiously distorted signals slow the process of digestion down so much that it can lead to constipation. Heat destroys amino acids, especially the fragile essential amino acids, and so pasteurized milk contains less protein than fresh. But the damaged amino acids don’t just disappear; they have been glycated, oxidized and transformed into stuff like N-carboxymethyl-lysine, malonaldehyde, and 4-hydroxynonanal—potentential allergens and proinflammatory irritants. And there’s more. Many of the active enzymes in fresh milk designed to help streamline the digestive process have also been destroyed. Other enzymes, such as xanthine oxidase, which ordinarily protect the milk (but cause damage inside our arteries) can play stowaway within the artificially formed fat blobs and be absorbed. Normally our digestive system would chop up this enzyme and digest it. But hidden inside fat, it can be ingested whole, and may retain some of its original activity. Once in the body, xanthine oxidase can generate free radicals and lead to atherosclerosis and asthma. One more thing that makes raw milk special is the surface molecules on milk fat globule membranes, called gangliosides. Gangliosides inhibit harmful bacteria in the intestine. Once digested, they’ve been shown to stimulate neural development. Homogenization strips these benefits away. What does all this scientific data mean to you? It means that the processed milk you buy in the store is not milk, not really. If you can’t find a good source of fresh, unprocessed milk, what can you do? Get the next best thing: yoghurt made from organic, whole milk. The fermentation process rejuvenates damaged proteins and makes minerals more bioavailable. A breakfast of yoghurt, fresh fruit slices, and nuts is nutritionally far superior to cold cereal and processed milk. But if you aren’t ready to give up milk for breakfast, then get organic whole milk (not low fat), preferably from cows raised on pasture—not grain! Non-organic dairy may seem cheaper, but in reality you get far less nutrition for the dollar than you do with organic because at least organically raised cows produce milk. The stuff that comes out of malnourished cows living in cement milk-factories hardly qualifies as such. Whatever you do, avoid soymilk. The primary difference between Yoo-hoo, a junk-food beverage snack sold in your local 7–11 and the soymilk sold in the health food stores is that Yoohoo is flavored with chocolate.
Fresh Meat
Here in the US, white-gloved health department officials encourage us to cook our meat to death. Not because overcooked meat is tastier or more nutritious but because our meat has generally been slaughtered days or weeks ago in filthy conditions that enable pathogenic bacteria to proliferate all over the surface. Those, we must destroy with plenty of heat in order to be “safe.” If you are lucky enough to travel to Asia, Africa, or India, you might want to stop at one of those restaurants that keep chickens out back. Why do they do this? Because fresh meat is part of every world cuisine and fresh meat can, when the animals are known to be healthy, safely be cooked rare. Juicy pinkness indicates the presence of far more nutrients than you can get when meat is overcooked. In the 1930s and ‘40s, Dr. Frances Marion Pottenger conducted a ten-year experiment that gives us valuable insights into the potential long-term consequences of overcooking. Pottenger fed one group of cats raw meat and milk, and another group cooked meat and pasteurized milk. The all-raw cats produced ten generations of healthy and well-adjusted kittens. Not so, the cats on the cooked diet. By the end of the first generation, they started to develop degenerative diseases and became “quite lazy.” The second generation developed degenerative diseases earlier in life and started losing their coordination. By the third generation, the cats had developed degenerative disease very early in life, and some were born blind and weak and died prematurely. There was an abundance of parasites and vermin in this group, and skin disease and allergies increased from an incidence of five percent in normal cats to over 90 percent in the third generation. Males became docile and females aggressive. By the fourth generation, litters were stillborn or so sickly they didn’t live to reach adulthood. This research prompted pet food manufacturers to add back some of the vitamins lost during heating. Still, dried and canned pet food is nothing like the diets cats thrive on. Pottenger’s research highlights the importance of eating vitamin-rich, fresh meat. But if you don’t have access to the quality of meat that can safely be cooked rare, then it’s all the more important for you to make sure to get the freshest greens you can and eat them raw or gently cooked.
How The Four Pillars Will Make You Healthier
Whatever your age, whatever illnesses run in your family, whatever your “risk factors,” however many times you’ve tried to lose weight, build muscle, etc., eating the foods I’ve described in this chapter will transform your body. And if you are planning a baby, eating Four-Pillar foods before, during, and after conception, and then feeding them to your child as he or she grows up, will allow the genes in his or her body to express in ways yours may not have. Meat on the bone will bring enough of the glycosaminoglycan growth factors and bone-building minerals to make a child’s joints strong and their bones tough, enabling them to grow tall and excel in sports. In adulthood, these same factors will keep your joints well-lubricated and prevent aging bones from crumbling. No combination of supplements has the right balance of bioavailable minerals and collagen-derived growth factors to fortify your body as effectively as meat on the bone. Organ meats bring the vitamins and brain-building fats that can ensure children will have mental stability and an aptitude for learning, and continued consumption of these foods is the best way to guarantee that your brain cells and nerves stay healthy for the rest of your life. Because these nutrients deteriorate so rapidly, no pills can effectively encapsulate them. Fermented foods, full of probiotics, protect the intestinal tract from invading pathogens. Since a healthier intestine is more able to take in nutrients, probiotics may prevent infections and allergic disorders from developing elsewhere in the body, reducing the need for repeated doses of antibiotics.
Probiotics living in our intestine also produce all sorts of vitamins, which help to round out a diet that might otherwise be deficient. Sprouted foods enable you to enjoy your breads and breakfast porridges without consuming the empty calories that cause obesity and diabetes. And finally, fresh foods are naturally loaded with more antioxidants than can possibly survive the processes of drying, overcooking, or being stuffed into a capsule and bottled. This is just a brief look at the benefits imparted by the Four Pillars. People who aren’t connected to any culinary tradition don’t consume any of the Four Pillars as often as they should. If you build your diet on the foundation of the Four Pillars, and get regular exercise and plenty of sleep, you will immediately notice vast improvements in how you feel. Those differences will compound over the years to keep you looking young.
Two Steps to Perfect Health
The first half of this book provided information that has, I hope, convinced you that the source of incredible health and vitality is no mystery. Rather than leaving your fate in the hands of, well, fate, you can take control of your genetic destiny by feeding your body the same nutrients your ancestors depended on. There are only two steps to doing that. First, find the best ingredients grown on the richest soil in the most wholesome, sustainable manner. Second, ensure that your body can use those nutrients most efficiently by preparing the raw materials according to the Four Pillars of World Cuisine. When I say genetic destiny, I’m talking about your future and your children’s as well. As you remember from previous chapters, the building of a whole body from a single fertilized cell requires an optimum nutritional environment. Every event during the 9 1/2 months in-utero is a minor miracle requiring a wholesome, rich environment. No physiologic event is as dramatic as the transcription of epigenetic data from gametes to zygote. And therefore none is as dependent on good nutrients, or more vulnerable to the interference of toxins.
Two Ingredients to Avoid
Most people are aware of the harmful effects of chemical residues leftover from industrial farming and of the preservatives and other agents that have harmful physiologic effects. And those of us who care about our health do what we can to avoid them. These two ingredients are different. Not only does each one seem perfectly engineered to prevent our cells from functioning the way they should, they often appear as a tag-team duo, showing up in the same foods together. I’m talking about vegetable oils and sugar. I’m not saying that all the pollutants and toxins so often talked about aren’t hurting our health. They are. But because vegetable oil and sugar are so nasty and their use in processed foods so ubiquitous that they have replaced nutrient-rich ingredients we would otherwise eat, I place vegetable oil and sugar before all others, on the very top of my don’t eat list. When traditional people wanted to send the message that certain foods were dangerous (or, in some cases, too special for non-royal persons) they’d place them on a do-not-eat list. In Hawaii, these foods were kapu, or forbidden. If they noticed that a food led to deleterious effects in newborns, then they would be kapu for expectant moms. Every indigenous society honored such a list; to ignore it could spell disaster for mother or child. Coming up, we’ll see why vegetable oil and sugar are the real culprits for diseases most doctors blame on chance, or—even more absurdly—on the consumption of animal products that you need to eat to be healthy. Once you learn what they do inside your body, I hope you’ll put them both on the top of your family’s kapu list.
By Catherine Shanahan, MD, & Luke Shanahan, MFA, in the book 'Deep Nutrition' Why Your Genes Need Traditional Food, Big Box Books, USA, 2008, Chapter 7, p.121-165. Adapted and illustrated to be posted by Leopoldo Costa.
About the Book
Deep Nutrition illustrates how our ancestors used nourishment to sculpt their anatomy, engineering bodies of extraordinary health and beauty. The length of our limbs, the shape of our eyes, and the proper function of our organs are all gifts of our ancestor's collective culinary wisdom. Citing the foods of traditional cultures from the Ancient Egyptians and the Maasai to the Japanese and the French, the Shanahans identify four food categories all the world's healthiest diets have in common, the 'Four Pillars of World Cuisine'. Using the latest research in physiology and genetics, Dr. Shanahan explains why your family's health depends on eating these foods. In a world of competing nutritional ideologies, Deep Nutrition gives us the full picture, empowering us to take control of our destiny in ways we might never have imagined.
No comments:
Post a Comment
Thanks for your comments...