Monday, January 29, 2007

"The Greatest Health Secret"

A brief history

The beneficial properties of coral calcium were discovered in 1979 when researchers investigated an island off the coast of Japan whose inhabitants were reported to have an unusual longevity. The water consumed by the inhabitants was highly mineralized, due to the live coral reefs that surrounded the island. The presence of ionized minerals resulted in the water having a highly alkaline pH (8.0 to 8.6). Further studies showed that the water was very similar in chemical composition to bodily fluids, and that this water was found to have healing and cleansing properties.

Ionizing calcium from foods and supplements

Calcium can be found in dairy products, green vegetables, and mineral deposits mined from the earth, such as lime (calcium citrate) and fossilized coral. However, before these forms of calcium (or any other similarly-derived minerals) can be used by a system of the body, they must first be broken down into ions. The hydrochloric acid present in the stomach serves to break these minerals into a usable ionic state.

Fossilized coral vs. Live coral

Unlike lime and fossilized coral which are inert, live coral produces calcium in its ionized form. This calcium is able to bypass the digestive process. Hence, calcium and other trace minerals from coral are immediately available to the systems in the human body.

The key role of ionized calcium in the body

A key role of calcium is to neutralize acidic compounds (usually toxins) anywhere in the body, before damage can take place. Once these acidic molecules are neutralized, the body can then easily excrete them; otherwise, they begin to accumulate in the body-first in the connective tissue and later, even in the organs and blood. Having a sufficient amount of ionized calcium on hand is critical to body cleansing.

Possible cause of poor calcium absorption

Studies have shown that the digestive acid produced by the stomach declines as one ages. The consequence to this is that essential minerals pass through the body without being used, or worse, form deposits that are impossible for the body to use. This is very likely related to the statistic that over 50% of Americans are calcium deficient by age 40, and over 90% are calcium deficient by the age of 90.

Calcium deposits and stones

Medical research reveals that calcium deposits (bone spurs) in the joints actually are a result of an abnormal metabolism of bone calcium stemming from a lack of ionized calcium in the diet. A deficiency of ionized minerals in the diet will cause the body’s pH to swing over into the acid range. To buffer the overly acidic blood, an “emergency response” goes into effect and minerals are withdrawn from the body’s reserves, particularly the bones. This is a survival mechanism that “kicks in” when a person is starving, or when undue stress is placed on a system. Contributing calcium to the body’s fluids is not a normal function of the bones. Although this form of calcium, drawn from the bones, succeeds in buffering (neutralizing) the acidic bloodstream, it also encourages stones to form in the liver, gallbladder and kidneys. Ionized coral calcium supplementation addresses the deficiency within the body, without forming deposits.

Fighting Cancer

Researchers in the 1940’s and 1950’s noticed that people with cancer, arthritis, and other degenerative diseases were all suffering from calcium deficiency. One doctor found that if he gave his patients highly soluble (ionized) forms of calcium, their bodies would produce mono-ortho-calcium phosphate. This preferred form of calcium would succeed in raising the pH back into the proper alkaline range. Once the body’s tissue acidity was reduced, he found the cancer could not survive.

Every cancer researcher knows that cancer cannot survive in an alkaline environment. The human body, not withstanding stomach acid and waste matter, is alkaline by design. Even small increases by a tenth of a degree in pH can exponentially increase the oxygen capacity of the body’s tissues.

The Greatest Health Secret

The human body has a tremendous healing power when it is properly nourished, and the body is kept in an alkaline pH range. If the body’s pH becomes acidic, it becomes harder and harder to properly take in and implement nutrients such as minerals, vitamins and herbs, creating a hostile internal environment like a raging storm, making the body struggle to get the nutrients it needs for optimal health.

The more acidic you become, the worse you feel.

The first step is to re-alkalize the body and keep it in the optimal pH range by regularly taking ionized coral minerals. As the body’s pH comes into better balance (with a pH above 7.0), it is like a sunny spring day with flowers blooming; your body can easily assimilate minerals and other nutrients that were very difficult to absorb at a lower, more acidic pH, and the road to healing is paved!

Ó 2001-2006 Baumeister LLC

Unhappy Meals

The New York Times
Printer Friendly Format Sponsored By


January 28, 2007

Unhappy Meals

Eat food. Not too much. Mostly plants.

That, more or less, is the short answer to the supposedly incredibly complicated and confusing question of what we humans should eat in order to be maximally healthy. I hate to give away the game right here at the beginning of a long essay, and I confess that I’m tempted to complicate matters in the interest of keeping things going for a few thousand more words. I’ll try to resist but will go ahead and add a couple more details to flesh out the advice. Like: A little meat won’t kill you, though it’s better approached as a side dish than as a main. And you’re much better off eating whole fresh foods than processed food products. That’s what I mean by the recommendation to eat “food.” Once, food was all you could eat, but today there are lots of other edible foodlike substances in the supermarket. These novel products of food science often come in packages festooned with health claims, which brings me to a related rule of thumb: if you’re concerned about your health, you should probably avoid food products that make health claims. Why? Because a health claim on a food product is a good indication that it’s not really food, and food is what you want to eat.

Uh-oh. Things are suddenly sounding a little more complicated, aren’t they? Sorry. But that’s how it goes as soon as you try to get to the bottom of the whole vexing question of food and health. Before long, a dense cloud bank of confusion moves in. Sooner or later, everything solid you thought you knew about the links between diet and health gets blown away in the gust of the latest study.

Last winter came the news that a low-fat diet, long believed to protect against breast cancer, may do no such thing — this from the monumental, federally financed Women’s Health Initiative, which has also found no link between a low-fat diet and rates of coronary disease. The year before we learned that dietary fiber might not, as we had been confidently told, help prevent colon cancer. Just last fall two prestigious studies on omega-3 fats published at the same time presented us with strikingly different conclusions. While the Institute of Medicine stated that “it is uncertain how much these omega-3s contribute to improving health” (and they might do the opposite if you get them from mercury-contaminated fish), a Harvard study declared that simply by eating a couple of servings of fish each week (or by downing enough fish oil), you could cut your risk of dying from a heart attack by more than a third — a stunningly hopeful piece of news. It’s no wonder that omega-3 fatty acids are poised to become the oat bran of 2007, as food scientists micro-encapsulate fish oil and algae oil and blast them into such formerly all-terrestrial foods as bread and tortillas, milk and yogurt and cheese, all of which will soon, you can be sure, sprout fishy new health claims. (Remember the rule?)

By now you’re probably registering the cognitive dissonance of the supermarket shopper or science-section reader, as well as some nostalgia for the simplicity and solidity of the first few sentences of this essay. Which I’m still prepared to defend against the shifting winds of nutritional science and food-industry marketing. But before I do that, it might be useful to figure out how we arrived at our present state of nutritional confusion and anxiety.

The story of how the most basic questions about what to eat ever got so complicated reveals a great deal about the institutional imperatives of the food industry, nutritional science and — ahem — journalism, three parties that stand to gain much from widespread confusion surrounding what is, after all, the most elemental question an omnivore confronts. Humans deciding what to eat without expert help — something they have been doing with notable success since coming down out of the trees — is seriously unprofitable if you’re a food company, distinctly risky if you’re a nutritionist and just plain boring if you’re a newspaper editor or journalist. (Or, for that matter, an eater. Who wants to hear, yet again, “Eat more fruits and vegetables”?) And so, like a large gray fog, a great Conspiracy of Confusion has gathered around the simplest questions of nutrition — much to the advantage of everybody involved. Except perhaps the ostensible beneficiary of all this nutritional expertise and advice: us, and our health and happiness as eaters.

FROM FOODS TO NUTRIENTS

It was in the 1980s that food began disappearing from the American supermarket, gradually to be replaced by “nutrients,” which are not the same thing. Where once the familiar names of recognizable comestibles — things like eggs or breakfast cereal or cookies — claimed pride of place on the brightly colored packages crowding the aisles, now new terms like “fiber” and “cholesterol” and “saturated fat” rose to large-type prominence. More important than mere foods, the presence or absence of these invisible substances was now generally believed to confer health benefits on their eaters. Foods by comparison were coarse, old-fashioned and decidedly unscientific things — who could say what was in them, really? But nutrients — those chemical compounds and minerals in foods that nutritionists have deemed important to health — gleamed with the promise of scientific certainty; eat more of the right ones, fewer of the wrong, and you would live longer and avoid chronic diseases.

Nutrients themselves had been around, as a concept, since the early 19th century, when the English doctor and chemist William Prout identified what came to be called the “macronutrients”: protein, fat and carbohydrates. It was thought that that was pretty much all there was going on in food, until doctors noticed that an adequate supply of the big three did not necessarily keep people nourished. At the end of the 19th century, British doctors were puzzled by the fact that Chinese laborers in the Malay states were dying of a disease called beriberi, which didn’t seem to afflict Tamils or native Malays. The mystery was solved when someone pointed out that the Chinese ate “polished,” or white, rice, while the others ate rice that hadn’t been mechanically milled. A few years later, Casimir Funk, a Polish chemist, discovered the “essential nutrient” in rice husks that protected against beriberi and called it a “vitamine,” the first micronutrient. Vitamins brought a kind of glamour to the science of nutrition, and though certain sectors of the population began to eat by its expert lights, it really wasn’t until late in the 20th century that nutrients managed to push food aside in the popular imagination of what it means to eat.

No single event marked the shift from eating food to eating nutrients, though in retrospect a little-noticed political dust-up in Washington in 1977 seems to have helped propel American food culture down this dimly lighted path. Responding to an alarming increase in chronic diseases linked to diet — including heart disease, cancer and diabetes — a Senate Select Committee on Nutrition, headed by George McGovern, held hearings on the problem and prepared what by all rights should have been an uncontroversial document called “Dietary Goals for the United States.” The committee learned that while rates of coronary heart disease had soared in America since World War II, other cultures that consumed traditional diets based largely on plants had strikingly low rates of chronic disease. Epidemiologists also had observed that in America during the war years, when meat and dairy products were strictly rationed, the rate of heart disease temporarily plummeted.

Naïvely putting two and two together, the committee drafted a straightforward set of dietary guidelines calling on Americans to cut down on red meat and dairy products. Within weeks a firestorm, emanating from the red-meat and dairy industries, engulfed the committee, and Senator McGovern (who had a great many cattle ranchers among his South Dakota constituents) was forced to beat a retreat. The committee’s recommendations were hastily rewritten. Plain talk about food — the committee had advised Americans to actually “reduce consumption of meat” — was replaced by artful compromise: “Choose meats, poultry and fish that will reduce saturated-fat intake.”

A subtle change in emphasis, you might say, but a world of difference just the same. First, the stark message to “eat less” of a particular food has been deep-sixed; don’t look for it ever again in any official U.S. dietary pronouncement. Second, notice how distinctions between entities as different as fish and beef and chicken have collapsed; those three venerable foods, each representing an entirely different taxonomic class, are now lumped together as delivery systems for a single nutrient. Notice too how the new language exonerates the foods themselves; now the culprit is an obscure, invisible, tasteless — and politically unconnected — substance that may or may not lurk in them called “saturated fat.”

The linguistic capitulation did nothing to rescue McGovern from his blunder; the very next election, in 1980, the beef lobby helped rusticate the three-term senator, sending an unmistakable warning to anyone who would challenge the American diet, and in particular the big chunk of animal protein sitting in the middle of its plate. Henceforth, government dietary guidelines would shun plain talk about whole foods, each of which has its trade association on Capitol Hill, and would instead arrive clothed in scientific euphemism and speaking of nutrients, entities that few Americans really understood but that lack powerful lobbies in Washington. This was precisely the tack taken by the National Academy of Sciences when it issued its landmark report on diet and cancer in 1982. Organized nutrient by nutrient in a way guaranteed to offend no food group, it codified the official new dietary language. Industry and media followed suit, and terms like polyunsaturated, cholesterol, monounsaturated, carbohydrate, fiber, polyphenols, amino acids and carotenes soon colonized much of the cultural space previously occupied by the tangible substance formerly known as food. The Age of Nutritionism had arrived.

THE RISE OF NUTRITIONISM

The first thing to understand about nutritionism — I first encountered the term in the work of an Australian sociologist of science named Gyorgy Scrinis — is that it is not quite the same as nutrition. As the “ism” suggests, it is not a scientific subject but an ideology. Ideologies are ways of organizing large swaths of life and experience under a set of shared but unexamined assumptions. This quality makes an ideology particularly hard to see, at least while it’s exerting its hold on your culture. A reigning ideology is a little like the weather, all pervasive and virtually inescapable. Still, we can try.

In the case of nutritionism, the widely shared but unexamined assumption is that the key to understanding food is indeed the nutrient. From this basic premise flow several others. Since nutrients, as compared with foods, are invisible and therefore slightly mysterious, it falls to the scientists (and to the journalists through whom the scientists speak) to explain the hidden reality of foods to us. To enter a world in which you dine on unseen nutrients, you need lots of expert help.

But expert help to do what, exactly? This brings us to another unexamined assumption: that the whole point of eating is to maintain and promote bodily health. Hippocrates’s famous injunction to “let food be thy medicine” is ritually invoked to support this notion. I’ll leave the premise alone for now, except to point out that it is not shared by all cultures and that the experience of these other cultures suggests that, paradoxically, viewing food as being about things other than bodily health — like pleasure, say, or socializing — makes people no less healthy; indeed, there’s some reason to believe that it may make them more healthy. This is what we usually have in mind when we speak of the “French paradox” — the fact that a population that eats all sorts of unhealthful nutrients is in many ways healthier than we Americans are. So there is at least a question as to whether nutritionism is actually any good for you.

Another potentially serious weakness of nutritionist ideology is that it has trouble discerning qualitative distinctions between foods. So fish, beef and chicken through the nutritionists’ lens become mere delivery systems for varying quantities of fats and proteins and whatever other nutrients are on their scope. Similarly, any qualitative distinctions between processed foods and whole foods disappear when your focus is on quantifying the nutrients they contain (or, more precisely, the known nutrients).

This is a great boon for manufacturers of processed food, and it helps explain why they have been so happy to get with the nutritionism program. In the years following McGovern’s capitulation and the 1982 National Academy report, the food industry set about re-engineering thousands of popular food products to contain more of the nutrients that science and government had deemed the good ones and less of the bad, and by the late ’80s a golden era of food science was upon us. The Year of Eating Oat Bran — also known as 1988 — served as a kind of coming-out party for the food scientists, who succeeded in getting the material into nearly every processed food sold in America. Oat bran’s moment on the dietary stage didn’t last long, but the pattern had been established, and every few years since then a new oat bran has taken its turn under the marketing lights. (Here comes omega-3!)

By comparison, the typical real food has more trouble competing under the rules of nutritionism, if only because something like a banana or an avocado can’t easily change its nutritional stripes (though rest assured the genetic engineers are hard at work on the problem). So far, at least, you can’t put oat bran in a banana. So depending on the reigning nutritional orthodoxy, the avocado might be either a high-fat food to be avoided (Old Think) or a food high in monounsaturated fat to be embraced (New Think). The fate of each whole food rises and falls with every change in the nutritional weather, while the processed foods are simply reformulated. That’s why when the Atkins mania hit the food industry, bread and pasta were given a quick redesign (dialing back the carbs; boosting the protein), while the poor unreconstructed potatoes and carrots were left out in the cold.

Of course it’s also a lot easier to slap a health claim on a box of sugary cereal than on a potato or carrot, with the perverse result that the most healthful foods in the supermarket sit there quietly in the produce section, silent as stroke victims, while a few aisles over, the Cocoa Puffs and Lucky Charms are screaming about their newfound whole-grain goodness.

EAT RIGHT, GET FATTER

So nutritionism is good for business. But is it good for us? You might think that a national fixation on nutrients would lead to measurable improvements in the public health. But for that to happen, the underlying nutritional science, as well as the policy recommendations (and the journalism) based on that science, would have to be sound. This has seldom been the case.

Consider what happened immediately after the 1977 “Dietary Goals” — McGovern’s masterpiece of politico-nutritionist compromise. In the wake of the panel’s recommendation that we cut down on saturated fat, a recommendation seconded by the 1982 National Academy report on cancer, Americans did indeed change their diets, endeavoring for a quarter-century to do what they had been told. Well, kind of. The industrial food supply was promptly reformulated to reflect the official advice, giving us low-fat pork, low-fat Snackwell’s and all the low-fat pasta and high-fructose (yet low-fat!) corn syrup we could consume. Which turned out to be quite a lot. Oddly, America got really fat on its new low-fat diet — indeed, many date the current obesity and diabetes epidemic to the late 1970s, when Americans began binging on carbohydrates, ostensibly as a way to avoid the evils of fat.

This story has been told before, notably in these pages (“What if It’s All Been a Big Fat Lie?” by Gary Taubes, July 7, 2002), but it’s a little more complicated than the official version suggests. In that version, which inspired the most recent Atkins craze, we were told that America got fat when, responding to bad scientific advice, it shifted its diet from fats to carbs, suggesting that a re-evaluation of the two nutrients is in order: fat doesn’t make you fat; carbs do. (Why this should have come as news is a mystery: as long as people have been raising animals for food, they have fattened them on carbs.)

But there are a couple of problems with this revisionist picture. First, while it is true that Americans post-1977 did begin binging on carbs, and that fat as a percentage of total calories in the American diet declined, we never did in fact cut down on our consumption of fat. Meat consumption actually climbed. We just heaped a bunch more carbs onto our plates, obscuring perhaps, but not replacing, the expanding chunk of animal protein squatting in the center.

How did that happen? I would submit that the ideology of nutritionism deserves as much of the blame as the carbohydrates themselves do — that and human nature. By framing dietary advice in terms of good and bad nutrients, and by burying the recommendation that we should eat less of any particular food, it was easy for the take-home message of the 1977 and 1982 dietary guidelines to be simplified as follows: Eat more low-fat foods. And that is what we did. We’re always happy to receive a dispensation to eat more of something (with the possible exception of oat bran), and one of the things nutritionism reliably gives us is some such dispensation: low-fat cookies then, low-carb beer now. It’s hard to imagine the low-fat craze taking off as it did if McGovern’s original food-based recommendations had stood: eat fewer meat and dairy products. For how do you get from that stark counsel to the idea that another case of Snackwell’s is just what the doctor ordered?

BAD SCIENCE

But if nutritionism leads to a kind of false consciousness in the mind of the eater, the ideology can just as easily mislead the scientist. Most nutritional science involves studying one nutrient at a time, an approach that even nutritionists who do it will tell you is deeply flawed. “The problem with nutrient-by-nutrient nutrition science,” points out Marion Nestle, the New York University nutritionist, “is that it takes the nutrient out of the context of food, the food out of the context of diet and the diet out of the context of lifestyle.”

If nutritional scientists know this, why do they do it anyway? Because a nutrient bias is built into the way science is done: scientists need individual variables they can isolate. Yet even the simplest food is a hopelessly complex thing to study, a virtual wilderness of chemical compounds, many of which exist in complex and dynamic relation to one another, and all of which together are in the process of changing from one state to another. So if you’re a nutritional scientist, you do the only thing you can do, given the tools at your disposal: break the thing down into its component parts and study those one by one, even if that means ignoring complex interactions and contexts, as well as the fact that the whole may be more than, or just different from, the sum of its parts. This is what we mean by reductionist science.

Scientific reductionism is an undeniably powerful tool, but it can mislead us too, especially when applied to something as complex as, on the one side, a food, and on the other, a human eater. It encourages us to take a mechanistic view of that transaction: put in this nutrient; get out that physiological result. Yet people differ in important ways. Some populations can metabolize sugars better than others; depending on your evolutionary heritage, you may or may not be able to digest the lactose in milk. The specific ecology of your intestines helps determine how efficiently you digest what you eat, so that the same input of 100 calories may yield more or less energy depending on the proportion of Firmicutes and Bacteroidetes living in your gut. There is nothing very machinelike about the human eater, and so to think of food as simply fuel is wrong.

Also, people don’t eat nutrients, they eat foods, and foods can behave very differently than the nutrients they contain. Researchers have long believed, based on epidemiological comparisons of different populations, that a diet high in fruits and vegetables confers some protection against cancer. So naturally they ask, What nutrients in those plant foods are responsible for that effect? One hypothesis is that the antioxidants in fresh produce — compounds like beta carotene, lycopene, vitamin E, etc. — are the X factor. It makes good sense: these molecules (which plants produce to protect themselves from the highly reactive oxygen atoms produced in photosynthesis) vanquish the free radicals in our bodies, which can damage DNA and initiate cancers. At least that’s how it seems to work in the test tube. Yet as soon as you remove these useful molecules from the context of the whole foods they’re found in, as we’ve done in creating antioxidant supplements, they don’t work at all. Indeed, in the case of beta carotene ingested as a supplement, scientists have discovered that it actually increases the risk of certain cancers. Big oops.

What’s going on here? We don’t know. It could be the vagaries of human digestion. Maybe the fiber (or some other component) in a carrot protects the antioxidant molecules from destruction by stomach acids early in the digestive process. Or it could be that we isolated the wrong antioxidant. Beta is just one of a whole slew of carotenes found in common vegetables; maybe we focused on the wrong one. Or maybe beta carotene works as an antioxidant only in concert with some other plant chemical or process; under other circumstances, it may behave as a pro-oxidant.

Indeed, to look at the chemical composition of any common food plant is to realize just how much complexity lurks within it. Here’s a list of just the antioxidants that have been identified in garden-variety thyme:

4-Terpineol, alanine, anethole, apigenin, ascorbic acid, beta carotene, caffeic acid, camphene, carvacrol, chlorogenic acid, chrysoeriol, eriodictyol, eugenol, ferulic acid, gallic acid, gamma-terpinene isochlorogenic acid, isoeugenol, isothymonin, kaempferol, labiatic acid, lauric acid, linalyl acetate, luteolin, methionine, myrcene, myristic acid, naringenin, oleanolic acid, p-coumoric acid, p-hydroxy-benzoic acid, palmitic acid, rosmarinic acid, selenium, tannin, thymol, tryptophan, ursolic acid, vanillic acid.

This is what you’re ingesting when you eat food flavored with thyme. Some of these chemicals are broken down by your digestion, but others are going on to do undetermined things to your body: turning some gene’s expression on or off, perhaps, or heading off a free radical before it disturbs a strand of DNA deep in some cell. It would be great to know how this all works, but in the meantime we can enjoy thyme in the knowledge that it probably doesn’t do any harm (since people have been eating it forever) and that it may actually do some good (since people have been eating it forever) and that even if it does nothing, we like the way it tastes.

It’s also important to remind ourselves that what reductive science can manage to perceive well enough to isolate and study is subject to change, and that we have a tendency to assume that what we can see is all there is to see. When William Prout isolated the big three macronutrients, scientists figured they now understood food and what the body needs from it; when the vitamins were isolated a few decades later, scientists thought, O.K., now we really understand food and what the body needs to be healthy; today it’s the polyphenols and carotenoids that seem all-important. But who knows what the hell else is going on deep in the soul of a carrot?

The good news is that, to the carrot eater, it doesn’t matter. That’s the great thing about eating food as compared with nutrients: you don’t need to fathom a carrot’s complexity to reap its benefits.

The case of the antioxidants points up the dangers in taking a nutrient out of the context of food; as Nestle suggests, scientists make a second, related error when they study the food out of the context of the diet. We don’t eat just one thing, and when we are eating any one thing, we’re not eating another. We also eat foods in combinations and in orders that can affect how they’re absorbed. Drink coffee with your steak, and your body won’t be able to fully absorb the iron in the meat. The trace of limestone in the corn tortilla unlocks essential amino acids in the corn that would otherwise remain unavailable. Some of those compounds in that sprig of thyme may well affect my digestion of the dish I add it to, helping to break down one compound or possibly stimulate production of an enzyme to detoxify another. We have barely begun to understand the relationships among foods in a cuisine.

But we do understand some of the simplest relationships, like the zero-sum relationship: that if you eat a lot of meat you’re probably not eating a lot of vegetables. This simple fact may explain why populations that eat diets high in meat have higher rates of coronary heart disease and cancer than those that don’t. Yet nutritionism encourages us to look elsewhere for the explanation: deep within the meat itself, to the culpable nutrient, which scientists have long assumed to be the saturated fat. So they are baffled when large-population studies, like the Women’s Health Initiative, fail to find that reducing fat intake significantly reduces the incidence of heart disease or cancer.

Of course thanks to the low-fat fad (inspired by the very same reductionist fat hypothesis), it is entirely possible to reduce your intake of saturated fat without significantly reducing your consumption of animal protein: just drink the low-fat milk and order the skinless chicken breast or the turkey bacon. So maybe the culprit nutrient in meat and dairy is the animal protein itself, as some researchers now hypothesize. (The Cornell nutritionist T. Colin Campbell argues as much in his recent book, “The China Study.”) Or, as the Harvard epidemiologist Walter C. Willett suggests, it could be the steroid hormones typically present in the milk and meat; these hormones (which occur naturally in meat and milk but are often augmented in industrial production) are known to promote certain cancers.

But people worried about their health needn’t wait for scientists to settle this question before deciding that it might be wise to eat more plants and less meat. This is of course precisely what the McGovern committee was trying to tell us.

Nestle also cautions against taking the diet out of the context of the lifestyle. The Mediterranean diet is widely believed to be one of the most healthful ways to eat, yet much of what we know about it is based on studies of people living on the island of Crete in the 1950s, who in many respects lived lives very different from our own. Yes, they ate lots of olive oil and little meat. But they also did more physical labor. They fasted regularly. They ate a lot of wild greens — weeds. And, perhaps most important, they consumed far fewer total calories than we do. Similarly, much of what we know about the health benefits of a vegetarian diet is based on studies of Seventh Day Adventists, who muddy the nutritional picture by drinking absolutely no alcohol and never smoking. These extraneous but unavoidable factors are called, aptly, “confounders.” One last example: People who take supplements are healthier than the population at large, but their health probably has nothing whatsoever to do with the supplements they take — which recent studies have suggested are worthless. Supplement-takers are better-educated, more-affluent people who, almost by definition, take a greater-than-normal interest in personal health — confounding factors that probably account for their superior health.

But if confounding factors of lifestyle bedevil comparative studies of different populations, the supposedly more rigorous “prospective” studies of large American populations suffer from their own arguably even more disabling flaws. In these studies — of which the Women’s Health Initiative is the best known — a large population is divided into two groups. The intervention group changes its diet in some prescribed manner, while the control group does not. The two groups are then tracked over many years to learn whether the intervention affects relative rates of chronic disease.

When it comes to studying nutrition, this sort of extensive, long-term clinical trial is supposed to be the gold standard. It certainly sounds sound. In the case of the Women’s Health Initiative, sponsored by the National Institutes of Health, the eating habits and health outcomes of nearly 49,000 women (ages 50 to 79 at the beginning of the study) were tracked for eight years. One group of the women were told to reduce their consumption of fat to 20 percent of total calories. The results were announced early last year, producing front-page headlines of which the one in this newspaper was typical: “Low-Fat Diet Does Not Cut Health Risks, Study Finds.” And the cloud of nutritional confusion over the country darkened.

But even a cursory analysis of the study’s methods makes you wonder why anyone would take such a finding seriously, let alone order a Quarter Pounder With Cheese to celebrate it, as many newspaper readers no doubt promptly went out and did. Even the beginner student of nutritionism will immediately spot several flaws: the focus was on “fat,” rather than on any particular food, like meat or dairy. So women could comply simply by switching to lower-fat animal products. Also, no distinctions were made between types of fat: women getting their allowable portion of fat from olive oil or fish were lumped together with woman getting their fat from low-fat cheese or chicken breasts or margarine. Why? Because when the study was designed 16 years ago, the whole notion of “good fats” was not yet on the scientific scope. Scientists study what scientists can see.

But perhaps the biggest flaw in this study, and other studies like it, is that we have no idea what these women were really eating because, like most people when asked about their diet, they lied about it. How do we know this? Deduction. Consider: When the study began, the average participant weighed in at 170 pounds and claimed to be eating 1,800 calories a day. It would take an unusual metabolism to maintain that weight on so little food. And it would take an even freakier metabolism to drop only one or two pounds after getting down to a diet of 1,400 to 1,500 calories a day — as the women on the “low-fat” regimen claimed to have done. Sorry, ladies, but I just don’t buy it.

In fact, nobody buys it. Even the scientists who conduct this sort of research conduct it in the knowledge that people lie about their food intake all the time. They even have scientific figures for the magnitude of the lie. Dietary trials like the Women’s Health Initiative rely on “food-frequency questionnaires,” and studies suggest that people on average eat between a fifth and a third more than they claim to on the questionnaires. How do the researchers know that? By comparing what people report on questionnaires with interviews about their dietary intake over the previous 24 hours, thought to be somewhat more reliable. In fact, the magnitude of the lie could be much greater, judging by the huge disparity between the total number of food calories produced every day for each American (3,900 calories) and the average number of those calories Americans own up to chomping: 2,000. (Waste accounts for some of the disparity, but nowhere near all of it.) All we really know about how much people actually eat is that the real number lies somewhere between those two figures.

To try to fill out the food-frequency questionnaire used by the Women’s Health Initiative, as I recently did, is to realize just how shaky the data on which such trials rely really are. The survey, which took about 45 minutes to complete, started off with some relatively easy questions: “Did you eat chicken or turkey during the last three months?” Having answered yes, I was then asked, “When you ate chicken or turkey, how often did you eat the skin?” But the survey soon became harder, as when it asked me to think back over the past three months to recall whether when I ate okra, squash or yams, they were fried, and if so, were they fried in stick margarine, tub margarine, butter, “shortening” (in which category they inexplicably lump together hydrogenated vegetable oil and lard), olive or canola oil or nonstick spray? I honestly didn’t remember, and in the case of any okra eaten in a restaurant, even a hypnotist could not get out of me what sort of fat it was fried in. In the meat section, the portion sizes specified haven’t been seen in America since the Hoover administration. If a four-ounce portion of steak is considered “medium,” was I really going to admit that the steak I enjoyed on an unrecallable number of occasions during the past three months was probably the equivalent of two or three (or, in the case of a steakhouse steak, no less than four) of these portions? I think not. In fact, most of the “medium serving sizes” to which I was asked to compare my own consumption made me feel piggish enough to want to shave a few ounces here, a few there. (I mean, I wasn’t under oath or anything, was I?)

This is the sort of data on which the largest questions of diet and health are being decided in America today.

THE ELEPHANT IN THE ROOM

In the end, the biggest, most ambitious and widely reported studies of diet and health leave more or less undisturbed the main features of the Western diet: lots of meat and processed foods, lots of added fat and sugar, lots of everything — except fruits, vegetables and whole grains. In keeping with the nutritionism paradigm and the limits of reductionist science, the researchers fiddle with single nutrients as best they can, but the populations they recruit and study are typical American eaters doing what typical American eaters do: trying to eat a little less of this nutrient, a little more of that, depending on the latest thinking. (One problem with the control groups in these studies is that they too are exposed to nutritional fads in the culture, so over time their eating habits come to more closely resemble the habits of the intervention group.) It should not surprise us that the findings of such research would be so equivocal and confusing.

But what about the elephant in the room — the Western diet? It might be useful, in the midst of our deepening confusion about nutrition, to review what we do know about diet and health. What we know is that people who eat the way we do in America today suffer much higher rates of cancer, heart disease, diabetes and obesity than people eating more traditional diets. (Four of the 10 leading killers in America are linked to diet.) Further, we know that simply by moving to America, people from nations with low rates of these “diseases of affluence” will quickly acquire them. Nutritionism by and large takes the Western diet as a given, seeking to moderate its most deleterious effects by isolating the bad nutrients in it — things like fat, sugar, salt — and encouraging the public and the food industry to limit them. But after several decades of nutrient-based health advice, rates of cancer and heart disease in the U.S. have declined only slightly (mortality from heart disease is down since the ’50s, but this is mainly because of improved treatment), and rates of obesity and diabetes have soared.

No one likes to admit that his or her best efforts at understanding and solving a problem have actually made the problem worse, but that’s exactly what has happened in the case of nutritionism. Scientists operating with the best of intentions, using the best tools at their disposal, have taught us to look at food in a way that has diminished our pleasure in eating it while doing little or nothing to improve our health. Perhaps what we need now is a broader, less reductive view of what food is, one that is at once more ecological and cultural. What would happen, for example, if we were to start thinking about food as less of a thing and more of a relationship?

In nature, that is of course precisely what eating has always been: relationships among species in what we call food chains, or webs, that reach all the way down to the soil. Species co-evolve with the other species they eat, and very often a relationship of interdependence develops: I’ll feed you if you spread around my genes. A gradual process of mutual adaptation transforms something like an apple or a squash into a nutritious and tasty food for a hungry animal. Over time and through trial and error, the plant becomes tastier (and often more conspicuous) in order to gratify the animal’s needs and desires, while the animal gradually acquires whatever digestive tools (enzymes, etc.) are needed to make optimal use of the plant. Similarly, cow’s milk did not start out as a nutritious food for humans; in fact, it made them sick until humans who lived around cows evolved the ability to digest lactose as adults. This development proved much to the advantage of both the milk drinkers and the cows.

“Health” is, among other things, the byproduct of being involved in these sorts of relationships in a food chain — involved in a great many of them, in the case of an omnivorous creature like us. Further, when the health of one link of the food chain is disturbed, it can affect all the creatures in it. When the soil is sick or in some way deficient, so will be the grasses that grow in that soil and the cattle that eat the grasses and the people who drink the milk. Or, as the English agronomist Sir Albert Howard put it in 1945 in “The Soil and Health” (a founding text of organic agriculture), we would do well to regard “the whole problem of health in soil, plant, animal and man as one great subject.” Our personal health is inextricably bound up with the health of the entire food web.

In many cases, long familiarity between foods and their eaters leads to elaborate systems of communications up and down the food chain, so that a creature’s senses come to recognize foods as suitable by taste and smell and color, and our bodies learn what to do with these foods after they pass the test of the senses, producing in anticipation the chemicals necessary to break them down. Health depends on knowing how to read these biological signals: this smells spoiled; this looks ripe; that’s one good-looking cow. This is easier to do when a creature has long experience of a food, and much harder when a food has been designed expressly to deceive its senses — with artificial flavors, say, or synthetic sweeteners.

Note that these ecological relationships are between eaters and whole foods, not nutrients. Even though the foods in question eventually get broken down in our bodies into simple nutrients, as corn is reduced to simple sugars, the qualities of the whole food are not unimportant — they govern such things as the speed at which the sugars will be released and absorbed, which we’re coming to see as critical to insulin metabolism. Put another way, our bodies have a longstanding and sustainable relationship to corn that we do not have to high-fructose corn syrup. Such a relationship with corn syrup might develop someday (as people evolve superhuman insulin systems to cope with regular floods of fructose and glucose), but for now the relationship leads to ill health because our bodies don’t know how to handle these biological novelties. In much the same way, human bodies that can cope with chewing coca leaves — a longstanding relationship between native people and the coca plant in South America — cannot cope with cocaine or crack, even though the same “active ingredients” are present in all three. Reductionism as a way of understanding food or drugs may be harmless, even necessary, but reductionism in practice can lead to problems.

Looking at eating through this ecological lens opens a whole new perspective on exactly what the Western diet is: a radical and rapid change not just in our foodstuffs over the course of the 20th century but also in our food relationships, all the way from the soil to the meal. The ideology of nutritionism is itself part of that change. To get a firmer grip on the nature of those changes is to begin to know how we might make our relationships to food healthier. These changes have been numerous and far-reaching, but consider as a start these four large-scale ones:

From Whole Foods to Refined. The case of corn points up one of the key features of the modern diet: a shift toward increasingly refined foods, especially carbohydrates. Call it applied reductionism. Humans have been refining grains since at least the Industrial Revolution, favoring white flour (and white rice) even at the price of lost nutrients. Refining grains extends their shelf life (precisely because it renders them less nutritious to pests) and makes them easier to digest, by removing the fiber that ordinarily slows the release of their sugars. Much industrial food production involves an extension and intensification of this practice, as food processors find ways to deliver glucose — the brain’s preferred fuel — ever more swiftly and efficiently. Sometimes this is precisely the point, as when corn is refined into corn syrup; other times it is an unfortunate byproduct of food processing, as when freezing food destroys the fiber that would slow sugar absorption.

So fast food is fast in this other sense too: it is to a considerable extent predigested, in effect, and therefore more readily absorbed by the body. But while the widespread acceleration of the Western diet offers us the instant gratification of sugar, in many people (and especially those newly exposed to it) the “speediness” of this food overwhelms the insulin response and leads to Type II diabetes. As one nutrition expert put it to me, we’re in the middle of “a national experiment in mainlining glucose.” To encounter such a diet for the first time, as when people accustomed to a more traditional diet come to America, or when fast food comes to their countries, delivers a shock to the system. Public-health experts call it “the nutrition transition,” and it can be deadly.

From Complexity to Simplicity. If there is one word that covers nearly all the changes industrialization has made to the food chain, it would be simplification. Chemical fertilizers simplify the chemistry of the soil, which in turn appears to simplify the chemistry of the food grown in that soil. Since the widespread adoption of synthetic nitrogen fertilizers in the 1950s, the nutritional quality of produce in America has, according to U.S.D.A. figures, declined significantly. Some researchers blame the quality of the soil for the decline; others cite the tendency of modern plant breeding to select for industrial qualities like yield rather than nutritional quality. Whichever it is, the trend toward simplification of our food continues on up the chain. Processing foods depletes them of many nutrients, a few of which are then added back in through “fortification”: folic acid in refined flour, vitamins and minerals in breakfast cereal. But food scientists can add back only the nutrients food scientists recognize as important. What are they overlooking?

Simplification has occurred at the level of species diversity, too. The astounding variety of foods on offer in the modern supermarket obscures the fact that the actual number of species in the modern diet is shrinking. For reasons of economics, the food industry prefers to tease its myriad processed offerings from a tiny group of plant species, corn and soybeans chief among them. Today, a mere four crops account for two-thirds of the calories humans eat. When you consider that humankind has historically consumed some 80,000 edible species, and that 3,000 of these have been in widespread use, this represents a radical simplification of the food web. Why should this matter? Because humans are omnivores, requiring somewhere between 50 and 100 different chemical compounds and elements to be healthy. It’s hard to believe that we can get everything we need from a diet consisting largely of processed corn, soybeans, wheat and rice.

From Leaves to Seeds. It’s no coincidence that most of the plants we have come to rely on are grains; these crops are exceptionally efficient at transforming sunlight into macronutrients — carbs, fats and proteins. These macronutrients in turn can be profitably transformed into animal protein (by feeding them to animals) and processed foods of every description. Also, the fact that grains are durable seeds that can be stored for long periods means they can function as commodities as well as food, making these plants particularly well suited to the needs of industrial capitalism.

The needs of the human eater are another matter. An oversupply of macronutrients, as we now have, itself represents a serious threat to our health, as evidenced by soaring rates of obesity and diabetes. But the undersupply of micronutrients may constitute a threat just as serious. Put in the simplest terms, we’re eating a lot more seeds and a lot fewer leaves, a tectonic dietary shift the full implications of which we are just beginning to glimpse. If I may borrow the nutritionist’s reductionist vocabulary for a moment, there are a host of critical micronutrients that are harder to get from a diet of refined seeds than from a diet of leaves. There are the antioxidants and all the other newly discovered phytochemicals (remember that sprig of thyme?); there is the fiber, and then there are the healthy omega-3 fats found in leafy green plants, which may turn out to be most important benefit of all.

Most people associate omega-3 fatty acids with fish, but fish get them from green plants (specifically algae), which is where they all originate. Plant leaves produce these essential fatty acids (“essential” because our bodies can’t produce them on their own) as part of photosynthesis. Seeds contain more of another essential fatty acid: omega-6. Without delving too deeply into the biochemistry, the two fats perform very different functions, in the plant as well as the plant eater. Omega-3s appear to play an important role in neurological development and processing, the permeability of cell walls, the metabolism of glucose and the calming of inflammation. Omega-6s are involved in fat storage (which is what they do for the plant), the rigidity of cell walls, clotting and the inflammation response. (Think of omega-3s as fleet and flexible, omega-6s as sturdy and slow.) Since the two lipids compete with each other for the attention of important enzymes, the ratio between omega-3s and omega-6s may matter more than the absolute quantity of either fat. Thus too much omega-6 may be just as much a problem as too little omega-3.

And that might well be a problem for people eating a Western diet. As we’ve shifted from leaves to seeds, the ratio of omega-6s to omega-3s in our bodies has shifted, too. At the same time, modern food-production practices have further diminished the omega-3s in our diet. Omega-3s, being less stable than omega-6s, spoil more readily, so we have selected for plants that produce fewer of them; further, when we partly hydrogenate oils to render them more stable, omega-3s are eliminated. Industrial meat, raised on seeds rather than leaves, has fewer omega-3s and more omega-6s than preindustrial meat used to have. And official dietary advice since the 1970s has promoted the consumption of polyunsaturated vegetable oils, most of which are high in omega-6s (corn and soy, especially). Thus, without realizing what we were doing, we significantly altered the ratio of these two essential fats in our diets and bodies, with the result that the ratio of omega-6 to omega-3 in the typical American today stands at more than 10 to 1; before the widespread introduction of seed oils at the turn of the last century, it was closer to 1 to 1.

The role of these lipids is not completely understood, but many researchers say that these historically low levels of omega-3 (or, conversely, high levels of omega-6) bear responsibility for many of the chronic diseases associated with the Western diet, especially heart disease and diabetes. (Some researchers implicate omega-3 deficiency in rising rates of depression and learning disabilities as well.) To remedy this deficiency, nutritionism classically argues for taking omega-3 supplements or fortifying food products, but because of the complex, competitive relationship between omega-3 and omega-6, adding more omega-3s to the diet may not do much good unless you also reduce your intake of omega-6.

From Food Culture to Food Science. The last important change wrought by the Western diet is not, strictly speaking, ecological. But the industrialization of our food that we call the Western diet is systematically destroying traditional food cultures. Before the modern food era — and before nutritionism — people relied for guidance about what to eat on their national or ethnic or regional cultures. We think of culture as a set of beliefs and practices to help mediate our relationship to other people, but of course culture (at least before the rise of science) has also played a critical role in helping mediate people’s relationship to nature. Eating being a big part of that relationship, cultures have had a great deal to say about what and how and why and when and how much we should eat. Of course when it comes to food, culture is really just a fancy word for Mom, the figure who typically passes on the food ways of the group — food ways that, although they were never “designed” to optimize health (we have many reasons to eat the way we do), would not have endured if they did not keep eaters alive and well.

The sheer novelty and glamour of the Western diet, with its 17,000 new food products introduced every year, and the marketing muscle used to sell these products, has overwhelmed the force of tradition and left us where we now find ourselves: relying on science and journalism and marketing to help us decide questions about what to eat. Nutritionism, which arose to help us better deal with the problems of the Western diet, has largely been co-opted by it, used by the industry to sell more food and to undermine the authority of traditional ways of eating. You would not have read this far into this article if your food culture were intact and healthy; you would simply eat the way your parents and grandparents and great-grandparents taught you to eat. The question is, Are we better off with these new authorities than we were with the traditional authorities they supplanted? The answer by now should be clear.

It might be argued that, at this point in history, we should simply accept that fast food is our food culture. Over time, people will get used to eating this way and our health will improve. But for natural selection to help populations adapt to the Western diet, we’d have to be prepared to let those whom it sickens die. That’s not what we’re doing. Rather, we’re turning to the health-care industry to help us “adapt.” Medicine is learning how to keep alive the people whom the Western diet is making sick. It’s gotten good at extending the lives of people with heart disease, and now it’s working on obesity and diabetes. Capitalism is itself marvelously adaptive, able to turn the problems it creates into lucrative business opportunities: diet pills, heart-bypass operations, insulin pumps, bariatric surgery. But while fast food may be good business for the health-care industry, surely the cost to society — estimated at more than $200 billion a year in diet-related health-care costs — is unsustainable.

BEYOND NUTRITIONISM

To medicalize the diet problem is of course perfectly consistent with nutritionism. So what might a more ecological or cultural approach to the problem recommend? How might we plot our escape from nutritionism and, in turn, from the deleterious effects of the modern diet? In theory nothing could be simpler — stop thinking and eating that way — but this is somewhat harder to do in practice, given the food environment we now inhabit and the loss of sharp cultural tools to guide us through it. Still, I do think escape is possible, to which end I can now revisit — and elaborate on, but just a little — the simple principles of healthy eating I proposed at the beginning of this essay, several thousand words ago. So try these few (flagrantly unscientific) rules of thumb, collected in the course of my nutritional odyssey, and see if they don’t at least point us in the right direction.

1. Eat food. Though in our current state of confusion, this is much easier said than done. So try this: Don’t eat anything your great-great-grandmother wouldn’t recognize as food. (Sorry, but at this point Moms are as confused as the rest of us, which is why we have to go back a couple of generations, to a time before the advent of modern food products.) There are a great many foodlike items in the supermarket your ancestors wouldn’t recognize as food (Go-Gurt? Breakfast-cereal bars? Nondairy creamer?); stay away from these.

2. Avoid even those food products that come bearing health claims. They’re apt to be heavily processed, and the claims are often dubious at best. Don’t forget that margarine, one of the first industrial foods to claim that it was more healthful than the traditional food it replaced, turned out to give people heart attacks. When Kellogg’s can boast about its Healthy Heart Strawberry Vanilla cereal bars, health claims have become hopelessly compromised. (The American Heart Association charges food makers for their endorsement.) Don’t take the silence of the yams as a sign that they have nothing valuable to say about health.

3. Especially avoid food products containing ingredients that are a) unfamiliar, b) unpronounceable c) more than five in number — or that contain high-fructose corn syrup.None of these characteristics are necessarily harmful in and of themselves, but all of them are reliable markers for foods that have been highly processed.

4. Get out of the supermarket whenever possible. You won’t find any high-fructose corn syrup at the farmer’s market; you also won’t find food harvested long ago and far away. What you will find are fresh whole foods picked at the peak of nutritional quality. Precisely the kind of food your great-great-grandmother would have recognized as food.

5. Pay more, eat less. The American food system has for a century devoted its energies and policies to increasing quantity and reducing price, not to improving quality. There’s no escaping the fact that better food — measured by taste or nutritional quality (which often correspond) — costs more, because it has been grown or raised less intensively and with more care. Not everyone can afford to eat well in America, which is shameful, but most of us can: Americans spend, on average, less than 10 percent of their income on food, down from 24 percent in 1947, and less than the citizens of any other nation. And those of us who can afford to eat well should. Paying more for food well grown in good soils — whether certified organic or not — will contribute not only to your health (by reducing exposure to pesticides) but also to the health of others who might not themselves be able to afford that sort of food: the people who grow it and the people who live downstream, and downwind, of the farms where it is grown.

“Eat less” is the most unwelcome advice of all, but in fact the scientific case for eating a lot less than we currently do is compelling. “Calorie restriction” has repeatedly been shown to slow aging in animals, and many researchers (including Walter Willett, the Harvard epidemiologist) believe it offers the single strongest link between diet and cancer prevention. Food abundance is a problem, but culture has helped here, too, by promoting the idea of moderation. Once one of the longest-lived people on earth, the Okinawans practiced a principle they called “Hara Hachi Bu”: eat until you are 80 percent full. To make the “eat less” message a bit more palatable, consider that quality may have a bearing on quantity: I don’t know about you, but the better the quality of the food I eat, the less of it I need to feel satisfied. All tomatoes are not created equal.

6. Eat mostly plants, especially leaves. Scientists may disagree on what’s so good about plants — the antioxidants? Fiber? Omega-3s? — but they do agree that they’re probably really good for you and certainly can’t hurt. Also, by eating a plant-based diet, you’ll be consuming far fewer calories, since plant foods (except seeds) are typically less “energy dense” than the other things you might eat. Vegetarians are healthier than carnivores, but near vegetarians (“flexitarians”) are as healthy as vegetarians. Thomas Jefferson was on to something when he advised treating meat more as a flavoring than a food.

7. Eat more like the French. Or the Japanese. Or the Italians. Or the Greeks. Confounding factors aside, people who eat according to the rules of a traditional food culture are generally healthier than we are. Any traditional diet will do: if it weren’t a healthy diet, the people who follow it wouldn’t still be around. True, food cultures are embedded in societies and economies and ecologies, and some of them travel better than others: Inuit not so well as Italian. In borrowing from a food culture, pay attention to how a culture eats, as well as to what it eats. In the case of the French paradox, it may not be the dietary nutrients that keep the French healthy (lots of saturated fat and alcohol?!) so much as the dietary habits: small portions, no seconds or snacking, communal meals — and the serious pleasure taken in eating. (Worrying about diet can’t possibly be good for you.) Let culture be your guide, not science.

8. Cook. And if you can, plant a garden. To take part in the intricate and endlessly interesting processes of providing for our sustenance is the surest way to escape the culture of fast food and the values implicit in it: that food should be cheap and easy; that food is fuel and not communion. The culture of the kitchen, as embodied in those enduring traditions we call cuisines, contains more wisdom about diet and health than you are apt to find in any nutrition journal or journalism. Plus, the food you grow yourself contributes to your health long before you sit down to eat it. So you might want to think about putting down this article now and picking up a spatula or hoe.

9. Eat like an omnivore. Try to add new species, not just new foods, to your diet. The greater the diversity of species you eat, the more likely you are to cover all your nutritional bases. That of course is an argument from nutritionism, but there is a better one, one that takes a broader view of “health.” Biodiversity in the diet means less monoculture in the fields. What does that have to do with your health? Everything. The vast monocultures that now feed us require tremendous amounts of chemical fertilizers and pesticides to keep from collapsing. Diversifying those fields will mean fewer chemicals, healthier soils, healthier plants and animals and, in turn, healthier people. It’s all connected, which is another way of saying that your health isn’t bordered by your body and that what’s good for the soil is probably good for you, too.

Michael Pollan, a contributing writer, is the Knight professor of journalism at the University of California, Berkeley. His most recent book, “The Omnivore’s Dilemma,” was chosen by the editors of The New York Times Book Review as one of the 10 best books of 2006.

What if It's All Been a Big Fat Lie?

July 7, 2002

What if It's All Been a Big Fat Lie?

If the members of the American medical establishment were to have a collective find-yourself-standing-naked-in-Times-Square-type nightmare, this might be it. They spend 30 years ridiculing Robert Atkins, author of the phenomenally-best-selling ''Dr. Atkins' Diet Revolution'' and ''Dr. Atkins' New Diet Revolution,'' accusing the Manhattan doctor of quackery and fraud, only to discover that the unrepentant Atkins was right all along. Or maybe it's this: they find that their very own dietary recommendations -- eat less fat and more carbohydrates -- are the cause of the rampaging epidemic of obesity in America. Or, just possibly this: they find out both of the above are true.

When Atkins first published his ''Diet Revolution'' in 1972, Americans were just coming to terms with the proposition that fat -- particularly the saturated fat of meat and dairy products -- was the primary nutritional evil in the American diet. Atkins managed to sell millions of copies of a book promising that we would lose weight eating steak, eggs and butter to our heart's desire, because it was the carbohydrates, the pasta, rice, bagels and sugar, that caused obesity and even heart disease. Fat, he said, was harmless.

Atkins allowed his readers to eat ''truly luxurious foods without limit,'' as he put it, ''lobster with butter sauce, steak with béarnaise sauce . . . bacon cheeseburgers,'' but allowed no starches or refined carbohydrates, which means no sugars or anything made from flour. Atkins banned even fruit juices, and permitted only a modicum of vegetables, although the latter were negotiable as the diet progressed.

Atkins was by no means the first to get rich pushing a high-fat diet that restricted carbohydrates, but he popularized it to an extent that the American Medical Association considered it a potential threat to our health. The A.M.A. attacked Atkins's diet as a ''bizarre regimen'' that advocated ''an unlimited intake of saturated fats and cholesterol-rich foods,'' and Atkins even had to defend his diet in Congressional hearings.

Thirty years later, America has become weirdly polarized on the subject of weight. On the one hand, we've been told with almost religious certainty by everyone from the surgeon general on down, and we have come to believe with almost religious certainty, that obesity is caused by the excessive consumption of fat, and that if we eat less fat we will lose weight and live longer. On the other, we have the ever-resilient message of Atkins and decades' worth of best-selling diet books, including ''The Zone,'' ''Sugar Busters'' and ''Protein Power'' to name a few. All push some variation of what scientists would call the alternative hypothesis: it's not the fat that makes us fat, but the carbohydrates, and if we eat less carbohydrates we will lose weight and live longer.

The perversity of this alternative hypothesis is that it identifies the cause of obesity as precisely those refined carbohydrates at the base of the famous Food Guide Pyramid -- the pasta, rice and bread -- that we are told should be the staple of our healthy low-fat diet, and then on the sugar or corn syrup in the soft drinks, fruit juices and sports drinks that we have taken to consuming in quantity if for no other reason than that they are fat free and so appear intrinsically healthy. While the low-fat-is-good-health dogma represents reality as we have come to know it, and the government has spent hundreds of millions of dollars in research trying to prove its worth, the low-carbohydrate message has been relegated to the realm of unscientific fantasy.

Over the past five years, however, there has been a subtle shift in the scientific consensus. It used to be that even considering the possibility of the alternative hypothesis, let alone researching it, was tantamount to quackery by association. Now a small but growing minority of establishment researchers have come to take seriously what the low-carb-diet doctors have been saying all along. Walter Willett, chairman of the department of nutrition at the Harvard School of Public Health, may be the most visible proponent of testing this heretic hypothesis. Willett is the de facto spokesman of the longest-running, most comprehensive diet and health studies ever performed, which have already cost upward of $100 million and include data on nearly 300,000 individuals. Those data, says Willett, clearly contradict the low-fat-is-good-health message ''and the idea that all fat is bad for you; the exclusive focus on adverse effects of fat may have contributed to the obesity epidemic.''

These researchers point out that there are plenty of reasons to suggest that the low-fat-is-good-health hypothesis has now effectively failed the test of time. In particular, that we are in the midst of an obesity epidemic that started around the early 1980's, and that this was coincident with the rise of the low-fat dogma. (Type 2 diabetes, the most common form of the disease, also rose significantly through this period.) They say that low-fat weight-loss diets have proved in clinical trials and real life to be dismal failures, and that on top of it all, the percentage of fat in the American diet has been decreasing for two decades. Our cholesterol levels have been declining, and we have been smoking less, and yet the incidence of heart disease has not declined as would be expected. ''That is very disconcerting,'' Willett says. ''It suggests that something else bad is happening.''

The science behind the alternative hypothesis can be called Endocrinology 101, which is how it's referred to by David Ludwig, a researcher at Harvard Medical School who runs the pediatric obesity clinic at Children's Hospital Boston, and who prescribes his own version of a carbohydrate-restricted diet to his patients. Endocrinology 101 requires an understanding of how carbohydrates affect insulin and blood sugar and in turn fat metabolism and appetite. This is basic endocrinology, Ludwig says, which is the study of hormones, and it is still considered radical because the low-fat dietary wisdom emerged in the 1960's from researchers almost exclusively concerned with the effect of fat on cholesterol and heart disease. At the time, Endocrinology 101 was still underdeveloped, and so it was ignored. Now that this science is becoming clear, it has to fight a quarter century of anti-fat prejudice.

The alternative hypothesis also comes with an implication that is worth considering for a moment, because it's a whopper, and it may indeed be an obstacle to its acceptance. If the alternative hypothesis is right -- still a big ''if'' -- then it strongly suggests that the ongoing epidemic of obesity in America and elsewhere is not, as we are constantly told, due simply to a collective lack of will power and a failure to exercise. Rather it occurred, as Atkins has been saying (along with Barry Sears, author of ''The Zone''), because the public health authorities told us unwittingly, but with the best of intentions, to eat precisely those foods that would make us fat, and we did. We ate more fat-free carbohydrates, which, in turn, made us hungrier and then heavier. Put simply, if the alternative hypothesis is right, then a low-fat diet is not by definition a healthy diet. In practice, such a diet cannot help being high in carbohydrates, and that can lead to obesity, and perhaps even heart disease. ''For a large percentage of the population, perhaps 30 to 40 percent, low-fat diets are counterproductive,'' says Eleftheria Maratos-Flier, director of obesity research at Harvard's prestigious Joslin Diabetes Center. ''They have the paradoxical effect of making people gain weight.''

Scientists are still arguing about fat, despite a century of research, because the regulation of appetite and weight in the human body happens to be almost inconceivably complex, and the experimental tools we have to study it are still remarkably inadequate. This combination leaves researchers in an awkward position. To study the entire physiological system involves feeding real food to real human subjects for months or years on end, which is prohibitively expensive, ethically questionable (if you're trying to measure the effects of foods that might cause heart disease) and virtually impossible to do in any kind of rigorously controlled scientific manner. But if researchers seek to study something less costly and more controllable, they end up studying experimental situations so oversimplified that their results may have nothing to do with reality. This then leads to a research literature so vast that it's possible to find at least some published research to support virtually any theory. The result is a balkanized community -- ''splintered, very opinionated and in many instances, intransigent,'' says Kurt Isselbacher, a former chairman of the Food and Nutrition Board of the National Academy of Science -- in which researchers seem easily convinced that their preconceived notions are correct and thoroughly uninterested in testing any other hypotheses but their own.

What's more, the number of misconceptions propagated about the most basic research can be staggering. Researchers will be suitably scientific describing the limitations of their own experiments, and then will cite something as gospel truth because they read it in a magazine. The classic example is the statement heard repeatedly that 95 percent of all dieters never lose weight, and 95 percent of those who do will not keep it off. This will be correctly attributed to the University of Pennsylvania psychiatrist Albert Stunkard, but it will go unmentioned that this statement is based on 100 patients who passed through Stunkard's obesity clinic during the Eisenhower administration.

With these caveats, one of the few reasonably reliable facts about the obesity epidemic is that it started around the early 1980's. According to Katherine Flegal, an epidemiologist at the National Center for Health Statistics, the percentage of obese Americans stayed relatively constant through the 1960's and 1970's at 13 percent to 14 percent and then shot up by 8 percentage points in the 1980's. By the end of that decade, nearly one in four Americans was obese. That steep rise, which is consistent through all segments of American society and which continued unabated through the 1990's, is the singular feature of the epidemic. Any theory that tries to explain obesity in America has to account for that. Meanwhile, overweight children nearly tripled in number. And for the first time, physicians began diagnosing Type 2 diabetes in adolescents. Type 2 diabetes often accompanies obesity. It used to be called adult-onset diabetes and now, for the obvious reason, is not.

So how did this happen? The orthodox and ubiquitous explanation is that we live in what Kelly Brownell, a Yale psychologist, has called a ''toxic food environment'' of cheap fatty food, large portions, pervasive food advertising and sedentary lives. By this theory, we are at the Pavlovian mercy of the food industry, which spends nearly $10 billion a year advertising unwholesome junk food and fast food. And because these foods, especially fast food, are so filled with fat, they are both irresistible and uniquely fattening. On top of this, so the theory goes, our modern society has successfully eliminated physical activity from our daily lives. We no longer exercise or walk up stairs, nor do our children bike to school or play outside, because they would prefer to play video games and watch television. And because some of us are obviously predisposed to gain weight while others are not, this explanation also has a genetic component -- the thrifty gene. It suggests that storing extra calories as fat was an evolutionary advantage to our Paleolithic ancestors, who had to survive frequent famine. We then inherited these ''thrifty'' genes, despite their liability in today's toxic environment.

This theory makes perfect sense and plays to our puritanical prejudice that fat, fast food and television are innately damaging to our humanity. But there are two catches. First, to buy this logic is to accept that the copious negative reinforcement that accompanies obesity -- both socially and physically -- is easily overcome by the constant bombardment of food advertising and the lure of a supersize bargain meal. And second, as Flegal points out, little data exist to support any of this. Certainly none of it explains what changed so significantly to start the epidemic. Fast-food consumption, for example, continued to grow steadily through the 70's and 80's, but it did not take a sudden leap, as obesity did.

As far as exercise and physical activity go, there are no reliable data before the mid-80's, according to William Dietz, who runs the division of nutrition and physical activity at the Centers for Disease Control; the 1990's data show obesity rates continuing to climb, while exercise activity remained unchanged. This suggests the two have little in common. Dietz also acknowledged that a culture of physical exercise began in the United States in the 70's -- the ''leisure exercise mania,'' as Robert Levy, director of the National Heart, Lung and Blood Institute, described it in 1981 -- and has continued through the present day.

As for the thrifty gene, it provides the kind of evolutionary rationale for human behavior that scientists find comforting but that simply cannot be tested. In other words, if we were living through an anorexia epidemic, the experts would be discussing the equally untestable ''spendthrift gene'' theory, touting evolutionary advantages of losing weight effortlessly. An overweight homo erectus, they'd say, would have been easy prey for predators.

It is also undeniable, note students of Endocrinology 101, that mankind never evolved to eat a diet high in starches or sugars. ''Grain products and concentrated sugars were essentially absent from human nutrition until the invention of agriculture,'' Ludwig says, ''which was only 10,000 years ago.'' This is discussed frequently in the anthropology texts but is mostly absent from the obesity literature, with the prominent exception of the low-carbohydrate-diet books.

What's forgotten in the current controversy is that the low-fat dogma itself is only about 25 years old. Until the late 70's, the accepted wisdom was that fat and protein protected against overeating by making you sated, and that carbohydrates made you fat. In ''The Physiology of Taste,'' for instance, an 1825 discourse considered among the most famous books ever written about food, the French gastronome Jean Anthelme Brillat-Savarin says that he could easily identify the causes of obesity after 30 years of listening to one ''stout party'' after another proclaiming the joys of bread, rice and (from a ''particularly stout party'') potatoes. Brillat-Savarin described the roots of obesity as a natural predisposition conjuncted with the ''floury and feculent substances which man makes the prime ingredients of his daily nourishment.'' He added that the effects of this fecula -- i.e., ''potatoes, grain or any kind of flour'' -- were seen sooner when sugar was added to the diet.

This is what my mother taught me 40 years ago, backed up by the vague observation that Italians tended toward corpulence because they ate so much pasta. This observation was actually documented by Ancel Keys, a University of Minnesota physician who noted that fats ''have good staying power,'' by which he meant they are slow to be digested and so lead to satiation, and that Italians were among the heaviest populations he had studied. According to Keys, the Neapolitans, for instance, ate only a little lean meat once or twice a week, but ate bread and pasta every day for lunch and dinner. ''There was no evidence of nutritional deficiency,'' he wrote, ''but the working-class women were fat.''

By the 70's, you could still find articles in the journals describing high rates of obesity in Africa and the Caribbean where diets contained almost exclusively carbohydrates. The common thinking, wrote a former director of the Nutrition Division of the United Nations, was that the ideal diet, one that prevented obesity, snacking and excessive sugar consumption, was a diet ''with plenty of eggs, beef, mutton, chicken, butter and well-cooked vegetables.'' This was the identical prescription Brillat-Savarin put forth in 1825.

It was Ancel Keys, paradoxically, who introduced the low-fat-is-good-health dogma in the 50's with his theory that dietary fat raises cholesterol levels and gives you heart disease. Over the next two decades, however, the scientific evidence supporting this theory remained stubbornly ambiguous. The case was eventually settled not by new science but by politics. It began in January 1977, when a Senate committee led by George McGovern published its ''Dietary Goals for the United States,'' advising that Americans significantly curb their fat intake to abate an epidemic of ''killer diseases'' supposedly sweeping the country. It peaked in late 1984, when the National Institutes of Health officially recommended that all Americans over the age of 2 eat less fat. By that time, fat had become ''this greasy killer'' in the memorable words of the Center for Science in the Public Interest, and the model American breakfast of eggs and bacon was well on its way to becoming a bowl of Special K with low-fat milk, a glass of orange juice and toast, hold the butter -- a dubious feast of refined carbohydrates.

In the intervening years, the N.I.H. spent several hundred million dollars trying to demonstrate a connection between eating fat and getting heart disease and, despite what we might think, it failed. Five major studies revealed no such link. A sixth, however, costing well over $100 million alone, concluded that reducing cholesterol by drug therapy could prevent heart disease. The N.I.H. administrators then made a leap of faith. Basil Rifkind, who oversaw the relevant trials for the N.I.H., described their logic this way: they had failed to demonstrate at great expense that eating less fat had any health benefits. But if a cholesterol-lowering drug could prevent heart attacks, then a low-fat, cholesterol-lowering diet should do the same. ''It's an imperfect world,'' Rifkind told me. ''The data that would be definitive is ungettable, so you do your best with what is available.''

Some of the best scientists disagreed with this low-fat logic, suggesting that good science was incompatible with such leaps of faith, but they were effectively ignored. Pete Ahrens, whose Rockefeller University laboratory had done the seminal research on cholesterol metabolism, testified to McGovern's committee that everyone responds differently to low-fat diets. It was not a scientific matter who might benefit and who might be harmed, he said, but ''a betting matter.'' Phil Handler, then president of the National Academy of Sciences, testified in Congress to the same effect in 1980. ''What right,'' Handler asked, ''has the federal government to propose that the American people conduct a vast nutritional experiment, with themselves as subjects, on the strength of so very little evidence that it will do them any good?''

Nonetheless, once the N.I.H. signed off on the low-fat doctrine, societal forces took over. The food industry quickly began producing thousands of reduced-fat food products to meet the new recommendations. Fat was removed from foods like cookies, chips and yogurt. The problem was, it had to be replaced with something as tasty and pleasurable to the palate, which meant some form of sugar, often high-fructose corn syrup. Meanwhile, an entire industry emerged to create fat substitutes, of which Procter & Gamble's olestra was first. And because these reduced-fat meats, cheeses, snacks and cookies had to compete with a few hundred thousand other food products marketed in America, the industry dedicated considerable advertising effort to reinforcing the less-fat-is-good-health message. Helping the cause was what Walter Willett calls the ''huge forces'' of dietitians, health organizations, consumer groups, health reporters and even cookbook writers, all well-intended missionaries of healthful eating.

Few experts now deny that the low-fat message is radically oversimplified. If nothing else, it effectively ignores the fact that unsaturated fats, like olive oil, are relatively good for you: they tend to elevate your good cholesterol, high-density lipoprotein (H.D.L.), and lower your bad cholesterol, low-density lipoprotein (L.D.L.), at least in comparison to the effect of carbohydrates. While higher L.D.L. raises your heart-disease risk, higher H.D.L. reduces it.

What this means is that even saturated fats -- a k a, the bad fats -- are not nearly as deleterious as you would think. True, they will elevate your bad cholesterol, but they will also elevate your good cholesterol. In other words, it's a virtual wash. As Willett explained to me, you will gain little to no health benefit by giving up milk, butter and cheese and eating bagels instead.

But it gets even weirder than that. Foods considered more or less deadly under the low-fat dogma turn out to be comparatively benign if you actually look at their fat content. More than two-thirds of the fat in a porterhouse steak, for instance, will definitively improve your cholesterol profile (at least in comparison with the baked potato next to it); it's true that the remainder will raise your L.D.L., the bad stuff, but it will also boost your H.D.L. The same is true for lard. If you work out the numbers, you come to the surreal conclusion that you can eat lard straight from the can and conceivably reduce your risk of heart disease.

The crucial example of how the low-fat recommendations were oversimplified is shown by the impact -- potentially lethal, in fact -- of low-fat diets on triglycerides, which are the component molecules of fat. By the late 60's, researchers had shown that high triglyceride levels were at least as common in heart-disease patients as high L.D.L. cholesterol, and that eating a low-fat, high-carbohydrate diet would, for many people, raise their triglyceride levels, lower their H.D.L. levels and accentuate what Gerry Reaven, an endocrinologist at Stanford University, called Syndrome X. This is a cluster of conditions that can lead to heart disease and Type 2 diabetes.

It took Reaven a decade to convince his peers that Syndrome X was a legitimate health concern, in part because to accept its reality is to accept that low-fat diets will increase the risk of heart disease in a third of the population. ''Sometimes we wish it would go away because nobody knows how to deal with it,'' said Robert Silverman, an N.I.H. researcher, at a 1987 N.I.H. conference. ''High protein levels can be bad for the kidneys. High fat is bad for your heart. Now Reaven is saying not to eat high carbohydrates. We have to eat something.''

Surely, everyone involved in drafting the various dietary guidelines wanted Americans simply to eat less junk food, however you define it, and eat more the way they do in Berkeley, Calif. But we didn't go along. Instead we ate more starches and refined carbohydrates, because calorie for calorie, these are the cheapest nutrients for the food industry to produce, and they can be sold at the highest profit. It's also what we like to eat. Rare is the person under the age of 50 who doesn't prefer a cookie or heavily sweetened yogurt to a head of broccoli.

''All reformers would do well to be conscious of the law of unintended consequences,'' says Alan Stone, who was staff director for McGovern's Senate committee. Stone told me he had an inkling about how the food industry would respond to the new dietary goals back when the hearings were first held. An economist pulled him aside, he said, and gave him a lesson on market disincentives to healthy eating: ''He said if you create a new market with a brand-new manufactured food, give it a brand-new fancy name, put a big advertising budget behind it, you can have a market all to yourself and force your competitors to catch up. You can't do that with fruits and vegetables. It's harder to differentiate an apple from an apple.''

Nutrition researchers also played a role by trying to feed science into the idea that carbohydrates are the ideal nutrient. It had been known, for almost a century, and considered mostly irrelevant to the etiology of obesity, that fat has nine calories per gram compared with four for carbohydrates and protein. Now it became the fail-safe position of the low-fat recommendations: reduce the densest source of calories in the diet and you will lose weight. Then in 1982, J.P. Flatt, a University of Massachusetts biochemist, published his research demonstrating that, in any normal diet, it is extremely rare for the human body to convert carbohydrates into body fat. This was then misinterpreted by the media and quite a few scientists to mean that eating carbohydrates, even to excess, could not make you fat -- which is not the case, Flatt says. But the misinterpretation developed a vigorous life of its own because it resonated with the notion that fat makes you fat and carbohydrates are harmless.

As a result, the major trends in American diets since the late 70's, according to the U.S.D.A. agricultural economist Judith Putnam, have been a decrease in the percentage of fat calories and a ''greatly increased consumption of carbohydrates.'' To be precise, annual grain consumption has increased almost 60 pounds per person, and caloric sweeteners (primarily high-fructose corn syrup) by 30 pounds. At the same time, we suddenly began consuming more total calories: now up to 400 more each day since the government started recommending low-fat diets.

If these trends are correct, then the obesity epidemic can certainly be explained by Americans' eating more calories than ever -- excess calories, after all, are what causes us to gain weight -- and, specifically, more carbohydrates. The question is why?

The answer provided by Endocrinology 101 is that we are simply hungrier than we were in the 70's, and the reason is physiological more than psychological. In this case, the salient factor -- ignored in the pursuit of fat and its effect on cholesterol -- is how carbohydrates affect blood sugar and insulin. In fact, these were obvious culprits all along, which is why Atkins and the low-carb-diet doctors pounced on them early.

The primary role of insulin is to regulate blood-sugar levels. After you eat carbohydrates, they will be broken down into their component sugar molecules and transported into the bloodstream. Your pancreas then secretes insulin, which shunts the blood sugar into muscles and the liver as fuel for the next few hours. This is why carbohydrates have a significant impact on insulin and fat does not. And because juvenile diabetes is caused by a lack of insulin, physicians believed since the 20's that the only evil with insulin is not having enough.

But insulin also regulates fat metabolism. We cannot store body fat without it. Think of insulin as a switch. When it's on, in the few hours after eating, you burn carbohydrates for energy and store excess calories as fat. When it's off, after the insulin has been depleted, you burn fat as fuel. So when insulin levels are low, you will burn your own fat, but not when they're high.

This is where it gets unavoidably complicated. The fatter you are, the more insulin your pancreas will pump out per meal, and the more likely you'll develop what's called ''insulin resistance,'' which is the underlying cause of Syndrome X. In effect, your cells become insensitive to the action of insulin, and so you need ever greater amounts to keep your blood sugar in check. So as you gain weight, insulin makes it easier to store fat and harder to lose it. But the insulin resistance in turn may make it harder to store fat -- your weight is being kept in check, as it should be. But now the insulin resistance might prompt your pancreas to produce even more insulin, potentially starting a vicious cycle. Which comes first -- the obesity, the elevated insulin, known as hyperinsulinemia, or the insulin resistance -- is a chicken-and-egg problem that hasn't been resolved. One endocrinologist described this to me as ''the Nobel-prize winning question.''

Insulin also profoundly affects hunger, although to what end is another point of controversy. On the one hand, insulin can indirectly cause hunger by lowering your blood sugar, but how low does blood sugar have to drop before hunger kicks in? That's unresolved. Meanwhile, insulin works in the brain to suppress hunger. The theory, as explained to me by Michael Schwartz, an endocrinologist at the University of Washington, is that insulin's ability to inhibit appetite would normally counteract its propensity to generate body fat. In other words, as you gained weight, your body would generate more insulin after every meal, and that in turn would suppress your appetite; you'd eat less and lose the weight.

Schwartz, however, can imagine a simple mechanism that would throw this ''homeostatic'' system off balance: if your brain were to lose its sensitivity to insulin, just as your fat and muscles do when they are flooded with it. Now the higher insulin production that comes with getting fatter would no longer compensate by suppressing your appetite, because your brain would no longer register the rise in insulin. The end result would be a physiologic state in which obesity is almost preordained, and one in which the carbohydrate-insulin connection could play a major role. Schwartz says he believes this could indeed be happening, but research hasn't progressed far enough to prove it. ''It is just a hypothesis,'' he says. ''It still needs to be sorted out.''

David Ludwig, the Harvard endocrinologist, says that it's the direct effect of insulin on blood sugar that does the trick. He notes that when diabetics get too much insulin, their blood sugar drops and they get ravenously hungry. They gain weight because they eat more, and the insulin promotes fat deposition. The same happens with lab animals. This, he says, is effectively what happens when we eat carbohydrates -- in particular sugar and starches like potatoes and rice, or anything made from flour, like a slice of white bread. These are known in the jargon as high-glycemic-index carbohydrates, which means they are absorbed quickly into the blood. As a result, they cause a spike of blood sugar and a surge of insulin within minutes. The resulting rush of insulin stores the blood sugar away and a few hours later, your blood sugar is lower than it was before you ate. As Ludwig explains, your body effectively thinks it has run out of fuel, but the insulin is still high enough to prevent you from burning your own fat. The result is hunger and a craving for more carbohydrates. It's another vicious circle, and another situation ripe for obesity.

The glycemic-index concept and the idea that starches can be absorbed into the blood even faster than sugar emerged in the late 70's, but again had no influence on public health recommendations, because of the attendant controversies. To wit: if you bought the glycemic-index concept, then you had to accept that the starches we were supposed to be eating 6 to 11 times a day were, once swallowed, physiologically indistinguishable from sugars. This made them seem considerably less than wholesome. Rather than accept this possibility, the policy makers simply allowed sugar and corn syrup to elude the vilification that befell dietary fat. After all, they are fat-free.

Sugar and corn syrup from soft drinks, juices and the copious teas and sports drinks now supply more than 10 percent of our total calories; the 80's saw the introduction of Big Gulps and 32-ounce cups of Coca-Cola, blasted through with sugar, but 100 percent fat free. When it comes to insulin and blood sugar, these soft drinks and fruit juices -- what the scientists call ''wet carbohydrates'' -- might indeed be worst of all. (Diet soda accounts for less than a quarter of the soda market.)

The gist of the glycemic-index idea is that the longer it takes the carbohydrates to be digested, the lesser the impact on blood sugar and insulin and the healthier the food. Those foods with the highest rating on the glycemic index are some simple sugars, starches and anything made from flour. Green vegetables, beans and whole grains cause a much slower rise in blood sugar because they have fiber, a nondigestible carbohydrate, which slows down digestion and lowers the glycemic index. Protein and fat serve the same purpose, which implies that eating fat can be beneficial, a notion that is still unacceptable. And the glycemic-index concept implies that a primary cause of Syndrome X, heart disease, Type 2 diabetes and obesity is the long-term damage caused by the repeated surges of insulin that come from eating starches and refined carbohydrates. This suggests a kind of unified field theory for these chronic diseases, but not one that coexists easily with the low-fat doctrine.

At Ludwig's pediatric obesity clinic, he has been prescribing low-glycemic-index diets to children and adolescents for five years now. He does not recommend the Atkins diet because he says he believes such a very low carbohydrate approach is unnecessarily restrictive; instead, he tells his patients to effectively replace refined carbohydrates and starches with vegetables, legumes and fruit. This makes a low-glycemic-index diet consistent with dietary common sense, albeit in a higher-fat kind of way. His clinic now has a nine-month waiting list. Only recently has Ludwig managed to convince the N.I.H. that such diets are worthy of study. His first three grant proposals were summarily rejected, which may explain why much of the relevant research has been done in Canada and in Australia. In April, however, Ludwig received $1.2 million from the N.I.H. to test his low-glycemic-index diet against a traditional low-fat-low-calorie regime. That might help resolve some of the controversy over the role of insulin in obesity, although the redoubtable Robert Atkins might get there first.

The 71-year-old Atkins, a graduate of Cornell medical school, says he first tried a very low carbohydrate diet in 1963 after reading about one in the Journal of the American Medical Association. He lost weight effortlessly, had his epiphany and turned a fledgling Manhattan cardiology practice into a thriving obesity clinic. He then alienated the entire medical community by telling his readers to eat as much fat and protein as they wanted, as long as they ate little to no carbohydrates. They would lose weight, he said, because they would keep their insulin down; they wouldn't be hungry; and they would have less resistance to burning their own fat. Atkins also noted that starches and sugar were harmful in any event because they raised triglyceride levels and that this was a greater risk factor for heart disease than cholesterol.

Atkins's diet is both the ultimate manifestation of the alternative hypothesis as well as the battleground on which the fat-versus-carbohydrates controversy is likely to be fought scientifically over the next few years. After insisting Atkins was a quack for three decades, obesity experts are now finding it difficult to ignore the copious anecdotal evidence that his diet does just what he has claimed. Take Albert Stunkard, for instance. Stunkard has been trying to treat obesity for half a century, but he told me he had his epiphany about Atkins and maybe about obesity as well just recently when he discovered that the chief of radiology in his hospital had lost 60 pounds on Atkins's diet. ''Well, apparently all the young guys in the hospital are doing it,'' he said. ''So we decided to do a study.'' When I asked Stunkard if he or any of his colleagues considered testing Atkins's diet 30 years ago, he said they hadn't because they thought Atkins was ''a jerk'' who was just out to make money: this ''turned people off, and so nobody took him seriously enough to do what we're finally doing.''

In fact, when the American Medical Association released its scathing critique of Atkins's diet in March 1973, it acknowledged that the diet probably worked, but expressed little interest in why. Through the 60's, this had been a subject of considerable research, with the conclusion that Atkins-like diets were low-calorie diets in disguise; that when you cut out pasta, bread and potatoes, you'll have a hard time eating enough meat, vegetables and cheese to replace the calories.

That, however, raised the question of why such a low-calorie regimen would also suppress hunger, which Atkins insisted was the signature characteristic of the diet. One possibility was Endocrinology 101: that fat and protein make you sated and, lacking carbohydrates and the ensuing swings of blood sugar and insulin, you stay sated. The other possibility arose from the fact that Atkins's diet is ''ketogenic.'' This means that insulin falls so low that you enter a state called ketosis, which is what happens during fasting and starvation. Your muscles and tissues burn body fat for energy, as does your brain in the form of fat molecules produced by the liver called ketones. Atkins saw ketosis as the obvious way to kick-start weight loss. He also liked to say that ketosis was so energizing that it was better than sex, which set him up for some ridicule. An inevitable criticism of Atkins's diet has been that ketosis is dangerous and to be avoided at all costs.

When I interviewed ketosis experts, however, they universally sided with Atkins, and suggested that maybe the medical community and the media confuse ketosis with ketoacidosis, a variant of ketosis that occurs in untreated diabetics and can be fatal. ''Doctors are scared of ketosis,'' says Richard Veech, an N.I.H. researcher who studied medicine at Harvard and then got his doctorate at Oxford University with the Nobel Laureate Hans Krebs. ''They're always worried about diabetic ketoacidosis. But ketosis is a normal physiologic state. I would argue it is the normal state of man. It's not normal to have McDonald's and a delicatessen around every corner. It's normal to starve.''

Simply put, ketosis is evolution's answer to the thrifty gene. We may have evolved to efficiently store fat for times of famine, says Veech, but we also evolved ketosis to efficiently live off that fat when necessary. Rather than being poison, which is how the press often refers to ketones, they make the body run more efficiently and provide a backup fuel source for the brain. Veech calls ketones ''magic'' and has shown that both the heart and brain run 25 percent more efficiently on ketones than on blood sugar.

The bottom line is that for the better part of 30 years Atkins insisted his diet worked and was safe, Americans apparently tried it by the tens of millions, while nutritionists, physicians, public- health authorities and anyone concerned with heart disease insisted it could kill them, and expressed little or no desire to find out who was right. During that period, only two groups of U.S. researchers tested the diet, or at least published their results. In the early 70's, J.P. Flatt and Harvard's George Blackburn pioneered the ''protein-sparing modified fast'' to treat postsurgical patients, and they tested it on obese volunteers. Blackburn, who later became president of the American Society of Clinical Nutrition, describes his regime as ''an Atkins diet without excess fat'' and says he had to give it a fancy name or nobody would take him seriously. The diet was ''lean meat, fish and fowl'' supplemented by vitamins and minerals. ''People loved it,'' Blackburn recalls. ''Great weight loss. We couldn't run them off with a baseball bat.'' Blackburn successfully treated hundreds of obese patients over the next decade and published a series of papers that were ignored. When obese New Englanders turned to appetite-control drugs in the mid-80's, he says, he let it drop. He then applied to the N.I.H. for a grant to do a clinical trial of popular diets but was rejected.

The second trial, published in September 1980, was done at the George Washington University Medical Center. Two dozen obese volunteers agreed to follow Atkins's diet for eight weeks and lost an average of 17 pounds each, with no apparent ill effects, although their L.D.L. cholesterol did go up. The researchers, led by John LaRosa, now president of the State University of New York Downstate Medical Center in Brooklyn, concluded that the 17-pound weight loss in eight weeks would likely have happened with any diet under ''the novelty of trying something under experimental conditions'' and never pursued it further.

Now researchers have finally decided that Atkins's diet and other low-carb diets have to be tested, and are doing so against traditional low-calorie-low-fat diets as recommended by the American Heart Association. To explain their motivation, they inevitably tell one of two stories: some, like Stunkard, told me that someone they knew -- a patient, a friend, a fellow physician -- lost considerable weight on Atkins's diet and, despite all their preconceptions to the contrary, kept it off. Others say they were frustrated with their inability to help their obese patients, looked into the low-carb diets and decided that Endocrinology 101 was compelling. ''As a trained physician, I was trained to mock anything like the Atkins diet,'' says Linda Stern, an internist at the Philadelphia Veterans Administration Hospital, ''but I put myself on the diet. I did great. And I thought maybe this is something I can offer my patients.''

None of these studies have been financed by the N.I.H., and none have yet been published. But the results have been reported at conferences -- by researchers at Schneider Children's Hospital on Long Island, Duke University and the University of Cincinnati, and by Stern's group at the Philadelphia V.A. Hospital. And then there's the study Stunkard had mentioned, led by Gary Foster at the University of Pennsylvania, Sam Klein, director of the Center for Human Nutrition at Washington University in St. Louis, and Jim Hill, who runs the University of Colorado Center for Human Nutrition in Denver. The results of all five of these studies are remarkably consistent. Subjects on some form of the Atkins diet -- whether overweight adolescents on the diet for 12 weeks as at Schneider, or obese adults averaging 295 pounds on the diet for six months, as at the Philadelphia V.A. -- lost twice the weight as the subjects on the low-fat, low-calorie diets.

In all five studies, cholesterol levels improved similarly with both diets, but triglyceride levels were considerably lower with the Atkins diet. Though researchers are hesitant to agree with this, it does suggest that heart-disease risk could actually be reduced when fat is added back into the diet and starches and refined carbohydrates are removed. ''I think when this stuff gets to be recognized,'' Stunkard says, ''it's going to really shake up a lot of thinking about obesity and metabolism.''

All of this could be settled sooner rather than later, and with it, perhaps, we might have some long-awaited answers as to why we grow fat and whether it is indeed preordained by societal forces or by our choice of foods. For the first time, the N.I.H. is now actually financing comparative studies of popular diets. Foster, Klein and Hill, for instance, have now received more than $2.5 million from N.I.H. to do a five-year trial of the Atkins diet with 360 obese individuals. At Harvard, Willett, Blackburn and Penelope Greene have money, albeit from Atkins's nonprofit foundation, to do a comparative trial as well.

Should these clinical trials also find for Atkins and his high-fat, low-carbohydrate diet, then the public-health authorities may indeed have a problem on their hands. Once they took their leap of faith and settled on the low-fat dietary dogma 25 years ago, they left little room for contradictory evidence or a change of opinion, should such a change be necessary to keep up with the science. In this light Sam Klein's experience is noteworthy. Klein is president-elect of the North American Association for the Study of Obesity, which suggests that he is a highly respected member of his community. And yet, he described his recent experience discussing the Atkins diet at medical conferences as a learning experience. ''I have been impressed,'' he said, ''with the anger of academicians in the audience. Their response is 'How dare you even present data on the Atkins diet!' ''

This hostility stems primarily from their anxiety that Americans, given a glimmer of hope about their weight, will rush off en masse to try a diet that simply seems intuitively dangerous and on which there is still no long-term data on whether it works and whether it is safe. It's a justifiable fear. In the course of my research, I have spent my mornings at my local diner, staring down at a plate of scrambled eggs and sausage, convinced that somehow, some way, they must be working to clog my arteries and do me in.

After 20 years steeped in a low-fat paradigm, I find it hard to see the nutritional world any other way. I have learned that low-fat diets fail in clinical trials and in real life, and they certainly have failed in my life. I have read the papers suggesting that 20 years of low-fat recommendations have not managed to lower the incidence of heart disease in this country, and may have led instead to the steep increase in obesity and Type 2 diabetes. I have interviewed researchers whose computer models have calculated that cutting back on the saturated fats in my diet to the levels recommended by the American Heart Association would not add more than a few months to my life, if that. I have even lost considerable weight with relative ease by giving up carbohydrates on my test diet, and yet I can look down at my eggs and sausage and still imagine the imminent onset of heart disease and obesity, the latter assuredly to be caused by some bizarre rebound phenomena the likes of which science has not yet begun to describe. The fact that Atkins himself has had heart trouble recently does not ease my anxiety, despite his assurance that it is not diet-related.

This is the state of mind I imagine that mainstream nutritionists, researchers and physicians must inevitably take to the fat-versus-carbohydrate controversy. They may come around, but the evidence will have to be exceptionally compelling. Although this kind of conversion may be happening at the moment to John Farquhar, who is a professor of health research and policy at Stanford University and has worked in this field for more than 40 years. When I interviewed Farquhar in April, he explained why low-fat diets might lead to weight gain and low-carbohydrate diets might lead to weight loss, but he made me promise not to say he believed they did. He attributed the cause of the obesity epidemic to the ''force-feeding of a nation.'' Three weeks later, after reading an article on Endocrinology 101 by David Ludwig in the Journal of the American Medical Association, he sent me an e-mail message asking the not-entirely-rhetorical question, ''Can we get the low-fat proponents to apologize?''

Gary Taubes is a correspondent for the journal Science and author of ''Bad Science: The Short Life and Weird Times of Cold Fusion.''

New York Times 7 July 2002