Don’t Be Confused by Big Salt

High blood pressure is not the only harmful effect of too much salt—it’s also been tied to stomach cancer, kidney stones, bone loss, obesity, and direct damage to our kidneys, arteries, and heart. But, as I reviewed previously in my video, The Evidence That Salt Raises Blood Pressure, there is a consensus that dietary sodium plays a significant role in raising people’s blood pressure, a dispute that has now finally been resolved.

In Sodium Skeptics Try to Shake Up the Salt Debate, I discuss the unequivocal evidence that increased sodium intake is associated with increased blood pressure, which we know leads to increased risk of vascular diseases like strokes, aneurisms, and atherosclerosis. To quote the long-time editor-in-chief of the American Journal of Cardiology, “We all must decrease our salt intake!,” a sentiment echoed by many other authorities. So, how is the food industry going to keep the salt controversy alive? If salt leads to high blood pressure and high blood pressure leads to disease, doesn’t it follow that salt should lead to disease? I mean, if A leads to B, and B leads to C, then A should lead to C, right? The logic seems sound. Blood pressure is one of the best validated surrogate markers for cardiovascular disease, and, when countries have tried cutting down on salt, it seems to have worked.

Campaigns in England were able to successfully bring down salt consumption. Blood pressures dropped, as did rates of heart disease and stroke. They also successfully brought down cholesterol levels and smoking prevalence, though, and improved fruit and vegetable consumption. In Japan, however, they dropped salt intake while eating a worse diet and smoking more, yet still saw a large reduction in stroke mortality. Based on what they were able to achieve in Finland, one daily teaspoon of salt may mean between 25 to 50 percent more deaths from heart attacks and strokes.

Are there randomized controlled trials to show that? Researchers never randomized people into two groups—one low-sodium and one not—and followed them for 20 years to see if the differences in blood pressure translated into the expected consequences. But, for that matter, such a study has never been done on smoking either. Imagine randomizing a group of smokers to quit smoking or stay smoking for ten years to see who gets lung cancer. First, it’s hard to get people to quit, just like it’s hard to keep people on a low-salt diet. Second, would it be ethical to force people to smoke for a decade knowing from the totality of evidence that it’s likely to hurt them? That’s like the Tuskegee experiment. We can’t let the perfect be the enemy of the good.

We may never going to get a decade-long randomized trial, but, in 2007, we got something close. There have been randomized trials of sodium reduction, but they didn’t last long enough to provide enough data on clinical outcomes. For example, the famous TOHP trials randomized thousands into at least 18 months of salt reduction. What if you followed up with them 10 to 15 years after the study was over, figuring maybe some in the low-salt group stuck with it? Indeed, they found that when people cut sodium intake by 25 to 35 percent, they may end up with 25 percent lower risk of heart attacks, strokes, and other cardiovascular events.

This was considered the final nail in the coffin for salt, addressing the one remaining objection to universal salt reduction. It was the first study to show not only a reduction in blood pressure, but a reduction in hard end points—morbidity and mortality—by reducing dietary sodium intake. Case closed, 2007.

But, when billions of dollars are at stake, the case is never closed. One can just follow the press releases of the Salt Institute. For example, what about the Institute of Medicine report saying that salt reduction may cause harm in certain patients with decompensated congestive heart failure? An analysis of those studies has since been retracted out of concern that the data may have been falsified. It is certainly possible that those with serious heart failure, already severely salt-depleted by high dose salt-wasting drugs, may not benefit from further sodium restriction. However, for the great majority of the population, the message remains unchanged.

What about the new study published in the American Journal of Hypertension that found the amount of salt we are eating is just fine, suggesting a kind of u-shaped curve where too much sodium is bad, but too little could be bad, too?

Those biased less towards Big Salt and more towards Big Heart have noted that these studies have been widely misinterpreted, stirring unnecessary controversy and confusion. It basically comes down to three issues: measurement error, confounding, and reverse causality. All these data came from studies that were not designed to assess this relationship, and they tended to use invalid sodium estimates simply because it’s hard to do the multiple, 24-hour urine collections necessary to get a good measurement. And, in the United States, many of those eating less salt are simply eating less food—maybe because they’re so sick—so it’s no wonder they’d have higher mortality rates. So, compiling these studies together is viewed as kind of like garbage in, garbage out. But why would they do that? They claim to have no conflicts of interest. When confronted with evidence showing at least one of the co-authors received thousands of dollars from the Salt Institute, they replied they didn’t get more than $5,000 from them in the last 12 months, so, no conflict of interest!

If you instead look only at the trials in which they did the gold-standard, 24-hour urine collections in healthy people to avoid the reverse causation and controlled for confounders, the curve instead has a continuous decrease of cardiovascular disease (CVD) events like heart attacks and strokes as sodium levels get lower and lower. There was a 17 percent increase in risk of CVD for every gram of sodium a day. And, this is for people without high blood pressure. We’d expect the benefit to be even greater for the 78 million Americans with hypertension. Unfortunately, the media has widely misreported the findings and a false sense of controversy has been broadcast, confusing the public. But it’s not just the media. When editorials are published on the subject in some of the most prestigious medical journals in the world, you don’t expect them to be written by someone who got paid personal fees by Big Salt. Before she accepted money from the Salt Institute, the author was accepting money from the Tobacco Institute and was a frequent expert witness in defense of Philip Morris and other tobacco companies. So, if that’s who the New England Journal of Medicine chooses to editorialize about salt, you can see the extent of industry influence. The editor-in-chief of the American Journal of Hypertension himself worked for many years as a consultant to the Salt Institute.

This video is part of my extended, in-depth series on sodium, which includes:

Salt restriction is also important for kidney stones, as I discussed in How to Treat Kidney Stones with Diet, but aren’t low-salt diets tasteless? Only for a little while. See Changing Our Taste Buds.

For more on how industry influence can distort nutritional science, see:

In health,
Michael Greger, M.D.

PS: If you haven’t yet, you can subscribe to my free videos here and watch my live, year-in-review presentations:

How to Counteract the Effects of Alcohol

More than a million new cases of skin cancer are diagnosed every year, affecting about one in three Americans in their lifetimes. As I discuss in my video Preventing Skin Cancer from the Inside Out, although the chief risk factor is UV exposure from the sun, alcohol consumption may also play a role. Most of the cancers associated with alcohol use are in the digestive tract, from mouth cancer, throat cancer, and stomach cancer down to cancers of the liver and colon. These involve tissues with which alcohol comes in more direct contact. But why skin cancer?

A study of 300,000 Americans found that excessive drinking was associated with higher rates of sunburn. It “may be that heavy and binge drinking are markers for an underlying willingness to disregard health risks” and pass out on the beach, but it also may be because breakdown products of alcohol in the body generate such massive numbers of free radicals that they eat up the antioxidants that protect our skin from the sun. Plants produce “their own built-in protection against the oxidative damage of the sun,” and we can expropriate these built-in protectors by eating those plants to function as cell protectors within our own bodies. One might say fruit and vegetables provide the best polypharmacy—the best drug store—against the development of cancer.

The ingestion of plant foods increases the antioxidant potential of our bloodstream, which can then be deposited in our tissues to protect us against the damaging effects of the sun’s rays, but only recently was it put to the test.

Researchers studied 20 women and burned their buttock skin with a UV lamp before and after half of them ate three tablespoons of tomato paste a day for three months. There was significantly less DNA damage in the derrieres of those who had been eating the tomatoes. So, three months or even just ten weeks before swimsuit season, if we eat lots of an antioxidant-rich food, such as tomato sauce, we may reduce the redness of a sunburn by 40 percent. It’s like we have built-in sunscreen in our skin. Now, this isn’t as good as a high SPF sunblock, but “[m]uch of the UV exposure over a life time occurs when the skin is not protected; thus, the use of dietary factors with sun-protecting properties might have a substantial beneficial effect.”

It works both ways, though. Alcohol consumption decreases the protection within our skin. If you have people drink about three shots of vodka, within eight minutes—not after ten weeks, but within just eight minutes––the level of carotenoid antioxidants in their skin drops dramatically. If, however, you drink the same amount of vodka in orange juice, there is still a drop in skin antioxidants compared with the initial value, but drinking a screwdriver cocktail is not as bad as drinking vodka neat. Is the difference enough to make a difference out in the sun, though?

After the drinks, researchers exposed volunteers to a UV lamp and waited to see how long it would take them to burn, and the time span until they started turning red was significantly shorter after alcohol consumption than in the experiments in which either no alcohol was consumed or alcohol was consumed in combination with orange juice. It came out to be about an extra half hour out in the sun based solely on what you put in your mouth before heading to the beach. And, oranges are pretty wimpy––not as bad as bananas, but berries have the highest cellular antioxidant activity.

The researchers concluded that “[p]eople should be aware of the fact that the consumption of alcohol in combination with UV light [from sun exposure or a tanning booth] increases their risk of sunburn and therefore their risk of developing premature skin aging and even skin cancer.” If you are going to drink alcohol and be out in the sun, you should make sure you are using sunblock or, at the very least, drinking a strawberry daiquiri or something else to reduce oxidative damage.

Isn’t that wild? Antioxidant dynamics in the body change minute to minute so be sure to keep yourself topped off. See:

What else can tomatoes do? Check out Inhibiting Platelet Activation with Tomato Seeds.

Other videos on skin health include:

Alcohol doesn’t just raise the risk of skin cancer. See Breast Cancer and Alcohol: How Much Is Safe?. But, like the orange juice in a screwdriver cocktail, grape skin components may help mediate wine’s adverse effects. See Breast Cancer Risk: Red Wine vs. White Wine.

In health,
Michael Greger, M.D.

PS: If you haven’t yet, you can subscribe to my free videos here and watch my live, year-in-review presentations:

Uncovering the Early Silent Stages of Alzheimer’s Disease

In 1985, a Swiss pathologist noted Alzheimer’s disease-like changes—plaques and tangles—in the brains of about three-quarters of a small group of men and women in their 50s and 60s who had died from other causes. Most brains from people under age 30, however, did not have these features. But these studies just involved a few dozen people.

As I discuss in my video, Alzheimer’s May Start Decades Before Diagnosis, based thousands of autopsies, we have seen what appear to be the first silent stages of Alzheimer’s starting as early as our 20s in about 10% of the population and increasing to about 50% by age 50. “Just as the first malignant cells in cancer…fail to produce any clinically detectable symptoms but represent a larger and potentially life-threatening disease process, the presence of… [these tangles in the brain] may constitute a true threat.”

The high prevalence of the first stage of the disease and its extraordinarily long duration—most people don’t get diagnosed until their 70s—had not been fully appreciated until now. We now understand that neurodegenerative brain changes begin by middle age, and so does cognitive decline; we start losing brain function in our 40s. 

Before people are diagnosed with Alzheimer’s, they are diagnosed with mild cognitive impairment, or MCI. That’s when cognitive decline becomes clinically apparent. A few years later, Alzheimer’s may be diagnosed, which then eventually results in death. We never knew what was happening before mild cognitive impairment was diagnosed… until now. There appears to be a slow decline in brain function and the buildup of plaques and tangles in the brain for decades before Alzheimer’s is diagnosed. 

This finding potentially has profound implications for the prevention of dementia: We have to start early before marked brain loss has occurred. The good news is that brain disease is not inevitable, even after age 100. The oldest woman in the world, aged 115, retained the brainpower of those practically half her age. Had she had not died from stomach cancer, she could have kept on thriving.

It turns out, there’s no such thing as dying of old age. In 42,000 consecutive autopsies, centenarians––those living past 100––succumbed to diseases in 100% of the cases examined, though most were perceived to have been healthy just prior to death, even by their physicians. In actuality, not one died of “old age.” Until recently, advanced age has been considered to be a disease itself, but people don’t die as a consequence of old age as commonly assumed, but from diseases and, most commonly, heart attacks.

One of the most intriguing findings from the 115-year-old woman mentioned above was that her body showed no significant atherosclerosis and the arteries in her brain were clear as well. That may have been one of the secrets to her mental clarity. There is emerging consensus that “what is good for our hearts is also good for our heads,” which I cover in my video, Alzheimer’s and Atherosclerosis of the Brain.

I have an extended video series on this dreaded disease. Learn more about Alzheimer’s in the following videos:

See more on cognitive decline in general in these videos:

More information on healthy aging can be found in

In health,

Michael Greger, M.D.

PS: If you haven’t yet, you can subscribe to my free videos here and watch my live, year-in-review presentations: