Sabtu, 17 Juli 2010

A Closer Look at Tartrazine

Tartrazine, also known as Yellow #5, is a coal-tar derivative azo dye found in a lot of processed food, including Kraft Macaroni and Cheese, Doritos, Mountain Dew, Peeps, and many soups, custards, mustards, baked goods, cotton candy, ice cream and tons and tons more. It's also in a million and one other products we may use on a daily basis - lotions, face soaps (including the one I used this morning), body soaps (including the one I used this morning!), cosmetics, shampoos, hand stamps - you name it! Despite this multimodal ingestion and cutaneous exposure, only a very small amount of the dye is used in each product, so total exposure might be around 1 teaspoon in a year (from wikipedia, so take that number with a grain of salt) (The CSPI site says around 12.75mg a day on average based on usage data - but probably more for kids and those who live on Mountain Dew). There's no reason, though, to freak out and empty out the pantry and medicine cabinet. Dose is important. But, as we well know, some of us can bathe in toxic substances and come out smelling like a rose, and others are sensitive to very small amounts.

The reason I'm looking at it more closely today is because if you hunt around the internet searching for possible creepy things about industrial food dyes, tartrazine has the worst reputation. And, indeed, it was one of the several dyes used in the Southampton Study I discussed earlier this week - and the whole study cocktail of dye and preservative did lead to increased hyperactivity in kids. Its use as a food additive is subject to a ban in the UK and voluntary bans in other European countries, like Norway. In fact, the Center for Science in the Public Interest in the US called for the FDA to ban Yellow Number 5 on June 30, 2010. (Here's a cute PDF from CSPI - Food Dyes, a Rainbow of Risk.)

But how might tartrazine cause problems? Well, some people (around 1/10,000, more or less) are definitely allergic to it. Hives, purpura, anaphylaxis, the real deal. This is why the FDA declared that it has to be on the labels of food if it is used - for people with hypersensitivity, and you can run afoul of the FDA if you include tartrazine in your product and don't label it (1). Not unusual - lots of natural and manmade chemicals cause allergic reactions in some people. It would be an unfortunate allergy to have, as yellow number 5 is in all sorts of things you wouldn't expect. Also, there seems to be a cross-reactivity in some people with asthma between tartrazine and aspirin - people with asthma caused by aspirin also seem to have asthma caused by tartrazine (2), and using desensitization techniques (gradual increasing exposure) to reduce aspirin sensitivity in one case study protected against the effects of tartrazine too. (3). Hmmm.

What about other actions of tartrazine? I couldn't find much. One study showed that application of small amounts of tartrazine caused contraction of intestinal muscle cells in guinea pigs, and the effect was blocked by atropine. That clues us in that tartrazine seems to be able to activate the parasympathetic nervous system, either directly or indirectly, via the muscarinic receptor (4). Now that is quite interesting, as the central nervous system has lots of muscarinic receptors of all types known (M1-M5). Various activating agents of these receptors can cause things such as confusion, convulsions, restlessness, and psychosis - in high enough doses. At lower doses they can sometimes do the opposite, and cholinesterase inhibitors (which increase the CNS activity of acetylcholine, a muscarinic activator) are used to treat dementia. We've taken several big leaps at this point, but it is theoretically possible that if tartrazine gets into the central nervous system, this muscarinic receptor activation might be a mechanism for some sort of psychiatric reaction (like increasing hyperactivity).

The most intriguing information comes from this study from 1990 about how tartrazine influences the zinc status of hyperactive children. Now I'm still trying to get my hands on the full text - institutional access website is being coy the last few days, and it seems this journal is only available online from 1995 on anyway. But the abstract is telling. 20 hyperactive boys and 20 "aged matched controls" were tested for zinc levels in their saliva, urine, 24 hour urine, scalp hair, fingernails, and blood. The hyperactive kids had decreased zinc everywhere but the saliva. Then 10 control kids and 10 hyperactive kids were given a tartrazine-containing drink. In the hyperactive kids, the blood levels of zinc went down and the urine levels of zinc went up, and their behavior got worse, suggesting that tartrazine caused them to pee out some much-needed zinc. It's a bit hard to tell from the abstract, but the way I read it, it looks like the control kids zinc levels didn't change, and neither did their behavior. So that might be the mechanism by which yellow number five influences hyperactivity in certain kids. Ironically enough, Concerta, a formulation of Ritalin, has yellow number 5 as a colorant in the capsule!

The only other dirt I found on yellow number five is that it was implicated also in reducing the absorption or metabolism of vitamin B6, leading to carpal tunnel syndrome, of all things (at least according to Dr. Murray). (Here's the link between B6 deficiency and carpal tunnel, anyway). The rumor that the yellow number 5 in Mountain Dew causes your testicles to shrink? Well, that's not true.

My stance is - for most kids there is no need to make a big scene at a birthday party by grabbing the bag of rainbow candy out of your kid's hand. But on a day to day basis, processed food should be avoided in favor of whole, real food anyway. Doing that will reduce exposure to the rainbow soup of chemicals in processed food. Not to mention the mountains of fructose, trans fats, and genetically modified ingredients. Real food is a win/win. Weirded out by yellow number 5 in shampoos and soap? Check the labels if it bothers you. Or use baking soda and apple cider vinegar as cheap alternatives to shampoo and conditioner.
Read More..

Kamis, 15 Juli 2010

Hyperactivity and Diet

Here's the title to a June 2009 article in the Harvard Mental Health Letter: "Diet and Attention Deficit Hyperactivity Disorder - Can some food additives or nutrients affect symptoms? The jury is still out."

Hmmm. That sounds very noncommittal. Let's start instead with the 2007 Southampton Study. Published in the Lancet, Britain's premier medical journal, this was a well-designed nutritional study! We're talking a randomized, placebo-controlled, crossover study. That's enough to make any biological scientist's heart go pitter patter. So what did the researchers do? They took 153 3 year-olds and 144 8/9 year-olds recruited from the community, and did a baseline measure of hyperactivity on the kids via a questionnaire filled out by teachers. All the kids were put on an additive and preservative free diet. At weeks 2, 4, and 6, half the kids were randomly/but in crossover fashion given a study juice drink containing either Mix A (with tartrazine (yellow #5), sodium benzoate, sunset yellow, carmoisine, ponceau 4R) or Mix B (sodium benzoate, sunset yellow, carmoisine, quinoline yellow, and allura red AC). Mix A was supposed to be equal to the additives and preservatives found in 2 bags of candy and was similar to a mix in previous studies, and Mix B to 4 bags of candy - meant to be a representation of the average amount of additives a kid on a normal diet might receive. The other half of the kids got a placebo drink that was the same color and flavor as the test drinks, but had no artificial colors or preservatives (I'm sure they managed it somehow). On weeks 3 and 5, everyone got a placebo (these were "washout weeks."

Then the researchers asked parents and teachers to assess the children's behavior using standard clinical instruments, and also asked independent reviewers to observe the kids at school. The investigators found a significant increase in hyperactivity during the weeks the kids (both 3 year olds and 8/9 year olds) consumed the drinks containing the artificial colors. Some kids appeared to be more vulnerable to the effect than others, but the overall effect was a 10% increase in hyperactivity. It was similar to the results found by an earlier meta analysis, and, in summary, the effect of removing additives from the kids' diet is about 1/3 the effect of giving kids Ritalin to calm them down.

The analysis of the study came down to this - some kids are very sensitive to additives, and their behavior will be significantly impacted. With other kids, it won't matter much. The results of the study, however, were impressive enough that the UK and several other European countries banned the use of the studied food additives, so Skittles sold in the UK have no sunset yellow or tartrazine, though as far as I know, Skittles in the US have the same old recipe of hyperactive family fun. Sodium benzoate was not banned.

So why is the Harvard Mental Health Letter so noncommittal? Well, they cite a well known study of 35 mother-son pairs where the mothers believed the boys, ages 5 to 7, were sugar sensitive. The researchers told the Moms that the boys would be randomly assigned to a dose of high sugar or a dose of aspartame. In reality, all the boys received aspartame. The mothers who thought their sons got sugar reported their child's behavior to be significantly more hyperactive afterward. "The researchers concluded that parental expectation may color perception when it comes to food related behaviors." Really? Well, doesn't a double-blind prospective cross-over design with independent examiners CONTROL for that sort of thing in the Southampton Study? I would think so.

The medical letter goes on to talk about Omega 6 to Omega 3 fatty acids (they recommended kids consume 12 ounces of low mercury shellfish or fish a week) and micronutrients (deficiencies in zinc, iron, magnesium, and vitamin B6 have been documented in children with ADHD, but there is no evidence that supplementation is helpful, and megadoses, which can be toxic, should be avoided). Eventually, they recommend "a healthful diet" for kids and minimizing processed and fast food. Well, I can get behind that. And, frankly, I don't recommend a steady diet of skittles to anyone, though it would be nice to have the option (as the European moms do) of having candy without the crappy additives. Since hyperactivity affects 10% of children (1), what they eat can have a large effect, overall, on kids and parents alike.
Read More..

Senin, 12 Juli 2010

D-D-Depression

I was deficient in vitamin D. Of course. I paid attention to the official word about sunshine - it's bad for you. Ultraviolet radiation chops up your skin cell DNA, and with enough scrambled DNA and a bit of bad luck, you will eventually get cancer. There are several major types of skin cancer, but melanoma is the scariest, also, sun gives you wrinkles and age spots and... so I've been putting on sunblock and avoiding the beach except for a few days a year for a least 10 years.

At the same time the dermatologists and women's magazines were scaring us away from the sun, our own fat phobia and a cultural trend of eating less organ meat scared us away from the best dietary sources of some key fat soluble vitamins (A, D, E, and K). We don't want to be low in these vitamins, as they tend to help orchestrate a lot of functions in the body. Vitamin D (which is found in animal fats, but we tend to get about 90% from the sun) in particular seems to be involved in about 10% of the biochemical soupy stuff our body does every day. It has a lot to do with membrane signaling and scavenging up any screwy cells that are starting to go awry (i.e. cancer), and being low in vitamin D seems to put us at hugely increased risk of cancer, including melanoma. And prostate cancer. And breast cancer. And colon cancer. In fact, women diagnosed with breast cancer in the summer and fall have the best prognosis. There are reports of chemotherapy not working as well in the winter. (1)

There are also links to mental health - depression, bipolar disorder, and psychotic disorders (2) have all increased in populations once most people stopped working outside and went to work inside. The elderly with low vitamin D also have much higher rates of depression (3). In this study of bone mineral density and depression, the elderly with poorer bone status were also more depressed (vitamin D was not explicitly stated to be the possible linking factor for both illnesses).

How would vitamin D affect the brain? Vitamin D is involved in the synthesis of the catecholamines (which are highly involved in neurotransmission). Summer sunlight increases brain serotonin levels twice as much as winter sunlight (4). Neurons and glial cells in all kinds of areas of the brain have vitamin D receptors on them, indicating a brain that is hungry to use vitamin D. Some effects in the nervous system include the synthesis of neurotrophic factors (what I call "brain fertilizer"), inhibition of the creation of an enzyme that chews up nitric oxide, and increasing glutathione levels. (See my previous posts for a molecular description of how some of these brain chemicals are thought to be involved in depression). As vitamin D in the periphery is associated with scavenging and cleaning cancer cells, vitamin D in the central nervous system seems to be involved in detoxification and anti-inflammatory pathways (5)(6).

Does supplementation help depression? Well, the first several studies were disappointing. Harris and Dawson Hughes tried treating Seasonal Affective Disorder with 400 IU vitamin D2 daily. Didn't do squat. Of course, D2 is the plant form of vitamin D (the animal form is D3), and 400 IU is a tiny dose anyway. Lansdowne and buddies gave 400 IU and 800 IU of vitamin D3 to healthy subjects in late winter, and found a lightened mood in those receiving the supplements. Hollis gave people with seasonal affective disorder a single 100,000 IU dose of D3, and found it to be more effective than light therapy, and the improvement was statistically correlated with the improvement in serum 25(OH) vitamin D levels. In this intriguing study, young adults were given access to tanning beds on Mondays and Wednesdays. One bed had UV light, and identical bed didn't. On Fridays, the participants were allowed to choose which bed they wanted. 95% of the time, they chose the UV bed, and participants also reported being more relaxed after a UV tan than in the sham bed.

Unfortunately, there is no large, well-designed study of D3 supplementation for depression that I'm aware of. However, there is enough interesting evidence for such a trial to be done, especially in populations that are more likely to be vitamin D deficient, such as the elderly. Like fish oil, vitamin D3 is cheap (about $10 for a three month supply) and readily available. And given the links to other diseases also (heart disease, stroke, osteoporosis, kidney damage, hypertension, you name it (1)), it would seem prudent (and money-saving from a public health standpoint if a lot of cancer is really prevented by adequate supplementation) to test for and treat deficiency in people with psychiatric disorders.

Another issue is that the RDA for vitamin D is woefully small. About 400 IU daily. This is an amount that will keep you from getting rickets, but it's certainly not an optimal amount for humans. I've heard murmurings that the official RDA is going to be increased to 1000 IU daily, and most decent multivitamins will have 1000 IU of vit D already (that's why your multi says "250%" of RDA of cholecalciferol (vit D3), in case you were wondering). The amount in fortified milk is also small, so that one would need to drink a truckload for it to matter much.

So how much vitamin D do we need, and hey, isn't vitamin D a fat soluble vitamin, which means we can store is for a long time, and couldn't we get toxic from high amounts? The answer is - we probably need many times the current RDA for vitamin D to get reasonable serum levels of the stuff, and yes, we can get toxic, but for most people that is not a realistic worry.

According to the Vitamin D Council, a serum level of 50 ng/ml or higher of 25 (OH) vit D3 is optimal. This level is not without controversy, and 35 is accepted by most as the minimal acceptable level. One probably doesn't want to go above 100, though toxicity has only been reported at serum levels higher than 150 (6). You can't get too much vitamin D from the sun - our skin actually destroys excess vitamin D made there after you have enough for the day. A cool regulatory mechanism if ever I heard one. You *could* theoretically get toxicity from combining high amounts of supplementation *and* lots of sunshine. There's a description on the vitamin D council website of one guy who actually did get toxic from supplements - turned out an industrial accident made his particular variety of vitamins (Prolongevity) contain up to 430 times the amount on the label. This guy was taking between 50,000 IU and 2.6 million IU daily for about two years. He recovered (uneventfully) with some medicine and sunscreen.

So how do you know if you have enough vitamin D? Well, if you are a lifeguard in Miami, you're probably fine. If you have very dark skin, unless you are a lifeguard on the equator, you probably need some supplementation. It can take someone with very dark skin about 5-6 times longer in the sun to get enough vitamin D to have adequate levels compared to someone with very pale skin. If you live north of 40 degrees latitude (above New York City), you only have a few weeks in the summer to expose that skin and get the full amounts of vitamin D you need to last you for the year, and you may have to supplement (again, there is controversy about this, especially as very pale people of Northern European ancestry seemed to live to the far north of 40 degrees and had only a few days a year they could possibly get adequate vitamin D from the sun). Anyway, to really know your blood levels of vitamin D, you need to get a blood test. The key level you need to know is 25-OH vitamin D3. If your doctor orders 1,25 OH or just "total vitamin D" you might not get the right number, so make sure you look at the lab slip. If you don't want to go to the doctor, you can go to this website and pay $65 or so for a home testing kit. Unless you live in New York state, where home testing via mailing bloodspot cards is apparently illegal.

So let's say you ordered a home test kit and stabbed your finger and shipped your spot of blood back to the lab and your level comes out to be 31 ng/ml. There's a general rule of thumb that 1000 IU of supplementation daily will increase blood levels by 10 ng/ml. (Use geltabs in oil suspension rather than tablets, unless you are always going to be taking the supplement with some oil/fat.) So let's say we are aiming for 50 - then one could take 2000 IU D3 daily in the morning. If you were already supplementing at 1000 IU (in your multivitamin, for example), you could take an additional 2000 IU daily, and you could skip the additional supplementation on days you spent time in the sun (without suncreen - sunscreen will block the UVB rays that synthesize vitamin D in the skin). Arms and legs exposure for 20 minutes midday in the summertime in Boston about 3-4 times a week would get you a goodly amount (probably around 10,000-12,000 IU with each exposure) if you have pale skin. That kind of exposure is not such a big deal for skin cancer risk, as long as you avoid burning. The farther south you are (until you get to the equator, then reverse!) and the paler you are, the less time you need.

It is standard practice for physicians to treat vitamin D deficiency with 50,000 IU pills once a week for 8-12 weeks, then recheck. Unfortunately, a recent JAMA study of similar treatment in elderly women (admittedly it was 50,000 IU D3 daily for 10 days) resulted in a great increase in the number of fractures. The editorial for the study thought 4000 IU daily was a safer, more physiological amount to treat deficiency, and be sure you are getting adequate calcium too. However, if you supplement with calcium and vitamin D3, as your vitamin D levels become adequate, your absorption of calcium can increase quite a bit (see slides 18-36). Therefore, you may not need as much calcium if you take vitamin D. The recommendations are not set in stone, though. (Our current RDA for calcium may be high simply because we don't get enough vitamin D!) Also, most of the prescription vitamin D doses are D2, not D3, and D2, the plant form, is probably not nearly as effective as the animal-derived form, D3.

Here's yet another thing to watch out for with higher-dose vitamin D3 supplementation. Occasionally, you will unmask some hyperparathyroidism. If someone's parathyroid is working on overdrive, he or she will start to have serum levels of calcium that are way too high, potentiated by the higher doses of vitamin D3. This can be dangerous if it goes undetected, though high calcium levels can be very uncomfortable, with symptoms of muscle twitching, cramping, fatigue, insomnia, depression, thinning hair, high blood pressure, bone pain, kidney stones, headaches, and heart palpitations. Since bone pain, fatigue, depression, and insomnia can be symptoms of low vitamin D3 as well, it is important to realize that if your symptoms get worse with supplementation, you should see your doctor and get a calcium and parathyroid hormone checked. While I personally don't check calcium levels with the initial vitamin D level, I do check it for follow-up ones (I tend to check after three months or so). While home testing is a neat option for the initial level, seeing your doctor about follow up and his or her suggestions for supplementation is a good idea if your level is found to be low.

And what about those other fat soluble vitamins: A, E, and K? It is important that you have enough of each of them, or things can get a bit screwy. For example, in order to create bone, you need adequate vitamin D (at least a level of 20-30), adequate calcium, AND vitamin K2. The best sources are animal fats, particularly the fats from animals that eat their natural diet - grass for cows, or grubs and grains and whatnot for chickens. So pastured chicken egg yolks, and butter and liver from pastured cows. Conventionally-raised eggs can have about 1/20th the vitamins of pastured eggs, and butter from grain-fed cows may have as little as 1/200th as much K2 as pastured butter, so it really does matter what the animals you eat ate. Vitamin A is also found in multivitamins and it is important not to have too much vitamin D3 and too little A, so I've recommended a multivitamin in addition to vit D3 for people who are deficient in serum 25 (OH) vitamin D (and aren't big liver eaters :)).

Strict vegetarians - here's another place you need to be super careful about what you eat, and you might need to choke down some fermented soy products (netto) to get enough Vitamin K2. K2 isn't found in a standard multivitamin (though we can make K1 into K2, if our intestinal flora is happy, which it might not be on a standard American diet - no idea about flora in a vegan diet. Interesting question) and is vital to bone formation and in keeping our arteries resilient. K2 is what warfarin blocks, so don't take it if you are on coumadin for blood clots. (Though why are you at risk for blood clots in the first place? maybe too much omega 6 compared to omega 3??)

So, a key part of good, lasting health is either to get plenty of (safe - no burns!) sun as our ancestors did, or use today's science to get your blood levels of vitamin D where they need to be. Chat with your doctor about it - and check out the Vitamin D Council Website for more information.
Read More..

Jumat, 09 Juli 2010

"Epidemiology is Bogus"

The paleoblogosphere is humming with excitement the past few days over the glorious work done by raw food blogger Denise Minger in her personal examination of the China Study, a large data set of epidemiology studies used by researcher T. Colin Campbell to formulate his book The China Study: The Most Comprehensive Study of Nutrition Ever Conducted and the Startling Implications for Diet, Weight Loss and Long-term Health. In it, Campbell comes to the conclusion that avoiding animal protein is the best way to avoid all sorts of diseases of civilization.

Denise's post uses the same China data set to implicate wheat as a major factor associated with heart disease and cancer, not animal protein. Kurt Harris of PaNu makes a point in his analysis of Denise's work that is the closest to my own struggles combing the literature - association studies are interesting, but troublesome. Conclusions from such studies should be viewed with a furrowed brow. "Hmmm, that's intriguing. I wonder why red-tailed baboons who eat less algae live longer than the red-tailed baboons who eat more algae? Why don't we do a prospective randomized controlled trial of red-tailed baboons and algae eating to sort that one out?" Because associations always come with confounding factors - turns out red-tailed baboons who eat algae love race car driving, and have you ever seen a baboon wearing a seat belt? Sophisticated statisticians will try to account for all these "confounding factors," but such a task can be simply impossible when examining a complex system such as society, or the biochemistry of the human body.

The psychiatric literature is loaded with brief, often useless, short-term randomized controlled studies of lesser and better quality and association studies. It seems the nutrition literature is even worse. Large, long prospective trials of good quality are horrendously expensive, and may take decades to do properly.

This is why I feel the healthiest and most sensible way to eat is based on an evolutionary paradigm, and that evolutionary-based lifestyle measures (though I haven't blogged too specifically about them yet, as the nutrition aspect is my primary personal interest, these measures would include regular exercise, meditation (as a proxy for the intense in-the-now concentration we used to use for hunting and gathering), proper sleep, working with the hands, and various other social/fun stress-reducing activities) are likely to be most effective for our modern, unsettled minds. It is not that there is a huge amount of science data backing that up - it is that until we have exhaustively proven otherwise (which so far, in my mind, we haven't), at least scientists can concede that our exceedingly complex human design is based on adaptations for our ancestors' lives.

For public health and sanity, I believe in an evolutionary viewpoint. And animal protein :)
Read More..

Selasa, 06 Juli 2010

Autoimmune Disease

The first thing you think of when the word "psychiatry" comes up is probably not autoimmune disease. But I see a lot of it, as autoimmune diseases (all kinds - lupus, MS, hypothyroidism, rheumatoid arthritis, asthma, seasonal allergies etc.) bring with them their tough to treat anxieties and depressions. And of course, as a psychiatrist, I'll also get referrals for people with vague symptom clusters of autoimmune-like illness who have been worked up for all sorts of dire things, and when nothing shows up on lab tests, they are shipped off to me. A few years ago, I actually helped write a book for these kinds of symptoms - Feeling Better: A 6-Week Mind-Body Program to Ease Your Chronic Symptoms, so I might get more of these kinds of referrals than the average psychiatrist. (Full disclosure - please ignore the nutrition chapter from the book, if you happen to pick it up! It's more Body For Life than Primal Blueprint!)

For today's post, I thought I might summarize Staffan Lindeberg's work from Food and Western Disease: Health and nutrition from an evolutionary perspective.

From the beginnings of the immune system, our ancestors have used it to fight off attacks strange-looking proteins. Autoimmune problems occur when our immune systems are somehow triggered to take on our own bodies.

How could this happen? Lindeberg suggests the following - start with an increased amount of potential antigens in the intestine from problematic foods, such as grains, milk, and beans. Cereals and beans also have enzyme inhibitors which keep our intestines from fully breaking them down. Then plant lectins in beans and grains open the tight junctions between the intestinal epithelial cells to allow the partially digested, undesirable molecules to pass through. Gliadin, a wheat protein, also activates zonulin signaling, which makes the intestine even more permeable. (Yummy). Heck, "even the permeability blood-brain barrier has been shown to increase with the consumption of wheat lectin." (page 211).

Speaking of wheat - gluten intolerance is an autoimmune disease. In celiac disease, the consumption of wheat and rye (gluten) causes the intestinal lining to be destroyed. It is genetic, but is generally cured by eliminating grains from the diet. Dermatitis herpetiformis is an autoimmune skin disease that is also caused by exposure to gluten, and there is a form of ataxia (difficulty moving due to problems with muscle coordination) that improves dramatically with the removal of gluten from the diet. Other studies have found people with headaches that went away with a gluten-free diet. I've already reviewed the data in schizophrenia.

In rheumatoid arthritis, many patients have anti-milk or anti-wheat antibodies, or both. Fasting has been shown to be helpful, and so have a Mediterranean-like diet and a gluten-free vegan diet that was low in omega6/omega 3 ratio.

Type I diabetes (caused by autoimmune destruction of certain cells in the pancreas) has a geographical distribution that is strongly related to the consumption of cows milk, both on a global level and regionally. The more milk consumed, the more type 1 diabetes is found in a particular area. Immigrants who move to a milk-consuming area start to get more type I diabetes than their relatives who live back in relatively milk-free areas (such as Japan). Beta casein A1 is probably the most likely milk protein candidate, as Iceland has cows who produce smaller amounts of beta casein A1, and their happy milk-drinking people tend to have less type I diabetes than you would expect. Wheat lectins have also been suspected of causing issues with diabetes - as in one study 19 of 23 children with new onset type I diabetes had anti-wheat gluten protein fragment antibodies. Perhaps the wheat lectins open the gut, then the beta casein A1 gets into the bloodstream and triggers the autoimmune attack.

Multiple sclerosis has nearly an identical geographic distribution to type I diabetes, and many people with MS are found to have immune reactions to milk proteins. Wheat gluten antibodies are also increased among MS patients.

Lindeberg suggests, in summary, that there is plenty of evidence that problematic proteins in our food cause autoimmune reactions. Therefore, a trial of a Paleolithic diet for preventing and treating these illnesses is certainly reasonable. If your autoimmune disease is active, he says, reduce the amount of circulating proteins that are contributing to the problem. Don't eat cereals (including wheat), milk products, or beans. Make sure your gut is digesting properly by not eating protease inhibitors such as those found in cereals, beans, and soy. And keep your tight junctions in your gut happy and tight! Eliminate powerful lectins from grains, soy, legumes, and beans.

Sound familiar?
Read More..

Sabtu, 03 Juli 2010

Low Cholesterol and Suicide

Low serum cholesterol has been linked in numerous scientific papers to suicide, accidents, and violence (1)(2)(3)(4)(5)(6)(7)... there are a bunch more, but I'm a bit weary of linking! This is why I write a blog, and not a peer-reviewed journal. Anyway, no one knows to this day whether depression, violence, and suicidal risk have a metabolic byproduct of low cholesterol, or whether having low cholesterol will predispose you to suicide out of hand (here's a rather snarky editorial pointing out that fact (8)). Some trials of statins (with the resultant crackerjack drop in cholesterol) will show no effect on suicide (9). A statin skeptic's favorite study, the J-LIT trial, showed deaths by accidents/suicides increased threefold in the group with total cholesterol less than 160 (yes, the p was .09, but that means there is only a 91% chance that finding didn't happen by random happenstance (10)).

Now, why could serum cholesterol have anything to do with the brain and depression? Good question - and the first question to ask in any theory of the brain is do the peripheral levels of something have anything at all to do with the central nervous system amounts of the same thing - so do serum cholesterol levels match up to relative amounts of cholesterol in the brain? They do (11). And cholesterol is important in the brain. Synapses, where brain function goes live, have to have cholesterol to form. Brain signaling is all about membranes, and cell membranes are constructed from fat. Cholesterol and the omega 3 and 6 fatty acids are the most important molecules in the synapse. If your brain fat is significantly different from so-called "normal" fat (which I'll go back to the hunter gatherer paradigm and say an HG's brain is going to have the approximate fat constituents for which we are evolved), the signaling in your brain could be very different too (12). Scientific papers will call this "alterations of membrane fluidity." (13)

So we know that low serum cholesterol is associated with suicide, violence, and accidents. (Another wrench in the works - low serum cholesterol is also associated with low CSF serotonin - which is of course associated with increased violence and suicide! These association studies are enough to make anyone give up and go boar hunting.) But does dietary fat intake have anything to do with depression and suicide? (Remember, serum cholesterol is often a chancy thing to connect to diet, after all.) Well, of 3400 some-odd people in Finland, the omega-3 rich fish consumers (14) had significantly less depression than abstainers, but the finding was more robust among women (no one knows why). In this round-up of 408 suicide attempters and an equal number of controls, there was no difference in saturated fat intake between attempters and controls, but the attempters did report lower fiber and polyunsaturated fat intake.

And, finally, do statins cause depression? I've seen statins cause or exacerbate depression several times in my clinical practice, especially in women. (I've also seen them cause paranoid psychosis a couple of times - twice in women and once a long time ago in a man. The psychosis remitted with withdrawal of the statin). Very striking! But anecdotes aren't clinical trials. This brand new study shows no link, and statins actually seemed to decrease depression in elderly women. This study also shows no link. This study shows that chronic cholesterol depletion via statin use decreases the functioning of the serotonin 1A receptors in humans, by decreasing the ability of the receptor to bind to its friendly neighborhood G proteins and other binding proteins. (The serotonin 1A receptor is more highly associated with anxiety-type symptoms than depression).

Clear as mud! But stepping back to whole health, I never like the idea of "the lower the better, no matter what," which seems to be the prevailing winds of cholesterol treatment right now. Usually, chemicals in the body are important for something, or else they wouldn't be there, and typically, a U-shaped curve emerges, where too little or too much (cholesterol, vitamin D, omega 3s, immunoglobulins, you name it!) is bad for human health. Here's an example of the U-shaped curve from the J-Lit trial (via Hyperlipid), showing increased cardiac death at low and high serum cholesterol levels.

Statins may have their role, but please don't put them in the water. In my opinion, adopt a whole foods, paleo-style diet. Keep yourself in the middle of that U-shaped curve for what our human systems were evolved for. It may help your mood, too, especially if you are a woman who eats fish!

(Follow-up posts:  Low Cholesterol and Suicide 2 and Your Brain Needs Cholesterol, Don't Go Too Low)
Read More..

Jumat, 02 Juli 2010

Semistarvation Experiments in WWII

This morning I had a chance to read this fascinating article about Ancel Keys' semistarvation experiments of conscientious objectors toward the end of World War II.

For the study, 36 healthy young men who had been excused from armed service for ethical objections to killing agreed to a year long diet of sorts that would include 3 months of preparation, 6 months of semistarvation, designed to make the men lose 25% of their body weight, and then 3 months of refeeding. The purpose of the study was to determine how people would react under such conditions, and then also to learn how to safely and successfully refeed starving populations. The men were highly motivated for the study, and their purpose was to help their country and the young men who were fighting overseas who might face starving conditions themselves.

The young men lived in a dorm at the University of Minnesota, and in addition to their restricted diet, they were required to walk 22 miles a week. All their food was prepared in a dormitory kitchen, and once the starvation began, each man's calories were adjusted every Friday to meet a weight loss goal of 2.5 lbs (1.1 kg) per week. Their average daily calories during the semistarvation period was about 1600 calories a day (they ate approximately 3200 calories daily before the study). I find the number 1600 calories especially compelling, for a standard weight loss diet recommended for a woman is about 1200 calories daily. Their food consisted of what might have been available in war-torn Europe at the time - potatoes, turnips, rutabagas, dark bread, macaroni, small glasses of milk, chicken, toast with a small smear of jam, those kinds of things.

What was it like for them? Well, horrible. They described lethargy, irritability, anxiety that approached each time they were to learn how much they were allowed to eat the following week. They had to institute a buddy system so that none of the men were allowed to leave the dormitory alone, as one man went off diet and had to be excused from the study. They had dizziness, cold intolerance (requesting heavy blankets even in the middle of summer), muscle soreness, hair loss, reduced coordination, edema, and ringing in the ears. Some had to withdraw from their university classes because they did not have the capability to concentrate. Their sex drive disappeared. They became obsessed with food, eating with elaborate rituals (which eating disorder patients also do) and adding water to their plates to make the food last longer. Many collected cookbooks and recipes. One man, tempted by the odor from a bakery, bought a dozen doughnuts and gave them to children in the street just to watch them eat. Originally, the participants were allowed to chew gum, but when many of the men went to chewing about 40 sticks a day, it was decided that gum would affect the experiment and it was disallowed.

Only 32 of the original 36 completed the semistarvation period. One man who broke diet admitted to stealing scrapings from the garbage cans, stealing and eating raw rutabagas, and stopping at shops to eat sundaes. Two of the men suffered severe psychological stress - one became suicidal, and another began self-mutilating, and both had to be taken to a psychiatric hospital (the details or the mutilation and suicidality are not mentioned in the article I cited at the top of the post, but are described in Good Calories, Bad Calories by Gary Taubes. The book Depression-Free, Naturally: 7 Weeks to Eliminating Anxiety, Despair, Fatigue, and Anger from Your Life describes one man cutting off three of his fingers).

The 3 month refeeding period involved trying several different combinations of protein, vitamins, and levels of calories. Dizziness, apathy and lethargy improved first, but persistent hunger, weakness, and loss of sex drive persisted for several months. The men described "a year long cavity" that needed to be filled. The day after they were finally released from the study, one of the men was hospitalized to have his stomach pumped after binging. In the aftermath of the study, "many, like Roscoe Hinkle, put on substantial weight: �Boy did I add weight. Well, that was flab. You don�t have muscle yet. And get[ting] the muscle back again, boy that�s no fun.�" None who were interviewed in their 80s felt there was any lasting medical harm, once they'd recovered.

If you have a moment, the article is definitely worth a read, and it's only 6 pages long. Much easier to digest than Keys' 1385 page textbook based on the research, The Biology of Human Starvation (which I must admit, I have not read, and given the height of the stack of books on my nightstand, I will not be reading any time soon).

Again, what strikes me the most about this study is how close it is to the standard recommendations for weight loss today (500-1000 calorie deficit daily for goal of 1-2 pounds lost a week, plus moderate exercise). The difference is by degree (1700 calorie deficit daily for goal of 2.5 pounds lost a week), and the fact that the men were normal weight when they began the study. But this strict diet sent 6% of the participants to the psychiatric hospital - and these were highly motivated, healthy young men! There is also a marked contrast between the psychological states in this long-term semi-starvation and reports of shorter-term water fasts. And what about bariatric surgery patients - the voluntary surgery leads to forced, sustained semi-starvation, after all. This study shows an improved quality of life and far greater weight loss compared to obese controls, and this study shows improved mental health at 6 months and 12 months post surgery, (though they used a questionnaire called the "SF-36" in which 36 questions somehow covered eight dimensions of "physical functioning, role limitation due to physical health problem, bodily pain, social functioning, general mental health, role limitations due to emotional problems, vitality/energy/fatigue, and general health perceptions.") Finally, this study shows sustained improvement in depression after bariatric surgery with subsequent weight loss, even after 4 years. The differences in the amount of adipose tissue available for fuel may make a real difference (I sincerely doubt this is the case, or else it would be easy to lose weight if one were obese), and most post-bariatric surgery diets it seems to me must be higher in protein than the Ancel Keys diet. Also, there is suspected to be an immediate hormonal change in the body after a gastric bypass, and this may affect satiety and the perception of starvation as the bodyfat set point is suddenly adjusted much lower.

All told, prolonged semi-starvation on turnips and dark bread is not something I would recommend for anyone, if you can avoid it. Perhaps Mrs. Ancel Keys said it best, when she described the effects of the experiment on her husband: "�Mrs. Keys said that Dr. Keys went through terrible times during the experiment as we lost weight and became gaunt and so on. And he would come home and say, �What am I doing to these young men? I had no idea it was going to be this hard.� �
Read More..