For years, I've wondered why celiac experts continue to remain an underperforming segment of the medical industry. They just never seem to be able to get all their ducks in a row. I think I've figured out why - they often don't know what they're talking about, but that doesn't keep them from talking about it anyway.
One of the things that really bugs me about celiac "experts" is that they're notorious for refusing to admit when they're wrong, (of course, they're wrong most of the time, so it would probably keep them pretty busy, just owning up to their mistakes.
Also, instead of bothering to collect accurate facts, on which to base their working knowledge, they continue to promote incorrect assumptions, half-truths, and omissions, as the source of their working knowledge. Even after their own research proves those claims to be wrong, they sometimes continue making the same incorrect assertions, as if it doesn't really matter what they say. It's impossible for anyone to advance the cause of science, when their work is based on initial "facts", that are not actually true. I have no way of knowing whether those little gems of knowledge originated as incorrect assumptions, or they were handed down as urban rumors, and the celiac community just adopted them as truths, but any way you look at it, they are bad science, and should have no place in mainstream medicine.
Consider, for example, the following quote from a recent article.
http://consumer.healthday.com/Article.asp?AID=651247Other potential explanations for the rise in celiac disease rates, according to Fasano, include:
An increase in the amount of gluten found in grains. "We eat grains that are much more rich in glutens than they were 70 or 80 years ago," he said.
Do we actually "eat grains that are much more rich in glutens than they were 70 or 80 years ago"? I see this mistaken claim repeated so often, that even though I've disputed it previously, I feel obligated to once again expose it as a "mistaken assumption", (that sounds so much better than calling it a bald-faced lie).
Kansas is considered to be the breadbasket of the U. S., since that state produces more wheat than any other state in the union. So it's no coincidence that the Kansas State Agricultural Statistics Service has the most complete records of wheat production in the United States. Their records go all the way back to 1866, but unfortunately, they only started collecting and reporting data on protein content of their wheat crop somewhere in the late 1930's, so protein data were not available until the 1940's. Consider these representative average protein reports, from the Kansas wheat crop, over the years:
1948 wheat crop - average protein content - 12.4%
1958 wheat crop - average protein content - 11.8%
1968 wheat crop - average protein content - 11.7%
1978 wheat crop - average protein content - 12.0%
1988 wheat crop - average protein content - 12.5%
1998 wheat crop - average protein content - 11.5%
http://www.wheatmania.com/grainsofhisto ... etches.htm
That report ended with the new millennium, so we'll need to look at a more recent report, to get more recent figures.
http://www.ksre.ksu.edu/library/crpsl2/mf2773.pdfThe 2010 Kansas wheat crop produced 369 million bushels on 8.2 million harvested acres. It featured an average test weight of 61.4 pounds per bushel and protein of 12.0 percent, both of which exceed the 2009 averages of 61.0 pounds and 11.6 percent, respectively. The 10-year test-weight average is 60.3 pounds per bushel, and protein is 12.3 percent.
Does it appear that the protein content of Kansas wheat has changed significantly, over the years? It sure doesn't appear that way to me - it appears that the protein content has been remarkably stable, for many decades, (at least as long as wheat protein records have been maintained). We may eat more of those grains than we did 70 or 80 years ago, but the grain itself has changed little, if any. Of course, those who disagree with me will claim that the protein content of wheat is irrelevant - it's the gluten fraction of protein that has increased so dramatically over the decades. But if that's true, where's the evidence? I don't see any evidence of that claim, except as oft-repeated hearsay.
Also consider this quote from the article at the first link above, (from healthday):
OK, I'm not saying that such a claim may not have some merit, but I've lived on a farm all my life, and I played in the dirt, (a heck of a lot), when I was a kid, and our chickens, and ducks and geese, and dogs and cats played, (and presumably, defecated), in that same dirt. Every time I see that claim, I wonder if dirty, unsanitary kids actually grow up to be as healthy as the "experts" seem to think that they should. All I know is, it didn't work for me."We're just too clean a society, so our immune systems aren't as developed as they should be," she said.
Another version of the hypothesis holds that the cleanliness of industrialized society has caused a fundamental change in the composition of the digestive bacteria contained within the gut, Fasano said.
Nearly every celiac-related article contains this obligatory comment:
My response to that is, "Why? Are we supposed to do that so that our doctors can misdiagnose us, and convince us to continue eating gluten, and suffering from the symptoms of guten-sensitivity, for the rest of our life, so that we can continue to slowly destroy what's left of our health, by continuing to eat a toxic diet?". Statistically, we have roughly a 1% chance of being properly diagnosed, meaning that we have a 99% chance of being misdiagnosed. That doesn't sound like good advice to me, but then, I'm not a doctor, I'm just an ignorant old farm boy.However, people who suspect they have celiac disease should not go gluten-free before being tested. Doing that can interfere with the accuracy of the screening.
"It's very important that you don't change your diet before you are screened for celiac disease," Shilson said.
But, there is at least one redeeming value in that article, that's definitely worthy of our consideration, and it's found in this quote:
The red emphasis is mine, of course, but I feel that this constitutes a very elegant observation, which offers some insight into why the disease is so often misdiagnosed in adults. Doctors have such a bad habit of trying to break issues down into individual problems, and treating the symptoms, (rather than taking a wholistic approach), that they tend to fail to see the forest for the trees. They treat anemia and fatigue, and they treat osteoporosis and arthritis, but they never bother to figure out what is causing those symptoms - they're just happy as larks to be writing prescriptions to treat all those "neat" symptoms.The disease interferes with proper digestion and, in children, prompts symptoms that include bloating, vomiting, diarrhea or constipation. Adults with celiac disease are less likely to show digestive symptoms but will develop problems such as anemia, fatigue, osteoporosis or arthritis as the disorder robs their bodies of vital nutrients.
Of course, a lot of adults present with the same obvious digestive system issues as celiac kids, and their doctors still misdiagnose them, so it's no wonder that they can't properly diagnose adults who present with less-obvious symptoms.
So the bottom line is, my biggest beef is that celiac doctors try to blame the unacceptably poor diagnostic statistics on all sorts of irrelevant satellite issues that may or may not matter, when the dominant problem is their own incompetence in diagnosing the disease. I suppose that's part of their CYA plan.
Thus endeth my rant for the day.
Tex

Visit the Microscopic Colitis Foundation Website




