The Fall of Bloodletting and The Rise of Iron

I’m truly at a point where I love doing this blog more than I ever have. No longer a lone voice that’s hit & miss, I have a group of wonderful collaborators who somehow manage to put up with me—perhaps, because I truly strive to promote the fruits of their collaborative efforts as best as I can, and without the over-the-top marketing hype that’s become so common everywhere.

Here’s the entire bank of my knowledge on bloodletting, up until a couple of months ago: primitive, superstitious practice that killed George Washington, the first president of these united states (may not even be true). I always love being wrong because then, minimally, I’m less wrong than before.

I love slaying dragons or, in this context, questioning icons, bromides, and slogans of “truth.”

I once again give you The Duck Dodgers.

In our previous article, Iron, Food Enrichment and The Theory of Everything, we hypothesized the link between the rise in modern chronic disease and the rise in iron intakes during the 20th century, through both food fortification and increased meat consumption.

Careful readers are well aware that chronic diseases of civilization began to rise well before iron fortification entered the food supply. When we investigated this further, we found to our surprise that our not so distant ancestors were bloodletting far, far more than we ever imagined.

From the time of antiquity to the late 19th century—for at least 2,000 years—bloodletting was extremely common. In fact, it became so common during the 19th century that its abuse ultimately led to its downfall—along with it being discredited by prominent physicians.

Bloodletting was popular with the major religions (seasonal/ritual bloodlettings, relief from afflictions). Bloodletting was common in Ancient Egypt, China, Greece and Rome. And it expanded to prophylactic bloodletting during ancient times and into pre-modern times.

In medieval times, the major religious institutions used to do a lot of periodic bloodletting, and on scheduled and regulated days. People could also choose to fast or bloodlet, and the pious did a lot of both—bloodletting as much as five times a year if they were healthy. If you chose to bloodlet you were given a three-day break from services and perhaps some meat as you recovered.

By the 19th century, you were bled for virtually every conceivable condition. If you had a headache, a cough, a fever, were pregnant, or had anxiety—or anything—you were bled until you fainted…and you did it standing up, to speed up the process. If you needed surgery, you were bled before the operation as a form of anesthesia. The practice did become abused (but that raises a question: why?). It was the most common medical practice and the de facto treatment for virtually every ailment. Bi-annual prophylactic bloodlettings were very common.

The British Homoeopathic Review, Volume 40 (1896)

How common was the spring and fall practice of bloodletting during the first half of this century, and how disastrous were its effects, are illustrated by a statement of Dr. Wilks. He said that he had often asked the late Mr. Monson Hills, who for many years was cupper and surgery attendant, and for all practical purposes house-surgeon, at Guy’s Hospital, as to his experience of the time when persons came to the hospital, especially at the “spring and fall,” to be bled by the dozen or twenty in the morning. After I had supposed that they would walk in and as quietly walk out after the operation, he would answer, “No such thing; they commonly fainted, and they might be seen lying in rows on the surgery floor like so many slaughtered sheep.” Dr. Markham quoted the late Dr. Stokes, of Dublin, as saying that “when I was a student of the Meath Hospital hardly a morning passed when some twenty or thirty unfortunate creatures were not phlebotomised. The floor was running with blood to such an extent that it was difficult to cross the prescribing hall for fear of slipping. Patients were seen wallowing in their own blood.”

Not only was bloodletting constantly practiced by every physician, but your local “barber surgeon” could give you a shave, a haircut and a bloodletting whenever you felt like it. They were in very high demand.

The Professor of Secrets: Mystery, Medicine, and Alchemy in Renaissance Italy, by William Eamon

In theory, barber-surgeons intervened only under a physician’s order, as part of a prescribed cure. In reality, matters were quite different. Seasonal bloodlettings were commonly self-prescribed as part of everyday health management. In addition to performing phlebotomies, barber-surgeons were authorized to set broken bones, treat wounds, and medicate abscesses and skin diseases. They were, in other words, authorized to treat the outer body. (The physiological domain of physicians, by contrast, was the body beneath the skin.) Because their cures were more accessible and less expensive than those of physicians, surgeons routinely treated a much broader range of illnesses than they were officially empowered to.

The traditional barber pole that hangs outside of many barber shops is a symbol that represents a barber’s bloodletting services.

It’s commonly believed that it was the doctors and barbers who preyed on unsuspecting patients and convinced them that they needed unnecessary bloodlettings. In reality bloodlettings were said to have had a noticeable effect that the patients demanded—much like patients demanding antibiotics today.

An essay on the remittent and intermittent diseases, by John Macculloch (1830)

“It is said, and indeed it is matter of daily experience, that in all such cases, immediate relief is procured by blood-letting in either of these forms; and as the same relief is similarly produced in cases of decided inflammation, as in others in which it is an acknowledged remedy, the analogy seems sufficiently perfect to form a justifiable argument. Unfortunately, still more unfortunately, patients themselves become so convinced; so conscious in fact of this relief, that they are always ready to demand it, and, still more, to resort to it without advice, or against that, on their own notions and opinions.”

Although Hippocratic therapies, including bloodletting, survived into the 1920s, by 1875 bloodletting had fallen by the wayside, and many doctors had lamented its passing:

Bad Medicine: Doctors Doing Harm Since Hippocrates

In 1879 an American doctor, T. H. Buckler acknowledged that ‘the lancet, but the common consent of the profession at large, had been sheathed never to be drawn again’. Yet he was writing ‘A Plea for the Lancet’. In 1875 an English doctor, W. Mitchell Clarke, wrote ‘we are most decidedly living in one of the periods when the lancet is carried idly in its silver case; no one bleeds; and yet from the way in which my friends retain their lancets, and keep them from rusting, I cannot help thinking they look forward to a time when they will employ them again’. Bloodletting had largely been abandoned because statistical studies had shown that it did not work, and recent developments in physiology had been able to show that it resulted in reduced haemoglobin concentrations, which hardly seemed likely to be beneficial. But doctors clearly regretted sheathing their lancets. The lancet was a symbol of their profession and of their status as doctors—the leading English medical journal is still called The Lancet.

Worst of all though, the abandonment of the lancet was not compensated for by the introduction of any new therapy that could replace it in general practice. A gap was left, and something was needed to fill the gap. By 1892 the leading Amercian physician of his day, William Osler, was writing ‘During the first five decades of this century we have certainly bled too little.’ And he proceeded to advocate for bloodletting for pneumonia: done early it could ‘save life’. Similarly in 1903 Robert Reyburn, and American, was asking ‘Have we not lost something of value to our science in our entire abandonment of the practice of venesection?’ The Lancet of 1911 contained and article entitled ‘Cases illustrating the use of venesection’—the cases included high blood pressure and cerebral haemorage. Bloodletting was also recommended for various types of poisoning, from carbon monoxide to mustard gas. In the trenches in 1916, venesection was the approved method of treating the victims of gas attacks. Heinrich Stern, publishing The Theory and Practice of Bloodletting in New Yorkin 1915 declared that ‘like a phoenix, the fabulous bird, bloodletting has outlasted the centuries and has risen, rejuvinated, and with new vigor, from the ashes of fire which threatened its destruction’—he thought bloodletting a useful treatment for drunkenness and homosexuality. Others recommended it for typhoid, influenza, jaundice, arthritis, eczema, and epilepsy.

While many factors are obviously involved, doctors were said to have observed a considerable rise in arteriosclerosis after 1880. Diets had changed too. By that point in time, the nation was also experiencing widespread dyspepsia—a sort of national stomach ache—that some had blamed on a lack of fiber due to people replacing their traditional whole wheat with white flour).

By the late 19th century, some physicians were publicly lamenting the complete abandonment of bloodletting for therapeutic purposes.

New York Medical Journal: The Therapeutical Value of Bloodletting (1887)

“But a few years ago it was customary to bleed too frequently, and almost every morbid condition was thought to demand bloodletting. Practically, we never resort to the measure now, perhaps because we do not consider to their full extent the advantages to be derived from it. From one excess we have fallen into the other. The disciples of the lancet bled according to a system; it was a formula. Their adversaries abstained by convention, not always by conviction; that, too, was a formula. There was error on either side. Therapeutical truth does not lie in a mere formula; it is to be found in facts proven clinically and experimentally, not in mere systems.”

What’s interesting is not whether or not bloodletting actually was therapeutic or not. What’s interesting is that virtually all of humanity went from a civilization where bloodletting was common and iron-rich meats were a luxury, to a society that rarely bled, where food was enriched with iron and iron-rich meats were eaten with regularity. And such a dramatic change happened over the course of about 100 years. It’s a complete 180° that has rarely been considered in the context of modern chronic diseases.

The Great Depression and WWII may have played am unsuspecting role in swinging this pendulum. As the nation continued to consume nutrition-less white flour, while meat was rationed or scarce, key micronutrients became a challenge for many to obtain in the years leading up to WWII. Metabolic issues were seen, which may now be linked to imbalances in the micronutrients—such as manganese and copper—needed to metabolize carbohydrates and manage iron efflux. Instead of promoting whole wheat flour, fortification of white flour was used to crudely solve rampant deficiencies for anemia and pellagra—convincing much of the nation that people can never have too much iron.

Before long, Popeye was promoting iron-rich foods, and iron-fortified Geritol™ became the success of Madison Avenue. And even though it was argued to be unnecessary, the FDA significantly increased fortification levels in 1983, which coincidentally skyrockted many metabolic health issues in countries that fortify their foods. To this day, most Americans erroneously believe that you can never have too much iron.

As more and more research continues to implicate excess iron in a wide range of chronic diseases, hopefully people will begin to notice that skyrocketing iron intakes and the cessation of traditional bloodletting may in fact be related to how we got here.


  1. Dave on August 14, 2015 at 11:50

    Talking about marketing hype…

    I must have subscribed to dozens of blog newsletters over the last couple of years, only to unsubscribe one by one. But in the case of this blog, I never get a marketing email yet find myself checking in daily to see if there’s a new post. So much fucking substance here.

    Anyway, great fucking post. Again.

  2. kxmoore on August 14, 2015 at 20:45

    These guys are out to fortify the food of the world and who is going to stop them? Methinks that legislated grain fortification give some of their members advantage over small mills/distributers. Someone please tell me i’m being overly cynical.

    • GTR on August 16, 2015 at 10:27

      @kxmoore – you can aks Russia to ban imports of fortified flour/products – they are eager to to ban anything western now.

  3. CoolBeans on August 14, 2015 at 10:03

    Keep this stuff coming! Loving the history lessons!

  4. golooraam on August 14, 2015 at 10:30

    that reminds me to floss!
    a little blood letting

  5. marthe on August 14, 2015 at 10:39

    How often do you “let” your blood?

    • cremes on August 14, 2015 at 11:32

      @marthe, I usually donate blood at According to their website, they require 56 days to elapse between whole blood donations. That seems relatively safe.

    • John on August 16, 2015 at 15:28

      When I was actively trying to lower a very high ferritin number, I was doing it once a month. Now that it’s near deficiency, it’s probably just over two months or so. I probably donate 5 or 6 times a year.

  6. Rob on August 14, 2015 at 10:54

    I gave a “double red cells” donation the other day. They filter out your red blood cells and pump the rest back in (plus some saline and anticoagulant). Apparently twice the iron loss in one go.

    I had my iron tested first:

    Iron Level: 154 (45-182)
    % Iron Saturation: 54.0 (20-55)
    Total Iron Binding Capacity: 285 (230-409)
    Transferrin: 204 (180-329)

    No serum ferritin unfortunately. The above test was after a whole blood donation about 8-12 weeks prior.

  7. John on August 14, 2015 at 14:22

    Ha! Just happened to donate blood this morning, and that’s the day this blog post goes up. Nice.

    By the way, I’ve noticed that Life Extension’s Two Per Day formula is excellent at boosting hemoglobin levels, even better than a simple B complex (and yes, it is free of iron). When I donated in early June, hbg was 12.7, while ferritin was at 28. I went back to using Two Per Day, and hbg was 15.5 when they measured today, which is the same number it was when I first donated with ferritin at 440.

  8. John on August 15, 2015 at 12:31

    When I read “An Epidemic of Absence,” I thought the hygiene or old friends hypothesis had a lot of merit, but kept on wondering about the role of iron, since I knew that hookworm would constantly lower iron levels by feeding on the host’s blood. Then, the previous article really got me thinking that iron fortification and iron supplements were the true disasters to the gut biome (much more than say, meat or spinach). Now, looking at bloodletting as well, I realize that iron levels in humans underwent a massive shift in less than a century, and are continuing to spiral out of control.

    I wonder if prebiotics, probiotics, and even antibiotics only work on the fringe when iron is out of whack (maybe why they seem to work powerfully in some, and not at all in others).

    I currently believe that high iron levels are the true cause of antibiotic resistance, and that antibiotic overuse is a secondary factor, and maybe not really a factor at all.

    I also find it interesting that antibiotics paired with iron chelation drugs are far more effective, and that the tetracycline class of antibiotics are strong iron chelators themselves-

    I wonder if the potato hack, raw milk diet, and William Brown’s skim milk and sugar diet were all working by lowering iron levels (or if you look at both iron and PUFA, lowering oxidative stress).

    Sorry if this comment may have seemed a bit long and rambling. Anyway, Duck and Richard, keep this series up. I think it’s going to some really interesting places.

    • Natasha on August 16, 2015 at 22:54

      Very interesting connection!

      I too wonder what connection parasites have.

      One of my other interests is dogs. Want a dog, never had one before. So I research breeds, breeding, training, health, etc. I visit shelters and watch websites where breeders and back-yard breeders post for sale/adoption. No dog yet but I will likely adopt….

      But one of the things that worries me, is how frequently puppies are de-wormed. I see “registered” breeders and backyard breeders advertising that their puppies have been de-worming 3 or 4 times, before it is even 8 weeks old! One of the keys to human health, according to Dr. Art Ayers includes a dog, which adds to the families microbe diversity. But if the dog isn’t healthy either…

      I see dogs that have all kinds of autoimmune diseases, skin diseases, allergies, food allergies, anxiety ridden. Dogs that need insulin. Dogs on antidepressants. Dogs with cancers and tumors.

      Did happy dogs have worms and eat meat?

    • Bret on August 17, 2015 at 06:19

      “Did happy dogs have worms and eat meat?”

      My laymen’s answer is an unequivocal yes, Natasha. Genetically and historically, dogs are pack animals that roamed the region and scavenged for food, in addition to an occasional fresh kill. You can bet top dollar that 99.9999% of healthy and “happy” (meaning instinctually settled/unconflicted) canine specimens throughout the species’s history had a robust immune system that included experience with all sorts of parasites, including worms.

      I think you’ve brought up a brilliant example of the collision between nature and modern western human culture’s hygienic obsession with sterility. Then again, a dog that is going “all naturale” might ought to be primarily an outdoor dog (also in line with natural history). :-)

      On a separate note, have you ever heard of Cesar Millan (the Dog Whisperer)? That guy is a freaking dog behavior genius. I’d highly recommend consuming his first book, Cesar’s Way, before getting a dog.

    • Natasha on August 17, 2015 at 22:37

      I did a little more research, which I should have done before posting. The recommended de-worming regime is every two weeks! So, reputable people de-worm pups. But, it still seems off to me.

      Thanks for letting me cmoment… I try to limit the amount of autoimmune, microbiome, gut bug, theorizing ideas onto my very patient husband.

      But all of this is so interesting!

    • gabkad on August 22, 2015 at 14:05

      Natasha, dogs didn’t used to live in the house with their owners. Not usually. They lived outside so they didn’t sleep on the sofa or the bed, lick their owner’s children in the face or stick their butts in human’s faces. I think this is a big difference.

      These days heartworm is a big deal.

      Today people keep dogs as companion animals and not as guard or working dogs. They don’t want their dogs scooching wormy butts on their Persian carpets. Among other things.

  9. SteveRN on August 15, 2015 at 03:58

    Human nature never changes, I guess! People overused blood letting, demanded it. How many other things have we done that with? More is better is not just American or modern, it seems. Sounds like Vit. D, fish oil, running, etc, etc, etc. Dose makes the poison, as you are fond of saying.

  10. edster on August 15, 2015 at 04:09

    Does anyone in Australia know how a Pommy Bastard** can have some blood-letting performed (without having to insult a Collingwood supporter)?

    ** In case you’re curious, anyone who has lived in the UK between 1980 and 1996 isn’t allowed to donate blood in Oz.

    • John on August 15, 2015 at 11:15

      That’s a common deferral, it’s the same in the US. You could attempt to get a prescription for a theraputic phlebotomy, that would work. And while this may be less practical, you could look into donating blood in other countries. There are also the alternative iron lowering strategies (IP6, lactoferrin, low iron diets, etc.). Aside from that, do you have any friends that are nurses or possess venipuncture skills?

    • James on August 15, 2015 at 17:34

      I’m not sure about the laws there, but you could try to find a practitioner of Traditional Chinese Medicine to do cupping and bleeding. They will commonly (if legally permitted and trained to do so) use a small mallet with 5 – 7 needles that extend out of it’s face approximately a centimeter to tap areas of injury or stagnation and place small glass globes on with suction to draw the blood out. This moves old, stagnant blood from the area and allows fresh blood to come in. It works wonders for frozen shoulders and other injuries. It doesn’t hurt much and you will feel better afterward. If nothing else, it’s something to experience.

    • John on August 22, 2015 at 14:02

      Also, if you’re brave enough, you could do it yourself. Here’s a video on youtube of a guy giving himself a theraputic phlebotomy-

      I wonder if cutters are basically doing (or are attempting to do) the same thing.

    • Kate on September 15, 2015 at 11:32

      I would love to hear anyone’s experiences of giving themselves a phlebotomy. I’ve looked at the youtube you posted here but can’t really figure out what he is doing w/o any narration. Loved this whole iron series, and have re read each several times. Like Edster above, I’m subject to the same blood donation deferral, which I doubt will ever be lifted. Back in 2011, after reading Jaminet’s book, I had my Ferritin tested (111, newly post menopausel). After reading the first in this series, had it tested again (215). Re diet, I haven’t eaten any iron fortified foods in years. Past five years I’ve basically eaten a PHD style diet, eggs every day and about .3 lbs of meat or fish a day, usually at dinner. I eat very little dairy, so I’ve started adding some cheese to beef and lamb meals. Also experimenting with IP6, curcumin, and green tea extract. I’ll ask the doc for a scrip for a phlebotomy next time I’m in, but I’d rate the chances of getting one slim to none. Ha, ha, wish I could just go around to the nearest barber and get cut.

    • Duck Dodgers on September 16, 2015 at 11:37

      I shudder to think of the idea of self phlebotomy. I envision one passing out as the blood continues to drain out. :/

      This is in no way medical advice, but rising ferritin may not always be bad news. Imagine one is sick and inflamed due to an infection or some kind of chronic illness, the body would withhold iron in tissues to avoid keeping iron out of the blood.

      Now, say you fix the underlying health issues and through various therapies, your body begins removing this iron from its tissues. Your iron levels and ferritin could go up when your health is actually improving. Phlebotomy might be a way to remove that excess iron that appeared out of nowhere. If you lost considerable amounts of weight while your ferritin increased, that would be very interesting, considering that adipose tissue appears to sequester iron stores when people are inflamed. Lose the weight (and reduce inflammation) and the iron stored in that adipose tissue would likely migrate safely into the bloodstream. Phlebotomy would be one way to dispose of that higher serum iron.

      Another example of how rising iron stores may be a sign of progress… some studies show that Earthing (physically touching the Earth) is said to reduce inflammation in joints and muscles and has been shown to improve heart rhythms (HRV)—nobody really knows how this happens. People claim that skin looks smoother and more elastic after Earthing. Meditation is also said to improve heart rhythms (demonstrated by increased HRV).

      One study shows that ferritin temporarily increases from Earthing while sleeping. (Although, some studies suggest that our ferritin levels are rising and falling throughout the day anyhow). At first this seems confounding. How can increasing ferritin seem to provide benefits? Well, it’s entirely possible that removing free iron from tissues and migrating it to the blood can be beneficial. It may be a step in the right direction if we are seeing other benefits at the time.

      Another study shows that in Thalassemia Patients, iron depositions in the heart are related to disrupted heart rhythms. This might explain why Earthing improves heart rhythms. Wild speculation.. Perhaps even meditation may help us remove free iron from our tissues.

      Ferritin is used as an estimate of total body iron stores. When it’s high, we assume that iron in tissues is elevated. So, we might just consider it a helpful signal that it’s time for us to consider phlebotomy. But, it may also mean that our organs and tissues are being cleansed and detoxed of excess iron—and now we have the option to dispose of it if we please.

      * None of this is intended to be medical advice—it is pure speculation.

    • Duck Dodgers on September 16, 2015 at 13:21

      Correction. I wrote, “the body would withhold iron in tissues to avoid keeping iron out of the blood”

      That should have read… “the body would withhold iron in tissues to keep excess iron out of the blood”.

    • Kate on September 16, 2015 at 14:42

      Yes, I’m mindful of the speculative nature of these iron issues. Our bodies are complex organisms, and I doubt modern lab tests are all that useful in many cases. I’ve also read that ferritin measurements can jump around quite a bit. Still, I’m quite taken with the iron whole hypothesis. After your last piece on bloodletting, I was thinking how smug we have been to condemn this practice. It stands to reason that it would not have persisted for so many eons in independent cultures if it didn’t produce positive results. Of course, there may be other reasons for benefits besides taking some iron out of circulation. Anyway, it annoys me that I can’t do a safe experiment–donating blood on a regular basis and monitoring my iron levels–just because I was stationed in Germany during the 80s. No signs of mad cow disease yet :) I notice my iron measurements were pretty consistent from the 2011 panel and the recent one. Low TIBC, normal serum iron on the high end of the range, and now high Ferritin. I’ve always been lean and don’t have any obvious metabolic issues, but I’ve had chronic headaches and migraines for the last two decades. Much better now perhaps thanks to PHD, resistant starch and fiber, but still an issue. So pure speculation on my part. If iron accumulates in the brain as we age is there a connection? Those old time blood letters did treat headache with bloodletting.

    • Duck Dodgers on September 17, 2015 at 14:16

      Yes. I think having someone else on hand (with a brain) seems to be the key step. That’s probably part of the reason why people traditionally went to their barbers for bloodlettings, rather than just slicing their arms open at home. :)

      Also, people were expected to faint, as that was believed to be a key part of the therapy. So, you needed someone else on hand.

      Btw, I’m surprised no one’s yet to mention the 1978 Steve Martin SNL sketch, Theodoric, Barber of York. You can tell the writers actually did a fair amount of research on the subject.

    • Duck Dodgers on September 17, 2015 at 06:58

      Well said, Kate. That was a good article, btw. One quote stands out…

      “In addition to its direct chelation of iron, curcumin induces increased genetic expression of the body’s natural iron-binding and transport protein, ferritin, further sequestering iron away from vulnerable tissues. These multiple capabilities lead directly to reduction in iron levels in iron-overloaded organs.”

      Covers what I was trying to say. It seems as though sometimes rising ferritin can be a good thing (a sign of progress), at least until there’s an opportunity to dispose of that excess ferritin.

    • Duck Dodgers on September 17, 2015 at 07:05

      Btw, the last time I went in to donate blood, I was amazed by how many donation restrictions there were on US veterans.

      Up until recently, civilians were restricted if they had been to the Cancun area within the past year—for malaria concerns—but they had to remove the restriction because the it happens to be one of the most popular destinations in the Americas and the risk of malaria was low.

    • Richard Nikoley on September 17, 2015 at 11:31

      “I shudder to think of the idea of self phlebotomy.”

      Well, people are going to do it anyway, just like DIY fecal transplants. There are a fraction of people know who know more that their doctors about themselves. Thanks, Google, and self experimentation and above all, thinking for one’s self.

      To my mind, just some common sense things I would definitely do were I do DIY blood loss.

      1) have a fixed volume bag of 1 pint.

      2) Two-man control, just like in nuclear weapons. Have someone with a brain on hand.

      3) Follow the sensible blood letting (donation) protocol of a pint every two months at most. You don’t build up the iron overnight. No need to fret about getting rid of it overnight.

      With any luck, we’ll soon have a number of good YouTube DIY bloodletting videos to watch.

    • Woodwose on September 18, 2015 at 11:28

      I Think an easy way to do bloodletting is so called wet cupping. This is ´very popular in china and arab countries (hijama). I tried it myself with a hijama set and it was quite painless.

    • Kate on September 18, 2015 at 12:35

      Woodwose, I’m curious how much blood you were able to extract.

    • woodwose on September 18, 2015 at 22:52

      Didnt measure the first or second time, was just to get a clue how to do it. But i dont tink 4,5 decilters would be that hard to extract in one go. Just let the cupp fill upp and then reapply it after its empty.

      Using a needle was too slow for me so i used a surgical blade. I held the blade with a plier to get a reasonably short length of stabbing. My blood congeal very fast so i had to apply water and wash away congealed blood for when I would reapply the cups. If you use a needle instead of a blade the scars will go away very fast, but the extraction will be slower.

  11. Natasha on August 15, 2015 at 09:35

    Richard! Excellent thinking and research.

    Nice to see a quote from the homeopathic journal. A number of years agao, I worked for researchers in Belgium and Sweden, reading and coding them for a searchable database. Lots of credible history and knowledge in homeopathy.

    So interesting this change of habits and medical expertise! I think you will find this interesting… New article out this week.

    People rarely question “accepted” practices. But good science requires it!

    Enjoyable posting. Thanks!

  12. VW on August 15, 2015 at 10:53

    I like that you fuckers are thinking outside the box.

  13. gabkad on August 15, 2015 at 10:58

    ‘From one excess we have fallen into the other.’

    That’s the medical profession for you. No nuance whatsoever. It’s like a cult.

    • John on August 15, 2015 at 11:58

      I think this is true of people in general, not just the medical profession.

  14. Steven on August 15, 2015 at 13:14

    As I’ve mentioned before, my mom told me stories of her childhood when she was sick, they let blood, she got better almost instantly.

    She will gladly talk about her experience and it’s very interesting.

  15. GTR on August 16, 2015 at 10:35

    One of the iron absorption enhancers – the practice of adding ascrobic acid to the recipes, eg. sprinkling meat or fish with lemon juice – has positive effects of blocking cancerogenic nitrosamine formation.

  16. ZM on August 16, 2015 at 06:01

    This looks like an interesting article about iron overload and points to a book called “The iron elephant”:

  17. Vizeet Srivastava on August 17, 2015 at 05:51

    I read about “Helminthic therapy” in which they use worms larvae as immuno-therapy. Also following paper seems to suggest iron availability in blood seem to control hook worms:

    It appears to me that in pre-historic times the worms controlled iron level in human inspite of red-meat rich diet. As a consequence of lack of worms in our body we had to rely on blood-letting to manage iron level.

  18. jonw on August 16, 2015 at 18:25

    I cant be the only one wondering about practical details. What’s a reasonably safe procedure to let blood for self or a friend? How much, how often, etc. Or have all you bold self-experimenters killed yourselves already?

  19. Anand Srivastava on August 17, 2015 at 03:45

    It seems Ayurveda also has a blood letting therapy, called Raktamoksha

  20. RMcSack on August 17, 2015 at 09:32

    Any idea if regular heavy lifting is a good alternative to bleeding/blood donation? I thought I read that weightlifters have higher iron needs.

    Also, is it possible to have higher iron levels but still show up as iron deficient (low ferritin)? I’m not familiar yet on the specifics on how they detect it. Are the conventional tests the best way to know, or is it like cholesterol where they have to calculate counts that can be inaccurate.

  21. robm on August 17, 2015 at 10:37

    Black tea reduces iron absorption. One of the first questions a (good) doctor will ask those with low iron levels is, how many and what kind of tea do you drink? Also easier than leeches or long fanged humans.

    • Duck Dodgers on August 17, 2015 at 12:14

      …And green tea chelates iron, even from the brain. While it hasn’t been studied as closely, white tea may even be as or more powerful, as it is a younger green tea without the high levels of caffeine.

    • John on August 17, 2015 at 13:33

      Interesting. I never much cared for green tea, but will look into white tea a bit more. I think I had it once and enjoyed it.

    • Duck Dodgers on August 18, 2015 at 21:03

      We briefly mentioned this in our Iron Theory of Everything post, but Curcumin chelates iron too. I’m sure it’s related to why it’s known for reducing inflammation.

      Curcumin, a cancer chemopreventive and chemotherapeutic agent, is a biologically active iron chelator (2008)

      Curcumin is a natural product currently in human clinical trials for a variety of neoplastic, preneoplastic, and inflammatory conditions. We previously observed that, in cultured cells, curcumin exhibits properties of an iron chelator. To test whether the chelator activity of curcumin is sufficient to induce iron deficiency in vivo, mice were placed on diets containing graded concentrations of both iron and curcumin for 26 weeks. Mice receiving the lowest level of dietary iron exhibited borderline iron deficiency, with reductions in spleen and liver iron, but little effect on hemoglobin, hematocrit, transferrin saturation, or plasma iron. Against this backdrop of subclinical iron deficiency, curcumin exerted profound 2 effects on systemic iron, inducing a dose-dependent decline in hematocrit, hemoglobin, serum iron, and transferrin saturation, the appearance of microcytic anisocytotic red blood cells, and decreases in spleen and liver iron content. Curcumin repressed synthesis of hepcidin, a peptide that plays a central role in regulation of systemic iron balance. These results demonstrate that curcumin has the potential to affect systemic iron metabolism, particularly in a setting of subclinical iron deficiency. This may affect the use of curcumin in patients with marginal iron stores or those exhibiting the anemia of cancer and chronic disease.

    • Timothy on August 19, 2015 at 03:47

      Duck Dodgers,
      First thank you for your research and provocative hypothesis. I have been churning the iron-overload idea through my head (and readings) all summer. One possible weakness I see pertains to the increasing health and obesity problems of people under 40. ( My apologies if you addressed this somewhere and I missed it.) Among the under 40 group, Shouldn’t females be less susceptible to iron overload and thus less susceptible to obesity and western diseases than males of the same age due to the “advantage” of menses?

    • Duck Dodgers on August 19, 2015 at 20:29


      I’m sure the hypothesis is not perfect. Two things…

      1) People under 40 are more likely to have been born—or were going through crucial stages of development—after the FDA’s 1983 significant increase in fortification levels. We discussed that increase in our original article. In other words, anyone born after 1983 would have been developing with the iron intakes never before experienced for children. And that doesn’t even consider the extreme amounts of amount of iron in baby formula. So, I suspect poorer health of those under 40 is easily explained by this 1983 increase in iron levels.

      2) As far as sex differences go, I’m not sure. Menstruating women get pregnant. Pregnancy is often associated with stubborn weight. Sometime pregnancy can disrupt hepcidin expression and therefore iron homeostasis. So, a history of pregnancy could be a confounding factor. I don’t really know.

      And we don’t know for sure if adiposity is actually a form of iron overload or not. What we do know is that adiposity is a kind of iron dysregulation. And we know that it appears to happen more often in iron-fortified developed populations—and worsened after increases in fortification levels.

      We are also finding that combinations of obesity-implicated foods does in fact seem to result in increased absorption of that iron. For instance…

      Transport of Fe2+ across lipid bilayers: possible role of free fatty acids (1987)

      “Fatty acids can form lipid-soluble complexes with Fe2+. Incorporation of fatty acids into phosphatidylcholine/cholesterol liposomes renders them permeable to Fe2+. Of several fatty acids tested, the most effective Fe2+ carriers were linoleic and oleic acids followed, in decreasing order of efficacy, by linolenic, myristic, arachidonic and palmitic acids. ….. It is suggested that free fatty acids may act as mediators of Fe2+ transport across biological membranes, particularly isolated intestinal brush-border membrane.”

      Refined oils might just make you absorb lots of fortification iron. This may explain why developed fortified nations are the hardest hit, as people in developing nations would be less likely to consume refined oils.

      Now the interesting part…

      Linoleic acid in soy strongly linked to the obesity epidemic

      Have a look at the graph. Linoleic acid skyrocketed ten years before the obesity epidemic. Iron explains the mechanism of the association. This seems to be the missing piece of the puzzle. LA skyrockets and then you increase iron fortification in 1983, which promotes obesity.

      As you can see, we may have found the missing piece of the puzzle. I think this mechanism makes far more sense than just blaming the obesity epidemic on laziness and enormous appetites—as if people became hungrier and lazier after 1980 for no apparent reason. No, there seems to be something wrong with the food supply in fortified countries, and I think we may have figured it out. The research continues….

    • Duck Dodgers on August 19, 2015 at 20:58

      Also, the idea that women in general are less susceptible to chronic diseases isn’t my theory. Jerome Sullivan came up with the original “iron hypothesis” about 35 years ago when he pointed this link out. Though, his original hypothesis was mainly about linking iron to CHD. And he wasn’t specific as what that link actually was.

      Sullivan’s iron hypothesis has never been proven and the results of studies have been confounding—perhaps because serum iron is often used to prove/disprove his hypothesis and serum iron does not always reflect total body iron stores.

      But sometimes—perhaps when chronic disease is not apparent—the serum iron can correlate pretty well for some diseases…

      Body Iron Stores in Relation to Risk of Type 2 Diabetes in Apparently Healthy Women (2004)

      “Higher iron stores (reflected by an elevated ferritin concentration and a lower ratio of transferrin receptors to ferritin) are associated with an increased risk of type 2 diabetes in healthy women independent of known diabetes risk factors”

      It’s a complicated subject. But, I’ve also seen studies where iron depositions and/or iron dysregulations in the key organs (heart, liver, etc) are associated with inflammation in those organs. Fascinating subject for sure.

  22. Jim on August 18, 2015 at 02:36


  23. Jane Karlsson on August 18, 2015 at 03:15

    This post is a work of very serious scholarship. No scientist I know could do better.

  24. tw on August 18, 2015 at 08:07


    I managed to get some tiger nuts locally (finally) and noticed that they have a pretty good iron content.

    Taking this rather interesting and thought provoking series into account, have you altered your consumption of these tasty morsels?

  25. Colombo on August 24, 2015 at 14:22

    Oh my God, my head fuckin hurts!

    It is mind bending to watch how much people can twist things.

    I guess if made a living by selling iron supplements or fortifying food with stupid iron filings, then I should recommend my clients to avoid all vitamin C and all B vitamins, as they may increase iron absorption… That would be morally right, I think.

    So no acid fruit while eating my stupid fortified cereals.

  26. SL on August 31, 2015 at 08:57

    I have always had normal iron and lowish ferritin. When I had the ferritin checked last it was 42 up from 14 through supplementation. Would blood letting still be good idea for someone who struggles with low ferritin? I could only find one site that speculated that normal iron with low ferritin was due to heavy metals competing with iron. Could blood letting be advantageous for moving the heavy metals out and iron into body storage to raise my ferritin levels?

    • Richard Nikoley on August 31, 2015 at 15:21

      Our purpose is to highlight potential problems with wide-sweeping state mandates to fortify staple foods. I suppose it could be taken as advice to avoid them.

      However, we really can’t engage in medical advice for individuals. I wouldn’t do it even if I cared about laws, which I don’t–not one.

      You would need to consult with someone who can assess your individual case.

  27. FrenchFry on October 27, 2015 at 03:56

    Amazing side-effect of the current economic crisis / recession hitting the poor in the US:

    For paying say a gas bill, some people sell their blood! If this trend grows, it could be interesting to observe whether a correlation with some health markers is developing.

  28. Robert Ross on November 17, 2015 at 10:03

    It’s often said men and post-menopausal women don’t have a mechanism to dispose of excess iron and that bloodletting is one way to achieve this. Have you considered the role parasites might play here? I recently read about hookworm and whipworm that have been extinguished in some countries and how it causes blood loss through the gut even at non-pathological levels that aren’t considered clinically dangerous.

    It’s always seemed a bit strange to me that we don’t have a way to get rid of iron unless you’re a premenstrual woman. Are the old-friend parasites we’ve gotten rid of perhaps a missing piece to the puzzle?

    Think of internal blood loss as an ecosystem service provided by these little guys. Not to mention the more appreciated immune modulating services these guys provide.

    • Duck Dodgers on November 17, 2015 at 20:41

      FWIW, we looked into blood loss from parasites and found that the science is unclear. I’ll try to summarize here what the group found. Hookworm infestations are ubiquitous in third world countries and old friends can offer benefits. Conventional wisdom states that these old friends promote anemia. There are a few problems however…

      The association makes sense, helminths eat blood, blood contains iron, iron is reduced in blood, therefore helminths promote anemia. However, the helminth load may be determined by the iron available, or the body will withhold iron in tissues to lower iron in the serum in response to helminths. The results may be the same as chronic anemia triggered by bacterial infection.

      In other words, it could just be that helminths cause anemia of chronic disease. Apparently it takes heavy infestation to cause significant blood loss, and heavy infestation means damage to the gut and poor nutrient absorption, which means the anemia may be just as likely to be due to micronutrient deficiencies as blood loss.

      The US government was sufficiently convinced of a helminth-anemia link that it carried out a huge program to eradicate helminths.

      Dietary Iron Content Mediates Hookworm Pathogenesis In Vivo (2006)

      “… it has also been suggested that hookworm infection modulates iron metabolism in the host, resulting in enhanced reabsorption from the gut as a means of compensating for hookworm-associated blood loss (10, 11). Such a compensatory mechanism was originally put forth to explain the fact that death from overwhelming hookworm anemia is rare, despite calculations that show that iron losses from hookworm infection are likely to far exceed dietary intake in many communities of high endemicity (11, 24).

      The data presented here offer another potential explanation for this apparent contradiction, namely, that severe iron deficiency may directly influence the ability of hookworms to establish and/or maintain infection. Our findings are supportive of the recent observations that Caenorhabdis elegans and related helminths, including hookworms, lack the ability to synthesize heme and thus depend on exogenously acquired heme as an iron source (26). It is plausible that animals fed a low-iron diet reveal diminished hookworm pathogenicity due to inadequate heme availability for hookworm growth and development.”

      So helminths metabolize the human-sourced iron and also encourage re-absorption of iron in the human gut, so that enough remains to feed them. Makes sense. It appears to be an efficient, low-waste process.

      Helminthic anemia impacts women most, especially pregnant women—but again this could be anemia of chronic disease. Even light infection was found to reduce serum iron in pregnant women. Anecdotally, helminths appear to be more of a net benefit, on avg, than a problem, for males. It is indeed complex.

  29. Sabrinah jones on March 1, 2016 at 00:38

    Hi people of the internet, very refreshing and needed post. Thanks for the info.

    It seems that there is a great difference between blood-letting (taking healthy blood out of body) and wet-cupping (originated from Chinese medicine and then promoted by various cultures). I actually regularly started wet cupping due to major hormone issues and toxins build up which was really effecting my health. After 5 sessions my body is healthier and symptoms reduced drastically.

    Wet cupping or hijama (as some refer to it) actually involves dry cupping and then taking cups out and putting incisions in those area, then putting the cups back again and sucking out again. The second suction just draw toxic, surface blood which is full of toxins. This has helped with reducing cholesterol, diabetes and many other illnesses.

    Please research it and try it yourself. You will not be disappointed – wet cupping/hijama x

    Look forward to an article on this in future maybe? (“,)

  30. Catherine on May 17, 2019 at 07:32

    Yes, they are poisoning us by iron fortification. Making us think it is a “supplement” but in reality causes a whole list of degenerative diseases for which pharma makes millions. Free iron like in supplements causes oxidative stress (Fenton Reaction) and oxidative stress causes degenerative disease. Not only this, but doctors aren’t trained properly and don’t understand the difference between bound and unbound iron (which ends up in the tissues). And the ferritin test is an outdated test that is really more of a marker for inflammation than iron status. Taking a bottle of iron supplement made my thyroid almost go hypothyroid, caused leaky gut so I would get tendonitis when I ate chocolate, turned on my sulfur gene which I didn’t know I had (CBSc699t) and made me extremely sensitive to sulfur (oxidative stress turns on these bad genes and cancer genes) and iron feeds parasites (candida anyone?).!divAbstract

    • Jane Karlsson on May 23, 2019 at 05:40

      Yes they are poisoning us with iron fortification. I went to a seminar on Tuesday by an eminent vascular rheumatologist called Justin Mason, whose group found a few years ago that a single iron pill can damage the blood vessels.

      Low Dose Iron Treatments Induce a DNA Damage Response in Human Endothelial Cells within Minutes (2016)

      His current work shows that protection of blood vessels depends largely on induction of the manganese enzyme MnSOD. I asked at the end about iron overload, and whether the patients had it, because it had recently been shown that excess iron stops manganese from getting into mitochondria, so MnSOD has no Mn and doesn’t work. I wondered whether his patients were getting enough manganese, since the western diet is very low in Mn and very high in iron. His response astonished me because it was so positive. Almost as if he’d been waiting his whole life for someone to ask that question.

      • Richard Nikoley on May 23, 2019 at 05:57

        Thanks Jane.

        Haven’t “seen” you in a while. Hope you’re doing well.

  31. Benjamin David Steele on June 4, 2020 at 06:26

    What stands out to me is that all of the societies that did bloodletting were major agricultural civilizations. Bloodletting apparently is far less common among hunter-gatherers. At least, I don’t know of any hunter-gatherer examples, offhand. Can anyone think of some exceptions, specifically natives who have remained mostly isolated?

    That is telling. What is it about the agricultural diet and/or lifestyle that causes iron overload? Could it be the grains or other plant foods? Is it how animals are raised? It’s probably not the latter, as most animals were raised on grasslands/pasture until rather recently. Even now, most cattle in the world aren’t factory farmed. So, what is going on?

    This is, in many ways, the most interesting post on this site. It points to some greater significance of the agricultural diet, far beyond the industrial iron-fortification of processed foods. Also, it corroborates the views of Dr. Paul Saladino and Dr. Shawn Baker, in that they don’t observe iron overload and related health issues among their meat-heavy and low-carb patients, including those on the carnivore diet.

    This calls for a revision of the iron hypothesis proposed by Nikoley and associates. It very well might be true that iron is a central cause to chronic diseases that arose with civilization. But it might not be a problem of access to iron. People can consume large amounts of animal foods high in iron and yet no have iron overload. It might be something else that is added in when an agricultural diet is introduced.

Leave a Comment

You must be logged in to post a comment.