The Celtic Curse: Haemochromatosis & Agriculture
Irish Population Genetics & The Biological Impact Of The Famine
Haemochromatosis, or ‘iron overload’ is one of a cluster of iron processing disorders with their roots in the Agricultural Revolution. Prior to the over-reliance on cereal grains, humans ate a more meat based diet, which provided enough heme iron to keep the body healthy. Something changed with the Neolithic adoption of wheat and other plants and one of the results was the transmission of iron metabolic problems into the general population of Europe and the rest of the world. For the Irish, haemochromatosis is known as the ‘Celtic Curse’, disproportionately affecting populations of Celtic descent. This is the story of how and why this metabolic problem was selected for, but also the story of how agriculture and its associated problems led to persistent and still destructive issues with processing dietary iron today.
What Is Haemochromatosis?
Haeomochromatosis is the condition where the body receives and stores excess iron, the results of which can be fatal and include: heart attack, liver disease, arthritis, loss of libido, diabetes and skin problems. Excessive iron can come about in a number of ways, such as blood transfusions, iron supplementation or a hereditary genetic disorder known as ‘Hereditary Haemochromatosis Type-1’. For the purposes of this article, we will just be focusing on this form of the condition. Humans lack any ability to naturally excrete excess iron, except through menstruation or blood loss, which makes the persistence of this disorder intriguing from a population genetics standpoint - especially given the very high levels in Celtic, English and Scandinavian peoples, Celtic in particular.
The most common genetic mutation for haemochromatosis is the C282Y allele of the ‘Human homeostatic iron regulator protein’ or ‘HFE protein’ controlled by the HFE gene - located on short arm of chromosome 6 at location 6p22.2. The Irish population carry this C282Y mutation at around 10%–11%, the highest known frequency in the modern global population. Although there are over 100 known mutations that lead to iron overload, the C282Y variant is the most common by far. The recessive allele leads to the above mentioned conditions when two copies are inherited in an individual. Iron toxicity has also been implicated in neurological damage, Alzheimer’s disease and other brain related disorders.
The origins of C282Y are murky, but a 2004 paper suggests the following:
“The mutation responsible for most cases of genetic haemochromatosis in Europe (HFE C282Y) appears to have been originated as a unique event on a chromosome carrying HLA-A3 and -B7. It is often described as a “Celtic mutation”—originating in a Celtic population in central Europe and spreading west and north by population movement. It has also been suggested that Viking migrations were largely responsible for the distribution of this mutation. Two, initial estimates of the age of the mutation are compatible with either of these suggestions… We conclude that the HFE C282Y mutation occurred in mainland Europe before 4,000 BC.”
While this paper is useful, the field of prehistoric European population genetics has raced along with little pause to consolidate the data. Certainly more studies looking at how specific mutations have arisen and moved with populations are sorely needed.
The Biology Of Iron Metabolism
To really understand how this mutation works and why it might have been selected for and preserved in certain populations, we need to turn to the topic of how iron is processed and metabolised by the body.
Iron is a crucial element in human health, primarily due to iron’s ability to accept or donate electrons. This makes it important in redox (reduction and oxidation) reactions where electrons are transferred, in particular in red blood cells as heme in haemoglobin and for cellular respiration in the cell. Crucially the difference is between iron as an electron donor: ferrous state (Fe2+) and as an acceptor: ferric state (Fe3+). The difference between these two forms of iron has profound consequences.
In general the metabolism of iron is a strictly controlled and well regulated system. An adult male stores around 4000mg of iron, held in haemoglobin, iron proteins and macrophage immune cells. Roughly 1-2mg of iron a day is lost through cellular sloughing in the intestinal wall, while menstruation can account for 3mg a day. Given that the average European diet nets around 15mg of iron per day, one might wonder how we don’t accumulate iron very quickly. The answer is that only about 10% of consumed iron is actually absorbed into the body and put to use, the remainder is simply left to pass through the body. Why? The body can only absorb iron in its ferrous Fe2+ state or as an iron-protein like heme. Ferric iron has to be converted to ferrous, a process which is significantly helped by consuming ascorbic acid (vitamin C) at the same time.
Thus iron consumption and use is highly dependant on the type of foods being eaten. A number of compounds such as oxalates, phytates, tannins and fiber will hinder the absorption of non-heme iron, a problem which will bring us on to agriculture shortly. So whilst high meat diets, such as hunter-gatherers, will have no problem providing enough heme iron, other diets which rely heavily on ferrous and ferric iron run the risk of iron deficiency.
Anaemia and Agriculture
Despite iron overload being associated with the onset of the Neolithic, in fact the opposite problem is the more significant result from the adoption of agriculture - a deficiency of iron causing a decrease in red blood cells. The effects of anaemia are devastating and global. In a paper entitled A systematic analysis of global anemia burden from 1990 to 2010, the authors conclude that “Global anaemia prevalence in 2010 was 32.9%” and accounted for “8.8% of the total disability from all conditions in 2010”.
To quote from another paper looking at the evolutionary logic behind anaemia:
“In the developing countries, prevalence of IDA [iron-deficient anaemia] is estimated to be 56% among pregnant and 41% among non-pregnant women. In South Asia, more pregnant women have IDA (62%) than a normal hemoglobin level; this is also true for women and children in some African countries. Although many reports have not excluded other causes of microcytic anemia that could lead to overestimation of the prevalence of IDA, in many parts of the world the prevalence of IDA is sufficiently high to be considered a “statistical normality”.”
The reasons for this are many, but an agricultural diet is amongst the top contributors. Agriculture does two things simultaneously, it decreases the amount of heme-iron rich foods - in particular red meat - it also increases the amounts of compounds in the diet which interfere and block the absorption of what little iron can be found in domesticated plants - phytates in cereals, lactoferrin and calcium in milk, lactoferrin in eggs and tannins in nuts, tea and other plants. The switch from a forager to agricultural diet therefore, was a vast biological selection process which utterly transformed the metabolic processes of the previous two hundred thousand years at least. On top of this, the Neolithic Revolution also increased the birth rate, increased the frequency of conflict and warfare and the amount of parasites and infectious diseases in human populations, all extra risk factors for people suffering with chronic iron shortages.
Interestingly, both low and excess iron have adaptive properties. We’ll discuss the benefits of iron overloading later, but anaemia does also seem to protect the individual against chronic parasitic and bacterial infections. All microorganisms require iron for their own metabolic processes, so when a person suffers from an infection, one response is to increase the production of an intestinal protein called apoferritin. This sequesters free iron and binds it in an unavailable form to any infectious bacteria or parasites. Many bacteria have their own mechanisms for securing iron from their hosts - in the case of Helicobacter pylori it has surface molecules for acquiring human iron-proteins; for the plague-causing Yersinia pestis, it steals iron using human binding proteins, and for streptococcus it actively destroys iron storing cells to acquire free iron. Lower free iron levels in the case of anaemia are therefore adaptive against chronic and persistent infections, essentially hiding iron from pathogens and potentially decreasing mortality.
Parasites such as hookworm and malaria also play an important role in evolutionary history. Bizarrely infections of hookworms and malaria often coincide, but where a person suffers a high level of hookworm infestation, their risk of death from malaria decreases. Despite the fact that 25% of the world’s population being infected with hookworms sounds atrocious, this has been postulated to be an adaptive and beneficial mutualism, since hookworms decrease free iron and help prevent the lethality of the malarial parasite:
“A protective effect of iron deficiency against malaria has been supported by studies of animals and humans. Iron-supplementation treatment of anemia increases the risk of P. vivax malaria.
In contrast, anemia-inducing parasites, including particular helminths and nematodes, appear to offer a benefit against malaria to humans who are infected with these organisms and who live in regions in which malaria is endemic. Bacteria that induce iron-deficiency anemia in humans also might confer resistance to malaria”
Bloodletting & Tea Drinking
Tea has been drunk for at least 5,000 years. The plant, Camellia sinensis, originated in southwestern China and north Burma and may have its roots in the Holocene ‘Broad Spectrum Revolution’ of increasing plant use as the temperature across the planet increased. Tea contains tannins, a type of polyphenolic compound which are remarkably good at binding to proteins. This property is especially acute when looking at iron absorption and excessive tea drinking is a major risk factor for anaemia due to the ability of tannins to sequester iron. One statistic has eliminating tea drinking producing a three-fold increase in non-heme iron uptake. But as with anaemia in general, it is possible that widespread tea consumption may be adaptive against the persistent threat of plague and malaria, decreasing free iron and improving survival rates at the cost of general overall health. Some researchers have linked the explosion of tea drinking in Europe with its beneficial effects against the ‘White Plague’ of 17th century tuberculosis. Similarly, although I haven’t seen any research into this topic, the boom of nut consumption in places like Japan, California and Central Europe during the early Holocene may have had a similar effect against the increase of malaria and infections of sedentary lifestyles, due to the high tannin levels of certain nuts.
On the opposite end of the iron-spectrum, those suffering from haemochromatosis, particularly men, have no way to reduce their excess iron except through bleeding. There are hints in the literature that this may have been a selective mechanism for aggression and warfare amongst northern European populations, but certainly the ‘quack’ medical intervention of bloodletting may also have been an effective remedy in the same regions of the world. As Burke & Duffy note:
“In medicine, Armand Trousseau presented the first case description in 1865, and Friedrich Daniel von Recklinghausen applied the name “hemochromatosis” in 1889 (Geller & de Campos, 2015). Screening and early diagnosis of hereditary hemochromatosis can offset the potentially damaging effects of iron accumulation, and regular phlebotomy provides “a simple, cheap and efficient therapeutic modality” to purge excessive iron from the body (Girouard et al., 2002:185). Phlebotomy was first introduced clinically in 1950, an effective treatment because the blood loss stimulates erythropoiesis, helping to draw iron out of storage in peripheral tissues (Hollerer et al., 2017:812).”
Other review papers looking at bloodletting, or phlebotomy, come to the same conclusion. This suggests that our scornful view of medieval and early modern medicine needs some revising, since male populations in Britain, Ireland and Scandinavia at least would have gained some benefit from purging their blood on a semi-regular basis.
Ireland & Iron-Overload
Having set up the discussion so far, we can see that agriculture had a widespread effect globally on iron-metabolism and introduced both anaemia and some of the pathogenic conditions under which anaemia might be beneficial. Crowded and disease-ridden cities, booming tropical populations and the growth in malaria and other parasites may have been offset or at least managed by lower free iron in the body, despite the huge toll it takes on the individual’s general health. Vegetarian diets low in iron and cultural phenomena like tea drinking may have helped people survive plagues and infections. So why then would an excess of iron be of any benefit to anyone?
An excess of iron in the context of an agricultural diet would in general promote greater health and well-being. An over reliance on cereals, dairy and eggs and a reduction in meat consumption causes lower iron absorption - therefore haemochromatosis offers an advantage to a farming-pastoralist population. But this doesn’t explain why the C282Y mutation appears more frequently in Celtic and northern European peoples. One possibility is that anaemia causes a decrease in thyroid output and lowers overall body thermogenesis. A paper from 2016 suggests that C282Y is a climate-driven adaptation for Neolithic groups moving into a cold and damp climate, helping to maintain a higher body temperature:
“The C282Y allele is the major cause of hemochromatosis as a result of excessive iron absorption. The mutation arose in continental Europe no earlier than 6,000 years ago, coinciding with the arrival of the Neolithic agricultural revolution. Here we hypothesize that this new Neolithic diet, which originated in the sunny warm and dry climates of the Middle East, was carried by migrating farmers into the chilly and damp environments of Europe where iron is a critical micronutrient for effective thermoregulation. We argue that the C282Y allele was an adaptation to this novel environment.”
However, this has been challenged on the grounds that similar iron-overload mutations have appeared and spread in other parts of the world where temperature is not an issue, most notably during the Bantu Expansions in Sub-Saharan Africa, where the Q248H mutation performs similarly. Even today the condition known as Bantu Siderosis still affects many individuals, especially those who drink homemade beer brewed in traditional iron pots.
Ireland offers a particularly interesting case though for the confluence of culture and biology when looking at haemochromatosis. The traditional pre-Famine diet of Ireland was based around oat porridge, buttermilk and dairy, some beef and fish and then, infamously, potatoes. Potatoes are an absolute powerhouse of nutrition. As anyone interested in historical diets knows, a person can live perfectly healthily on potatoes, dairy and the occasional bowl of porridge for trace elements. The introduction of the potato to Ireland was a biological revelation. Boiling them, as per the traditional Irish method, helps preserve the majority of the vitamin C, a factor absolutely crucial to non-heme iron uptake. On average, adult Irish males ate 12 lbs, adult females 10 lbs, and children under 11 years of age 4 lbs of potatoes per day, which combined with some dairy, was a monotonous but healthy diet.
Unexpectedly, the Irish population is amongst the most gluten intolerant in Europe. Roughly 1% of Irish people and their descendants suffer from celiac disease, likely due to a combination of genetic and dietary factors. Wheat and other cereal grains fare poorly in the damp boggy conditions of the northwest Atlantic, but oats grow much better. Oats are significantly lower in gluten and were prepared as a fermented porridge with dairy, increasing their digestibility.
Combining these two facts together with the bottleneck of the Famine, we can piece together perhaps a major selection pressure for the persistence of C282Y in Ireland compared to elsewhere.
The Great Famine & Its Consequences
The potato blight which spread rapidly across Europe hit Ireland hardest of all, due to its over-reliance on this single crop. Potatoes, as we’ve seen, are exceptionally good crops for a poor farmer, plus they can be harvested twice, sometimes even three times a year. When Phytophthora infestans struck, it created an enormous and sweeping selection mechanism across the population.
The major substitute famine food introduced by the British was maize, a crop which was poorly understood. The phenomena of niacin-deficiency in those on a predominantly maize-based diet was not known to early European populations and it took major famines and pellagra outbreaks in Italy and in the southern United States to prompt extensive research into the topic. For Irish tenant farmers in the 1840’s, the knowledge that maize had to be heat-treated with an alkaline material to be fully edible was a world away. Metabolically, the absence of niacin can be remedied through the conversion of the amino acid tryptophan. Typtophan was available from the buttermilk and dairy in the traditional diet, and coincidentally, the more iron the body has available, the easier the conversion process. Anaemic people cannot convert tryptophan so readily, thus the C282Y ‘enriched’ Irish were at an advantage during the early years of the famine.
Famines also cause disease, often the greater killer than starvation. Stressed and malnourished people become easy prey for any number of infections, and in Ireland this was combined with widespread scurvy, since the potato was the major source of vitamin C in the diet. Curiously though, the C282Y mutation may have offered enough protection against many of these diseases to act as a selective mechanism for survival:
“It is reasonable to infer that epidemic typhus bacteria would be disadvantaged by the iron withholding associated with C282Y, since work on Rickettsia rickettsia, a related species, confirms that limiting the bacterium's access to iron inhibits growth (Ellison et al., 2009). In this instance, amidst the malnutrition, scurvy, and anemia apparent in emigrants boarding ships, high mortality caused by epidemic typhus would have offered strong selection for C282Y, and promoted the allele's distribution, via gene flow, into territories such as Canada, the United States, and Australia that received waves of emigrants escaping the Famine.”
The final major stressor here was the replacement food that Ireland came to rely on during and after the Famine, namely wheat bread and black tea. As mentioned, celiac disease is far more common in Ireland than elsewhere and the phytate levels in wheat are significantly higher than found in oats. Together the diarrhoea and loss of available non-heme iron would have favoured those who already possessed excess iron levels.
The introduction of black tea was equally devastating for the general health of the population. As we’ve seen, the tannins are exceptionally effective scavengers of non-heme iron, which combined with phytates, a lack of meat, widespread alcohol consumption and high infection rates, compounded the pressure on those with any extra iron stored in their system. Tea production was through day-long stewing on a stove, a method which simply increased the tannins in the final drink. The effects of tea became obviously deleterious, as stated in this quote from Miller (2013):
“Concerns about the impact of tea on mental health reached a crescendo when tea drinking became implicated in an apparently dramatic increase in insanity in Ireland, discussion of which reveals the extent to which medical opinion on the matter had begun to penetrate even official circles. Like other countries, Ireland had suffered during the agricultural depression of the late nineteenth century. Contemporaneously, Irish asylums reported dramatic rises in admissions, which were blamed in a special inquiry undertaken on the issue in 1894 upon a lack of nutritious food, increased vexation and worry and the gradual derangement of physical and mental functions. Tea was targeted repeatedly throughout this report. It was observed that Indian tea of inferior quality was commonly consumed by Ireland’s poorer classes – stewed, rather than infused – and that this caused a peculiar form of dyspepsia which in turn debilitated nervous systems and generated psychological problems. Inspectors observed that a general dietary change from oatmeal, porridge, potatoes and milk to bread and tea had occurred throughout the country. This, combined with severe mental strain, had resulted in epileptic seizures and consequent mania, noted to rapidly pass away following a period of rest and nutritious food supplied in the asylum. One inspector noted that in districts including Ballinasloe, County Galway, alcohol usage had declined dramatically, meaning that what was termed ‘the insanity of malnutrition’ seemed to have been a prime explanatory factor for rising incarceration levels. Special prominence was also given to the excessive consumption of stewed tea by factory workers in the industrialized region of Belfast.”
Doctors repeatedly noted that families traded away eggs and other nutritious foods for tea and that, on average, women were drinking 12 cups of strong black tea every day. The resulting ill-health was horrific, particularly on children, who became pale, anaemic and disease-ridden, raised on a diet of black tea and bread. Again from Miller:
“Over-indulgence in the substance, claimed the newspaper, was causing numerous housewives to seek solace in the outpatient departments of hospitals, where washerwomen, kitchen girls and mothers would arrive daily with symptoms including headaches, nausea, loss of appetite, physical distress after eating, and dizziness. The Belfast Newsletter depicted a cycle of events whereby the housewife would gradually lose her appetite due to excessive tea consumption, slowly coming to loathe food. She would then find solace in the tea-cup, although this ultimately intensified her condition. Methods of tea preparation which entailed obtaining as much tannin (or tannic acid) from the tea as possible would then be fostered to quell her physical cravings”
The decrease in potatoes and replacement with maize, bread and tea, plus the rise in infectious disease, combined to create a new cultural niche where C282Y carriers were more likely to survive childhood and reproduce. Although this scenario is exceptional in many ways, not least the global trade in foodstuffs, it does raise the question of whether haemochromatosis was generally adaptive during earlier periods of starvation, famine and widespread disease.
The story of how agriculture came to alter the human body is long and extensive, but iron surely ranks high on the list of problems it has brought. Almost every population on earth now suffers with iron-deficient anaemia in one form or another and, despite any benefits it might bring with regards to disease, it clearly creates a form of sickly life, hardly brimming with health and vitality. Haemochromatosis on the other hand is a strange disorder which on the face of it should be more widespread, given the advantages it offers agricultural peoples. And yet it remains confined to small pockets of the world and with no overarching reason as to why. Complicated histories of disease, diet, famine and maybe warfare seem to have mixed together to promote the C282Y variant in northern Europe and for Ireland specifically the Famine was a major selective pressure. The biological story of the Famine is a reminder of the power of basic metabolic facts like iron absorption and niacin availability. These facts underpin how a population tackles a catastrophe like a famine, or in our times, a global pandemic. We should always be cognizant of the hidden factors of ethnicity and genetics in such scenarios and do our best to understand how populations form, where they come from and what pressures they have been through.