Discover more from Grey Goose Chronicles
The Road to Metallurgy - Fire & Rocks
The Long Story Of How Humans Mastered Fire, Pigments and Minerals
How, when and why humans began to master metals is one of those great questions of archaeology and world history, the long slow story of controlling fire and experimenting with different rocks, pigments and ores. Too often though it is presented as an ex nihilo breakthrough, a revolution without precedent. In fact, humans had been tinkering with both fire and inorganic chemistry for tens of millennia, starting with cooking, creating paints, manufacturing glue, ceramics, weapons and artwork. In this brief piece I want to bring that story alive and lead you through these earliest glimpses of chemical experimentation, starting with the deep Palaeolithic, and finishing with the dawn of metallurgy. We’ll see by the end that smelting ores was not a breakthrough act from scratch, but the product of a perfect combination of developed experience, insight and location.
Fire & Rock - Humble Beginnings
There are surely many ways to tackle the problem of metalworking, but I want to make it simple. There are two key technologies which need to be controlled before metallurgy can begin - the ability to make, direct and manipulate fire and heat, and the ability to locate, identify, manipulate and exploit inorganic materials. By inorganic I mean materials as diverse as clay, ores, minerals, different rocks, crystals and geological features. To smelt metals you must have a good idea of how to create the right temperature and conditions and how to select the right ore and heat it in the right way.
These two technologies have exceptionally archaic roots, starting with the invention and use of fire and the discovery of colourful stones, powders and pigments.
Locating the first use of fire has proved to be a classically impossible problem for archaeology. Broadly speaking we have two lines of evidence, which unhelpfully don’t quite line up. First is identifying features on sites and artefacts that conclusively prove the controlled use of fire, the second is the physiological adaptations of humans to cooked food. The earliest traces of fire appear around 1.5-1 million years ago, which points to Homo erectus as the first masters of the flame. Two sites, one in Koobi Fora, Kenya, and the other in Wonderwerk Cave, Northern Cape province, South Africa, have fairly solid evidence of heating from micro changes to the surroundings sediments. Other similar sites, from Chesowanja, GnJi1/6E, Kenya; Gadeb, Ethiopia and Swartkrans, South Africa, show the limited use of fire through this time period. Though what is strange is that Homo erectus didn’t seem to use fire as a ‘base camp’ at this point. The surrounding Kuruman Hills near Wonderwerk Cave have turned up tens of millions of stone tools, dating to the contemporary Acheulean, but show no evidence of fire.
Despite this, plenty of archaeologists disagree with the trace evidence and instead prefer much later periods for more reliable signatures of controlled fire, but here we run into a dilemma. Both Homo erectus and sapiens show intense physiological adaptations towards cooked food. In fact, it goes further than this, we are obliged to eat cooked food. As Richard Wrangham neatly summarises:
Key evidence comes from research on raw foodists, that is, people who live for long periods on all-raw diets. Raw-foodist groups typically live in industrial societies on store-bought foods. Even though raw foodists take little exercise compared with hunter-gatherers and have fewer disease challenges, on average they experience chronic energy shortage leading to low body mass index (BMI). In the only study of reproductive performance, incompetent or absent ovulation left more than 50% of women on an all-raw diet unable to reproduce (Koebnick et al. 1999).
These physiological detriments are striking because the diet eaten by raw foodists is extremely high quality compared with any known hunter-gatherer diet (if their diet were eaten raw). Most of the raw foodists’ diet is rich in digestible energy and low in structural fiber because it comes from domesticated species. Furthermore, raw foodists typically process the food extensively by nonthermal means (such as by blending) and (in spite of their supposed adherence to raw) often use heat to lightly cook (up to around 114°F). In addition, raw foodists experience no important seasonal energy shortages (because they buy from globally connected markets; Wrangham 2009).
This obligation, a product of our shortened digestive tracts, smaller teeth, larger brains and lowered tolerance for meat-borne pathogens, points to a very early use of cooked foods in the Homo diet, freeing up calories to feed the hungry brain. At least, this is the ‘Cooked Food Hypothesis’. It infers that sometime during the Lower Palaeolithic, different hominins like Homo erectus and their ancestors began regularly using fire to chemically alter their food. How they made these fires and how they fitted into their social lives is uncertain, perhaps far more sporadically and more remotely than we like to imagine.
The second revolution in human technology that concerns us during the Palaeolithic is the appreciation of colour and the deliberate sourcing and manufacture of inorganic pigments, primarily from an iron-rich clay called red ochre. If any material can claim to have been our constant, common companion, it would either be flint or red ochre. Found in artwork and particularly in burials the world over, red ochre is the colour of cave paintings, of early textiles and jewellery, of graves and death and likely of skin painting or coverings. Ochre also has many other interesting uses, aside from its colour, including being worn as an insect repellent and sunscreen, as a binder in early adhesives, and as an ingredient in leather preservation.
This wonderful little snippet from Alexander Marshack’s 1981 paper on ochre gives a sense of the time depth involved here:
At the Homo erectus shelter of Bečov, Czechoslovakia, J. Fridrich excavated a piece of red ochre that was striated on two faces with the marks of abrasion, and here on flat rubbing stone with a granular crystallised surface that had been abraded in the centre, clearly in the preparation of ochre powder. On the floor of the shelter, at the side where the piece of ochre was found, there was a wide area of ochre powder. Seating himself on a rock against the wall of the shelter to study the ochre, Fridrich found that his feet accidentally fitted the only two areas without ochre powder. Homo erectus had sat on this stone, away from other activities in the site, while he made his red powder.
The evolutionary development of symbolic thought and behaviour is believed to occur at the threshold of sapien emergence as a species. Ochre pieces from Blombos Cave in South Africa, dated to 350,000 years ago, and the Olorgesailie basin in Kenya, dated to 307,000 years ago, appear to bear this out. Olorgesailie also shows long distance procurement of obsidian and even manganese dioxide, again for grinding to make pigments.
Neanderthals - Fire, Glue and Art
Speeding into the Middle Palaeolithic and the Neanderthals in particular, we come to another internal academic crossroads. The question at this point is not whether Neanderthals used fire, they absolutely did, but whether they were capable of creating fire on demand. If this sounds ridiculous its because it is, but I will be fair and give the sceptics their due.
The proposal that Neanderthals were unable to make fire when and where they liked is probably best defended by the archaeologist Dennis M Sandgathe. Across two papers (here and here) he outlines his case based on the frequency and timing of Neanderthal hearths and burnt artefacts (flint, bone etc) at a number of well excavated sites, particularly Pech de l'Azé IV and Roc de Marsal (Dordogne, France). In these papers he shows that fire use is strongly associated with the warmest climates and seasons, and appear infrequently or are absent during the coldest. This is something of a paradox, which Sandgathe explains as Neanderthals exploiting lightening strikes and wildfires, rather than creating fires themselves. This model has been thoroughly demolished, in my mind at least, by several detailed and complex papers which draw together palaeoclimatology, archaeology and geology to make the opposite case. Readers are welcome to follow the paper trails of arguments in the literature themselves, but I think with the following evidence we can be confident that Neanderthals had a superb grasp on fire production and control.
Neanderthal handaxes, those large pear-shaped flint tools, characteristic of their species, have been analysed closely under microscopes and many display a pattern of scratches and striations which look very similar to flint tools striking against iron pyrites (‘fool’s gold’). This combination has been well documented ethnographically, which is not a surprise since pyrite striking on flint produces a visible spark. Other evidence for Neanderthal fire skills come from well-preserved wooden spears, where the points have been fire-hardened, as well as from their careful selection of firewood, as attested from Abric Romaní in Spain. But probably the best display of fire technology comes from the Neanderthal production of glues - specifically tar created from heating birch-bark. This procedure can only be done at a certain temperature and oxygen has to be excluded from the process, indicating that Neanderthals had a fine-tuned mastery of heat.
Finally, the existence of Neanderthal artwork, long dismissed as a possibility, has been amply demonstrated over the last ten years or so. Red ochre, manganese oxide and dioxide, shell beads and feathers have all been discovered in Neanderthal contexts. Taken together, we should be satisfied that Neanderthals not only had superb control over fire production and use, but also that they understood the visible and working properties of different minerals - pyrite, flint, manganese ores and red ochre. All crucial steps towards metalworking.
The Upper Palaeolithic - Ceramics, Cave Art & Fossil Fuels
If I’m focusing heavily on the European archaeological record, its because that is where the best archaeological work has been done. The cold climate of the Palaeolithic, combined with the birth of archaeological science in Europe, has meant preservation and focus here is unparalleled, but I will expand to other parts of the world as we go on.
The Upper Palaeolithic in Europe, roughly from 50 - 12,000 years ago, saw a tremendous explosion in human creativity and inventiveness. Here we see the first ceramics appear in the human record, during the Pavlovian/Gravettian period (33-21,000 years ago). The site of Dolní Věstonice in the Czech Republic is astonishing for many reasons - earliest evidence for textiles, incredible figurine artwork, rich burials and so on - but the invention of ceramics counts amongst the world’s breakthrough technologies. However, unlike the utilitarian ceramic containers of later times, the Gravettian taste was far more cultic and religiously oriented. Thousands of fragments of animal figurines are known from Dolní Věstonice, all displaying signs of thermal shock. Combined with the discovery of limestone kilns in strange structures some way from the camp, we have a fascinating glimpse into the Upper Palaeolithic mind.
Careful analysis of the chemical composition of both the kilns and the figurines suggests that the people of Dolní Věstonice were gathering a kind of clay-like material called ‘loess’, which was wetted and pressed or sculpted, before being placed into a hot fire. The deliberate overloading of water and rapid temperature gain meant the figurines exploded, with glowing glassy fragments launching themselves from the hearths. The temperature inside the kiln was enough for the limestone to form lime (calcium oxide), a material we’ll see again soon. This intentional manipulation of chemistry, on a dark night by a glowing fire, paints a picture of people leaving their camp, gathered around a special kiln, to observe animal models glowing and exploding with great noise and visual effects. A magical performance.
Alongside such pyrotechnics came a profusion of cave art, mostly during the later Magdalenian period (17-12,000 years ago). This was the time of the most famous cave paintings, of Lascaux and Altamira - herds of animals, strange shamanic hybrid men, hand-prints and the careful exploitation of the cave surface to make bulls and horses appear like they were emerging from the very rocks themselves. The symbolism and meanings of these paintings will be debated forever, but what is often overlooked is the technical production. Alongside scaffolding to reach high areas of the cave, we see crayons, feather quills, blowpipes, brushes and special applicators. The pigment palette of red ochre and charcoal was expanded with crushed calcite, kaolin, umber, sienna and manganese. Binders, such as animal fats, marrow, albumen, saliva, urine, blood, vegetable juice and calcium carbonate-rich cave water were used. Extenders and preservatives such as crushed bone, biotite, feldspar and ground quartz were also added to the pigments, displaying a stunning level of ingenuity and experimentation on the part of these artists.
Finally, to complete the growing story of fire and fuel management, the Upper Palaeolithic and beyond also saw the first exploitation of fossil fuels, as well as superb control over heating stones. Lignite fragments appear in certain hearths, such as at Les Canalettes and Les Usclades at Causse du Larzac (France). Adding flint into the fire to alter its working properties really comes into its own during the Palaeolithic Solutrean period (22-17,000 years ago). Solutrean tools are famously outstanding amongst the technological advances of the Palaeolithic, displaying skills and techniques not seen before. Remarkably, the method required to heat chert and other tool-stones to make them easier to work, requires controlling the temperature in a narrow band between 250 °C and 300 °C. This is an astonishing level of control for an outdoor campfire and suggestions for how this was achieved has included a ‘sandbath’ made under a fire, or some kind of earth oven or kiln. Work still continues to understand how the Solutreans were able to do this, but it should tell us that sophisticated pyrotechnology was feasible during the Palaeolithic.
End of the Ice Age - Global Revolutions
The end of the ice age saw the world fragment and coalesce around a number of different economic strategies - complex hunter-fisher societies, early agriculture in the Near East, animal domestication, the introduction of forager and farmer ceramics, intensive proto-farming of different plants and stone/brick architecture. Within these revolutions there was an expansion of technologies involving heating different materials.
Plaster quickly became an invaluable material amongst Near Eastern Neolithic societies - as I’ve outline in a previous article, many cultic buildings and mortuary treatments of skulls required the production of gypsum and lime plasters. To quote from The Beginnings of Pyrotechnology, Part II: Production and Use of Lime and Gypsum Plaster in the Pre-Pottery Neolithic Near East:
Gypsum plaster is made by heating alabaster or gypsum rock (CaS04:2H20) at a temperature of 150-400°C to form the hemihydrate which, when mixed with water, reacts to reform the dihydrate …. The mix tends to set quickly and the resulting product is relatively soft and susceptible to chipping; it absorbs water, and can only be used for exterior purposes in dry climates. Lime technology is a good deal more complicated. Lime plaster is made by heating limestone (CaC03) for an extended period at bright heat, 800-900°C, to form quicklime (CaO), which must be soaked in water to form slaked lime (Ca(OHh), a process in which considerable heat is generated. The slaked lime paste can be stored for some time before use, but after drying and standing in air, the product reacts with the atmosphere to form the carbonate, CaC03.
What is often forgotten about these plasters is how energy and resource intensive they would have been to make. A limestone kiln circa 1850’s Britain required 1.8 tonnes of limestone and two tonnes of wood for every tonne of quicklime. Neolithic production kilns were likely less efficient, sucking up huge quantities of wood, charcoal, stone and manpower. On top of this is the danger involved in quicklime production - unlike many small scale technologies we’ve seen so far, kilns of burning limestone have the potential to seriously injure or kill their workers, anticipating the increased risk of metallurgy from heat and toxic fumes.
In a similar vein we see Neolithic salt production beginning to emerge across Europe, using wooden wells, charcoal burners and ceramic vessels to drive off the water and condense the salt. One of the best examples with preserved oak trunk wells comes from Fontaines Salées in Saint-Père-sous-Vézelay (Yonne, France). The production of charcoal itself, long a topic of speculation amongst Mesolithic archaeologists, should also be considered here. To my knowledge no early charcoal production sites have been located for Holocene Europe, but this is likely due to the charcoal being removed and the presumed earth oven ‘clamp’ being unidentifiable in the record.
Neolithic mining expanded from its proto-form and became a serious and organised activity in multiple regions. The Gavà Neolithic Mining Complex (GNMC) near Barcelona possessed five levels of activity, galleries and chambers, with the aim of acquiring variscite, a green phosphate mineral similar to turquoise that can be easily cut and polished to make ornaments such as necklaces or bracelets. Red cinnabar (mercury sulphide-HgS) began to be sought after for its vibrant pigmentation - in places like Almadén (Ciudad Real, Spain) and Vinča, on the right bank of the River Danube. Widespread cinnabar use led to mercury poisoning, as documented in the bones of Neolithic people from Perdigões, Portugal. Geological and mineralogical knowledge began to be accumulated across Europe, the Near East and wider Eurasia. As the researchers studying the Gavà Complex conclude:
The present study suggests that the Neolithic miners recognized simple three-dimensional geological structures (tabular bodies, as fractures and the stratigraphic discordance between the Silurian gray shales and the Quaternary caliches, clay deposits and debris), and used these observations to plan new mines and/or to search for new resources. There is sufficient evidence that the Neolithic miners formed their own ideas about which rocks were favorable to find variscite bearing veinlets (gray shales) and which were not (brown and multicoloured shales). We have found no evidence that the miners dug indiscriminately into the rocks
Cold Forging - The Dawn of Metals
As we draw towards the end of the story, we need to finish on the last rung of the technological ladder - cold forging, or cold working. This refers to manipulating metals without the use of heat, or at least using very low temperatures. Cold forging appears all over the world, from the high Arctic to Anatolia, and even without the later sophisticated metalworking methods, it provides useful cutting edges, points and durable tools.
Typically cold forging refers to copper, although meteoric iron can be cold worked, as the Inuit of Greenland discovered. Native copper, the fortunate occurrence of a metal unbound to other minerals, can be easily worked wherever it is found. Around the North American Great Lakes, the Old Copper Complex cultures made use of native copper possibly as early as 9,500 years ago, which makes them amongst the earliest metal working peoples. The area around the Levant and Anatolia saw an obvious transition from Epipalaeolithic peoples using malachite for green pigments, to early Neolithic worked copper (Aşıklı Höyük, Nevali Çori), followed by annealed copper beads and pieces around Çayönü Tepesi and Çan Hassan (6000 BC) and then a smelted and cast copper awl at Tel Tsaf, Jordan Valley, Israel around 5,000 BC. Copper extraction and working developments in the Balkans follow a similar pattern.
Without heading into metallurgy proper, this seems the right place to stop in a potted history of pre-metallurgical fire and mineral technologies. We’ve seen how the first developments of fire and pigmentation grew from their simple and sporadic roots, through increasingly complex and surprisingly early methods of heat control and mineral use, to the first stirrings of copper working across different parts of the world. I’ve necessarily left out many places and examples to make the article manageable, but hopefully you’ll agree that this is a fascinating story of human creativity and innovation, and that metal working was the end result of millennia of tinkering and experimentation, both for utilitarian tools and more esoteric and artistic endeavours.