Author Archives: Steve Drury

Something large moved 2 billion years ago

More than 50 years ago a group of schoolchildren discovered a fronded fossil (Charnia) in the Precambrian rocks of Charnwood Forest in the English Midlands. Since then it has been clear that multicellular life originated before the Cambrian Period, when the first tangible life had previously been considered to have emerged. Discovery of the rich Ediacaran fauna of quilted, baglike and disc-like animals in 635 Ma old Neoproterozoic sediments in South Australia, and many other occurrences re-established the start of the ‘carnival of animals’ in the Ediacaran Period (635 to 541 Ma). It happened to follow the climatic and environmental turmoil of at least two Snowball Earth episodes during the preceding Cryogenian Period (850 to 635 Ma), which has led to a flurry of suggestions for the transition from protozoan to metazoan life. Yet, applying a ‘molecular-clock’ approach to the genetic differences between living metazoan organisms seems to suggest a considerable earlier evolutionary event that started ‘life as we know it’. That may have been confirmed by a discovery in much older sediments in Gabon, West Africa.

A sequence of shallow-marine sediments in the Francevillian Series in Gabon was laid down at a time of fluctuating sea level around 2100 Ma ago, when the upper oceans had become oxygenated. In them are black shales that preserve an abundance of intricate sedimentary features. Among them are curious stringy structures rich in crystalline pyrite (Fe2S). They are infilled wiggly tubes that lie in the shale bedding. CT scans reveal that the bedding has been flattened around the tubules as it became lithified. So the tubes formed while the sediment was wet and soft (El Albani, A. and 22 others 2019. Organism motility in an oxygenated shallow-marine environment 2.1 billion years ago. Proceedings of the National Academy of Sciences, online preprint; DOI: 10.1073/pnas.1815721116). They look very like burrows. Up to 5 mm across, they can be considered large by comparison with almost all organisms known from that time. The exception comes from the same stratigraphic Series in Gabon. In 2010, El Albani and colleagues published an account of fossils preserved by pyrite that look like fried eggs, 1 to 2 cm across, with scalloped edges. Internal structures revealed by CT scanning include radial slits in the ‘whites’ and folding within the central ‘yolk’. That paper reported the geochemical presence in the host shales of steranes, which are breakdown products of steroids that are unique to eukaryotes. Could these organisms and the wiggly tube-like trace fossils indicate the presence of the earliest metazoans in the Francevillian Series?

407458aa.2

Palaeoproterozoic fossils from the Francevillian Series in Gabon. Top: greytone photographs of burrow-like trace fossils (Credit: El Albani et al. 2019; Fig.1). Bottom: colour photograph and 3 CT scans of discoidal fossil (Credit: El Albani et al. 2010; Fig. 4).

Until the discoveries in Gabon, the oldest organic structure that had been suggested to be a metazoan was the rare Grypania, a spiral, strap-like fossil found in a variety of strata ranging in age from 1870 to 650 Ma. Being made of a structureless ribbon of graphite, Grypania seems most likely to have been made by colonial bacteria. The two Gabon life forms cannot be disposed of quite so easily. The discoids have organised structures rivalling those in Ediacaran animals, while the wiggly tubes clearly seem to indicate something capable of movement. In both cases preservation is by iron sulfide, which suggests the presence at some stage of chemo-autotrophic bacteria that reduce sulfate ions to sulfide. Could these not have formed mats taking up irregular discs and plates? The burrows may have been formed by unicellular eukaryotes, one type of which – the slime moulds – is capable of aggregating together to form multi-celled reproductive structures as well as living freely as single amoeba. Some form slug-like masses that are capable of movement; not metazoans, but perhaps their precursors.

A stratigraphic timeline for the Denisova Cave

Denisova Cave was named to commemorate an 18th century hermit called Denis, who used it as his refuge. The culmination of more than four decades of excavation, which followed the discovery there of Mousterian and Levallois tools there, has been the explosion onto the palaeoanthropological scene of Denisovan genomics, beginning in 2010 with sequenced DNA from a child’s finger bone. The same layer yielded Neanderthal DNA from a toe bone in 2013. Another layer yielded similar evidence in 2018 of an individual who had a Neanderthal father and a Denisovan mother. Application of the new technique of peptide mass fingerprinting, or zooarchaeology by mass spectrometry (ZooMS), to small, unidentifiable bone fragments from the cave sediments revealed further signs of Denisovan occupation and the first trace of anatomically modern humans (AMH). So far the tally is 4 Denisovans (two female children and two adult males), a Neanderthal woman and the astonishing hybrid. Analyses of the sediments themselves showed traces of both Neanderthal and Denisovan mtDNA from deeper in the stratigraphy than levels in which human fossils had been found, but which contained artefacts. The discovery of the first Denisovan DNA revealed that AMH migrants from Africa who reached the West Pacific islands about 65 ka ago carried fragments of that genome. As well as hybridising with Neanderthals some of the people who left Africa had interbred with Denisovans sufficiently often for genetic traces to have survived. Yet, until now, the ages of the analysed samples from the cave remained unknown.

That is no surprise for two reasons: cave sediments are complex, having been reworked over millennia to scramble their true stratigraphy; most of the organic remains defied 14C dating, being older than its maximum limit of determination. However, using alternative approaches has resulted in two papers in the latest issue of Nature. The first reports results from two methods that rely on the luminescence of grains of quartz and feldspar when stimulated, which measures the time since they were last exposed to light (Jacobs, Z. and 10 others 2019. Timing of archaic hominin occupation of Denisova Cave in southern Siberia. Nature, v. 565, p. 594-599; DOI: 0.1038/s41586-018-0843-2). Over 280 thousand grains in 103 sediment samples from different depths and various parts of the cave system have yielded a range of ages from 300 to 20 ka that span 3 glacial-interglacial cycles except for a few gaps, giving rough estimates of the timing of hominin occupation shown by fossils and soil layers that contain DNA. The youngest evidence for Denisovans is shown to be roughly 50 ka; a time when AMH was present elsewhere in Siberia. They lived at a time halfway between the 130 ka interglacial and the last glacial maximum. Two Neanderthals, a Denisovan and the hybrid occupied the site during the 130 ka interglacial. Soils from the previous warm episode from 250 to 200 ka contain both Neanderthal and Denisovan DNA traces. The oldest occupancy, marked by the presence of a Denisovan bone sample, was 300 ka ago, once again midway between an interglacial and a glacial maximum.

407458aa.2

All the hominin remains found in Denisova Cave: Note the common scale. (Credit: Douka et al. 2019; extended data Figure 1)

The second paper (Douka, K. and 21 others 2019. Age estimates for hominin fossils and the onset of the Upper Palaeolithic at Denisova Cave. Nature, v. 565, p. 640–644; DOI: 10.1038/s41586-018-0870-z) focused on direct dating of the hominin fossils themselves – and thus their DNA content, important in trying to piece together timings of genetic mixing. In the absence of radiocarbon dates from the bones themselves because of most specimens’ >50 ka ages, except in the case of the youngest whose 14C age is at the 50 ka limit. They resorted to a hybrid technique based on a means of modelling fossils’ ages from differences in mtDNA between the specimens and that in the youngest hominin, which, luckily, was dateable by radiocarbon means. Weighted by dating of the actual sediments that contain them, the differences should become greater for successively older fossils because of random mutations: a variant of the ‘molecular clock’ approach. It’s complicated and depends on assuming that mitochondrial mutation rate was the same as that in modern humans. Unsurprisingly the results are imprecise, but sufficient to match the hominin fossil occurrences with different environmental conditions

Pollen grains and vertebrate fossils from various levels in the cave system demonstrate the wide climatic and ecological conditions in which the various hominins lived. The warmest episodes supported broad-leafed forest, offering maximum resources for hominin survival. Those between interglacial and full glacial conditions were much less benign, with alternating dry and wet cold conditions that supported open steppe ecosystems. The oldest Denisovan occupation was at the close of a period of moderately warm and humid conditions that supported mixed conifer and broad-leafed trees that gave way to reduced tree cover.

As well as the presence of stone tools sporadically through the sedimentary sequence, in the youngest levels there are bone rings and pendants made from deer teeth; clearly ornamental items.  Did the late Denisovans make them or do they signify anatomically modern human activity? Radiocarbon ages do not give a concrete answer, one of the pendants is about 45 ka old with an error that puts it just within the range of age variation of the oldest Denisovan fossil. No AMH remains have been found in Denisova Cave, but remains of a modern human male have been found at Ust’-Ishim, in NW Siberia. At 45 ka, he represents the earliest arrival of AMH in northern Asia. So it may have been members of this new population that left ornaments in Denisova, but, for the moment, artistic Denisovans are a possibility.

Further deployment of rapid screening for hominin bone fragments using the ZooMS method and analyses for traces of DNA in soils is likely to expand the geographic and time ranges of Denisovans and other close human relatives. Denisova Cave formed in Silurian limestones of the Altai Range, and there are other caves in those hills …

Related article: Dennel, R. 2019. Dating of hominin discoveries at Denisova. Nature, v. 565, p. 571-572; DOI: 10.1038/d41586-019-00264-0)

MOOCs: wheels come off the bandwagon

Massive open online courses (MOOCs for short) first mooted in 2006, surfaced with something of a pop in 2012. Intended to be open to all with Internet access, they promised a renaissance of higher education with the ’best’ professors, educational technologies and materials, flexibility, innovative assessment and accreditation (if chosen), no entry requirements, and very low cost at a time of relentlessly rising fees for conventional study. And they did not require attendance, although certificates of successful completion may be a currency for acceptance in conventional HE. They could be about literally anything at a variety of levels and involving a range of study times. By the end of 2016 MOOC programs had been set up by more than 700 universities worldwide, and around 58 million students had signed up to one of more courses. The general business model is described as ‘freemium’; i.e. a pricing strategy whereby a product or service is provided free of charge, with a premium charged for certification. There are innumerable variants of this model. The top providers are mainly consortia linking several universities and other academic and cultural entities. Futurelearn, although wholly owned by the formerly world-leading distance-learning distributor the British Open University, has 157 partners in Britain and globally. Its venture into the field involved its investing several tens of million UK pounds at start-up, which some believe was the source of its current financial difficulties.

The 11 January issue of Science published a brief account of the fortunes of a range of MOOC providers (Reich, J. & Ruipérez, J.A. 2019. The MOOC pivot. Science, v. 363, p. 130-131; DOI: 10.1126/science.aav7958) using data from edX that links Harvard University and MIT. The vast majority of learners who chose MOOCs never return after their first year. Growth in the market is concentrated almost entirely in affluent countries, whereas the model might seem tailor-made, and indeed vital, for less fortunate parts of the world. Completion rates are very low indeed, largely as a result of poor retention: since 2012 drop-out rates in the first year are greater than 80%. In the data used in the study both enrollments and certifications from 2012 to last year rose to peaks in the first three years (to 1.7 million and 50 thousand respectively) then fell sharply in the last two years (to <1 million and <20 thousand, respectively). Whatever the ‘mission’ of the providers  – was it altruistic or seeking a revenue stream? – the MOOC experience seems to be falling by the wayside. Perhaps many students took MOOCs for self-enlightenment rather than for a credential, as their defenders maintain. Well, the figures suggest that few saw fit to continue the experience. Surely, if knowledge was passed on at a level commensurate with participants requirements in a manner that enthused them, a great many would have signed up for ‘more of the same’: clearly that didn’t happen.

The authors conclude with, ‘Dramatic expansion of educational opportunities to underserved populations will require political movements that change the focus, funding, and purpose of higher education; they will not be achieved through new technologies alone.’

A unifying idea for the origin of life

The nickel in stainless steel, the platinum in catalytic converters and the gold in jewellery, electronic circuits and Fort Knox should all be much harder to find in the Earth’s crust. Had the early Earth formed only by accretion and then the massive chemical resetting mechanism of the collision that produced the Moon all three would lie far beyond reach. Both formation events would have led to an extremely hot young Earth; indeed the second is believed to have left the outer Earth and Moon completely molten. All three are siderophile metals and have such a strong affinity for metallic iron that they would mostly have been dragged down to each body’s core as it formed in the early few hundred million years of the Earth-Moon system, leaving very much less in the mantle than rock analyses show. This emerged as a central theme at the Origin of Life Conference held in Atlanta GA, USA in October 2018. The idea stemmed from two papers published in 2015 that reported excessive amounts in basaltic material from both Earth and Moon of a tungsten isotope (182W) that forms when a radioactive isotope of hafnium (182Hf), another strongly siderophile metal, decays. Hafnium too must have been strongly depleted in the outer parts of both bodies when their cores formed. The excesses are explained by substantial accretion of material rich in metallic iron to their outer layers shortly after Moon-formation, some being in large metallic asteroids able to penetrate to hundreds of kilometres. Hot iron is capable of removing oxygen from water vapour and other gases containing oxygen, thereby being oxidised. The counterpart would have been the release of massive amounts of hydrogen, carbon and other elements that form gases when combined with oxygen. The Earth’s atmosphere would have become highly reducing.

Had the atmosphere started out as an oxidising environment, as thought for many decades, it would have posed considerable difficulties for the generation at the surface of hydrocarbon compounds that are the sine qua non for the origin of life. That is why theories about abiogenesis (life formed from inorganic matter) hitherto have focussed on highly reducing environments such as deep-sea hydrothermal vents where hydrogen is produced by alteration of mantle minerals. The new idea revitalises Darwin’s original idea of life having originated in ‘a warm little pond’. How it has changed the game as regards the first step in life, the so-called ‘RNA World’ can be found in a detailed summary of the seemingly almost frenzied Origin of Life Conference (Service, R.F. 2019. Seeing the dawn. Science, v. 363, p. 116-119; DOI: 10.1126/science.363.6423.116).

Isotope geochemistry has also entered the mix in other regards, particularly that gleaned from tiny grains of the mineral zircon that survived intact from as little as 70 Ma after the Moon-forming and late-accretion events to end up (3 billion years ago) in the now famous Mount Narryer Quartzite of Western Australia. The oldest of these zircons (4.4 Ga) suggest that granitic rocks had formed the earliest vestiges of continental crust far back in the Hadean Eon: Only silica-rich magmas contain enough zirconium for zircon (ZrSiO4) to crystallise. Oxygen isotope studies of them suggest that at that very early date they had come into contact with liquid water, presumably at the Earth’s surface. That suggests that perhaps there were isolated islands of early continental materials; now vanished from the geological record. A 4.1 Ga zircon population revealed something more surprising: graphite flakes with carbon isotopes enriched in 12C that suggests the zircons may have incorporated carbon from living organisms.

407458aa.2

A possible timeline for the origin of life during the Hadean Eon (Credit: Service, R.F. 2019, Science)

Such a suite of evidence has given organic chemists more environmental leeway to suggest a wealth of complex reactions at the Hadean surface that may have generated the early organic compounds needed as building blocks for RNA, such as aldehydes and sugars (specifically ribose that is part of both RNA and DNA), and the amino acids forming the A-C-G-U ‘letters’ of RNA, some catalysed by the now abundant siderophile metal nickel. One author seems gleefully to have resurrected Darwin’s ‘warm little pond’ by suggesting periodic exposure above sea level of abiogenic precursors to volcanic sulfur dioxide that could hasten some key reactions and create large masses of such precursors which rain would have channelled into ‘puddles and lakes’. The upshot is that the RNA World precursor to the self-replication conferred on subsequent life by DNA is speculated to have been around 4.35 Ga, 50 Ma after the Earth had cooled sufficiently to have surface water dotted with specks of continental material.

There are caveats in Robert Services summary, but the Atlanta conferences seems set to form a turning point in experimental palaeobiology studies.

Impacts increased at the end of the Palaeozoic

Because it is so geologically active the Earth progressively erases signs of asteroid and comet impacts, by erosion, burial or even subduction in the case of the oceanic record. As a result, the number of known craters decreases with age. To judge the influence of violent extraterrestrial events in the past geologists therefore rely on secondary outcomes of such collisions, such as the occasional presence in the sedimentary record of shocked quartz grains, glassy spherules and geochemical anomalies of rare elements. The Moon, on the other hand, is so geologically sluggish that its surface preserves many of the large magnitude impacts during its history, except for those wiped out by later such events. For instance, a sizeable proportion of the lunar surface comprises its dark maria, which are flood basalts generated by gigantic impacts around 4 billion years ago. Older impacts can only be detected in its rugged, pale highland terrains, and they have been partially wiped out by later impact craters. The Moon’s surface therefore preserves the most complete record of the flux and sizes of objects that have crossed its orbit shared with the Earth.

The Earth presents a target thirteen times bigger than the cross sectional area of the Moon so it must have received 13 times more impacts in their joint history.  Being about 81 times as massive as the Moon its stronger gravitational pull will have attracted yet more and all of them would have taken place at higher speeds. The lunar samples returned by the Apollo Missions have yielded varying ages for impact-glass spherules so that crater counts combined with evidence for their relative ages have been calibrated to some extent to give an idea of the bombardment history for the Earth Moon System. Until recently this was supposed to have tailed off exponentially since the Late Heavy Bombardment between 4.0 to 3.8 billion years ago. But the dating of the lunar record using radiometric ages of the small number of returned samples is inevitably extremely fuzzy. A team of planetary scientists from Canada, the US and Britain has developed a new approach to dating individual crater using image data from NASA’s Lunar Reconnaissance Orbiter (LRO) launched in 2009 (Mazrouei, S. et al. 2019. Earth and Moon impact flux increased at the end of the Paleozoic. Science, v. 363, p. 253-257; DOI: 10.1126/science.aar4058).

The method that they devised is, curiously, based on thermal imagery from the LRO’s Diviner instrument which records the Moon’s surface temperature. Comparison of day- and night-time temperatures produces a measure of surface materials’ ability to retain heat known as thermal inertia. A material with high thermal inertia stays warmer for longer at night. When a crater forms it partly fills with rock fragments excavated by the impact. When fresh these are full of large blocks of rock that were too massive to be blasted away. But these blocks are exposed to bombardment by lesser projectiles for the lifetime of the crater, which steadily reduces them to smaller fragments and eventually dust. Blocks of solid rock retain significantly more solar heat than do gravelly to dust-sized materials:  thermal inertia of the crater floor therefore decreases steadily with age.

407458aa.2

Blocky surface of a relatively young lunar crater (Credit: NASA)

As well as day- and night thermal data provided by the Diviner instrument, from which thermal inertia values are calculated, the LRO deploys two cameras that capture black and white images of the surface in the visible range, with a resolution of about a metre. They enable the blockiness of crater floors to be estimated. Sara Mazrouei and colleagues measured blockiness and thermal inertia of the floors of 111 craters more than 10 km across, ages of nine of which had been modelled independently using counts of smaller craters that subsequently accumulated on their floors shown by even finer resolution images from the Japanese Kaguya orbiter. Their results are surprising. About 290 Ma ago the rate of large impacts on the Moon increased by a factor of 2.6. This might explain why the Neoproterozoic and Palaeozoic Eras are deficient in terrestrial craters. Another inference from the results is that the number of objects in Earth-crossing orbits suddenly increased at the end of the Carboniferous. Maybe that resulted from an episode of collisions and break-up of large bodies in the Asteroid Belt or, perhaps, some kind of gravitational perturbation by Jupiter. The age-distribution of large craters on Earth is no help because of their ephemeral nature. Moreover, apart from Chicxulub that is bang on the K-Pg boundary, there is little evidence of an increase in impact-driven mass extinctions in the Mesozoic and Cenozoic. Nor for that matter did igneous activity or sediment deposition undergo any sudden changes. There are sediments that seem to have formed as a result of tsunami devastation, but none greater in magnitude than could have been caused by major earthquakes. Or … maybe geologists should have another look at the stratigraphic record.

Early stone tools spread more widely

The rift systems of Ethiopia, Kenya and Tanzania, and the limestone caverns near Johannesburg, South Africa have a long history of intensive archaeological study, rewarded by many finds of hominin skeletal remains and artifacts over the last century. Each region lays claim to be the birthplace of humans, that in South Africa being grandiloquently dubbed ‘The Cradle of Humankind’. Of course, the realistic chances of making discoveries and careers draws scientists and funds back to these regions again and again: a kind of self-fulfilling prophesy fueled by the old miners’ adage, ‘to find elephants you must go to elephant country’. The key site for the earliest stone tools was for a long time Tanzania’s Olduvai Gorge, thanks to finds of deliberately shaped choppers, hammer stones and sharp edges from about 2 Ma ago in close association with remains of Homo habilis by the Leakeys. Termed ‘Oldowan’, signs of this industry emerged from 2.6 Ma sediments in the Afar Depression of Ethiopia in 2010, but with no sign of who had made them. By 2015 the cachet of ‘first tools’ moved to Lomekwi on the shore of Lake Turkana in Kenya, dated to 3.3 Ma but again with no evidence for a maker. In fact the oldest evidence for the use of tools emerged with the 2010controversial discovery at Dikika in Afar of 3.4 Ma old bones that carry cut marks, but no sign of tools nor whoever had used them. However remains of Australopithecus afarensis occur only a few kilometers away.

Excavations outside the East African Rift System and South Africa are still few and far between, especially from before 1 Ma. The High Plateaus of eastern Algeria include one ancient site, near Ain Hanech, which yielded 1.8 Ma Oldowan stone artifacts as long ago as 1992. A nearby site at Ain Boucherit takes the North African record back to 2.4 Ma with both Oldowan tools and cut-marked bones of horse and antelope (Sahnouni, M. and 12 others 2018. 1.9-million- and 2.4-million-year-old artifacts and stone tool–cutmarked bones of from Ain Boucherit, Algeria. Science, v. 362, p. 1297-1301; DOI: 10.1126/science.aau0008). Tool makers had clearly diffused across what is now the Sahara Desert by that time. Given the distance between the Lomekwi and Dikika sites in East Africa that is hardly a surprise, provided climatic conditions were favourable. Michel Brunet’s discovery in 3.3 Ma old sediments of an australopithecine (Au. bahrelghazali) in central Chad demonstrates that early hominins were quite capable of spreading across the African continent. Yet, to wean palaeoanthropologists and their sponsors from hitherto fruitful, ‘elephant’ areas to a more ‘blue skies’ approach is likely to be difficult. There are plenty of sedimentary basins in Africa that preserve Miocene to Recent sediments that may yet turn up fossils and artifacts that take the science of human origins and peregrinations further and possibly in unexpected taxonomic directions

Related article: Gibbons, A. 2018. Strongest evidence of early humans butchering animals discovered in North Africa. Science News online; doi:10.1126/science.aaw2245.

Pterosaurs had feathers and fur

Pterosaurs, which include the pterodactyls and pteranodons, were the first vertebrates to achieve proper, flapping flight. In the popular imagination they are regarded as ‘flying dinosaurs’, whereas the anatomy of the two groups is significantly different. The first of them appeared in the Upper Triassic around 235 Ma ago, at roughly the same time as the earliest known dinosaurs. The anatomical differences make it difficult to decide on a common ancestry for the two. But detailed analysis of pterosaur anatomy suggests that they share enough features with dinosaurs, crocodiles and birds for all four groups to have descended from ancestral archosaurs that were living in the early Triassic, and they survived the mass extinction at the end of that Period. Birds, on the other hand, first appear in the fossil record during the Upper Jurassic 70 Ma later than pterosaurs. They are now widely regarded as descendants of early theropod dinosaurs, which are known commonly to have had fur and feathers.

Pterosaurs leapt into the public imagination in the final chapter of Sir Arthur Conan Doyle’s Lost World with a clatter of ‘dry, leathery wings’ as Professor George Challenger’s captive pterodactyl from northern Brazil’s isolated Roraima tepui plateau made its successful bid for escape from a Zoological Institute meeting in Queens Hall. Yet, far from being leathery, pterosaurs turned out, in the late 1990’s, to have carried filamentous pycnofibres akin to mammalian hair. Widespread reports in the world press during the week before Christmas in 2018 hailed a further development that may have rescued pterosaurs from Conan Doyle’s 1912 description before it sprang from its perch:

It was malicious, horrible, with two small red eyes as bright as points of burning coal. Its long, savage mouth, which it held half-open, was full of a double row of sharp-like teeth. Its shoulders were humped, and round them was draped what appeared to be a faded grey shawl. It was the devil of our childhood in person.

Two specimens from the Middle to Upper Jurassic Yanliiao lagerstätte in China show far more (Yang, Z. and 8 others 2018. Pterosaur integumentary structures with complex feather-like branching. Nature Ecology & Evolution, v. 3, p. 24-30; DOI: 10.1038/s41559-018-0728-7). Their pycnofibres show branching tufts, similar to those found in some theropods dinosaurs, including tyrannosaurs. They also resemble mammalian underfur fibres, whose air-trapping properties provide efficient thermal insulation. Both body and wings of these pterosaurs are furry, which the authors suggest may also have helped reduce drag during flight, while those around the mouth may have had a sensory function similar to those carried by some living birds. Moreover, some of the filaments contain black and red pigments.

407458aa.2

Artist’s impression of a Jurassic anurognathid pterosaur from China (Credit: Yang et al 2018; Fig. 4)

Pterosaurs may have independently developed fur and feathers; a case of parallel evolution in response to similar evolutionary pressures facing dinosaurs, birds and mammals. Alternatively, they may have had a deep evolutionary origin in the common ancestors of all these animal groups as far back as the Upper Carboniferous and Lower Permian.

Related articles: Nature Editorial 2018. Fur and fossils. Nature, v. 564, p. 301-302; DOI: 10.1038/d41586-018-07800-4; King, A. 2018. Pterosaurs sported feathers, claim scientists (The Scientist); Conniff, R. 2018. Pterosaurs just keep getting weirder (Scientific American); New discovery pushes origin of feathers back by 70 million years (Science Daily)

Calibrating 14C dating

Radiocarbon dating is the most popular tool for assessing the ages of archaeological remains and producing climatic time series, as in lake- and sea-floor cores, provided that organic material can be recovered. Its precision has steadily improved, especially with the development of accelerator mass spectrometry, although it is still limited to the last 50 thousand years or so because of the short half-life of 14C (about 5,730 years,). The problem with dating based on radioactive 14C is its accuracy; i.e. does it always give a true date. This stems from the way in which 14C is produced – by cosmic rays interacting with nitrogen in the atmosphere. Cosmic irradiation varies with time and, consequently, so does the proportion of 14C in the atmosphere. It is the isotope’s proportion in atmospheric CO2 gas at any one time in the past, which is converted by photosynthesis to dateable organic materials, that determines the proportion remaining in a sample after decay through the time since the organism died and became fossilised. Various approaches have been used to allow for variations in 14C production, such as calibration to the time preserved in ancient timber by tree rings which can be independently radiocarbon dated. But that depends on timber from many different species of tree from different climatic zones, and that is affected by fractionation between the various isotopes of carbon in CO2, which varies between species of plant. But there is a better means of calibration.

The carbonate speleothem that forms stalactites and stalagmites by steady precipitation from rainwater, sometimes to produce visible layering, not only locks in 14C dissolved from the atmosphere by rainwater but also environmental radioactive isotopes of uranium and thorium. So, layers in speleothem may be dated by both methods for the period of time over which a stalagmite, for instance, has grown. This seems an ideal means of calibration, although there are snags; one being that the proportion of carbon in carbonates is dominated by that from ancient limestone that has been dissolved by slightly acid rainwater, which dilutes the amount of 14C in samples with so called ‘dead carbon’. Stalagmites in the Hulu Cave near Nanjing in China have particularly low dead-carbon fractions and have been used for the best calibrations so far, going back the current limit for radiocarbon dating of 54 ka (Cheng, H. and 14 others 2018. Atmospheric 14C/12C during the last glacial period from Hulku Cave. Science, v. 362, p. 1293-1297; DOI: 10.1126/science.aau0747). Precision steadily falls off with age because of the progressive reduction to very low amounts of 14C in the samples. Nevertheless, this study resolves fine detail not only of cosmic ray variation, but also of pulses of carbon dioxide release from the oceans which would also affect the availability of 14C for incorporation in organic materials because deep ocean water contains ‘old’ CO2.

The earliest humans in Tibet

Modern Tibetans thrive in the rarefied air at altitudes above 4 km partly because they benefit from a genetic mutation of the gene EPAS1, which regulates haemoglobin production. Surprisingly, the segment of Tibetan’s DNA that contains the mutation matches that present in the genome of an undated Denisovan girl’s finger bone found in the eponymous Siberian cave. The geneticists who made this discovery were able to estimate that Tibetans inherited the entire segment sometime in the last 40 thousand years through interbreeding with Denisovans, who probably were able to live at high altitude too. Wherever and whenever this took place the inheritance was retained because it clearly helped those who carried it to thrive in Tibet. The same segment is present in a few percent of living Han Chinese people, which suggests their ancestors and those of the Tibetans were members of the same group some 40 ka ago, most of the Han having lost the mutation subsequently.

That inheritance would have remained somewhat mysterious while the existing evidence for the colonisation of the Tibetan Plateau suggested sometime in the Holocene, possibly by migrating early farmers. A single archaeological site at 4600 m on the Plateau has changed all that (Zhang, X.L. and 15 others 2018. The earliest human occupation of the high-altitude Tibetan Plateau 40 thousand to 30 thousand years ago. Science, v.  362, p. 1049-1051; DOI: 10.1126/science.aat8824). The dig at Nwya Devu, which lies 250 km NW of Lhasa, has yielded a sequence of sediments (dated by optically stimulated luminescence at between 45 to 18 thousand years) that contains abundant stone tools made from locally occurring slate. The oldest coincides roughly with the age of the earliest anatomically modern human migrants into northern China, so the earliest Tibetans may well have been a branch of that same group of people, as suggested by the DNA of modern Tibetan and Han people. However, skeletal remains of both humans and their prey animals are yet to emerge from Nwya Devu, which leaves open the question of who they were. Anatomically modern humans or archaic humans, such as Denisovans?

The tools do not help to identify their likely makers. Slate is easy to work and typically yields flat blades with sharp, albeit not especially durable, edges; they are disposable perhaps explaining why so many were found at Nwya Devu. None show signs of pressure flaking that typify tools made from harder, more isotropic rock, such as flint. Yet they include a variety of use-types: scrapers; awls; burins and choppers as well as blades. The lack of associated remains of prey or hearths is suggested by the authors to signify that the site was a workshop; perhaps that will change with further excavation in the area. The age range suggests regular, if not permanent, occupancy for more than 20 ka

Related articles: Gibbons, A. 2014. Tibetans inherited high-altitude gene from ancient human. Science News,2 July 2014, Zhang J-F. & Dennell, R. 2018. The last of Asia conquered by Homo sapiens. Science, v. 362, p. 992-993; DOI: 10.1126/science.aav6863.

Volcanism and the Justinian Plague

Between 541 and 543 CE, during the reign of the Roman Emperor Justinian, bubonic plague spread through countries bordering the Mediterranean Sea. This was a decade after Justinian’s forces had had begun to restore the Roman Empire’s lost territory in North Africa, Spain, Italy and the present-day Balkans by expeditions out of Byzantium (the Eastern Empire). At its height, the Plague of Justinian, was killing 5000 people each day in Constantinople, eventually to consume 20 to 40% of its population and between 25 to 50 million people across the empire. Like the European Black Death of the middle 14th century. The bacterium Yersinia pestis originated in Central Asia and is carried in the gut of fleas that live on rats. The ‘traditional’ explanation of both plagues was that plague spread westwards along the Silk Road and then with black rats that infested ship-borne grain cargoes. Plausible as that might seem, Yersinia pestis, fleas and rats have always existed and remain present to this day. Trade along the same routes continued unbroken for more than two millennia. Although plagues with the same agents recurred regularly, only the Plague of Justinian and the Black Death resulted in tens of million deaths over short periods. Some other factor seems likely to have boosted fatalities to such levels.

407458aa.2

Monk administering the last rites to victims of the Plague of Justinian

Five years before plague struck the Byzantine historian Procopius recorded a long period of fog and haze that continually reduced sunlight; typical features of volcanic aerosol veils. Following this was the coldest decade in the past 2300 years, as recorded by tree-ring studies. It coincides with documentary evidence of famine in China, Ireland, the Middle East and Scandinavia.. A 72 m long ice core extracted from the Colle Gnifetti glacier in the Swiss Alps in 2013 records the last two millennia of local climatic change and global atmospheric dust levels. Sampled by laser slicing, the core has yielded a time series of data at a resolution of months or better. In 536 an Icelandic volcano emitted ash and probably sulfur dioxide over 18 months during which summer temperature fell by about 2°C. A second eruption followed in 540 to 541. ‘Volcanic winter’ conditions lasted from 536 to 545, amplifying the evidence from tree-ring data from the 1990’s.

The Plague of Justinian coincided with the second ‘volcanic winter’ after several years of regional famine. This scenario is paralleled by the better documented Great Famine of 1315-17 that ended the two centuries of economic prosperity during the 11th to 13th centuries. The period was marked by extreme levels of crime, disease, mass death, and even cannibalism and infanticide. In a population weakened through malnutrition to an extent that we can barely imagine in modern Europe, any pandemic disease would have resulted in the most affected dying in millions. Another parallel with the Plague of Justinian is that it followed the ending of four centuries of the Medieval Warm Period, during which vast quantities of land were successfully brought under the plough and the European population had tripled. That ended with a succession of major, sulfur-rich volcanic eruption in Indonesia at the end of the 13th century that heralded the Little Ice Age. Although geologists generally concern themselves with the social and economic consequences of a volcano’s lava and ash in its immediate vicinity– the ‘Pompeii view’ – its potential for global catastrophe is far greater in the case of really large (and often remote) events.

Chemical data from the same ice core reveals the broad economic consequences of the mid-sixth century plague. Lead concentrations in the ice, deposited as airborne pollution from smelting of lead sulfide ore to obtain silver bullion, fell and remained at low levels for a century. The recovery of silver production for coinage is marked by a spike in glacial lead concentration in 640; another parallel with the Black Death, which was followed by a collapse in silver production, albeit only for 4 to 5 years.

Related article: Gibbons, A. 2018. Why 536 was ‘the worst year to be alive’. Science, v. 362,p. 733-734; DOI:10.1126/science.aaw0632