MOOCs: wheels come off the bandwagon

Massive open online courses (MOOCs for short) first mooted in 2006, surfaced with something of a pop in 2012. Intended to be open to all with Internet access, they promised a renaissance of higher education with the ’best’ professors, educational technologies and materials, flexibility, innovative assessment and accreditation (if chosen), no entry requirements, and very low cost at a time of relentlessly rising fees for conventional study. And they did not require attendance, although certificates of successful completion may be a currency for acceptance in conventional HE. They could be about literally anything at a variety of levels and involving a range of study times. By the end of 2016 MOOC programs had been set up by more than 700 universities worldwide, and around 58 million students had signed up to one of more courses. The general business model is described as ‘freemium’; i.e. a pricing strategy whereby a product or service is provided free of charge, with a premium charged for certification. There are innumerable variants of this model. The top providers are mainly consortia linking several universities and other academic and cultural entities. Futurelearn, although wholly owned by the formerly world-leading distance-learning distributor the British Open University, has 157 partners in Britain and globally. Its venture into the field involved its investing several tens of million UK pounds at start-up, which some believe was the source of its current financial difficulties.

The 11 January issue of Science published a brief account of the fortunes of a range of MOOC providers (Reich, J. & Ruipérez, J.A. 2019. The MOOC pivot. Science, v. 363, p. 130-131; DOI: 10.1126/science.aav7958) using data from edX that links Harvard University and MIT. The vast majority of learners who chose MOOCs never return after their first year. Growth in the market is concentrated almost entirely in affluent countries, whereas the model might seem tailor-made, and indeed vital, for less fortunate parts of the world. Completion rates are very low indeed, largely as a result of poor retention: since 2012 drop-out rates in the first year are greater than 80%. In the data used in the study both enrollments and certifications from 2012 to last year rose to peaks in the first three years (to 1.7 million and 50 thousand respectively) then fell sharply in the last two years (to <1 million and <20 thousand, respectively). Whatever the ‘mission’ of the providers  – was it altruistic or seeking a revenue stream? – the MOOC experience seems to be falling by the wayside. Perhaps many students took MOOCs for self-enlightenment rather than for a credential, as their defenders maintain. Well, the figures suggest that few saw fit to continue the experience. Surely, if knowledge was passed on at a level commensurate with participants requirements in a manner that enthused them, a great many would have signed up for ‘more of the same’: clearly that didn’t happen.

The authors conclude with, ‘Dramatic expansion of educational opportunities to underserved populations will require political movements that change the focus, funding, and purpose of higher education; they will not be achieved through new technologies alone.’

A unifying idea for the origin of life

The nickel in stainless steel, the platinum in catalytic converters and the gold in jewellery, electronic circuits and Fort Knox should all be much harder to find in the Earth’s crust. Had the early Earth formed only by accretion and then the massive chemical resetting mechanism of the collision that produced the Moon all three would lie far beyond reach. Both formation events would have led to an extremely hot young Earth; indeed the second is believed to have left the outer Earth and Moon completely molten. All three are siderophile metals and have such a strong affinity for metallic iron that they would mostly have been dragged down to each body’s core as it formed in the early few hundred million years of the Earth-Moon system, leaving very much less in the mantle than rock analyses show. This emerged as a central theme at the Origin of Life Conference held in Atlanta GA, USA in October 2018. The idea stemmed from two papers published in 2015 that reported excessive amounts in basaltic material from both Earth and Moon of a tungsten isotope (182W) that forms when a radioactive isotope of hafnium (182Hf), another strongly siderophile metal, decays. Hafnium too must have been strongly depleted in the outer parts of both bodies when their cores formed. The excesses are explained by substantial accretion of material rich in metallic iron to their outer layers shortly after Moon-formation, some being in large metallic asteroids able to penetrate to hundreds of kilometres. Hot iron is capable of removing oxygen from water vapour and other gases containing oxygen, thereby being oxidised. The counterpart would have been the release of massive amounts of hydrogen, carbon and other elements that form gases when combined with oxygen. The Earth’s atmosphere would have become highly reducing.

Had the atmosphere started out as an oxidising environment, as thought for many decades, it would have posed considerable difficulties for the generation at the surface of hydrocarbon compounds that are the sine qua non for the origin of life. That is why theories about abiogenesis (life formed from inorganic matter) hitherto have focussed on highly reducing environments such as deep-sea hydrothermal vents where hydrogen is produced by alteration of mantle minerals. The new idea revitalises Darwin’s original idea of life having originated in ‘a warm little pond’. How it has changed the game as regards the first step in life, the so-called ‘RNA World’ can be found in a detailed summary of the seemingly almost frenzied Origin of Life Conference (Service, R.F. 2019. Seeing the dawn. Science, v. 363, p. 116-119; DOI: 10.1126/science.363.6423.116).

Isotope geochemistry has also entered the mix in other regards, particularly that gleaned from tiny grains of the mineral zircon that survived intact from as little as 70 Ma after the Moon-forming and late-accretion events to end up (3 billion years ago) in the now famous Mount Narryer Quartzite of Western Australia. The oldest of these zircons (4.4 Ga) suggest that granitic rocks had formed the earliest vestiges of continental crust far back in the Hadean Eon: Only silica-rich magmas contain enough zirconium for zircon (ZrSiO4) to crystallise. Oxygen isotope studies of them suggest that at that very early date they had come into contact with liquid water, presumably at the Earth’s surface. That suggests that perhaps there were isolated islands of early continental materials; now vanished from the geological record. A 4.1 Ga zircon population revealed something more surprising: graphite flakes with carbon isotopes enriched in 12C that suggests the zircons may have incorporated carbon from living organisms.

407458aa.2

A possible timeline for the origin of life during the Hadean Eon (Credit: Service, R.F. 2019, Science)

Such a suite of evidence has given organic chemists more environmental leeway to suggest a wealth of complex reactions at the Hadean surface that may have generated the early organic compounds needed as building blocks for RNA, such as aldehydes and sugars (specifically ribose that is part of both RNA and DNA), and the amino acids forming the A-C-G-U ‘letters’ of RNA, some catalysed by the now abundant siderophile metal nickel. One author seems gleefully to have resurrected Darwin’s ‘warm little pond’ by suggesting periodic exposure above sea level of abiogenic precursors to volcanic sulfur dioxide that could hasten some key reactions and create large masses of such precursors which rain would have channelled into ‘puddles and lakes’. The upshot is that the RNA World precursor to the self-replication conferred on subsequent life by DNA is speculated to have been around 4.35 Ga, 50 Ma after the Earth had cooled sufficiently to have surface water dotted with specks of continental material.

There are caveats in Robert Services summary, but the Atlanta conferences seems set to form a turning point in experimental palaeobiology studies.

Impacts increased at the end of the Palaeozoic

Because it is so geologically active the Earth progressively erases signs of asteroid and comet impacts, by erosion, burial or even subduction in the case of the oceanic record. As a result, the number of known craters decreases with age. To judge the influence of violent extraterrestrial events in the past geologists therefore rely on secondary outcomes of such collisions, such as the occasional presence in the sedimentary record of shocked quartz grains, glassy spherules and geochemical anomalies of rare elements. The Moon, on the other hand, is so geologically sluggish that its surface preserves many of the large magnitude impacts during its history, except for those wiped out by later such events. For instance, a sizeable proportion of the lunar surface comprises its dark maria, which are flood basalts generated by gigantic impacts around 4 billion years ago. Older impacts can only be detected in its rugged, pale highland terrains, and they have been partially wiped out by later impact craters. The Moon’s surface therefore preserves the most complete record of the flux and sizes of objects that have crossed its orbit shared with the Earth.

The Earth presents a target thirteen times bigger than the cross sectional area of the Moon so it must have received 13 times more impacts in their joint history.  Being about 81 times as massive as the Moon its stronger gravitational pull will have attracted yet more and all of them would have taken place at higher speeds. The lunar samples returned by the Apollo Missions have yielded varying ages for impact-glass spherules so that crater counts combined with evidence for their relative ages have been calibrated to some extent to give an idea of the bombardment history for the Earth Moon System. Until recently this was supposed to have tailed off exponentially since the Late Heavy Bombardment between 4.0 to 3.8 billion years ago. But the dating of the lunar record using radiometric ages of the small number of returned samples is inevitably extremely fuzzy. A team of planetary scientists from Canada, the US and Britain has developed a new approach to dating individual crater using image data from NASA’s Lunar Reconnaissance Orbiter (LRO) launched in 2009 (Mazrouei, S. et al. 2019. Earth and Moon impact flux increased at the end of the Paleozoic. Science, v. 363, p. 253-257; DOI: 10.1126/science.aar4058).

The method that they devised is, curiously, based on thermal imagery from the LRO’s Diviner instrument which records the Moon’s surface temperature. Comparison of day- and night-time temperatures produces a measure of surface materials’ ability to retain heat known as thermal inertia. A material with high thermal inertia stays warmer for longer at night. When a crater forms it partly fills with rock fragments excavated by the impact. When fresh these are full of large blocks of rock that were too massive to be blasted away. But these blocks are exposed to bombardment by lesser projectiles for the lifetime of the crater, which steadily reduces them to smaller fragments and eventually dust. Blocks of solid rock retain significantly more solar heat than do gravelly to dust-sized materials:  thermal inertia of the crater floor therefore decreases steadily with age.

407458aa.2

Blocky surface of a relatively young lunar crater (Credit: NASA)

As well as day- and night thermal data provided by the Diviner instrument, from which thermal inertia values are calculated, the LRO deploys two cameras that capture black and white images of the surface in the visible range, with a resolution of about a metre. They enable the blockiness of crater floors to be estimated. Sara Mazrouei and colleagues measured blockiness and thermal inertia of the floors of 111 craters more than 10 km across, ages of nine of which had been modelled independently using counts of smaller craters that subsequently accumulated on their floors shown by even finer resolution images from the Japanese Kaguya orbiter. Their results are surprising. About 290 Ma ago the rate of large impacts on the Moon increased by a factor of 2.6. This might explain why the Neoproterozoic and Palaeozoic Eras are deficient in terrestrial craters. Another inference from the results is that the number of objects in Earth-crossing orbits suddenly increased at the end of the Carboniferous. Maybe that resulted from an episode of collisions and break-up of large bodies in the Asteroid Belt or, perhaps, some kind of gravitational perturbation by Jupiter. The age-distribution of large craters on Earth is no help because of their ephemeral nature. Moreover, apart from Chicxulub that is bang on the K-Pg boundary, there is little evidence of an increase in impact-driven mass extinctions in the Mesozoic and Cenozoic. Nor for that matter did igneous activity or sediment deposition undergo any sudden changes. There are sediments that seem to have formed as a result of tsunami devastation, but none greater in magnitude than could have been caused by major earthquakes. Or … maybe geologists should have another look at the stratigraphic record.

Early stone tools spread more widely

The rift systems of Ethiopia, Kenya and Tanzania, and the limestone caverns near Johannesburg, South Africa have a long history of intensive archaeological study, rewarded by many finds of hominin skeletal remains and artifacts over the last century. Each region lays claim to be the birthplace of humans, that in South Africa being grandiloquently dubbed ‘The Cradle of Humankind’. Of course, the realistic chances of making discoveries and careers draws scientists and funds back to these regions again and again: a kind of self-fulfilling prophesy fueled by the old miners’ adage, ‘to find elephants you must go to elephant country’. The key site for the earliest stone tools was for a long time Tanzania’s Olduvai Gorge, thanks to finds of deliberately shaped choppers, hammer stones and sharp edges from about 2 Ma ago in close association with remains of Homo habilis by the Leakeys. Termed ‘Oldowan’, signs of this industry emerged from 2.6 Ma sediments in the Afar Depression of Ethiopia in 2010, but with no sign of who had made them. By 2015 the cachet of ‘first tools’ moved to Lomekwi on the shore of Lake Turkana in Kenya, dated to 3.3 Ma but again with no evidence for a maker. In fact the oldest evidence for the use of tools emerged with the 2010controversial discovery at Dikika in Afar of 3.4 Ma old bones that carry cut marks, but no sign of tools nor whoever had used them. However remains of Australopithecus afarensis occur only a few kilometers away.

Excavations outside the East African Rift System and South Africa are still few and far between, especially from before 1 Ma. The High Plateaus of eastern Algeria include one ancient site, near Ain Hanech, which yielded 1.8 Ma Oldowan stone artifacts as long ago as 1992. A nearby site at Ain Boucherit takes the North African record back to 2.4 Ma with both Oldowan tools and cut-marked bones of horse and antelope (Sahnouni, M. and 12 others 2018. 1.9-million- and 2.4-million-year-old artifacts and stone tool–cutmarked bones of from Ain Boucherit, Algeria. Science, v. 362, p. 1297-1301; DOI: 10.1126/science.aau0008). Tool makers had clearly diffused across what is now the Sahara Desert by that time. Given the distance between the Lomekwi and Dikika sites in East Africa that is hardly a surprise, provided climatic conditions were favourable. Michel Brunet’s discovery in 3.3 Ma old sediments of an australopithecine (Au. bahrelghazali) in central Chad demonstrates that early hominins were quite capable of spreading across the African continent. Yet, to wean palaeoanthropologists and their sponsors from hitherto fruitful, ‘elephant’ areas to a more ‘blue skies’ approach is likely to be difficult. There are plenty of sedimentary basins in Africa that preserve Miocene to Recent sediments that may yet turn up fossils and artifacts that take the science of human origins and peregrinations further and possibly in unexpected taxonomic directions

Related article: Gibbons, A. 2018. Strongest evidence of early humans butchering animals discovered in North Africa. Science News online; doi:10.1126/science.aaw2245.

Pterosaurs had feathers and fur

Pterosaurs, which include the pterodactyls and pteranodons, were the first vertebrates to achieve proper, flapping flight. In the popular imagination they are regarded as ‘flying dinosaurs’, whereas the anatomy of the two groups is significantly different. The first of them appeared in the Upper Triassic around 235 Ma ago, at roughly the same time as the earliest known dinosaurs. The anatomical differences make it difficult to decide on a common ancestry for the two. But detailed analysis of pterosaur anatomy suggests that they share enough features with dinosaurs, crocodiles and birds for all four groups to have descended from ancestral archosaurs that were living in the early Triassic, and they survived the mass extinction at the end of that Period. Birds, on the other hand, first appear in the fossil record during the Upper Jurassic 70 Ma later than pterosaurs. They are now widely regarded as descendants of early theropod dinosaurs, which are known commonly to have had fur and feathers.

Pterosaurs leapt into the public imagination in the final chapter of Sir Arthur Conan Doyle’s Lost World with a clatter of ‘dry, leathery wings’ as Professor George Challenger’s captive pterodactyl from northern Brazil’s isolated Roraima tepui plateau made its successful bid for escape from a Zoological Institute meeting in Queens Hall. Yet, far from being leathery, pterosaurs turned out, in the late 1990’s, to have carried filamentous pycnofibres akin to mammalian hair. Widespread reports in the world press during the week before Christmas in 2018 hailed a further development that may have rescued pterosaurs from Conan Doyle’s 1912 description before it sprang from its perch:

It was malicious, horrible, with two small red eyes as bright as points of burning coal. Its long, savage mouth, which it held half-open, was full of a double row of sharp-like teeth. Its shoulders were humped, and round them was draped what appeared to be a faded grey shawl. It was the devil of our childhood in person.

Two specimens from the Middle to Upper Jurassic Yanliiao lagerstätte in China show far more (Yang, Z. and 8 others 2018. Pterosaur integumentary structures with complex feather-like branching. Nature Ecology & Evolution, v. 3, p. 24-30; DOI: 10.1038/s41559-018-0728-7). Their pycnofibres show branching tufts, similar to those found in some theropods dinosaurs, including tyrannosaurs. They also resemble mammalian underfur fibres, whose air-trapping properties provide efficient thermal insulation. Both body and wings of these pterosaurs are furry, which the authors suggest may also have helped reduce drag during flight, while those around the mouth may have had a sensory function similar to those carried by some living birds. Moreover, some of the filaments contain black and red pigments.

407458aa.2

Artist’s impression of a Jurassic anurognathid pterosaur from China (Credit: Yang et al 2018; Fig. 4)

Pterosaurs may have independently developed fur and feathers; a case of parallel evolution in response to similar evolutionary pressures facing dinosaurs, birds and mammals. Alternatively, they may have had a deep evolutionary origin in the common ancestors of all these animal groups as far back as the Upper Carboniferous and Lower Permian.

Related articles: Nature Editorial 2018. Fur and fossils. Nature, v. 564, p. 301-302; DOI: 10.1038/d41586-018-07800-4; King, A. 2018. Pterosaurs sported feathers, claim scientists (The Scientist); Conniff, R. 2018. Pterosaurs just keep getting weirder (Scientific American); New discovery pushes origin of feathers back by 70 million years (Science Daily)

Calibrating 14C dating

Radiocarbon dating is the most popular tool for assessing the ages of archaeological remains and producing climatic time series, as in lake- and sea-floor cores, provided that organic material can be recovered. Its precision has steadily improved, especially with the development of accelerator mass spectrometry, although it is still limited to the last 50 thousand years or so because of the short half-life of 14C (about 5,730 years,). The problem with dating based on radioactive 14C is its accuracy; i.e. does it always give a true date. This stems from the way in which 14C is produced – by cosmic rays interacting with nitrogen in the atmosphere. Cosmic irradiation varies with time and, consequently, so does the proportion of 14C in the atmosphere. It is the isotope’s proportion in atmospheric CO2 gas at any one time in the past, which is converted by photosynthesis to dateable organic materials, that determines the proportion remaining in a sample after decay through the time since the organism died and became fossilised. Various approaches have been used to allow for variations in 14C production, such as calibration to the time preserved in ancient timber by tree rings which can be independently radiocarbon dated. But that depends on timber from many different species of tree from different climatic zones, and that is affected by fractionation between the various isotopes of carbon in CO2, which varies between species of plant. But there is a better means of calibration.

The carbonate speleothem that forms stalactites and stalagmites by steady precipitation from rainwater, sometimes to produce visible layering, not only locks in 14C dissolved from the atmosphere by rainwater but also environmental radioactive isotopes of uranium and thorium. So, layers in speleothem may be dated by both methods for the period of time over which a stalagmite, for instance, has grown. This seems an ideal means of calibration, although there are snags; one being that the proportion of carbon in carbonates is dominated by that from ancient limestone that has been dissolved by slightly acid rainwater, which dilutes the amount of 14C in samples with so called ‘dead carbon’. Stalagmites in the Hulu Cave near Nanjing in China have particularly low dead-carbon fractions and have been used for the best calibrations so far, going back the current limit for radiocarbon dating of 54 ka (Cheng, H. and 14 others 2018. Atmospheric 14C/12C during the last glacial period from Hulku Cave. Science, v. 362, p. 1293-1297; DOI: 10.1126/science.aau0747). Precision steadily falls off with age because of the progressive reduction to very low amounts of 14C in the samples. Nevertheless, this study resolves fine detail not only of cosmic ray variation, but also of pulses of carbon dioxide release from the oceans which would also affect the availability of 14C for incorporation in organic materials because deep ocean water contains ‘old’ CO2.

The earliest humans in Tibet

Modern Tibetans thrive in the rarefied air at altitudes above 4 km partly because they benefit from a genetic mutation of the gene EPAS1, which regulates haemoglobin production. Surprisingly, the segment of Tibetan’s DNA that contains the mutation matches that present in the genome of an undated Denisovan girl’s finger bone found in the eponymous Siberian cave. The geneticists who made this discovery were able to estimate that Tibetans inherited the entire segment sometime in the last 40 thousand years through interbreeding with Denisovans, who probably were able to live at high altitude too. Wherever and whenever this took place the inheritance was retained because it clearly helped those who carried it to thrive in Tibet. The same segment is present in a few percent of living Han Chinese people, which suggests their ancestors and those of the Tibetans were members of the same group some 40 ka ago, most of the Han having lost the mutation subsequently.

That inheritance would have remained somewhat mysterious while the existing evidence for the colonisation of the Tibetan Plateau suggested sometime in the Holocene, possibly by migrating early farmers. A single archaeological site at 4600 m on the Plateau has changed all that (Zhang, X.L. and 15 others 2018. The earliest human occupation of the high-altitude Tibetan Plateau 40 thousand to 30 thousand years ago. Science, v.  362, p. 1049-1051; DOI: 10.1126/science.aat8824). The dig at Nwya Devu, which lies 250 km NW of Lhasa, has yielded a sequence of sediments (dated by optically stimulated luminescence at between 45 to 18 thousand years) that contains abundant stone tools made from locally occurring slate. The oldest coincides roughly with the age of the earliest anatomically modern human migrants into northern China, so the earliest Tibetans may well have been a branch of that same group of people, as suggested by the DNA of modern Tibetan and Han people. However, skeletal remains of both humans and their prey animals are yet to emerge from Nwya Devu, which leaves open the question of who they were. Anatomically modern humans or archaic humans, such as Denisovans?

The tools do not help to identify their likely makers. Slate is easy to work and typically yields flat blades with sharp, albeit not especially durable, edges; they are disposable perhaps explaining why so many were found at Nwya Devu. None show signs of pressure flaking that typify tools made from harder, more isotropic rock, such as flint. Yet they include a variety of use-types: scrapers; awls; burins and choppers as well as blades. The lack of associated remains of prey or hearths is suggested by the authors to signify that the site was a workshop; perhaps that will change with further excavation in the area. The age range suggests regular, if not permanent, occupancy for more than 20 ka

Related articles: Gibbons, A. 2014. Tibetans inherited high-altitude gene from ancient human. Science News,2 July 2014, Zhang J-F. & Dennell, R. 2018. The last of Asia conquered by Homo sapiens. Science, v. 362, p. 992-993; DOI: 10.1126/science.aav6863.

Volcanism and the Justinian Plague

Between 541 and 543 CE, during the reign of the Roman Emperor Justinian, bubonic plague spread through countries bordering the Mediterranean Sea. This was a decade after Justinian’s forces had had begun to restore the Roman Empire’s lost territory in North Africa, Spain, Italy and the present-day Balkans by expeditions out of Byzantium (the Eastern Empire). At its height, the Plague of Justinian, was killing 5000 people each day in Constantinople, eventually to consume 20 to 40% of its population and between 25 to 50 million people across the empire. Like the European Black Death of the middle 14th century. The bacterium Yersinia pestis originated in Central Asia and is carried in the gut of fleas that live on rats. The ‘traditional’ explanation of both plagues was that plague spread westwards along the Silk Road and then with black rats that infested ship-borne grain cargoes. Plausible as that might seem, Yersinia pestis, fleas and rats have always existed and remain present to this day. Trade along the same routes continued unbroken for more than two millennia. Although plagues with the same agents recurred regularly, only the Plague of Justinian and the Black Death resulted in tens of million deaths over short periods. Some other factor seems likely to have boosted fatalities to such levels.

407458aa.2

Monk administering the last rites to victims of the Plague of Justinian

Five years before plague struck the Byzantine historian Procopius recorded a long period of fog and haze that continually reduced sunlight; typical features of volcanic aerosol veils. Following this was the coldest decade in the past 2300 years, as recorded by tree-ring studies. It coincides with documentary evidence of famine in China, Ireland, the Middle East and Scandinavia.. A 72 m long ice core extracted from the Colle Gnifetti glacier in the Swiss Alps in 2013 records the last two millennia of local climatic change and global atmospheric dust levels. Sampled by laser slicing, the core has yielded a time series of data at a resolution of months or better. In 536 an Icelandic volcano emitted ash and probably sulfur dioxide over 18 months during which summer temperature fell by about 2°C. A second eruption followed in 540 to 541. ‘Volcanic winter’ conditions lasted from 536 to 545, amplifying the evidence from tree-ring data from the 1990’s.

The Plague of Justinian coincided with the second ‘volcanic winter’ after several years of regional famine. This scenario is paralleled by the better documented Great Famine of 1315-17 that ended the two centuries of economic prosperity during the 11th to 13th centuries. The period was marked by extreme levels of crime, disease, mass death, and even cannibalism and infanticide. In a population weakened through malnutrition to an extent that we can barely imagine in modern Europe, any pandemic disease would have resulted in the most affected dying in millions. Another parallel with the Plague of Justinian is that it followed the ending of four centuries of the Medieval Warm Period, during which vast quantities of land were successfully brought under the plough and the European population had tripled. That ended with a succession of major, sulfur-rich volcanic eruption in Indonesia at the end of the 13th century that heralded the Little Ice Age. Although geologists generally concern themselves with the social and economic consequences of a volcano’s lava and ash in its immediate vicinity– the ‘Pompeii view’ – its potential for global catastrophe is far greater in the case of really large (and often remote) events.

Chemical data from the same ice core reveals the broad economic consequences of the mid-sixth century plague. Lead concentrations in the ice, deposited as airborne pollution from smelting of lead sulfide ore to obtain silver bullion, fell and remained at low levels for a century. The recovery of silver production for coinage is marked by a spike in glacial lead concentration in 640; another parallel with the Black Death, which was followed by a collapse in silver production, albeit only for 4 to 5 years.

Related article: Gibbons, A. 2018. Why 536 was ‘the worst year to be alive’. Science, v. 362,p. 733-734; DOI:10.1126/science.aaw0632

Subglacial impact structure: trigger for Younger Dryas?

Radar microwaves are able to penetrate easily through several kilometres of ice. Using the arrival times of radar pulses reflected by the bedrock at glacial floor allows ice depth to be computed. When deployed along a network of flight lines during aerial surveys the radar returns of large areas can be converted to a grid of cells thereby producing an image of depth: the inverse of a digital elevation model. This is the only means of precisely mapping the thickness variations of an icecap, such as those that blanket Antarctica and Greenland. The topography of the subglacial surface gives an idea of how ice moves, the paths taken by liquid water at its base, and whether or not global warming may result in ice surges in parts of the icecap. The data can also reveal topographic and geological features hidden by the ice (see The Grand Greenland Canyon September 2013).

Untitled-2

Colour-coded subglacial topography from radar sounding over the Hiawatha Glacier of NW Greenland (Credit: Kjaer et al. 2018; Fig. 1D)

Such a survey over the Hiawatha Glacier of NW Greenland has showed up something most peculiar (Kjaer, K.H. and 21 others 2018. A large impact crater beneath Hiawatha Glacier in northwest Greenland. Science Advances, v. 4, eaar8173; DOI: 10.1126/sciadv.aar8173). Part of the ice margin is an arc, which suggests the local bed topography takes the form of a 31km wide, circular depression. The exposed geology shows no sign of a structural control for such a basin, and is complex metamorphic basement of Palaeoproterozoic age. Measurements of ice-flow speeds are also anomalous, with an array of higher speeds suggesting accelerated flow across the depression. The radar image data confirm the presence of a subglacial basin, but one with an elevated rim and a central series of small peaks. These are characteristic of an impact structure that has only been eroded slightly; i.e. a fairly recent one and one of the twenty-five largest impact craters on Earth.. Detailed analysis of raw radar data in the form of profiles through the ice reveals  that the upper part is finely layered and undisturbed. The layering continues into the ice surrounding the basin and is probably of Holocene age (<11.7 ka), based on dating of ice in cores through the surrounding icecap. The lower third is structurally complex and shows evidence for rocky debris. Sediment deposited by subglacial streams where they emerge along the arcuate rim contain grains of shocked quartz and glass, as well as expected minerals from the crystalline basement rocks. Some of the shocked material contains unusually high concentrations of transition-group metals, platinum-group elements and gold; further evidence for impact of extraterrestrial material – probably an iron asteroid that was originally more than 1 km in diameter. The famous Cape York iron meteorite, which weighs 31 t – worked by local Innuit to forge harpoon blades – fell in NW Greenland about 200 km away.

The central issue is not that Hiawatha Glacier conceals a large impact crater, but its age. It certainly predates the start of the Holocene and is no older than the start of Greenland glaciation about 2.6 Ma ago. That only Holocene ice layers are preserved above the disrupted ice that rests immediately on top of the crater raises once again the much-disputed possibility of an asteroid impact having triggered the Younger Dryas cooling event and associated extinctions of large mammals in North America at about 12.9 ka (see Impact cause for Younger Dryas draws flak May 2008). Only radiometric dating of the glassy material found in the glaciofluvial sediments will be able to resolve that particular controversy.

Oceanic hydrothermal vents and the origin of life

A range of indirect evidence has been used to suggest that life originated deep in the oceans around hydrothermal vents, such as signs of early organic matter in association with Archaean pillow lavas. One particularly persuasive observation is that a number of proteins and other cell chemicals are constructed around metal sulfide groups. Such sulfides are common around hydrothermal ‘smokers’ associated with oceanic rift systems. Moreover, Fischer-Tropsch reactions between carbon monoxide and hydrogen produce quite complex hydrocarbon molecules under laboratory conditions. Such hydrogenation of a carbon-bearing gas requires a catalyst, a commonly used one being chromium oxide (see Abiotic formation of hydrocarbons by oceanic hydrothermal circulation May 2004). It also turns out that fluids emitted by sea-floor hydrothermal systems are sometimes rich in free hydrogen, formed by the breakdown of olivine in ultramafic rocks to form hydroxylated minerals such as serpentine and talc. The fact that chromium is abundant in ultramafic rocks, in the form of its oxide chromite, elevates the possibility that Fischer-Tropsch reactions may have been a crucial part of the life-forming process on the early Earth. What is needed is evidence that such reactions do occur in natural settings.

407458aa.2

A white carbonate mound forming at the Lost City hydrothermal vent field on the Mid-Atlantic Ridge (Credit: Baross 2018)

One site on the mid-Atlantic ridge spreading centre, the Lost City vent field, operates because of serpentinisation of peridotites exposed on the ocean floor, to form carbonate-rich plumes and rocky towers; ‘white smokers’. So that is an obvious place to test the abiotic theory for the origin of life. Past analyses of the vents have yielded a whole range of organic molecules, including alkanes, formates, acetates and pyruvates, that are possible precursors for such a natural process. Revisiting Lost City with advanced analytical techniques has taken the quest a major step forward (Ménez, B. et al. 2018. Abiotic synthesis of amino acids in the recesses of the oceanic lithosphere. Nature, advance online publication; DOI: 10.1038/s41586-018-0684-z). The researchers from France and Kazakhstan focused on rock drilled from 170 m below the vent system, probably beyond the influence of surface contamination from living organisms. Using several methods they detected the nitrogen-containing amino acid tryptophan, and that alone. Had they detected other amino acids their exciting result would have been severely tempered by the possibility of surface organic contamination. The formation of tryptophan implies that its abiotic formation had to involve the reduction of elemental nitrogen (N2) to ammonia (NH3). Bénédicte Ménez and colleagues suggest that the iron-rich clay saponite, which is a common product of serpentine alteration at low temperatures, may have catalysed such reduction and amino-acid synthesis through Friedel–Crafts reactions. Fascinating as this discovery may be, it is just a step towards confirming life’s abiogenesis. It also permits speculation that similar evidence may be found elsewhere in the Solar System on rocky bodies, such as the moons Enceladus and Europa that orbit Saturn and Jupiter respectively. That is, if the rock base of hydrothermal systems thought to occur there can be reached.

Related article: Baross, J.A. 2018. The rocky road to biomolecules. Nature, v. 564, p. 42-43; DOI: 10.1038/d41586-018-07262-8.