The evolution of the word 'tea' (or 'cha')

If you look around the world, you might notice that there are two ways to designate 'tea'. One consists of variations of the English term tea, such as thee in Dutch. The other is some variation of 'cha', like chay in Hindi.

Both versions come from China. The words that sound like 'cha' spread across land, along the ancient Silk Road. The 'tea'-like phrasings spread over water, by Dutch traders of the vOC, the very first to bring tea to Europe.
The term cha (茶) is 'Sinitic', meaning it is common to many varieties of Chinese. It began in China and made its way through central Asia, eventually becoming chay (چای) in Persian. That is no doubt due to the trade routes of the Silk Road, along which, according to a recent discovery, tea was already traded over 2,000 years ago. This form spread beyond Persia, becoming chay in Urdu, shay in Arabic and chay in Russian. It even it made its way to sub-Saharan Africa, where it became chai in Swahili. The Japanese and Korean terms for tea are also based on the Chinese cha, though those languages likely adopted the word even before its westward spread into Persian.

The Chinese character for tea, 茶, is pronounced differently by different varieties of Chinese, though it is written the same in them all. In today’s Mandarin, it is chá. But in the Min Nan variety of Chinese, spoken in the coastal province of Fujian, the character is pronounced te.

The te-form, used in coastal-Chinese languages spread to Europe via the Dutch, who became the primary traders of tea between Europe and Asia in the 17th century. The main Dutch ports in east Asia were in Fujian and Taiwan, both places where people used the te-pronunciation. The Dutch East India Company’s expansive tea importation into Europe gave us the Dutch thee, French thé, the German tee and the English tea.
Yet the Dutch were not the first to Asia. That honour belongs to the Portuguese. The Portuguese did not trade not through Fujian but Macao, where chá is used. That’s why, on the map above, Portugal is a pink anomality in a sea of blue.

A few languages have their own way of talking about tea. These languages are generally in places where tea grows naturally, which led locals to develop their own way to refer to it. In Burmese, for example, tea leaves are lakphak.

The map demonstrates two different eras of globalization in action: the millenia-old overland spread of goods and ideas westward from ancient China and the 400-year-old influence of Asian culture on the seafaring Europeans of the age of exploration.

Whisky kills bacteria in ice

Italian researchers studied 60 samples of ice from domestic, restaurant or industrial producers. They found 52 different strains of bacteria, including Pseudomonas, Staphylococcus, Bacillus and Acinetobacter, across the 60 samples of ice, some of which were 'agents of human infection' indicating environmental contamination[1].
The researchers then took samples of contaminated ice and, to simulate a bar environment, used this ice to serve a range of drinks, including vodka, whisky, peach tea, tonic water and cola.

In the case of each drink, they found that the population of bacteria in the sample was reduced and cited the levels of alcohol, the drink’s pH and the amount of carbon dioxide in each serve as reasons for the reduction.

However, their results also showed that the ice sample served with whisky saw the greatest reduction in bacteria – none of the bacterial strains on the ice cubes survived after they were added to the whisky. The researchers noted that this was likely to be because whisky is somewhat more acidic than vodka. They speculated that the more acidic a drink is, the less likely bacteria are able to survive.

The question remains however why in the world would you add ice to your whisky or any other alcoholic drink.

[1] Settanni en al: Presence of pathogenic bacteria in ice cubes and evaluation of their survival in different systems in Annals of Microbiology - 2017

Why Painkillers are Killing America

Here, we wrote about the epidemic of opioid related death in the USA. But just stating the problem does not explain it. So, why do Americans flock to painkillers in the first place?

For several decades now the American Midwest has suffered from unprecedented economic decay courtesy of a persistent outsourcing of manufacturing jobs in the automotive and steel industries, among others.
Yes, the stock markets are reaching new highs every day, while industrial production lags. Normally one would expect that both numbers 'travel' in the same direction. Not now.

Which means that the American working middle class is experiencing the slow destruction of a way of life that used to exist. There has been stagnation in wages for the last 50 years. If you don’t have a university degree, median wages for those people have actually been going down. So it is just like that model, whereby American capitalism really delivered to people who were not particularly well-educated, seems to be broken.
The century-long decline in mortality rates that had gone on since the beginning of the 20th century had just stopped and was starting to rise. For mmortality rates to rise instead of fall is extremely rare. It typically takes a war or epidemic for death rates to jump.
It's not far-fetched to think that these deaths are tied to 'deaths of despair' from alcohol, suicide and opioids. Because opioids are often taken together with other pain medication, one can expect that accidental overdoses are rife.

The domestication of chickens

Chicken and humans have conquered the world together. Where humans went, chicken went too. Chicken (Gallus gallus domesticus) were first domesticated some 8,000 years ago from a hybrid of wild red junglefowl (Gallus gallus), and gray junglefowl (Gallus sonneratii).
[Red Junglefowl]
Domesticated chickens are less active, have fewer social interactions, are less aggressive to would-be predators and are less likely to go looking for foreign food sources than their wild ancestors. Chicken now have increased adult body weight and simplified plumage, while their egg production starts earlier, is more frequent and produces larger eggs.
[Grey junglefowl]
Research suggests there may have been multiple origins in distinct areas of South and Southeast Asia. The earliest archaeological evidence to date is from China about 5400 BC, though a few studies supported even earlier domestication of chicken in northern and central China[1]. Researchers think that chickens were a rare occurrence in northern and Central China, and thus probably an import from southern China or Southeast Asia where evidence of domestication is stronger.

The red junglefowl and gray junglefowl also live in India. Domestication of chickens appears in the Indus Valley around 2000 BC[2]. From there the chicken spread into Europe and Africa. Chickens arrived in the Middle East starting with Iran at 3900 BC, followed by Turkey and Syria (2400-2000 BC) and Jordan by 1200 BC.

The earliest firm evidence for chickens in east Africa are illustrations from several sites in Egypt's New Kingdom. Chickens were introduced into western Africa multiple times, arriving at Iron Age sites in Mali, Burkina Faso and Ghana by the mid-first millennium AD. Chickens arrived in the southern Levant about 2500 BC and reached Iberia in circa 2000 BC.

Chickens were brought to the Polynesian islands from Southeast Asia by Pacific Ocean sailors about 3,300 years ago. While it was previously assumed that they had been brought to the Americas by the Spanish conquistadors, pre-Columbian chickens have been identified at several sites throughout the Americas, most notably in Chile and dated at about 1350 AD.

But there's a problem: some archaeologists argue that the presence of haplogroup E in chickens from Rapa Nui (Easter Island) and coastal Chile must have come from chickens that travelled the Pacific with the Polynesians[3]. Others claim that the presence of haplogroup E in chickens from Rapa Nui is from contamination[4]. If the latter is true, then chickens must have travelled the Atlantic with Columbus.

[1] Xiang et al: Early Holocene chicken domestication in northern China in PNAS -2014
[2] Kanginakudru et al: Genetic evidence from Indian red jungle fowl corroborates multiple domestication of modern day chicken in BMC Evolutionary Biology – 2008
[3] Storey et al: Polynesian chickens in the New World: a detailed application of a commensal approach in Archaeology in Oceania – 2013
[4] Thomson et al: Using ancient DNA to study the origins and dispersal of ancestral Polynesian chickens across the Pacific in PNAS – 2014.

Ancient solar eclipse dates reign of pharaoh

The Bible speaks of a peculiar natural event. In the book Joshua 10:11 it says 'And the sun stood still, and the moon stayed, until the people had avenged themselves upon their enemies. ... So the sun stood still in the midst of heaven, and hasted not to go down about a whole day'.

The visionair Immanuel Velikovsky (1895-1979) thought this was proof that the proto-planet Mars came very the the earth[1]. Now, scientists have found another explanation for that mythical story[2].

Going back to the original Hebrew text, they determined that an alternative meaning could be that the sun and moon just stopped doing what they normally do: they stopped shining. In this context, the Hebrew words could be referring to a solar eclipse, when the moon passes between the earth and the sun, and the sun appears to stop shining. This interpretation is supported by the fact that the Hebrew word translated ‘stand still’ has the same root as a Babylonian word used in ancient astronomical texts to describe eclipses.”
An ancient Egyptian text, dated to 1205 BC and chiselled on the Merneptah Stele, recounts the military conquests of the pharaoh Merneptah, son of the fabled Ramesses the Great. The inscription mentions a people called 'Israel' that is said to have been wiped out by the conquering pharaoh. The stele mentions the same event as the text in the Bible.

Earlier historians have used these two texts to try to date the possible eclipse, but were unsuccessful as they were only looking at total eclipses. What they failed to understand was in the ancient world the same word was used for both total and annular eclipses.
[Path of Solar Eclipse in 1207 BC]
The researchers developed a code, which takes into account variations in the Earth’s rotation over time. From their calculations, they determined that the only annular eclipse visible from Canaan between 1500 and 1050 BC was on 30 October 1207 BC, in the afternoon. It enabled researchers to date the reigns of Ramesses the Great and his son Merneptah to within a year: He reigned in 1210 or 1209 BC. As it is known from Egyptian texts how long he and his father reigned for, it would mean that Ramesses the Great reigned from 1276-1210 BC, with a precision of plus or minus one year, the most accurate dates available. The precise dates of the pharaohs have been subject to some uncertainty among Egyptologists, but this new calculation, if accepted, could lead to an adjustment in the dates of several of their reigns and enable us to date them precisely.

I suppose you're now wondering why the column mentions 1207 BC and the image -1206? This is because the image was generated using astronomy software and the convention in astronomy is that there is no year zero between 1 BC and AD1. However, in the Julian calendar used by historians there is a year zero, hence in this calendar the date is 30 October 1207 BC. So both dates are correct[3].

[1] Velikovsky: Worlds in Collision – 1950 
[2] Humphreys, Waddington: Solar eclipse of 1207 BC helps to date pharaohs in News and Reviews Astronomy and Geophysics - 2017. See here.
[3] Personal communication with Colin Humphreys (9 November 2017)

Smoking and Stunting

Stunting, or being too short for one’s age, is defined as a height that is more than two standard deviations below the World Health Organization (WHO) Child Growth Standards median. Factors that contribute to stunted growth and development include – but are not limited to – poor maternal health and nutrition, inadequate infant and young child feeding practices, and infection. Stunting should be made a development indicator.

Here we explained that stunting can also be the result of exploitation and here we found that voluntary restrictions of the intake of food, such as in anorexia, might also result in stunted growth.
So, are there any other, less obvious factors, that can result in a stunted growth? There is an obvious one.

If you start smoking at a very young age, as happens so often in developing countries, you might experience stunting. In other words, you might not achieve your maximum length. At the same time stunted growth is of course only an issue for those still growing.
A scientific study on 451 boys and 478 girls showed that a boy who smokes ten cigarettes a day (or more) from age 12 to 17 will be about 2.5 centimeters shorter than a boy who does not smoke at all[1].

Strangely, in girls, cigarette use was not associated with any height or weight loss. Cigarette use appears to only decrease height and body mass index in boys. Young girls may be less likely to take up cigarette smoking if they would understand that cigarette use may not be associated with reduced weight in adolescent females.

Part 1 'Stunting: Malnutrition or Exploitation?' can be read here.
Part 2 'Stunting and Anorexia' can be read here.

[1] O'Loughlin et al: Does cigarette use influence adiposity or height in adolescence? in Annals of Epidemology - 2008

The Evolution of Melons

Cucurbitaceae are a plant family consisting of about 965 species. Well known genera are Cucurbita (squash, pumpkin, zucchini, some gourds), Lagenaria (calabash), Citrullus (watermelon), Cucumis (cucumber, various melons) and Luffa (luffa). This great diversity is related species wouldn't have been possible if it weren’t for an ancient event in plant evolution.
About 90 to 102 million years ago, the genome of a single melon-like fruit copied itself. Over time, this one ancestor became a whole family of plants with different colors, shapes, sizes, defenses and flavors, such as pumpkins, squash, watermelons and cucumbers, according to a recently published paper[1].

The researchers compared the genomes and evolutionary trees of a number of plants including cucumbers, melons and gourds. Millions of years of environmental changes allowed the fruits to lose genes over time and tailor their own codes to become what we know them as today.

After each major divergent event, genes were deleted, chromosomes were rearranged and new genetic patterns were created. Knowing more about which genes survived to do different things in each plant means scientists can now get closer to creating even more variations of these fruits.

[1] Wang et al: An overlooked paleo-tetraploidization in Cucurbitaceae in Molecular Biology and Evolution - 2017

[Review] 'Chaos' by Patricia Cornwell

I've read a few bad books in my life, some were even pretty bad, but 'Chaos' by Patricia Cornwell must certainly rank as the worst thriller I've read in a decade. Yes, Patricia Cornwell can write words that constitute a sentence and she produces many sentences. Far too many in fact and I wonder how she bribed here editor, because there could easily have been cut 100 pages filled with dribble from 'Chaos'.

Patricia Cornwell seems the enjoy the wealth she has accumulated, but she does so as a nouveau riche, a person who has recently become rich and needs to show the world just how knowledgeable she is about expensive food, wines and cars. And the book drags on and on about that (Kay Scarpetta wonders if husband Bryce may arrive in his Porsche Cayenne Turbo S or his Audi RS 7).

I wondered if Patricia Cornwell just started writing this thriller without any clue of a plot. Then, halfway in, she ran into difficulties. I will not refrain from warning the reader about *SPOILERS* and just mention that she uses a drone to kill people. A drone using electric wires that whizz down to the victim to electrocute him (or her). Then, surprised that the electric current cannot possibly be so powerful that the intended victim will die (Ohm's law), she 'invents' that panguite, a rare mineral found only in minute traces in meteorites, can supply that power. She even mentions that the mineral involves nano-technology. It doesn't: the amounts of Panguite in some meteorites are so small that you have to measure it in nanometers (nm), which means that you need tons of meteorites to get just a bit of panguite. Sloppy writing at its best, an uneducated woman at its worst.

What we have then is a drone targeting people that seem not to have noticed the sound of a strange apparatus above their heads and they seem not to have noticed that the wires came down. You would have thought that the intended victims would take evasive action, but no they didn't. So, we have victims that seem electrocuted by lightning without any thunder.

Like I said: 'Chaos' is easily one of the worst books I have ever read. Do not – I repeat NOT – buy this book. To be honest, it's the first time I ever reviewed a book with this sad result.

Earth's second sun

Earth has already a second moon, but a second sun is impossible. Right? Not quite.

In the constellation of Orion, Betelgeuse forms the left hand shoulder of the warrior (see the sword dangling from his belt). It is a red giant, a semi-regular variable star in the latter stages of its life whose apparent magnitude varies between 0.0 and 1.3. Which is a lot.
As Betelgeuse is using up the last of its fuel, it will become increasingly unstable over time and will eventually collapse due to its own gravity. Then Betelgeuse will become a supernova. Supernovae can outshine the whole galaxy they live in. Supernovae have a 'rising time' of about a week, when the star is increasing in brightness. It stays at its peak brightness for several days days and then slowly declines into obscurity over a period of a couple of weeks. At its point of maximum brightness it can compete with the brightness of a full moon (-11 magnitude). Because Betelgeuse is a star it will become a second sun. Our second sun.

Will we ever live to see such a spectacle in the heavens? Scientists have calculated that the possibility of Betelgeuse imploding and exploding is somewhere between nil and a million years. As Betelgeuse lives a mere 640 light years away from earth, it might already have gone supernova 640 years ago.

So, keep watching the southern sky (if you live in the northern hemisphere).

The White Horse of the Sun

Carved into the chalk of a hillside in southern England, the Uffington White Horse stretches 110 meters from head to tail, it is the only prehistoric geoglyph - a large-scale design created using elements of the natural landscape - known in Europe. Its closest parallel are the Nazca lines in Peru.
Excavations in the 1990s yielded dates that showed it was created during the Late Bronze Age or the Iron Age, sometime between 1380 and 550 BC.

Archeologist Joshua Pollard usually works on sites dating to the Neolithic, a period when people erected large monuments, such as Stonehenge, that were for the most part aligned with astronomical events. That experience led him to wonder if the Uffington Horse could have been designed along similar lines, and he investigated how the geoglyph was positioned relative to celestial bodies[1]. He found that when observed from a hill opposite, in midwinter, the sun rises behind the horse, and as the day progresses, seems to gain on the horse and finally pass it. From the same vantage point, at all times of the year, the horse appears to be galloping along the ridge in a westerly direction, toward the sunset.

Both the form and the setting of the site led Pollard to conclude that the White Horse was originally created as a depiction of a 'solar horse', a creature found in the mythology of many ancient Indo-European cultures. These people believed that the sun either rode a horse or was drawn by one in a chariot across the sky. Depictions of horses drawing this so-called solar chariot have been unearthed in Scandinavia and Celtic coins often show horses associated with the sun.
[Scandinavian Sun Horses]
"The White Horse is depicted as a horse in motion, and the people who created it must have thought that it was responsible for the sun’s movement across the sky," says Pollard. He posits that the geoglyph was not a static symbol, but an animated creature on the landscape, one that connected ancient Britons with the sun.

Over time, though its original purpose was lost, local people maintained a connection with the White Horse that ensured its continued existence. If it weren’t maintained, the White Horse would be overgrown and disappear in 20 years. Each summer, a few hundred local volunteers weed the White Horse and then crush fresh chalk on top of it so that it keeps the same brilliant white appearance it has had for 3,000 years. The site, as it must have throughout millennia, continues to be meaningful to the people around it.

[1] Pollard: The Uffington White Horse geoglyph as sun-horse in Antiquity - 2016

[Review] 'Classical Traditions in Science Fiction'

'Classical Traditions in Science Fiction' (edited by Brett M. Rogers and Benjamin Eldon Stevens) is a book that contains 14 essays by scholars of the classics, Greek, English, and philosophy. The essays explore connections between Jules Verne and the Greek satirist Lucian; Dune and the Iliad; Alien Resurrection and the Odyssey; antiquity and Western identity in Battlestar Galactica; the Iliad and Dan Simmons’ Ilium; The Hunger Games and the Roman Empire; and the graphic novel Pax Romana, which explores the transition from antiquity to a Christian world.

The term 'science fiction' is inherently vague and finding an all encompassing definition proves surprisingly elusive. Adam Roberts’ dictum that science fiction is 'premised on a material, instrumental version of the cosmos,' in contrast to its close ally, fantasy, which concerns 'magic, the supernatural, the spiritual.' Alternately, Susan Sontag summed up the whole genre as consisting of the 'imagination of disaster,' a fascination with dread of irresistible destruction.

At first science fiction did keep itself busy with 'novel ideas' about a possible future as dictated by Adam Roberts. Yet, the next wave of SF consisted of visions of a drab and depressing future as summed up by Susan Sontag. During the Victorian era, the world was changing fast, for some too fast. When extrapolated, the rapid industralisation with its smog and crumbling institutions, could herald an apocalypse in the future.

To be literature, one school of thought goes, a science fiction novel must be depressing, ginging an account of hubris and failure, such as George Orwell’s 1984. Some consider Mary Shelley’s Frankenstein the first science fiction: the optimism that drives scientific advance is thwarted by that unreliable factor, the human element.

Jesse Weiner’s essay “Lucretius, Lucian, and Mary Shelley’s Frankenstein” gives a thorough account of the book’s debate with the ancients, its later influence, and Shelley’s ambivalence about scientific progress.

But Frankenstein is subtitled The Modern Prometheus. Shelley drew upon the myth of Prometheus, who steals fire from the gods and is condemned to eternal damnation. Dr. Frankenstein is seeking higher human knowledge, the secret to the spark of life, and pays dearly for it.

'Classical Traditions in Science Fiction' is a book that contains a fascinating collection of essays that gives readers a new understanding of the place of science fiction within the Western literary tradition. Science fiction certainly harks its history back to classical Greek literature. Well worth your time. 

Diet Soda Linked to Weight Gain, Not Weight Loss?

Olive Oil Times, formerly a site with a good reputation, ran an article with the heading 'Diet Soda Linked to Weight Gain, Not Weight Loss'.
The article used data from recent Canadian research that claimed that 'Evidence from RCTs does not clearly support the intended benefits of nonnutritive sweeteners for weight management, and observational data suggest that routine intake of nonnutritive sweeteners may be associated with increased BMI and cardiometabolic risk'[1].

Well, that was a strange outcome, because a previous study that used the same data reached a different conclusion 'Overall, the balance of evidence indicates that use of LES (Low Energy Sugars) in place of sugar, in children and adults, leads to reduced EI (Energy Intake) and BW (body weight), and possibly also when compared with water[2].

What I think is that the effects of all interventions to combat obesity are limited and inconsistent. There are many variables and no study will ever be able to control for all of them. People might drink diet soda, but still eat to much fast food, nullifying the effect of the zero calories of the diet soda.

So, if you are trying to lose weight replacing sugary drinks with low calorie drinks can be a helpful part of your overall strategy. It will not be a panacea or make weight loss easy. See here.

The Olive Oil Times made things even worse by asking a naturopath (quack alert!) for her opinion. 'Carolyn Dean, medical doctor and naturopath, didn’t mince words in giving her opinion about the research. “This study, which exposes the false claims of synthetic sweeteners, should have the industry quaking in its boots”'.

As Wikipedia rightly warns: 'Naturopathy or naturopathic medicine is a form of pseudoscientific, alternative medicine.' Poor Olive Oil Times. I hope they didn't pay the writer of that article, because it did more harm than good.

[1] Azad et al: Nonnutritive sweeteners and cardiometabolic health: a systematic review and meta-analysis of randomized controlled trials and prospective cohort studies in Canadian Medical Association Journal – 2017
[2] Rogers et al: Does low-energy sweetener consumption affect energy intake and body weight? A systematic review, including meta-analyses, of the evidence from human and animal studies in International Journal of Obesity – 2015

Human Brain Is Still Evolving

Two genes involved in determining the size of the human brain have undergone substantial evolution in the last 60,000 years, suggesting that the brain is still undergoing rapid evolution[1].
New versions of the genes - or alleles - appear to have spread because they enhanced the brain's size and function in some way. These new alleles improve brain function, but that would not necessarily mean that the populations where they are common have any brain-related advantage over those where they are rare. Different populations often take advantage of different alleles, which occur at random, to respond to the same evolutionary pressure, as has happened in the emergence of genetic defenses against malaria, which are somewhat different in Mediterranean and African populations.

The researchers studied study two genes, Microcephalin (MCPH1) and ASPM (Abnormal Spindle-like Microcephaly Associated), that came to light because they are disabled in microcephaly ('small brain'), now better known because Zika Virus causes it[2].

Lahn and his colleagues have studied the worldwide distribution of the alleles by decoding the DNA of the two genes in many different populations. They report that with microcephalin, a new allele arose ~37,000 years ago (between 60,000 and 14,000 years ago)[3]. Some 70 percent or more of people in most European and East Asian populations carry this allele of the gene, as do 100 percent of those in three South American Indian populations, but the allele is much rarer in most sub-Saharan Africans.

With the other gene, ASPM, a new allele emerged ~5,800 years ago (between 14,100 and 500 years ago). The allele has attained a frequency of about 50 percent in populations of the Middle East and Europe, is less common in East Asia, and found at low frequency in some sub-Saharan Africa peoples. They note that the ASPM allele emerged at about the same time as the spread of agriculture in the Middle East 10,000 years ago and the emergence of the civilizations of the Middle East some 5,000 years ago, but say any connection is not yet clear.
The Microcephalin and ASPM genes are known to be involved in determining brain size and so far have no other known function, he said. They are known to have been under strong selective pressure as brain size increased from monkeys to man, and the chances seem "pretty good" that the new alleles are a continuation of that process, Dr. Lahn said.

[1] Mekel-Bobrov et al: Ongoing adaptive evolution of ASPM, a brain size determinant in Homo sapiens in Science – 2005
[2] Evans et al: Microcephalin, a gene regulating brain size, continues to evolve adaptively in humans in Science – 2005
[3] Evans et al: Evidence that the adaptive allele of the brain size gene microcephalin introgressed into Homo sapiens from an archaic Homo lineage in PNASofUSA - 2006

Stunting and Anorexia

Most experts now probably agree that stunting is a development disorder[1]. Stunting, or being too short for one’s age, is defined as a height that is more than two standard deviations below the World Health Organization (WHO) Child Growth Standards median[2]. It is a largely irreversible outcome of inadequate nutrition and repeated bouts of infection during the first 1000 days of a child’s life.

In my previous paper, 'Stunting: Malnutrition or Exploitation?'[3], I claimed that stunting is not only the result of malnutrition, but also of child exploitation. Both are indicative of poverty.

I also linked stunting to rigorous training by athletes. These athletes eat meals that contain more than enough nutrients to grow, but their bodies use these nutrients to enhance the short-term goals to the detriment of long-term growth. My conclusion was that, while stunting is usually monitored in children less than five years of age, stunting should also be monitored in children older than five years of age.
But what if malnutrition is the result of an ill-advised choice? What if anorexia also leads to stunting? The Diagnostic and Statistical Manual of Mental Disorders 5 (DSM-5) classifies Anorexia Nervosa as an eating disorder. Criteria include [1] Restriction of energy intake relative to requirements leading to a significantly low body weight in the context of age, sex, developmental trajectory, and physical health, [2] Intense fear of gaining weight or becoming fat, even though underweight and [3] Disturbance in the way in which one's body weight or shape is experienced, undue influence of body weight or shape on self-evaluation, or denial of the seriousness of the current low body weight[3].

How will a voluntary restriction of energy intake relative to requirements, that leads to a significantly lower body weight in the context of age, sex, developmental trajectory and physical health, influence your growth?
One study revealed that 'Male children of women with a history of Anorexia Nervosa [...], and female children of women with Anorexia Nervosa, were shorter throughout childhood'[4]. Another study found that 'linear growth retardation was a prominent feature of Anorexia Nervosa in our sample of male adolescent patients, preceding, in some cases, the reported detection of the eating disorder. Weight restoration, particularly when target weight is based on the premorbid height percentile, may be associated with significant catch-up growth, but complete catch-up growth may not be achieved'[5].

Therefore, anorexia is a type of malnutrition and can lead to stunting.

Part 1 'Stunting: Malnutrition or Exploitation?' can be read here.
Part 3 'Smoking and Stunting' can be read here.

[1] Kraemer: Making Stunting a Development Indicator in Sight and Life - 2016 
[2] WHO Global Nutrition Targets 2025: Stunting Policy Brief. See here
[3] De Vries: Stunting: Malnutrition or Exploitation? in Sight and Life - 2016 
 [4] American Psychiatric Association: Diagnostic and Statistical Manual of Mental Disorders 5 – 2013 
[5] Easter et al: Growth trajectories in the children of mothers with eating disorders: a longitudinal study in BMJ Open - 2014 
 [6] Modan-Moses et al: Stunting of growth as a major feature of anorexia nervosa in male adolescent in Pediatrics - 2003

[Review] 'The Evidence of Ghosts' by AK Benedict

Maria King, blind from birth and now blind by choice, sits by the Thames mudlarking, sifting through the history of London. Having been blind all her life, she can't get used to being gifted with sight after surgery. She wears a blindfold that gives her a feeling of security. Only, one day, while mudlarking, she finds a ring still on a finger in a box with 'Marry me Maria' on the lid in braille.

DI Jonathan Dark is assigned to the case. The finger and the ring belonged to the last woman who received a similar proposal and was murdered. Jonathan Dark was unable to prevent that murder and his intention is not to let Maria be the next victim of the stalker.

Jonathan Dark is a detective with a disintegrating private life. His personal problems constantly interfere with his professional life, but the real question in 'The Evidence of Ghosts' is: who's stalking Maria King and why?

The other question that may be on our lips is: if I was being stalked by a murderer would I want to keep wearing a blindfold? I know that seems an odd question but when you consider Maria wears one by choice all the time, it makes sense to ask. While most reviewers think that this doesn't reflect true life, I can assure readers that one can never understand the psychology of the human mind.

Alexandra Benedict weaves a fascinating supernatural (or supranational) world where the dead are always with us, sometimes helping, sometimes obstructing and sometimes urging to kill.

What do I think of 'The Evidence of Ghosts'? I got the distinct feeling that Alexandra Benedict was trying to weave too many storylines in this book and not quite succeeding. Yet, it still is a perfect albeit unusual amalgamation of a crime novel and a Gothic novel. A.K. Benedict has a rich imagination and a dark sense of humour that enlightens nearly every page.

Death has no sequel. So ends the book. But I'm certain that AK Benedict's fertile imagination has already conjured up other adventures for our troubled detective Jonathan Dark. Highly Recommended.


[Review] 'Classical Traditions in Modern Fantasy'

'Classical Traditions in Modern Fantasy' is the second in a series, the first being 'Classical Traditions in Science Fiction'.

'Classical Traditions in Modern Fantasy' is a collection of essays focusing on how fantasy draws deeply on ancient Greek and Roman mythology and literature.

Edited by Brett M. Rogers and Benjamin Eldon Stevens, the book contains fifteen essays intended for scholars and readers of fantasy alike. This volume explores many of the most significant examples of the modern genre, including H. P. Lovecraft's dark stories, J. R. R. Tolkien's 'The Hobbit', C. S. Lewis's 'Chronicles of Narnia', J. K. Rowling's 'Harry Potter' and George R. R. Martin's 'A Song of Ice and fire' (aka 'Game of Thrones'), in relation to ancient classical texts such as Aeschylus' Oresteia, Aristotle's Poetics, Virgil's Aeneid and Apuleius' Metamorphoses (aka 'The Golden Ass').

So, the writers of the essays try to find links and similarities between modern fantasy and classical texts. It's a comparatively easy task, because both hark back to universal stories that lie buried deep within us. All writers, ancient and recent, will tell stories that have the same issues at the heart of it: a quest for freedom, a rebellion against repression or the urge to discover unknown lands.

What most of the essays fail to mention is the education the modern fantasy writers have had. We know that Tolkien was a philologist and university professor, but he said his main inspiration for 'The Hobbit' was the Old English epic 'Beowulf'. I agree with Benjamin Eldon Stevens, writer on the essay on Tolkien, that Bilbo's travels into the tunnels and his encounters with Gollum/Sméagol echoes the underworlds of Dante and Virgil. We also know that Rowling studied classics at the University of Exeter, so her classical 'roots' are also not in doubt. But what of George R. R. Martin, who 'only' studied journalism? Did he write his sprawling fantasy series with the classics in mind? Or did he simply write a story that has so many similarities with classical stories that one is easily tempted to deduce that Martin is influenced by them. H.P. Lovecraft never finished high school, but was interested in chemistry and astronomy. His dark writing was fueled by his nightmares, the result of parasomnia or ‘night terrors’.

In the end, 'Classical Traditions in Modern Fantasy', is certainly a book that you should read, because it gives you a reason to ask yourself a lot of questions. And that's the very best one might expect from a book.

Viking: an alternative etymology

Everybody knows about Vikings, the fearless warriors from the cold and barren north. People who have studied history (but not etymology) will tell you that Viking is an Old Norse word meaning 'pirate' or 'raider'. It's not.
Actually, the English word 'Viking' went extinct in Middle English, was revived in the 19th century and borrowed from the Scandinavian languages of that time.

The etymology of víkingr and víking is hotly debated by scholars. A víkingr was someone who went on expeditions, usually abroad, usually by sea, and usually in a group with other víkingar (the plural).

Both words are thought to be connected with Old Norse vík meaning 'fjord'', 'small bay', 'inlet' or 'cove'. Towns such as Reykjavik and Lerwick may trace their origins back to the Vikings. But it would be a step too far if one would decide that viking was named after a 'fjord'. Vikings were a diverse group and originated from the entire Scandinavian peninsular. We need another explanation.

The Swedes will tell you that vig means 'battle' and therefore a viking would be a 'warrior'. Not so.

Both wic in Old English and wick in Old Frisian meant ‘camp’ or ‘a temporary living space'. So, it's quite possible that, if vic means 'camp', then 'vikingr' could well mean '(one) going on a camping trip'.

As camps grew into more permanent settlements, the word vic also came to denote something different. We can discover the word in Old English wīc ('dwelling place', 'abode') and Middle English wik, wich ('village', 'hamlet', 'town'). Modern Dutch wijk and modern Frisian wyk still mean 'part of a city'. This solution also ties in very neatly with the old Norse word vestrvíking. It is usually translated as 'raiding in the west', in the context of 'the British Isles'. Now we can give its original meaning 'camping in the west'.

[Review] 'Sleeper' by J.D. Fennell

'Sleeper', the debut by J. D. Fennell, is marketed as a young adult thriller. Yes, it is that and much more. The book is also a masterful melange of fantasy and war-time chaos. The protagonist Will Starling is a sixteen year old and he must keep a mysterious notebook out of the hands of VIPER, a murderous bunch of villains. After being shot and fallen into the icy water near Dover, he is rescued only to discover his memory is gone. You might think that this is some sort of homage to Jason Bourne, but 'Sleeper' is different. Very different.

Slowly but surely the memory of Starling returns and he understands that he's no ordinary lad. He's been trained to kill and to maim. As could be expected the story takes place at the backdrop of air raids on London, which adds another layer of fear and chaos to 'Sleeper'.

Just a few pages into the tale, I was certain that this was no ordinary thriller. This was something new. If I were a native English speaker, I would be able to say that it is a ripping yarn told at breakneck speed. What can it be compared to, I wondered aloud. In the end I decided 'Sleeper' might well be a start to a wonderful series that emulates the movies about 'Indiana Jones' with elements of Young Bond (by Steve Cole) and Alex Rider (by Anthony Horowitz) thrown in for good measure.

This is a thriller I would certainly highly recommend to young adults, but more adult readers might find 'Sleeper' also very entertaining. If I were pressed to mention a minor negative, I might mention that I missed a bit of British tongue-in-cheek humour to lighten the narrative a bit at opportune moments. But I would only mention that after a fair bit of torture.

The fear of cats in Victorian times

It was in the late nineteenth century that medicine turned its attention to irrational fears. The German physician Carl Westphal (1833-1890) made the initial diagnosis of a phobia, agoraphobia, the fear of open spaces, in 1871[1]. He studied the behaviour of three otherwise sane and rational men who were terrified of crossing an open city space. Following this diagnosis, the notion that individuals could be overtaken by various form of inexplicable fear was quickly taken up by medical practitioners around the world.

The American psychologist Granville Stanley Hall (1846-1924) soon identified 138 different forms of pathological fear[2]. Not only did these include recognised phobia, such as agoraphobia and claustrophobia, but also some fears that were particular to the Victorian era: amakophobia (fear of carriages), pteronophobia (fear of feathers) and hypegiaphobia (fear of responsibility).
However, it was the fear of cats (ailurophobia) that attracted the most attention from Victorian researchers. Hall, with his colleague Silas Weir Mitchell, even conducted experiments, such as placing sufferers into a room with a hidden cat, to see if they picked up animal's presence. He became convinced that many of his patients always could sense them. Trying to explain the phobia, he ruled out asthma and evolutionary inherited fears (people who were terrified of cats were could look at lions and tigers without problems).

Eventually Hall suggested that emanations from the cat 'may affect the nervous system through the nasal membrane, although recognised as odours'. He remained baffled over why cats seemed to have an urge to get as close as possible to individuals who were scared of them.

Research now suggest that the Victorian urge to classify almost everything was the result of a rapidly changing, industrialising society, where new scientific theories were starting to challenge long-held religious beliefs, explanations and dogma.

[1] Westphal: Die Agoraphobie, eine neuropathische Erscheinung in Archiv für Psychiatrie und Nervenkrankheiten - 1871
[2] Stanley Hall: Synthetic Genetic Study of Fear in American Journal of Psychology - 1914

Astronomy and watches in Friesland

Astronomical devices have been made for thousands of years. A famous example is the Antikythera mechanism, an artifact recovered off the Greek island of Antikythera. It is an ancient planetarium (or orrery) used to predict astronomical positions and eclipses for calendrical and astrological purposes. Even the Olympiads, the cycles of the ancient Olympic Games, could be calculated. The ancient device is a complex clockwork mechanism composed of at least 30 meshing bronze gears and is dated at around 205 BC.
[Model of the Antikythera mechanism]
Just a few kilometers west of my hometown of Harlingen lies the city of Franeker. In that Frisian town once lived Eise Eisinga (1744-1828), an amateur astronomer who built a planetarium in his own house. The planetarium still exists and is the oldest functioning planetarium in the world. Eisinga never went to school, but he did publish a book about the principles of astronomy when he was only 17 years old.
[Eise Eisinga's planetarium]
And today the Frisians are still world famous for their – sometimes – astronomical watches with – yes- astronomical price tags. Christiaan van der Klaauw, based in Heerenveen, creates astronomical watches such as the Planetarium, which contains the smallest mechanical planetarium in the world, showing in real time the orbits of Mercury, Venus, Earth, Mars, Jupiter and Saturn around the Sun. Don't worry, it also tells you the time and the mechanism is extremely accurate.
[Van der Klaauw Planetarium CKPT3304]
It isn't quite known why Frisians are so fascinated with planetary movements. It might have something to do with their healthy dairy products or their perfect night skies, but my bet is on Beerenburg, a traditional alcoholic drink that contains a host of medicinal herbs. It is almost exclusively consumed by Frisians.

Painkillers are killing America

You might remember House MD self-medicating on Vicodin to keep the pain in his leg at bay and allowing him to function as a brilliant docter. Vicodin is a painkiller that consists of a combination of two ingredients: hydrocodone and acetaminophen. Hydrocodone is an opioid, while acetaminophen (paracetamol) is a non-opioid analgesic. It is indicated for relief of moderate to severe pain.
Fentanyl is another potent, synthetic opioid pain medication with a rapid onset and short duration of action. It is approved for treating severe pain, typically advanced cancer pain. Fentanyl is more than 50 times more potent than morphine, thus increasing the risks for users. Fentanyl has emerged as the drug of choice in many parts of the United States and its legal and illegal use is now termed an 'epidemic' by scientists.
Opioid use has exploded in the US, after decades of doctors over-prescribing painkillers in the 1990s and 2000s. Authorities believe it is now pouring into the US, mostly directly from China through the mail, sometimes via Mexico.

A recent report by the Centers for Disease Control and Prevention shows that drug overdose deaths nearly tripled during 1999–2014[1]. Among 47,055 drug overdose deaths that occurred in 2014 in the United States, nearly 30,000 of these deaths involved an opioid. There are now more people killed by opioids than from bullets[2]. The numbers (read: deaths) keep rising, because in 2015 these were 52,404 drug overdose deaths, 33,091 of those involved the use of opioids[3]. Drug overdose deaths in 2016 are expected to exceed 64,000, representing a rate of 175 deaths a day.
Another factor is that opioids are often taken with other painkillers and alcohol, which also acts as a sedative.

[Update March 16, 2017] The Commission on Narcotic Drugs, part of the UN, has decided to help the US by adding two chemicals, used to make the drug Fentanyl, to an international list of controlled substances. It is hoped that it will help fight a wave of deaths by overdose in America. The substances are two precursors of Fentanyl: 4-anilino-N-phenethylpiperidine (ANPP) and N-phenethyl-4-piperidone (NPP). It also added a fentanyl analog called butyrfentanyl, a drug similar to fentanyl.

[1] Rose et al: Increases in Drug and Opioid-Involved Overdose Deaths — United States, 2010–2014 in Morbidity and Mortality Weekly Report (MMWR) – 2016
[2] Washington Post: Heroin deaths surpass gun homicides for the first time, CDC data shows – 2016. See here
[1] Rose et al: Increases in Drug and Opioid-Involved Overdose Deaths — United States, 2010–2015 in Morbidity and Mortality Weekly Report (MMWR) – 2016

Abigail Thaw on 'Morse' and 'Endeavour'

[Guest post by Damian Michael Barcroft, previously published here]

2017 comes around and I had no inkling it was 30 years since Morse first crossed our TV screens. Perhaps that’s a credit to the Endeavour series that we’ve become so immersed on our characters and our own program. Suddenly I am in the thick of the “30 years” thing and I can’t believe it was so long ago that it all started.
[Abigail Thaw as Dorothea Fazil]
But I remember thinking, while waiting to shoot my first scene of series 4 in 2016, that being in Oxford is a pertinent reminder of my father for me. It brings me back to him with a jolt; the colleges, the streets, the Randolph Hotel, the Ashmolean. Strange, because I lived there as a child long after my parents divorced so I’ve rarely been there with him. But the character of Morse is so ingrained in that golden stone and the legacy (although I hate that cliched word) is quite sobering. Staring round at this wonderful, talented crew and actors, there to tell the stories of Inspector Morse’s crime solving… I mean, how extraordinary is that!

Thank you Colin Dexter and thank you Dad for giving 'Morse' a corporal existence and everyone for continuing to make it happen: Damien, Russell, Kevin who drives you to the set happy and rested, Shaun with all that weight on his slender shoulders that he carries effortlessly… The list is very long. And then I stop thinking about it because if I didn’t I’d be overwhelmed and wouldn’t be able to do my job!

Having James Laurenson in the first episode was a treat and it was lovely to hear his stories of that very first Morse; the uncertainty of whether it “had legs”. But for the rest of the time I don’t think about “Morse” or “Dad”. I look across at my fellow actors and I think, Hello Endeavour or Hello Thursday, and when the camera’s not rolling I’m having a jolly good laugh; or putting the world to right over a custard cream and a tepid cup of tea; or trying to remember my lines and not bump into the furniture. Or trying to look as though I drive a 1960 Triumph with exceptionally stiff gears every day of my life…

And I love Dorothea. I fall for her more with each series. Russell thinks up all sorts for her, some make it to the final cut and many don’t but I know they’re there and they help me fill her out. Russell graciously allows me to feel I have some input into her development as I email him with the odd thought but I have to admit, he’s the puppet master. And I love the glimpses we get of her private life. Her friendship with Endeavour is touching and particularly comes to fruition in this series. Not to give anything away! She’s a lonely soul much like her Morse compatriot. But she’s got such gumption and life force. She can be utterly charmless when she wants to be which is rare in playing or being a woman. Something men take for granted. I wish I was more like her in many ways. But not at the witching hour after a scotch too many. Or those dark hours before dawn. I doubt she’s a stranger to the Dark Night of the Soul.

Whatever other job I do during the year, there is nothing like the thrill of a fresh new Endeavour script arriving, the comfort of all those familiar faces working for the same thing, making it as brilliant and enjoyable as possible. Putting on Dorothea’s rather uncomfortable clothes and pointy bra and drowning in a sea of Irene’s (Napier) hairspray, I’m plunged back into “Ah yes, I know this. Hello, girl. Cheers.”

BTW: The name Dorothea Frazil is a clever find. 'Frazil' means 'Ice crystals formed in turbulent water, as in swift streams or rough seas'. D. Frazil can thus be read as 'De-ice' or 'Thaw'.

The (Short) Evolution of Smallpox

New research suggests that smallpox, a viral disease that caused millions of deaths worldwide, may not be an ancient disease[1]. The findings raise new questions about when the Variola virus first emerged and later evolved, possibly in response to inoculation and vaccination.
Smallpox, one of the most devastating viral diseases, had long been thought to have appeared in human populations thousands of years ago in ancient Egypt, India and/or China, with some historical accounts suggesting that pharaoh Ramses V, who died circa 1145 BC, suffered from smallpox due to lesions found on his face.

To better understand its evolutionary history, scientists extracted the DNA, from partial mummified remains of a Lithuanian child, interred in the crypt of a church in Vilnius, believed to have died between 1643 and 1665, a period in which several smallpox outbreaks were documented throughout Europe with increasing levels of mortality. Researchers compared the 17thC strain to those from a databank of samples dating from 1940 up to its eradication in 1977. Surprisingly, the results shows that the evolution of smallpox virus occurred far more recently than previously thought, with all the available strains of the virus having an ancestor no older than 1580 AD.

The pox viral strains, that represent the true reservoir for human smallpox, remains unknown to this day. Camelpox is very closely related, but is not regarded as the likely ancestor to smallpox, suggesting that the real reservoir remains at large or has gone extinct[2].
The researchers also discovered that smallpox virus evolved into two circulating strains, Variola major and Viriola minor, after English physician Edward Jenner developed a vaccine in 1796.

One form, Variola major, was highly virulent and deadly, the other Variola minor more benign. However, the two forms experienced a ‘major population bottleneck’ with the rise of immunization efforts.The date of the ancestor of the minor strain corresponds well with the Atlantic Slave trade which was likely responsible for partial worldwide dissemination.

This raises important questions about how a pathogen diversifies in the face of vaccination. While smallpox is now eradicated in humans, we should remain vigilant about its possible reemergence until we fully understand its origins.

[1] Duggan et al: 17th Century Variola Virus Reveals the Recent History of Smallpox in Current Biology – 2016. See here
[2] Smithson et al: Prediction of steps in the evolution of variola virus host range in PLoS One - 2014 


While ether was already synthesized around 1540, when the German botanist and chemist Valerius Cordus created a revolutionary formula that involved adding sulfuric acid to ethyl alcohol, its use as an anaesthetic on humans was only 'discovered' in 1842.
Crawford Williamson Long (1815-1878), an American surgeon and pharmacist, became the first pioneer to use ether as a general anesthetic when he removed a tumor from a patient’s neck. Unfortunately, Long didn’t publish the results of his experiments until 1848. By that time, Boston dentist William Morton (1819-1868) had won fame by using it while extracting a tooth from a patient in 1846. An account of this successful painless procedure was published in a newspaper, prompting, surgeon, John Collins Warren(1778-1856), to ask Morton to assist him in an operation removing a large tumor from a patient’s lower jaw.

But ether had a more disturbing and sinister use. During the second half of the 19th century, ether was widely used a recreational drug in some European countries[1], becoming especially popular in Ireland, as temperance campaigners thought it was an acceptable alternative to alcohol. Until 1890, when it was finally classified as a poison, more than 17,000 gallons of ether were being consumed in Ireland, mostly as a beverage. The anti-alcohol brigade was partly right, because consuming ether does cause dependence, but no withdrawal symptoms are prevalent.

Ether parties sprang up all over the world. Thomas Lint, a medical student at St. Bartholomew’s Hospital in London, confessed: “We sit round a table and suck [on an inhaling apparatus], like many nabobs with their hookahs. It’s glorious, as you will see from this analysis of a quarter of an hour’s jolly good suck.” He then went on to describe several “ethereal” experiences he and his fellow classmates had while under the influence of the newly discovered substance.

Ether wasn’t just inhaled. It was also drunk, like alcohol. In Ireland, the substance replaced whiskey for a while, due to its low cost (a penny a draught). After drinking a glass of water, “ethermaniacs” would take a drop of the drug on their tongues while pinching their noses and chasing it with another glass of water. Taken this way, ether hit the user hard and fast. Dr. Ernest Hart wrote that “the immediate effects of drinking ether are similar to those produced by alcohol, but everything takes place more rapidly.”

Recovery was just as swift. Those taken into custody for drunken disorderliness were often completely sober by the time they reached the police station, with the bonus that they also suffered no hangover. In this way, 19th-century revelers could take draughts of ether several times a day, with little consequence[2].
Even in the Raymond Chandler's 1939 novel and subsequent movie 'The Big Sleep' (1946) with Humphrey Bogart and Lauren Bacall, the detective Philip Marlowe, played by Bogart, drinks a mixture of ether and laudanum.

[1] Zandberg: “Villages … Reek of Ether Vapours”: Ether Drinking in Silesia before 1939 in Medical History – 2010. See here.
[2] Haynes: Ethermaniacs in BC Medical Journal – 2014

[Review] 'The Beauty of Murder' by AK Benedict

Usually reviews are constructed the same: a reviewer tells you a bit about the story, followed by his own thoughts and views. He then ends with a recommendation: to buy or not to bother.

I want to start this review of Alexandra Benedict's 'The Beauty of Murder' with my recommendation: if you're reading a book, just put it aside, order 'The Beauty of Murder' and prepare yourself for a treat. This book is not a usual mystery, but a guided voyage through your imagination. What sort of book is it, you might ask. Reviewers are not at all in agreement, but I would say this is a mystery that perfectly blends the supernatural and metaphysical. It reminds me somewhat of the splendidly written mysteries by Irish novelist John Connolly.

Jackamore Grass is a serial killer who is able to break the boundaries of time. But then Cambridge lecturer Stephen Killigan finds a body of a beauty queen who has been missing for a year. Only to discover that she's disappeared again without any trace of her ever being there. The police start questioning his sanity. Unknowingly he is being drawn into the dark and twisted world of Jackamore Grass. Darkness, once gazed upon, can never be lost.

A.K. Benedict writes with supreme confidence and is able to grip the reader's attention with perfect and elegant prose. So, by now you must have ordered your copy of 'The Beauty of Murder', because if you haven't, you've lost valuable time. Remember: time, once lost, cannot be regained. Unless, of course, your name is Jackamore Grass.

I'm already eagerly awaiting the publication of part two of the series, provisionally entitled 'The Cabinet of Shadows'.

Nightshade: an alternative etymology

Deadly nightshade (Atropa belladonna) is a highly toxic hallucinogen. Its cousin, the black nightshade (Solanum nigrum) is partly edible.
The deadly nightshade is native to temperate southern and central Europe, but has been cultivated and introduced outside its native range. Its most northern frontier reaches Skåne in Sweden, where it was grown in apothecary gardens.

Yes, The deadly nightshade is one of the most toxic plants found in the Eastern Hemisphere (though it has been introduced in the Western hemisphere). On the other hand, the ripe berries and cooked leaves of the black nightshade are used as food in some locales and selected plant parts are used as a traditional medicine.

Right. The botanicals are now sorted. So where does the name 'nightshade' derive from?

The Etymology Dictionary predictably claims that Old English nihtscada literally means 'shade of night'. Yes, both in Dutch and German the same word is used for these plants: nachtschade (Dutch) and Nachtschatten (German). The Dictionary suggests that the name is perhaps an allusion to the poisonous black berries. A similar Swedish word was nattskata which meant a 'bat'. Bats were (and still are) shadows in the night.
So, nightshadow it or isn't it? In modern Dutch 'schade' means damage. Modern Frisian 'skea' has exactly the same meaning. That directs us to an alternative explanation of the word 'nightshade': the plant has berries that are as black as the night and these cause damage.

The Oracle of Delphi

From about 1400 BC to 400 AD, the Oracle of Delphi was considered one of the most sacred sites in all of ancient Greece. It is located on Mount Parnassus in Phocis some 200 kilometers northwest from Athens.

People from all walks of life made pilgrimages there to seek advice from the God Apollo, which was relayed to them by Pythia (Πῡθίᾱ), the High Priestess. Her often cryptic ramblings were highly regarded and affected everything from the outcome of wars to when farmers should plant their crops. No kingdom, city or private person could afford to make critical decisions without consulting the Pythia. Thanks to her prestige, Delphi also became the richest Hellenic sanctuary. The Greeks called it the omphalos, or 'navel of the world'.

One of the most famous example of her predictions or revelations was that of King Croesus of Lydia. Croesus asked at Delphi whether he should wage war against the Persians. He was told that, if he did, he would destroy a great empire. Taking the response to predict victory, he launched a military assault on Xerxes, the king of Persia. The oracle was right: Croesus did end up destroying an empire – his own.

The ancient sources describe two distinct types of prophetic trance experienced by the Pythia. First, and more normally, she would lapse into benign semi-consciousness, during which she remained seated on the tripod, responding to questions—though in a strangely altered voice. According to Plutarch, once the Pythia recovered from this trance, she was in a composed and relaxed state, like a runner after a race. A second kind of trance involved a frenzied delirium characterized by wild movements of the limbs, harsh groaning and inarticulate cries. When the Pythia experienced this delirium, Plutarch reports, she died after only a few days—and a new Pythia took her place.

The Pythia entered her trance by inhaling sweet-smelling noxious fumes coming from deep fissures underneath the temple, according to the ancient historian Plutarch.

At first, a lack of evidence led modern archaeologists to dismiss Plutarch’s observations, but it appears the ancients were right after all. Tests showed that the waters of a nearby spring showed the presence of methane and ethane, which can be intoxicating, as well as ethylene[1].
Ethylene was later widely used as an anesthetic in the first half of the 20th century[2]. In small doses, ethylene stimulates the central nervous system, causing hallucinations and emits a sweet odor. However, it was not particularly successful as an anesthetic, because high concentrations were needed to achieve unconsciousness and it was dangerously explosive.

[1] De Boer et al: New evidence of the geological origins of the ancient Delphic oracle (Greece) in Geology – 2001
[2] Spiller et al: The Delphic oracle: a multidisciplinary defense of the gaseous vent theory in Journal of Toxicology – 2002

Dämmerschlaf or Twilight Sleep

Most of us are familiar with the German term Götterdämmerung, which translates as 'Twilight of the Gods' or more correctly as 'Gods' Twilight'. Another concept is Dämmerschlaf or 'Twilight Sleep', which became popular in the beginning of the twentieth century[1].

The treatment of choice for childbirth pains during the latter half of the 1800s was chloroform. The anaesthetic qualities of chloroform were first described in 1842. On November 4th, 1847, the Scottish doctor James Young Simpson first used the anesthetic qualities of chloroform on a pair of friends at a dinner party. This was done purely as entertainment rather than being a medical procedure.
Between about 1865 and 1920, chloroform was used in about 90% of all narcoses performed in the UK, but complications were many. The problem was that chloroform causes depression of the central nervous system (CNS), ultimately producing deep coma, respiratory center depression and death.

The search was on for a safer means of sedation.

Twilight sleep was developed in Germany around 1900. It is an amnesic condition characterized by insensitivity to pain without loss of consciousness, induced by an injection of morphine (from opium) and scopolamine (from the deadly nightshade) in order to relieve the pain of childbirth. This combination, which mimics the Greek nepenthe, induces a semi-narcotic state which produces the experience of childbirth without pain. However, some scientists state that women do feel the – sometimes - excruciating pain, but the drug removes all memory of that pain.

Pain can lead to all sorts of long-term traumatic effects, such as a postpartum depression. In the end it doesn't really matter if a woman does not feel the pain or simply does not remember the pain she had experienced.

The combination of morphine and scopolamine entered mainstream medical use around 1907, but it also had its drawbacks. In the end the drug was discontinued because it had depressive effects on the central nervous system of the infant. This resulted in a drowsy newborn with poor breathing capacity.

[1] Marx: Historische Entwicklung der Geburtsanästhesie in Anaesthesist - 1987