Staple though it is today, the lowly potato had a hard time reaching its preeminent status in Western cuisine. Perhaps its lengthy purgatory has something to do with the tale that when Sir Walter Raleigh gave some potatoes to Queen Elizabeth, her cooks tossed aside the roots and served up the boiled greens instead, causing a court-wide case of indigestion. Whether that’s the case or not—and there’s no evidence that Raleigh ever so much as set eyes on a potato—for decades Europeans would have nothing to do with the tuber. At best, it was found useful to feed the cattle. At worst, it was considered a leprosy-inducing invention of the devil.
This belief was particularly pernicious in the fair fields of France, a country at the time holding a quarter of Europe’s inhabitants despite its periodic decimation by epidemic and famine. By the beginning of the 17th century France’s population had reached twenty million and continued to rise. Clearly, a cheap, plentiful, and resilient crop was just what the nutritionist ordered, yet even in the face of the brutal demographic crises that popped up every ten to fifteen years over the next two centuries, each time lopping two or three million inhabitants off the non-existent voting rolls, the potato remained unpondered, unprized, and unplanted.
Clearly, the potato needed a champion. What it got was a pharmacist.
Note: These days all of our articles are also available in fancy audio form. Click "Audio" on the toolbar to consume with your ears and/or subscribe to our podcast.
Near the heart of Scotland lies a large morass known as Dullatur Bog. Water seeps from these moistened acres and coalesces into the headwaters of a river which meanders through the countryside for nearly 22 miles until its terminus in Glasgow. In the late 19th century this river adorned the landscape just outside of the laboratory of Sir William Thompson, renowned scientist and president of the Royal Society. The river must have made an impression on Thompson—when Queen Victoria granted him the title of Baron in 1892, he opted to adopt the river’s name as his own. Sir William Thompson was thenceforth known as Lord Kelvin.
Kelvin’s contributions to science were vast, but he is perhaps best known today for the temperature scale that bears his name. It is so named in honor of his discovery of the coldest possible temperature in our universe. Thompson had played a major role in developing the Laws of Thermodynamics, and in 1848 he used them to extrapolate that the coldest temperature any matter can become, regardless of the substance, is -273.15°C (-459.67°F). We now know this boundary as zero Kelvin.
Once this absolute zero temperature was decisively identified, prominent Victorian scientists commenced multiple independent efforts to build machines to explore this physical frontier. Their equipment was primitive, and the trappings were treacherous, but they pressed on nonetheless, dangers be damned. There was science to be done.
Sometime in the 1940s, an improbable encounter occurred at a mental institution in Maryland. Two women, each of whom was institutionalized for believing she was the Virgin Mary, chanced upon one another and engaged in conversation. They had been chatting for several minutes when the older woman introduced herself as “Mary, Mother of God.”
“Why you can’t be, my dear,” the other patient replied, unable to conceive of such a notion. “You must be crazy. I am the Mother of God.”
“I’m afraid it’s you who are mixed up,” the first asserted, “I am Mary.”
A hospital staff member eavesdropped as the two Virgin Marys debated their identities. After a while the women paused to quietly regard one another. Finally, the older patient seemed to arrive at a realization. “If you’re Mary,” she said, “I must be Anne, your mother.” That seemed to settle it, and the reconciled patients embraced. In the following weeks the woman who had conceded her delusion was reported to be much more receptive to treatment, and she was soon considered well enough to be discharged from the hospital.
This clinical anecdote was retold in a 1955 issue of Harper’s Magazine, and a highly-regarded social psychologist named Dr. Milton Rokeach read it with great interest. What might happen, he wondered, if a psychologist were to deliberately pair up patients who held directly conflicting identity delusions? Perhaps such psychological leverage could be used to pry at the cracks of an irrational psyche to let in the light of reason. Dr. Rokeach sought and secured a research grant to test his hypothesis, and he began canvassing sanitariums for delusional doppelgängers. Soon he found several suitable subjects: three patients, all in state care, each of whom believed himself to be Jesus Christ. And he saw that it was good.
On the 11th of July 1897, the world breathlessly awaited word from the small Norwegian island of Danskøya in the Arctic Sea. Three gallant Swedish scientists stationed there were about to embark on an enterprise of history-making proportions, and newspapers around the globe had allotted considerable ink to the anticipated adventure. The undertaking was led by renowned engineer Salomon August Andrée, and he was accompanied by his research companions Nils Strindberg and Knut Fraenkel.
In the shadow of a 67-foot-wide spherical hydrogen balloon—one of the largest to have been built at that time—toasts were drunk, telegrams to the Swedish king were dictated, hands were shook, and notes to loved ones were pressed into palms. “Strindberg and Fraenkel!” Andrée cried, “Are you ready to get into the car?” They were, and they dutifully ducked into the four-and-a-half-foot tall, six-foot-wide carriage suspended from the balloon. The whole flying apparatus had been christened the “Örnen,” the Swedish word for “Eagle.”
“Cut away everywhere!” Andrée commanded after clambering into the Eagle himself, and the ground crew slashed at the lines binding the balloon to the Earth. Hurrahs were offered as the immense, primitive airship pulled away from the wood-plank hangar and bobbed ponderously into the atmosphere. Their mission was to be the first humans to reach the North Pole, taking aerial photographs and scientific measurements along the way for future explorers. If all went according to plan they would then touch down in Siberia or Alaska after a few weeks’ flight, laden with information about the top of the world.
Onlookers watched for about an hour as the voluminous sphere shrank into the distance and disappeared into northerly mists. Andrée, Strindberg, and Fraenkel would not arrive on the other side of the planet as planned. But their journey was far from over.
The naked mole rat, Heterocephalus glaber, is fleshy, furless, buck-toothed and brazenly ugly. Yet what these small East African rodents lack in terms of good looks, they make up with an impressive array of biological quirks. These misnamed mammals are neither moles nor rats, and in terms of their social behaviour are actually closer to bees, wasps, ants, and termites than to other backboned animals.
They live in underground cooperative colonies of up to 300 individuals with a dominant breeding “queen” and celibate soldier and worker castes. Biologists have identified only one other vertebrate—the closely related Damaraland mole rat—that uses this rigid reproductive and social structure. Until the late 1970s scientists believed that this trait, known as eusociality, was confined to insects.
Naked mole rats deploy several impressive feats of physiology, including an apparent imperviousness to pain, a casual disregard for low-oxygen environments, and resistance to cancer. Indeed, these unsightly creatures both baffle and buttress Darwin’s Theory of Evolution in multiple remarkable and apparently self-contradictory ways.
It was the summer of 1936 when Ernest Lawrence, the inventor of the atom-smashing cyclotron, received a visit from Emilio Segrè, a scientific colleague from Italy. Segrè explained that he had come all the way to America to ask a very small favor: He wondered whether Lawrence would part with a few strips of thin metal from an old cyclotron unit. Dr Lawrence was happy to oblige; as far as he was concerned the stuff Segrè sought was mere radioactive trash. He sealed some scraps of the foil in an envelope and mailed it to Segrè’s lab in Sicily. Unbeknownst to Lawrence, Segrè was on a surreptitious scientific errand.
At that time the majority of chemical elements had been isolated and added to the periodic table, yet there was an unsightly hole where an element with 43 protons ought to be. Elements with 42 and 44 protons—42molybdenum and 44ruthenium respectively—had been isolated decades earlier, but element 43 was yet to be seen. Considerable accolades awaited whichever scientist could isolate the elusive element, so chemists worldwide were scanning through tons of ores with their spectroscopes, watching for the anticipated pattern.
Upon receiving Dr Lawrence’s radioactive mail back in Italy, Segrè and his colleague Carlo Perrier subjected the strips of molybdenum foil to a carefully choreographed succession of bunsen burners, salts, chemicals, and acids. The resulting precipitate confirmed their hypothesis: element 42 was the answer. The radiation in Lawrence’s cyclotron had converted a few 42molybdenum atoms into element 43, and one ten-billionth of a gram of the stuff now sat in the bottom of their beaker. They dubbed their plundered discovery “technetium” for the Greek word technetos, meaning “artificial.” It was considered to be the first element made by man rather than nature, and its “short” half-life—anywhere from a few nanoseconds to a few million years depending on the isotope—was the reason there’s negligible naturally-occurring technetium left on modern Earth.
In the years since this discovery scientists have employed increasingly sophisticated apparatuses to bang particles together to create and isolate increasingly heavy never-before-seen elements, an effort which continues even today. Most of the obese nuclei beyond 92uranium are too unstable to stay assembled for more than a moment, to the extent that it makes one wonder why researchers expend such time, effort, and expense to fabricate these fickle fragments of matter. But according to our current understanding of quantum mechanics, if we can pack enough protons and neutrons into these husky nuclei we may encounter something astonishing.
It’s a testament to the strength and versatility of the human brain that anyone with at least half of one tends to assume that their senses give them direct access to objective reality. The truth is less straightforward and much more likely to induce existential crises: the senses do not actually provide the brain with a multifaceted description of the outside world. All that the brain has to work with are imperfect incoming electrical impulses announcing that things are happening. It is then the job of neurons to rapidly interpret these signals as well as they can, and suggest how to react.
This neurological system has done a pretty good job of modelling the world such that the ancestors of modern human beings avoided getting eaten by sabre-toothed tigers before procreating, but the human brain remains relatively easy to fool. Optical illusions, dreams, hallucinations, altered states of consciousness, and the placebo effect are just a handful of familiar cases where what the brain perceives does not correspond to whatever is actually occurring. The formation of a coherent model of the world often relies on imagined components. As it turns out, this pseudo-reality in one’s imagination can be so convincing that it can have unexpected effects on the physical body.
On 10 January 1709, pioneering weather observer William Derham recorded an historic event outside his home near London. He examined his thermometer in the frigid morning air and jotted an entry into his meticulous meteorological log. The prior weeks had been typical for an English winter, but overnight an oppressive cold had lodged itself over the Kingdom. As far as Derham was aware, London had never experienced so few millimeters of mercury as it did that morning: -12º C.
The remarkable cold lingered in Europe for weeks. Lakes, rivers, and the sea froze over, and the soil solidified a meter deep. The cold cracked open trees, crushed the life out of livestock huddling in stables, and made travel a treacherous undertaking. It was the coldest winter in the past 500 years, and one of the coldest moments in a larger global phenomenon known as the Little Ice Age. Likely causes include volcanic activity, oceanic currents, and/or reforestation due to Black-Death-induced population decline. It is nearly certain, however, that it has something to do with the unusually low number of sunspots that appeared at that time, a phenomenon referred to as the Maunder minimum.
We now know that such solar minima correlate quite closely with colder-than-normal temperatures on Earth, but science has yet to ascertain exactly why. Solar maximums, on the other hand, have historically had little noteworthy impact on the Earth apart from extra-splendid auroral displays. But thanks to our modern, electrified, interconnected society these previously innocuous events could cause catastrophic economic and social damage in the coming decades.