Marisa Brook is from the west coast of Canada.
Only one fictional character has ever been honoured with a front-page obituary in the New York Times: Hercule Poirot, one of Agatha Christie’s two recurring detectives. On 06 August 1975, the headline read, “Hercule Poirot Is Dead; Famed Belgian Detective”. Two months later, the last Poirot mystery – Curtain – was released to the public. Christie, whose life was drawing to a close, had written Curtain in the 1940s as Poirot’s last case and locked it away until she realized that she could not write any more of his mysteries. Christie had long been personally burned out on her famous fictional detective; however due to his popularity, she had refrained from discarding him altogether.
Sir Arthur Conan Doyle—famed creator of Sherlock Holmes—also tired of his own popular protagonist, once writing “I have had such an overdose of him that I feel towards him as I do towards paté de foie gras, of which I once ate too much, so that the name of it gives me a sickly feeling to this day.” When Conan Doyle gave in to the temptation to murder the beloved character in 1893, public outrage was so vehement and sustained that the author eventually resumed writing Holmes stories under the pretense that the eccentric detective had merely faked his own death.
A lengthy study of ‘crack babies’ born to cocaine addicts in Philadelphia in the 1980s and 1990s ended in 2013 with an unexpected result. The average IQ amongst the ‘crack babies’ now in their early 20s was 79.0. However, the control group, who were socioeconomically similar but not born to crack addicts, had an average IQ of only 81.9. The cocaine exposure appeared to have only a small and non-significant detrimental effect on the average cognitive functioning of the children of addicts. But both groups were below the average range for the United States (90 to about 110). Further study led the team to a surprise conclusion: both groups had been unable to reach their full intellectual potential due to chronic poverty.
110 of the original ‘crack babies’ now in their 20s were followed through to the end of the study. 108 of them are still alive, but only 6 of them have graduated from college, with only another 6 working towards doing so. In the meantime, there have been 60 children born to them. Whether the next generation will grow up in conditions any better than the ones that held their parents’ cognitive functioning back remains to be seen.
Ancient Israel was renowned for its date palm plants, which were widespread in thick forests and reportedly bore delicious fruit. The dates were a staple food for dwellers of the Judaean Desert. Sometime around the year 1300, however, a confluence of catastrophes—agricultural, economic, and climatological—killed many of the trees, and over time the palms became so uncommon that a French explorer in the 16th century doubted that the ancient date trade could ever have been particularly noteworthy. Within the following few centuries, that particular variant of date palm was extinct, and became entirely the stuff of legend.
In the mid-1960s, archaeologists at the clifftop palace of Herod the Great in Masada, Israel uncovered a 2000-year-old jar containing seeds. These turned out to be seeds of the long-extinct date palm. In 2005, researchers treated three of these seeds with fertilizer solutions and planted them in pots to see whether they were still capable of germinating. One of the seeds did indeed sprout, and it yielded a large, healthy date palm that the researchers nicknamed ‘Methuselah’ (not to be confused with the famous bristlecone pine of the same name). Within a decade Methuselah was almost ten feet tall, and producing pollen.
Date palms come in separate male and female plants, only the females being able to produce fruit. Methuselah is unfortunately male. However, scientists speculate that Methuselah could be used to fertilize a female plant of a closely related Egyptian date palm, resulting in fruit as soon as the early 2020s, offering humanity access to a legendary flavour that has not been tasted in centuries.
The height of Mount Everest was not calculated by George Everest, but by a brilliant mathematician who has since been all but forgotten. Everest himself (who pronounced his name ‘ee-vrist’) had become the Surveyor-General of India in 1830, and by the next year was eagerly seeking a mathematician/topographer for his Great Trigonometric Survey of the area. A local college math teacher sent him the then-19-year-old Radhanath Sikdar. Sikdar was from Bengal and had become semi-notorious as a part of the Young Bengal movement of free-spirited noncomformists (expected to enter into an arranged marriage with a young girl, Sikdar had said no and walked away). However, he was also becoming known for his mathematical talents. Under Everest’s direction, Sikdar distinguished himself almost immediately with his level of technical skill and intellectual creativity. Sikdar would end up inventing a number of new forms of measurement, some of which far outlived him.
Sikdar ended up working for the Survey for more than two decades. Unfortunately, he was often treated unfairly. One edition of a surveyors’ manual left his name off his contributions. On another occasion, when Sikdar spoke up about the Survey taking advantage of some of its employees, he was fined 200 rupees for what the organisation saw as impudence. And due to how valuable his contributions were, when Sikdar attempted to change jobs, Everest denied the request on less-than-truthful grounds.
This was not even the final insult for Sikdar. When Everest retired, Sikdar continued his mapping and calculations under successor Andrew Waugh. Sikdar was able to show through his calculations that ‘Peak XV’ was the tallest in the world from sea-level, and Waugh eventually agreed with his calculations. Although the Survey had been labelling peaks according to what the local people called them, in this case Waugh decided to break with tradition and name the illustrious peak after…Everest. At least one scholarly society at the time took full note of Sikdar’s accomplishments, but in spite of his brilliant contributions, he has mostly been forgotten.
Many people have experienced the odd psychological sensation that results from repeating a word until it no longer seems to have any meaning. This is a recognized phenomenon in psycholinguistics known as ‘semantic satiation’ or ‘semantic saturation.’ When the phenomenon occurs, the neurons that deal with the connection between the pronunciation (or spelling) of the given word and its meaning have become so overwhelmed by repeated, emphasized activation that they begin producing inhibitory signals in protest, sometimes called ‘reactive inhibition,’ thus briefly disabling the listener’s ability to connect the pronunciation (or spelling) to the meaning of the word.
In the late 1430s and early 1440s, a certain Korean scholar embarked on a massively ambitious project, working almost single-handedly and spurred on largely by personal interest. Although the Korean language had existed for almost 1,500 years, it had never had its own dedicated writing system. Korean writers had long tended to rely on Chinese writing, which was logographic—that is, it was a system of symbols that stood for concepts. Adapting the Chinese characters to Korean meant borrowing some Chinese symbols because of the way they were pronounced, and others because of the concept they conveyed.
This approach had centuries of tradition behind it, but it was not ideal. In particular, Korean had more prefixes, suffixes, and short grammatical words (e.g., prepositions) than Chinese did, and Chinese logographs were not well-suited to capturing these. More practically, learning the thousands of Chinese characters required a good deal of study, which meant that only the most well-educated Koreans could read and write. The Korean scholar in question was determined to bring literacy to the masses. His insight was that they needed an alphabet—that is, a writing system based entirely on pronunciation, and one that required far fewer characters than the logographs.
“What do you know of language and linguistics?” the bold scholar asked of several high-ranking officials who objected to his idea. “This project is for the people, and if I don’t do it, who will?” The scholar was none other than Sejong, the king of Korea, who had held the throne since 1418. His profoundly democratic conviction that literacy ought to be accessible to everyone was revolutionary in every sense. When King Sejong unveiled Hangul—his new alphabet for the Korean language—it was met with vehement opposition from Sejong’s advisors, from the literary elite, and from subsequent monarchs. For these objectors, Hangul was barbaric, it was primitive, it was unnecessary, it was an insult, and it needed to be eliminated.
In the late 1960s, an Italian engineer named Giorgio Rosa oversaw the construction of an artificial platform in the Adriatic Sea about 4,500 square-feet in size. On this, he helped establish a restaurant, a nightclub, a bar, and a post office. On 24 June 1968, Rosa declared his platform the Republic of Rose Island (a play on his surname), and proclaimed himself the president. The Italian government was not at all amused; they saw Rosa as trying to make a profit while evading Italian tax laws. Deciding to shut down Rosa’s enterprise, the government sent soldiers over to the platform to seize control of Rose Island. Rosa and his own micronational government protested this ‘occupation’, but no one paid him any attention. Once the population of Rose Island had been relocated, the Italian government warned everyone to stand back and then blew the platform to bits. Rosa himself went on to declare himself the ongoing president ‘in exile’; however, with his micronation’s homeland obliterated, things aren’t looking so good for his cause.
Market researcher James Vicary became well-known for a 1957 study attesting to the efficacy of subliminal advertising. His description of his experiment involved movie-theatre customers being shown very brief (0.03-second-long) advertisements for popcorn and soft-drinks, then purchasing substantially more of these than attendees who were not shown the advertisements.
The only problem was that Vicary’s results proved to be hard to replicate, and Vicary himself claimed that too many of the details of his experiment were confidential and could not be shared with other researchers. Suspicion grew, and Vicary admitted on television in 1962 that the study had been a “gimmick” with only a very small amount of data. A 1992 study by another researcher went farther and concluded that Vicary had not performed an experiment at all.
In spite of this, the idea persists that advertising below the level of consciousness is powerfully persuasive.
Inés Ramírez Pérez of Rio Talea, Mexico, is known for being one of the only confirmed people to have successfully completed a Caesarian section on herself. She went into labour with her ninth child at the age of 40 in March of 2000 while alone in her cabin; her husband was out drinking, and the nearest midwife was 50 miles away over poor-quality roads. Rio Talea itself had 500 people and a telephone, but it wasn’t close enough for Ramírez to reach. Ramírez was accustomed to childbirth – her eldest child was now 25 – but her eighth child had died during labour due to the lack of a way to conduct a Caesarian section, and Ramírez was determined not to see the same thing happen this time. She had no medical training, but decided to deliver her own child by Caesarian. After twelve hours of excruciating labour pain, Ramírez drank some liquor of almost 100-proof, then found a large knife and stabbed herself in the abdomen. It took her three tries to get an incision started, plus it was night and the only light was a small bulb. But she managed to cut a 17-centimetre-long gash vertically downwards from the right side of her navel. Blood started pouring out immediately, and getting to the uterus took Ramírez an hour, but she stayed alert and delivered a baby boy. With a pair of scissors, she cut the umbilical cord; and after a brief period of unconsciousness, used clothing to bandage the wound, then sent one of her older children to get help.
A village health-assistant arrived within a few hours and temporarily sewed the incision shut. Ramírez was transferred to a clinic several kilometres away, then to the closest hospital. She underwent two surgeries in the next week: one to repair damage to her intestines, and another to close the incision site.
It was fortunate for Ramírez that her position during the self-directed surgery made the womb close to the incision site. She was also extremely lucky in that she did not simply pass out due to pain and/or shock partway through the surgery. Furthermore, the enormous open wound in a very non-sterile environment did not lead to infection, which was improbable.
Ramírez and her baby son, Orlando, survived the surgery. Ramírez was released from hospital after only about a week, and has since made a full recovery. The surgery has left behind a large scar but no problematic side-effects. Her case not only got attention in the media, but also from the medical community; it was reported on in the International Journal of Gynecology and Obstetrics in 2004. As of 2004, the knife was still in Ramírez’s kitchen. She used it to cut fruits and vegetables.