In 1968, a 3M chemist named Dr. Spencer Silver was attempting to create a super-strong adhesive when instead he accidentally invented a super-weak adhesive, which could be used to only temporarily stick things together. The seemingly limited application of Silver’s product meant that it sat unused at 3M (then technically known as Minnesota Mining & Manufacturing) for another five years, until, in 1973, a colleague named Art Fry attended one of Silver’s seminars and struck upon the idea that his impermanent glue could be used to stick bookmarks into the pages of his hymnbook. It took another few years for 3M to be convinced both of Fry and Silver’s idea and of the salability of their product, but eventually they came up with a unique design that worked perfectly: a thin film of Spencer’s adhesive was applied along just one edge of a piece of paper. After a failed test-market push in 1977 as Press ’N Peel, the product went national as the Post-It note in 1980.
Plastic, “the material of a thousand uses,” the blessing and the curse of our modern world, all the conveniences you could wish for with oodles of Monkey’s Paw-style consequences. Substances that could be classified as “plastics” have been in use for thousands of years, from rubber balls made from tree sap in Central America to the protective coating shellac, which is made from the secretions of the lac beetle. They did their jobs, but not, ya know, great. The raw materials were usually difficult to obtain, which meant they were expensive, so the products made from them were rare and only accessible to a select few. They also tended to have short useful lives or were susceptible to temperature changes and humidity. The industrial revolution –you may have heard of it, it was in all the papers– created huge demand for new materials, both natural and synthetic, like celluloid, made of plant cellulose and camphor, and galalith, made of milk protein and formaldehyde.
Then along came Belgian chemist Leo Baekeland. In 1907, with a successful product under this belt, the first fiber-based photographic paper, was on a mission to create a replacement for shellac. Making shellac requires going to the trees where swarms of little red lac beetles are laying eggs and sucking up sap until they burst. I’m barely being hyperbolic there; indigenous people call it the feast of death. As the beetles excrete the leavings of the sap, it forms a coating over the entire swarm, which is scraped up, bugs and all, and taken for refining. In addition to obvious things like woodworking, shellac is also used in candy-making, so my vegan, kosher, and halal friends beware.
In his attempts to improve upon nature, Baekeland heated a mixture of formaldehyde and phenol, a waste product of coal, which will reappear in the script later. Rather than a lacquer, he inadvertently created a polymer that didn’t melt under heat and stress. In 1907 he applied for a patent for his new compound, polyoxybenzylmethylenglycolanhydride, which he humbly named Bakelite, after himself. And yes, I got that in one take. Order your voiceovers today at moxielabouche.com. This new thermosetting plastic went like a house on fire. It could be, and was, used for everything from phones to dinner plates, to toys to jewelry. It was also a boon to the emerging automotive and electrical industries, because it was an effective insulator. Apart from being the first proper modern plastic, it was also the first synthetic material to lean into being a synthetic material, rather than trying to look or act like a natural product. It was lightweight, durable, and could be molded into nearly any shape, you could think of, in nearly any color or pattern you could think of. It also looked sleek and modern, the Apple aesthetic of its day. Bakelite introduced plastics to the fashion world, to be followed by nylon, polyester, spandex, and more. These plastics have inspired fashion designers to do more with less: more fabric choices, more creativity, and more durability, coupled often with less material, less weight, fewer wrinkles, and less expense. All this to avoid using bug secretions.
Bonus fact: The phrase “Better Living Through Chemistry” is a variant of a DuPont advertising slogan, “Better Things for Better Living… Through Chemistry.” DuPont adopted it in 1935 and it was their slogan until 1982 when the “Through Chemistry” part was dropped.
If you break your Bakelite bangle, you know the one you bought at the vintage shop because it was so tacky it was kinda cute, what are you going to stick it back together with? Whether you think it’s crazy or super, you’re going to need glue, a glue with both an accidental discovery and a famous accidentally-discovered use case. Cyanoacrylate glue, or CA glue if you’re pressed for time, or California glue if you’ve been watching the Shop Time woodturning channel on YouTube, was not supposed to be an adhesive of any kind.
Cyanoacrylate was discovered in 1942 by Dr. Harry Coover. Like a lot of people in the early 1940’s, Coover was working for the war effort. Coover, a prolific inventor who would eventually hold over 460 patents, was trying to make clear plastic lenses for gun sights. While working with various chemicals, Coover and his team discovered one formulation was extremely sticky. This was not ideal. The chemicals would also polymerize in the presence of moisture, causing anything made with cyanoacrylate to be covered in a film, and anything that touched the surface stuck to it immediately, which was even less good for use in the field. Failing to be what it was needed to be, cyanoacrylate was put on a shelf and the team moved on to other options. But CA wasn’t completely forgotten.
Nine years later, now working at Eastman Kodak, Dr. Coover was the head of a team tasked with developing a heat resistant polymer for jet canopies. One researcher went back to formulas of days past and spread some ethyl cyanoacrylate between a pair of refractometer prisms, and was surprised to find them proper stuck together. This time, being sticky worked in CA’s favor. As testing went on, Coover realized the great potential of an adhesive that would quickly bond to a variety of materials, and all it needed to work was a bit of water to activate, just as the moisture in your breath when you blow on it. It took another seven years of tinkering and I’m assuming market research and other such nonsense, but in 1958, Eastman Kodak launched Super Glue…under the thoroughly underwhelming name “Eastman #910.” Eastman #910 was licensed to the Loctite who re-branded it as “Loctite Quick Set 404,” before making their own version called “Super Bonder”. By the 1970s, everybody and their sainted brother was making CA glue and super glue was soon becoming a genericized name.
Hold up, Moxie, you say. Everyone knows it was made during WWII as a battlefield wound closure. Sorry, luv, while CA glue has been and, if you’re my husband, still is used for closing minor wounds (it’s a wonder for paper cuts), CA wasn’t pressed into service as an adhesive until after the war was over. It did see a lot of use in first aid during the Vietnam conflict, or as the Vietnamese call it, the American war. While super glue can help with minor wounds, it can also cause them. If you put CA on cotton or wool, it creates a rapid exothermic chemical reaction that releases enough heat to cause minor burns, so don’t do that, no matter what you see on Five Minute Crafts. On the flipside, if you find yourself in a Bear Grylls/Naked and Afraid/Castaway situation and how just so happen to have a tube of super glue and cotton ball or wool socks, adding a bunch of glue can generate enough heat to start a fire.
Handy thing, super glue, but if you don’t binge Forensic Files every Friday like I do, you might not know about its most interesting use — solving crimes. Okay, that’s a bit of an exaggeration, but it helps. CA can make fingerprints easier to find and that talent was found by accident. In the 1970’s, workers in a Japanese factory producing CA noticed that there were fingerprints everywhere. Like, everywhere. And they were really distinct. The fumes from the CA react to the ridiculously tiny amount of moisture in fingerprints on a non-porous surface. It not only makes the fingerprints more visible, but creates a sort of shell on the ridge details to preserve them. How that information got to the Tokyo National Crime Lab, I cannot say, but members of the US Army CID Lab in Japan brought the idea home in the late 1970’s. Early experiments used things like fish tanks and large plastic bags as fuming chambers, but today things are much more bespoke. Heat accelerates the process, and in the early days, they used the warming base from a coffee pot.
You might have heard the statistic that the deadliest animal in the world is the mosquito. By one count, mosquito-borne illness has killed half the people who have ever lived. The grand dame of them all is malaria and the best medicine we had for it for the longest time was quinine. Annoyingly, a lot of listicles of accidental discoveries and inventions listed quinine, a drug derived from the bark of the cinchona tree in South America. They annoy me before they act like white colonizers just happened across it, ignoring how indigenous people had been using it to treat fevers, some of which were malaria, for centuries.
In the mid-nineteenth century, the British Empire noticed a bunch of countries that white people weren’t running, so they decided, better fix that. There they encountered understandable resentment, spices they wouldn’t bother using back home, and malaria-carrying mosquitos. The need for quinine was great. They could get some from tonic water, which is how the G&T became the second-most British drink, but on the whole, more quinine was needed than could be imported. They needed to be able to *make quinine, or quineen, for my listeners in the home counties. Enter 18 year old student William Henry Perkin. Over Easter break in 1856, Perkins made use of the small laboratory of his professor’s house in London to oxidize the organic compound aniline to obtain to try to create quinine. As you will have guessed if you’ve caught on to today’s format, and I’m sure you have, my clever little sausage, Perkin didn’t succeed in making quinine. All he got was some black gunk in the bottom of the beaker. When he tried to clean the gunk out with alcohol, the material changed to a purplish hue. Without intending to do so, Perkin had made the world’s first synthetic dye. Blind luck was definitely a factor, but credit must be given to Perkin’s observantness and persistence. With this help of his professor and brother, he conducted new experiments to perfect his method for making the dye, which he patented in August 1856. It’s been called aniline purple, purple aniline or Perkin’s mallow, the molecule, science would discover later, is 3-amino-2,9-dimethyl-5-phenyl-7- (p-tolylamino) phenazine acetate. Perkincalled it mauvine, but most of us call it mauve.
Perkin also had a great entrepreneurial spirit. He realized that this happy accident could replace purple dyes that hadn’t changed since Roman times, when togas were dyed with Tyrian purple, made from the shells of tens of thousands of eentsy-weentsy sea snails. Market mauvine he would and market mauvine he did. Within a few years, mauve garments were everywhere, especially in fashion hot spots like London and Paris. It got a major boost when Queen Victoria appeared at the Royal Exhibition of 1862 with a long mauve gown. Other chemists worked out what he’d done and within five years, there were 28 dye factories making mauve, including the German company BASF, which is still around to today. You may have seen one of their tv ads — high production values, no clear impression of what they actually do. Meanwhile, the still young chemist undertook intense research on dyes, inks and paints, and also perfected coumarin, one of the first synthetic perfumes.
By the age of 21, William Henry Perkin was already a millionaire and at 36 he retired to devote himself exclusively to research in organic chemistry. As Simond Garfield, author of the book “Mauve” explains, Perkin’s mauve not only meant a revolution in the dye industry, but also in medicine. His works with artificial dyes were essential so that Walther Flemming could colour cells and study chromosomes under a microscope. They also helped Robert Koch, Nobel Prize in Medicine in 1905, to discover the bacillus responsible for tuberculosis, after dyeing the sputum of a patient. What’s more, the development of Perkin’s synthetic dyes was crucial for the studies of Paul Ehrlich, Nobel Prize in Medicine in 1908 and a pioneer in chemotherapy research. All that because he cleaned a beaker of failed experiment with alcohol.
Some discoveries were made while practicing good science in the cause of finding something else. Some were made through are-you-kidding-me-with-this-nonsense level of lab safety recklessness, like having DeeDee running Dexter’s Lab. The questionable practices at issue belonged to one Constantin Fahlberg, and I will say that, in his defense, the discovery he’s most famous for was made in 1879, when doctors were still debating germ theory, there were no laws against patent medicine, and you could sell warming blankets treated with radium. Falberg was working for the H.W. Perot company when he was tasked with testing the purity of a shipment of sugar they’d had impounded by the US government. He tested the sugar and did whatever else was on his checklist for the day, except for one key task — it seems that at no point in the process did he wash his hands.
Fahlberg was home having supper when he noticed his food tasted sweet. Incredibly sweet. At first, he blamed the bread, but he was about a century before American bread would be chock-a-block with sugar. Bonus fact: the courts in Ireland have declared that the bread at Subway contains too much sugar to even be classified as bread and is legally a confection.
Fahlberg came to the conclusion that he must have gotten something on his hands at work that got transferred onto his bread roll. Rather than panicking, inducing vomiting, calling a doctor, or even being concerned to any degree that has been recorded by history, he was over the moon about it. The thing was, his lab safety being what it was, he didn’t know *which of the *many chemicals he’d been working with it could have been. So he went back to the lab and began *tasting all the chemicals on his desk. We’ll call him a runner-up for the Physician Tes tThyself section of the book. Btw, have you ever seen the ‘can I lick the science” meme? I’ll post it in the FB group and on the reddit, both of which you use to hang out with your fellow brainiacs and you can reach both through yourbrainonfacts.com/social.
Eventually and without flat-out poisoning himself, Fahlberg managed to find the sweetness in a beaker filled with sulfobenzoic acid, phosphorus chloride and ammonia. This intimidating-sounding mix had boiled over, creating benzoic sulfinide, a compound Fahlberg was familiar with, but had never thought to stick in his mouth, because why would you? Fahlberg rushed to write up a paper, with Ira Remsen, the supervisor of the lab, describing the compound and the methods of creating it. Both men were listed as the compound’s creators, but after realising the compound’s massive commercial potential, Fahlberg changed his mind about sharing credit and patented the substance in 1886, listing himself as the sole creator behind…saccharin. What did Remsen think of that? “Fahlberg is a scoundrel. It nauseates me to hear my name mentioned in the same breath with him.
It’s not clear where the name came from, but Fahlberg’s discovery, the first artificial sweetener was fairly successful as soon as it hit the market, but it’s big moment came in WWI, when rationing and sugar shortages threatened our collective sweet tooth. Saccharin was marketed as non-fattening; the body doesn’t actually metabolise saccharin, meaning it has no caloric or nutritional value. Now, let’s talk about cancer. Listeners of a certain age probably remember buying a pack of gum sweetened with saccharin, you may have been surprised to notice a cigarette pack-like warning box USE OF THIS PRODUCT MAY BE HAZARDOUS TO YOUR HEALTH. THIS PRODUCT CONTAINS SACCHARIN WHICH HAS BEEN DETERMINED TO CAUSE CANCER IN LABORATORY ANIMALS. It’s almost the same warning as they put on engine cleaner. How is it legal to sell as food?!.
This may come as a surprise considering that starting in the 1970s, and as recent as a a little over a decade ago, the widespread belief was that it caused cancer. This was despite the fact that in 1974 the National Academy of Sciences performed a review of all the studies done on saccharin and determined that there was no sound evidence that saccharin was a carcinogen and that the only studies that claimed to show it was were flawed or otherwise ambiguous in their results.
One particular flawed study from the 1970s was nearly the final nail in the coffin of saccharin when the researchers found that saccharin could lead to bladder cancer in rats. This spurred the Saccharin Study and Labeling Act of 1977, which managed to thwart efforts to ban saccharin outright, instead simply getting it a severe warning label: “Use of this product may be hazardous to your health. This product contains saccharin which has been determined to cause cancer in laboratory animals.”
The rats in the study did indeed have a high rate of bladder tumors. However, beyond any potential flaws in methodology, there is the obvious caveat that, while similar in some ways, rodents and humans aren’t exactly the same (shocker); so further studies needed to be done to see if the same thing occurred in humans.
What was happening with the rats is that specific attributes in their urine (high pH, high proteins, and high calcium phosphate) was, combined with the undigested saccharin, causing microcrystals to form in their bladders. This led to damage of their bladder lining, which over time led to tumors forming as their bladders were continually having to be repaired.
Once the exact cause of the tumors was determined, exhaustive tests were done to see if the same thing was happening with primates. In the end, the results came up completely negative, with no such microcrystals forming.
Thanks to this, in 2000, saccharin was removed from U.S. National Toxicology Program’s list of substances that might cause cancer. The next year, both the state of California and the U.S. Food and Drug Administration removed it from their list of cancer causing substances. In 2010, the Environmental Protection Agency concurred, stating that “saccharin is no longer considered a potential hazard to human health.”
The 1970s wasn’t the first time this compound came under fire. A much earlier and equally as unfounded panic occurred as a result of the Pure Food and Drug Act of 1906. [heard about in blue plate special) Harvey Wiley, the director of the bureau of chemistry for the USDA, considered saccharin inferior to sugar and lobbied hard against it, even going so far as telling president Teddy Roosevelt that “Everyone who ate that sweet corn was deceived. He thought he was eating sugar, when in point of fact he was eating a coal tar product totally devoid of food value and extremely injurious to health.”
While he got the “totally devoid of food value” part correct, the latter “injurious to health” part wasn’t actually backed by any vetted evidence at the time (or since). Roosevelt, who ate saccharin regularly, stated, “Anybody who says saccharin is injurious to health is an idiot.” Needless to say, Wiley soon lost much of his credibility and his job.
Art Fry attended one of Silver’s seminars and struck upon the idea that his impermanent glue could be used to stick bookmarks into the pages of his hymnbook. It took another few years for 3M to be convinced both of Fry and Silver’s idea and of the salability of their product, but eventually they came up with a unique design that worked perfectly: a thin film of Spencer’s adhesive was applied along just one edge of a piece of paper. After a failed test-market push in 1977 as Press ’N Peel, the product went national as the Post-It note in 1980.