Select Page

It’s rare for me to return to a topic, but if it gives me an excuse to run a George Carlin clip, I’m all for it. [sfx]  Plus, I’ve got new information to share. 
Of course, that was recorded before blue corn chips were available at every mega mart and gas station.  It’s strange when you think about how there are nearly no blue foods when blue is such a common color in our world.  Birds and butterflies are blue, flowers are blue, 10% of the 7.8 billion people on earth have blue eyes, the whole dang sky is blue.  Two questions lay before us — why is there no naturally-occurring blue food, and why does artificially-colored blue food taste like a red fruit?


Blue foods are rare in nature, because leaves are green.  Let me explain.  No, there is too much, let me sum up.  The color in plant foods comes from natural pigments. In general, chlorophyll provides green and blue-green; carotenoids provide orange, yellow, red and red-orange; and anthocyanins provide red, purple and various shades of blue.  Plants use green chlorphyll to photosynthesize, to turn sunlight into energy.  Plants also need to get their flowers pollinated and their seeds spread, so many of them evolved to have flowers or fruits in bright warm colors so that they stand out against the green background.  They really “make it pop,” you could say, if you want every graphic designer in a half-mile radius to shudder in disgust.  Blue anthocyanins are chemically less stable than other pigments, so carotenoids tend to dominate them.  You need very precise environmental and chemical conditions to get a dominant blue.  Things could have been very different, though.  Scientists speculate that, before the proliferation of chlorophyll, primitive organisms used a pigment called retinal.  Retinal is purple, meaning forests and lawns would look like Crown Royal bags.  In that case, the opposing color would be yellow.  Without chlorophyll, we might well be asking why are there no yellow foods?


With blue food being rare in nature, blue was not a color that our primitive ancestors associated with nourishment.  In fact, quite the opposite.  Blue was bad.  Blue was mold. Blue would make you sick.  Early agrarians weren’t exactly going to propagate things that reminded them of the time they almost pooped themselves to death.  Diarrhea is one of the leading causes of death throughout history, after all.


Jump forward a few millennia with me to answer our second question, why is raspberry blue and we’re all just kinda going along with it, especially at a time when we could combine any color with any flavor you could think of?  There are two stages to the answer, how raspberry became paired with blue food dye, and why blueberry specifically did not.  The weak link in the chain of making the flavor match the appearance was the rich rubescent red dye #2.  In 1976, it was banned from food products as a carcinogen in 1976, but it had been controversial off and on for decades.  It was in one of these periods of controversy in the late 50’s that the Gold Medal company of Cincinnati, OH decided to stop using it.  One company discontinuing a color wouldn’t usually start a trend, but Gold Medal the was maker of the first reliable cotton candy machine, and supplier of the colored, flavored sugar to go in them.  Gold Medal dropped red #2, but they still had heaps of raspberry flavoring.  For reasons that cannot be substantiated beyond “it was what we had,” Gold Medal opted to marry raspberry flavor to blue food color.  Blue raspberry had a toe-hold in the public consciousness, but it would take a different kind of sweet treat to make it a staple.  When red #2 was banned, the manufacturers of Otter Pops, those delightful mouth-lacerating frozen baggies of sugar water, and Icee, the precursor to 7-11’s Slurpee.   Making raspberry blue had the added benefit of making it more visually distinct from things like cherry, strawberry, and watermelon.  


Food manufacturers have long understood that brightly colored foods are especially appealing to that segment of the population with no money of their own, but great influence on purchasing decisions — children.  The food doesn’t even need to have a specific flavor, so long as it’s cartoonishly bright.   A beverage trade magazine from the 20’s pointed out that children preferred fuschia lemonade, even though the only thing different from regular lemonade was the color.  Manufacturers were primed to think bright colors when the baby boom of the post WWII era brought about another sudden increase, an ice cream boom.  Product appearance was more important than ever as the displays became increasingly crowded.  Blue raspberry left the carnival and moved into people’s homes via their grocer’s freezer case.  One more factor helped cement blue raspberry as a thing, and that was the Fourth of July, American Independence Day, an especially big deal in 1976, being the 200th birthday and whatnot.  If you’re going to make a jello mold or dish up sherbet to coordinate with your Old Glory napkins and bunting, you’re going to need blue and, like it or not, blue was raspberry.


Let’s say you didn’t like it then or you don’t like it now.  You might be asking, why didn’t they just use the blue dye with blueberry flavor?  The thing of it is, blueberries weren’t really a thing.  They were a thing that existed, of course, they’d been growing wild in the northern corners of the country since time immemorial.  But outside of the Pacific Northwest and New England, and outside of their harvest season in the summer, and maybe a few jars of jam put up for winter, most people weren’t eating them.  Blueberries have only been domesticated for about a century, as of now.  So if you’re out grocery shopping in, say, the Midwest in decades past, blueberries weren’t on your radar at all.  In 1939, Americans were eating about 20 million pounds of blueberries.  Most of these were wild-picked and only half were eaten fresh; the other half were frozen or canned.  20mil lbs/9071 metric tonnes sounds like a lot, and it is a lot, but at the same time, we were eating *46 million pounds of figs.  That’s how low blueberries were in the rankings.  But blueberries were beginning to wriggle their way into mainstream consciousness, through things like repeatedly covered Fats Domino’s song Blueberry Hill and a character turning into a blueberry in Roald Dahl’s Charlie and the Chocolate Factory.    Today, we eat 660 million pounds of blueberries annually, a sizeable increase from its days as a fringe flavor.  Bonus fact: another big boost to blueberries came from the Jelly Belly company, who created a blueberry jelly bean to make a red, white and blue display for president Ronald Reagan’s inauguration.


Why all the focus on color, anyway?  If the taste of food is a combination of the work of messers nose and tongue, how do the eyes claim to be so important?  They’re already the “windows to the soul,” isn’t that enough for them?  No, they have to pre-dispose our brains with ideas about the food we’re about to eat.  I mean, I’ll concede that the eyes are the fastest and safest way to tell if food is furry, green, sprouting, or that weird rainbow iridescence like you see in an oily parking lot puddle.   You can take in a lot of information about your food before you even touch it, if your steak has been properly grilled, if the broccoli has been cooked to death plus ten minutes, or how much garlic butter sauce your partner just covered the whole pizza in — there’s a limit to everything, Bobby.  


Humans begin to associate certain colors with various types of foods from birth, and equate these colors to certain tastes and flavors throughout life.  When we see yellow pudding, for example, we expect vanilla or banana.  A red sports drink should taste vaguely of cherries.  Your brain is setting up what your tongue will tell your brain.  If, however, that pudding were pistachio or the sports drink was grape, you’d be thrown into a moment of cognitive dissonance as your brain tried to figure out what’s going one.  If the color of a food product does not match our expectations, we may perceive its taste and flavor differently.  In a study published in the Journal of Food Science, researchers found that people confused flavors when a drink did not have the appropriate color.  A cherry-flavored drink manipulated to be orange in color was thought to taste like an orange drink, and a cherry drink manipulated to be green in color was thought to taste like lime.


There is an often cited study from the early 1970s, where volunteers were served a plate of food in a room with colored lighting.  When they were about half-way done, the light was switched to normal lighting.  It was then the test subjects saw that their steak was blue and their french fries were green, whereupon many of the subjects lost their appetite and a few even became ill.  This study has been referenced and repeated for decades as a testament to how the color of food affects our perception and our appetite, and a lot of theories are based on it.  For example, fast food restaurants often use red, orange, and yellow in their branding and decorating, because those colors are supposed to stimulate appetite and encourage you to eat faster, so they can get more customers in.  It’s too bad it doesn’t seem to have actually happened.


Even still, color is, to put it mildly, important in food manufacturing and sale.  Farmed salmon, for example, tend to have grayish flesh, so it’s colored a lovely orangey-pink to match their wild cousins.  Orange juice is made during the citrus harvest season and stored in giant tanks, to be doled out throughout the year.  In addition to having orange flavor added, it also gets a color boost to make it look fresh-squeezed.  Many products are colored to insure a uniformity of finished product, regardless of the specific shade of the raw ingredients.


The desire to color food is hardly new.  There was a big stink online a few years ago when people realized that Starbucks was using cochineal to make sure their strawberry frappacinos were pink enough.  I seemed to be in the minority of people who weren’t surprised by the news, but that’s because I saw a news article a long while back about Jewish women needing to be careful when shopping for lipstick.  Why did Jewish women need to be wary of certain lipsticks?  Because cochineal is dried and powdered beetles, and insects aren’t kosher.  The cochineal insect was once the go-to red food coloring in America, a pound of which required about 70,000 beetles, give or take a few.  It’s safe to eat, but has been gradually phased out because… you know, it’s bugs.


The appreciation for colorful food has been with us since the beginning.  Roman and Egyptian feasts were famous for their visual appeal and extravagance.  The Romans and Phoenicians used saffron to get bright yellow, even though it literally costs more than its weight in gold, and they got purple, an extremely rare and sought-after color, with Tyrian purple, made from the mucus of sea snails.  It would take until the 1850’s to come up with a non-snail substitute.  Think about that next time you’re eating a rainbow unicorn cupcake.


Then, as now, colors were added to make food appear better than it was, like juice or even henna dye was added to wine to disguise the fact that it had been watered down, or alum to make bread look whiter (white bread being fancier than brown).  Medieval bakers kept up that tradition by adding bone meal or chalk to their flour, in as large a quantity as they could get away with.  This led to the creation of the first law that we know of specifically addressing food color additives.  The 13th century law read, “If any default shall be found in the bread of a baker in the city, the first time, let him be drawn upon a hurdle from the Guildhall to his own house through the great street where there be most people assembled, and through the streets which are most dirty, with the faulty loaf hanging from his neck.”  Keep up your bread badness and you’ll be sent to the pillory, and that still doesn’t teach you, your shop will be destroyed and you’ll be banned from the town.


The Victorians were mad for food colors.  Colors in general.  That was the industrial revolution, when, long before the phrase “better living through chemistry” was coined, anyone with a few beakers and a little gumption could find a way to make food, fabric, wallpaper, toys, anything you can think of, bright and bold.  And hazardous.  Don’t be fooled into thinking that bread assize meant there were a lot of consumer-protection laws in place.  Copper salts were used to keep pickles an inviting shade of green.  Iron compounds made red sauce redder and if you needed orange or brown, there was always iron oxide, aka rust.  In China, they noticed Europeans bought more green tea if it was, like, really green, so they added yellow gypsum and an arsenic-based dye called Prussian Blue to make the colors brighter.  The worst offender, as a class, was candy.  Little children lured in by window displays of fabulously colored sweets might be ingesting red candies with mercury, yellow candies with lead, green candies with copper and arsenic.


In 1856, William Henry Perkin discovered the first synthetic dye while trying to synthesize quinine, called mauve.  Other chemists created similar dyes and because these dyes were first produced from by-products of coal processing, they were known as “coal-tar colors.”  Coal tar colors were cheap to manufacture, and this was the reason they began to replace toxic metals and poison, because federal oversight of color additives wouldn’t start until the 1880s.  Color additives were among the first public initiatives undertaken by the U.S. Department of Agriculture’s (USDA) Bureau of Chemistry began research on the use of colors in food, which focused first on butter and cheese.  Still, manufacturers were using all manner of toxins, irritants, sensitizers, and carcinogens.  


It wasn’t until the Pure Food and Drugs Act (aka. the “Wiley Act”) of 1906, to try to curb the “manufacture, sale, or transportation of adulterated or misbranded or poisonous or deleterious foods, drugs, medicines, and liquors, and for regulating traffic therein, and for other purposes.”  This was the first time in the US that broad measures were taken against additives that could be shown to be unhealthy.  By 1930, the USDA Bureau of Chemistry, which was responsible for testing products and enforcing the Wiley Act, was spun off into its own department, known as the National Food and Drug Administration (FDA).  Between the Wiley Act and the FDA’s formation, many synthetic colors were deemed illegal.  By 1938, only 15 synthetic colors were legal for use.  They fell into three categories: those suitable for foods, drugs, and cosmetics; those suitable only for drugs and cosmetics; and those suitable only for cosmetics.  Of those 15, 8 would be banned later as ever-improving science was able to show that they too were hazardous.  Six of the remaining seven are still in use today: FD&C Blues No. 1 and 2, Green No. 3, Red No. 3, and Yellows No. 5 and Yellow No. 6.  And everything has been perfectly fine and safe ever since.  Apart from those incidents in 1950 where Halloween candies made with Orange #1 made several children sick.  In the 1970s, Red #2 was banned after tests showed a link between its carcinogenic properties and intestinal tumors.   When you think of red food dye causing problems, you might be put in mind of the widespread belief that it causes hyperactivity in children.  It’s not just an urban legend.  In recent years, six food dyes were identified as having a possible link to hyperactivity in children, including Red #40.  The E.U. now requires that food containing these additives come with warning labels.  Rather than mess up their carefully-designed packaging with a scary warning, companies like Kelloggs and Kraft have adopted natural colorings… for the European market.  Apparently, ‘Murica can have all the red dye it wants.


When Kermit the frog sang It’s Not Easy Being Green, he had no idea how hard it could be simply being *near green, at least in the Victorian age, and that’s thanks to one man in particular, German chemist Carl Wilhelm Scheele.  Scheele could have been one of the most important names in chemistry, having discovered oxygen, molybdenum, chlorine and several organic acids, from citric to hydrofluoric, but someone else would inevitably get the credit.  The only thing he was left to claim was the green pigment, copper arsenite, made by heating sodium carbonate and mixing in arsenious oxide and copper sulfate.  Being both an irresistible vibrant shade of green and cheap to manufacture, it was an immediate success.  It should go without saying that arsenic is bad news bears; it causes skin lesions, vomiting, diarrhea, and in some cases, cancer.   And, thanks in no small part to Scheele, the 1800’s was lousy with arsenic. riddled in this substance.  Called cupric green or Scheele’s green, it was used in wallpapers, dying cotton and linen for clothes, in paints, on toys and even in food for fancy things like cake decorations and petit-fours.  Sometimes, a baker would use arsenic without even knowing it, as it wasn’t unheard of for millers to stretch flour with substanecs that contained arsenic.


While we’re here today to talk about food, it’s the green wallpaper that may be the most historically significant.  Scheele’s green wallpaper may have killed Napoleon Bonaparte.  Tiny particles of the pigment tended to flake off and become airborne to then be absorbed by the lungs, or the flakes might be eaten by toddlers or animals.  Also, when the wallpaper becomes damp, it would release poisonous arsine gas.  After having his derriere handed to him by the Duke of Wellington, Napoleon was exiled to the hot, humid island of St. Helena in 1815.  He was hardly marooned, but had lavish accommodations, including a room painted his favorite color – green.  Six years later he died of what was most likely stomach cancer, though analysis of samples of his hair have shown significant amounts of arsenic had been in his body for a prolonged period.  In 1903, the U.K. did pass legislation on arsenic levels in food and drink.  The law said nowt about wallpaper and paint, because they had thankfully fallen out of fashion and no one was decorating with Scheele’s green anymore.


Crystal Pepsi

list some failed colored products – ketchups


When David Novak, chief operating officer for Pepsi, pitched the idea for a Crystal clear variant of their flagship soft drink, other folks around the boardroom table were skeptical.  History –a rather short sliver of history in fact– would prove them right.  The company had put huge amounts of money into advertising.  The name Crystal Pepsi was on every tongue.  The actual product, not so much.  Why did it fail?  Just as most things in life are not black and white but fall somewhere on a shade of grey, so do most things that happen have multiple causes.


Let’s set the scene.  [cyrstla pepsi commercial] No, earlier than that. [rewind sfx, classic commercial]  No, farther.  Hold the button down, I’ll tell you when to stop. [rewind, old-timey music]  There we are.  Clear cola, as a concept, was set up for failure from the start, all the way back to the advertising of the very first sodas.  as you know from eps # , sodas were originally medicinal tonics.  From Pepsi;s creation in year and Coke’s in year, consumers have been conditioned to associate the flavor of cola with the color brown.  That’s actually thanks to caramel color, because kola nut extract is green, but that’s neither here nor there.  As sodas moved out of medicine and into junk food and new flavors were introduced, clear or pale-colored sodas were almost universally citrus-flavored.  That’s the way it was and we liked it, we loved it.  


Big red X next to “looks like cola.”  Put an even bigger X next to the line “tastes” like cola.  Early in the products life, bottlers complained to Novak that Crystal Pepsi didn’t taste like Pepsi.  They worried it would put people off, but Novak wouldn’t hear it.  There was a big push in the market for things that were quote-unquote pure.  Remember that clear shampoo in the weird trapezoidal bottle from the late 80’s?  It’s not really germane to the topic, but if you know what I’m talking about and actually remember the name, hop on soc med.


Another issue was that Crystal Pepsi was an answer to a question no one asked.  Successful products tend to be though that resolve some pain point for the consumer, though as you might have read on page of the YBOF, sometimes all you have to do is convince people they have a problem before you sell them the solution.  Pepsi didn’t do any of that.  The marketing touted the amazing awesome coolness of Crystal Pepsi, but never once told consumers why they should want it.  Another key oversight was, they didn’t.  People were beginning to move away from sugary sodas to healthier –or at least less unhealthy– alternatives, like bottled water.  Part of Pepsi’s plan seemed to be to let consumers *think the drink was healthier, but you needed only turn the bottle around to the nutrition panel to see that it wasn’t.  A 12-ounce can of Crystal Pepsi –and what is the point of putting it in an opaque container if the appearance is the main selling point?– had only 13% fewer calories than regular Pepsi.  The only real distinction was that Crystal Pepsi was caffeine-free.


While Pepsi was shelling out big marketing bucks, like licensing the hit Van Hagar single “Right Now” for tv commercials, which are expensive already, their ever-present rival Coca-cola was watching.  Coke came up with a rather ingenious way to cut the legs out from under Pepsi.  Rather than the time-honored approach of “we’re great; they suck” marketing, Coke opted for a “we both suck” strategy.  Here’s what I mean.  Coke began selling crystal clear Tab.  For those persons of a certain age, Tab was Coca-cola’s first diet soda, debuting in 1963 (19 years before Diet Coke) and hanging in there until, astonishingly, last year.  I say astonishingly, because it was not regarded as tasting all that great, there were fears that the saccharine it was sweetened with would give you cancer, and never in my 41 years have I seen someone purchase, consume, or even be in possession of a Tab.  Coke made a spin-off of this limping turkey of a product that was just like Pepsi’s hot new thing.  But this wasn’t playing catch-up.  It was an astute understanding of consumers’ minds.  Tab Clear wasn’t good, but it didn’t have to be.  In fact, the badness worked in their favor.  When people began seeing clear Tab, they associated the lack of color with the soda being diet.  By extension, that meant all clear colas were diet, and diet sodas have never done as well as the full-sugar originals.  Once the damage to Pepsi was done, Coke discontinued Tab Clear.


When Crystal Pepsi first appeared on shelves, it did alright, thanks in no small part to curiosity, but once people tried one, most didn’t go back for a second.  At the height of its popularity, Crystal Pepsi had managed to claw its way to a market share of one half of one percent.  Even still, Novak maintains that it was “the best idea I may have ever had in my career.”  Crystal Pepsi saw two re-releases, one in 2016, prompted by nostalgic die-hard fan support (probably the same people that got Surge relaunched), and in 2017 when interest peaked again after a certain video went viral.  That video?  A competitive eater drank a 20+ year old bottle of original Crystal Pepsi and promptly threw up.