Saturday, March 7, 2015

A final Anniversary Worth Celebrating, and I can’t think of a better one. A year ago, I was a Georgia-to-California transplant trying to balance school, work, and the ever-tricky third space where human connections are supposed to happen. Combining all of this, I paradoxically had too little free time, and too much. If I had thought too much about it, I might have never asked out a girl who was a year ahead of me in my graduate program. The demands of a working student’s life could have easily paralyzed any attempt to add more.
But, you guys…seriously. She could talk about best practices in multi-tiered school service delivery models, and actually make you listen. She had a laugh and a smile I couldn’t stop thinking about (and I spent a few days trying). So I went for it. I pulled out every bad joke I could think of to ask her out. She wasn’t scared away.
And so, one year ago on March 7, 2014, I picked up Sarah and we drove off into the murky mists of Getting to Know Each Other. Statistically speaking, it wasn’t likely to work. (Most first dates are monuments to unwarranted optimism.) We went to Disneyland, which is setting the bar pretty damn high out of the gate anyway. If this actually worked, what was I supposed to do for an encore? With no justification, I scared her on the Haunted Mansion. With full justification, she punched me in the arm. And I knew she had some fight, and I knew there was more to find out.
I asked her out a second time. She made me wait three days for a reply. (She claims she was busy. I’m still not sure.) But friends, when my entreaty for a repeat engagement was granted, no bard of old ever saw the clouds part so beautifully. (I still smile walking past the campus fountain where I read her reply.)
One year from that fateful evening at Disneyland, I’ve learned a lot more about Mexican food (the real kind), and she’s heard a lot more bad jokes. I didn’t have time for this. Neither did she. But we found a year, and counting.
I didn’t plan this anniversary right at the end of my little anniversary series. (That would have been a little weird.) It just worked out that way. But it’s a great reminder that Anniversaries Worth Celebrating don’t have to just live in history books. New Anniversaries Worth Celebrating can happen anytime. Go out and make some.

Friday, March 6, 2015

AWC is going meta today, celebrating the anniversary of...celebrating anniversaries. Forgive the self-indulgence, but it was two years ago that this series took its first trembling steps on March 6, 2013. The first post was an ode to the humble Oreo, which Nabisco claims to have developed at its factory in Manhattan’s Chelsea neighborhood on this date 103 years ago on March 6, 1912. The Oreo was actually an imitation of the older Hydrox cookie, but after its introduction a funny thing happened. Nabisco did such a good job copying Hydrox that they ended up making a better cookie, and Hydrox came to be seen as the cheap knockoff.
If you’re celebrating a birthday or anniversary, Oreos aren’t a bad choice to have on hand. You can make a cake with them, or skip the middleman and just try the Birthday Cake flavored variants, one of dozens of flavors Nabisco has introduced over the years. (Others have included Red Velvet, Pumpkin Spice, Peanut Butter, Gingerbread, and Watermelon, because sure, why not?)
Since starting with cookies, this series has spent two years traveling far and wide to find anniversaries worth celebrating. (Your humble correspondent took a year-long break starting on June 2, 2013 and picked up right where we left off on the same date last summer.) In that time, we’ve covered everything from sliced bread to Superman. We’ve gone from the banks of the Halys River in modern-day Turkey in 585 BC (the earliest date on our anniversary quest) to the steps of the U.S. Supreme Court in 2013 (the most recent). The 1960s seemed to be AWC’s favorite decade. With its mix of milestones in civil rights and space exploration, along with great music, pro football, and heart surgery, the ‘60s accounted for 34 posts – nearly one in 10 of the total.
Recurring characters on our journey included Abraham Lincoln, Teddy Roosevelt, Thomas Edison, Albert Einstein, Walt Disney, and Elvis Presley. And then there was baseball, a game with a past that makes it irresistible as a prism for history. It showed up at least 17 times in posts that ranged from on-field achievements, to the Hall of Fame in Cooperstown, to “Casey at the Bat.” All of these choices reflect the biases of one person, namely me. But hopefully in a year’s worth of milestones in sports, entertainment, politics, science, and whatever weird stories grabbed my fancy, you found more than a few items to brighten up your day. This series had one simple goal from the beginning: Find something that a person of good will could celebrate on every day of the year. I think we succeeded.
If it sounds like we’re about to turn out the lights on AWC, that’s true. But there’s still one more piece of business to tend to before we lock the doors. Tune in tomorrow…
http://static7.businessinsider.com/image/4f514a78eab8eac10b000091/oreo-bday.jpg

Thursday, March 5, 2015

AWC is wrapping up its trip around the calendar with a tale of the American frontier, where life was tough 180 years ago. With no 911 service, isolated homesteaders had to defend their own territory. A hunting rifle could serve as a defensive weapon in a pinch, but the time needed to reload between shots meant you’d better aim well the first time. But Samuel Colt started a revolution in the firearms industry when he formed the Patent Arms Manufacturing company and made the first production model of the .34 caliber revolver on the same day 179 years ago on March 5, 1836. While Colt didn’t invent the revolver, he would go on to popularize it.

Handguns are bound to trigger controversy as a topic of celebration, and in an urbanized country with an established police force, a case can be made that wide access to firearms does more harm than good. But things were different in 1836, particularly west of the Mississippi. Families often had to fend for themselves, and a reliable gun was a fact of life. With the Colt revolver (Colt’s patent gave him exclusive production rights until 1857), a major shortcoming of the rifle and single-shot pistol was overcome. The revolving chamber allowed the trigger action to slide the next bullet into place, allowing the gunman (or woman) to get off the next round by just cocking the hammer and firing again.

With the ability to get off multiple shots at once from a compact weapon that traveled in a holster, settlers stood better odds against cattle thieves or Indian raids. The six-shooter could also be used for hunting, although the short barrel and lack of a sight made it less suitable than a hunting rifle for taking down game. (Colt’s company seemed to realize the value of the rifle-handgun combo. The Colt Frontier Six-Shooter was released in 1877, and was designed to be compatible with the same .44-40 caliber bullets used in the popular Winchester rifle, allowing settlers to carry one type of ammunition for both guns.)

The downside of this story is that the phrase “arms race” exists for a reason. If the good guys have better weapons, it’s a matter of seconds until the bad guys have them too. Colt’s handguns were popular with criminals as well as ranchers, and with Native Americans as well as cowboys. (Who qualified as the good guys or the bad guys in that scenario is still being debated.) It’s fair to say that Colt was probably a mercenary, who didn’t particularly care where his weapons ended up. (He approached both North and South to do business during the Civil War, and was said to believe that even bad publicity was fine, as long as his guns were referenced by name.)

But in lawless frontier towns where everyone carried, the revolver was seen as largely a force for good. One popular adage said that “God made men, but Samuel Colt made them equal.” (Of course, that could have been some of Colt’s famous marketing.) The Colt Single Action Army model was coined “the Peacemaker,” and is so iconic that it’s become known as “the gun that won the West.” (That’s not just marketing, since Colt was long dead by the time the West was tamed.)

Regardless of Colt’s character, his innovation was crucial in allowing the boldest (or most desperate) Americans to keep pushing the outer reaches of the Western frontier. At a time when a home security system meant an alert dog and a good aim, a Colt six-shooter could make all the difference. Plus, without them, all the cowboys in those Westerns would be toting rifles on horseback…which wouldn’t look nearly as cool.

With 366 posts, AWC has hit every spot on the calendar (including February 29). We’re just about ready to turn out the lights on this series, but we’ll tend to a little more business first over the next few days. Stay tuned.
http://www.peacemakerspecialists.com/art/newgun1st.jpg

Wednesday, March 4, 2015

Ernest Hemingway might have picked up something from generations of anglers, because for his most impressive piece of work, he chose a fish story. By the 1950s, Hemingway had earned renown for novels like “The Sun Also Rises,” “A Farewell to Arms,” and “For Whom the Bell Tolls.” But since 1940’s “Bell,” Hemingway had been mostly quiet on the literary front. The Illinois native had sunk into depression during the 1940s, as friends and fellow writers – expatriated Americans he had befriended while living in Paris in the 1920s, like Gertrude Stein and F. Scott Fitzgerald – died around him. An attempted comeback with 1950’s “Across the River and into the Trees” was a dud.
In response, Hemingway churned out the draft of another book over an eight-week period in 1951. He called it “The Old Man and the Sea,” and is believed to have finished it 63 years ago on March 4, 1952. At least, that’s the date he sent off a message to his publisher about the book, in which he called it “the best I can write ever for all of my life." Readers and critics agreed. The book made Hemingway an international celebrity when it was released in 1952, winning a Pulitzer Prize for Fiction in 1953. Hemingway himself won the Nobel Prize for Literature in 1954, and “The Old Man and the Sea” was cited as part of the reason.
True to Hemingway’s style, the book avoids overwrought language as it describes the Cuban fisherman Santiago and his epic battle with a giant marlin in the Gulf Stream. Hemingway was known for a sparse writing style, perhaps following an observation by Henry James that World War I had “used up words.” (Hemingway had seen action on the Western Front before being seriously injured, and explored the idea that words were useless to explore the war in 1929’s “A Farewell to Arms.”)
While the story of Santiago and the marlin seems simple, it has been interpreted as rife with symbolism. (Hemingway said a good writer should leave much of what he knows unsaid…similar to an iceberg, which keeps most of its power submerged.) In its glorification of struggle, the book has been called a biblical allegory, with allusions to the Crucifixion, as well as an oblique commentary on Hemingway’s entire literary career. And all of this could be true…or it could just be that, under the right conditions, everything you need to know about life can be summed up in a fishing trip.
http://molempire.com/wp-content/uploads/2011/08/The-Old-Man-and-the-Sea-by-Ernest-Hemingway-Book-Cover.jpg

Tuesday, March 3, 2015

For such a loud party, America’s Mardi Gras tradition got a quiet start. Or so the story goes, anyway. Back in 1699, North America was basically a giant pizza that the European powers were each trying to grab a slice of before it disappeared. French king Louis XIV had grabbed a rather huge slice right in the middle and named it after himself (Louisiane), which was the kingly equivalent of writing your name on your underwear’s elastic band. Spain and Britain were digging into the pizza box too, and Louis was getting nervous that France might lose its hold in the New World. So he sent a pair of explorers to firm up France’s hold on Louisiane.
According to the story, the two explorers traveled up the Mississippi River from the Gulf Coast, finally stopping to make camp about 60 miles downriver from present-day New Orleans. It was Fat Tuesday, the final day of feasting before the sacrificial season of Lent began the countdown to Easter. The French Catholics noted the holiday by naming their campsite with the French phrase for Fat Tuesday, calling it “Point du Mardi Gras” 316 years ago on March 3, 1699.
Plenty has changed since then. Mardi Gras is French for “Fat Tuesday,” for the tradition of eating one last day of fatty foods before taking up the Lenten fast. But it extends well beyond a day in present-day New Orleans, covering a Carnival season that picks up on the Catholic feast day of the Epiphany on January 6, and running until the day before Ash Wednesday. (Because Easter is a moving target, so is the 46-day Lenten season preceding it. As a result, Fat Tuesday can fall anywhere from February 3 to March 9.)
Another big difference: The original meaning of Mardi Gras, and the Carnival season, was to allow a season of indulgence and partying before the somber obligations demanded by Lent kicked in. Excess, then sacrifice. One suspects that the French Quarter is full of Mardi Gras fans who have mostly abandoned the second half of this equation.
But whether as one half of a cycle, or just as a party for its own sake, Mardi Gras has lived on in America…and not just in New Orleans. Traditionally French cities along the Gulf Coast, including Pensacola, Mobile, and Biloxi, have strong Mardi Gras traditions. (Mobile's organized Mardi Gras tradition dates back to 1703, predating even New Orleans'.) Lent got an early start this year, with Ash Wednesday falling on February 18, so if this post has you wanting to join in the fun, you'll need to wait until next year. But odds are you’ll still find a party in New Orleans, regardless of when you go. Just don’t come back to work wearing beads. That’s an awkward conversation to have with your boss.
http://darkroom.baltimoresun.com/wp-content/uploads/2013/02/REU-USA-61.jpg

Monday, March 2, 2015

On Month Three and Day Two, what a surprisal! The first day on Earth for Theodor Seuss Geisel! ‘Twas 111 years ago (no less or more) when he was born on March 2, of 1904. He began life in Springfield. (The state? Massachusetts!) His name became famous, instead of a “Whozits?” He signed college art “Seuss”…to hide his true name. (He’d been caught drinking gin – in those days quite a shame!)
He traveled to England, to learn English lit. But advice from Helen Palmer caused him to quit. He went back to drawing, his English hopes buried. With success in New York, he and Helen were married. He drew advertisements, and the money was fine. But his life’s truest calling was still down the line. In 1936 (with a Europe trip planned) he took a sea voyage, with time on his hands. The book that he wrote there would prove rather neat: “And to Think That I Saw it on Mulberry Street.” It was based on a real street in his New England hometown. The sales were so-so, but he would soon find renown. Have I left something out? Is my memory loose? Oh yes! This was the first book he signed “Dr. Seuss.”
Then came World War II, and Geisel stayed busy…with political cartoons. (Politics?? Are you dizzy?!) He stood against Hitler, and took his stance firm. (Even old Charles Lindbergh couldn’t cause him to squirm.) He joined up with the Army, and made films for the war…and after we’d whipped ‘em, turned to kids’ books once more.
“The Cat in the Hat.” “If I Ran the Zoo.” “Green Eggs and Ham.” “Horton Hears a Who.” He wrote ‘em and drew ‘em, a famous one-two. (The “Cat” book taught reading, but the kids never knew.) He wrote more than 60, and sold a whole CLEFTUS!* Then in 1991, he laid down and left us.
His art and his writing, folks haven’t forgotten. As long as kids read, they’ll stay fresh (and not rotten). One lesson he stood on: To yourself be true. (No matter the cost, only you can be you.) It’s a lesson quite fitting, for this part’s the truth-iest: Out of all of the Seusses, Dr. Seuss was the Seuss-iest.
* 1 cleftus = 14 hoodzinks (or about 600 million copies)
http://www.davidpaulkirkpatrick.com/wp-content/uploads/2013/07/Dr.-Seuss_US_stamp.jpg

Sunday, March 1, 2015

The AWC train is pulling into the station, with one more week of anniversaries worth celebrating. On this date 54 years ago, John F. Kennedy established the Peace Corps with an executive order on March 1, 1961. It fulfilled a goal he had expressed as a member of Congress a decade earlier, and campaigned on during his run for president in 1960.
The idea was to send recent college graduates into the developing world, helping locals learn basic skills of self-sufficiency and rebutting negative stereotypes of Americans as imperialists. Kennedy’s vision of the Peace Corps as an engine of cultural exchange and international cooperation was also found in his inaugural address. His famous line to Americans – “Ask not what your country can do for you, ask what you can do for your country” was followed by this remark addressed to all world citizens: “Ask not what America will do for you, but what together we can do for the freedom of man.”
From an honest perspective, the Peace Corps can be seen as a way to address the world’s problems on the cheap. It costs the country relatively little to send young volunteers into villages. (On the flip side, these emissaries have relatively little experience and training, as some critics of the Peace Corps have noted.) The Peace Corps’ 2014 budget of $379 million might sound like a lot, but in truth it represented about 1% of 1% of the government’s annual budget. The Peace Corps hasn’t ended any world problems with that level of resources, but they have contributed efforts to fight malaria in Africa, and assist with health, education, and nutrition efforts worldwide.
In the end, the Peace Corps represents a vision of what the world could be. (This could be seen as covering for a lack of concrete results, and critics have charged the organization with exporting “emotionalism” above all.) Maybe the most important question to ask is whether we’re a better country with a Peace Corps than without one. An author of a critical report about the organization in 1986 concluded "The Peace Corps is the epitome of Kennedy's Camelot mythology. It is a tall order to expect a small program appended to an immense superpower, to make a difference, but it is a goal worth striving for."
http://s.hswstatic.com/gif/peace-corps-9.jpg

Saturday, February 28, 2015

BONUS AWC: It’s not a leap year, but this series leaves no date behind in its quest to find an anniversary worth celebrating in every spot on the calendar. While February 29 remains in hiding for another year, we’ll take a moment to celebrate an event that happened on the rarest date of all.
For Hattie McDaniel, plenty of moments surrounding her performance in “Gone with the Wind” were not worth celebrating. When the movie debuted in Atlanta in December 1939, she – along with all the film’s black actors – were barred from the event due to Georgia’s segregation laws. Even on the night when she attended the Academy Awards ceremony in Los Angeles, she had to sit at a separate table from the rest of the cast and crew. But despite all that, she took home an Oscar for Best Supporting Actress for her Mammy role, making her the first black Oscar recipient in Academy Awards history. It happened 75 years ago (more or less) on February 29, 1940.
McDaniel’s achievement came on a banner night for “Gone with the Wind,” which took home 10 Oscars, including Best Picture. While a triumph of production, the movie is not the best vehicle for racial progress. It celebrates the antebellum South, with a black housemaid whose name has become shorthand for a brand of female racial stereotype. The Mammy performance is hard for me to watch today…but it’s also easy to see how it might have seemed progressive in 1939 for audiences to watch McDaniel's character deliver sassy tough love to a spoiled white debutante. For her part, McDaniel – who had been a washroom attendant and waitress after the stock market crash of 1929 – seemed unapologetic about taking on roles like Mammy, which were the only acting jobs she could get. She supposedly responded to criticism from liberal groups by saying "Why should I complain about making $700 a week playing a maid? If I didn't, I'd be making $7 a week being one.”
Upon accepting her Oscar, McDaniel called it one of the happiest moments of her life in a gracious speech, during which she said “I sincerely hope I shall always be a credit to my race and to the motion picture industry.” She was both. In her Los Angeles neighborhood, she helped organize black homeowners who were sued by their neighbors in an attempt to keep the neighborhood white. (A judge threw out the case, allowing McDaniel and the other families to keep their homes.) She has two stars on the Hollywood Walk of Fame, recognizing her contributions to both radio and movies. She died in 1952 from breast cancer, at just 57. The whereabouts of her Oscar are unknown.
http://www.eurweb.com/wp-content/uploads/2012/02/hattie-mcdaniel.jpg
The story goes that Francis Crick walked into a pub near the Cambridge University lab where he and James Watson had been working and interrupted all the patrons’ lunches for a rather momentous announcement 62 years ago on February 28, 1953…he and Watson had “found the secret of life.” If the anecdote is true, Crick wasn’t too far off. By discovering the molecular structure of DNA, he and Watson had indeed made a discovery worth looking up from your bangers and mash for.
Deoxyribonucleic acid, as the smart kids call it, had been on scientists’ radar for decades prior to Watson and Crick. As a microscopic substance in the nuclei of human cells that was made up of nucleic acid, DNA was known by the late 19th century. Experiments in the 20th century suggested, and then confirmed, its role as an engine of hereditary traits.
What was still unknown was how the structure of the DNA molecule allowed it to pass traits through generations of living organisms. By isolating the now-famous double helix structure of DNA, the American Watson and English Crick helped birth the entire field of molecular biology. Each DNA molecule contains a spiral of two DNA strands, which in turn are composed of nucleotides which contain one of four compounds…the ordering and pairing of which contain all the genetic information needed to code a life.
The genetic revolution has touched every segment of modern society, impacting medicine, agriculture, forensics, and archaeology. Even if the technical details escape us, we intuit the meaning of Watson and Crick’s work anytime we say something is so central to our being that it’s “in our DNA.” Watson and Crick would formally announce their discovery in Nature magazine in April 1953. But for a few unsuspecting patrons at Cambridge’s Eagle pub on this date, a sneak peek at the future arrived along with their steak and kidney pudding. Who says you can’t find the meaning of life on a barstool?
http://rack.1.mshcdn.com/media/ZgkyMDEzLzA2LzEzLzU0L2RuYS40NjE1MC5qcGcKcAl0aHVtYgkxMjAweDYyNyMKZQlqcGc/54ec3455/275/dna.jpg

Friday, February 27, 2015

Largely forgotten today, the events of 155 years ago might have been uniquely responsible for propelling Abraham Lincoln to the White House. Using the dominant communication media of his time, Lincoln used a speech and a photo on February 27, 1860 to emerge into the national awareness in a new and dramatic way.

Lincoln spent the day at Cooper Union, a private college in Manhattan’s East Village where he delivered a lengthy 7,000-word address carefully detailing his position on slavery. Lincoln labored at length to justify his position against its expansion into Western territories that were preparing for statehood, aligning his position with those of the Founding Fathers. This speech has been overshadowed since, for good reasons. Lincoln is making a tepid, conservative case against slavery’s expansion, far less inspiring than the abolitionist stances he ended up taking as a result of the Civil War.

But if modern audiences find little of interest in the speech, Lincoln’s intended audience was very impressed. Lincoln was a man from the frontier known for telling jokes. He had made a name for himself in Illinois by debating Stephen Douglas during the Senate campaign of 1858. But Lincoln’s careful argumentation on this stage – “informed by history, suffused with moral certainty, and marked by lawyerly precision,” one Lincoln scholar said – helped transform him from regional curiosity to a national leader. It didn’t hurt that the speech was given in New York, the nation’s media capital. The speech was reprinted in newspapers and pamphlets across the North, and has been seen as the crucial piece in helping Lincoln gain the 1860 nomination for president of the young Republican Party that May.

A few hours before the speech, Lincoln made another stop – at the photography studio of Matthew Brady, remembered today for his stirring photographs of Civil War battlefields. Lincoln called on Brady for a portrait. Photography was still a young medium, and Brady’s picture of a beardless Lincoln provided the perfect visual counterpoint for Lincoln’s successful speech. After he won the nomination, Harper’s Weekly used the portrait to make a triumphant front-page engraving of the nominee (erroneously captioned as the “Hon. Abram Lincoln”). In 1860, simply seeing a candidate look presidential had value in itself…particularly for Lincoln, who was described as being so gangly and thin by his opponents that a humanizing picture was worth a thousand words. Or as Lincoln said later, the picture allowed him to display a “human aspect and dignified bearing” to offset his opponents’ descriptions of him. (Could simply proving you look human be enough to swing an election?)

The tides of politics are tricky to discern. Lincoln towers over our national consciousness today, but the fact is he had to win election in a system just as dominated by its channels of communication as we have today. Could a speech and a photograph really swing an election? Lincoln seemed to think so, when he was quoted as saying “Brady and the Cooper Institute made me president.” Of course, knowing Lincoln, this could have been intended as a wry comment on the outsize importance one day’s events could have on a whole campaign. But whether it was said earnestly or facetiously, there was certainly some truth to it.
http://www.printsoldandrare.com/lincoln/103linc.jpg

Thursday, February 26, 2015

As an animator in the 1930’s and 1940’s, Tex Avery basically had two choices: Emulate the formula that had allowed Disney to dominate the young medium, with cute animals and doe-eyed kids…or go in the complete opposite direction. He chose Plan B.

Frederick Bean Avery was born 107 years ago on February 26, 1908, in Taylor, Texas…which easily explains his more famous nickname. Attempts to explain his unique outlook on the animated short, and what it could do, are not as simple. One theory reaches into a day when the young animator was horsing around with some coworkers at Walter Lantz’s studio in the early ‘30s. A thumbtack flew into his left eye, costing him his sight in that eye. Avery’s lack of depth perception could help explain the crazy proportions and physics he brought to his characters, taking the rubbery “squash and stretch” nature of cartoons to extreme places.

Avery’s most famous work came after leaving Lantz, when he moved to Warner Bros. and later to MGM. At Warner, he was assigned to his own production unit and oversaw a team of animators while developing the studio’s “Looney Tunes” series. Staffed with animators like Bob Clampett and Chuck Jones, the Avery unit pushed the “Looney Tunes” shorts to the top of the field, elbowing Disney’s tamer fare out of the way. (Is it any coincidence that Disney seemed to lose interest in the short and turned to feature-length animation in the ‘30s?) The Warner shorts under Avery brought a crazy style of action, and developed the studio’s top-flight characters. First, Avery’s crew took a pre-existing (and rather bland) Porky Pig, and made him a genuine star. Then they developed wackier foils for the studio’s straight characters, like Daffy Duck and Bugs Bunny – both of whom started out in much more manic forms. (The phrase “What’s up, doc?” had supposedly been popular around Avery’s North Dallas High School.)

While Avery was successful at Warner Bros., he clashed with management and disliked the working conditions. (His crew nicknamed their first bungalow “Termite Terrace.”) In 1941, he moved over to MGM, where his frantic style was less restrained. He developed new stars, most notably Droopy. His MGM cartoons were fast-paced and silly, with extreme violent gags. His characters frequently broke the fourth wall, directly addressing the audience. He also played with sexuality about as much as it was possible to. While not his most famous, his most iconic character might be the MGM Wolf, a lust-crazed character who was always chasing down a curvy redhead designed to keep the dads in the audience paying attention.

Avery was unrestrained but overworked at MGM, and took a year off in 1950 to recover from fatigue. In 1953, he returned to the Lantz studio, his original employer. He didn’t stay long, only directing four cartoons, but this brief stop still produced a legacy character, as he helped develop the mute penguin Chilly Willy. He ended his career doing TV commercials and working with Hanna-Barbera. He died of liver cancer in 1980 at age 72.

He has received countless plaudits, but probably the best that can be said about him is that his work is still influencing the medium he loved. In a direct line tracing from Roger Rabbit through Ren & Stimpy to SpongeBob SquarePants, animators have never stopped leaning on the Avery style. Roger Rabbit might be the biggest tribute: In a movie developed by Disney and featuring characters like Mickey Mouse and Donald Duck, Avery’s creations…Bugs, Daffy, Droopy, and his sexy redhead (in fakeout form)…are right there too, adding their own opposing, but just as influential, perspectives to Toontown's landscape.
https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg7ydsmTF_pHtNnX-c1xpaatnEHvPuf9SnGnqIw3ULgvLkWe99a2f8xGXyE0DUnCuuglGd_xR8Dlt8HA0KoHkQLczxuJmAgxZ8clVXd1qoTqYQwGquonM_IOdAPb9KxtWW_XGEi-2Ox334/s1600/Tex+Avery.jpg

Wednesday, February 25, 2015

AWC is celebrating the quiet guys today…or the relatively quiet, anyway. Zeppo Marx and George Harrison (born 42 years apart on this date) didn’t exactly lead lives of anonymity, but as members of famous quartets, both of them worked in the shadows of more flamboyant and famous team members.
Zeppo was the stage name for Herbert M. Marx, the youngest of the five Marx Brothers, born in New York 114 years ago on February 25, 1901. As part of his brothers’ famous comedy team, Zeppo was the straight man when the Marx Brothers made the leap from the vaudeville stage to the big screen. He appeared in the Marx Brothers’ first five films, playing the romantic lead or the regular guy while his brothers Chico, Harpo, and Groucho clowned around. (The fifth Marx brother, Gummo, was even more anonymous…he never appeared in a movie after the brother act went Hollywood.) One unattributed quote referred to “the plight of poor Zeppo Marx,” claiming that “While Groucho, Harpo and Chico are hogging the show, as the phrase has it, their brother hides in an insignificant role, peeping out now and then to listen to plaudits in which he has no share.”
Zeppo finally left the group to the funnymen, but he might have had the last laugh. His talents as a mechanic (he was the one who kept the family car running) led him into an engineering career where, among other things, he owned a company that produced military parts during World War II and invented a watch that tracked the pulse rate of cardiac patients. He also started a successful theatrical agency and became very wealthy from his post-Marx Brothers ventures before his death in 1979.
George Harrison (born in Liverpool 72 years ago on February 25, 1943) was known as “the quiet Beatle.” While John Lennon and Paul McCartney famously wrote most of the band’s songs and fought over its direction, and while Ringo Starr received a drummer’s unique brand of fame, George kept it mellow, writing a few major songs and playing lead guitar. His contributions to the band were nonetheless distinct. Indian culture seeped into the Beatle repertoire, thanks to his interest in Hinduism. (He introduced the band to the eerie sound of the sitar, and for better or worse, the Hare Krishna movement.)
After the Beatles broke up, Harrison set out on a successful solo career, raised money for starving Bangladeshi refugees in a major 1971 benefit concert, and formed the most awesome superband ever in 1988, bringing Bob Dylan, Tom Petty, and Roy Orbison on board for a few jam sessions that turned into the Traveling Wilburys, because that’s what you get to do when you were in the Beatles. He was only 58 when lung cancer took him in 2001, leaving a lot of guitars to gently weep. His memorial tree is being replaced in Los Angeles today, on his birthday. It turns out the quiet Beatle’s memorial was quietly killed...by beetles.
Photo collage assembled by AWC

Tuesday, February 24, 2015

This entire series revolves around it, so it’s only fair to dedicate some love to our modern calendar. When Pope Gregory XIII announced a reform to the old Julian Calendar 433 years ago on February 24, 1582, it looked like a minor tweak, compared to the major overhaul the Julian Calendar itself had been. The Julian Calendar had been in use since the Romans had introduced it over 16 centuries earlier, and for the most part, it was a good one. (The pre-Julian calendar it replaced in 45 BC, on the other hand, was a mess, with a wacky leap month of variable length inserted every few years.) The Julian Calendar was much more stable, with 365 days divided into 12 months, and an extra day squeezed in every four years. Sounds basically like today, right?

The problem was that the actual solar year is not a precise 365 ¼ days long, so the Julian Calendar overshot the Earth’s rotation around the Sun by about 11 minutes every year. This added up to an extra 3 days every 400 years. By Gregory’s time, the vernal equinox – which helped determine the date of Easter – had drifted about 10 days to March 11. Gregory knew that if he didn’t do something, Easter would eventually end up in December, and then Santa and the Easter Bunny would have some kind of holiday death match.

To restore Easter back to its rightful spot, Gregory issued a papal bull on this date reforming the calendar. Well, it was sort of on this date. If you’ve been paying attention, you know that the Julian Calendar, which dated Gregory’s pronouncement, had drifted by about 10 days, making this an anniversary only in a rough sense. This is a common problem with dating events before Gregory’s calendar reforms, and our options are basically to do a lot of extra math to match any Western date before 1582 to our modern calendar, or just live with it.

Gregory’s reform made three changes: One was to cut back on the leap years, which were pulling Easter further and further up in the year. Gregory eliminated 3 leap years every 400 years, which is still what we do today. (Years ending in “00” are only leap years if they can be divided by 400. The year 2000 was a leap year, but 1700, 1800, and 1900 weren’t.) The second change involved some complicated business with the lunar calendar which makes my head hurt to think about, but which was crucial to putting Easter back where the Nicene Council had placed it in 325. (On another note, calculating the date of Easter is really complicated.)

Finally, there was the issue of what to do with all those extra days that had built up under the Julian Calendar, like plaque in the calendar’s teeth. Gregory just waved them away, and people in the Catholic areas that adopted the calendar reforms right away went to bed on October 4, 1582 and woke up on October 15 (presumably very hungry and really, really needing to pee). It would take much longer for the Gregorian Calendar to catch on outside of Catholic Europe. Protestant and Orthodox countries held out for a while, but the logic of Gregory’s changes (and demands for convenience in international trade) eventually overcame religious suspicion. In 1923, Greece became the last European holdout to make the switch. Today the Gregorian Calendar is the standard for much of the world. But I’ll be honest with you, I still have no clue when Easter is. Sorry, Greg.
http://www.nta.ng/wp-content/uploads/2015/01/nta-image-gallery-calendar.jpg

Monday, February 23, 2015

Your correspondent will fully admit that today’s date is a bit of an educated guess. It’s hard to say exactly when Johannes Gutenberg made the first Bible printed on movable type available. (Oddly enough, the date wasn’t saved in print anywhere.) But there is some agreement among scholars that Gutenberg’s Bible might have been published in Mainz, Germany, on February 23, 1455 – 560 years ago today. One future pope wrote in a letter that he had seen pages of the Gutenberg Bible on display in Frankfurt that March. Considering that this would have been a 17-mile distance from Mainz (or approximately 400 million miles to people in the 15th century), dating the first publication to today is probably not too bad a guess.
After his discovery of movable type around 1439, Gutenberg had printed books using the technology. (The Chinese and Koreans had done the same earlier.) But nothing, East or West, had approached the scope and popularity of Gutenberg's Bible printing project. It’s estimated to have taken him three years to print about 180 editions of the Latin Vulgate, a project which had the blessing of the Catholic Church. The books were a labor of love, and the high standards of the ink and other materials have earned the Gutenberg Bible the reputation as one of the most beautiful books ever printed. Large margins allowed room for artists to add decorations, which could be as ornate as a buyer was willing, or able, to pay for.
Despite their high selling price (about three year’s wages for a clerk in some instances), the books sold out immediately and traveled across Europe. While expensive, the printed Bible was still more affordable than a handwritten copy.
The printing press set off a revolution in Europe. (If you’ve ever been to Epcot, you’ll know all about this part.) Print exploded across the continent in the decades following Gutenberg’s work. By the year 1500, 77 towns or cities in Italy alone boasted print shops. Scientists could share information accurately and easily. Errors introduced by scribes stopped plaguing copies of original works, and information in general was easily passed around. By the 1600’s, printing had become so cheap that the first newspapers started appearing.
Today 48 copies of Gutenberg’s Bible are known to exist, although only 21 are complete. Most of them are owned by university libraries and other scholarly institutions (like this one, on display at the University of Texas). The New York Public Library has what’s believed to be the first copy to have reached North America in the mid-19th century. A complete copy hasn’t been sold since 1978, but it’s estimated that such a sale would fetch at least $25 million today. So I guess we can forgive Gutenberg if he didn’t record the exact date of his Bible’s publication. His press was a little busy at the moment. Revolutionizing the entire Western world has its perks, but it probably doesn’t leave much time for anything else.
http://www.hrc.utexas.edu/exhibitions/permanent/gutenbergbible/images/gutenberg_case.jpg

Sunday, February 22, 2015

What can be said about the Miracle on Ice? Sports Illustrated ran this cover with no words, under the logic that everyone in the country knew what they were looking at and needed no explanation. Your humble correspondent is almost tempted to do the same. This piece of modern American mythology is better summed up in a single image than any string of words could hope to do. But we write about things here, so let’s at least try.

A good place to start is a reminder that the U.S. Olympic hockey team did not win the gold medal when they took down the mighty Soviet Union on American soil during the Lake Placid Winter Games 35 years ago on February 22, 1980. You could be forgiven for assuming they had won the gold from this victory shot, but that didn’t happen until two days later when the U.S. beat Finland 4-2.

But nobody remembers that. What has lived on for more than three decades is the image of a bunch of college guys from Boston and Minnesota (average age 21) shocking a buzz-saw Soviet team that couldn’t have been more heavily favored if God Himself had been in goal. The Soviets had lost one Olympic hockey match in 20 years. They had pummeled the best professional players from the U.S. and Canada, beating the NHL All-Stars 6-0. Now they got the chance to face a pack of American amateurs, the same team they had laughed past in a 10-3 exhibition win just two weeks earlier. If the entire Cold War had hinged on this game, a smart gambler would have started polishing his Russian before the puck dropped.

The only real prayer for the Americans was to stick with a hard-hitting, physical style of play that coach Herb Brooks had instilled in them. And while the U.S. got in their share of blows and managed to get Soviet goalie Vladislav Tretiak pulled after a fluke goal tied the game at the end of the first period, they still entered the final period down 3-2. After a long offensive drought, the Americans tied the game on a power play equalizer against the USSR’s backup goalie, and took their first lead of the day on a shot by team captain Mike Eruzione with 10 minutes left.

The Soviets wouldn’t go quietly. They started shooting wildly in the game’s closing moments, looking to get anything past American goalie Jim Craig, who stopped 36 of 39 shots on goal. As the final seconds ticked down, ABC’s Al Michaels echoed the disbelief he was feeling, and helped coin the name that would stick to this game for over a generation, when he asked the TV audience “Do you believe in miracles? Yes!” Another member of the broadcasting crew said it was like watching a bunch of Canadian college football players take down the Pittsburgh Steelers.

Cold War politics turned the game into a moment of national pride, at a moment when the real Cold War wasn’t going well for the West. Soviet tanks had rolled into Afghanistan two months earlier. There were also Americans being held hostage in Tehran. For Americans in need of a dose of national pride, it was the right result, against the right opponent, at the right time.

So those are a few words that can be said about the Miracle on Ice. But to be honest, you’ll probably still get just as much by looking at this picture with no words attached.
http://cdn-jpg.si.com/sites/default/files/si/2010/writers/joe_posnanski/02/22/miracle.on.ice/mi/ra/cl/miracle-cover.jpg

Saturday, February 21, 2015

The first telephone directory was issued on this date 137 years ago in New Haven, Connecticut. Distributed on February 21, 1878, it was a single sheet of cardboard listing 50 technologically savvy businesses in New Haven that had telephones (which had only been patented two years earlier). There were no numbers, since early telephone users simply told the operator who they wanted a connection to. At this stage, just knowing who had a phone was useful information.
Today, we take access to phone listings for granted. We can Google any business we need to contact, and our friends’ numbers are typed in once, stored, and never thought of again (until we have to replace phones). But for years, the lowly telephone directory was the workhorse of our connected world. White pages for personal listings, yellow pages for service providers, and if your community was lucky enough to have one, a reverse directory for attaching numbers to names…these were the engines that fueled countless economic and informational exchanges. (Yes, calling your friends after school to find out what’s up qualifies as an exchange of information.)
That’s all changed, of course. France had online phone listings as early as 1981 (for all six people who had internet access back then), and the U.S. started seeing listings migrate to cyberspace in 1996. The old phone book has become an outdated relic, and a wasteful one at that. (Greener cities like Seattle and San Francisco have tried to ban their unsolicited distribution.) But the death of the Yellow Pages is just a shift in format. The idea behind online listings is the same as it was in New Haven in 1878…your phone is useless unless you know who it connects you to.
http://www.haineslocalsearch.com/wp-content/uploads/2012/05/yellow-pages-directory.jpg

Friday, February 20, 2015

Ansel Adams (born 113 years ago on February 20, 1902) didn’t have the best early experience with the American West. As a four-year-old, he was thrown into a garden wall during an aftershock following the 1906 San Francisco earthquake. His nose was broken and stayed permanently crooked. To his credit, he didn’t hold it against the region, which he spent his career as a photographer helping others fall in love with.
In 1916, two things happened to a teenage Ansel that marked him forever: He visited Yosemite National Park with his family for the first time, and hid dad gave him a Kodak Brownie box camera, his first. He was permanently smitten by both. He quickly became a photography enthusiast, while joining the Sierra Club and becoming a summer caretaker at Yosemite. He spent his summers hiking, camping, and taking pictures, while trying to launch a musical career the rest of the time.
His first pictures were published in 1921, and he would eventually give up his musical ambitions. (His small, easily bruised hands didn’t mark him as a pianist.) He spent 60 years in photography, with his career marked by black-and-white photographs of the West, especially his first love at Yosemite. (This photo, "Moonrise, Hernandez, New Mexico," became one of his most popular.) He developed the Zone System to optimize film exposure and development, and the complete sense of control he gained from this system might have explained his lifelong preference for black-and-white film.
He taught photography workshops, and supposedly told his students "It is easy to take a photograph, but it is harder to make a masterpiece in photography than in any other art medium." He won numerous awards, and even had a photo placed on board the Voyager spacecraft. If aliens ever encounter that repository of Earth’s knowledge, one thing they’ll find is Ansel Adams’ photo of the Tetons and the Snake River. He died at age 82 in 1984.
http://i.guim.co.uk/static/w-700/h--/q-95/sys-images/Guardian/Pix/pictures/2011/11/12/1321099450442/Ansel-Adams-Moonrise-Hern-011.jpg

Thursday, February 19, 2015

Nicolaus Copernicus (born 542 years ago on February 19, 1473) knew that mankind had no desire to be moved from the center of the universe, and he tried his best to delay it. His heliocentric theories that refuted the popular views of his day were worked out during his time as a civil servant along the Baltic Coast in present day Poland, when he was supposed to be consumed by economic and administrative duties but couldn't keep from focusing on less earthbound ideas.
Copernicus dragged his feet on making his views public, perhaps intuiting some of the sharp resistance he would meet. He finally published his work “On the Revolutions of the Heavenly Spheres” in 1543, right before his death. He would never witness the upheaval it caused, although even that was delayed by the thick, technical language he used which made his work inaccessible to many readers. A small first printing of 400 didn’t even sell out.
But it didn’t take long for readers to slice through Copernicus’ thick writing to the heart of his contention: Earth revolved around the sun. Man was no longer at the center of everything. It wouldn’t stand. Popular imagination places the Pope at the head of the backlash, but this was a moment of religious unity: Protestants and Catholics alike hated what Copernicus stood for. Martin Luther called him a fool who ignored the clear account of Joshua stopping the sun in its tracks, and another Protestant theologian mocked him as the “astronomer who moves the earth and stops the sun.”
The brunt of the backlash would fall on other men who followed in Copernicus’ orbit, like Galileo. Maybe Copernicus delayed publication so long because he just didn’t want to deal with it, in which case it would be hard to blame him. Copernicus might have been looking out for his own well-being, which would have been threatened by men trying to protect their own power in turn. We might not be at the center of everything anymore, but we’re still pretty self-centered. Not even Copernicus could change that.
https://alexautindotcom.files.wordpress.com/2013/02/copernicus-picture.jpg

Wednesday, February 18, 2015

If you’re ever told that something will happen when it snows in the Sahara, feel free to point to 36 years ago, when precisely that happened for the first time in recorded history (and one of only twice your humble correspondent could uncover) on February 18, 1979.

Any readers from Boston might find this especially unimpressive at the moment, but for everyone else, the sight of camels standing in snowy sand in southern Algeria is just pretty darn cool…as well as a sign t...hat no event should be dismissed as completely impossible. (AWC is unable to state for certain whether this particular camel roamed the Sahara during the 1979 snowfall or the more recent one mentioned later…so let him represent the Platonic Snow Camel on the cave wall of our hearts.)

The 1979 snowstorm, described as "the first in living memory" by the New York Times, brought traffic to a standstill in southern Algeria. It was out of the ordinary but also brief, lasting about a half-hour. Within a few hours, the snow was gone (which anyone from Boston should DEFINITELY find impressive). AWC has uncovered one other time when the Saharan lowlands received snow, this time in western Algeria on January 18, 2012. (Snow is a regular event in Saharan mountain ranges.)

Whether the 2012 snowfall is a harbinger of a new world of climate-change induced crazy weather -- where penguins fly over the desert and volcanoes spew purple lava at the South Pole -- or just another aberration remains to be seen. For now, snow in the Sahara remains a pretty cool sight, and a reminder that this crazy world we share is something we claim to understand at our peril. We don’t really know what can happen, and the possibilities are beyond us. If camels can handle snow, you can go out and tackle whatever has you nervous today. (By the way, do you guys know what day it is?)

http://i.telegraph.co.uk/multimedia/archive/02143/readers-snow-camel_2143977i.jpg
 

Tuesday, February 17, 2015

If computers someday become our masters, let it be said that we got in a few licks along the way. One of the most notable (or at least entertaining) came 19 years ago today, when grandmaster Garry Kasparov defeated IBM’s chess-playing supercomputer Deep Blue in a 6-game match which ended on February 17, 1996.

In fairness to our future robot overlords, AWC should note that an upgraded Deep Blue won a rematch against Kasparov the following year. But maybe that just enhances the impressiveness of Kasparov’s first win. Kasparov suspected IBM of cheating in the 1997 rematch by using human chess players to intervene between moves. (The rules only allowed humans to move Deep Blue’s pieces and reprogram “him” between games.) Kasparov asked for a third match, but IBM declined and retired Deep Blue. It’s possible Kasparov fell victim to the human tendency to see patterns in randomness. A random move by Deep Blue due to a bug spooked Kasparov during one game, making him think he was matched against a creative intelligence, instead of a brute force calculating machine. In all, Kasparov and Deep Blue played 12 games over the two matches, with Kasparov winning 4, Deep Blue taking 3, and 5 ending in draws.

A tale of the tape reveals that in the machine corner, Deep Blue was able to evaluate 200 million positions per second. That proved just enough to keep up with Kasparov out of the human corner, a Russian who became the world’s youngest undisputed chess champion at age 22.

Chess-playing computers have become passé since the 1990's, with developers focusing on chess software programs like Deep Junior over hardware specifically dedicated to the task. IBM would go on to develop Watson, a computer who went on “Jeopardy!” in 2011 and crushed the uber-champ Ken Jennings, who declared in his Final Jeopardy response “I for one welcome our new computer overlords.” He later wrote “'Quiz show contestant' may be the first job made redundant by Watson, but I'm sure it won't be the last.”

As for Kasparov, he retired from chess in 2005. He has spent his time trying to crack the door into Russian politics, but has found opposing Vladimir Putin to be far more confounding than putting Deep Blue in checkmate ever was. He currently sits on the board of directors for the Human Rights Foundation, a fitting spot for a guy who struck a small blow for humanity against technology on this date.

Monday, February 16, 2015

The first 911 call in North America was placed on this date, not in a major city but in Haleyville, Alabama 47 years ago on February 16, 1968. AT&T had agreed to institute a single, easy to remember emergency phone number across North America in response to a recommendation by a crime commission assembled by Lyndon Johnson. The president of the Alabama Telephone Company got wind of the idea and decided to implement the new service in Alabama first. On this date, two local politicians participated in the first 911 call, routed between Haleyville’s city hall and police station.

911 became the United States’ national emergency number in 1968 – with access to police, fire, and medical services – but it took a while to catch on. Public awareness of the service increased throughout the 1970's, and access took off outside the major cities in the 1980's. Today around 98 percent of residents in the U.S. and Canada can dial 911 and be connected with emergency services, and 96 percent within the US will be connected to Enhanced 911, which allows the dispatcher to automatically access the caller’s location (which has become more complex as cell phones and internet telephony have eclipsed the use of landlines).

Like anything, the service has its unique challenges. Among them are callers reporting emergencies in a jurisdiction outside of the area where the call originated (in which case dispatchers may not know who to contact), or calls being made in areas where emergency services have been cut, with dispatchers able to do little more than stay on the line during the long response time. (A woman in Oregon was raped while on the line with 911 during a call when no county police were on duty.) The rise of internet telephony has also allowed callers to more easily disguise their location, resulting in a rise in “swatting,” a dangerous “prank” where heavily armed police are sent to a location for no reason. Some areas are moving to make this a crime with the full costs of the response (which can run into the thousands of dollars) billed to the prankster.

Another common issue can be misdialed numbers. This is especially an issue in telephone exchanges where “9” accesses an outside number and “1” begins an area code. A caller with jittery fingers might tap both buttons twice, and hit 9911, which grabs an outside line and then dials emergency. Around Raleigh, North Carolina, the old 919 area code has been required on all calls after a recent area code overlay switch. Misdialed numbers beginning with 911 have resulted in a large number of hang-ups from the area, all of which have to be traced and called back to confirm the presence or absence of an emergency.

Despite these challenges, Americans dial 911 about 240 million times a year, while Canadians make at least 12 million calls annually. Both countries impose a fee on telephone customers to fund the service. In the United States, 911 costs telephone customers an average of 72 cents a month…which could be worth every penny if you ever find yourself needing it.
http://media.cmgdigital.com/shared/lt/lt_cache/thumbnail/960/img/photos/2013/05/03/4a/80/sns011212sherifffunds.jpg

Sunday, February 15, 2015

In the winter of 1925, the dog sled relay was a way of life in remote northern Alaska…and during that season, it also became a matter of life and death. Aircraft had yet to become reliable enough for the punishing conditions of the region, so mushing teams were responsible for delivering mail during the coldest months of the year. So when an outbreak of diphtheria threatened to decimate the small city of Nome, where Alaskan Natives had no natural immunity, it was up to a team of dogs and their sled drivers to deliver antitoxin to the one doctor in town, whose batch had expired. Without it, the estimated mortality rate was around 100 percent.
The “Great Race of Mercy,” as it came to be called, actually consisted of two relays. The first delivered a supply of 300,000 units of antitoxin to Nome. The supply had been found in Anchorage. It wasn’t enough to wipe out the outbreak, but it could hold it at bay while a supply of 1.1 million units was rounded up along the West Coast, shipped to Seattle, and delivered to the Alaska Territory. The trip from Nenana to Nome (which was as far as the trains could deliver supplies into the Alaskan interior) normally took 25 days by dog sled. The antitoxin would only survive for about six days along the trail. Facing some of the worst conditions on the planet – with hurricane-force winds and temperatures consistently below -50 F – a team of 20 drivers and around 150 dogs covered the 674 miles in 5 ½ days from January 27 to February 2.
Two weeks later, the final shipment of antitoxin arrived in Nome. The 1.1 million units that would wipe out what threatened to become an epidemic arrived in Nome 90 years ago today on February 15, 1925. The second relay had taken a relatively leisurely seven days. The death toll from the diphtheria outbreak was officially listed at 5-7 people, but was likely higher since Alaskan Natives had a custom of burying their dead without alerting the authorities. A number of dogs also died delivering the antitoxin. Balto, the lead dog of the final leg of the relay into Nome, became emblematic of the heroic animals. He became a celebrity in his own right, and had a statue erected in New York’s Central Park. Today the Iditarod Dog Sled Race honors the serum run and the history of the dog sled in Alaska, which saw its last great hurrah come to an end on this date.

Saturday, February 14, 2015

Valentine’s Day is supposed to be a day for lovers, but a scientific discovery on this date should get its due too. With candy, wine, and roses can also come activities that leave a mark long after the champagne is gone…or would, if not for Alexander Fleming, who announced the discovery of penicillin 86 years ago on February 14, 1929.
The Scottish scientist claimed that he had discovered the antibiotic after some of it gained a foothold in an open plate of bacteria, growing into a mold and inhibiting the bacterial growth as it expanded. Scientists have since called this story into question, noting that it doesn’t match up that well with the observed interaction of penicillin and bacteria. But if Fleming came up with a too-cute tale of his discovery, its arrival was no less fortunate. Penicillin made a major difference during World War II, resulting in far fewer deaths and amputations than would have been the case otherwise. In 1945, Fleming shared the Nobel Prize in Medicine for his discovery.
Since 1929, different strains of penicillin have been used to treat a host of bacterial infections, although bacterial resistance over time has made it less of a “miracle drug” today than when it was introduced. Tonsillitis and gangrene have been treated with penicillin over time, as well as a few products of a Valentine’s Day gone awry, like syphilis and gonorrhea. Because while romance can go sour, an antibacterial treatment will be there for you as long as natural selection allows it to be effective. What could be more heartwarming than that?
http://www.abc.net.au/news/image/570248-16x9-700x394.jpg

Friday, February 13, 2015

It was one of history’s most notoriously overdue library books. Sometime in the late 19th century, the first half of Mark Twain’s original handwritten “Adventures of Huckleberry Finn” manuscript disappeared from the Buffalo library, where Twain had donated it. It was presumed lost forever…until 24 years ago today, when Sotheby’s auction house in New York announced that it had authenticated a set of papers as the long-lost missing half of “Huck Finn” on February 13, 1991.
The papers took a strange trip on their way back to Buffalo. The manuscript showed up in a trunk in an attic in Los Angeles, of all places. A pair of sisters had gained possession of the trunk on the death of an aunt in upstate New York. The sisters themselves were granddaughters of James F. Gluck, a lawyer and book collector who had been a friend of Twain’s and convinced him to donate the manuscript to the Buffalo library. The accepted theory is that Gluck borrowed half the manuscript from the library, then forgot he had it. He died suddenly in 1897, and the stack of papers, with no title page, was probably hurriedly stuffed in a trunk…where it stayed for nearly 100 years.
“Huck Finn” is known for its pervasive use of dialect – notoriously so in the case of a racial slur whose repeated use, while honest to the time and people the book depicts, has made the novel a thorny presence in school libraries and children’s reading lists. The recovered manuscript shows how Twain wrestled to find the right voice for his unlettered protagonist, who also narrates the novel. In the book’s opening line, Huck tells the reader "You don't know about me, without you have read a book by the name of 'The Adventures of Tom Sawyer'; but that ain't no matter." Twain labored over this introduction, originally writing “You will not know about me,” before changing it to “You do not know about me” and then settling on the final version.
In the end, the Buffalo library appeared ready to overlook Gluck’s oversight. During his life, Gluck had been a prodigious collector, and donator, of books, having donated hundreds of pieces of writing to the library, where he served as curator. It seems he also pestered many a writer to donate their original scribblings. "Gluck badgered everyone into giving," is what William Loos, curator of rare books at the Buffalo library in 1991, said after recovering the “Finn” papers, according to the New York Times. He added "If Gluck forgot to return this overdue book, we are prepared to forgive him."
http://mediad.publicbroadcasting.net/p/kalw/files/styles/placed_wide/public/201402/04_a_photo.jpg

Thursday, February 12, 2015

George Gershwin might have been the last to know he was working on “Rhapsody in Blue.” As the story goes, he got this helpful information in January 1924 while he was playing pool at a Manhattan billiard hall as his brother (and future musical collaborator) Ira read the newspaper. Ira came across a piece about an ambitious experimental concert being planned by bandleader Paul Whiteman for the following month. The story claimed “George Gershwin is at work on a jazz concerto.”
Gershwin had already turned Whiteman down, believing he didn’t have time to work on a new piece, so this was news to him. When he called the bandleader, Whiteman pleaded with Gershwin to come on board and help save the concert idea from being stolen by a rival musician. Gershwin relented, and went to work on what became “Rhapsody.” He had five weeks to get it done.
Finally the big day came. Whiteman’s concert at New York’s Aeolian Hall was intended to blend elements of jazz and classical music, and demonstrate that the relatively new form of jazz deserved to be taken seriously. Today, just about everything about it has been forgotten…except for the piece Gershwin debuted late into the program 91 years ago on February 12, 1924. Gershwin played piano for the first performance of “Rhapsody in Blue,” which begins with a sharp clarinet wail (called a glissando by those in the know) that has become one of the most famous openings in all of instrumental music. (I found multiple suggestions that only the four notes beginning Beethoven’s Fifth rival it in popular recognition.) It was improvised during rehearsal as a joke, and Gershwin kept it in.
Gershwin has described his piece as “a sort of musical kaleidoscope of America, of our vast melting pot, of our unduplicated national pep, of our metropolitan madness.” Moviemakers have more often seen it as a musical portrait of Manhattan. Since 1924, “Rhapsody” has, like the forms of jazz and classical it sought to marry, occupied an interesting limbo straddling the common and the elevated. Serious musical critics have dissected it at length, and musicians like Brian Wilson of the Beach Boys and Michael Stipe of R.E.M. have cited it as an inspiration. It’s also popped up in United Airlines ads or in the background at Disney World. More than anything, it’s a piece that seems naturally at home in just about any American setting.
Whiteman got more than he bargained for when he twisted Gershwin's arm, and the fruit of what might have been a little dirty trick between colleagues (whether Whiteman intentionally leaked Gershwin's name to the press is unclear) lives on nearly a century later. If Whiteman's intention was to bring "high" and "low" together, the bandleader couldn't have orchestrated Gershwin's role more perfectly.

Wednesday, February 11, 2015

China lifted state censorship of Aristotle, Shakespeare, and Charles Dickens 37 years ago on February 11, 1978. The move coincided with China’s turn away from the excesses of Mao Zedong’s “Cultural Revolution” and toward loosened state control over citizens’ lives.

To Western eyes, China looks like a pretty repressive place. A one-party state that heavily censors the internet isn’t exactly a bastion of freedom. But a little history helps put things in context. During Chairman Mao’s reign as head of China’s Communist Party, he worried that the country was drifting toward capitalism. (To put his views in perspective, he saw the 1960s-era Soviet Union as having already gone fully capitalist.) His paranoid view led to disaster for average Chinese, when he implemented his so-called Cultural Revolution to root out supposedly subversive people and ideas from Chinese society in 1966.

For a decade, Chinese students, soldiers, laborers, and government workers turned on each other in the name of the revolution. People were thrown into prison for no reason, tortured, or had their property seized. When Mao died in 1976, the ruling Communist Party quickly acknowledged that his revolution had been a disaster. Beginning in 1978, they moved to install economic reforms that have made China the world’s second-largest economy (and still growing). China is still officially Communist, but what Mao feared most…infiltration by market-based ideas…has become official policy. After Mao, foreign investment was encouraged, and price controls were lifted. Much of Chinese industry is still state-owned, resulting in a strange hybrid economy. But the half-steps toward open markets have been a major economic boon for the country.

Of course, China is still known for being a closed society, culturally and politically…which brings us to this anniversary. Right at the beginning of China’s economic reforms came a lesser noted cultural one as well. Aristotle, Shakespeare, and Dickens are so commonly referred to in Western thought that we barely realize it. (I found a piece listing all the common phrases we use that come from Shakespeare, but it’s too long to even meaningfully excerpt here. Moral of the story: Dude said a lot of stuff we still say.)

China’s decision to allow these formerly banned authors into the minds and homes of its people was a remarkable step away from censorship. (The works haven’t gone ignored. This actor is performing “The Revenge of Prince Zi Dan,” a Chinese adaptation of “Hamlet.) For now, China has managed to have it both ways…allowing limited economic freedom to boost the standard of living, while maintaining tight control on many personal freedoms. But the free translation of Western authors might look to future generations like a crack in the dike. (A free internet would be more like a burst dam.) Whether China can maintain a “middle way” between Mao’s total crackdown and an open Western-style society remains to be seen long-term. Bringing Shakespeare to Beijing might be seen as a turning point one day; but if not, it’s still worth celebrating in its own right. They deserve to enjoy “Hamlet” as much as anyone else.
http://i.telegraph.co.uk/multimedia/archive/01972/revenge_1972004b.jpg