Friday, October 31, 2014

Don’t be scared, it’s just AWC! Arthur Conan Doyle treated audiences to the first collection of stories featuring Baker Street’s famous detective 122 years ago when “The Adventures of Sherlock Holmes” was published on October 31, 1892. This wasn’t the first appearance of Holmes, who had been solving criminal’s tricks in novels and periodical short stories since 1887. But it was the first time Holmes had been featured in a collection of short stories, which was arguably the best setting for his brand of brilliant deduction.
Doyle started writing Holmes stories during his ample free time as a doctor with a practically non-existent medical practice in London, before publishing the first Holmes story, “A Study in Scarlet,” in 1887. The character took off, allowing Doyle to leave medicine altogether and focus on writing. He supposedly based Holmes on a university teacher he had encountered in Scotland with extraordinary talents of insight and deduction.
A celebration of Sherlock Holmes isn’t altogether out of place on Halloween. Arguably his most famous story, “The Hound of the Baskervilles” is a suitably creepy affair, as a horrific hellhound haunts the dark moor outside a country estate. But Holmes also reminds us that most of what goes bump in the night is not what it appears, as he sheds light on dark mysteries (including the allegedly diabolical hound). A touch of the terrifying, leavened with the sharp knife of logic: It’s a combination whose appeal seems quite elementary.
https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjSLyLdUmys4RyGB13uQHYtzzwnnVZ0_KZlSZ3SG3mcg42Bc_PNaT9fmoUQiszf91mlKYk53ejnwG8Qx6SuXM60ZbNk-HZQhsaDaOtXrCl-sd8qR2fL1qsWW4UUJ-uLXHgT0uEbFfPDnaMF/s1600/adventures+of+sherlock+holmes.jpg

Thursday, October 30, 2014

On the eve of Halloween 76 years ago, Orson Welles pulled off one of history’s most memorable tricks…although how many people were actually fooled on October 30, 1938 is still debated. H.G. Wells’ “War of the Worlds” had been around for 40 years, but 23-year-old Orson Welles was finding it difficult to translate into a gripping radio drama for his CBS Radio “Mercury Theatre” hour. Welles insisted the story was too dull and needed authentic-sounding news bulletins to spice up the broadcast. Frantic re-writes took place throughout the week, with a final script gaining network approval just two days before the Sunday broadcast.
Whether intentionally or not, Welles set up his prank perfectly. He front-loaded the broadcast with the announcement that what followed was a drama, followed by a colorless musical section interspersed with dry news bulletins. Only after several minutes, when NBC’s competing comedy and variety show launched into a musical interlude, freeing up that audience to see what else was happening around the dial, did Welles launch into the meat of his show: A dramatized Martian invasion of New Jersey, “reported” live by on-scene "correspondents." It was a dramatic masterpiece that also happened to sound a lot like a real newscast (assuming you didn’t have the presence of mind to see if anyone else was reporting what would have been the biggest story of all time).
Welles announced it was all a joke at the end, but the damage was done. CBS’ switchboard was lit up with calls, and the New York Times building’s lighted bulletin declared “ORSON WELLES CAUSES PANIC” around midnight. Newspapers played up the story, reporting mobs of people fleeing their homes to escape the invasion. In the lead up to World War II, some listeners apparently thought they were listening to a German invasion. The Washington Post claimed a man had died of a heart attack, but this wasn’t verified. In fact, many of the newspaper reports seem to have been exaggerated. The show’s audience wasn’t as large as the papers claimed, and those who did hear it largely realized it was fictional. Radio was a new medium, and had started taking newspapers’ advertising revenue, leading some to speculate in hindsight that the older medium had embellished a few accounts of genuine panic to rouse suspicion about radio.
But some people clearly were fooled, and even sued for damages. As it turned out, only one listener was paid: A man from Massachusetts who had spent his shoe money to escape the Martians, and asked for enough to afford a new pair. Welles agreed he should be paid. But that was the extent of any punishment for CBS and Welles. The FCC decided not to impose new censorship standards on radio…and Orson Welles, thanks to H.G. Wells, got away with one of the most celebrated pranks any Halloween has witnessed.
http://skepticalteacher.files.wordpress.com/2010/10/war-of-the-worlds-by-orson-welles.jpg

Wednesday, October 29, 2014

Events from the ancient world are hard to pin down, for obvious reasons. (Hey ancient world, ever heard of this little idea called “not being ancient”?) But we have enough documentation to be reasonably confident that today marks 2,552 years since Persian king Cyrus the Great peaceably marched into ancient Babylon and generally improved a lot of people’s lives on what would have been October 29, 539 BCE. (They used different dates, because again…ancient.)
Cyrus built the biggest empire the world had seen up to that point, which is how guys entertained themselves before “Call of Duty.” But in addition to his military prowess, Cyrus is remembered for his decent treatment of minority groups within his empire. He is revered in Jewish tradition as the liberator of the ancient Hebrews following their Babylonian exile. Cyrus is said to have ordered any destroyed temples rebuilt, which tracks nicely with accounts in the Jewish Tanakh and Christian Old Testament that Cyrus returned the exiled Jews to Jerusalem with a commission to rebuild their temple. Cyrus’ policies made such an impression on Jewish people that he is the only Gentile to earn the title of Messiah (a Jewish title for a divinely appointed leader, as opposed to the rather more specific meaning it acquires in Christian tradition).
Cyrus left enough impact on the ancient world to have his rule documented in multiple places, both sacred and otherwise. The clay cylinder created in his name has been called the world’s first human rights charter, although scholars have argued over whether this is overdoing it, considering the entire concept of human rights didn’t exist yet. But at any rate, Cyrus seems to have been a progressive figure for his time. In those days, “not being a homicidal jerk” was generally enough to be considered a good ruler. To be a great one, you needed to go further, and Cyrus seems to have earned his title by encouraging people under his rule to live their lives and express their faiths according to their own beliefs. He may not be mentioned much in Western thought today, but his impact on the traditions and cultures of the Middle East still resonate. Some Iranians still note this date on their calendar as “Cyrus the Great Day.”
http://i0.wp.com/www.kavehfarrokh.com/wp-content/uploads/2014/02/Cyrus-Babylon.png

Tuesday, October 28, 2014

Harvard University is pretty old. It's older than both the country and state where it's located, and old enough that something called the "Great and General Court" established it 378 years ago on October 28, 1636...and people totally took it seriously.
The first institute of higher learning established in the colonies, Harvard predates the U.S. Constitution by over 150 years. It was established in the Massachusetts Bay Colony, by the forerunner to the state legislature (the aforementioned Great and General Court). Here's another example of how old Harvard is: The October 28 when it was founded wasn't really our October 28. The British Empire (which extended to these shores at the time) didn't adopt the Gregorian calendar until 1752, so the colonists were still rocking the Julian calendar, with its 11-day discrepancy, in 1636.
The school became known as Harvard College in 1639, when a young clergyman named John Harvard (that's him deep in thought) made a deathbed contribution to the "schoal or Colledge" as he called it. (Seriously, folks...it's old.) Harvard was initially designed to turn out Puritan preachers, providing colonial pulpits with a steady stream of Unitarian and Congregationalist ministers, while always remaining a nondenominational institute itself. As New England's historical religions left Puritanism behind for the ideas of the Enlightenment, Harvard began to embrace a secular mission as early as the 19th century.
As Harvard lost religion, it gained football (which are kind of the same thing in other parts of the country), and its rivalry with Yale is one of the country's oldest. While Ivy League sports don't have the same draw they once did, Harvard-Yale (or just "The Game") has a storied tradition. In 1916, Yale's coach gave his team the following pre-victory pep talk: "Gentlemen, you are now going to play football against Harvard. Never again in your whole life will you do anything so important."
In the 20th century, Harvard made a concerted effort to recruit more middle-class students rather than just pulling from Boston's elite prep academies, which had been its historical practice. But with its $36 billion endowment, Harvard's role as training ground for the country's elite remains intact. If you happen to be a billionaire or Rhodes scholar (in which case, thanks for stopping by!), the odds are greater that you went to Harvard than to any other university in the United States. The old saw about "Pahking the cah in Hahvahd Yahd" is probably irrelevant if you actually went there, since you can just get your chauffeur to handle it.
http://alumniconnections.com/olc/filelib/HAA/cpages/237/Library/images/JohnHarvard-Fall-med.jpg

Monday, October 27, 2014

The Cold War nearly turned very hot in 1962, when the world’s two superpowers reached a two week staredown over Soviet nuclear missiles placed in Cuba. But the crisis wound down quietly over two days, starting with a thaw on this date 52 years ago – October 27, 1962.
The U.S. had obtained evidence of the missiles on October 14, and responded with a naval blockade to prevent further Russian ships from reaching Cuba. The Soviets showed no immediate signs of backing down, and the Kennedy administration started preparing to invade Cuba and initiate a nuclear strike on the USSR if Moscow retaliated. It wasn’t overstating things to say the world appeared to hang in the balance.
Things only seemed to get worse on the morning of October 27, when a U.S. reconnaissance plane was shot down over Cuba, killing the Air Force major on board. But long, tense, and complicated negotiations finally yielded a breakthrough later that day, when the two countries agreed in principle to a swap: The Russian missiles would come out of Cuba, and the Americans would remove missiles capable of striking Russia from Turkey and Italy. The next day, Soviet premier Nikita Khrushchev announced on Moscow radio that the weapons in Cuba would be crated and returned east. The Russians had blinked, and Khrushchev would pay a political price. But the stalemate was over, and the Cold War could return to more chilly terms.
http://series.c-span.org/uploadedImages/Content/Images/AHTV/CubanCrisis.gif

Sunday, October 26, 2014

Benjamin Franklin had a lot of jobs: printer, postman, scientist, professional stormy weather kite flyer. But of all these hats, the one he wore as the United States' first diplomat might have had the most lasting impact, by securing the support that kept a nascent American republic from being strangled in its cradle.
One month after being appointed by the Continental Congress to a diplomatic commission, Franklin set sail for France 238 years ago on October 26, 1776. With him he took his 16-year-old grandson to act as secretary, and they settled in a Paris suburb. Franklin would spend the next nine years in France. There, he was an active Freemason, an advocate for the rights of non-Catholics, and a famous hit with the French women. (Diplomacy…it's a rough job.) He charmed the French with his "rustic" fur cap, and generally became the guy everyone wanted at their parties. They put his face on medallions and rings, and celebrated him as an emblem of the "enlightened" New World.
But in the middle of all this, Franklin had a job to do. He was there to secure increased French support for his countrymen back in the American colonies. France was already contributing support to the colonists under the table, but wasn't prepared to openly oppose its British rival until the Americans appeared likely to win the war. The Americans appeared to gain the upper hand after the Battle of Saratoga in October 1777, a year after Franklin's departure for France. This was the turning point the French had been waiting for, and they declared a formal alliance in February 1778. As delegates from the Continental Congress met with representatives of Louis XVI in a Paris hotel, Franklin was there to witness the solidifying of the alliance.
French support proved invaluable for the success of the American Revolution, which now had an established European power with its thumb on the scale as it took on the world's most powerful empire. When the British finally surrendered at Yorktown three years later, French forces were there to witness it. It's impossible to know how much of a role Franklin's glad-handing and charm had on the outcome, but it's clear that the infant United States couldn't have picked a better representative to send across the ocean on this date.
http://www.sas.upenn.edu/~engelis/franklin_belles.jpeg

Saturday, October 25, 2014

Pablo Picasso (born in Spain 133 years ago on October 25, 1881) showed signs of his talent early. His mother claimed his first words were a shortened version of the Spanish word for pencil, and his father, who was a painter and art teacher, supposedly gave up painting when he realized his son had surpassed his own skill by age 13.
Picasso's career as an artist covered different periods, such as his "Blue Period," when he turned out somber paintings predominantly worked in cold shades of blue, followed by the "Rose Period," which saw him turn to warmer colors and brighter scenes inspired by circus activity. But his work as a founder of Cubism, along with French artist Georges Braque, was his lasting contribution to modern art. Picasso (pictured in self-portrait) moved beyond literal representations in his work, delving into a technique that broke objects down into their constituent shapes and re-assembled them, sometimes using cut portions of wallpaper and newsprint in strange collages that placed a distinctive stamp on the practice of art.
Picasso spent both world wars living in France, experiences that almost certainly pushed his work into more somber directions. During one inspection by a Gestapo officer of his Paris apartment, he was supposedly asked about his painting "Guernica," depicting the bombing of a town by German and Italian warplanes during the Spanish Civil War. "Did you do that?" the officer supposedly asked, to which Picasso is said to have replied "No. You did."
Picasso was married twice, and had multiple lovers and mistresses during his life. One explanation of his career said that he invented a new style every time he fell in love with a new woman. He turned out over 1,800 paintings, 1,200 sculptures, and 2,800 ceramics during his life, many of which he kept instead of selling. After his death in 1973 at age 91, these works were turned over to the French government in the absence of a will, and are now displayed in the Musée Picasso, a Paris art gallery dedicated to his work.
http://uploads2.wikiart.org/images/pablo-picasso/self-portrait-1907.jpg

Friday, October 24, 2014

Annie Edson Taylor knew this eternal truth: When life kicks you around, you can either give up, or you can attempt an unprecedented death-defying stunt instead. She chose the second. A New York-born schoolteacher, Taylor had suffered the deaths of her infant son and husband in short succession during the Civil War, which left her not only with a world of grief, but the need to take care of herself. She moved between towns and jobs, before a boat ride one day convinced her she should ride over Niagara Falls in a barrel. Folks, I don't know how to make that narrative transition work, so you're just going to have to accept it.
She might have been seeking thrills and fame, but more likely is that she just wanted to find some financial security for her later years. A custom-made barrel was constructed for the occasion, made of oak and iron and padded with a mattress. After a cat survived a pilot trip by the barrel over Horseshoe Falls, Annie took her turn. A leather harness held her in place, and 113 years ago, on October 24, 1901, Annie Taylor, on her 63rd birthday, became the first person to survive the 17-story drop over Niagara Falls in a barrel.
Now you might ask yourself exactly why this is an anniversary worth celebrating. I would submit that Annie Taylor's daring spirit, and her refusal to accept the quiet role expected of a woman in her time, were noble traits. And if that doesn't convince you, then just celebrate that she survived at all (albeit with a bleeding head wound when she was recovered).
For the record, Taylor showed no interest in attempting the plunge again. She told the press "If it was with my dying breath, I would caution anyone against attempting the feat... I would sooner walk up to the mouth of a cannon, knowing it was going to blow me to pieces than make another trip over the Fall." However, she seemed to soften her stance on this later, as she never gave up her attempts to get rich. She made money taking photographs with tourists, and briefly talked about going over the Falls again. (She never did.) She died at 82 years old in 1921.
Taylor was one of the fortunate ones. At least five people have died going over Niagara Falls, in conveyances less secure than Annie Taylor's barrel. (Some of the fatal plunges have happened in canoes and on jet skis.) The jump is also illegal these days, so if you do manage to survive, expect to find some criminal charges and a steep fine awaiting you on the ground, whether you land in Canada or the United States.
http://upload.wikimedia.org/wikipedia/commons/4/47/Annie_Taylor.jpg

Thursday, October 23, 2014

So everybody knows the Smurfs. Little blue guys, made a cartoon that was everywhere in the '80s, decorated pasta cans during that decade, lately known for a couple of CGI cash-grab movies with Barney Stinson. Not too smurfing interesting, right? But did you know that these characters (who made their first appearance in the French/Belgian comics magazine "Spirou" 56 years ago on October 23, 1958) have been interpreted as allegorical of topics ranging from economics to linguistics to gender hierarchies?
So the Smurfs have that deal where they just substitute "smurf" for every other word in the cartoon to make kids laugh. It's enough to make you want to smurf somebody after the first few minutes. But in their original comics incarnation, this tic was put to more clever use. There's a story from 1972 where the Smurfs from the North call a certain object a "bottle smurfer," while those from the South call it a "smurf opener." This is considered a parody of the tension between French and Dutch speakers in Belgian communities, and is actually a pretty hilarious joke in my opinion.
Then there's Smurfette…the only girl Smurf. What's up with that? In the comics, Smurfette is created by Gargamel from clay to disrupt the all-male Smurf society by rending the Smurfs with jealousy. The only problem? She's ugly. Papa Smurf takes pity by changing her appearance, and she then self-exiles herself from the Smurf village to avoid disrupting the social order. Heavy stuff. (This was all a little much for the 1981 cartoon, where they just brought her back.)
But speaking of heavy, how about this? The Smurfs have no known currency. When they want food, they just go grab some Smurfberries or maybe get a free pie from the chef Smurf. (My memory isn't the clearest on the details here.) They all live in identical houses. And they all seem to have assigned jobs, that for many of them are described right in their names. (There's no way somebody named "Handy Smurf" is going to be allowed to go into teaching.) Get it? The Smurfs are a socialist utopia, something that hasn't been lost on commentators over the years.
This can all get carried too far. When a French sociologist described the Smurf society as a totalitarian and racist utopia, the head of Studio Peyo, which owns the characters, pushed back hard. But it's clear that these stories have had a little more bite to them over the years than often recognized, especially in their original comics incarnation. The cartoon version excised many of these elements completely, which were already oblique to begin with. But knowing that the Smurfs have a little more in their history than going around eating berries and saying the same word over and over makes them a little more smurfing interesting, after all.
http://bluebuddies.com/gallery/Smurf_Comic/jpg/Smurfs_Comic_Books_Complete_Collection.jpg

Wednesday, October 22, 2014

Everyone? Could I have your attention for a moment? I'm sorry, but I just wanted to note that today's AWC honors Toastmasters International, which traces its origins back 90 years to a single club founded on October 22, 1924 at the YMCA in Santa Ana, California.
Now, you might be familiar with Toastmasters, and their mission of helping members improve their public speaking and leadership skills. Not to mention their French toast is to die for! Sorry, sorry…somebody dared me to make that joke. No, but seriously folks, did you know that Toastmasters grew out of the efforts of Ralph Smedley, who noticed a need for speech training during his YMCA work in the early 20th century? Smedley established clubs where members evaluated each other on short speeches at "Y" branches in Illinois and Northern California, but none of them lasted long after he left.
Now, I know that food's getting cold, so I'll get to the point. When Smedley arrived in Santa Ana, he set up another little speech club, and this one actually stuck. Not only that, but it inspired copycats across Southern California, and soon places like Los Angeles and Long Beach had their own Toastmasters clubs. Smedley started getting so many letters that he printed up a manual and a set of lessons on public speaking, and since a club had popped up in Canadian British Columbia, he decided it was fair to organize all the clubs as Toastmasters International.
Smedley died in 1970, but the need for effective public communication didn't. And today, Toastmasters has over 14,000 clubs in 126 countries, with more than 300,000 members around the world. (Thankfully, they're not all speaking here today.) If you're interested, membership is open to anyone over 18. And ladies, you'll be happy to know the organization's male-only policy ended in 1973, so if your husband says you talk too much, join up and give him all the speeches he can handle!
Alright, tough crowd. Moving right along. I'm wrapping up here, but in closing, just remember…in this age of texting, tweeting, and sideways smilies, there's still a venue that values the art of genuine communication. You might not give many speeches in your life, but if you do, it's nice to have a little practice among people who are just as nervous as you are. Salud!
http://www.noonerstoastmasters.com/wp-content/uploads/2014/01/public-speaking-with-toastmasters-e1390689945664.jpg

Tuesday, October 21, 2014

The USS Constitution has seen a few things since launching from Boston Harbor 217 years ago on October 21, 1797…and remarkably, is still around to tell the tale. (You know, if boats could talk and all…) 

Originally built to fight Barbary pirates off the North African coast, the 44-gun Navy frigate was named by George Washington in honor of the country's constitution (which was only a decade old, it's worth noting). After that, she moved on to service in the War of 1812, where she won her more famed nickname "Old Ironsides" during a naval tussle with a British warship off the coast of Nova Scotia. Witnesses to the battle claimed that British shots bounced off the sides of the Constitution as if the wooden vessel had been made of iron. That war didn't go too well for the young United States, but a victory at sea over the world's predominant naval power was a nice morale boost.


During the Civil War, Old Ironsides served as a training ship and was eventually designated a museum ship after retiring from active service. She finished a three-year national tour in 1934, and has been docked at Boston's Charlestown Navy Yard ever since, with a few notable exceptions. Thanks to numerous restorations, the Constitution is still seaworthy…the world's oldest commissioned naval vessel still afloat. She sailed under her own power in 1997 (her 200th anniversary), with Walter Cronkite among the dignitaries who took a turn at the helm. She sailed again in 2012 (pictured), marking 200 years since the 1812 battle that made her famous. The naval history group responsible for the ship's maintenance and restoration has worked hard to keep the Constitution as close to her historical configuration as possible, and estimate that at least 10% of the ship's timber contains material original to her 18th century construction.
http://upload.wikimedia.org/wikipedia/commons/2/2f/USS_Constitution_underway,_August_19,_2012_by_Castle_Island_cropped.jpg

Monday, October 20, 2014

The Sydney Opera House opened 41 years ago, following 14 years of construction, on October 20, 1973. Queen Elizabeth formally opened the opera house (Ah, imperialism!), along with fireworks and a performance of Beethoven's 9th Symphony. Though its name might suggest otherwise, the distinctive "shell"-topped structure in Sydney Harbor actually contains multiple performance venues, which enable it to host over 1,500 performances a year with annual audiences of over a million. Its four primary tenants are a local opera, ballet, theater company and symphony orchestra.
The largest venue in the house is the 2,679-seat Concert Hall, which houses a massive 10,000-pipe grand organ. The site played a key role in the triathlon when the 2000 Olympics came to Sydney, and last December it hosted New Year's fireworks for the first time in a decade in honor of its 40th anniversary. The site has become one of Australia's most famous landmarks and most popular attractions, with over 7 million annual visitors.
http://upload.wikimedia.org/wikipedia/commons/a/a1/Sydney_Opera_House_Night.jpg

Sunday, October 19, 2014

AWC always enjoys looking at dates (other than July 4) that were crucial to U.S. independence, and today's might be the most important of all. For all the noble concepts and lofty words in our founding documents, none of them would have meant too much without a decisive military victory in the Revolutionary War, which more or less ended 233 years ago with the British surrender at Yorktown, Va., on October 19, 1781.
The first image that came to my mind of this event was Charles Cornwallis surrendering his sword to George Washington, but that's not accurate. Washington had been at the head of an allied American-French force that surrounded Cornwallis' men in late September, and spent the following weeks closing the noose around them. But neither general was directly involved in the ceremonial exchange that resulted in the surrender of 8,000 British and Hessian troops, and effectively ended the war in the colonies. Cornwallis claimed illness, and sent a brigadier general named Charles O'Hara in his place. For his part, Washington refused O'Hara's offer of Cornwallis' sword. He wanted it awarded to his second in command, Benjamin Lincoln, who had lost a humiliating engagement with the British at Charleston. That loss really stung Washington, by the way. When the British requested traditional honors for their surrender (flags waving, muskets shouldered, and all the rest), Washington remembered they had denied Lincoln's men the same privilege and said no.
While this is a red-letter day in American history, it's worth remembering that it probably wouldn't have happened if Britain and France hadn't constantly been at each other's throats in the 18th century. France was still steamed about its losses in the Seven Year's War and wanted to hold down British power, so they were more than happy to add to British headaches by assisting the American cause. French support was crucial for American independence, and Washington knew it. Fearful that any perceived insult to his ally could torpedo the surrender talks at Yorktown, Washington made sure the French were equal partners throughout the process. For all his military expertise, Washington's political instincts were dead on as well. We really should honor that guy somehow.
http://cdn-5.britishbattles.com/images/yorktown/surrender-washington.jpg

Saturday, October 18, 2014

Herman Melville's novel "The Whale" (renamed "Moby-Dick" for its American release) was published in London 163 years ago on October 18, 1851. The book has always stubbornly defied classification. In places, it is a dry catalogue of contemporary zoological knowledge; in others, an epic exploration of topics no less vexing than the nature of good and evil, the existence of God, and the encroachment of madness. One scholarly observation notes that the only constant in the odd seafaring book is its skillful use of language: "nautical, biblical, Homeric, Shakespearean, Miltonic, cetological, alliterative, fanciful, colloquial, archaic, and unceasingly allusive.”
Regarded today as one of the English language's crowning novels, "Moby-Dick" was a failure during Melville's life. He only saw 3,200 copies sold, and the book was out of print when he died in 1891. The book's reputation only recovered in the 20th century, when Melville's eclectic style found more sympathy. After World War I, faith in the old consensus was shaken and artists were more willing to experiment with unorthodox approaches, which fit the long dead Melville perfectly. Today any serious discussion of contenders for "Great American Novel" has to mention Melville's strange volume.
Ahab's quest to slay the white whale is dramatic enough to have been adapted into a handful of movies (most notably with John Huston in 1956), but film isn't well suited to capturing the madness of a man who believes that all the world's evil from Adam's sin on lives in a whale's hump…and comes close to making a convincing case. Simply put, this story was made to be read. Make time for it one day if you haven't already.
http://www.openlettersmonthly.com/stevereads/wp-content/uploads/2010/09/moby-dick-signet.jpg

Friday, October 17, 2014

Benji debuted in his self-titled first film 40 years ago on October 17, 1974. The movie made $39 million on a shoestring budget of $500,000 and spawned a minor phenomenon in the '70s and '80s, inspiring follow-up movies, TV specials and a 1983 TV show. The original Benji was a rescue mutt named Higgins. His daughter Benjean played the role in follow-up stories.
Joe Camp created the "Benji" series in part as a push back against the wide perception that G-rated movies had become unwatchable dreck. Independent filmmakers made relatively wide use of a practice called "four-walling" in the '60s and '70s, which allowed them to keep all the receipts for a movie by paying a flat fee to rent the theater for a few days. With a guaranteed revenue stream, theaters had no incentive to vet the quality of the films, which was generally abysmal family-oriented fare, with the hope that enough kids would beg to see it over a weekend to turn a profit. (There were exceptions, like the hugely successful "The Life and Times of Grizzly Adams," which was distributed the same year as "Benji" using this method.)
Benji-mania has faded now. (An attempt to revive the franchise in 2004 pretty much went to the....no, I can't.) But it left at least one enduring legacy, or so I'm told. It helped inspire the name of your humble correspondent in 1981 (along with the odd biblical patriarch or two). Plus, just look at that face. Go on, look at it. America in 1974 never stood a chance.
http://dogingtonpost.com/wp-content/uploads/2012/06/Benji.jpg

Thursday, October 16, 2014

18-month-old Jessica McClure was rescued from a well in Midland, Texas, 27 years ago after rescue workers and volunteers spent 58 hours drilling. They reached her on October 16, 1987, while viewers around the world watched it unfold live. She had fallen down the abandoned shaft while playing.
The "Baby Jessica" rescue was one of the first stories to get the round-the-clock treatment cable news is known for. CNN was still trying to make a name for itself and justify 24-hour news coverage, and the story was tailor-made for audiences who wanted minute-by-minute updates. A microphone lowered down the narrow shaft allowed people to keep track of the toddler's condition as she cried or hummed. (If you're too young to remember this story but it still seems oddly familiar, congratulations, you're a "Simpsons" fan.)
Some criticized the media circus around the event, but then-President Reagan said "everybody in America became godmothers and godfathers of Jessica while this was going on." Some people used the story for inspiration, like Charlie Musselwhite, a blues musician who released an album called "The Well" in 2010 and said Jessica's ordeal inspired him to quit drinking: "I decided I wasn't going to drink till she got out of that well. It was like I was tricking myself, telling myself that I wasn't going to quit for good, just until she got out. It took three days to get her out, and I haven't had a drink since."
After her rescue, Jessica lost a toe to gangrene, received a visit in the hospital from then Vice-President Bush, and visited the White House. After the story faded, she disappeared into a relatively quiet life. She graduated high school, married a man she met working at a day-care, and had two kids. When she turned 25 in 2011, she came into possession of an $800,000 trust fund established through donations. She has had 15 surgeries since being rescued, but claims to have no memory of the drama that made her famous.
http://kctv.images.worldnow.com/images/19834831_BG1.jpg

Wednesday, October 15, 2014

Wayne Gretzky became the NHL's all-time points leader 25 years ago, breaking his idol Gordie Howe's record on October 15, 1989. Howe's record was forged over a quarter-century long career and thought to be unbreakable by many. But the 28-year-old Gretzky only needed 11 years in the NHL to surpass Howe's mark of 1,850 combined goals and assists. Gretzky set the mark on a return to his former home in Edmonton, after the Oilers had traded him to the Los Angeles Kings a year before.
In a truly bizarre spectacle, Oilers fans cheered deliriously when Gretzky, back in their arena in an opposing uniform, scored a tying goal with under a minute to play. Their team had lost the lead, and would lose the game in overtime on a goal by…who else? But the Edmonton crowd gave Gretzky a three-minute ovation for what he had done for them and the game. Gretzky had been so beloved in Edmonton that at least one Canadian politician had tried to block the trade to Los Angeles a year earlier.
Gretzky spent another decade adding to his stats, before retiring with the New York Rangers in 1999. Fifteen years later, he still holds at least 60 NHL records. He ended his career with 2,857 points. His assists total alone (1,963) is more than any other player in NHL history has points, which means all else being equal, he could have taken Howe's points record without scoring a single goal.
Post-retirement, Gretzky has racked up a slew of movie and TV appearances, and dipped into multiple business ventures, hockey-related and otherwise. He coached the Phoenix Coyotes for four seasons while serving as part-owner of the team. He owns a restaurant named after him in Toronto, and has invested in sports equipment manufacturing and roller hockey rink operation companies. He and his wife Janet have been married 26 years and have five kids.
http://www.totalprosports.com/wp-content/uploads/2010/10/wayne-gretzky-18511.jpg

Tuesday, October 14, 2014

The attempted assassination of a former president might seem an odd cause for celebration, dear reader, but that is just the strain of bad-assery that Theodore Roosevelt lends to our anthology.

It was 102 years ago that Roosevelt made a campaign stop in Milwaukee on October 14, 1912. The presidential election was quickly approaching its climax, and T.R. was mounting the most spirited third-party challenge for the White House this nation's annals have long recorded. A former occupant of that most esteemed office from 1901-1909, Roosevelt sought the presidency once more. Disgusted with the policies of his Republican successor William H. Taft, Roosevelt had sought to wrest the party's nomination back. Failing in that endeavor, he girded himself for battle in the general election, where he vowed to take on both Taft and the Democratic nominee Woodrow Wilson in a battle to the end. Lest you think my language on this matter needlessly dramatized, I shall refer you to Roosevelt's direct words on the matter while accepting the mantle as nominee of his own Progressive "Bull Moose" Party: "We stand at Armageddon and we battle for the Lord."

Returning now to Milwaukee, dear reader, we come to the unfortunate attempt on Roosevelt's mortality earlier referenced. A deranged saloon-keeper, angered by Roosevelt's attempt at a third term, lodged a bullet in his chest. But rather than felling the former president, the projectile was halted short of T.R.'s vital innards by a 50-page copy of Roosevelt's speech in his jacket pocket. Realizing he was not coughing blood, Roosevelt correctly surmised that he was in no mortal danger, and delivered his speech as planned. As blood seeped into his shirt, he opened his 90-minute remarks thusly: "Ladies and gentlemen, I don't know whether you fully understand that I have just been shot; but it takes more than that to kill a Bull Moose."


Alas, Roosevelt was not re-elected in 1912. He split the Republican vote with Taft, allowing Wilson to take the prize. But of all the stories Roosevelt left behind -- tales of virility and manhood, of hunting and boxing, of war-time exploits, and peace-time conservation and common cause with the working chap -- it is this one that is perhaps my favorite. Imagine, sir or madam, a candidate for office in our present age withstanding an attempt on his or her person so gamely. And if you can, I declare your imagination to be as hairy and as finely developed as Roosevelt's own superb mustache. Bully!
http://mentalfloss.com/sites/default/files/styles/article_640x430/public/like-a-boss-e1350189178780_6.jpg

Monday, October 13, 2014

Like Columbus stumbling onto the New World, we continue to discover new AWC's. Today is 222 years since the cornerstone was laid for the White House, or the Executive Mansion as they called it at the time, on October 13, 1792. Construction finished in 1800, and John Adams was the first president to occupy the building, relocating from the temporary capital at Philadelphia for the last few months of his presidency.
The building was burned by the British in 1814 during the War of 1812. Dolly Madison is credited for personally saving a portrait of George Washington during the chaos. When the building was reconstructed after the war, white paint was supposedly applied to cover the burn marks, giving the mansion its distinctive hue.
The White House has undergone extensive renovations throughout its history. Theodore Roosevelt needed more space for his rambunctious brood of six children, and relocated all work spaces into the new West Wing in 1901. (Roosevelt also officially named the building the White House, although it had been informally referred to by that name for decades.) The first Oval Office was added during the William Howard Taft administration in 1909, and was relocated to its current spot in 1934. The White House became one of Washington's first wheelchair-accessible buildings during the presidency of Franklin Roosevelt, who was partially paralyzed. By Harry Truman's presidency, the building was in danger of collapse due to poor maintenance. Truman spent two years of his presidency living across the street while the building underwent a complete structural overhaul from 1949-1951, the last major architectural changes made to the building.
While the architecture has been preserved since Truman, the interior of the White House has continued to evolve. Jacqueline Kennedy undertook a major restoration and interior decorating project during her husband's presidency in the 1960s, bringing many artifacts and antiques into the building, while Pat Nixon brought over 600 artifacts into the White House. Richard Nixon had a single-lane bowling alley installed in the basement in 1970, while Jimmy Carter's administration was the first to bring computers and laser printing into the building later in the decade. The most notable change made during the Barack Obama presidency might be the solar panels on the roof, which were installed to provide power to the first family's living quarters.
Security has changed a bit since the 19th century, when it was said Thomas Jefferson was just as likely to open the door when you knocked as anyone else. The section of Pennsylvania Avenue around the building is completely closed to vehicle traffic, and the occasional fence-jumper aside, visitors are thoroughly screened. Still, the building is open for tours five days out of seven, and 30,000 people drop by for a visit every week.
http://www.whitehouse.gov/sites/default/files/image/whitehouse_historypg.jpg

Sunday, October 12, 2014

Lift a stein (or just a glass, if that's all you have), because today marks 204 years of Oktoberfest. The first Munich festival was held on October 12, 1810, to mark a Bavarian royal marriage whose details you don't need to know much about. The original festival included horse races, and was enough of a hit to be repeated the next fall. The recurring festival didn't become the suds-soaked event we think of today until at least 1892, when beer began being served in glass mugs.
For more than two centuries, Munich has celebrated Oktoberfest more or less annually, although the occasional cholera outbreak or war has caused several cancellations. The event celebrated its 100th anniversary in 1910 with around 120,000 liters of beer. If hops and barley aren't your thing, feel free to sample all the sausage, potato cakes, sauerkraut, and other Munich delicacies you can handle. Oktoberfest also hosts rides, carnival booths, an agricultural show…but no horse races. Those ended in 1960.
The actual dates of Oktoberfest have moved around a bit. Today, the event is traditionally celebrated for 16 days beginning in late September (to take advantage of some of Munich's last warm days of the year) and running through the first weekend of October. I say "traditionally" 16 days because October 3 is German Unity Day, marking the reunification of East and West Germany in 1990, so if the holiday happens to fall on a Monday or Tuesday, they just extend Oktoberfest an extra day or two. Two world wars aside, the Germans know what they're doing when it comes to this thing. And if a trip to Munich isn't in the cards for you, just enjoy one of the many honorary Oktoberfests that have sprung up in other places (often founded by German immigrants) over the years. So is all this enough to forgive that unfortunate business in the 20th century? I'm not sure, Germany. We might need another round first.
http://i.dailymail.co.uk/i/pix/2013/09/21/article-0-1822CEBE00000578-145_964x641.jpg

Saturday, October 11, 2014

Live from my laptop, it's AWC! In the mid-'70s, NBC's Saturday night lineup consisted of reruns of Johnny Carson's "Tonight Show" before Carson asked the network to put something else in the slot. The network approached a 30-year-old writer named Lorne Michaels for a replacement idea, which he helped develop over three weeks. The show premiered as "NBC's Saturday Night" 39 years ago on October 11, 1975. (The network couldn't use the name "Saturday Night Live" because Howard Cosell was already hosting a show by that name at ABC. NBC bought the name and gave it to the weekend sketch show in 1977.)
Whatever it was called, the show had a heavy-hitting comedic lineup from the first night, with Dan Aykroyd, John Belushi, Chevy Chase, and GIlda Radner all helping launch the show. Through nearly four decades and over 700 episodes, the show has consistently served as a launchpad for careers. Outside of the original cast, Bill Murray and Eddie Murphy were some of the earliest big names to take off from "SNL" and the list has only grown from there. (This shot from 1992 is an all-star lineup in itself.) Darrell Hammond has had the longest run of any regular, with 14 years on the cast, and recently returned as the show's announcer following the August death of Don Pardo, who had voiced the show for its entire history. Michaels has also been a near-constant presence. He left the show in 1980 before returning five years later, and has stayed ever since.
"SNL" has won a truckload of awards in its run, which is one of the longest in the history of network TV…but cast members and writers describe the show's development as a go-go environment where you're only as good as the bit you're putting together for the next show. For the record, tonight's episode is all new, with your host Bill Hader and featuring musical guest Hozier.
http://static3.nydailynews.com/polopoly_fs/1.1809512.1401361881!/img/httpImage/image.jpg_gen/derivatives/article_970/saturday-night-live-1992.jpg

Friday, October 10, 2014

Give humanity one thing. For all our shortcomings, nobody has turned the moon into a nuclear launchpad…at least, not yet. That's by design, because the Outer Space Treaty (yep, that's a thing) went into effect 47 years ago on October 10, 1967. It bans participating countries from placing nuclear weapons in Earth's orbit, on the moon, or stationed anywhere in space. It also bans any use of the moon for militaristic purposes, like testing weapons or building bases, and forbids nations from annexing celestial bodies, such as the moon or other planets. (Don't expect Mars to become the 51st state, since the US is a party to the treaty.)


This stuff might seem pretty far out, but it's nice to know we've imagined a future where it might happen and pretty much collectively said "Nah, let's not go there." As of today, 102 countries are full parties to the treaty, while 27 more have signed it without completing ratification. Russia and China are parties. Heck, even North Korea is on board. It's easy to overstate the importance of this, since only a relative handful of countries have nuclear weapons programs, and only one has made it to the moon. ('Murica!) Countries are notorious for violating treaties as soon as they decide it's to their advantage, and we flirted with modifying this one when Reagan proposed his "Star Wars" initiative in the '80s, which never panned out. But if a future that resembles "Star Trek" more than "Star Wars" is in the cards for us, it would help if we didn't use outer space to blow ourselves up first. It's a pretty low bar. So far we've been able to clear it.
http://mkalty.org/wp-content/uploads/2014/07/Moon-and-Earth.jpg

Thursday, October 9, 2014

The Hoover Dam started transmitting electricity 226 miles to Los Angeles on this date  78 years ago, using the power of the Colorado River to generate hydroelectric power for the job, which it's been performing since October 9, 1936.

The dam was the largest concrete structure in the world when it was finished. It was intended to provide usable water to the arid desert Southwest, which it accomplished by diverting water from the Colorado into Lake Mead, which is the country's largest reservoir when full. The generation of electricity was a secondary benefit. During its construction, it also become a key public works project, providing thousands of jobs during the Great Depression. The work was hard and dangerous, and over 100 lives were lost during its construction. 

The dam was interchangeably called Boulder Dam or Hoover Dam during the 1930s and 1940s, as members of the Franklin Roosevelt administration, including FDR himself, tried to discourage associating the dam with Herbert Hoover, under whose administration its construction had begun. By 1947, Hoover was a less polarizing figure as memories of the Great Depression faded, and Congress officially named the dam after him.


Today, the Hoover Dam is a crucial source of water for thirsty livestock, crops, and people in Nevada, Arizona, and California. It also helps keep the region's most bustling cities moving, by providing electricity for Las Vegas, Phoenix, and Los Angeles. Continued population growth along with drought conditions could make water usage a growing problem in the region, but the Hoover Dam is a symbol of man's significant, if limited, power to take control in some of the most inhospitable settings.
http://www.galavantier.com/sites/default/files/styles/single_item_image_gallery_large/public/C45-300-21143-Hoover-Dam-1280.jpg?itok=GVvtTPAy

Wednesday, October 8, 2014

The World Series is baseball's biggest stage, and plenty of big performances have happened under its glare. But only one pitcher has ever been perfect in the Fall Classic, and he did it 58 years ago on October 8, 1956.
Don Larsen didn't appear destined for baseball immortality when the Baltimore Orioles traded him to New York after the 1954 season. In his first two major league seasons, he had a 10-33 won-loss record and gave up more than 4 earned runs a game on average. He looked like an afterthought in a massive 17-player trade, but Yankees GM George Weiss and manager Casey Stengel thought the 25-year-old hurler had potential and insisted he be included in the trade.
Fast forward two years, and Larsen took the mound at Yankee Stadium as the starter for Game 5 of the 1956 World Series against the Brooklyn Dodgers. It was the fourth meeting between the crosstown rivals in the Fall Classic in five seasons, and the Yankees were seeking revenge for a loss to Brooklyn in 1955. With the series even 2-2, Larsen needed a big day to keep the Yankees from heading back to Brooklyn down 3-2. His Game 2 performance hadn't provided much reason for optimism. He had given up four walks, opening the door for a 6-run second inning by the Dodgers, and was gone by the end of the second inning as Brooklyn won 13-8 to take a 2-0 series lead. The Yankees had come home to the Bronx and won two straight, and now Larsen (who was better known for his nightlife than his pitching) needed to keep the momentum going.
What followed was a masterpiece. Larsen employed an unusual no-windup style he had recently started using, and needed only 97 pitches to retire every Brooklyn batter he faced. Only one Dodger got to 3 balls in the count all day, as Larsen sat down 27 men in a row in a 2-0 New York win…the only perfect game in the World Series before, or since. Since baseball has expanded to a four-tiered postseason structure, it also retains the distinction as the only perfect game in the history of the playoffs. The Yankees went on to win the series in 7 games, with Larsen earning the series MVP award.
Larsen won another World Series with the Yankees in 1958, and pitched with various teams before retiring in 1967. At age 85, he has lived through plenty of afternoons…but none of them can quite compare to the October day in 1956 when, for a few hours under baseball's brightest spotlight, the only word to describe him was "perfect."
http://www.lewpaper.com/images/photos/LarsenD338263eCorbis.jpg

Tuesday, October 7, 2014

Science has been coming to grips with the craziness of quantum mechanics for the last century or so, and one of the first guys to help people start wrapping their heads around it (or throwing up their hands completely) was Niels Bohr, born 129 years ago in Copenhagen on October 7, 1885. Bohr rubbed shoulders with the titans of early 20th-century physics, having friendly arguments with Einstein and critiquing papers by Werner Heisenberg. He made lasting contributions to physics like the Bohr atomic model, that describes the behavior of electrons around a nucleus, but his principle of complementarity is what will keep you up at night if you let it.

To put it bluntly, quantum mechanics (the study of reality at a really small level) is insane. The tiniest constituents of matter and energy act nothing like the macro-bodies (chairs, jet engines, Simon Cowell, etc.) that they construct. You'd send quantum mechanics home for being drunk if the existence of everything didn't depend on it. Bohr determined that quantum mechanics described "stuff" (for lack of a better term) that sometimes acts like a good ol' solid particle, and at other times acts like a misty, spread-out wave. It gets weirder: Light is a perfect example of this wave-particle duality, and can even demonstrate both properties in the same observation! It's enough to make your macroscopic head explode, but Bohr's theory is completely consistent with every observation made in the funky quantum world. His idea that objects can take on properties that seem contradictory was both a scientific and philosophical problem, and he argued for it in both domains. Combined with contemporary work like Heisenberg's uncertainty principle, Bohr's ideas gave us the beginnings of a road map to start charting the quantum realm.

Back here in the non-quantum world, he did some pretty cool stuff too. Half-Jewish on his mother's side, he helped refugees from Nazi-occupied countries find temporary jobs in Denmark in the 1930s, before fleeing to Sweden ahead of an imminent arrest by the Germans in 1943. He eventually consulted with nuclear experts in Britain and America, including the men working on the Manhattan Project. He claims to have had no role in developing the atomic bomb, but Robert Oppenheimer credited him with an important technical contribution.


After the war, Bohr helped develop the research organization CERN, which is still working to unlock the secrets of reality with the Large Hadron Collider. He died of heart failure in 1962. Bohr was the recipient of several awards during his life, including the 1922 Nobel prize in physics, but he also has an asteroid and a lunar crater named after him, as well as the synthetic element bohrium. To get an element named after you, you might be a great scientist, philosopher, or citizen…or you might be all of them at once. That's a complementarity principle anyone can understand.
Via http://www.nbi.ku.dk/english/namely_names/2011/anniversary_of_the_niels_bohr_archive/Bohr-200.jpg

Monday, October 6, 2014

Keep your voice down, but the American Library Association is 138 years old today, after 103 librarians convened in Philadelphia on October 6, 1876. They adopted the aim "to enable librarians to do their present work more easily and at less expense." Today, the ALA is the world's largest and oldest library association, with over 62,000 members.

You might think a library association might be limited to pretty dull work (ooh, another newsletter about the Dewey decimal system!)…and you would be wrong. Over its history, the ALA has waded into some very contentious political battles, most notably over censorship and freedom of information after "The Grapes of Wrath" was banned in many places, but also over topics like segregation, war, and library unions.

The ALA took a stand favoring equal service for all in 1961, while some members objected to a pro-Vietnam War a general delivered at their 1967 conference. Increasing access to libraries for the poor has also been an important topic for the ALA. Multiple times, the ALA has spoken out against censorship and in favor of the freedom of information. The ALA also founded the first professional organization centered on gay and lesbian issues, "The Task Force on Gay Liberation," today called the GLBT Round Table, and annually present the Stonewall Book Award for exceptional books dealing with LGBT issues.


Fun fact: Anyone can join the ALA (kinda like getting a library card). There are 11 divisions dealing with issues ranging from publishing to reference services to service for children, as well as for both public and research libraries. With 25,000 attendees, the ALA's annual conference is one of the largest professional conferences anywhere…and believe me, those librarians know how to party. Just keep it down, will you? This is the really good part.
Via http://www.teleread.com/wp-content/uploads/2013/01/AmericanLibraryAssociation.jpg

Sunday, October 5, 2014

007 made his leap from page to screen 52 years ago with the London debut of "Dr. No," the first film based on Ian Fleming's James Bond books, on October 5, 1962. (It opened throughout the UK in the next few weeks and came to American theaters the following year.) The first Bond film presented financial and content-related challenges. It was produced on a $1.1 million budget, low even by 1962 standards. Some of the corners cut included using cardboard paintings in M's office and magnified goldfish footage in Dr. No's aquarium. Director Terence Young was concerned about getting the books' sex and violence past the film censors, and relied on a heavy dose of humor to lighten what could have been gritty content, an approach he referred to in British slang as "taking the mickey out."
Led by a 30-year-old Sean Connery in the lead role, the formula worked and the movie's $59 million box office take guaranteed that Bond (James Bond) would be sipping martinis and swapping leading ladies on screen well into the future. In addition to kicking off the Bond series itself, the movie launched a '60s spy craze that influenced TV shows like "The Man from U.N.C.L.E." while Ursula Andress' bikini-clad role boosted two-piece swimsuit sales and Fleming's Bond books also got a bump. And of course, without the series we'd have no Austin Powers.
More than 50 years later, Bond has remained eternally young and dapper through 23 films and six actors (and enough shaken, not stirred cocktails that some experts have speculated he should have long ago died of alcohol poisoning). This has led some fans to speculate that the 007 codename is actually recycled for different MI-6 agents over time…or maybe they just regenerate, like Dr. Who. Whatever his secret, Bond has pulled in around $5 billion in film revenue over half a century, and it's estimated that a quarter of the globe's population has seen at least one of his movies. The next movie (#24) is set to begin production in December, with a budget that will presumably exceed $1.1 million.
Via http://www.wired.com/images_blogs/dangerroom/2012/10/dr-no-1962.jpg

Saturday, October 4, 2014

The faces of Mount Rushmore started slowly appearing from the mountain 87 years ago, as a crew of 400 workers under the direction of sculptor Gutzon Borglum began constructing the monument on October 4, 1927. The vision for the monument changed drastically both before and during its construction. A local historian thought that a monument in South Dakota's Black Hills would boost tourism to the area, but he envisioned a tribute to Western heroes like Lewis & Clark and Buffalo Bill Cody in the region's granite pillars known as the Needles. When Borglum came on the project, he thought the Needles offered low-quality granite and the Western focus was too limited. He settled on the face of Mount Rushmore as the site for a nationally-focused memorial with the faces of America's most admired presidents.
Washington, Jefferson, and Lincoln had long been part of the national pantheon and were easy choices. Borglum rounded out his quartet with Theodore Roosevelt, a more contemporary choice who had only died eight years before construction started. Calvin Coolidge, who was president at the time, insisted that in addition to Washington, two presidents from his Republican Party and one Democrat be represented. The choice of Roosevelt, a progressive Republican who had broken with his party and run for a third term on a third party ticket in 1912, resulted in a remarkably balanced partisan and ideological mix of faces on the mountain.
The faces were completed and dedicated one at a time between 1934 and 1939: First Washington, then Jefferson, then Lincoln, and finally Roosevelt. (Jefferson was originally supposed to go on Washington's right side, but the rock was deemed unsuitable, causing the original attempt at Jefferson's likeness to be dynamited and a new one started on Washington's left.) There is no true completion date for the monument, because it was never completed to Borglum's vision and probably never will be. He originally planned to carve all four presidents to the waist, but funding ran out for the project and construction ended on Halloween in 1941, seven months after Borglum had died from an embolism. Impressively, not a single worker was killed during 14 years of work on the monument.
Today the site has a visitor's center, a trail, a chamber containing biographies of the presidents and text of the Declaration of Independence and Constitution, and a theater with a film about the history of Mount Rushmore. It attracts over 2 million visitors a year.
(image source:http://upload.wikimedia.org/wikipedia/commons/1/1f/Mountrushmore.jpg)

Friday, October 3, 2014

Thanksgiving is still a few weeks away, but it's a good time to start planning your turkey and football regimen, thanks to a pair of proclamations on this date. Prior to the 19th century, Thanksgiving had no set occurrence, and basically happened whenever the government thought it was a good idea. The Continental Congress had declared days of thanksgiving during the colonial period, and George Washington continued that tradition by proclaiming the first national Thanksgiving 225 years ago on October 3, 1789, in a signed declaration marking that November 26 as "a day of public thanksgiving and prayer" for the young country. Washington waited six years before declaring another Thanksgiving in 1795.
After that, the holiday became as fluid as a plate of congealed gravy. John Adams declared two Thanksgivings, Thomas Jefferson declared none (supposedly believing the holiday blurred the distinction between church and state), and James Madison declared two in 1815 alone…but neither was in the fall. Some states started declaring their own Thanksgivings, with Southern states recoiling at the holiday's New England Puritan overtones.
This messy state of affairs prevailed until 1863, when Abraham Lincoln signed a declaration 151 years ago on October 3, setting aside the fourth thursday of November for Americans to express thanks for recent victories in the Civil War. (Presumably, this went over no better in the South than the Puritan celebrations had.)
From there, Thanksgiving found its annual home on November's final Thursday, until Franklin Roosevelt tried to goose holiday shopping during the Great Depression by moving it to the next-to-last Thursday of the month from 1939-1941. (It was considered poor taste for retailers to advertise Christmas wares until after Thanksgiving.) This irritated enough people that 1939 was said to be the year of two Thanksgivings, with "Republican Thanksgiving" honoring Lincoln's memory on November 30, and "Democratic Thanksgiving" sticking with FDR's plan on November 23.
In 1941, Congress finally nailed the holiday down, but they actually split the difference between Abe and Frank. Instead of returning Thanksgiving to its traditional final November Thursday, they set the holiday on the fourth Thursday of November, which is where we find it today. There were some holdouts in years when November had five Thursdays, with Texas holding its own Thanksgiving as late as 1956. As a bred Southerner, I'm just glad we finally got on board at all.
(Image source: http://media.salon.com/2013/11/lincoln_thanksgiving-620x412.jpg)

Thursday, October 2, 2014

Submitted for your approval: An AWC from a nondescript date 55 years ago. On October 2, 1959, Mr. and Mrs. John Q. Public of Anytown, USA, fired up the cathode ray tubes inside their electronic picture box and were greeted with a most unusual communiqué… a bizarre transmission from a set of coordinates that existed beyond any cartographer's knowledge, a new destination by turns intriguing and repulsive, known only as "The Twilight Zone."
By the late 1950's, writer-producer Rod Serling was chafing at the limits network television imposed on him. Sponsors got veto power over his scripts, while any controversial topic, like racism, could arouse the censors. Serling wanted a way to tell his own stories within the confines of a TV industry that had no cable alternative, and he found it with an ingenious approach: An anthology series that could use sci-fi tropes like aliens, robots, and time travel on the surface, while actually addressing topics that TV wouldn't touch openly.
Serling was present in every aspect, acting as executive producer, writing or co-writing 92 of 156 episodes, and hosting and narrating. Serling wasn't necessarily an egomaniac. He originally wanted Richard Egan, a contemporary actor with a resonant voice, to host the episodes, and reportedly said in frustration "It's Richard Egan or I'll do the thing myself."
"The Twilight Zone" lasted for five seasons on CBS, playing to critical acclaim but middling ratings. Since ending its run in 1964, the show has inspired two revival series, a movie, and a Disney theme park attraction. Syfy has made marathons a twice-annual tradition, on New Year's and the Fourth of July. Serling foresaw none of this in 1964, and sold all of his rights to the property to CBS, hoping to use the sale to get out from under some of the debt his studio had taken on to produce the show, which was notorious for going over budget. He died aged 50 in 1975 following a string of heart attacks, prematurely ending a life of constant activity that left a legacy he never fully witnessed. So ends a tale of industry and a reminder that hard work has its rewards, but they're sometimes deferred until long after the account can be withdrawn…a paradox common to this world, and the Twilight Zone.

(image source: http://fc01.deviantart.net/fs70/f/2013/040/e/6/he_is_the_twilight_zone_by_healer-d5ubjox.jpg)