Wednesday, December 31, 2014

There are moments when the world changes in an instant…and one of them came 135 years ago, when Thomas Edison drove out the darkness that had haunted human dreams from time immemorial. With the flick of a switch, Edison pierced the night with the first public demonstration of the incandescent light bulb. It fittingly happened on the very last day of the 1870s, on December 31, 1879. It would be the last decade of human history lived in the dark.
Edison was already famous by 1879, already dubbed “The Wizard of Menlo Park” for his amazing invention of the phonograph two years earlier. Now he claimed to have solved a puzzle that had been stumping inventors throughout the 19th century…the riddle of producing electric light, and doing so in a way that was practical for widespread use. Early attempts had been ongoing since at least 1802, but they were all plagued with problems. They burnt out too quickly, cost too much to make, or drew too much current to be practical.
Edison had fiddled with the problem, using different materials to produce light. He settled on a carbon filament that would last for 1,200 hours. After successfully testing the bulb earlier in the year, he arranged a New Year’s Eve demo in 1879 where he would light up a street in Menlo Park, New Jersey. There was so much excitement that the railroads ran special trains to Menlo Park just for the occasion.
After Edison’s successful demo, the age of the light bulb came quickly, at least to the cities. Edison and his crew outfitted the steamship Columbia with electric lights the following year. By 1882, he had set up New York’s first electrical power plant on Pearl Street in lower Manhattan, and switched on a steam-generating power station in London. By 1887, Edison had 121 power stations in the United States. He said "We will make electricity so cheap that only the rich will burn candles."
Edison achieved his goals with efficiency, and no short streak of ruthlessness. He vehemently campaigned against the use of AC (alternating current) electrical systems, which had been developed with the work of his former employee-turned-rival Nikolas Tesla, in favor of his own DC (direct current) distribution system. He told the public AC was more dangerous, and helped develop the electric chair using AC to back up his point. His employees electrocuted stray animals using AC to demonstrate its lethality. (The fact that all electrical currents are potentially dangerous was glossed over for a public still coming to grips with electricity.)
While rightly celebrated for his genius as an inventor, Edison was also powered by the instincts of business and showmanship. He hired men with the mathematical know-how to put his intuitions into practice, and then reaped public glory for the teamwork being produced at Menlo Park…sort of the Walt Disney of inventing. But that application of large-scale industrial processes to science was itself part of his genius, and helped him earn his reputation as one of the greatest inventors of all time. When he died in 1931 at age 84, he held 1,093 U.S. patents…and it is impossible to imagine modern life without the innovations he helped develop, like recorded sound and motion pictures. But most of all, Edison cast his glow on all of us when he ushered in the 1880s, and a new world…when he let there be light.
http://i2.cdn.turner.com/cnn/dam/assets/111230100454-sloane-edison-story-top.jpg

Tuesday, December 30, 2014

Rudyard Kipling (born 149 years ago on December 30, 1865) lived an imperial life, and the tensions that marked this long-vanished way of living have marked his body of work and complicated his reputation. He was born in Mumbai (then called Bombay), and his parents considered themselves Anglo-Indians, reflecting both their English heritage and their decision to live in an India that was still part of the vast British Empire. His recollections of his early life reflected this strange mix of Victorian and Hindu culture, as he recalled being told “Speak English now to Papa and Mamma” by the nannies who cared for him and sang Indian nursery stories to him in the local vernacular.
While Kipling spent much of his life in either India or England, it was during a brief stay in New England when the germ of what would become his most famous work came to him. Perhaps the snow piled up outside the windows of the Vermont cottage where his wife gave birth to their two daughters during a four-year stay in America made him think fondly of his time back on the steamy subcontinent, as he later recalled beginning work on “The Jungle Book” during a long winter. Kipling claimed to populate the stories with everything he knew or "heard or dreamed about the Indian jungle." The stories were published in magazines in 1893-94, and then collected in two books. Some of the stories involved a boy named Mowgli being raised by wolves and learning lessons from different animals, and are the source of Kipling’s most enduring fame.
Kipling was a prolific writer, however, and churned out many other works during his life, leading to a Nobel Prize for Literature in 1907, in honor of his “power of observation, originality of imagination, virility of ideas and remarkable talent for narration.” He was the first English-language speaker to win the six-year-old award.
Kipling was widely celebrated in his day, although the imperialist tone of his works has complicated his reputation since his death at age 70 in 1936. But while he celebrated aspects of empire (he was a strong proponent of England's effort to subjugate the Boer population in South Africa), he was clear-eyed on the most pressing moral questions of his day. The swastika was an ancient good luck symbol in India throughout most of his life, and he included it on the covers of some of his books. But after Hitler came to power in Germany, he ordered that it stop being used on his works.
He was also fond of the outdoor life, and allowed characters from the Jungle Books to be associated with Scouting. His poem “If —“ is a collection of fatherly advice (published 20 years before his own son died in World War I) and ends with the words “you’ll be a Man, my son.” One Indian historian has described the Victorian verse as the essence of the Hindu scripture, the Bhagavad Gita, in English. Rudyard Kipling took the best from every culture he encountered…perhaps the most that could be asked of a life lived in the shadow of empire.
http://i.dailymail.co.uk/i/pix/2013/09/24/article-2430993-183BAD0F00000578-472_306x423.jpg

Monday, December 29, 2014

Writing about James Joyce is a unique challenge, since he paid so little heed to the conventions of writing during his career. His works are famously difficult to read, marked by a stream of consciousness that attempts to capture in print the mind’s annoying habit of jumping between sensations, smells, colors, and intelligible thoughts without structure. (When he wrote “Ulysses,” it contained the longest “sentence” in written English…4,391 words of a character’s unbroken inner monologue.)
He also wrote frankly about sex and bodily functions, shocking the Victorian sensibilities of his audiences in the early 20th century, but reasoning that such things needed to be included if he was to accurately portray the inner lives of his characters, who after all thought about such things the same as everyone else. (“Ulysses” famously survived an obscenity challenge by the U.S. Attorney in 1933, with the judge and defending lawyer noting that as they discussed a serious legal case, their thoughts drifted onto neckties or chairs in the room...trivial distractions that illustrated Joyce perfectly.) His works have some autobiographical factors thrown in as well, but filtered through fictional lenses, making it hard to distinguish reality from fiction. Many of these factors were confusingly present in his first novel, “A Portrait of the Artist as a Young Man,” published 98 years ago on December 29, 1916.
Like Joyce’s books, the story of how his first novel came into being is convoluted and not easy to follow. It began as a work heavy on philosophy called “A Portrait of the Artist” that was rejected by the editor of an Irish literary magazine on the grounds that “I can't print what I can't understand.” Joyce tried to salvage some of it by reworking it into a realistic biographical novel called “Stephen Hero.” About halfway through it, Joyce again abandoned this work and decided to repurpose the whole thing into the much more experimental final product, a sort-of biographical coming of age story which also included chapters on the philosophy of St. Thomas Aquinas plucked from his own notebooks. (In recycling two failed projects and a set of theological notes, Joyce showed nothing if not efficiency.)
The final “Portrait” is, not surprisingly, difficult to explain. Its protagonist, in his wrestling with the social and religious conventions of his Irish Catholic upbringing, reflected Joyce’s own experiences. (Joyce was unable to overcome his own lack of faith to take part in confession or communion even as his mother begged him to from her deathbed.) “Portrait” has some mechanics of plot, but is largely notable as a glimpse into the developing mind and consciousness of its hero…or perhaps not. (Judgment of the protagonist is largely left to the reader.)
Joyce struggled to get the novel published in any of its forms, and threw the manuscript into the fire one day in frustration. (It was saved by his more level-headed sister.) It was finally serialized by Ezra Pound in his magazine “The Egoist” from 1914-1915 before being collected as a single work on this date. From there, Joyce was on his way, and would continue confounding readers until his death in 1941.
https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiCU4zrSoSscW8vEYTPhZYTzCZbARl5VgOccsqXfEBzWZ0Kq7Zrcp0C82MrsfVn6goBoEo2wPCpZDmsgINeS9qloofTK4aHO-UExnRrthvCBzVPlVFCZ468WkZ4KVSW5DezRkNReClzEqQ/s1600/Portrait+Of+Artist.jpg

Sunday, December 28, 2014

 It’s a Sunday after Labor Day, which means the NFL is dominating living rooms and Facebook feeds again. Professional football passed baseball as America’s sport of choice years ago…but it wasn’t always so. In fact, the gridiron game’s surge in popularity can be attributed in large part to a game played 56 years ago on December 28, 1958.
The NFL was holding its 26th championship game in 1958, but television coverage of the sport had been spotty. NBC had picked up the rights to televise the league championship game three years earlier for $100,000, but was still looking for the breakout game that would justify the investment. They got it on this date, when the New York Giants and Baltimore Colts met at Yankee Stadium to decide the league title.
For a time, it looked like the Colts might take an easy win. Led by Johnny Unitas under center and paced by a big day from receiver Raymond Berry (his 12 receptions would be a championship game record until this year), Baltimore held a 14-3 lead at halftime and was looking to punch it in again from the Giants’ 1-yard line early in the third quarter. But the New York defense stood strong and stuffed Alan Ameche on third and fourth down, forcing the turnover on downs. The Giants needed just four plays to go 95 yards for the score, making it 14-10. New York took the lead 17-14 in the fourth quarter, and then punted to the Colts with two minutes remaining.
Unitas took over on his own 14-yard line and engineered a two-minute drill, long before the phrase came into common use. Berry came up big, moving the ball 62 yards on three consecutive catches and setting up a tying field goal with just seven seconds left. For the first time in the NFL playoffs, a game was going to be decided in overtime. Unitas later recalled that the players had never heard of the idea, and were standing around waiting to see what would happen before the officials walked out for the overtime coin toss.
In the sudden death period, New York took the ball first before going three-and-out. With Unitas handing off to Ameche and throwing to Berry, the Colts took advantage of New York’s fatigue, moving 80 yards up the field. It was all over when Ameche (nicknamed "The Horse") stuck his horseshoe-painted helmet out and plunged into the end zone from a yard away, giving the Colts the 23-17 win. But the country almost didn’t see the dramatic finish. Earlier in the drive, NBC’s television feed had gone dead thanks to an unplugged cable. When someone delayed the game by running out on the field, the network had time to resume the feed. Rumors later said it was an NBC employee who caused the distraction.
They called it “The Greatest Game Ever Played,” and up to that point it was a fair distinction. It was exactly what the NFL needed: An exciting game in a championship setting with the whole country watching. The league’s popularity continued to grow. Two years later, the flashier AFL would come on the scene, leading to the football wars of the 1960s, before the two leagues merged into a stronger mega-league with franchises all over the country. The bigger and better NFL needed a grander title for its championship game, and so audiences got the Super Bowl. But for all its pomp, the Super Bowl has never seen a game decided in overtime…which makes the drama of Giants-Colts in 1958 still unique in the league’s history.
http://rayonsports.com/wp-content/uploads/2014/10/nfl_greatestgame12_576.jpg

Saturday, December 27, 2014

“Peter Pan” debuted on the London stage 110 years ago on December 27, 1904. The play by Scottish author and playwright J.M. Barrie was subtitled “the Boy Who Wouldn’t Grow Up,” and it was an instant success, traveling across the Atlantic to play on Broadway the following year. While it was this play that introduced the Peter Pan mythos to a wide audience, it wasn’t Barrie’s first use of the eternally young flying boy, who had appeared in a few relatively light chapters of a darker novel called “The Little White Bird” in 1902. After the success of the play, Barrie revisited the novel, pulling out the relevant chapters and re-releasing them in a children's book called “Peter Pan in Kensington Gardens.”
As a journalist, novelist and playwright, Barrie left behind plenty of writings and had multiple successes during his lifetime. But the ever-youthful Pan has outlived him, and proven to be his lasting mark on the culture. The stories are pretty airy stuff (at least on the surface), but their origin contains more somber material. Barrie devised the character of Peter Pan from stories he told the sons of a family he befriended, whom he dressed up as pirates and told that their brother Peter could fly. Both boys’ parents died between 1907-1910, leaving Barrie and the boys’ nurse as their co-guardians. It’s also been suggested that his own brother’s death two days before his 14th birthday in an ice skating accident inspired Barrie to create the character. While Barrie’s mother was devastated at losing her son, she seemed to take comfort in knowing that he would forever be a boy.
True to his nature, Peter Pan hasn’t aged a day, even as the culture has grown up around him…and he still pops up from time to time as a reminder of eternal youth. A musical adaptation from 1954 has supplanted Barrie’s original play in popularity in the United States, and it was that version which brought audiences a singing Christopher Walken-Captain Hook in a live TV production earlier this month…just the latest in a string of adaptations running from the silent film era to Disney to Robin Williams and beyond. (Gisele Bundchen was a slightly more, ahem…grown up…Wendy in this photo by Annie Leibovitz, with Tina Fey as Tinker Bell.) Back in the U.K., the original play has given way to pantomime showings of the story, which are popular around Christmas. Whether in 1904 or 2014, audiences seem to enjoy having a Neverland to escape to every now and then.
http://timerime.com/user_files/62/62736/media/peter_pan_gisele.jpg

Friday, December 26, 2014

In 1965, the Watts section of Los Angeles exploded in a week of race riots in August which devastated the largely black neighborhood. In the wake of the riots, Maulana Karenga, an activist with a deep interest in African studies who went on to become a professor of the topic, came up with the idea of a holiday which would unite black Americans in a celebration of their common African roots. It was a weeklong festival called Kwanzaa, from a Swahili phrase for “first fruits of the harvest,” and it was first celebrated 48 years ago on December 26, 1966.
Kwanzaa is an African-American holiday, in the truest sense of that phrase…a celebration of African heritage conceived by a California activist and academic. It celebrates seven principles expressed through Swahili words for ideals like unity, self-determination, creativity, and faith. It runs for seven nights, beginning on the day after Christmas and ending on New Year’s Day. Like any solstice holiday worth its salt, Kwanzaa knows the value of symbolism, and symbols are a central part of the holiday’s celebration, including a candle holder with seven candles, a communal cup, African drums, and agricultural fruits like corn.
Karenga himself has a complicated past, which as founders of tradition go, is not unheard of. (See Clement Clarke Moore from a few days ago for another holiday figure with some particularly rough edges.) He was born in Maryland as Ronald Everett in 1941 before moving to California and adopting his Africanized name as a student of Africana (“Maulana” for master teacher, and “Karenga” for keeper of tradition). He established a community organization called US, which clashed violently with the Black Panthers at the height of the Black Power movement in the 1960s. In 1971, he was sent to prison on some particularly nasty assault charges. He has since been released, and has maintained he was a political prisoner sent away on false charges. Whether he was guilty or not, he has devoted his life to improving the lot of black Americans since prison. He has earned two Ph.D.’s and is currently a professor of African studies at Cal State-Long Beach…and the holiday he began in 1966 is still going strong.
In its early days, Karenga described the holiday as an alternative to Christmas. But as Kwanzaa has become more mainstream over the years, he appears to have softened that stance, emphasizing the inclusive and family-centered aspects of the holiday. Over almost half a century, Kwanzaa has earned a respectable place in the constellation of holidays that fall around the first days of winter. While Christmas and Hannukah have gotten a pretty big head start, Kwanzaa has carved out a considerable space for itself around this time of year. The number of people who celebrate it is hard to know, but one 2004 survey suggested that Kwanzaa might be celebrated by around 5 million people in the United States. It has also gained a sizable following in Canada.
http://www.allianceabroad.com/wp-content/uploads/Kwanzaa1.jpg

Thursday, December 25, 2014

Christmas miracles pop up every December on TV and in movies…but the events that occurred 100 years ago on December 25, 1914 come about as close as you’ll find in the annals of history. It was the first Christmas of World War I, and hundreds of thousands of lives had already been taken by a war that was just getting started. Along the Western Front, the fighting had settled into the dreary form of trench warfare that would define the war for future generations. Young men faced the prospect of a holiday away from loved ones, with the guns of war blaring all around. But then, in places all along the Western Front, the guns fell silent.
To be clear, there was no official truce by either side. But all along the front, men just stopped fighting. In some places, it began on Christmas Eve, with German and British voices volleying Christmas carols across the muck, corpses, and barbed wire of No Man’s Land separating the trenches. (The Germans sang “Stille Nacht,” and the Brits responded to the same tune with “Silent Night.”) German troops lit small Christmas trees with candles (a tradition with deep German roots), while Allied lookouts peered over the parapets and wondered what was happening.
On Christmas morning, German troops stepped warily toward Allied positions yelling “Merry Christmas” in English. And while many suspected a trap, other British, French, and Belgian troops walked out to meet their enemies. Gifts were exchanged -- German beer and sausage for English plum pudding, along with tobacco, hats, and pins – and a number of impromptu soccer games broke out. No one knows exactly how many troops decided to lay down their arms that holiday, but the best guess is that 100,000 soldiers celebrated Christmas with men they had previously been (and would soon return to) trying to kill.
Touching as this scene is, it should be placed in context. World War I would drag on for another four years, snuffing out millions of lives. It seems almost certain that some of the soldiers who had exchanged presents or played soccer together died at one another’s hands in the days that followed. (At least two men were killed during the unofficial truce itself.) And ensuing Christmases (as well as other holidays, like Easter) would never see a spontaneous outbreak of peace along the front lines to the extent of Christmas 1914. The officers on both sides saw the “live and let live” attitude that developed among men who fought in close quarters to the enemy, and did their best to snuff it out, sending out strict orders against fraternization. The war itself also extinguished many of the brotherly feelings combatants might have felt in 1914. As the war dragged on, more and more vicious means of attacking (like mustard gas) made the idea of truce impossible.
What we’re left with is one holiday celebration a century ago fueled by men dreaming of Victorian era Christmases back in London and Berlin, and doing their best with what they found in the blood and mud covering the ground in Flanders. In the middle of the bloodiest conflict the world had seen up to that point, history records a genuine outbreak of peace on Earth, good will toward men. Christmas may not be able to work miracles, but you’d be hard pressed to find a day that’s come any closer than that.
http://upload.wikimedia.org/wikipedia/en/4/42/Illustrated_London_News_-_Christmas_Truce_1914.jpg

Wednesday, December 24, 2014

Like many interesting stories, NORAD’s foray into the Santa-tracking business began with a mistake. NORAD claims it was 59 years ago on December 24, 1955 when the staff working Christmas Eve at the Continental Air Defense Command center in Colorado Springs started getting unusual phone calls. Kids kept calling asking to talk to Santa Claus, who was obviously a little busy at the moment. It turned out the source of the calls was a misprinted Sears ad that ran in a local newspaper, inviting the kiddies to call Santa on his personal hotline. But the phone number was misprinted, and all the calls were being routed to CONAD’s nerve center instead.
With no threat to North American airspace imminent, the commanding officer on duty, Col. Harry Shoup, told his crew to give anyone who called asking about Santa a “current location” as St. Nick made his trip around the world. A tradition was born which followed CONAD into its rebranding as NORAD in 1958, and has stayed fully updated with the times. (An alternate version of this story says CONAD embellished the truth of one kid mixing up the phone number to justify the Santa operation, seen as a way to make people comfortable with the pervasive nature of a military tracking apparatus that never went to sleep during the Cold War. Decide for yourself which is more likely.)
Whatever the reason for NORAD Tracks Santa (as it's officially called), the idea is to use every bit of tracking and communication technology available to the North American Aerospace Defense Command (a combined air defense cooperative of the U.S. and Canadian militaries), and it is quite an operation. A hallmark of NORAD Tracks Santa is a reliance on a variety of technologies. Over the years, NORAD has enlisted telephone, radio, newspaper, and TV outlets to assist in the mission of keeping youngsters updated on Santa’s whereabouts before they go to bed. In the internet age, NORAD has used smartphone apps, Google Earth, and CGI videos tracking Santa…with the help of celebrity narrators. (Ringo Starr lent his voice to the effort when Santa reached London 10 years ago.) Last year, NORAD claims to have logged 19.5 million unique online visitors to its Santa website.
Of course, the whole thing started with the telephone, and that’s still a major component. NORAD claims to have fielded 117,000 phone calls in 2013 which were answered by an army of 1,200 honorary elves (including Michelle Obama, who has participated for the last five years). This kind of reconnaissance mission is expensive, of course, but taxpayers aren’t on the hook. The whole thing is financed through corporate sponsorship, thanks to deals with companies like Google and Bing. So if you look up tonight and happen to spot a red-tinged anomaly in the sky, rest assured the big boys are keeping a close eye on it.
http://www.cornwallseawaynews.com/media/photos/unis/2013/12/23/2013-12-23-11-26-50-NORAD_-_Canada_-_Santa_Radar_Tracking.jpg

Tuesday, December 23, 2014

The opening line states “’Twas the night before Christmas,” but it was actually a day early when the famous poem was published anonymously in Troy, N.Y.’s Sentinel newspaper 191 years ago on December 23, 1823.
We know now that the poem, titled “A Visit From St. Nicholas” on publication, was most likely penned by Clement Clarke Moore, but it took years for him to be identified as the author in 1837. For 14 years, the poem was anonymous. It’s perhaps better off that way, because Moore is a complex character who could only be produced by a certain time in American history. A Northerner and a seminary professor who taught Eastern literature and the Bible, he was also a slaveholder who opposed abolition and a polemicist who penned harsh political attacks on Thomas Jefferson based on his religion. Maybe he sensed that his little poem was best left untangled from all of that. More likely, he just thought the simple verses weren’t fitting for the resume of a serious scholar like himself.
Regardless of that, Moore’s poem is the reason he is remembered at all…and it has defined a central part of our Christmas tradition in this country. Before “’Twas the Night…” came along, the imported European tradition of St. Nicholas was, like a bowlful of jelly, hard to nail down in its stateside form. The fur-clad saint, known as Sinterklaas by the Dutch, was a mysterious fellow. He gave gifts secretly, but people were free to imagine him any way they wanted.
Moore nailed down that tradition, and many parts of his interpretation have clung to St. Nick like ashes and soot ever since. A white-bearded jolly fellow, with a round belly, who comes down the chimney on Christmas Eve, with a bag full of toys, then takes off in a sleigh pulled by eight reindeer: Every word of that description is straight from Moore.
What’s also interesting is which parts of Moore’s vision didn’t stick. Moore’s text is clear that his St. Nick is not a large fellow. “Miniature sleigh,” “tiny reindeer,” “jolly old elf”…these descriptions explain why it would be easy for Moore’s saint to fit down a chimney, but they’re overlooked in a culture that doesn’t do anything small. St. Nick’s heavy smoking (to the point that smoke encircles his head like a wreath) has also been toned down, which is probably a good thing, since he’d be fined for lighting up publicly in most parts of the country these days.
The “death of the author” phenomenon says that a writer’s intention is far less important than what readers see in a work. And while readers have been selective in what they’ve retained from Moore’s story and attached to our Santa Claus, what they have retained has been remarkably consistent and durable. Moore’s verses have been called the most famous in American writing. Tomorrow night, countless kids will be on edge because of the yarn he wove. For a holiday that had been centered around a hectic and harried Christmas Day, Moore injected quiet anticipation into the night before. He populated one night of the year with magic which hasn’t faded yet. Whatever else he may have done in his life, that’s not a bad legacy at all.
http://studentweb.cortland.edu/Karen.Jordan/finalproject/looking%20%20out%20window.jpg

Monday, December 22, 2014

If a music lover could buy a ticket to any public event in the past, there would be worse choices than the marathon concert held at Vienna’s Theater an der Wien 206 years ago on December 22, 1808. (Just be sure to pack a coat if you ever manage it.)
On the program: A night of premieres by the master himself, Ludwig van Beethoven. For the hardy souls who braved the early winter temperatures in the unheated theater and the four-hour running time, they were rewarded with being among the first ears to hear Beethoven’s Piano Concerto No. 4, the concluding Choral Fantasy (which brought all of the night’s musical elements – piano, chorus, and orchestra – together for a stirring finale), and not one but two symphonies….Beethoven’s Sixth, and probably the best known piece of classical music in existence, Beethoven’s Symphony No. 5.
You know Beethoven’s Fifth, whether you think you do or not. (Da-da-da-DAAAAA. Yep, that one.) The opening notes have been described by one Beethoven contemporary as death knocking at the door. The piece became known as the “V for Victory” Symphony for the Allies during World War II, thanks to some interesting coincidences: The four notes at the beginning recall Morse code for the letter “V.” The same letter represents the number 5 in Roman numerals, creating another connection with Beethoven’s Fifth. (The fact that Beethoven was German wasn’t lost on the Allies, as they used his symphony to rally their troops against the Nazis.)
Anyway, back to Vienna in 1808. The first public performance of Beethoven’s Fifth didn’t receive the immediate praise you might expect. But give the folks a break: It was cold; the piece didn’t begin until more than two hours into the bloated program (there were eight pieces on the bill in all, along with an intermission); and the orchestra was so under-rehearsed that Beethoven, who was both pianist and conductor for the night, had to stop one piece of music and start over. (Impressive, considering that he was almost deaf.)
The Fifth Symphony quickly got the recognition it deserved. (A review by E.T.A. Hoffman when Beethoven published the score described its drama with language like “radiant beams,” “gigantic shadows,” and “endless longing” – which just shows how much music meant to people 200 years ago.) And the people who packed into the Vienna theater for Beethoven’s mega-concert were the last crowd to see him take on something so ambitious. He would never conduct an orchestra and play a piano solo on the same night again. That’s worth a little frostbite, no?

Sunday, December 21, 2014

Some people leave massive footprints on the world, enacting bold, sweeping changes in their wake. Arthur Wynne was not one of those people. A British-born newspaper employee and violin enthusiast who emigrated to the United States at age 19, his life would be completely forgettable if not for one notable achievement. It was on this date 101 years ago that Wynne published what’s widely acknowledged as the first crossword puzzle in the New York World on December 21, 1913.
Wynne’s feature was called a “Word-Cross” and ran in the World’s “Fun” section. While earlier puzzles had used some variation on the phrase “crossword” to describe themselves, Wynne’s diamond-pattern game was the first to assemble most of the conventions of the modern crossword puzzle. It required readers to provide the answers to clues in an interlocking pattern that read horizontally and vertically. (The numbering scheme was admittedly strange, and needed some refining.) Wynne continued to produce the puzzles, and went on to introduce the use of black squares to separate words into neat rows and columns. It was supposedly a typesetting error that caused one edition of the “Word-Cross” to be printed as the “Cross-Word,” and there you have it.
Wynne’s puzzles were a popular World feature, and similar puzzles spread to other papers. (The Boston Globe was running its own crossword as early as 1917.) Crossword puzzles became the “Angry Birds” or “Candy Crush” of the 1920s, spreading everywhere, including to public places…and irritating a lot of people in their wake. The first issue of the New Yorker mused about "the number of solvers in the subway and "L" trains" in 1925.
Proving that old people have always been cranky, traditionalists complained that the crossword was a childish, dumbed-down fad that would hopefully fade away. The New York Public Library complained about crossword fans hogging the encyclopedias and dictionaries at the expense of “legitimate readers,” while The New York Times called the craze “a sinful waste,” “not a game at all,” and “a primitive form of mental exercise” in a 1924 editorial. In 1930, a New York Times correspondent bragged about how the paper had never succumbed to the crossword fad, which had apparently faded away.
In 1942, the Times finally printed its first crossword puzzle. Today the New York Times crossword is considered among the most challenging and prestigious in the world…proving that today’s flashy, annoying trend can become just another boring thing old people enjoy if you wait long enough. I look forward to the daily “Candy Crush” feature to be uploaded to my CyberNews neural implant, along with the weather and market reports, in 2060. Hopefully there’ll be some dumb new thing I can gripe about then too.
http://upload.wikimedia.org/wikipedia/commons/3/32/First_crossword.png

Saturday, December 20, 2014

When Frank Capra came across the project that would become “It’s a Wonderful Life,” the RKO studio couldn’t hand it to him fast enough. The story, called “The Greatest Gift,” had been written by a Pennsylvania author named Philip Stern in 1939, who spent four years trying (and failing) to get it published before including it in his private Christmas cards to family and friends in 1943. When RKO got wind of the story, they thought they could turn it into a Cary Grant movie and picked up the rights in 1944.
Three failed scripts later, they were so eager to unload the project onto Capra’s production company that they sold it for the same $10,000 they had paid to acquire it, and threw in the three scripts for free. Capra brought in a team of writers to “polish” the script, and the resulting movie opened in New York 68 years ago on December 20, 1946.
The story of George Bailey’s peek at a world without him is probably the most-loved holiday film ever made. But like many Hollywood classics, it took a while to get there. When the film entered general release in January 1947, receipts were underwhelming. (The movie’s high production costs, thanks to the elaborate set that created an entire “Bedford Falls” at a California movie ranch, were partly to blame.) Reviews were mixed, though generally positive. And on Oscar night, the film entered with five nods including Best Picture, but only walked away with a single statue, for Technical Achievement. (The technical breakthrough that impressed Hollywood so much was a new method of creating fake chemical snow on movie sets. The old method of crushed corn flakes had made so much noise when actors walked on it that dubbed dialogue had to be added for snow scenes.)
One factor that might have saved the film’s reputation was a simple quirk in copyright enforcement. The rights to “It’s a Wonderful Life” have bounced around over the years, and thanks to a clerical error by the copyright holder in 1974, the film was believed to have entered the public domain for over a decade. This resulted in a proliferation of home video releases by different companies, and airings by hundreds of local TV stations. During the 1980’s, the story was seemingly everywhere during the holidays. Republic Pictures has since claimed copyright on the film through a convoluted Supreme Court case, and today only NBC is allowed to show “It’s a Wonderful Life” on TV.
Capra himself seemed as thrilled as anyone to see the movie’s popularity pick up. It was his personal favorite of his own movies, and he screened it for his family privately over the holidays. He told the Wall Street Journal in 1984 “It's the damnedest thing I've ever seen. The film has a life of its own now, and I can look at it like I had nothing to do with it. I'm like a parent whose kid grows up to be president. I'm proud... but it's the kid who did the work. I didn't even think of it as a Christmas story when I first ran across it. I just liked the idea.” When he died in 1991 at age 94, the movie he had believed in 45 years earlier had stood the test of time. It still has.
http://d1zlh37f1ep3tj.cloudfront.net/wp/wblob/54592E651337D2/108/55A2/X1Ro9HORh1HnhZuggCYm3w/its-a-wonderful-life_zuzu.jpg

Friday, December 19, 2014

When Charles Dickens published “A Christmas Carol” in London 171 years ago on December 19, 1843, he was greeted with immediate success in almost every measurable way. The book’s first run of 6,000 copies sold out during the run-up to Christmas Eve, and critics hailed the book as a masterpiece. But in the financial arena, where Dickens had hoped the book would help ease the burden on his growing family (his wife was pregnant for the fifth time out of 10), things were different.
Despite being a bestseller, the book wasn’t the financial windfall its writer had hoped for. Some of the fault was Dickens’ own. He was so enamored of his story -- which he hoped would help revive some of Britain’s old Christmas traditions and spread a sense of social concern for the poor -- that he insisted it be printed on the fanciest material as befitting a gift book, with gold lettering on the spine and cover, gilded edges, and woodcuts. He was so picky that he insisted the book be reprinted when he saw the colors on the first run and deemed them too drab. All of this added to the costs of production, but the book was still sold at a price (the equivalent of about $33 a copy today) that made it accessible to more people, but failed to cover its expensive outlay while leaving many profits. Dickens was also plagued by piracy, which he went to court to fight. He won, but the rogue printer pleaded bankruptcy and Dickens was left to pay his own legal fees.
But if the book didn’t bring Dickens a lot of money, it achieved something more lasting. Dickens had originally thought about releasing a pamphlet pleading for Britons to do more for the poor who had been displaced by the Industrial Revolution, but he decided that pouring himself into an earnest and nostalgic Christmas tale might have more impact. Within a year of its publication, at least eight theaters in London were staging productions of “Carol.” Dickens himself was asked to do public readings of the book, which he did…more than 120 of them before he died in 1870.
The book has never gone out of print, and its impact on the Christmas season goes well beyond its memorable characters and phrases. It’s been argued that many of the trappings of Western Christmas celebrations can be attributed at least partially to Dickens’ story, which emerged at a time when Britons were exploring some of their older traditions in an attempt to revive the holiday. It has been adapted into countless media forms, including at least 28 movies. Sternly dramatic retellings, musicals, a silent film from 1901, a modernized Bill Murray fable, Mickey Mouse, and the Muppets…all of them have been brought into the service of “A Christmas Carol.”
As for its impact on public charity and benevolence, one critic said it had inspired more generosity than “all the pulpits and confessionals in Christendom." (High praise, given that Dickens had written the book as a kind of Christian allegory. His description of Jacob Marley as having no bowels was a reference to the “bowels of compassion” mentioned in I John.) Stories abound of people being moved to generosity by Scrooge’s tale of redemption. The writer Thomas Carlyle supposedly staged two Christmas dinners after reading it. In 1867, a businessman in Massachusetts is said to have closed his factory on Christmas Day and sent every employee a turkey after attending a reading the night before. And the Queen of Norway sent gifts to London’s crippled children in the early 20th century marked “with Tiny Tim’s Love.”
Much like his protagonist Scrooge, Dickens seems to have overseen a bounty of gifts more valuable than anything a money ledger could tabulate. It’s rare that a book has exactly its intended effect, but with “Carol,” Dickens not only affected his readers while they turned its pages, but he impacted how they lived their lives. Whatever financial blessings the story brought him, Charles Dickens blessed us (every one) with his enduring carol.
http://upload.wikimedia.org/wikipedia/commons/c/c1/Scrooges_third_visitor-John_Leech,1843.jpg

Thursday, December 18, 2014

AWC is classing up the joint today. Pyotr Tchaikovsky might have had mixed feelings when he read the reviews of his double-bill from critics who visited the Mariinsky Theatre in St. Petersburg 122 years ago on December 18, 1892. Reaction to the final opera by the famous Russian composer, a one-act production called “Iolanta,” was quite favorable. But reviews were more divided about the ballet he had scored that night, a two-act production whose title translated into English as “The Nutcracker.”
Tchaikovsky had been commissioned to compose an opera-ballet production (the equivalent of a double feature in those days) after his success with “The Sleeping Beauty” two years earlier. For the ballet portion, he decided to adapt “The Nutcracker and the Mouse King,” which had been around since 1816. Reaction to his music was favorable, with critics calling the score “beautiful,” “melodious,” and “astonishingly rich.” The dancing, choreographed by Marius Petipa and Lev Ivanov, came in for harsher treatment. Critics complained that the actress portraying the Sugar Plum Fairy was too fat, the battle scene was too confusing, the ballerina didn’t dance until the second act was almost over (and It was almost midnight at the theater), and most especially that children were far too prominent. Tchaikovsky preserved some of the music by extracting a 20-minute suite from the production, but it looked like the ballet was best left forgotten.
Things changed slowly in the 20th century. Alexander Gorsky staged a new version in Russia in 1919, making his own changes (most notably getting rid of the kids). By casting adult dancers for Clara and the Prince, Gorsky introduced a romantic angle into the story. He also began a tradition of adaptation that has followed “The Nutcracker” since, with many productions containing alterations by individual choreographers putting their own stamp on the proceedings. The ballet spread to England and the United States, and gained popularity toward the middle of the 20th century.
It might be a bit cliché, but “The Nutcracker” finally hit it big when it made it to New York. George Balanchine choreographed a version for the New York City Ballet that debuted in 1954. It was a smash hit, and has been performed every year since. This was also the version that brought the ballet to TV, with live broadcasts from New York in 1957 and 1958. (It was broadcast again in 2011.) Today “The Nutcracker” is a staple for ballet troupes during the holiday season (like the Colorado Ballet, seen here). It’s been estimated that 40% of the revenues for major American ballet companies come from “The Nutcracker.”
http://catchcarri.com/wp-content/uploads/2011/12/dana-benton-and-viacheslav-buchkovskiy-in-colorado-ballet_s-the-nutcrackerat-the-ellie-caulkins-opera-house-photo-by-terry-shapiro.jpg

Wednesday, December 17, 2014

Aviation milestones are a favorite here at AWC, and today’s soars above them all. The handful of witnesses who gathered at Kitty Hawk on North Carolina’s Outer Banks 111 years ago on December 17, 1903, couldn’t have known they were witnessing the final day of an age that had lasted from the dawn of humanity…an age where men looked at birds with jealousy and longing, and where the sky really was the limit.
The successful flights of Orville and Wilbur Wright’s “Wright Flyer” that day is often shorthanded as the first airplane flight, but aviation authorities are more specific. To be precise, the Wright Brothers’ machine was the first to successfully achieve four conditions in flight: It was heavier than air, self-propelled, controlled, and its flight was sustained. The brothers had made many successful glider flights before graduating to the gasoline-powered wooden biplane that took to the air on this date. Their last successful glider in 1902 led directly to the design of the fateful flyer the following year.
The brothers had already attempted a flight three days earlier, but Wilbur had pulled up too sharply and crashed within three seconds, damaging the plane. They spent three days repairing it, and on this date Orville took his chance to go first and took the controls. He flew for 12 seconds, and traveled 120 feet. They took turns from there, and Wilbur redeemed himself by traveling 852 feet in 59 seconds on the day's last flight. The aviation age had arrived.
It wasn’t a noisy arrival. Only a handful of newspapers covered the story the next day. Many editors either didn’t believe it, or didn’t understand it. The brothers took advantage of their prolonged anonymity, since they had patents to file, work that was much easier when the world didn’t realize what they were up to. Many people assumed they hadn’t flown at all. They were called liars until their first public flight in France in 1908.
After that, they were genuine celebrities. They were two brothers from Dayton, Ohio, who never graduated high school, never went to college, and started their careers working together at a printing press and a bicycle shop, and they might have been the 20th century’s most significant figures. Wilbur, who was four years older, would only glimpse the beginnings of the change they unleashed. He died in 1912 at age 45 while suffering from typhoid fever. Orville got to see more, living until 1948. When he was born in 1871, the horse-and-buggy had been in common use. By the time he died at age 76, the jet engine had arrived and airplanes had broken the sound barrier.
http://www.wpclipart.com/world_history/the_hand_of_man/wright_brothers_first_flight_1903.png

Tuesday, December 16, 2014

For Arthur C. Clarke (born 97 years ago on December 16, 1917), probably the highest praise you could give him as a science fiction writer is that so much of what he wrote and said didn’t stay fiction for long. Known as the “Prophet of the Space Age,” he wrote about geostationary satellites in the 1940s, and predicted that they would be useful telecommunications relays. When asked in 1974 how the computer would change the life of a man in the future, he basically explained how online banking and shopping would work decades in advance, stating "[H]e will have, in his own house … a console through which he can talk, through his local computer and get all the information he needs, for his everyday life, like his bank statements, his theatre reservations ... and he will take it as much for granted as we take the telephone."
In the 20th century, Clarke became one of what was known for a time as the “big three” of science fiction writing, along with Robert Heinlein and Isaac Asimov. He wrote “2001: A Space Odyssey” as a novel, while developing the screenplay concurrently with Stanley Kubrick. Book and movie were both released in 1968, while Clarke went on to write three sequels to the original novel.
While he achieved fame as a writer, Clarke also experienced life beyond a typewriter. He was a lifelong champion of space travel who joined the British Interplanetary Society and served as its chair twice. He was a deep-sea scuba diver who discovered the ruins of an ancient Hindu temple off the coast of Sri Lanka, and a TV host who explored strange phenomena in shows like “Arthur C. Clarke’s Mysterious World” and “Arthur C. Clarke’s Mysterious Universe.”
In short, he was someone who drank deeply from life’s well and used his experiences to imagine what might still come. Any thinker who speculates on what might be does his readers well by becoming richly acquainted with what is, and Arthur C. Clarke did exactly that. He won many science fiction writing awards, took at least one science and engineering honor for his work on satellites, and was knighted in his 80s. But much cooler are all the scientific phenomena named after him, including an asteroid, a species of dinosaur, and the geostationary satellite orbit (a.k.a. the “Clarke Orbit”). He died aged 90 in 2008, leaving behind a world that looked eerily like one he had only seen in his imagination years before.
http://page247.files.wordpress.com/2010/07/avclub_review314-article.jpg

Monday, December 15, 2014



Earning equal treatment is pretty difficult for a group of people when their core identity is literally identified as a mental illness, which was exactly the situation for gay Americans in 1973. It was on this date 41 years ago that the American Psychiatric Association made a giant stride toward removing that stigma with a unanimous vote by its board of directors to de-list homosexuality as a mental illness on December 15, 1973. The vote resulted in homosexuality being removed as a diagnostic category from the second edition of the Diagnostic and Statistical Manual of Mental Disorders (or DSM-II).
The science behind viewing all gay people as mentally ill had always been weak. The classification was based more on societal norms after the Victorian era. If something was defined as a sin and a crime, then someone would obviously have to be mentally ill to do it! Long before the APA’s historic vote, there were cracks in this consensus among mental health practitioners. Freud said in 1935 that “Homosexuality is assuredly no advantage, but it is nothing to be ashamed of, no vice, no degradation, [and] it cannot be classified as an illness.”
Work by Alfred Kinsey and Evelyn Hooker also poked holes in the traditional line of thought what would prove fatal through their work in the 1940s and 1950s. Kinsey found that same-sex attraction was much more common than had been previously assumed, while Hooker administered a series of psychological exams to groups of straight and gay men, with the result showing no difference between their psychological profiles.
And then there was the anonymous doctor who appeared before the APA board in a heavy disguise to give his testimony about his life as a gay psychiatrist in 1972. John Fryer came out later, but he took advantage of his relationship with a drama student at the time to appear before the APA in a mask and oversized outfit. “I had been thrown out of a residency because I was gay; I had lost a job because I was gay,” he later recalled. The anonymous Fryer made a simple argument: As a gay man in a field where his colleagues would have regarded his true identity as a sickness, he was forced to be even more mentally healthy than most psychiatrists. He also provided written statements from other gay psychiatrists.
There was no saving the DSM category of homosexuality by then, and it was struck down by a 13-0 vote on this date in 1973. Due to the political climate at the time, many psychiatrists still wanted the option to diagnose their gay patients with something. As a compromise, “sexual orientation disorder” was inserted into the DSM, followed by “ego-dystonic homosexuality,” which noted that many gay people were distressed about their orientation and wished to change it. Those categories have since been removed, and the only artifact of the old diagnosis in the DSM is a diagnosis of “sexual disorder not otherwise specified,” which notes that some people can experience mental anguish about their sexual orientation.
Gay-rights activist Barbara Gittings (seen on the left staffing a booth at the APA convention in 1972) explained how consequential the APA vote was this way: “The sickness label was an albatross around the neck of our early gay rights groups — it infected all our work on other issues. Anything we said on our behalf could be dismissed as ‘That’s just your sickness talking.’”
http://www.frontiersmedia.com/Pics/Blog%20Images%203/apa.jpg

Sunday, December 14, 2014

Roald Amundsen had a thing for extreme (and extremely cold) places. The Norwegian explorer was part of the first expedition to winter in Antarctica in 1898, and led the first expedition through Canada’s perilous Northwest Passage in the Arctic Ocean from 1903-1906. And on this date 103 years ago, Roald Amundsen led a crew of five to the geographic South Pole, where they became the first explorers to stand at the bottom of the world on December 14, 1911.
Amundsen had originally aimed for the other end of the globe, as he planned to sail to the North Pole, a trek he did significant fundraising for. But in 1909, Amundsen learned that American explorers Frederick Cook and Robert Peary had each claimed to reach the North Pole. He changed his plans, and instead aimed for an Antarctic expedition…although he was the only person who knew it. When his crew left Norway in the summer of 1910, they all thought they were going to the North Pole.
Amundsen was also racing against the British explorer Robert Falcon Scott, although Scott didn’t know it yet. Scott was leading his own Antarctic expedition to plant the British flag at the South Pole, with plans to establish a base at McMurdo Sound. Amundsen instead made for the Bay of Whales along what’s now called the Ross Ice Shelf, placing him 60 nautical miles closer to the pole than Scott’s planned base.
Amundsen’s crew sailed south for months before sighting the icebergs of Antarctica on New Year’s Day of 1911. Outfitted in sealskin and reindeer skin, his men camped for months, enduring temperatures more than 70 degrees below zero and four months of darkness between April and August. When the sun finally rose, Amundsen’s men and dogs made a false start toward the pole, but were turned back by extreme conditions. Some dogs froze to death. Amundsen realized it was too early in the season to set out for the pole.
The fateful expedition finally set out on October 19, springtime in the Southern Hemisphere. Five men, four sledges, and 52 dogs set out. (The dogs were intended as a source of both labor and protein in the punishing conditions.) They faced crevasses spanned by snow bridges and the Transantarctic Mountains, and scaled more than 10,000 feet up the Axel Heiberg Glacier (named for one of Amundsen’s financial backers). Finally, they reached the South Pole on this date. Looking around for signs of Scott’s rival team, they found none and planted the Norwegian flag at the bottom of the world. Amundsen, in a later recollection: "Never has a man achieved a goal so diametrically opposed to his wishes. The area around the North Pole—devil take it—had fascinated me since childhood, and now here I was at the South Pole. Could anything be more crazy?"
Scott’s team made it five weeks later, and found the evidence that they had been beaten by Amundsen. Scott’s men all died on the way back from the pole, leaving Amundsen and his crew alone to relate the stories of what they found at the South Pole. And as for the North Pole, Amundsen made it there too, flying over in a dirigible in 1926. At the time, Amundsen thought he was the fourth man to make it to the top of the world, since American Richard Byrd had claimed to have flown over the North Pole three days earlier. In the years since, all three Americans who claimed to have preceded Amundsen to the North Pole have had their stories questioned…making it very possible that Roald Amundsen was the first man to reach the top and bottom of the world.
http://static.ddmcdn.com/gif/blogs/6a00d8341bf67c53ef01543807861f970c-800wi.jpg

Saturday, December 13, 2014

AWC has observed an unwritten rule so far: Don’t post birthdays for living people. The theory is that it’s harder to say exactly what a life meant, or place it in historical context, if the person is still around to blow out the candles on their birthday cake. But rules are made to be broken, and this series is noting 89 birthdays for Dick Van Dyke today, born in Missouri on December 13, 1925 and still active in a career that has spanned over half a century.
His film and TV career highlights are well known. His best known comedic roles are as comedy writer Rob Petrie on “The Dick Van Dyke Show” from 1961-1966, and Bert the chimney sweep in 1964’s “Mary Poppins.” He also took a successful dramatic turn decades later in “Diagnosis: Murder,” which ran from 1993-2001. But these just scratch the surface in a career that has seen him pop up in dozens of guest starring roles in TV shows ranging from “The Carol Burnett Show” to “Matlock” to “The Golden Girls” to “Scrubs.” His big screen resume is shorter, but includes at least 22 film credits since 1963’s “Bye Bye Birdie.”
Combine that with his stage career, which saw him appear in two Broadway productions before he hit it big on TV and appear in a 1980 revival of “The Music Man,” and Dick Van Dyke has been just about everywhere people gather to be entertained since the middle of the last century -- as attested by his Tony, Grammy, and five Emmy awards. (The famed “four statue” sweep of the major entertainment industry awards has eluded him, with no Oscar nominations during his career.)
As he approaches 90 years old, Dick Van Dyke hasn’t slowed down. This year alone, he (or his voice) has popped up on the Hallmark Channel’s “Signed, Sealed, Delivered” and Disney’s “Mickey Mouse Clubhouse” on TV, while movie audiences saw him play himself in “Alexander and the Terrible, Horrible, No Good, Very Bad Day.” He’ll show up again in the next “Night at the Museum” movie, which opens next week.
With an output like that, AWC doesn’t need to wait to render a verdict: Dick Van Dyke has lived a life worth celebrating, and hopefully has a number of birthdays left to do so himself.
Photo montage assembled by AWC

Friday, December 12, 2014

An Irish priest named Edward J. Flanagan opened a house for wayward and neglected boys in Omaha, Nebraska 97 years ago on December 12, 1917, and was immediately joined by six troubled youths. It was the beginning of the Boys Town shelter, and it helped pioneer new methods of caring for troubled juveniles in the 20th century.
Father Flanagan was 31 years old at the time. He left Ireland for the United States as a teenager, and was ordained in 1912. He was sent to Nebraska, where he asked to start a shelter for homeless boys in Omaha. The bishop overseeing the Omaha Diocese wasn’t sure about the idea, but he supported the young priest and agreed to let him try.
Father Flanagan’s downtown facility was soon overflowing with boys referred from courts and concerned citizens, or who simply walked in on their own. Father Flanagan was forced to search for a larger facility as enrollment soared to 100 within months.
The approach at Boys Town was based on social preparation. Boys were educated at all grade levels, and learned trades in a career vocational center – a relatively new approach to raising orphans at the time. In 1943, Boys Town changed its logo to a larger boy carrying a smaller boy on his back, with the caption "He ain't heavy, mister - he's my brother." The expression became popular enough to inspire a song by the British rock group The Hollies.
Boys Town was eventually relocated to its present location, a farm 10 miles outside of town, and changed its name to “Boys and Girls Town” to reflect its co-ed mission. It currently has 12 regional missions around the country.
http://immigrationinamerica.org/uploads/posts/2011-11/1322502893_father-flanagan.png

Thursday, December 11, 2014

People do extreme things for love. But very few ever have the chance to go as far as Edward VIII, who famously walked away from the English throne and into the arms of Wallis Simpson 78 years ago on December 11, 1936.
Edward had been king for less than a year when his relationship with Simpson threw the British monarchy into chaos. Simpson was unacceptable to the British aristocracy as a possible queen. She was an American, which made her socially inferior in the eyes of England’s upper crust. (English common folk were more accepting.) She was divorced once, and working on a second divorce while she was seeing Edward, a moral scandal in those days heightened by the fact that the King of England was also the nominal head of the Church of England, which did not recognize marriages to divorcees if the ex-spouse was still living.
Edward floated the idea of a so-called “morganatic” marriage, where Simpson would not assume the title of queen upon marrying the king. This was rejected as unacceptable. And so, forced to choose between the crown and the woman he loved, Edward signed a document abdicating the throne on December 10, 1936. It was approved by Parliament the next day, and Edward’s abdication became official. He gave a radio address to the English public, and then he and his soon-to-be wife left for Austria. Edward was the only British king to voluntarily give up the throne.
To be exact, it wasn’t just one crown Edward abandoned for love. The British monarch also oversaw imperial protectorates in Canada, Australia, New Zealand, South Africa, and the Irish Free State. Each corner of the realm had its own royal crown, and each piece of the British Empire also had to approve of the abdication of the king (or to them, emperor). Edward would live out his days as the demoted Duke of Windsor, while his brother, George VI, assumed the throne.
Some Britons felt it was best for everyone involved for Edward to leave. His complete disdain for royal conventions had offended many. He also appeared to harbor pro-Nazi sentiments during the lead-up to World War II, leading some to wonder what would have happened if a Hitler sympathizer had stayed in power. After abdicating, Edward was assigned a royal job as governor of the Bahamas. Simpson’s own failed marriages also might have made her decision to enter a third marriage questionable. But whatever their personal shortcomings, Edward VIII and Wallis Simpson were clearly devoted to each other. They remained together for over 35 years until Edward’s death in 1972. They are buried together in the Royal Burial Ground at Frogmore Estate in WIndsor.
http://media-3.web.britannica.com/eb-media/60/160-004-1A7FF9D9.jpg

Wednesday, December 10, 2014

Alfred Nobel was given one of the rarest opportunities: The chance to read his own obituary. It was a mistake, of course. In 1888, Nobel’s older brother Ludvig died, and a French newspaper mistakenly believed it was Alfred Nobel himself, a Swedish chemist and prolific inventor who invented dynamite. Their chosen headline: “The Merchant of Death is Dead.”
When Nobel read the headline, he was deeply unsettled about how he would be remembered when his time was truly up. It bothered him so much that he changed his will, providing for a series of awards for those who conferred the “greatest benefit on mankind” in a series of fields, to be funded by his considerable fortune. On this date in 1896, Death came calling for Alfred Nobel in earnest. And 113 years ago today, on December 10, 1901 – the fifth anniversary of his death -- the first Nobel prizes were awarded.
From the beginning, the Nobel prize has been presented in five categories: physics, chemistry, physiology or medicine, literature, and peace. (A Nobel Memorial Prize in Economic Sciences was added in 1968.) The awards are frequently presented on this date, and several notable winners have accepted their prizes on December 10, including Theodore Roosevelt (the first American winner) in 1906; Selma Lagerlöf (the first female writer to win the literature award) in 1909; the International Red Cross in 1917 (the first peace prize recipient in three years as World War I raged); Woodrow Wilson in 1920; Ralph Joseph Bunche (the first black recipient) in 1950; and Menachem Begin and Anwar Sadat in 1978.
Like with any high-profile award, the Nobel committees (there are four presenting organizations in Sweden and Norway) have had their share of misses. The 1926 medicine award went to a Danish physician who claimed to have discovered a parasite that caused cancer. (Nobel prizes for scientific discoveries have adopted tighter standards since.) The most high-profile arguments have probably been around the peace prize, though. Henry Kissinger’s award while the Vietnam War was still underway caused two committee members to resign. Barack Obama’s peace prize while he was still trying to dent the cushion in his White House chair five years ago was a head-scratcher, including to Obama himself.
But the biggest Nobel whiff is probably who didn’t get the award. Mahatma Gandhi was nominated at least five times, but never received the peace prize. The Nobel Committee secretary said in 2006 that it was the biggest omission in the award’s history, noting “Gandhi could do without the Nobel Peace prize. Whether [the] Nobel committee can do without Gandhi is the question." When Gandhi was assassinated in 1948, the Nobel Committee didn’t award a peace prize, claiming that there was “no suitable living candidate.”
http://www.skeptical-science.com/wp-content/uploads/2014/10/Nobel-Prize.jpeg