Sunday, November 30, 2014

Two titans of history shared birthdays on this date. Mark Twain was born Samuel L. Clemens in rural Missouri 179 years ago on November 30, 1835. Winston Churchill entered the world in the English town of Woodstock 39 years later on November 30, 1874. You can judge for yourself whose life was more influential.
Twain’s writings and lectures chronicled American life in the late 19th century, while slaughtering its sacred cows with a wit that was at once gentle, welcoming, and deeply cutting. His “Adventures of Huckleberry Finn” is seen by many as the “Great American Novel,” and while its use of unvarnished racist language common to its time makes it a hard read today, it was a brilliant send-up of the hypocrisy of a country that mixed religion and slavery into one unsavory stew. (Huck Finn’s declaration to himself of “All right then, I’ll go to hell” upon deciding to ignore the warnings of every preacher he had encountered by helping the slave Jim run away is one of the most arresting passages in literature.)
Across the Atlantic, and a few decades later, Churchill overcame early failures in British military politics to rise to power as the steady and dogged English prime minister during World War II. At the height of Europe’s confrontation with the Nazi menace, Churchill’s calmly confrontational leadership style was seen as the perfect tonic for a country whose politicians had unsuccessfully sought to placate Hitler. In distinctive English style, however, the voters threw him out of office after the war…not necessarily out of dissatisfaction, but because maintaining the peace was seen as a different kind of job, requiring a different type of man, than winning the war had been.
One thing both men shared was a love of the written word, and an appreciation for serious ideas. (Churchill was a noted historian in addition to his political achievements.) While separated by almost four decades, they were contemporaries, and their periods of fame overlapped around the turn of the 20th century, when Twain was an aging celebrity and Churchill was a rising star in British public affairs. They met on at least one occasion in 1900, when a 65-year-old Twain introduced a 26-year-old Churchill at a speech in the ballroom of the Waldorf-Astoria in New York. In his introduction, Twain gently poked Churchill for his support of England’s Boer War in South Africa, at the same time America was tangled up in war in the Philippines. Twain said the two countries were “kin in blood, kin in religion, kin in representative government, kin in ideals, kin in just and lofty purposes; and now we are kin in sin.”
Another thing Churchill and Twain shared: They both are credited with dishing out some truly epic insults. To give just one example from each, Twain is credited with this beauty: “I did not attend his funeral, but I sent a nice letter saying I approved of it.” Churchill is said to have responded to a woman who accused him of being “disgustingly drunk” by saying “My dear, you are ugly, but tomorrow I shall be sober and you will still be ugly.”
http://infinityinsurance.com.au/wp-content/uploads/2013/06/WinstonChurchill_MarkTwain.jpg

Saturday, November 29, 2014

Atari co-founder Nolan Bushnell has said that the secret to making an addictive video game is creating something that’s easy to learn, but impossible to master. That might explain how a concept as simple as “Pong” (announced by Atari 42 years ago on November 29, 1972) managed to become such a hit that it uncorked a lust for gaming in the public that’s never been quenched.
The idea wasn’t even supposed to be a real game, at least according to one story that says Bushnell assigned it as a warm-up exercise to his designer Al Alcorn to help him learn the basics of programming. When the final product came back, Bushnell and Atari were so surprised at the quality they decided to release it. (For his part, Alcorn’s story is a little different. He says Bushnell wanted him to make a similar game to the Magnavox Odyssey’s tennis game, a detail which becomes relevant shortly.)
Whatever its origin, Atari decided to put a prototype of the game in a local dive called Andy Capp’s Tavern in Sunnyvale, California. (Sunnyvale is part of Silicon Valley today, but even four decades ago it was on the cutting edge of technology.) One key to the game’s replay value: The player’s tennis paddle didn’t go all the way to the top of the screen. This was a bug Alcorn had intended to fix, but decided to leave in place as insurance against two equally-skilled gamers playing forever. (Easy to learn, impossible to master.)
The game was a hit at Andy Capp’s, although there was a technical problem: The coin mechanism would fill up completely, but patrons would still try to jam money in the slot, causing the game to malfunction. Atari started producing more arcade cabinets, and by Christmas 1975 released a home version of “Pong,” which was sold exclusively at Sears. But there was a problem: Magnavox had picked up on the similarities between “Pong” and its tennis game, and took Atari to court. (In truth, the Odyssey game was even less visually interesting than Atari’s. It was nothing more than three dots of light, with a plastic overlay on the TV screen giving the illusion of looking at a tennis court.) But Magnavox had a signed guest book from a demo that proved Bushnell had seen the Odyssey game before developing “Pong,” and a financially shaky Atari decided to settle rather than pile up the legal bills needed to fight.
Atari paid Magnavox $700,000 to become a “licensee” of the tennis game, and agreed to give Magnavox the rights to any games they produced in the next year. Atari then proceeded to stop making games, for exactly one year. (If Atari had ripped off “Pong” from Magnavox, they were repaid in spades as the market now became flooded with “Pong” clones. Atari hadn’t filed the proper patents before rushing the game to market, so they had no recourse.) The “Pong” craze of the ‘70s eventually simmered, but it inspired Atari to create a home console that could actually play different games. The Atari 2600 hit markets in 1977, and the rest is history.
http://angryblog.de/uploads/pong_nah.jpg

Friday, November 28, 2014

Before Daytona or Indianapolis, American car racing got its start in Chicago 119 years ago on November 28, 1895. The Chicago Times-Herald sponsored the race, and heavily hyped it as a way to boost the nascent automobile industry (and newspaper sales). One problem: The “horseless carriage” was only two years old at the time, and there was no set agreement on what to call these proto-cars. For the purposes of its race, the Times-Herald settled on the “Moto-cycle” and mapped out a 54-mile roundtrip course between Chicago and Evanston, Illinois. (A longer planned route was scuttled thanks to a blizzard.)
After being delayed multiple times (Chicago’s Finest had to be instructed by the city to stop detaining drivers and allow them to drive their cars on the roads without having them pulled by horses), the race finally began on this date, which was Thanksgiving Day. More than 80 cars had entered. Six showed up. German inventor Karl Benz entered three cars, while American Charles Duryea brought the four-wheeled “Motor Wagon” he and his brother Frank had created (pictured). There were also a pair of two-wheeled entrants, who couldn’t climb the course’s snowy inclines. One of the Benz cars was fueled by an electric battery, which died in the cold air. Another struck a streetcar, or possibly the horse pulling it, and had to leave the race.
In the end, it was Duryea crossing the finish line first. He made the trip in seven hours and 53 minutes, at a blistering pace of 7 mph. The only other finisher, the surviving Benz car, crossed the finish line an hour and a half later. This all sounds silly today, but the race was riveting material for newspaper audiences in 1895. The publicity around this race is said to have moved up American automobile production by five years. Charles and Frank Duryea got so much attention for their victorious run that they sold 13 of their Motor Wagons in the following year…enough to make them the leading American car manufacturers in 1896.
http://media-1.web.britannica.com/eb-media/60/18360-004-7AAFCF8B.jpg

Thursday, November 27, 2014

An AWC to give thanks for: Macy’s held its first Thanksgiving Day parade on this day 90 years ago. A much more scaled-down predecessor to today’s event, its performers were Macy’s employees, many immigrants with a history of similar European festivals. Dressed as clowns and cowboys, or waving from floats featuring nursery rhyme characters, they marched six miles from Harlem to Macy’s massive Herald Square retail outlet along 34th Street as a quarter-million people packed the streets to watch on November 27, 1924.
While held on Thanksgiving Day, Macy’s actually called the first procession a Christmas parade. It was designed to officially welcome Santa Claus to New York (and more importantly, to welcome holiday shoppers to Macy’s, the “World’s Largest Store” with a million feet of retail space between Broadway and Seventh Avenue). At the end of the parade, Santa climbed onto a golden throne above the Macy’s marquee and unveiled an elaborate window display with Mother Goose characters carrying out their own parade as kids jostled to get a closer look. This mashup between two holidays is still part of the parade, so anyone complaining about how Christmas encroaches on Thanksgiving’s turf can thank Macy’s!
The first Macy’s parade didn’t have any character balloons. Instead, live animals on loan from the Central Park Zoo covered the route. This wasn’t a great idea. The bears, monkeys, and elephants became tired and cranky, and their growls scared kids along the parade route. The animals were removed from future parades, and in 1927, a helium-filled Felix the Cat was introduced to make up for their absence. The balloons kept coming in later parades: Mickey Mouse, the Marx Brothers, Superman, Popeye, and on and on…right up to the Spongebob, Shrek, and Pokemon inflatables that have joined the parade in the last decade.
The parade expanded in popularity throughout the 1930s, gaining a live audience of over a million. After going dark from 1942-1944 to conserve helium and rubber for the war effort, the parade returned in 1945. In 1947, it became famous on a much bigger scale thanks to its role in “Miracle on 34th Street.” In 1948, it made the jump from local TV stations and was broadcast on the brand-new medium of network television for the first time. Broadcasts of the parade typically pull in about 44 million people a year today.
http://way2independence.com/wp-content/uploads/2014/10/Macys-Thanksgiving-Day-Parade.jpg

Wednesday, November 26, 2014

When Charles M. Schulz (born in Minneapolis 92 years ago on November 26, 1922) tried to submit some of his drawings for the yearbook at his high school in St. Paul, he was turned down. Many years later, the school atoned for this mistake by erecting a 5-foot-tall Snoopy statue in the main office.
Schulz seemed to draw widely on his own life in writing and illustrating the life of his luckless hero Charlie Brown. His unrequited love for the Little Red-Haired Girl was based on an accountant Schulz had been smitten with at art school and unsuccessfully proposed to. His family dog as a child resembled Snoopy. Even Charlie Brown’s absurd failures seemed to find precedent in his life. Schulz saw combat toward the end of World War II, but claimed that on the one occasion he had a chance to fire his machine gun, he forgot to load it. (What could have been a fatal mistake turned out OK, when the German he was targeting compounded the absurdity of the story by surrendering anyway.)
In 1950, Schulz landed a syndication deal for a feature he had conceived as a single panel called “Li’l Folks.” The syndicate preferred to run it as a four-panel strip and called it “Peanuts.” Schulz wrote and drew the strip for almost 50 years. It made him a rich man, but he only took one vacation during the strip’s run, a five-week break to celebrate his 75th birthday.
Declining health forced Schulz to retire from “Peanuts” in December 1999. He died just two months later, of complications from colon cancer, at age 77. The final original “Peanuts” strip ran the next day. Schulz had predicted the strip would outlive him, due to the long turnaround time between drawing and publication. In a literal sense, he was right by 24 hours. But “Peanuts” didn’t end with Schulz, although he asked not to be replaced on the strip, to preserve its authenticity. The syndicate has respected his wishes, and since his death, “Peanuts” has been in “reruns,” drawing on the 50 years of material Schulz left behind.
The popularity of “Peanuts” can be analyzed in many ways, but Bill Watterson (of “Calvin and Hobbes” fame), who claimed Schulz as an inspiration, praised it for its “unflinching emotional honesty” and “serious treatment of children.” Charlie Brown never kicked the football, however much fans asked for it. Schulz took life’s constant failures and made them enjoyable…an object lesson in the art of “good grief.”
http://img3.wikia.nocookie.net/__cb20110825213701/peanuts/images/7/70/CharlesSchulz.jpg

Tuesday, November 25, 2014

When Agatha Christie’s play “The Mousetrap” was on the verge of beginning its theatrical run in London’s West End 62 years ago on November 25, 1952, the novelist and playwright, in true English fashion, wasn’t very optimistic. She claimed to have told the producer that the engagement would last eight months.
Over six decades later, Agatha Christie is long dead (she died of natural causes at age 85 in 1976), while “The Mousetrap” has passed 25,000 performances on the London stage…the longest running play in history, and still going. Keeping a show continuously running for over 60 years requires some dexterous planning, and the show has weathered plenty of changes on the fly: Over 300 actors and actresses have filled the show's eight roles, the set has been updated at least twice, and most notably the entire production switched theaters over the course of a weekend in 1974. The cast played their last show at the Ambassadors Theatre (its original venue) on March 23, a Saturday, then moved the whole thing next door to the larger St. Martin’s Theatre in time for the next curtain on Monday…a feat that allowed the show to maintain its uninterrupted “first-run” status.
The murder mystery (based on Christie’s radio play “Three Blind Mice,” which itself was based on the actual death of a Welsh boy in 1945) has a twist ending that itself carries a twist: The entire audience is sworn to secrecy before leaving the theater. (You won’t find the secret here, though an internet search will reveal all for the curious.) Christie was serious about preserving the play’s secret: She insisted that the story “Three Blind Mice” never be published as long as “The Mousetrap” was still playing in London. Six decades later, the story has still never been published in the U.K.
https://www.londontheatredirect.com/img/news/TheMousetraptobeperformedinMandarinasitcelebrates60thanniversaryTheMousetrap60.jpg

Monday, November 24, 2014

When Charles Darwin published “On the Origin of Species” 155 years ago on November 24, 1859, he unwittingly set in motion a neat case study of the very process he was arguing for. The idea that forms of life developed over time wasn’t new, and Darwin drew on work going back to the ancient Greeks in presenting his theory of evolution by natural selection, which he had honed for nearly a quarter century since observing the variability of finches in the Galapagos Islands in 1835.
Darwin’s problem was that, while scientists were ready to accept much of what he wrote about the evolution of life forms, they didn’t find his chosen mechanism of natural selection convincing. By the 1870s, a range of theories were commonly accepted explaining the development of species, including the idea that major evolutionary “leaps” happened all at once instead of over time.
It would take longer than Darwin’s lifespan for his theory to be validated. Advances in gene theory, the fossil record, and the dating of the Earth’s age all caused scientists to take a fresh look at Darwin’s idea that small changes served to make some creatures more suitable to their habitats, compounding over unfathomable time to result in major differences and give us what we call different species. As the theory best suited to the evidence of the natural world, evolution by natural selection had changed over time and outlasted its competitors...an interesting feat of selection in its own right. (In a neat coincidence, the hominid skeleton named “Lucy,” which added significantly to the fossil record connecting humans with apes, was discovered in Ethiopia on the 115th anniversary of Darwin’s publication on November 24, 1974.)
Today Darwin’s theory is universally accepted among biologists. Its only real resistance is found among those who embrace the Genesis creation account, but this hasn’t always been the case. In Darwin’s own time, he met little resistance based on a literal reading of Genesis, which was uncommon in early Christianity. Instead, religious objection tended to center on ideas derived from natural theology. The Vatican called a kind of truce with evolution in 1950, declaring that it is not inconsistent with Catholic beliefs. Religious thinkers have accepted or fought evolution in a variety of ways, based on wide-ranging interpretations of theology and nature. In a way, this range of ideas is itself something Darwin might have recognized and appreciated, while saying “May the best survive.”
http://i00.i.aliimg.com/photo/v0/111680192/On_the_Origin_of_Species_Book_Charles.jpg

Sunday, November 23, 2014

Today seen as a founding legend of the blues, Robert Johnson died young and unnoticed in his time. We probably wouldn’t even know his name if it hadn’t been for a small number of recordings, the first of which were cut on this date in a San Antonio hotel room 78 years ago on November 23, 1936.
The details of Johnson’s life are mysterious, but what we know isn’t very happy. He was a widower twice over (both wives died in childbirth) who took on the life of a rambling musician, playing for tips outside of barber shops and in juke joints as he wandered the Mississippi Delta and beyond. To people who knew him, he seemed to have developed his skill with the guitar overnight, giving rise to the legend that he had met the devil at the crossroads one night and sold his soul for his talent. (We can have fun with this legend today, but it was taken very seriously by some people in Johnson’s time. Surviving relatives of his first wife saw her death as divine punishment for Robert’s decision to play secular music.)
Robert Johnson’s entire recording library consists of 29 songs he cut in 1936 and 1937. His biggest lifetime hit sold 5,000 copies. He died at age 27 in 1938, and we don’t even know for sure how that happened. One story says he drank a poisoned bottle of liquor from the husband of a woman he had flirted with, and another report says it was syphilis that did him in. Even his burial site is a mystery, with markers at three different church cemeteries near Greenwood, Mississippi, claiming to mark his resting place.
His recordings sat forgotten for over 20 years before a 1961 re-issue called “King of the Delta Blues Singers” introduced him to a new generation. His technique has made him a legend among legends. The first time Keith Richards heard him play, he asked “Who is the other guy playing with him?” mistakenly believing he was listening to two guitars. Eric Clapton has recorded many of Johnson’s songs, and might have done more to reintroduce him to modern audiences than anyone else. (Clapton’s “Crossroads” is a cover of Johnson’s “Cross Road Blues,” which has become tied up in the legend of Johnson’s supposed crossroads meeting with the devil.) Clapton has called Johnson’s distinctive bluesy whine “the most powerful cry that I think you can find in the human voice” and called him “the most important blues singer that ever lived.”
https://www.youtube.com/watch?v=Yd60nI4sa9A

Saturday, November 22, 2014

The Humane Society of the United States was founded 60 years ago as the National Humane Society on November 22, 1954. The charity’s activities are wide-ranging. Over the years, the Humane Society has advocated for reforms in animal slaughter, seal clubbing, the fur trade, puppy mills, dogfighting, and other forms of animal cruelty.
The Humane Society doesn’t call for the end of all practices that cause animal suffering, but it has lobbied for increased regulation in the pursuit of more humane practices in slaughterhouses and research labs. It has also renounced the use of violence or threats against people or property in the pursuit of its goals. It donates 1% of its budget to local animal shelters. Like many nonprofits with large budgets (the Humane Society raised over $125 million in 2012), the group has been criticized for how it allocates its sizable finances. Still, a 2011 philanthropic study named it the top high-impact animal welfare organization for potential donors.
http://upload.wikimedia.org/wikipedia/en/thumb/5/5f/HSUS_logo.svg/1280px-HSUS_logo.svg.png

Friday, November 21, 2014

Thomas Edison and Albert Einstein crammed so much awesome into their days that they could go to lunch with their friends and be like “Oh hey, I just changed everything you thought you knew about everything today. I’ll have the turkey on whole wheat.” That was more or less what they both did on this date, albeit 28 years apart.
It was 137 years ago on November 21, 1877, that Edison announced he had invented the phonograph. It was apparently as much a surprise to him as anyone else. Devices that recorded sound had already been invented, but Edison had been working on a method of capturing sound through grooves on a cylinder that would also play it back. (How earlier devices could claim to have recorded “sound” in any meaningful sense when they couldn’t reproduce it is a problem for philosophers, I guess.) Edison’s work apparently came along much quicker than he had hoped for. After a worker produced the first phonograph from Edison’s sketches, Edison prepared to test it.
“I didn’t have much faith that it would work, expecting that I might possibly hear a word or so that would give hope of a future for the idea,” he said later. But when he shouted “Mary had a little lamb” into the device’s recorder and played it back, it reproduced his words exactly. “I was never so taken aback in my life. … I was always afraid of things that worked the first time,” Edison said. And just like that, recording and playing back sound was a thing.
Nearly three decades later, Einstein created what was later called the Annus Mirabilis, or “miracle year,” for physics in 1905 by contributing four extraordinary papers to the “Annalen der Physik” (“Annals of Physics”) journal. The journal had already published three of the papers by November, covering topics like the photoelectric effect and special relativity, but on this date 109 years ago they published the paper containing the idea that would be most associated with Einstein’s name.
It was called "Does the Inertia of a Body Depend Upon Its Energy Content?" and it tackled ideas about the relationship between energy and mass that had been around for centuries. In it, Einstein built upon previous speculation and said that all energy and mass within a body had a constant relationship whose proportion could be determined by multiplying the mass by the speed of light squared. Einstein explained the idea in depth in this paper, but after World War II, he derived a simpler equation to express it to the average person: E = mc^2. The equation helped explain how nuclear reactions could give off massive amounts of energy with such a relatively small loss of mass. The equation doesn’t fully account for the energy lost in nuclear reactions, but it gets at the idea broadly enough that TIME put Einstein, the equation, and a mushroom cloud on its cover to help people understand the atomic bomb in 1946.
http://images.mid-day.com/2013/mar/Albert-Einstein-and-Thomas-.jpg

Thursday, November 20, 2014

Bill Gates and Microsoft released Windows 1.0 on this date 29 years ago -- November 20, 1985. Windows was designed to allow users to run multiple applications at the same time, and to enhance the use of graphics on computer displays, which were often blank DOS command screens at the time. To paraphrase Sheldon Cooper, “Windows 1.0 was much more user-friendly. I don’t like that.”
In actuality, the first version of Windows was a bit of a flop. It wasn’t a full operating system; instead, it piggybacked on top of a previous installation of DOS. Many reviewers thought the Windows overlay (which came on two floppy disks) clogged up system performance too much (this was in the days of 512 KB RAM). Another complaint that seems odd today: It was too dependent on the mouse. By 1985, users were still married to their keyboards, and having to reach out to grab a peripheral was apparently asking too much for many people.
Windows might have been too far a step in 1985, when home computers were mostly limited to “power users.” Its emphasis on an integrated graphical interface was intended to make navigating all of a computer’s functions easier for casual users. However, Microsoft stuck with the product. The first full upgrade, Windows 2.0, was released in 1987. Microsoft upgraded the Windows “shell” to a full operating system in the 1990s and eventually overtook Apple OS as the most dominant operating system in the PC market. Today, around 90% of personal computers run some version of Windows (although smartphones and tablets are a different story).
As for Windows 1.0, it had a remarkably long life in the computer world, where today’s innovation is tomorrow’s piece of junk. Microsoft offered support for Windows 1.0 for 16 years. All support was finally discontinued on December 31, 2001.
http://cnet3.cbsistatic.com/hub/i/r/2012/02/29/913f5f5d-6de1-11e3-913e-14feb5ca9861/resize/620x/38225b15f7fdfa293af69628b024ee12/Windows1.0.png

Wednesday, November 19, 2014

When he delivered his two-minute address to dedicate the military cemetery at Gettysburg, Pennsylvania, seven-score and eleven years ago (that’s 151) on November 19, 1863, Abraham Lincoln said “The world will little note, nor long remember what we say here.” It was one of the worst predictions ever made.
The Battle of Gettysburg had killed or wounded around 35,000 Americans, split almost equally between North and South. What could Lincoln say tojustify this carnage, and encourage people to go on? With the Gettysburg Address, a war that had been waged on the slippery idea of “Union” found more solid, and enduring, justification. Did Lincoln’s founding summary “conceived in liberty, and dedicated to the proposition that all men are created equal,” actually mean what it said? If so, the Union cause had no greater foundation.
Lincoln’s evolving justification for the war…from a more abstract concern for holding the nation together to a rejection of the slave economy that had proved the greatest single threat to that union…is encapsulated at Gettysburg. It’s the difference between being a memorable wartime president, and having a monument in your name.
Worth noting is that not everyone realized the lasting power of Lincoln’s words at the time. The Chicago Times (which opposed Lincoln politically) derided the speech as “silly, flat, and dishwatery.” But Lincoln’s counterpart at the Gettysburg cemetery that day saw it differently. Lincoln had only been invited to the dedication as an afterthought. The keynote speaker was the noted orator Edward Everett, who spoke for two hours prior to Lincoln’s address. Afterward, Everett approached the president and said "I should be glad if I could flatter myself that I came as near to the central idea of the occasion, in two hours, as you did in two minutes."
http://i.huffpost.com/gen/1224920/thumbs/o-GETTYSBURG-ADDRESS-facebook.jpg

Tuesday, November 18, 2014

The patrons at New York’s Colony Theatre 86 years ago on November 18, 1928, weren’t actually there to see a cartoon. The main feature was the gangster film “Gang War.” But the movie has been all but forgotten today. All anyone remembers is “Steamboat Willie.”
Despite what you might have heard, this wasn’t the first time a cartoon had been produced with a synchronized soundtrack…although most of the previous attempts weren’t very good at keeping the sound lined up with the picture. It wasn’t even the first time a theater audience had seen Mickey Mouse. (A single test screening of the Lindbergh-themed “Plane Crazy” had failed to pick up a distributer earlier.)
But it was the marriage of the two that made it all come together somehow. Mickey needed sound to be compelling, it seems. (Walt Disney had actually sold his beloved roadster to finance the soundtrack, which was kept on time with a track of audio cues placed on the actual film. The sale convinced Roy Disney that his little brother was serious about this talking mouse picture, and he stopped complaining about the cost.)
And as far as cartoons go, at least, it seems that sound needed an interesting vehicle like Mickey for people to care. The 27-year-old Disney, along with his partner Ub Iwerks, was pretty much literally betting his life on the mouse character taking off, having lost his earlier star Oswald the Lucky Rabbit through what was more or less legalized creative theft to an old business associate.
We all know how that story worked out today. But it had to start somewhere, and while Disney (and the corporation he left behind) are fond of saying “It all started with a mouse,” it’s probably more accurate to say it all started when the mouse began talking, whistling, and playing barnyard animals like musical instruments. (Yeah, Mickey was kind of a jerk for a while.) The audience at the Colony ate it up, and Disney knew he had something. “Steamboat Willie” wasn’t the true genesis of sound cartoons, or Mickey, but it combined them in a way that audiences finally saw the potential of both. It was the first time either of them mattered.
http://www.disneyshorts.org/screenshots/1928/96/2large.jpg

Monday, November 17, 2014

The closing moments of a tight game always make for riveting TV, and viewers don’t have to worry about whether they’ll see every moment. It was not always so. In fact, it was 46 years ago that a perfect storm of events created a fiasco that changed how television handled sports forever.
November 17, 1968: The New York Jets travel to Oakland to meet the Raiders in one of the pre-merger AFL’s most bitter rivalries. The teams would trade the lead all day, and the high number of pass attempts, touchdowns, injuries, and penalties resulted in a slow-moving fourth quarter that had no chance of ending by 7 p.m. in the East – NBC’s strictly mandated cutoff time, at which point the network would switch to its heavily promoted sweeps event, “Heidi.”
Network execs watching the game at home realized the game would run over and tried to get through to the studio to give orders to stay with the game…but the phones were jammed with curious and annoyed viewers, some of them parents wondering if “Heidi” would start on time, many of them football fans wondering if they would see the whole game. They wouldn’t. By 7 p.m., the network programmer had received no order to the contrary, and had been told earlier in the week that “Heidi” must start on time. (The network had sold all of the movie’s sponsorship time to the Timex watch people, and their contract said the movie couldn’t be delayed for any reason.)
Meanwhile, back in Oakland, the Raiders pulled off a comeback for the ages, scoring two touchdowns in the game’s final minute to wipe out a Jets lead and win 43-32. No one east of the Mississippi saw it, and when NBC flashed the final score on screen in the middle of “Heidi,” viewers were outraged. Art Buchwald joked that men who wouldn’t leave their chairs in an earthquake went to the phone to yell at NBC.
The “Heidi Bowl” was a perfect confluence of events: An exciting game that dragged on longer than normal butting up against a heavily-promoted, exclusively-sponsored movie the network had already locked itself into, with a switchboard meltdown to boot. The Raiders’ unseen comeback was the icing on the cake. It was a disaster at the time, but it inspired NBC to install backup phone lines for network communications, and it changed how TV covers sports. The following month, NBC had scheduled a special showing of “Pinocchio” after a lead-in game. NBC’s newspaper ads assured football fans that the puppet would rather cut off his nose than cut off the end of the game. And when the Raiders and San Diego Chargers ran long in their December 15 game, NBC started “Huckleberry Finn” at 7:08, and delayed the rest of the night’s programming by eight minutes. A network spokesman said he couldn’t remember anything like it.
http://sportsrants.com/media/files/2012/07/HeidiBowl.jpg

Sunday, November 16, 2014

Benazir Bhutto became the first female leader of a majority Muslim country 26 years ago, winning election as Pakistan’s Prime Minister on November 16, 1988. She served two nonconsecutive terms from 1988-1990, and again from 1993-1996. Like many politicians, her legacy is complicated. Allegations of corruption (including nuclear deals with North Korea) hounded her throughout her life, contributing to the fall of her first government in 1990.
But she was also a clear-eyed reformer, who advanced the causes of democracy and modernization (critics said “Westernization”) in the Islamic world…stances that brought her peril, both political and personal. She stood with the United States against Communism at the end of the Cold War, but recognized that both East and West were overlooking dangers grown in her own part of the world. She saw that America’s anti-Soviet mood had brought it in league with anti-Communist Afghan fighters like Osama Bin Laden, and prophetically told the first President Bush that anti-Soviet policies were “creating a Frankenstein” in Afghanistan.
She was assassinated in December 2007, while campaigning for another term in Pakistan’s Parliament. While she has been criticized for not doing enough to advance the cause of women while she was in power, the simple fact that she was elected at all spoke volumes in a part of the world often seen as anti-modern. Today, women fully participate in Pakistani politics and elections, and many of them no doubt do it with the memory of Benazir Bhutto.
http://static.guim.co.uk/sys-images/Guardian/Pix/pictures/2013/9/24/1380031730904/Benazir-Bhutto-006.jpg

Saturday, November 15, 2014

 It’s a rare advertisement that delivers exactly what it promises, but that was the case for Intel when it took out a half-page spot 43 years ago to tout its 4004 4-bit CPU in the November 15, 1971 issue of Electronic News promising “a new era of integrated electronics.” If anything, the ad undersold the product. The 4004 was the world’s first commercially available single chip microprocessor, and it would change computing forever.
You probably have at least a general idea that the central processing unit (or CPU) is the “brains” of your computer. It carries out the commands your various programs feed into it, allowing you to calculate tips, listen to music, or share pictures of your cat (or your lunch. Or your cat’s lunch…) with the push of a few buttons. The history of computing can largely be boiled down to the evolution of the CPU. The earliest computers didn’t have central command units that could carry out multiple commands, and had to be physically rewired to perform a new task. Central processing units came along to allow more versatility in what a computer would do, but they were bulky and slow, initially made up of vacuum tubes before advancing into transistor-based design.
The idea of containing a computer’s CPU on a single microchip was revolutionary. It allowed computers to become faster, smaller, and able to carry out more functions. The Intel 4004 was initially placed in a calculator released by Japanese manufacturer Busicom, beginning the era of microprocessor-powered consumer electronics.
Today microchips can contain multiple CPUs (or cores). A 4-bit CPU like the 4004 would be painfully slow and useless to a modern user, but the basic design of CPU’s hasn’t changed much from the 1950s. A common comparison is to say that today’s cellphones have more computing power than the earliest space shuttles. For what it’s worth, the crew designing the Pioneer 10 space probe, which launched from Cape Canaveral in 1972 and became the first spacecraft to escape our solar system, considered including the 4004 among its components. They decided against it on the basis that it was too new.
http://upload.wikimedia.org/wikipedia/commons/thumb/a/a7/KL_National_INS4004.jpg/727px-KL_National_INS4004.jpg

Friday, November 14, 2014

Nellie Bly’s whole life was worth celebrating, but this story will work well enough. A pioneering female journalist who later became a leading industrialist, 25-year-old Elizabeth Cochrane (Bly was her pen name) decided to add world traveler to her resume 125 years ago. Specifically, she wanted to turn Jules Verne’s “Around the World in Eighty Days” into reality, while writing about it for the New York World. On November 14, 1889, she set out to do just that, boarding a steamer across the Atlantic for a journey that would take her nearly 25,000 miles. She used steamboats and railroads to traverse Europe and Asia, as well as two oceans. In France, she met the inspiration for her journey, Jules Verne himself. She visited a leper colony in China, bought a monkey in Singapore, and made it back to New York 72 days after she left, setting a new world record.

That record wouldn’t last long, but it was only one part of Bly’s legacy. She also exposed horrific conditions at a New York insane asylum by faking mental illness in 1887 to see how patients were treated firsthand. In later life, she married a millionaire manufacturer and took over the presidency of his manufacturing company. Besides being a captainess of industry, she was an inventor in her own right, patenting unique milk can and garbage can designs. (It’s been claimed she invented the 55-gallon oil drum, but this isn’t confirmed.)

She eventually went bankrupt thanks to embezzlement by her employees, and went back to journalism for a time, where she gave favorable coverage to the emerging women’s suffrage movement. Before she died in 1922 of pneumonia, she predicted right on the nose that it would be 1920 before women won the vote. Her life would have been remarkable for any person, but is even more so for a woman spanning the late 19th and early 20th centuries. I got up and dressed myself today, which felt pretty impressive until I read about her. Maybe I’ll circle the globe or invent a few things after lunch?
http://www.myhero.ws/images//ReadingRoom/books/bly.JPG

Thursday, November 13, 2014

The Vietnam Veterans Memorial was dedicated 32 years ago on November 13, 1982. Thousands of Vietnam veterans marched to the memorial’s site on the National Mall to pay tribute to their fallen comrades…more than 58,000 killed and missing service members who were memorialized on the 246-foot-long wall.
The Vietnam memorial has become a pilgrimage site for millions today, but the design was controversial when it was announced. The design, created by a Yale architecture student named Maya Lin, was chosen from over 1,400 entries. Many felt that the somber display did not pay adequate tribute to the heroism of those who served and died in Vietnam. The wall was called “a black gash of shame” and Ronald Reagan’s Interior secretary initially refused to issue it a building permit.
Since its dedication, the arguments have abated and the wall has become a place for silent reflection, uniting observers who were divided on the war itself. (That reflection is literal; the surfaces of the wall’s gabbro panels are meant to reflect the present-day observers, symbolically uniting them with the names from the past.) Etching the name of a loved one on paper is among the most common acts by the wall’s 3 million annual visitors, as is leaving gifts. With the exception of perishable items and unaltered American flags, which are redistributed, the National Park Service catalogues the many offerings left at the wall. In addition to notes and stuffed animals, the collection includes at least one Medal of Honor returned to the government by the veteran who received it, large works of art like a glass door painted with a Vietnam scene, and a Harley-Davidson motorcycle with “HERO” on the license plate. The wall has also inspired several traveling and stationary replicas for those unable to visit Washington, D.C. At least one veteran has called it “the parade we never got.”
http://darkroom.baltimoresun.com/wp-content/uploads/2012/04/sr-sunshots-frozentime-0131.jpg

Wednesday, November 12, 2014

Scientists got their first close-up view of Saturn on this date 34 years ago, as Voyager 1 made its closest approach to the ringed planet (77,000 miles) on November 12, 1980. Voyager 1 sent back the first high-resolution photos of the planet, along with its famous rings and moons, changing what astronomers thought they had known about Saturn. The images Voyager 1 sent back of the planet’s rings showed there were not six, but hundreds, and that they were intertwined, or “braided,” in ways that hadn’t been considered before. (This image was taken four days later, as Voyager 1 moved past the planet.)
Nine months later, Voyager 1’s sister craft Voyager 2 approached Saturn in August 1981. The Voyagers combined to send back a wealth of images of Saturn, confirming the existence of three new moons around the planet and a significant atmosphere around the largest moon, Titan.
Both Voyager crafts were launched in 1977 and are currently heading out of the solar system. (Voyager 1 is the farthest spacecraft from Earth, at around 12 billion miles.) Voyager 2 also approached Uranus and Neptune during its mission. The crafts are expected to remain operational until around 2025, when they will lose power. If intelligent beings ever find either of them, they’ll come across documents, pictures, and recordings from Earth, including spoken greetings by human children, written messages from former leaders including Jimmy Carter, and a range of sounds, from whales to Beethoven to Chuck Berry.
http://upload.wikimedia.org/wikipedia/commons/a/a9/Vg1_p23254_hires.jpg

Tuesday, November 11, 2014

It’s Veterans Day (or Remembrance Day in places like Canada and the UK), but for a while, this holiday had a different name: Armistice Day. It was established to celebrate the end of World War I, which came 96 years ago at 11 a.m. along Europe’s Western Front on November 11, 1918 (“the eleventh hour, of the eleventh day, of the eleventh month”).
The armistice that ended World War I (known for years as The Great War) was signed in a railroad car outside Compiègne, France, as Germany faced up to its impossible position after four years of trench warfare against the Allied Powers and finally laid down its weapons. The armistice was signed at 5 a.m., but gunfire continued right up to the war’s official end at 11 a.m. An American soldier named Henry Gunther is generally recognized as the last man to die in World War I. He was killed one minute before the armistice went into effect…one of more than 2,700 deaths on the last day of the war.
The war decimated Europe, bleeding each of the Great Powers of a million men apiece, and that’s before tallying the deaths from starvation, disease, and cold. Its end was as much cause for relief as celebration…and the holiday that commemorated its armistice was intended to be a reminder of the need to preserve the peace. But that peace was short-lived, and the holiday was renamed in most countries around the time of World War II. British Commonwealth countries chose Remembrance Day, while the US went with All Veterans Day, since shortened to Veterans Day. (Four countries still recognize Armistice Day.)
World War I is nearly a century behind us now, and its carnage has been sadly replaced with no shortage of more recent bloodshed. But for the world in 1918, November 11 was blessed relief…a chance to tally up the butcher’s bill without seeing it extended, a chance to mourn, to try to rebuild, and to hope for a peace that we still look for on this day for remembrance, this day for veterans…this Armistice Day.
http://static.guim.co.uk/sys-images/Guardian/About/General/2010/11/11/1289473495580/Armistice-Day-006.jpg

Monday, November 10, 2014

Audiences found their way to “Sesame Street” for the first time 45 years ago, as the show debuted on November 10, 1969. Designed with a comprehensive curriculum and intended to teach preschool skills to low-income kids, the show was the most systematic combination of entertainment and education attempted in a children’s show. For the show's creators, the goal was to "master the addictive qualities of television and do something good with them."
Writing an episode of “Sesame Street” must be a pretty easy paycheck, right? Pick a number of the day, have Cookie Monster eat cookies, throw in a lesson about sharing, call it a day. Apparently not. One of the show’s head writers has said that any of the 15 writers on staff at a given time tend to burn out after a dozen scripts or so. Learning how to hold preschoolers’ interest is a major challenge, according to the show’s writers and producers, like Tony Geiss, who said "It's not an easy show to write. You have to know the characters and the format and how to teach and be funny at the same time, which is a big, ambidextrous stunt."
Oddly enough, “Sesame Street” has courted controversy from its earliest days. A commission in Mississippi voted to ban it in 1970 for promoting racial integration, while the famous social scientist Urie Bronfenbrenner criticized it for being too wholesome, which is an odd criticism for a show with gluttonous blue monsters, obsessive-compulsive vampires, and poverty that forces people to live in trash cans.
The show also launched the career of Jim Henson. With its original government funding on shaky ground (it was revoked in 1981), the show explored other sources of funding, including selling books and toys. Henson was iffy on marketing his characters, but finally agreed on the grounds that merchandising sales be used only to fund the Children’s Television Workshop production costs and educational outreach.
“Sesame Street” has won its share of critical acclaim, with over 100 Emmys in the awards case, but the show has also been wildly successful in its attempts to reach young audiences. In 1996, a survey showed that 95% of American preschoolers had seen the show at least once by age 3.
http://www.kqed.org/assets/img/tv/programs/sesamestreet-300x300.jpg

Sunday, November 9, 2014

While the Cold War symbolically separated the world into east and west, there was nothing symbolic about the 96-mile barrier constructed around West Berlin, cutting off people from friends and family members on the other side. Any East Berliner who approached it ran the risk of being shot dead, which more than 100 were from 1961-1989. For almost three decades, the Berlin Wall was the most concrete symbol of the forceful division Germany and Europe had inherited after World War II…and then, one day 25 years ago, it wasn’t.
To echo the famous line, the fall of the Berlin Wall happened slowly, then all at once. During the summer of 1989, Hungary had liberalized its border crossing policies at the Austrian border, allowing thousands of East Germans to make it into West Germany through Hungary and Austria…the exact migration that the wall had been built to prevent in 1961. (Communist authorities always maintained that the wall was designed to protect East Germans from “fascist” infiltration from West Berlin…although almost all of the human movement had been in the other direction before the wall was erected.)
On this day a quarter-century ago, it all came to a head. On November 9, 1989, East German authorities decided that all border crossings between East Germany and West Germany would be opened for crossing…including the infamous checkpoints along the Berlin Wall. (West Berlin was contained entirely within East Germany, and completely surrounded by the wall.) A party boss was handed a note with the policy change to read live during a press conference. He did so dutifully, and was asked by reporters when the changes would take effect. The boss had been given no guidance, and said that he guessed the change was immediate. That was all it took to send thousands of East Berliners to the wall. Overwhelmed crossing guards tried to get guidance on how to deal with the crowds before finally opening the gates. The wall would physically stand until the following year, but as a meaningful barrier, the Berlin Wall was finished.
The two larger barriers the wall had represented…the 860-mile frontier between all of East Germany and West Germany, and the longer imaginary Iron Curtain dividing Europe…were also soon to crumble. Germany was reunified in 1990, and the Soviet Union would dissolve by Christmas 1991, ending the Cold War. But that was somewhat anti-climactic. The most enduring images of the Cold War’s final days were of Berliners literally climbing over and chipping away at a hated symbol that had divided a city, a country, and in a real sense, a continent, for decades.

Saturday, November 8, 2014

While experimenting with cathode rays in his lab, German physicist Wilhelm Röntgen accidentally discovered x-rays on this date 119 years ago. Röntgen’s discovery came on November 8, 1895, when he noticed a shimmering effect on a chemically coated plate on a bench about 3 feet away from the tubes he was working with. He theorized that invisible rays were passing through the tubes to cause the effect, and spent the next few weeks experimenting. He discovered the rays passed through human flesh, as well as books and paper, but not more dense substances like bone or lead. When he took a picture of his wife’s hand being penetrated by the strange rays, exposing her bones, she exclaimed “I have seen my death!”
Röntgen called the discovery “x-rays” as a sort of placeholder name. (The “X” simply meant “unknown.”) The name stuck in English, but other languages name the rays for their discoverer, using names that are equivalent to “Röntgen rays.” Röntgen won the first Nobel Prize in physics in 1901, and the medical applications of his discovery were quickly apparent. For the first time, doctors could see inside the human body without surgery.
Slower to emerge was the understanding of the danger caused by exposure to radiation. Researchers reported burns after exposure to x-rays, and an assistant to Thomas Edison who frequently x-rayed his own hands died of skin cancer in 1904. Still, x-rays were regarded by many scientists as being as harmless as light. Seen as a fun novelty, x-rays could even be found in shoe stores to allow customers to view the bones in their feet as they were fitted for shoes in the 1930s. Eventually, the dangers became clear, and Röntgen’s discovery came to be applied in more limited settings. Healthcare and the TSA might be politically sticky industries, but one other thing ties them together: They both make frequent use of x-rays.
http://www.sunsetradiology.net/uploads/1/0/5/8/10588004/7000712_orig.jpg?278

Friday, November 7, 2014

You’ve seen the logo flashed before PBS shows, but the Corporation for Public Broadcasting got its start with an act of Congress, which was signed by Lyndon Johnson 47 years ago on November 7, 1967. The CPB is organized as a non-profit corporation funded almost exclusively by government dollars. Following its creation, the CPB helped establish the Public Broadcasting System in 1969 and National Public Radio in 1970.
Contrary to popular belief, PBS and NPR aren’t government-sponsored broadcasters. While their affiliate stations do receive money from the CPB, the majority of their budget comes from donations solicited from their audiences (a.k.a. “Viewers Like You”). CPB grants make up on average 15-20% of the costs to run a public broadcasting station.
Still, the presence of any government funding has opened public broadcasters up to charges of political bias. The usual claim is that PBS and NPR programming leans too far left, but not always. In 2002 and 2003, audiences of both NPR and PBS were polled and reported that they mostly saw no bias in the programming…and those who did were split over whether the bias was liberal or conservative. (For what it’s worth, the CPB’s board of directors can’t have more than five of its nine seats filled by presidential appointees from the same political party.)
Whatever might be said about news and commentary, there’s little controversy to be found around public broadcasting content like “Sesame Street,” “Mr. Rogers' Neighborhood,” "This American Life," or “A Prairie Home Companion,” and the fact that they’re all brought to you commercial-free makes for a nice change of pace from the rest of the radio or TV dial. Except during pledge drive season…but hey, life’s full of tradeoffs. Having Cookie Monster beg for a few dollars every now and then is just part of the deal.
http://www.cpb.org/annualreports/2009/images/stories/about_cpb_logo.jpg

Thursday, November 6, 2014

America’s obsession with college football got its start 145 years ago, with the first documented game between two American colleges in New Brunswick, New Jersey on November 6, 1869. Rutgers and Princeton (then called the College of New Jersey) were the participants, with Rutgers winning 6-4.
The game had little resemblance to the one millions of Americans devote their autumn Saturdays to now. The teams lined up 25 to a side and had to advance a perfectly spherical ball by hitting or kicking it (no running or passing allowed), before scoring a “run” by kicking it into the opposing goal. Consider this proto-game a rough mix of soccer and rugby. (This painting by Arnold Friberg illustrates the chaos that the first game might have looked like.) Both teams made use of the flying wedge formation, a devastatingly effective tactic where a wall of players advance by locking arms. (The move was too devastating, as it turned out. The flying wedge was a popular tactic in football’s early days, but caused so many injuries that it was banned.)
The revered tradition of smack talk from fans also got its start during this game. About 100 spectators showed up to watch the contest, including a Rutgers professor who allegedly waved his umbrella toward the field and yelled “You will come to no Christian end!” (Hey buddy, just calm down. OK?) The curse might have been directed at Princeton’s players after a particularly rough scoring play, or it might have been an expression of disgust at the whole violent spectacle.
Princeton earned some payback with an 8-0 win during a rematch a week later. (Each team won on its own campus.) A third game had been planned, but was called off. One supposed reason: Professors and college administrators worried the young men were taking too much time away from their studies to focus on football.
http://a.espncdn.com/photo/2008/0617/ncf_rutgers2_412.jpg

Wednesday, November 5, 2014

Guy Fawkes Day has a long history in Great Britain, but is probably less familiar to American readers. To summarize: In 1605, a group of English Catholics plotted to overthrow the British government by blowing up the House of Lords with the king and Parliament in session. Guy Fawkes, a zealous Catholic who had previously fought for the Spanish Army, was given the unlucky job of guarding the gunpowder in the basement. When the government received an anonymous tip, a justice of the peace found Fawkes hiding with the explosives 409 years ago on November 5, 1605. Fawkes was arrested and eventually sentenced to death. The plot was foiled, the government had been saved, and the country began observing an official day of thanksgiving on the anniversary of the plot’s discovery.
So for a (largely) American audience, why celebrate this at all? First of all, a few aspects of this holiday are definitely not worth celebrating. For many years, November 5 (alternately called Gunpowder Treason Day, Bonfire Night, and Guy Fawkes Day/Night) carried a strong anti-Catholic odor in many circles. A common tradition was burning the pope in effigy. This practice followed English emigrants into other places, including the American colonies. George Washington was so annoyed by it that he issued an order reminding his soldiers of the need to win the friendship and loyalty of Catholic French-Canadians, bemoaning the existence of "Officers and Soldiers in this army so void of common sense" that they would participate in "that ridiculous and childish custom of burning the Effigy of the pope."
Since then, the holiday's religious overtones have faded, as England itself has become a less Protestant and more diverse place. Today, some Guy Fawkes revelers are just as likely to burn obnoxious celebrities as the pope or Guy himself (who has become a sympathetic symbol for many anti-government types). Fireworks and bonfires are an annual tradition on November 5, and there are few things Americans love more than an excuse to blow things up or set them on fire. Some have theorized that some of the holiday’s traditions, like children dressing up in masks and begging for pennies, evolved as a proxy for Halloween, which never caught on in Britain despite its Celtic heritage (but has become more popular lately). And finally, spare a second to remember that the whole thing was started by a foiled terrorist plot, saving many lives. Religion and politics aside, that’s worth a cheery nod. Guy Fawkes Day is a complicated holiday (one writer said it means “all things to all men”), but that complication contains enough bright points to make it an anniversary worth celebrating, one way or another.
http://news.bbcimg.co.uk/media/images/70933000/jpg/_70933067_70933066.jpg

Tuesday, November 4, 2014

By 1922, Howard Carter was getting desperate. The British archaeologist had spent a decade in Egypt’s Valley of the Kings looking for the legendary tomb of King Tutankhamun, lost for more than 3,000 years. But his wealthy benefactor was getting impatient and warned Carter that he was pulling his funding at the end of the season. Time and money were running out.
And then, 92 years ago, the breakthrough Carter and his crew had hoped for finally arrived. The men found a set of steps on November 4, 1922, and began their excavation. Three weeks later, Howard Carter entered the long-lost tomb of King Tut…the best preserved pharaoh’s tomb ever found in the Valley of the Kings. Carter’s first look at the tomb was lit by a candle after he breached a corner of the doorway with a chisel. With his benefactor, Lord Carnarvon, over his shoulder, he saw a room filled with gold and ebony treasures. “Can you see anything?” Carnarvon asked, and Carter replied “Yes, wonderful things!”
His crew spent months cataloging those wonderful things, and in February 1923, Carter finally found the sarcophagus of King Tut, known as the “boy king,” who took the throne around age 9 and died about a decade later, in or around 1323 BC. The gold coffin containing his mummy was nested inside two others, and his golden burial mask became the symbol for a revival of interest in ancient Egypt. As for Tutankhamun himself, the unfortunate boy-king most likely suffered from a degenerative bone disease, which probably combined with malaria to kill him as a teenager. Many of the treasures excavated from King Tut’s tomb have toured the world and are now housed at Cairo’s antiquities museum. As one author put it, “The pharaoh who in life was one of the least esteemed of Egypt's Pharoahs has become in death the most renowned."
http://www.comeseeegypt.com/images/tut.jpg

Monday, November 3, 2014

Tokyo got its first taste of green, scaly doom 60 years ago, as Godzilla debuted to Japanese audiences in his eponymous 1954 film on November 3. Nearly 10 million tickets were sold, and that’s before counting the damage “Godzilla” wreaked on American theaters. Six decades later, the original Godzilla movie is still the second most-attended in Japan.
Godzilla’s status as a metaphor for nuclear destruction is well-worn ground for commentators, as he represented the fears of Japanese movie-goers less than a decade removed from the atomic holocausts of Hiroshima and Nagasaki. These parallels were clear from the character’s first rampage, and Japanese critics gave the film harsh marks as an exploitation of the nation’s trauma. But that couldn’t keep movie-goers away, and by 1955 “Gojira” (the closest transliteration into English) had crossed the Pacific and was playing in predominantly Japanese-American neighborhoods. In 1956, a heavily edited version, featuring Raymond Burr in added footage, was released for English-speaking North American audiences.
Following his first outing, Godzilla has spawned 27 official sequels, along with two American reboots (most recently just this year) and countless appearances outside of the movies. From his roots as a political allegory, the King of the Monsters has mined lighter territory, morphing into a type of superhero, a space traveler, and of course, a heavyweight brawler taking on all comers, including King Kong, Mothra, and Mechagodzilla.
But for all this versatility, he’s still best known in his original role as bringer of destruction. For all the ghouls, vampires, demons, and witches the movies have inserted into our nightmares, no monster can quite compare to what Godzilla represents…a force of nature, without pity or logic, which you can only hope will pass you by. There’s no talking to Godzilla. There’s no stake through the heart or magic spell to bail you out. There’s just survival. Whether it’s the atomic bomb, hurricanes, wildfires, or some other mindless doom, Godzilla is real. Tokyo, or any city, is never truly safe.
http://upload.wikimedia.org/wikipedia/en/2/29/Godzilla_'54_design.jpg