Saturday, February 28, 2015

BONUS AWC: It’s not a leap year, but this series leaves no date behind in its quest to find an anniversary worth celebrating in every spot on the calendar. While February 29 remains in hiding for another year, we’ll take a moment to celebrate an event that happened on the rarest date of all.
For Hattie McDaniel, plenty of moments surrounding her performance in “Gone with the Wind” were not worth celebrating. When the movie debuted in Atlanta in December 1939, she – along with all the film’s black actors – were barred from the event due to Georgia’s segregation laws. Even on the night when she attended the Academy Awards ceremony in Los Angeles, she had to sit at a separate table from the rest of the cast and crew. But despite all that, she took home an Oscar for Best Supporting Actress for her Mammy role, making her the first black Oscar recipient in Academy Awards history. It happened 75 years ago (more or less) on February 29, 1940.
McDaniel’s achievement came on a banner night for “Gone with the Wind,” which took home 10 Oscars, including Best Picture. While a triumph of production, the movie is not the best vehicle for racial progress. It celebrates the antebellum South, with a black housemaid whose name has become shorthand for a brand of female racial stereotype. The Mammy performance is hard for me to watch today…but it’s also easy to see how it might have seemed progressive in 1939 for audiences to watch McDaniel's character deliver sassy tough love to a spoiled white debutante. For her part, McDaniel – who had been a washroom attendant and waitress after the stock market crash of 1929 – seemed unapologetic about taking on roles like Mammy, which were the only acting jobs she could get. She supposedly responded to criticism from liberal groups by saying "Why should I complain about making $700 a week playing a maid? If I didn't, I'd be making $7 a week being one.”
Upon accepting her Oscar, McDaniel called it one of the happiest moments of her life in a gracious speech, during which she said “I sincerely hope I shall always be a credit to my race and to the motion picture industry.” She was both. In her Los Angeles neighborhood, she helped organize black homeowners who were sued by their neighbors in an attempt to keep the neighborhood white. (A judge threw out the case, allowing McDaniel and the other families to keep their homes.) She has two stars on the Hollywood Walk of Fame, recognizing her contributions to both radio and movies. She died in 1952 from breast cancer, at just 57. The whereabouts of her Oscar are unknown.
http://www.eurweb.com/wp-content/uploads/2012/02/hattie-mcdaniel.jpg
The story goes that Francis Crick walked into a pub near the Cambridge University lab where he and James Watson had been working and interrupted all the patrons’ lunches for a rather momentous announcement 62 years ago on February 28, 1953…he and Watson had “found the secret of life.” If the anecdote is true, Crick wasn’t too far off. By discovering the molecular structure of DNA, he and Watson had indeed made a discovery worth looking up from your bangers and mash for.
Deoxyribonucleic acid, as the smart kids call it, had been on scientists’ radar for decades prior to Watson and Crick. As a microscopic substance in the nuclei of human cells that was made up of nucleic acid, DNA was known by the late 19th century. Experiments in the 20th century suggested, and then confirmed, its role as an engine of hereditary traits.
What was still unknown was how the structure of the DNA molecule allowed it to pass traits through generations of living organisms. By isolating the now-famous double helix structure of DNA, the American Watson and English Crick helped birth the entire field of molecular biology. Each DNA molecule contains a spiral of two DNA strands, which in turn are composed of nucleotides which contain one of four compounds…the ordering and pairing of which contain all the genetic information needed to code a life.
The genetic revolution has touched every segment of modern society, impacting medicine, agriculture, forensics, and archaeology. Even if the technical details escape us, we intuit the meaning of Watson and Crick’s work anytime we say something is so central to our being that it’s “in our DNA.” Watson and Crick would formally announce their discovery in Nature magazine in April 1953. But for a few unsuspecting patrons at Cambridge’s Eagle pub on this date, a sneak peek at the future arrived along with their steak and kidney pudding. Who says you can’t find the meaning of life on a barstool?
http://rack.1.mshcdn.com/media/ZgkyMDEzLzA2LzEzLzU0L2RuYS40NjE1MC5qcGcKcAl0aHVtYgkxMjAweDYyNyMKZQlqcGc/54ec3455/275/dna.jpg

Friday, February 27, 2015

Largely forgotten today, the events of 155 years ago might have been uniquely responsible for propelling Abraham Lincoln to the White House. Using the dominant communication media of his time, Lincoln used a speech and a photo on February 27, 1860 to emerge into the national awareness in a new and dramatic way.

Lincoln spent the day at Cooper Union, a private college in Manhattan’s East Village where he delivered a lengthy 7,000-word address carefully detailing his position on slavery. Lincoln labored at length to justify his position against its expansion into Western territories that were preparing for statehood, aligning his position with those of the Founding Fathers. This speech has been overshadowed since, for good reasons. Lincoln is making a tepid, conservative case against slavery’s expansion, far less inspiring than the abolitionist stances he ended up taking as a result of the Civil War.

But if modern audiences find little of interest in the speech, Lincoln’s intended audience was very impressed. Lincoln was a man from the frontier known for telling jokes. He had made a name for himself in Illinois by debating Stephen Douglas during the Senate campaign of 1858. But Lincoln’s careful argumentation on this stage – “informed by history, suffused with moral certainty, and marked by lawyerly precision,” one Lincoln scholar said – helped transform him from regional curiosity to a national leader. It didn’t hurt that the speech was given in New York, the nation’s media capital. The speech was reprinted in newspapers and pamphlets across the North, and has been seen as the crucial piece in helping Lincoln gain the 1860 nomination for president of the young Republican Party that May.

A few hours before the speech, Lincoln made another stop – at the photography studio of Matthew Brady, remembered today for his stirring photographs of Civil War battlefields. Lincoln called on Brady for a portrait. Photography was still a young medium, and Brady’s picture of a beardless Lincoln provided the perfect visual counterpoint for Lincoln’s successful speech. After he won the nomination, Harper’s Weekly used the portrait to make a triumphant front-page engraving of the nominee (erroneously captioned as the “Hon. Abram Lincoln”). In 1860, simply seeing a candidate look presidential had value in itself…particularly for Lincoln, who was described as being so gangly and thin by his opponents that a humanizing picture was worth a thousand words. Or as Lincoln said later, the picture allowed him to display a “human aspect and dignified bearing” to offset his opponents’ descriptions of him. (Could simply proving you look human be enough to swing an election?)

The tides of politics are tricky to discern. Lincoln towers over our national consciousness today, but the fact is he had to win election in a system just as dominated by its channels of communication as we have today. Could a speech and a photograph really swing an election? Lincoln seemed to think so, when he was quoted as saying “Brady and the Cooper Institute made me president.” Of course, knowing Lincoln, this could have been intended as a wry comment on the outsize importance one day’s events could have on a whole campaign. But whether it was said earnestly or facetiously, there was certainly some truth to it.
http://www.printsoldandrare.com/lincoln/103linc.jpg

Thursday, February 26, 2015

As an animator in the 1930’s and 1940’s, Tex Avery basically had two choices: Emulate the formula that had allowed Disney to dominate the young medium, with cute animals and doe-eyed kids…or go in the complete opposite direction. He chose Plan B.

Frederick Bean Avery was born 107 years ago on February 26, 1908, in Taylor, Texas…which easily explains his more famous nickname. Attempts to explain his unique outlook on the animated short, and what it could do, are not as simple. One theory reaches into a day when the young animator was horsing around with some coworkers at Walter Lantz’s studio in the early ‘30s. A thumbtack flew into his left eye, costing him his sight in that eye. Avery’s lack of depth perception could help explain the crazy proportions and physics he brought to his characters, taking the rubbery “squash and stretch” nature of cartoons to extreme places.

Avery’s most famous work came after leaving Lantz, when he moved to Warner Bros. and later to MGM. At Warner, he was assigned to his own production unit and oversaw a team of animators while developing the studio’s “Looney Tunes” series. Staffed with animators like Bob Clampett and Chuck Jones, the Avery unit pushed the “Looney Tunes” shorts to the top of the field, elbowing Disney’s tamer fare out of the way. (Is it any coincidence that Disney seemed to lose interest in the short and turned to feature-length animation in the ‘30s?) The Warner shorts under Avery brought a crazy style of action, and developed the studio’s top-flight characters. First, Avery’s crew took a pre-existing (and rather bland) Porky Pig, and made him a genuine star. Then they developed wackier foils for the studio’s straight characters, like Daffy Duck and Bugs Bunny – both of whom started out in much more manic forms. (The phrase “What’s up, doc?” had supposedly been popular around Avery’s North Dallas High School.)

While Avery was successful at Warner Bros., he clashed with management and disliked the working conditions. (His crew nicknamed their first bungalow “Termite Terrace.”) In 1941, he moved over to MGM, where his frantic style was less restrained. He developed new stars, most notably Droopy. His MGM cartoons were fast-paced and silly, with extreme violent gags. His characters frequently broke the fourth wall, directly addressing the audience. He also played with sexuality about as much as it was possible to. While not his most famous, his most iconic character might be the MGM Wolf, a lust-crazed character who was always chasing down a curvy redhead designed to keep the dads in the audience paying attention.

Avery was unrestrained but overworked at MGM, and took a year off in 1950 to recover from fatigue. In 1953, he returned to the Lantz studio, his original employer. He didn’t stay long, only directing four cartoons, but this brief stop still produced a legacy character, as he helped develop the mute penguin Chilly Willy. He ended his career doing TV commercials and working with Hanna-Barbera. He died of liver cancer in 1980 at age 72.

He has received countless plaudits, but probably the best that can be said about him is that his work is still influencing the medium he loved. In a direct line tracing from Roger Rabbit through Ren & Stimpy to SpongeBob SquarePants, animators have never stopped leaning on the Avery style. Roger Rabbit might be the biggest tribute: In a movie developed by Disney and featuring characters like Mickey Mouse and Donald Duck, Avery’s creations…Bugs, Daffy, Droopy, and his sexy redhead (in fakeout form)…are right there too, adding their own opposing, but just as influential, perspectives to Toontown's landscape.
https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg7ydsmTF_pHtNnX-c1xpaatnEHvPuf9SnGnqIw3ULgvLkWe99a2f8xGXyE0DUnCuuglGd_xR8Dlt8HA0KoHkQLczxuJmAgxZ8clVXd1qoTqYQwGquonM_IOdAPb9KxtWW_XGEi-2Ox334/s1600/Tex+Avery.jpg

Wednesday, February 25, 2015

AWC is celebrating the quiet guys today…or the relatively quiet, anyway. Zeppo Marx and George Harrison (born 42 years apart on this date) didn’t exactly lead lives of anonymity, but as members of famous quartets, both of them worked in the shadows of more flamboyant and famous team members.
Zeppo was the stage name for Herbert M. Marx, the youngest of the five Marx Brothers, born in New York 114 years ago on February 25, 1901. As part of his brothers’ famous comedy team, Zeppo was the straight man when the Marx Brothers made the leap from the vaudeville stage to the big screen. He appeared in the Marx Brothers’ first five films, playing the romantic lead or the regular guy while his brothers Chico, Harpo, and Groucho clowned around. (The fifth Marx brother, Gummo, was even more anonymous…he never appeared in a movie after the brother act went Hollywood.) One unattributed quote referred to “the plight of poor Zeppo Marx,” claiming that “While Groucho, Harpo and Chico are hogging the show, as the phrase has it, their brother hides in an insignificant role, peeping out now and then to listen to plaudits in which he has no share.”
Zeppo finally left the group to the funnymen, but he might have had the last laugh. His talents as a mechanic (he was the one who kept the family car running) led him into an engineering career where, among other things, he owned a company that produced military parts during World War II and invented a watch that tracked the pulse rate of cardiac patients. He also started a successful theatrical agency and became very wealthy from his post-Marx Brothers ventures before his death in 1979.
George Harrison (born in Liverpool 72 years ago on February 25, 1943) was known as “the quiet Beatle.” While John Lennon and Paul McCartney famously wrote most of the band’s songs and fought over its direction, and while Ringo Starr received a drummer’s unique brand of fame, George kept it mellow, writing a few major songs and playing lead guitar. His contributions to the band were nonetheless distinct. Indian culture seeped into the Beatle repertoire, thanks to his interest in Hinduism. (He introduced the band to the eerie sound of the sitar, and for better or worse, the Hare Krishna movement.)
After the Beatles broke up, Harrison set out on a successful solo career, raised money for starving Bangladeshi refugees in a major 1971 benefit concert, and formed the most awesome superband ever in 1988, bringing Bob Dylan, Tom Petty, and Roy Orbison on board for a few jam sessions that turned into the Traveling Wilburys, because that’s what you get to do when you were in the Beatles. He was only 58 when lung cancer took him in 2001, leaving a lot of guitars to gently weep. His memorial tree is being replaced in Los Angeles today, on his birthday. It turns out the quiet Beatle’s memorial was quietly killed...by beetles.
Photo collage assembled by AWC

Tuesday, February 24, 2015

This entire series revolves around it, so it’s only fair to dedicate some love to our modern calendar. When Pope Gregory XIII announced a reform to the old Julian Calendar 433 years ago on February 24, 1582, it looked like a minor tweak, compared to the major overhaul the Julian Calendar itself had been. The Julian Calendar had been in use since the Romans had introduced it over 16 centuries earlier, and for the most part, it was a good one. (The pre-Julian calendar it replaced in 45 BC, on the other hand, was a mess, with a wacky leap month of variable length inserted every few years.) The Julian Calendar was much more stable, with 365 days divided into 12 months, and an extra day squeezed in every four years. Sounds basically like today, right?

The problem was that the actual solar year is not a precise 365 ¼ days long, so the Julian Calendar overshot the Earth’s rotation around the Sun by about 11 minutes every year. This added up to an extra 3 days every 400 years. By Gregory’s time, the vernal equinox – which helped determine the date of Easter – had drifted about 10 days to March 11. Gregory knew that if he didn’t do something, Easter would eventually end up in December, and then Santa and the Easter Bunny would have some kind of holiday death match.

To restore Easter back to its rightful spot, Gregory issued a papal bull on this date reforming the calendar. Well, it was sort of on this date. If you’ve been paying attention, you know that the Julian Calendar, which dated Gregory’s pronouncement, had drifted by about 10 days, making this an anniversary only in a rough sense. This is a common problem with dating events before Gregory’s calendar reforms, and our options are basically to do a lot of extra math to match any Western date before 1582 to our modern calendar, or just live with it.

Gregory’s reform made three changes: One was to cut back on the leap years, which were pulling Easter further and further up in the year. Gregory eliminated 3 leap years every 400 years, which is still what we do today. (Years ending in “00” are only leap years if they can be divided by 400. The year 2000 was a leap year, but 1700, 1800, and 1900 weren’t.) The second change involved some complicated business with the lunar calendar which makes my head hurt to think about, but which was crucial to putting Easter back where the Nicene Council had placed it in 325. (On another note, calculating the date of Easter is really complicated.)

Finally, there was the issue of what to do with all those extra days that had built up under the Julian Calendar, like plaque in the calendar’s teeth. Gregory just waved them away, and people in the Catholic areas that adopted the calendar reforms right away went to bed on October 4, 1582 and woke up on October 15 (presumably very hungry and really, really needing to pee). It would take much longer for the Gregorian Calendar to catch on outside of Catholic Europe. Protestant and Orthodox countries held out for a while, but the logic of Gregory’s changes (and demands for convenience in international trade) eventually overcame religious suspicion. In 1923, Greece became the last European holdout to make the switch. Today the Gregorian Calendar is the standard for much of the world. But I’ll be honest with you, I still have no clue when Easter is. Sorry, Greg.
http://www.nta.ng/wp-content/uploads/2015/01/nta-image-gallery-calendar.jpg

Monday, February 23, 2015

Your correspondent will fully admit that today’s date is a bit of an educated guess. It’s hard to say exactly when Johannes Gutenberg made the first Bible printed on movable type available. (Oddly enough, the date wasn’t saved in print anywhere.) But there is some agreement among scholars that Gutenberg’s Bible might have been published in Mainz, Germany, on February 23, 1455 – 560 years ago today. One future pope wrote in a letter that he had seen pages of the Gutenberg Bible on display in Frankfurt that March. Considering that this would have been a 17-mile distance from Mainz (or approximately 400 million miles to people in the 15th century), dating the first publication to today is probably not too bad a guess.
After his discovery of movable type around 1439, Gutenberg had printed books using the technology. (The Chinese and Koreans had done the same earlier.) But nothing, East or West, had approached the scope and popularity of Gutenberg's Bible printing project. It’s estimated to have taken him three years to print about 180 editions of the Latin Vulgate, a project which had the blessing of the Catholic Church. The books were a labor of love, and the high standards of the ink and other materials have earned the Gutenberg Bible the reputation as one of the most beautiful books ever printed. Large margins allowed room for artists to add decorations, which could be as ornate as a buyer was willing, or able, to pay for.
Despite their high selling price (about three year’s wages for a clerk in some instances), the books sold out immediately and traveled across Europe. While expensive, the printed Bible was still more affordable than a handwritten copy.
The printing press set off a revolution in Europe. (If you’ve ever been to Epcot, you’ll know all about this part.) Print exploded across the continent in the decades following Gutenberg’s work. By the year 1500, 77 towns or cities in Italy alone boasted print shops. Scientists could share information accurately and easily. Errors introduced by scribes stopped plaguing copies of original works, and information in general was easily passed around. By the 1600’s, printing had become so cheap that the first newspapers started appearing.
Today 48 copies of Gutenberg’s Bible are known to exist, although only 21 are complete. Most of them are owned by university libraries and other scholarly institutions (like this one, on display at the University of Texas). The New York Public Library has what’s believed to be the first copy to have reached North America in the mid-19th century. A complete copy hasn’t been sold since 1978, but it’s estimated that such a sale would fetch at least $25 million today. So I guess we can forgive Gutenberg if he didn’t record the exact date of his Bible’s publication. His press was a little busy at the moment. Revolutionizing the entire Western world has its perks, but it probably doesn’t leave much time for anything else.
http://www.hrc.utexas.edu/exhibitions/permanent/gutenbergbible/images/gutenberg_case.jpg

Sunday, February 22, 2015

What can be said about the Miracle on Ice? Sports Illustrated ran this cover with no words, under the logic that everyone in the country knew what they were looking at and needed no explanation. Your humble correspondent is almost tempted to do the same. This piece of modern American mythology is better summed up in a single image than any string of words could hope to do. But we write about things here, so let’s at least try.

A good place to start is a reminder that the U.S. Olympic hockey team did not win the gold medal when they took down the mighty Soviet Union on American soil during the Lake Placid Winter Games 35 years ago on February 22, 1980. You could be forgiven for assuming they had won the gold from this victory shot, but that didn’t happen until two days later when the U.S. beat Finland 4-2.

But nobody remembers that. What has lived on for more than three decades is the image of a bunch of college guys from Boston and Minnesota (average age 21) shocking a buzz-saw Soviet team that couldn’t have been more heavily favored if God Himself had been in goal. The Soviets had lost one Olympic hockey match in 20 years. They had pummeled the best professional players from the U.S. and Canada, beating the NHL All-Stars 6-0. Now they got the chance to face a pack of American amateurs, the same team they had laughed past in a 10-3 exhibition win just two weeks earlier. If the entire Cold War had hinged on this game, a smart gambler would have started polishing his Russian before the puck dropped.

The only real prayer for the Americans was to stick with a hard-hitting, physical style of play that coach Herb Brooks had instilled in them. And while the U.S. got in their share of blows and managed to get Soviet goalie Vladislav Tretiak pulled after a fluke goal tied the game at the end of the first period, they still entered the final period down 3-2. After a long offensive drought, the Americans tied the game on a power play equalizer against the USSR’s backup goalie, and took their first lead of the day on a shot by team captain Mike Eruzione with 10 minutes left.

The Soviets wouldn’t go quietly. They started shooting wildly in the game’s closing moments, looking to get anything past American goalie Jim Craig, who stopped 36 of 39 shots on goal. As the final seconds ticked down, ABC’s Al Michaels echoed the disbelief he was feeling, and helped coin the name that would stick to this game for over a generation, when he asked the TV audience “Do you believe in miracles? Yes!” Another member of the broadcasting crew said it was like watching a bunch of Canadian college football players take down the Pittsburgh Steelers.

Cold War politics turned the game into a moment of national pride, at a moment when the real Cold War wasn’t going well for the West. Soviet tanks had rolled into Afghanistan two months earlier. There were also Americans being held hostage in Tehran. For Americans in need of a dose of national pride, it was the right result, against the right opponent, at the right time.

So those are a few words that can be said about the Miracle on Ice. But to be honest, you’ll probably still get just as much by looking at this picture with no words attached.
http://cdn-jpg.si.com/sites/default/files/si/2010/writers/joe_posnanski/02/22/miracle.on.ice/mi/ra/cl/miracle-cover.jpg

Saturday, February 21, 2015

The first telephone directory was issued on this date 137 years ago in New Haven, Connecticut. Distributed on February 21, 1878, it was a single sheet of cardboard listing 50 technologically savvy businesses in New Haven that had telephones (which had only been patented two years earlier). There were no numbers, since early telephone users simply told the operator who they wanted a connection to. At this stage, just knowing who had a phone was useful information.
Today, we take access to phone listings for granted. We can Google any business we need to contact, and our friends’ numbers are typed in once, stored, and never thought of again (until we have to replace phones). But for years, the lowly telephone directory was the workhorse of our connected world. White pages for personal listings, yellow pages for service providers, and if your community was lucky enough to have one, a reverse directory for attaching numbers to names…these were the engines that fueled countless economic and informational exchanges. (Yes, calling your friends after school to find out what’s up qualifies as an exchange of information.)
That’s all changed, of course. France had online phone listings as early as 1981 (for all six people who had internet access back then), and the U.S. started seeing listings migrate to cyberspace in 1996. The old phone book has become an outdated relic, and a wasteful one at that. (Greener cities like Seattle and San Francisco have tried to ban their unsolicited distribution.) But the death of the Yellow Pages is just a shift in format. The idea behind online listings is the same as it was in New Haven in 1878…your phone is useless unless you know who it connects you to.
http://www.haineslocalsearch.com/wp-content/uploads/2012/05/yellow-pages-directory.jpg

Friday, February 20, 2015

Ansel Adams (born 113 years ago on February 20, 1902) didn’t have the best early experience with the American West. As a four-year-old, he was thrown into a garden wall during an aftershock following the 1906 San Francisco earthquake. His nose was broken and stayed permanently crooked. To his credit, he didn’t hold it against the region, which he spent his career as a photographer helping others fall in love with.
In 1916, two things happened to a teenage Ansel that marked him forever: He visited Yosemite National Park with his family for the first time, and hid dad gave him a Kodak Brownie box camera, his first. He was permanently smitten by both. He quickly became a photography enthusiast, while joining the Sierra Club and becoming a summer caretaker at Yosemite. He spent his summers hiking, camping, and taking pictures, while trying to launch a musical career the rest of the time.
His first pictures were published in 1921, and he would eventually give up his musical ambitions. (His small, easily bruised hands didn’t mark him as a pianist.) He spent 60 years in photography, with his career marked by black-and-white photographs of the West, especially his first love at Yosemite. (This photo, "Moonrise, Hernandez, New Mexico," became one of his most popular.) He developed the Zone System to optimize film exposure and development, and the complete sense of control he gained from this system might have explained his lifelong preference for black-and-white film.
He taught photography workshops, and supposedly told his students "It is easy to take a photograph, but it is harder to make a masterpiece in photography than in any other art medium." He won numerous awards, and even had a photo placed on board the Voyager spacecraft. If aliens ever encounter that repository of Earth’s knowledge, one thing they’ll find is Ansel Adams’ photo of the Tetons and the Snake River. He died at age 82 in 1984.
http://i.guim.co.uk/static/w-700/h--/q-95/sys-images/Guardian/Pix/pictures/2011/11/12/1321099450442/Ansel-Adams-Moonrise-Hern-011.jpg

Thursday, February 19, 2015

Nicolaus Copernicus (born 542 years ago on February 19, 1473) knew that mankind had no desire to be moved from the center of the universe, and he tried his best to delay it. His heliocentric theories that refuted the popular views of his day were worked out during his time as a civil servant along the Baltic Coast in present day Poland, when he was supposed to be consumed by economic and administrative duties but couldn't keep from focusing on less earthbound ideas.
Copernicus dragged his feet on making his views public, perhaps intuiting some of the sharp resistance he would meet. He finally published his work “On the Revolutions of the Heavenly Spheres” in 1543, right before his death. He would never witness the upheaval it caused, although even that was delayed by the thick, technical language he used which made his work inaccessible to many readers. A small first printing of 400 didn’t even sell out.
But it didn’t take long for readers to slice through Copernicus’ thick writing to the heart of his contention: Earth revolved around the sun. Man was no longer at the center of everything. It wouldn’t stand. Popular imagination places the Pope at the head of the backlash, but this was a moment of religious unity: Protestants and Catholics alike hated what Copernicus stood for. Martin Luther called him a fool who ignored the clear account of Joshua stopping the sun in its tracks, and another Protestant theologian mocked him as the “astronomer who moves the earth and stops the sun.”
The brunt of the backlash would fall on other men who followed in Copernicus’ orbit, like Galileo. Maybe Copernicus delayed publication so long because he just didn’t want to deal with it, in which case it would be hard to blame him. Copernicus might have been looking out for his own well-being, which would have been threatened by men trying to protect their own power in turn. We might not be at the center of everything anymore, but we’re still pretty self-centered. Not even Copernicus could change that.
https://alexautindotcom.files.wordpress.com/2013/02/copernicus-picture.jpg

Wednesday, February 18, 2015

If you’re ever told that something will happen when it snows in the Sahara, feel free to point to 36 years ago, when precisely that happened for the first time in recorded history (and one of only twice your humble correspondent could uncover) on February 18, 1979.

Any readers from Boston might find this especially unimpressive at the moment, but for everyone else, the sight of camels standing in snowy sand in southern Algeria is just pretty darn cool…as well as a sign t...hat no event should be dismissed as completely impossible. (AWC is unable to state for certain whether this particular camel roamed the Sahara during the 1979 snowfall or the more recent one mentioned later…so let him represent the Platonic Snow Camel on the cave wall of our hearts.)

The 1979 snowstorm, described as "the first in living memory" by the New York Times, brought traffic to a standstill in southern Algeria. It was out of the ordinary but also brief, lasting about a half-hour. Within a few hours, the snow was gone (which anyone from Boston should DEFINITELY find impressive). AWC has uncovered one other time when the Saharan lowlands received snow, this time in western Algeria on January 18, 2012. (Snow is a regular event in Saharan mountain ranges.)

Whether the 2012 snowfall is a harbinger of a new world of climate-change induced crazy weather -- where penguins fly over the desert and volcanoes spew purple lava at the South Pole -- or just another aberration remains to be seen. For now, snow in the Sahara remains a pretty cool sight, and a reminder that this crazy world we share is something we claim to understand at our peril. We don’t really know what can happen, and the possibilities are beyond us. If camels can handle snow, you can go out and tackle whatever has you nervous today. (By the way, do you guys know what day it is?)

http://i.telegraph.co.uk/multimedia/archive/02143/readers-snow-camel_2143977i.jpg
 

Tuesday, February 17, 2015

If computers someday become our masters, let it be said that we got in a few licks along the way. One of the most notable (or at least entertaining) came 19 years ago today, when grandmaster Garry Kasparov defeated IBM’s chess-playing supercomputer Deep Blue in a 6-game match which ended on February 17, 1996.

In fairness to our future robot overlords, AWC should note that an upgraded Deep Blue won a rematch against Kasparov the following year. But maybe that just enhances the impressiveness of Kasparov’s first win. Kasparov suspected IBM of cheating in the 1997 rematch by using human chess players to intervene between moves. (The rules only allowed humans to move Deep Blue’s pieces and reprogram “him” between games.) Kasparov asked for a third match, but IBM declined and retired Deep Blue. It’s possible Kasparov fell victim to the human tendency to see patterns in randomness. A random move by Deep Blue due to a bug spooked Kasparov during one game, making him think he was matched against a creative intelligence, instead of a brute force calculating machine. In all, Kasparov and Deep Blue played 12 games over the two matches, with Kasparov winning 4, Deep Blue taking 3, and 5 ending in draws.

A tale of the tape reveals that in the machine corner, Deep Blue was able to evaluate 200 million positions per second. That proved just enough to keep up with Kasparov out of the human corner, a Russian who became the world’s youngest undisputed chess champion at age 22.

Chess-playing computers have become passé since the 1990's, with developers focusing on chess software programs like Deep Junior over hardware specifically dedicated to the task. IBM would go on to develop Watson, a computer who went on “Jeopardy!” in 2011 and crushed the uber-champ Ken Jennings, who declared in his Final Jeopardy response “I for one welcome our new computer overlords.” He later wrote “'Quiz show contestant' may be the first job made redundant by Watson, but I'm sure it won't be the last.”

As for Kasparov, he retired from chess in 2005. He has spent his time trying to crack the door into Russian politics, but has found opposing Vladimir Putin to be far more confounding than putting Deep Blue in checkmate ever was. He currently sits on the board of directors for the Human Rights Foundation, a fitting spot for a guy who struck a small blow for humanity against technology on this date.

Monday, February 16, 2015

The first 911 call in North America was placed on this date, not in a major city but in Haleyville, Alabama 47 years ago on February 16, 1968. AT&T had agreed to institute a single, easy to remember emergency phone number across North America in response to a recommendation by a crime commission assembled by Lyndon Johnson. The president of the Alabama Telephone Company got wind of the idea and decided to implement the new service in Alabama first. On this date, two local politicians participated in the first 911 call, routed between Haleyville’s city hall and police station.

911 became the United States’ national emergency number in 1968 – with access to police, fire, and medical services – but it took a while to catch on. Public awareness of the service increased throughout the 1970's, and access took off outside the major cities in the 1980's. Today around 98 percent of residents in the U.S. and Canada can dial 911 and be connected with emergency services, and 96 percent within the US will be connected to Enhanced 911, which allows the dispatcher to automatically access the caller’s location (which has become more complex as cell phones and internet telephony have eclipsed the use of landlines).

Like anything, the service has its unique challenges. Among them are callers reporting emergencies in a jurisdiction outside of the area where the call originated (in which case dispatchers may not know who to contact), or calls being made in areas where emergency services have been cut, with dispatchers able to do little more than stay on the line during the long response time. (A woman in Oregon was raped while on the line with 911 during a call when no county police were on duty.) The rise of internet telephony has also allowed callers to more easily disguise their location, resulting in a rise in “swatting,” a dangerous “prank” where heavily armed police are sent to a location for no reason. Some areas are moving to make this a crime with the full costs of the response (which can run into the thousands of dollars) billed to the prankster.

Another common issue can be misdialed numbers. This is especially an issue in telephone exchanges where “9” accesses an outside number and “1” begins an area code. A caller with jittery fingers might tap both buttons twice, and hit 9911, which grabs an outside line and then dials emergency. Around Raleigh, North Carolina, the old 919 area code has been required on all calls after a recent area code overlay switch. Misdialed numbers beginning with 911 have resulted in a large number of hang-ups from the area, all of which have to be traced and called back to confirm the presence or absence of an emergency.

Despite these challenges, Americans dial 911 about 240 million times a year, while Canadians make at least 12 million calls annually. Both countries impose a fee on telephone customers to fund the service. In the United States, 911 costs telephone customers an average of 72 cents a month…which could be worth every penny if you ever find yourself needing it.
http://media.cmgdigital.com/shared/lt/lt_cache/thumbnail/960/img/photos/2013/05/03/4a/80/sns011212sherifffunds.jpg

Sunday, February 15, 2015

In the winter of 1925, the dog sled relay was a way of life in remote northern Alaska…and during that season, it also became a matter of life and death. Aircraft had yet to become reliable enough for the punishing conditions of the region, so mushing teams were responsible for delivering mail during the coldest months of the year. So when an outbreak of diphtheria threatened to decimate the small city of Nome, where Alaskan Natives had no natural immunity, it was up to a team of dogs and their sled drivers to deliver antitoxin to the one doctor in town, whose batch had expired. Without it, the estimated mortality rate was around 100 percent.
The “Great Race of Mercy,” as it came to be called, actually consisted of two relays. The first delivered a supply of 300,000 units of antitoxin to Nome. The supply had been found in Anchorage. It wasn’t enough to wipe out the outbreak, but it could hold it at bay while a supply of 1.1 million units was rounded up along the West Coast, shipped to Seattle, and delivered to the Alaska Territory. The trip from Nenana to Nome (which was as far as the trains could deliver supplies into the Alaskan interior) normally took 25 days by dog sled. The antitoxin would only survive for about six days along the trail. Facing some of the worst conditions on the planet – with hurricane-force winds and temperatures consistently below -50 F – a team of 20 drivers and around 150 dogs covered the 674 miles in 5 ½ days from January 27 to February 2.
Two weeks later, the final shipment of antitoxin arrived in Nome. The 1.1 million units that would wipe out what threatened to become an epidemic arrived in Nome 90 years ago today on February 15, 1925. The second relay had taken a relatively leisurely seven days. The death toll from the diphtheria outbreak was officially listed at 5-7 people, but was likely higher since Alaskan Natives had a custom of burying their dead without alerting the authorities. A number of dogs also died delivering the antitoxin. Balto, the lead dog of the final leg of the relay into Nome, became emblematic of the heroic animals. He became a celebrity in his own right, and had a statue erected in New York’s Central Park. Today the Iditarod Dog Sled Race honors the serum run and the history of the dog sled in Alaska, which saw its last great hurrah come to an end on this date.

Saturday, February 14, 2015

Valentine’s Day is supposed to be a day for lovers, but a scientific discovery on this date should get its due too. With candy, wine, and roses can also come activities that leave a mark long after the champagne is gone…or would, if not for Alexander Fleming, who announced the discovery of penicillin 86 years ago on February 14, 1929.
The Scottish scientist claimed that he had discovered the antibiotic after some of it gained a foothold in an open plate of bacteria, growing into a mold and inhibiting the bacterial growth as it expanded. Scientists have since called this story into question, noting that it doesn’t match up that well with the observed interaction of penicillin and bacteria. But if Fleming came up with a too-cute tale of his discovery, its arrival was no less fortunate. Penicillin made a major difference during World War II, resulting in far fewer deaths and amputations than would have been the case otherwise. In 1945, Fleming shared the Nobel Prize in Medicine for his discovery.
Since 1929, different strains of penicillin have been used to treat a host of bacterial infections, although bacterial resistance over time has made it less of a “miracle drug” today than when it was introduced. Tonsillitis and gangrene have been treated with penicillin over time, as well as a few products of a Valentine’s Day gone awry, like syphilis and gonorrhea. Because while romance can go sour, an antibacterial treatment will be there for you as long as natural selection allows it to be effective. What could be more heartwarming than that?
http://www.abc.net.au/news/image/570248-16x9-700x394.jpg

Friday, February 13, 2015

It was one of history’s most notoriously overdue library books. Sometime in the late 19th century, the first half of Mark Twain’s original handwritten “Adventures of Huckleberry Finn” manuscript disappeared from the Buffalo library, where Twain had donated it. It was presumed lost forever…until 24 years ago today, when Sotheby’s auction house in New York announced that it had authenticated a set of papers as the long-lost missing half of “Huck Finn” on February 13, 1991.
The papers took a strange trip on their way back to Buffalo. The manuscript showed up in a trunk in an attic in Los Angeles, of all places. A pair of sisters had gained possession of the trunk on the death of an aunt in upstate New York. The sisters themselves were granddaughters of James F. Gluck, a lawyer and book collector who had been a friend of Twain’s and convinced him to donate the manuscript to the Buffalo library. The accepted theory is that Gluck borrowed half the manuscript from the library, then forgot he had it. He died suddenly in 1897, and the stack of papers, with no title page, was probably hurriedly stuffed in a trunk…where it stayed for nearly 100 years.
“Huck Finn” is known for its pervasive use of dialect – notoriously so in the case of a racial slur whose repeated use, while honest to the time and people the book depicts, has made the novel a thorny presence in school libraries and children’s reading lists. The recovered manuscript shows how Twain wrestled to find the right voice for his unlettered protagonist, who also narrates the novel. In the book’s opening line, Huck tells the reader "You don't know about me, without you have read a book by the name of 'The Adventures of Tom Sawyer'; but that ain't no matter." Twain labored over this introduction, originally writing “You will not know about me,” before changing it to “You do not know about me” and then settling on the final version.
In the end, the Buffalo library appeared ready to overlook Gluck’s oversight. During his life, Gluck had been a prodigious collector, and donator, of books, having donated hundreds of pieces of writing to the library, where he served as curator. It seems he also pestered many a writer to donate their original scribblings. "Gluck badgered everyone into giving," is what William Loos, curator of rare books at the Buffalo library in 1991, said after recovering the “Finn” papers, according to the New York Times. He added "If Gluck forgot to return this overdue book, we are prepared to forgive him."
http://mediad.publicbroadcasting.net/p/kalw/files/styles/placed_wide/public/201402/04_a_photo.jpg

Thursday, February 12, 2015

George Gershwin might have been the last to know he was working on “Rhapsody in Blue.” As the story goes, he got this helpful information in January 1924 while he was playing pool at a Manhattan billiard hall as his brother (and future musical collaborator) Ira read the newspaper. Ira came across a piece about an ambitious experimental concert being planned by bandleader Paul Whiteman for the following month. The story claimed “George Gershwin is at work on a jazz concerto.”
Gershwin had already turned Whiteman down, believing he didn’t have time to work on a new piece, so this was news to him. When he called the bandleader, Whiteman pleaded with Gershwin to come on board and help save the concert idea from being stolen by a rival musician. Gershwin relented, and went to work on what became “Rhapsody.” He had five weeks to get it done.
Finally the big day came. Whiteman’s concert at New York’s Aeolian Hall was intended to blend elements of jazz and classical music, and demonstrate that the relatively new form of jazz deserved to be taken seriously. Today, just about everything about it has been forgotten…except for the piece Gershwin debuted late into the program 91 years ago on February 12, 1924. Gershwin played piano for the first performance of “Rhapsody in Blue,” which begins with a sharp clarinet wail (called a glissando by those in the know) that has become one of the most famous openings in all of instrumental music. (I found multiple suggestions that only the four notes beginning Beethoven’s Fifth rival it in popular recognition.) It was improvised during rehearsal as a joke, and Gershwin kept it in.
Gershwin has described his piece as “a sort of musical kaleidoscope of America, of our vast melting pot, of our unduplicated national pep, of our metropolitan madness.” Moviemakers have more often seen it as a musical portrait of Manhattan. Since 1924, “Rhapsody” has, like the forms of jazz and classical it sought to marry, occupied an interesting limbo straddling the common and the elevated. Serious musical critics have dissected it at length, and musicians like Brian Wilson of the Beach Boys and Michael Stipe of R.E.M. have cited it as an inspiration. It’s also popped up in United Airlines ads or in the background at Disney World. More than anything, it’s a piece that seems naturally at home in just about any American setting.
Whiteman got more than he bargained for when he twisted Gershwin's arm, and the fruit of what might have been a little dirty trick between colleagues (whether Whiteman intentionally leaked Gershwin's name to the press is unclear) lives on nearly a century later. If Whiteman's intention was to bring "high" and "low" together, the bandleader couldn't have orchestrated Gershwin's role more perfectly.

Wednesday, February 11, 2015

China lifted state censorship of Aristotle, Shakespeare, and Charles Dickens 37 years ago on February 11, 1978. The move coincided with China’s turn away from the excesses of Mao Zedong’s “Cultural Revolution” and toward loosened state control over citizens’ lives.

To Western eyes, China looks like a pretty repressive place. A one-party state that heavily censors the internet isn’t exactly a bastion of freedom. But a little history helps put things in context. During Chairman Mao’s reign as head of China’s Communist Party, he worried that the country was drifting toward capitalism. (To put his views in perspective, he saw the 1960s-era Soviet Union as having already gone fully capitalist.) His paranoid view led to disaster for average Chinese, when he implemented his so-called Cultural Revolution to root out supposedly subversive people and ideas from Chinese society in 1966.

For a decade, Chinese students, soldiers, laborers, and government workers turned on each other in the name of the revolution. People were thrown into prison for no reason, tortured, or had their property seized. When Mao died in 1976, the ruling Communist Party quickly acknowledged that his revolution had been a disaster. Beginning in 1978, they moved to install economic reforms that have made China the world’s second-largest economy (and still growing). China is still officially Communist, but what Mao feared most…infiltration by market-based ideas…has become official policy. After Mao, foreign investment was encouraged, and price controls were lifted. Much of Chinese industry is still state-owned, resulting in a strange hybrid economy. But the half-steps toward open markets have been a major economic boon for the country.

Of course, China is still known for being a closed society, culturally and politically…which brings us to this anniversary. Right at the beginning of China’s economic reforms came a lesser noted cultural one as well. Aristotle, Shakespeare, and Dickens are so commonly referred to in Western thought that we barely realize it. (I found a piece listing all the common phrases we use that come from Shakespeare, but it’s too long to even meaningfully excerpt here. Moral of the story: Dude said a lot of stuff we still say.)

China’s decision to allow these formerly banned authors into the minds and homes of its people was a remarkable step away from censorship. (The works haven’t gone ignored. This actor is performing “The Revenge of Prince Zi Dan,” a Chinese adaptation of “Hamlet.) For now, China has managed to have it both ways…allowing limited economic freedom to boost the standard of living, while maintaining tight control on many personal freedoms. But the free translation of Western authors might look to future generations like a crack in the dike. (A free internet would be more like a burst dam.) Whether China can maintain a “middle way” between Mao’s total crackdown and an open Western-style society remains to be seen long-term. Bringing Shakespeare to Beijing might be seen as a turning point one day; but if not, it’s still worth celebrating in its own right. They deserve to enjoy “Hamlet” as much as anyone else.
http://i.telegraph.co.uk/multimedia/archive/01972/revenge_1972004b.jpg

Tuesday, February 10, 2015

It’s not easy being the second act. Americans celebrate people who emerge from obscurity and pave their own way. But having to follow a successful parent – and in the same field, no less – presents its own challenge, and that challenge was particularly acute for Lon Chaney, Jr. (born 109 years ago on February 10, 1906). His father, Lon Chaney, was one of the most iconic actors of the silent film era, who pioneered techniques in movie makeup and became “The Man of a Thousand Faces.” How could Junior top that? Have 2,000 faces?
But the younger Chaney seemed to embrace his legacy full-on. (He was born Creighton Chaney, and was originally billed as such on-screen before taking on his father’s name.) Where the first Lon Chaney had been among the earliest actors to put grotesque figures like the Phantom of the Opera and the Hunchback of Notre Dame in front of audiences, the second Lon Chaney helped create the monster craze that fueled Universal Studios during cinema’s Golden Age. He waited until after his father’s death to begin acting in 1931, and landed his iconic role in 1941’s “The Wolf Man.” He reprised that role several times, and also took on parts as Frankenstein’s Monster, the Mummy, and Count Alucard (Dracula’s son. Alucard…Dracula…see what they did there?), making him the only actor to portray all of Universal’s big four monsters in some way.
While the monster makeup made him famous, it also put him in a box that would be hard to escape. The monster craze of the ‘40s faded, but Chaney kept revisiting horror roles even as he transitioned to a career in television. (He was rumored to be drunk during a live airing of “Frankenstein,” during which he carefully avoiding breaking the on-set furniture, thinking it was a rehearsal and mumbling “I saved it for you.”) He got plenty of work outside the horror genre in difficult supporting roles, and acted on-stage, but it was the monster roles that proved his bread-and-butter and which always called him back. His last role was in 1971’s “Dracula vs. Frankenstein,” the coda on a four-decade acting career. He died in 1973 at age 67 from heart failure, after decades of heavy drinking and smoking…a sadly premature end which lowered the curtain on a father-son act whose legacy haunts moviemakers and film buffs to this day.
http://cdn2-b.examiner.com/sites/default/files/styles/image_content_width/hash/10/66/1369339006_2643_wolfman.jpg

Monday, February 9, 2015

The Beatles’ introduction to American audiences through their appearance on “The Ed Sullivan Show” 51 years ago on February 9, 1964 is fondly remembered as ushering Beatlemania to this side of the pond and for kicking off the so-called British Invasion. What’s less remembered is why there was such a pent-up U.S. demand for the band in the first place. The Beatles had already exploded in Europe, dominating the British charts and doing a five-day tour of Sweden over the previous year.
But for some reason, Capitol Records dragged its feet on bringing the band across the Atlantic. (Capitol was the American imprint of EMI, which had signed the Beatles, and had first dibs on releasing their work in the U.S.) EMI tried licensing the rights to other labels in 1963, but business complications kept any of them from promoting the Fab Four effectively in the States.
Finally, in January 1964, Capitol released an album called “Meet the Beatles!” The track lineup was heavily altered from the British release it was based on, and some of the audio had been monkeyed with as well. But it was something, and that month saw DJ’s in New York start putting the Beatles in rotation for the first time. The stage was set for the band’s physical invasion of the U.S., which began with their arrival at JFK Airport to thousands of screaming fans on February 7.
Two days later came their famous appearance on the Sullivan show, 51 years ago today. Ed wisely introduced them and got out of the way, with the simple intro “Ladies and gentlemen…the Beatles!” The viewing audience was estimated around 73 million that night, or over one-third of the country. Nielsen said it was the largest TV audience in American history.
After playing Sullivan, the Beatles went on to play concerts in Washington, D.C. and at Carnegie Hall. They hit Sullivan’s show a second time and finally headed back to London two weeks after their arrival. Their exposure to the American market would have big impacts on both sides of the Atlantic. The door had swung open for British imports in the American music scene, and bands like the Rolling Stones, Herman’s Hermits, and the Who would walk right through it.
The Beatles themselves were likely changed by America as well. Later in 1964, they would hang out and get high with Bob Dylan, and it wasn’t too long before John Lennon took on the nasal twang and social conscience that would help mark the Beatles’ evolution away from their days as a long-haired teen-pop hit factory. Invading America also brought the Beatles in contact with United Artists, whose record division had noted Capitol’s less than enthusiastic promotion of the band and pitched the idea of a mockumentary, leading to “A Hard Day’s Night,” which served both UA’s movie and music arms quite well. So all in all, Ed Sullivan uncorked a cultural genie that would upend the musical and media scene on two continents. Not bad for a five word introduction.
http://cbsnews2.cbsistatic.com/hub/i/r/2014/01/02/cb7e9975-4c9b-4970-a80c-c06aceb06ff5/resize/620x465/53dffdad12e2b9a90efb6c0812c8cf5c/Beatles_beatles5.jpg

Sunday, February 8, 2015

Jules Verne (born 187 years ago on February 8, 1828) is one of the most translated authors in history. (It’s estimated that only Agatha Christie’s works have been translated more frequently since 1979.) This has proved to be a double-edged sword for his reputation, revealing his vast popularity as a writer of travel adventures on the one hand, while also sacrificing some of his literary innovations in favor of his books’ exciting content.

This can be best illustrated by comparing Verne’s reputations between his native France and English-speaking countries. The French see their native son as far more than a genre writer or science fiction pioneer (although he clearly was). When read in his native tongue, his stylistic touches were innovative and experimental enough to inspire surrealist and avant-garde literature in France, and allow him to be examined as a literary figure in his own right. This reputation has been slower to emerge in English-speaking countries, where Verne’s works often suffered from poor translations that snuffed out his unique essence as a writer, and caused him to be seen as a popular writer if not a literary figure.

Of course, it’s also true that Americans have never held popularity against the gifted in the same way as the French. (Verne was shut out of literary academies in France during his lifetime, and his works only got a second hearing there after his death.) And the popularity of Verne’s works must be considered. He spent much of his youth fighting off his father’s attempts to pull him into practicing law, and was twice shattered when parents of women he loved married them off to older men. (The episodes impacted him deeply enough to make the theme of women married against their will a frequently recurring one in his books.)

Finally, his break came through a partnership with the publisher Pierre-Jules Hetzel, whom he met in 1862. The relationship would provide him with what he most needed…an outlet for the writing that he had pursued at the expense of all else (even turning down his father’s law practice a decade earlier). Hetzel enjoyed Verne’s work enough to offer him a flat fee for three books a year, to be published in Hetzel’s magazine first. Verne leapt at the chance for a steady income and a guaranteed outlet for his work.

Hetzel saw the thematic glue of Verne’s work….exciting travel adventures with technological themes…and decided that Verne’s books would make up a loosely connected series called the “Voyages Extraordinaires.” Hetzel said Verne would “outline all the geographical, geological, physical, and astronomical knowledge amassed by modern science and to recount, in an entertaining and picturesque format that is his own, the history of the universe.” (Oh, is that all?) Almost all of Verne’s novels are part of the “Extraordinary Voyages” sequence, including his most beloved works: “Journey to the Center of the Earth,” “From the Earth to the Moon,” “Twenty Thousand Leagues Under the Sea,” and “Around the World in Eighty Days.”

Verne suffered from diabetes, and died at age 77 in 1905. He left behind so many unpublished novels that his “Extraordinary Voyages” series continued for several years, with changes made by his son Michel. (The books were later published unaltered.) Jules Verne is remembered in many ways, as a surrealist innovator, a sci-fi pioneer, and a technological prophet. To illustrate the last point, Verne wrote a novel in 1863 called “Paris in the Twentieth Century.” Wikipedia summarizes the plot as being about “a young man who lives in a world of glass skyscrapers, high-speed trains, gas-powered automobiles, calculators, and a worldwide communications network, yet cannot find happiness and who comes to a tragic end.” It was lost for years before his great-grandson found it in a safe. It was published in 1994.
http://www.blastr.com/sites/blastr/files/jules_verne.jpg

Saturday, February 7, 2015

If it seems like Laura Ingalls Wilder (born 148 years ago on February 7, 1867) couldn’t keep all the details of her childhood on the move straight, it’s easy to sympathize. By the time she was 12, the daughter of a restless farmer and would-be businessman had experienced the following sequence in an itinerant childhood: Leaving the Big Woods of western Wisconsin in 1869, with a stop in Missouri en route to Kansas Indian Country; back to the Big Woods in 1871; on to Walnut Grove, Minnesota in 1874; down to Burr Oak, Iowa, to help run a hotel around 1877; back to Walnut Grove, where her father Charles served as town butcher and justice of the peace; and finally into the eastern Dakota territory, where Charles had taken a railroad job in 1879. There, in De Smet, South Dakota, the family finally settled.

Writing about these events later, Wilder would switch around some events and leave others out completely (like the hotel in Iowa). This was likely to help move the story along more easily, and possibly to satisfy her publisher, who supposedly thought her real-life memories of Kansas were too vivid for a 3-year-old, and asked her to make the pseudo-fictional Laura Ingalls a bit older when the family arrived in Kansas and built a “Little House on the Prairie.”

While the Ingalls family planted their stakes in South Dakota, Laura herself kept moving. After becoming a teenage schoolteacher, she married Almonzo Wilder in 1885, when she was 18. The Wilders left South Dakota; briefly spent time with Almonzo’s parents in Minnesota; tried life in Florida, hoping the climate would help Almonzo’s health (he had been partially paralyzed by diphtheria); and finally settled down in Mansfield, Missouri in 1894, where they would both live out their days.

It was in Missouri that Laura became a local columnist and polished her writing skills. Her daughter Rose, who was already a successful writer, encouraged her mother to follow the same path. Finally, after the Great Depression wiped out the Wilders, Laura decided to give writing a try. Her mother and older sister had died, and she believed it was time to preserve her childhood memories in writing. In 1932, she published “Little House in the Big Woods,” the first in an eight-volume series. The “Little House” books served their purpose: Laura preserved her memories, and she and Almonzo made enough money from the series to live comfortably on their Missouri farm until his death at age 92 in 1949, and hers eight years later at 89. They are buried together at the Mansfield cemetery.

The “Little House” books have been subject to a minor controversy over their primary authorship, with some suggestion that Rose was the true author, taking her mother’s rough manuscripts and polishing them until they were suitable for publishing. While the women maintained a lengthy correspondence over the works, the truth is impossible to tell from the record. Regardless of how we got them, the stories of Laura Ingalls Wilder’s childhood have fascinated generations, never going out of print and inspiring a TV show that ran from 1974-83. A replica of Laura’s childhood cabin, called the Little House Wayside, stands in Pepin County, Wisconsin. It’s a tribute to the beginnings of a life filled with movement and hardship, and marked by optimism that home was just over the next horizon…perhaps one of the most American lives ever lived.
https://latebloomersrule.files.wordpress.com/2013/06/laura-ingalls-wilder.jpg

Friday, February 6, 2015

If you had to pick one event to summarize what Michael Jordan meant to basketball, you could do a lot worse than the one that happened 27 years ago, when he cemented his title as “Air Jordan” by winning his second straight NBA Slam Dunk Contest on February 6, 1988. His final dunk, a soaring thing of beauty launched from just inside the free throw line, encapsulated the anti-gravity quality that Jordan brought to the game, in the process inspiring countless bedroom posters, selling warehouses full of athletic shoes, and elevating the NBA from a niche league to a marketing juggernaut.

One reason the ’88 dunk showdown still lives on in so many memories is that even with Jordan’s eye-popping final dunk – which earned him a perfect 50 and secured his second straight crown in the All-Star Weekend event – his win is still controversial. That’s because it was preceded by a powerful windmill throwdown by Atlanta’s Dominique Wilkins which only earned a 45. It was widely noted by the TV crew that the event was being held in Jordan’s hometown Chicago Stadium, and that the home crowd vibe seemed to influence the judges. Even Jordan has said he thought Wilkins was robbed, and would have given him a 49 or a 50 for his last dunk. As it was, MJ pulled out a 147-145 result (he and Wilkins combined to score four perfect 50’s on their six dunks) and the era of the slam dunk was in full swing.

Jordan would go on to be named MVP of the next day’s All-Star Game, leading the Eastern Conference to a win. Three years later, the Chicago Bulls assembled a team worthy of their superstar, and won their first championship in 1991. Five more would follow in the decade. And through it all, Michael Jordan would become the closest thing to royalty the United States could produce. He took over the Olympics, he slam dunked over aliens next to Bugs Bunny, he inexplicably retired to play baseball for two years…then came back to the Bulls and made it seem like he hadn’t been gone a day, leading them to three more titles.

Since his second retirement in 1998 (let’s just skip that whole business with the Wizards), every up-and-coming star has one question lobbed around them: Is this the next Michael Jordan? Kobe Bryant and LeBron James have built historically great careers, but everything they do is invariably matched against Jordan’s resume. With “Air Jordan” pumping up ratings (and sneakers) during the 1990s, the fact that Michael Jordan could also play defense, or beat you with the jump shot, got lost on many casual observers. A generation of kids grew up just wanting to dunk, something Jordan has blamed on the marketing image that surrounded him which ignored the fact that he also mastered the game’s fundamentals. But it’s hard to blame every kid on the playground (or R. Kelly) for wanting to fly during the ‘90s. Few people have made it look so easy, or so gorgeous, as Jordan did over and over…including a performance on this date that still has fans arguing over a quarter-century later.
http://iconicauctions.com/ItemImages/000021/27_21208a_lg.jpeg

Thursday, February 5, 2015

It’s never too late to shake hands and become friends. Seriously, never. For proof of this, we can look at the events of 30 years ago, when the mayors of Rome and Carthage signed a peace treaty officially ending the Punic Wars on February 5, 1985. (The Punic Wars, incidentally, hadn’t seen any fighting in over 2,100 years.)
Rome and Carthage were both major Mediterranean power centers in the ancient world, and were only separated by a few hundred miles. Not surprisingly, they didn’t always see eye to eye. In the third century BC, Carthage was the center of an existing empire centered in North Africa, and Rome was a city-state on the move. Their competing interests led to three wars over a century, called the Punic Wars. Great naval battles in the Mediterranean, Hannibal crossing the Alps on elephants…this is straight-up Punic War imagery. After each of the first two Punic Wars, Carthage found itself a little worse off than before, but not defeated.
Finally Rome had enough of Carthage’s...you know, being all Carthagey and whatnot, and decided to put an end to it. Over a three year period from 149-146 BC, Rome marched to Carthage and…to put it lightly…cleaned house. They sieged the city for three years before finally breaching the walls and burning it to the ground. (It doesn’t look like much fun for these people.) The Romans were even said to have salted the earth so nothing else could grow there, but this was probably a legend made up later. At any rate, Carthage was done. Rome expanded its power into North Africa, transitioned from a republic to an empire, and then fell. (This is the part where a clock spins in circles, or pages fly off the calendar.) Christ was born, the Middle Ages happened, Europe found the Americas, Rocky fought Mr. T. It’s safe to say the world moved on.
But here’s the thing. Rome had completely broken Carthage, but Carthage had never surrendered. No formal treaty had solemnized cessation of hostilities, so…by our modern standards of warfare…the Punic Wars had never officially ended. Recognizing this, and seeing an opportunity for a photo op, the mayors of Rome and a rebuilt Carthage met on this date 30 years ago in Tunis (the capital of Tunisia, less than 10 miles from Carthage), where they signed a peace treaty and a pact of friendship and cooperation. Rome was much less of a global power, and Carthage was much less…destroyed. Things had changed, and for the better. Rome and Carthage were at war longer than any of us have been alive, and today it’s all good. It goes to show that anything can be worked out with enough time.
https://bluejayblog.files.wordpress.com/2014/01/punicwar-destructionofcarthagebyrome.jpg?w=640&h=379