Saturday, January 31, 2015

When the history of the end of the Cold War gets told, the Golden Arches often get overlooked…but maybe they shouldn’t. It was 25 years ago today that perhaps the most notorious symbol of Western capitalism invaded Moscow, with the Soviet Union’s first McDonald’s opening at Pushkin Square on January 31, 1990. It took 14 years for McDonald’s to work out the deal that would allow the landmark opening. Two years later, the Soviet Union was gone, along with the geopolitical conflict that had defined everything for nearly half a century.
The pent-up demand was clearly there from the beginning. Even though Big Macs were rather pricey by Russian standards, it’s still estimated that around 30,000 people lined up on the first day to try the burgers and fries that had been forbidden fruit for so long. The food was expensive compared to state-subsidized eateries, but the Russians were impressed by the friendliness of the employees and the cleanliness of the restaurant ...the world’s largest McDonalds, with 28 cash registers and seating for 700. (It closed last year.)
By August 1991, Mikhail Gorbachev had resigned as General Secretary of the Soviet Communist Party, and by that December, the USSR had simply dissolved. Did McDonald’s bring down one of the world’s two great geopolitical superpowers? Of course not. But Gorbachev’s political and economic reforms opened up the Soviet Union in previously unheard of ways. The invasion of the Happy Meal was just one sign things were changing. But a quarter-century later, it’s easy to look at those crowds outside a Moscow McDonald’s and assume that the end was near.
http://a.abcnews.com/images/Archives/abc_archive_WNBB1870B_wg.jpg

Friday, January 30, 2015

Fittingly for a guy in a mask, the Lone Ranger’s real-life origins are as shrouded in mystery as his fictional ones. Most importantly for our purposes is which date should be noted for his premiere on Detroit radio station WXYZ in 1933. Accounts vary as to whether it was January 30 or 31. With that in mind, AWC will note that it was roughly 82 years ago that radio audiences first heard the series on or about January 30, 1933. Then there’s the question of who actually should be credited for creating the character, station owner George Trendle or Fran Striker, who wrote the show. Like the Lone Ranger’s true identity, these details are lost to history. (For years, his first name was never revealed, and “John Reid” was just the most popular name for him.)

What can be asserted with much more authority is that kids (and yes, adults) ate this stuff up, and the Lone Ranger took off like gangbusters in his time. There was the radio show, of course, which lasted nearly 3,000 episodes and spread from Detroit across the fruited plain along two radio networks. Then there were movie serials, comic books, full-length movies, and one of TV’s earliest hit shows, which ran from 1949-1957. Marketers were happy to leverage the craze into toy guns, costumes, badges, cereal boxes, and whatever else they could put the character on (including an “Atom Bomb Ring” with an actual sample of radioactive material inside, which amazes me on multiple levels).

For over 20 years, the Lone Ranger was a steady presence in American media and merchandising. And then…he wasn’t. You can’t just blame his disappearance on the fact that the character is really old. Batman is only six years younger than the Lone Ranger, and he might as well have been created yesterday. But where Batman’s urban roots allowed him to change with the times, the Lone Ranger was more or less stuck, not just in the romantic legends of the Old West but in a time when outlaw heroes had perfect grammar and never swore. (Having Tonto around with all that “Kemosabe” and “get-em-up, Scout” business also became a really tough sell over time.) There were efforts to bring him back in the ‘80s. Gore Verbinski made a game effort to revive the character two years ago, but couldn’t even bring himself to let him yell “Hi-yo, Silver, away!” without turning it into a Johnny Depp joke.

It might be in the Lone Ranger’s DNA to resist revival. The hero who wears a mask and sells tons of plastic crap is still very much alive, but so much else about him has been left behind by the culture. Something about him still lives every time Robert Downey hides behind the Iron Man mask. (As the fictional great-uncle of the Green Hornet, the Lone Ranger’s connection to urban masked crimefighters was even made explicit in his own time.) But something else about him is still back there on the empty plain, when we still called people “Indians” without a hint of self-consciousness. And then another part of him is forever tied to the middle of the last century, when cigarette ads were everywhere but don’t curse in front of the kids, please. He just doesn’t fit too well in our time. But he couldn’t have been better suited to his.
http://store.acousticsounds.com/images/large/UMAR_590__22183__10252010052133-661.jpg

Thursday, January 29, 2015

Peter Sellers was actually supposed to carry even more of the load in “Dr. Strangelove” than ended up being the case. In the Cold War satire, he plays the title character – an ex-Nazi scientist bound to a wheelchair – along with the President of the United States and a British Air Force captain. But he was also intended for probably the film’s most iconic role: Maj. “King” Kong, the would-be cowboy who famously rides the nuclear bomb that will set off the Russian “doomsday machine” and kill all life on Earth.
Sellers never wanted all four roles, but he had already pulled off a similar shtick in “The Mouse That Roared,” and Columbia insisted that director Stanley Kubrick cast Sellers across the board as a condition of funding the movie. Sellers was worried about the workload, and whether his British tongue could pull off Kong’s Texas accent. As it turned out, he sprained an ankle and couldn’t film Kong’s scenes in a crowded cockpit set. The part was offered to John Wayne, and the famously conservative actor unsurprisingly turned down the part. Finally, Slim Pickens ended up taking the bomb-straddling role.
Regardless of who played what, “Dr. Strangelove” was a hit when it opened 51 years ago on January 29, 1964. The world had been living in the Age of the Bomb since 1945, and the Cuban Missile Crisis in 1962 seemed to have brought the planet to the brink of nuclear war. The entire situation seemed…silly, and ripe for satire. Kubrick and Sellers stepped up to the plate with the black comedy, which was subtitled “How I Learned to Stop Worrying and Love the Bomb.”
In “Strangelove,” a rogue American officer sets off a bombing campaign against the Soviet Union. But this is guaranteed to trigger Moscow’s doomsday machine, which will wipe out all life on the planet…and has been kept secret from the world. When Strangelove points out that this makes it a rather poor deterrent to the Russian ambassador, the diplomat says it was going to be announced the following Monday – because “the Premier loves surprises.”
A mad scramble to recall the American bombers ensues, and all the bombs are stopped except one – Kong’s, which famously dooms all life on Earth. The only hope for humanity is to retreat to radiation-proof mineshafts underground for 93 years and institute a breeding program that will repopulate the planet. Of course, the arms race is never totally over, as a general warns the president of a “mineshaft gap” with the Soviets.
The movie was delayed from its initial release because of the Kennedy assassination in November 1963. Thinking that the subject matter would have been off-key while the country mourned, Columbia pushed it into late January. They also changed a line about “a pretty good weekend in Dallas” to refer to Vegas. By the time it opened, plenty of Americans were ready to laugh at the Cold War. What else could they do?
http://www.newyorker.com/wp-content/uploads/2014/01/dr-strangelove-still-580.jpg

Wednesday, January 28, 2015

Once the decision was made to cram some of the world’s biggest recording artists in one studio to cut a single for African famine relief, a basic problem presented itself: Where would it happen, and how would it be kept quiet? Ken Kragen, one of the activists who helped assemble the supergroup USA for Africa, supposedly outlined the problem like this in a prep meeting three days before the recording session: "If [the location] shows up anywhere, we've got a chaotic situation that could totally destroy the project. The moment a Prince, a Michael Jackson, a Bob Dylan … drives up and sees a mob around that studio, he will never come in."
It was decided to record the track at a Hollywood studio right after the American Music Awards, with many artists coming straight from the AMA’s. The secret was kept under wraps, and 30 years ago today, USA for Africa recorded “We Are the World” on January 28, 1985. The song had been written by Michael Jackson and Lionel Richie. The project itself was inspired by the previous year’s “Do They Know It’s Christmas?” – a similar charity effort with British and Irish musicians.
A sign at the recording studio said “Please check your egos at the door,” and what a set of egos it was: Prince didn’t make it, but Jackson, Richie, and Dylan were there. So were Paul Simon, Kenny Rogers, Tina Turner, Billy Joel, Diana Ross, Bruce Springsteen, Willie Nelson, and it really does go on and on. (Forty-five vocalists are credited on the track.) Stevie Wonder supposedly tried to lighten the mood and motivate the group by threatening that if the song wasn’t finished in one take, he and Ray Charles would drive everybody home.
The song was released in March, and quickly broke sales records. This was probably less due to its musical quality, which has been heavily debated, and more due to both the novelty of hearing so many diverse voices try to harmonize, and to the feel-good nature of knowing the purchase was doing something for charity. Famine in Ethiopia took a million lives from 1983-1985, but “We Are the World” is claimed to have raised $63 million for humanitarian causes.
Ninety percent of that went to African hunger relief, split between short-term solutions (providing food immediately) and long-term strategies (food production and birth control). The other 10% went to fight hunger and homelessness in the United States. One Ethiopian famine survivor has said she will never forget Michael Jackson, or the relief bread that many Ethiopians named for him: “If you speak to anyone who was in Addis Ababa at that time they will all know what Michael Bread is and I know I will remember it for the rest of my life.”
http://www.vettri.net/gallery/celeb/michael_jackson/USA-for-Africa/MichaelJackson_USA-for-Africa_Vettri.Net-05.jpg

Tuesday, January 27, 2015

The word “celebrate” can mean different things. We can mark bright moments with unrestrained joy, and we can be thankful that some dark chapters ended before they became worse. In that vein, today marks 70 years since the Soviet Red Army marched into the Auschwitz concentration camp in occupied Poland and liberated around 7,500 survivors of the Nazis’ hell on earth on January 27, 1945. For a million or more others, the day came too late.
The number of people killed at Auschwitz has been debated. The Nazis intentionally made it impossible to tell for sure by burning corpses and mixing the ashes. The Soviets initially claimed four million, but this was probably anti-Nazi propaganda (not that any was needed). One camp commandant testified he was told the number was over 2.5 million, but later said he doubted the number, noting “Even Auschwitz had limits to its destructive possibilities.” A widely-accepted historical range is somewhere between 1 and 1.5 million deaths, or about one in six of all people killed in the Holocaust. (The fact that the argument centers around how many millions the number should stop at places the exact figure in its grim context.)
However many were murdered there, it’s likely that around 90% of them were Jewish, with the remainder made up of gypsies, gays, Jehovah’s Witnesses, Soviet POW’s, Polish resisters, and various other unfortunates. Their lives at Auschwitz were as horrific as their deaths. The gates to the main camp (by the end, Auschwitz was a network of three primary camps and 45 satellite camps) displayed the motto "Arbeit macht frei,” which mockingly promised “Work makes you free.” This twisted promise was true, in a sense. For many inmates at Auschwitz, the only route to “freedom” was to work until they dropped from disease or starvation. When this cruelty became too inefficient, the gas chambers were installed to help facilitate Hitler’s “final solution” at a much brisker clip. Life expectancy for inmates at one of the main camps was about three months. Some escaped work only to fall into the hands of Josef Mengele and his grisly medical experiments, which were too cruel to describe here.
The Allies were not unaware of the horrors, or at least they had no excuse to be. Reports of the Nazi camps, including the gas chambers (often disguised as showers to put the inmates at ease) had leaked out of Poland for years before the Red Army arrived, but were frequently dismissed as exaggerations. Maybe it was hard to believe that anyone, even the Nazis, was capable of such cruelty. When the Soviets got close, the Nazis forced about 58,000 remaining inmates to march out of Auschwitz, with many dying on the march. Those left behind were too weak to leave. The Soviets found thousands of survivors, including the children pictured here. They also found an array of grotesque souvenirs left by the Nazis, including over 8 tons of human hair.
Auschwitz has been converted into a memorial, and today is International Holocaust Remembrance Day in honor of its discovery. The survivors of Auschwitz did what they could to return to life and tell their stories. Psychiatrist Viktor Frankl wrote “Man’s Search for Meaning,” in which he described his ability to find meaning in the darkest circumstances. Elie Wiesel refused to talk or write about his experiences at Auschwitz for a decade before finally writing his memoirs and becoming an activist for peace. Primo Levi wrote “The Periodic Table,” connecting his experiences in the camp with his love of chemistry.
If the Soviets hadn’t made it to Auschwitz when they did, there’s no telling how many books never would have been written, how many speeches never would have been heard, and how many sunrises never would have fallen on the faces of its survivors. We can’t throw a party today…but it is an anniversary worth celebrating.
http://upload.wikimedia.org/wikipedia/commons/thumb/5/51/Child_survivors_of_Auschwitz.jpeg/1024px-Child_survivors_of_Auschwitz.jpeg

Monday, January 26, 2015

It’s a national holiday in Australia and India, two countries deeply shaped by their histories as part of the British Empire who have since come into their own. America’s own patchy history with Britain makes it easy for AWC to tip its hat to the folks in the steamy subcontinent and the Land Down Under today. (Assuming they're still awake. Are they still up over there?)
Today is Australia Day, noting the 227th anniversary since British convicts landed at Botany Bay in southeastern Australia on January 26, 1788. With the loss of the American colonies, Britain could no longer send convicts to places like Georgia, so they turned to Australia. The so-called “First Fleet” contained 11 ships carrying settlers. It was tough going at first. Poor soil made starvation a constant reality, not to mention beatings and hangings from the harsh guards the British had sent along to oversee the colony. Over time the colony began to flourish, and the men there began celebrating the anniversary of their landing with drinking and songs.
Today Australia Day is a celebration of Australia’s independence and diversity. But Australia’s history of conflict between indigenous people and white settlers mirrors the United States’, and its holidays have been subject to the same tension. Aboriginal populations don’t see the arrival of British convicts as a celebratory event, and some counter-cultural forces have called the day “Invasion Day.” Encompassing all of its peoples and traditions in a single holiday might be impossible, but Australia has grown into the effort, with January 26 becoming a day both to celebrate and question its founding event.
History took a different course in India, where the British attempted to rule rather than settle. Britain’s East India Company had been set up for trade, but ended up assuming military rule in India in 1757. For nearly 200 years, British presence in India was a source of irritation and resentment for the native Indians. The movement for Indian independence was organized by Gandhi’s example of non-violent resistance. Gandhi used the plight of poor Indian farmers, subjected to high taxes and forced to grow cash crops instead of food for their families, to incite local resistance and shame the British. The effort was finally successful, when the British “Quit India” (the organizing mantra of the movement) in 1947. Three years later, the world’s largest democratic republic was born when India’s Constitution took effect 65 years ago on January 26, 1950.
Today India has two national holidays marking the subcontinent’s transition from colony to republic: Independence Day, celebrated on August 15 and noting the day that India became an independent nation; and Republic Day, observed on this date in the world’s most populous democracy (with over 1.2 billion people at last count).
The sun once never set on the British Empire. Those days are gone, and former colonial possessions have grown into their own…places like Canada and the U.S., and two spots farther south and east holding celebrations on this date.
Collage assembled by AWC

Sunday, January 25, 2015

The effort to add winter sports to the Olympic program took a while to catch on after the Games were revived in 1896. In 1912, the Swedes were hosting the Summer Games and had no interest in adding a winter program that would compete with their own Nordic Games. In 1916, the host Germans were ready to stage a parallel winter program before World War I cancelled the Games entirely. Along the way, figure skating and ice hockey found their way into the Summer Olympics, which just seems weird.
It finally all came together 91 years ago, when the first Winter Olympic Games were held in the French ski resort village of Chamonix on January 25, 1924. Paris was hosting the Summer Games that year, and the French were happy to get a double dip of Olympic energy in 1924. It was officially called the International Winter Sports Week, and there were 9 events, including speed skating, figure skating, curling, ice hockey, and four skiing events. The final medal table had a heavy Scandinavian tilt, with Norway’s 17 medals and Finland’s 11 easily outstripping the other eight competing countries. (New Yorker Charles Jewtraw brought home the lone U.S. gold, in speed skating.)
The event was such a success that the Olympic committee made it a permanent fixture, and officially changed the name to the Winter Olympic Games, with the Chamonix games retroactively becoming the first installment. The Summer and Winter Olympics were held in the same year until 1992, when the current schedule of alternating them in even-numbered years was implemented.
Like the Summer Games, the Winter Games have seen some events come and go (and then come again, in the case of curling). The most badass-sounding event to get the axe was something called the military patrol, which combined cross-country skiing with mountaineering and rifle shooting. (The modern biathlon is similar, but it doesn’t have mountain climbing...or nearly as cool of a name.) Norway has maintained its spot atop the all-time medal table with 329, outstripping the United States’ second-place showing at 281. (Our first-place all-time haul of 2,681 medals from combined Summer and Winter Games makes that a little easier to swallow.)
One crucial ingredient for the Winter Olympics is snow, and this hasn't always been reliable. In 1964, the Austrian army had to transport snow and ice to Innsbruck, a resort town that had no snowfall in time for the Games. The host Canadians and Russians resorted to the same thing in 2010 and 2014 thanks to unseasonably warm temperatures in Vancouver and Sochi, leading to speculation that climate change could create big problems for the Winter Games in coming years. I certainly hope not; I'm really starting to get into this curling business.
https://usatftw.files.wordpress.com/2014/02/gty-4703811491.jpg

Saturday, January 24, 2015

Robert Baden-Powell was already an English national hero by 1908 for his military exploits in South Africa. But it was on this date that his name became associated with something that would long outlive him, when he published the first installment of “Scouting for Boys” 107 years ago on January 24, 1908. The six-part serial would inspire the worldwide Scouting movement.
Baden-Powell hadn’t set out to appeal to youngsters. He had originally published “Aids to Scouting” in 1899 as a straightforward military field manual. To his surprise, the book caught on with boys, who were fascinated by the tracking and observation techniques he presented. He decided to repurpose the book for boys, keeping the scouting techniques and throwing in a few moral lessons along the way. In 1909, the year after he published his guide, Baden-Powell held the first Scout Rally at London’s Crystal Palace. It attracted 11,000 youth, including a group of girls calling themselves Girl Scouts.
Baden-Powell had created something far bigger than himself. Scouting spread across the British Commonwealth countries, and then worldwide. The Girl Guides were created to accommodate females, and the Wolf Cubs were designed for children under 11 – the equivalents of Girl Scouts and Cub Scouts in the United States. Eventually, there were Air Scouts and Sea Scouts for those more interested in aviation or maritime adventuring. Baden-Powell couldn’t possibly keep up with it all, or advise all the local groups that wanted to hear from him. Local troop guides, or Scoutmasters, would fill the void.
As of 2010, there were over 32 million registered Scouts around the world, with 10 million registered Guides. While all Scouting troops seek to instill a love of the outdoors along with values like honesty and self-reliance, other aspects of the movement differ worldwide…or even within the same country. While the Boy Scouts of America exclude openly gay Scoutmasters or atheists of any age, the Girl Scouts of the USA are neutral on the issues, a position roughly in line with Canadian, Australian, and many European Scouting organizations. (Baden-Powell himself, the British Empire’s first Chief Scout, has been rumored to have been a closeted homosexual.)
While politics has infiltrated Scouting life, Scouts have had their own ideas over time. The British introduced Scouting to Africa as a way to express colonial authority, but it actually inspired solidarity and resistance among the local Scouts. In the U.S., black Scouts in segregated troops worked toward the rank of Eagle Scout as a measure of their achievement, with other traditional outlets like college and military advancement closed off to them. Some totalitarian countries have banned Scouting entirely, and for good reason. In the end, Scouting works to instill youth with rugged strains of self-reliance and independence…and people like that are very hard to control.

Friday, January 23, 2015

What was it about the ‘80s? The decade that did everything big spawned a pair of muscled showmen who landed huge career breaks within a year of each other on this date. Mr. T and Hulk Hogan weren’t completely unknown before their career-defining stints. But their big breaks came exactly one year apart in 1983 and 1984, putting the rest of the decade in a chokehold that would make you pity the fools if it hadn't been so entertaining. (Maybe we could lobby Congress to call January 23 T-Hulk Day?)
Laurence Tureaud, aka Mr. T, had been a bouncer and bodyguard before winning a pair of strongman competitions aired on NBC. Sylvester Stallone saw one of the shows and hired T to play Clubber Lang in 1982's "Rocky III." This was the first time T spoke his immortal catchphrase “I pity the fool!” Stallone might have been inspired by an unscripted line T had delivered before a boxing match on the NBC competition, when he said "I just feel sorry for the guy who I have to box. I just feel real sorry for him.” The sympathy was justified, as T ended the match in 54 seconds.
But when “The A-Team” debuted 32 years ago on January 23, 1983, with Mr. T driving the car for a team of wrongly imprisoned military commandos, T fever really took off. With his trademark African warrior hairstyle and tons of jewelry (supposedly left behind by customers when he was a bouncer), T was a natural for the image-obsessed ‘80s. He was everywhere during the decade, and even had a Saturday morning cartoon where he owned a gym, had a dog with a mohawk, and helped kids solve mysteries, because obviously.
Terry Gene Bollea had been part of the pro wrestling circuit since the late ‘70s. He acquired the stage name “Hulk Hogan” along the way and began his career as a “heel,” (a.k.a. a bad guy if you’re not up on your rasslin’ lingo). Hulk had moved around and even wrestled in Japan, but when he returned for his second career stint with the WWF, it was by design. Owner Vince McMahon was determined to raise the profile of pro wrestling, and he had anointed Hulk as the face of the sport. Without time for a long character arc, his previous heel status had to be quickly papered over. Fellow wrestler Bob Backlund simply told fans “He’s changed his ways. He’s a great man,” and that was good enough. When Hogan won his first WWF title by pinning the Iron Shiek 31 years ago on January 23, 1984, announcer Gorilla Monsoon (ain't wrestling great?) said “Hulkamania is here!”
It was. The product in the ring might have been fake, but the money and popularity Hulk Hogan brought to it were real as could be. He held the WWF title for four years, and his long hair and tendency to slap “brother” onto every sentence became just as ubiquitous as Mr. T’s mohawk and “fool” shtick. The two heroes of our tale might have reached critical mass in 1985, when they teamed up to win the first WrestleMania event on pay-per-view.
This is all undeniably silly, but there are worse things to be famous for. T and Hulk had more than biceps going for them. They had over-the-top personalities perfectly suited for the decade. They also had squeaky-clean public images (at least at the time), and could credibly serve as role models for kids to take care of their bodies (and have weird hair). They’ve both toned things down since then. T claimed to have stopped wearing his gold out of guilt after seeing victims of Hurricane Katrina, and the less said about Hulk’s unfortunate foray into reality TV, the better. But for one shining moment in the ‘80s, our musclemen were as over the top as our dreams. And brother, I pity the fool who missed out.
http://s3.vidimg02.popscreen.com/original/48/VloxZW5EZVBUNkUx_o_hulk-hogan-mr-t-funny-promo-videovob.jpg

Thursday, January 22, 2015

 This date seems made for controversy. In choosing not to wade into the argument over Roe v. Wade, AWC is instead profiling a man whose legacy is every bit as divisive. D.W. Griffith was born 140 years ago in Kentucky on January 22, 1875. He is most remembered for directing 1915’s “The Birth of a Nation,” which in turn is remembered for two things: Being the first blockbuster movie with techniques that revolutionized the use of film, and simultaneously being one of the most racist movies ever made.
Griffith lived at a time when the so-called Dunning School of thought dominated historical discussion of the Reconstruction era. This was a serious scholarly attempt to argue that freeing the slaves and giving them the vote had been a disaster in the South, and that white Southerners had been justified in resisting Reconstruction by all means. One of those means was, of course, the founding of the Ku Klux Klan, which Griffith presented as a heroic force in his masterpiece (a word that can fairly be used for reasons coming up). The movie plays up the idea that freed black men (played by white actors in blackface, because this wasn’t bad enough already) preyed on Southern women, and had to be kept in line by the KKK.
This is not the stuff that Oscar campaigns are made of in 2015, and even in 1915 it was a hard sell in many places. The NAACP tried to get the film banned. There were riots around its showing in Boston and Philadelphia, and it never opened in several cities, including Chicago and St. Louis. Its on-screen portrayal has also been credited with inspiring the real life revival of the KKK in Stone Mountain, Georgia the year it came out.
“Birth of a Nation” would be easy to dismiss, if not for the fact that Griffith combined his reprehensible views on history (his father had been a Colonel in the Confederate Army, even though Kentucky never joined the Confederacy) with a flair for filmmaking that has impacted every moviemaker since. All the visual and editing techniques that help movies tell stories got their start somewhere, and for many of them, that start was in “Birth of a Nation.” Griffith pioneered things that are part of the basic vocabulary of film today, like close-ups, flashbacks, transition through dissolves, and nighttime photography. There was a choreographed battle sequence with hundreds of extras. Griffith also included an original score with the silent film, designed to be played by an in-house orchestra. In an era where filmmakers were happy to just point the camera at something and let it roll, Griffith realized that films could be presented much more creatively.
Griffith’s film was a hit with white audiences. (Woodrow Wilson saw it at the White House, and is supposed to have said “my only regret is that it is all so terribly true.” The quote has been disputed.) But the criticism it generated stung Griffith, and he tried to respond with “Intolerance” the following year. This was a sprawling 3 ½ hour epic that told the story of intolerance in different historical eras, including ancient Babylon and the life of Christ. Griffith hoped this would help blunt the criticism that he had made a bigoted movie. Griffith ended up making around 500 films before he died in 1948 at age 73.
For better or worse, his legacy is almost entirely bound up in “Birth of a Nation,” a movie which revealed a man with one eye trained hopelessly in the past, and the other seeing far into the future. It’s a complicated legacy, but AWC submits that Griffith’s use of techniques which will forever define moviemaking make his a life worth celebrating. If you can’t get on board with that, we could always discuss something less polarizing…like Roe v. Wade.
http://upload.wikimedia.org/wikipedia/commons/a/ac/DW_Griffith_star_HWF.JPG

Wednesday, January 21, 2015

Our eternal quest to go farther and faster got two big bumps on this date, thanks to a pair of breakthroughs in human travel. The USS Nautilus, the world’s first nuclear-powered submarine, was launched in Connecticut 61 years ago on January 21, 1954. Twenty-two years later, Concorde became the first supersonic jet to offer passenger service on January 21, 1976.
The Nautilus was named after Jules Verne’s fictional vessel in “Twenty Thousand Leagues Under the Sea,” and was the fourth U.S. Navy vessel to bear the name. But this Nautilus would actually earn its comparison to Capt. Nemo’s legendary craft, thanks to a nuclear-powered propulsion system that allowed her to operate free of surface-level oxygen. The Nautilus could stay submerged indefinitely as a result, recycling breathing air and distilling seawater into fresh water. The only limit on her ability to stay underwater was the amount of food on board. She completed a record-breaking trip to the North Pole in 1958, and logged over 300,000 nautical miles (her fictional counterpart only claimed 60,000) before being decommissioned in 1980.
Two decades after the Nautilus blazed trails underseas, Concorde did the same in the skies. A joint Anglo-French project, Concorde wasn’t the first supersonic transport developed. (That would be the Soviet Tupolev.) But it was the first to begin ferrying passengers through the air at almost unimaginable speeds. On this date in 1976, two Concorde turbojet-powered airliners entered passenger service. One served the route between London and Bahrain, while the other offered transatlantic service between Paris and Rio. (The US market was off limits at first, since Congress had outlawed supersonic landings due to noise complaints over the sonic booms generated.) When that ban ended in 1977, Concorde began service from London and Paris to New York.
Concorde had a top speed of Mach 2.04 (over twice the speed of sound), and was capable of covering the route between New York and London in under three hours on a good day. Concorde became a staple of the jet set, a symbol of travel for the rich and famous. French and British heads of state used it frequently, and the Pope logged a flight in 1989. But it was always more of a symbol than a commercial hit, and only 20 were ever built. (Six of those were prototypes.) Following a deadly crash in 2000 that killed 113 people (the only fatal accident in Concorde’s history) and the general downturn in aviation following 9/11, Concorde’s glory days were over. Air France and British Airways both discontinued their Concorde routes in 2003.
Photo montage assembled by AWC

Tuesday, January 20, 2015

For 52 Americans, a 444-day nightmare ended 34 years ago with their release from Iranian captivity on January 20, 1981. The Iranian hostage crisis had dragged on for over 14 months since radical Iranian students had stormed the U.S. embassy in Tehran, taking the diplomats and other Americans in the facility as captives.
The whole incident took place against the backdrop of the 1979 Iranian Revolution, which had seen the pro-Western Shah deposed by the forces of the Ayatollah Khomeini. The Islamic revolutionaries claimed that America had interfered in the country’s affairs for decades. The claim (which was true) helped the Ayatollah as he whipped up stories of America as “The Great Satan” in Iran. When the U.S. admitted the ex-Shah for cancer treatment in New York, revolutionaries had their pretext to take the embassy in November 1979. Women, minorities, and one sick man were released…but the remaining 52 hostages would spend over a year in captivity.
Publicly, Iran insisted the captives were “guests,” a claim which matched up very poorly with later tales from the hostages of beatings, solitary confinement, and psychological torture by their Iranian guards. The crisis was a public relations boon for the Ayatollah, who benefited from the perception that he had struck a blow against the U.S., and claimed “America can’t do a thing.”
The boast seemed to be true. Negotiations for the release of the hostages went nowhere for over a year. An attempted rescue in April 1980 was a disaster, and resulted in the deaths of 8 service members. As the captivity dragged on, Walter Cronkite began signing off the nightly news with a recap of how many days the Americans had been held. Ted Koppel’s “Nightline” got its start as a nightly update on the situation, with the same running tally on each night’s show.
The incident became a slow-moving disaster for Jimmy Carter’s presidency, and probably contributed to his defeat against Ronald Reagan in the 1980 presidential election. Negotiations picked up in the fall of 1980. The Shah had died, removing any chance he might return to power in Iran. Iran had been invaded by Iraq, and might have been motivated to put the matter to bed so it could focus on the war. Finally, all the hostages were released on this date in 1981. They boarded a plane, and began crying and hugging when told they were finally out of Iranian airspace.
It was Inauguration Day in the U.S., and word came down that the hostages were headed home just as Reagan finished his inauguration speech. It’s been said that the Iranians wanted to punish Carter one last time for his support of the Shah by delaying the release until he was no longer president. But Carter has since said he didn’t care who was president when he heard the news. He was just happy the hostages would soon be back home.
http://ahha3.files.wordpress.com/2013/12/jubilant-hostages-come-home.png

Monday, January 19, 2015

The work of Edgar Allan Poe (born 206 years ago in Boston on January 19, 1809) might give the impression of a life lived among crypts and séances, but the true fear that stalked him on earth was more banal, if just as frightening: Money, specifically the lack of it.
Poe made a bold decision to support himself through writing alone, the first great American writer to try that route. It would be tough for anyone, but he picked a particularly awful time for it. Before international copyright restrictions took hold, American publishers were happy to steal British literature instead of paying homegrown writers. The Panic of 1837 also sank the American economy, meaning even writers who could get work often went unpaid. Throughout his short life, Poe found himself repeatedly reduced to little better than begging, as his list of debts grew longer.
After years of writing (his first book was published in 1827), he finally achieved his popular breakthrough in 1845, when “The Raven” was published in the New York Evening Mirror. It was an instant hit, and made Poe a household name. He got $9 from it. The following year, the journal he owned and edited failed. The year after that, his wife Virginia died of tuberculosis. Suitably humbled by life, Poe died just two years after his wife in 1849. The story of his death is fittingly mysterious and has spawned many theories: The 40-year-old Poe was spotted walking around Baltimore, spouting delirious phrases and wearing clothes that weren’t his own. He died in the hospital four days later. No medical records, including his death certificate, have survived.
Here are some of the theories that have been presented for Poe’s cause of death: Alcohol, drugs, delirium tremens, tuberculosis, heart disease, epilepsy, syphilis, meningeal inflammation, cholera and rabies. Maybe the most sinister possibility is that Poe died from the practice of cooping. This was a nasty bit of 19th-century corruption where citizens were kidnapped and forced to vote for the same candidate in an election over and over. They were often plied with alcohol, forced to change clothes to avoid detection, and sometimes beaten or killed if they didn’t cooperate.
Whatever the truth, it’s clear that Poe’s life and death were, in their own way, as dark as the stories and poems he left behind…material that became popular in Europe first, before eventually becoming appreciated in his home country. “The Murders in the Rue Morgue,” “The Pit and the Pendulum,” “The Tell-Tale Heart,” and “The Masque of the Red Death” are among the creepy tales that still tingle spines. In addition to his dark Gothic works, he was also revered later as a pioneer in detective fiction and science fiction. Arthur Conan Doyle, Jules Verne, and H.G. Wells all explicitly credited him as an inspiration. Ironically for someone who struggled so mightily to make a dollar, Poe’s first book sold at auction for more than $660,000 in 2009…the highest price ever fetched for an American work. He might have died with nothing to his name, but Edgar Allan Poe left a rich library of the macabre that promises to thrill readers forevermore.
http://ekladata.com/fWqi4ThlmRxzOBiZ0oGeQcaYdqE.jpg

Sunday, January 18, 2015

A.A. Milne (born in London 133 years ago on January 18, 1882) seemed determined not to become famous for children’s writing, at least judging by his output. He contributed humor pieces to “Punch” magazine before joining the staff in 1906 and becoming an assistant editor. He wrote for the screen when such a job description barely existed, and had four of his stories filmed in 1920. He churned out 18 plays, and tried his hand as a detective writer in a murder mystery released in 1922.
But a ravenous beast was stalking his career…a stuffed bear, which would devour his reputation as thoroughly as a pot of honey. Milne observed his son, Christopher Robin, playing with his stuffed toys and saw inspiration. There was Piglet, Eeyore, Kanga, Tigger, and the favorite of the bunch, a bear called….Edward. Somehow the names of a Canadian black bear called “Winnie” (for Winnipeg) and a swan called “Pooh” got mixed up after visits to the London Zoo, and the stuffed bear was aptly renamed. Winnie-the-Pooh first showed up in print on Christmas Eve 1925 in a story published in the London Evening News. Two books followed in 1926 and 1928, and Milne’s fate was set.
Milne’s goal had always been the freedom to write whatever he wanted. But the Pooh stories took off beyond his imaginings, and as he believed, locked him into the role of children’s writer. As he put it, “I said goodbye to all that” with his kids’ books. He continued branching out, adapting “The Wind in the Willows” for the stage and writing about international relations during the interwar period and World War II. (He had served in World War I, and was initially opposed to another war, but changed his position after Hitler strode onto the world stage.)
But it was Pooh that would define him, and he knew it. He died aged 74 in 1956 and split the lucrative Pooh rights between his family and multiple organizations he wanted to support. The rights now sit with Disney, which has turned the bear into a multi-billion dollar merchandising powerhouse. But while Disney has made Pooh bigger than life, his origins will always be small and cozy. Christopher Robin’s original toys are displayed under glass at the New York Public Library. And in London’s Ashdown Forest (the inspiration for Pooh’s Hundred Acre Wood) sits a plaque containing words Milne penned himself: "In that enchanted place on the top of the forest a little boy and his bear will always be playing."
https://www.profilesinhistory.com/wp-content/uploads/2012/11/A-A-Milne-Oversize-Photograph-Signed-with-son-Christopher-Robin-and-Winnie-the-Pooh-1024x904.jpg

Saturday, January 17, 2015

We take for granted that we can watch movies and TV shows at home on our own schedule, but as with many commonplace things, ‘twas not always so. A big hurdle on the way to today’s world of TiVo and Blu-ray wasn’t technological; it was legal. Movie studios were terrified of giving consumers the power to record movies and shows for private use, so much so that they were willing to go to court to stamp out the home video market completely. It was 31 years ago today that the studios lost and home entertainment won, when the Supreme Court ruled in favor of VCRs in Sony v. Universal on January 17, 1984.
At issue was whether Sony, which had released the Betamax video recorder in the 1970s, was infringing on movie studios’ copyright protection by allowing consumers to record movies for their own personal use from TV broadcasts. Sony maintained that this was simply “time-shifting,” allowing people to watch programming on their own schedule and not the TV programmers’. The studios, led by Universal and Disney, weren’t having it, and they took Sony to federal court to block the sale of Betamax recorders.
The case was filed in 1976, and dragged through court for eight years. Sony won in California District Court, but lost on appeal to the Ninth Circuit. Even as the appeals court suggested Sony might be liable for damages to the studios, home video recording sales continued to take off as competition drove down prices. By the time the case reached the Supreme Court, far more homes had VCRs than when the case had begun. Many of the justices were concerned about holding consumers liable for damages for recording a single show or movie for private use. On this date, the high court reversed the Ninth Circuit by a 5-4 vote and held that Sony and other VCR manufacturers had done no provable harm to the studios.
Ironically, market forces ended up flipping the outcome that had been decided in court. Sony walked away a winner, but its Betamax had already seen its market share eroded by the VHS format, which offered longer recording times and ultimately made Betamax obsolete. For the studios, a big loss became a big win. After realizing there was no re-corking the VCR genie, they started releasing their films on video for purchase, opening up a huge new revenue stream.
The only consistent winner in the case was consumers, who had gotten used to the idea of watching what they wanted, when they wanted, and weren’t going to give it up. Formats have changed over the years, and VCRs are museum pieces today. But the idea they represented is very much alive every time you fire up a DVD, check your DVR history, or log into Netflix. You’re in control, and a Supreme Court decision on this date went a long way toward ensuring that.
http://gautamsawala.in/wp-content/uploads/2014/02/vcr.jpg

Friday, January 16, 2015

AWC has a complicated case to make today. We fully concede that granting Augustus the title of Caesar 2,041 years ago on January 16, 27 BCE (photo taken a bit later) ushered out a rare example of (sort of) democracy in the ancient world, replacing the Roman Republic with a brutal and expansionist Roman Empire. AWC also grants that many of the endowments Rome left to Western society should have been returned to sender, including slavery, religious persecution, and military dictatorship.
But it’s just as true that Rome embodied bits of everything we call Western civilization, which included many roses among the thorns, for those fortunate enough to benefit from them. (Or to let Monty Python make the case: “Alright, but apart from the sanitation, the medicine, education, wine, public order, irrigation, roads, a freshwater system and public health…what have the Romans ever done for us?!”)
For two centuries, the Pax Romana brought peace and prosperity to Roman citizens, which was a pretty big deal for people who had to poop outdoors. Rome’s long arm brought peace (often brutally, which is ever the paradox) to an area that stretched over three continents and contained about one-fifth of the world’s population. (It was a different story on the empire’s borders, which were constantly assailed by raids, most notably from Germans looking for a piece of the Roman dream.)
Ironically, the Roman Empire also provided the infrastructure for the spread of Christianity. While some emperors tried to stamp out the dangerous new religion through torture and brutality, their network of roads and civilizations clustered around the Mediterranean actually had the opposite effect, allowing the Apostle Paul and other early proselytizers to spread their faith at breathtaking speed, giving it a firm foothold in the West and eventually taking over the Empire. (It’s not by accident that the Pope lives in Rome.)
Rome eventually fell, as all human endeavors will. (Those Germans on the border couldn’t be held back forever.) When it left, much of the Roman Empire wasn’t worth mourning. But much more of it still lives on, in our languages, religion, and civil structures all across the Western world. Even 2,000 years later, in some ways all roads still lead to Rome.
http://www.smh.com.au/ffximage/2004/04/19/gladiators,0.jpg

Thursday, January 15, 2015

AWC is still at the Super Bowl, although they didn’t call it that when the Green Bay Packers and Kansas City Chiefs met in Los Angeles 48 years ago on January 15, 1967. It was the first time the champions of the two major football leagues had met to finish the season, and the name was still being hammered out. It was officially called the NFL-AFL World Championship Game, which doesn’t exactly roll off the tongue.
The Packers represented the old guard NFL, which was reluctantly accepting a merger with the upstart AFL, which it had been unable to squash. Both leagues wanted to win a share of pride in the new championship game, and on the first go-around, tradition beat innovation. With Vince Lombardi on the sidelines and Bart Starr under center, the Packers easily handled the Chiefs 35-10.
Off the field, the stadium was less than two-thirds full (plenty of empty seats in this shot), and college marching bands were the halftime entertainment. All in all, it was an inauspicious start for what would become America’s unofficial national holiday. The Packers won the next year’s game too, and it looked like the upstart AFL was in over its head against the NFL’s traditional powers.
Within two years, two big changes occurred. First, the name. Chiefs owner Lamar Hunt had been jokingly calling the new game the Super Bowl, riffing on college football’s bowl game tradition. He claimed he probably got the idea from watching his kids play with a Super Ball, and said he was sure the name could be improved. The leagues disagreed, and by 1969 it was official.
The second big change in 1969 was that the AFL started coming to play. The Packer dynasty had faded by then, and the New York Jets shocked the world with a win in Super Bowl III. The Chiefs won the last pre-merger Super Bowl the next year, and by the 1970 season, there was no question the absorbed AFL teams would be able to compete within the new, expanded NFL.
The hype around the game continued to grow. TV spots grew more expensive, halftime shows became more elaborate, and today even a dud of a game can set viewing records. (Seattle had wrapped up last year’s laugher by the third quarter, but it still drew 111 million viewers, making it the most-watched American telecast of all time.)
http://www.history.com/news/wp-content/uploads/2012/02/super-bowl-name.jpg

Wednesday, January 14, 2015

A new Super Bowl champion will be crowned in a few weeks…but whoever it is, they won’t be able to make the same boast as the 1972 Miami Dolphins, who became the first and still only NFL team to finish a perfect season 42 years ago today, beating Washington 14-7 in Super Bowl VII in Los Angeles on January 14, 1973.
Even after rolling through a perfect 14-0 regular season and knocking out Cleveland and Pittsburgh in the AFC playoffs, the Dolphins still weren’t Super Bowl favorites. They were knocked for playing a soft regular season schedule, and were installed as 1-point underdogs for the championship. Just like they had all year, the Dolphins turned to their “No-Name Defense” in the Super Bowl, holding Washington scoreless until almost the final 2 minutes and winning what is still the lowest-scoring Super Bowl ever.
The ’72 Dolphins weren’t packed with all-stars, although players like Bob Griese and Larry Csonka got their share of praise on offense. On the other side of the ball, the team was led by the “No-Name Defense,” a collection of guys who quietly put together the league’s best stopping unit that year.
At least one team routinely get listed ahead of the ’72 Dolphins in discussions of the league’s all-time best teams: The 1985 Chicago Bears, who were far more dominant. The only problem? The Bears lost one game that year, a Monday night tilt against (naturally) the Dolphins. The 2007 New England Patriots came closest to matching the Dolphins’ feat, coming into the Super Bowl with an 18-0 record. But they were stopped one win shy of both history and a title in a massive upset loss to the New York Giants.
It’s become an urban legend that whenever the last unbeaten NFL team loses in a season, the ’72 Dolphins break open a champagne bottle. There’s some debate on how true this is, but Griese has claimed to share Diet Cokes with some of his old teammates, which seems fair enough. Because while there might have been better teams in history, there’s still never been one who defined consistency like the 1972 Miami Dolphins.
http://www.netbrawl.com/uploads/70d36011f81dc97b1e7553df603dac85.jpg

Tuesday, January 13, 2015

Johnny Cash never spent a day in prison. He was arrested seven times on misdemeanor charges, but spending the night in jail was the closest he came to incarceration. Still, that experience, along with years of heavy drinking and drug use, might have given him some insight into the life of a prison inmate. Maybe he realized his own path could have easily mirrored one of theirs, with a few different choices. At any rate, when he recorded two concerts at Folsom State Prison in northern California 47 years ago on January 13, 1968, he sang like a man who knew what life on the inside was about.
Cash had become interested in prison life after seeing the drama “Inside the Walls of Folsom Prison.” He released “Folsom Prison Blues” in 1955, more than a decade before he would record it in front of actual inmates at Folsom. Cash was said to have great compassion for inmates, and started playing prison shows beginning with a show at San Quentin in 1959.
By 1968, Cash had gotten his drug use under control. At 35, his career had started to wane, and he was able to get enough support from Columbia Records to back the idea of a prison album. Calls to Folsom and San Quentin were made, and Folsom responded first. Cash, along with June Carter (who would become his wife two months later), Carl Perkins, and the Tennessee Three, recorded two concerts at Folsom on this date. The resulting album “At Folsom Prison,” rejuvenated Cash’s career, even though Columbia made only a token effort to promote it. A second prison album, “At San Quentin,” followed the next year.
Listening to parts of “At Folsom Prison” might give the impression of a room of rowdy inmates, but the prisoners were actually careful to reserve their cheers, since they didn’t want to be hassled by guards for cheering Cash’s mournful descriptions of prison life. Cheers on the line “I shot a man in Reno, just to watch him die” were added during post-production.
Modern musicians are often criticized for glorifying crime, but it’s worth remembering Johnny Cash “shot a man in Reno” way back in 1955. (Cash said he wrote the line after trying to think of the worst possible reason to shoot a person.) Luckily for him, his concerts were the closest he ever came to life inside a prison before his death in 2003. And luckily for all of us, a few of them were preserved to give us a sense of how an outlaw singer was able to connect with real outlaws.
http://www.captainbluehen.com/wp-content/Johnny-Cash-At-Folsom-Prison.jpg

Monday, January 12, 2015

Hoping to make more money as a studio owner than a songwriter, Berry Gordy leveraged an $800 loan from his family to start a new record company in Detroit called Tamla Records 56 years ago on January 12, 1959. The following year, he changed the company’s name to Motown (“motor town,” riffing on Detroit’s auto industry ties). The company name, and the sound it became associated with, were largely behind helping black musicians achieve mainstream success in the 1960s.

Motown placed 79 records in Billboard’s Top 10 singles chart during the 1960s, and 110 between 1961-1971. The Supremes, the Four Tops, Smokey Robinson & the Miracles, Marvin Gaye, Stevie Wonder, the Jackson 5…all of them released singles from the Motown hit factory. (The Hitsville recording studio Gordy purchased and renovated in 1959 was open 22 hours a day.)

Gordy recognized the resistance to black musicians in much of the country in the 1960s, which he countered by carefully controlling the images of his artists, many of whom had no experience with public life. He schooled them on how to dress, groom, and speak before sending them out.

Combining a squeaky-clean public image with an R&B/soul sound that was designed to have crossover appeal with black and white audiences, Motown arrived at the perfect time. White kids were getting turned on to black musicians against the backdrop of the Civil Rights Movement. Smokey Robinson described the change in the country years later: “I would come to the South in the early days of Motown and the audiences would be segregated. Then they started to get the Motown music and we would go back and the audiences were integrated and the kids were dancing together and holding hands.”


Motown couldn’t solve all of society’s problems, but it did make it impossible to go back to the days of segregated musicians for segregated audiences. After Motown, there would just be musicians and audiences, period. And once a culture changes, society itself isn’t far behind.
https://artsenglish.wikispaces.com/file/view/motown_50yrs.jpg/189070947/motown_50yrs.jpg

Sunday, January 11, 2015

Theodore Roosevelt designated 800,000 acres of the Grand Canyon as a National Monument on this date 107 years ago, placing the area under federal protection from mining and other development on January 11, 1908. As a New Yorker who had fallen in love with the West’s natural wonders during his travels as a young man, he dedicated much of his energy as president to preserving those wonders for future generations. And arguably none of the areas he placed under federal protection have inspired as much awe as the 277-mile gorge carved through Arizona over millions of years by the Colorado River.
The canyon had been a sacred site for Native Americans for over 3,000 years before being discovered by Europeans in the 16th century, and Roosevelt made his feelings on it clear: “Let this great wonder of nature remain as it now is. … You cannot improve on it. But what you can do is to keep it for your children, your children's children, and all who come after you, as the one great sight which every American should see."
Standing in the way was a slow-moving Congress which had been putting off granting the canyon National Park status since 1882. Landholding and mining interests hoped to exploit the area’s resources for profit, which no doubt contributed greatly to gumming up the works in Congress. Faced with this barrier, Roosevelt did the next best thing, using his executive powers under the Antiquities Act of 1906 to declare the Grand Canyon a National Monument as an “object of scientific interest” on this date.
One mining claimant complained that Roosevelt had overstepped the intent of the Antiquities Act by protecting an entire canyon. He sued the president, and the issue went all the way to the Supreme Court, who decided unanimously that a 277-mile gorge carved over millions of years, with 3 millennia of sacred history for Native Americans, indeed qualified as an “object of historic or scientific interest."
The Grand Canyon was eventually upgraded to National Park status in 1919, in an act approved by Congress and signed by Woodrow Wilson. Today the Grand Canyon attracts around five million visitors a year, who make their way around by foot, raft, mule, or helicopter, or just peer over the rim at one of nature’s masterpieces, one which Teddy Roosevelt knew could never be improved by the hand of man.
http://www.travelphotoadventures.com/wp-content/uploads/2013/01/The-Beautiful-Grand-Canyon-by-Michael-Matti.jpg

Saturday, January 10, 2015

Petroleum has brought its share of problems – economic, environmental, and more. It’s also a finite resource that we will use up one day, if not in our lifetimes. But the simple fact is that our modern world, with all the comforts and conveniences we enjoy, was built on oil. And no moment might have ushered in our age of oil more dramatically than the gusher that exploded from a derrick at the Spindletop oil field in Beaumont, Texas, 114 years ago on January 10, 1901.
The air over Spindletop was filled with gushing crude oil, and it took nine days to get the flow under control. Spindletop produced 100,000 barrels of oil a day, and it changed the face of Texas, not to mention the United States, forever. What had been a sprawling and mostly empty state became the center of a booming American oil industry. Houston exploded into a major city, with the largest collection of oil refineries in the world. Beaumont became a boomtown, with its population tripling almost overnight. The discovery of more oil deposits around the Gulf Coast of Texas and in other parts of the state would make the United States the world’s largest oil producer. Oil barons in Texas quickly became some of the richest and most influential men in the state and the country. (There’s a reason for the nickname “Texas tea,” after all.)
The post-World War II economic boom in this country is a celebrated part of our history. What’s not often mentioned is that the boom was powered by the increased use of petroleum. The good times couldn’t last forever, of course. By the middle of the 20th century, production had leveled off in Texas. A tightening of the world’s oil supply caused an energy crisis that hit the U.S. hard in the 1970s. But rising oil prices allowed Texas to ride out the crisis and benefit with another, smaller boom.
We will stop powering our world with oil one day…and it’s quite possible that the oil orgy of the last 100 years or so will leave our grandkids facing major environmental challenges we’ll never have to worry about. But if the age of oil has been a shortsighted one, it’s also unquestionably made much of the world a better place to live. In addition to material comforts, we’ve made strides in science, technology, and medicine that might have never occurred without our interconnected, oil-powered society. If we’ve left a heap of problems for future generations to untangle, we can hope that we’ve also left them with many of the tools they’ll need for the job.
http://static01.nyt.com/images/2014/01/10/us/10TT-gone/10TT-gone-blog427.jpg

Friday, January 9, 2015

We don’t do commercials here, so the point of today’s post isn’t to encourage you to go out and buy anything. But the point is that when Steve Jobs unveiled the iPhone just eight years ago on January 9, 2007, he announced a product that has probably influenced something you already own, whether you’ve ever bought an Apple product or not.
Combining old-fashioned telephony with new-fangled computing has been an idea since at least the 1970s, and there were phones that surfed the internet before. But Apple deserves credit for turning the smartphone industry (was that really even a thing 10 years ago?) on its head with a sleek design that integrated all the doodads of its device in an elegant and intuitive fashion. People lined up to buy the things when they went on sale in June 2007, and competitors like Microsoft and Blackberry basically went back to square one to compete in Apple’s new reality. Blackberry's "work phone" niche pretty much ceased to exist in the brave new world. The truest indicator that the iPhone changed the whole game? QWERTY keypads were common in cell phones before 2007. Three years later, none of the top-tier smartphones had keyboards. Touch screens had become all the rage.
The reality of having something in our pockets that is theoretically a phone, but in truth carries out countless functions that have nothing to do with spoken conversation is still a new one, and we’re still figuring out what it means for us. Boredom seems to have been banished with an endless variety of games (not to mention the entire internet) at our fingers each moment…but so has quiet introspection. Arguments over who starred in that one movie, with the guy and the thing, are a thing of the past. Just whip out your phone and IMDb that sucker. Convenience and endless information are the air we breathe, and while it’s probably made us lazier and dumber than our forerunners, we’re also more efficient and knowledgeable (without needing to actually know more) than they could have hoped.
Something like the iPhone was always going to happen. Science fiction has seen around this corner for decades. It just so happens that Apple got there first. A lot has changed in just eight years. Steve Jobs is gone, and so is the first iPhone (officially declared “obsolete” by Apple in 2013). Things move fast in the digital world, and keeping up with the “flood of electronic babble” (to coin an EPCOT phrase) is our new challenge. Only now it’s a challenge that’s always at our fingertips.
http://darkroom.baltimoresun.com/wp-content/uploads/2012/10/ed-jobs-1.jpg