Wednesday, September 27, 2006

War Games

"When you play Bobby, it is not a question if you win or lose. It is a question if you survive. - Boris Spassky on Bobby Fischer

As you read this, the world championship of chess is going on in Kalmykia, which is yet another country carved out of the old Soviet Union (it's on the Caspian Sea and borders on the Crimea to the one or two of you who still know anything about geography). It's the first unified world championship in 13 years. Regrettably, in a world focused on the Internet (which should be the perfect medium to follow a chess match), brain-dead television, video games, and big-money sports, chess is almost an anachronism.

It's not that chess was a huge event years ago, but it had a great deal of respect, and the majority of people played it, even if they didn't always own up to it. Now, if the average person knows anything about chess, they probably relate it to speed-chess, a degraded form of chess that has managed de-intellectualze a game once equated with enormous expenditures of brain power.

People also know how Gary Kasparov was the first grandmaster to lose a game to a computer, the legendary Deep Blue, although he won the match. He was subsequently played to a draw in a match against a successor machine, Deep Junior. In a way, Mr. Kasparov caused the decline of championship chess by starting his own chess federation in competition with FIDE, which resulted in two separate champions who never played each other. I don't know why he did this; perhaps he got tired of defending his title against Anatoly Karpov all the time.

At any rate, we'll have one champion again, but it's unlikely to get much coverage anywhere in our too-busy-for-a-daily-hair-restorer-treatment world. It wasn't always like that.

You may recall Bobby Fischer reappearing recently when the U.S. tried to get him extradited from Japan because he violated a sanction against traveling to Yugoslavia back in 1992 to play an exhibition against Boris Spassky. He and Spassky made a pile of money, which is what got the Feds irritated more than anything else. Fischer went on the lam for years. After a lot of demonstrations, diplomatic wrangling, and general confusion, Iceland offered to let Fischer come there to honor his achievements in chess. Many would say that the greatest of these occurred in Iceland, in Reykjavik, in 1972 when he won the world championship of chess, defeating the reigning champion, that same Boris Spassky.

I could probably write about a dozen entries about that match and the craziness that surrounded it, but others have done it better (see the resource for this article below, for example). I'll try to give you an overview.

You have to understand the period. In many ways, the worst of the Cold War had passed, but there were still plenty of tensions. We had beaten the Russians to the Moon; they had beaten our basketball team in the Olympics. In general, things were moving toward what would be called detente, but we had a long way to go. And the Russians dominated the world of chess.

Bobby Fischer was the wunderkind of chess. He won the U.S. championship at 14 and became a grandmaster at 15, the youngest up to that time. He was also a nut. He was reclusive, paranoid, volatile, and he hated the Russians with a passion because he felt they were “conspiring” to keep him from winning the chess crown.

Boris Spassky was Fischer's opposite in many ways. He was gregarious, charming, and somewhat athletic. He was calm and self-composed, able to easily withstand the pressures of winning chess' highest prize. And, he had never lost to Fischer.

That the match happened at all was a minor miracle. FIDE's rules called for a series of tournaments that would culminate in a challenger's tournament which would decide Spassky's opponent. Fischer, protesting something or other, sat out the initial phases of the challenge events. After considerable cajoling by friends and chess afficiandos, he entered the fray at what's called the Interzonal level and proceeded to mop up the floor with fallen grandmasters. He didn't just win his matches; he annihilate his opponents, at one point winning 20 games in a row. At a level where draws outnumber decisions by about 2 to 1, to win 20 games straight outright is on a par with Joe Dimaggio's 56-game hitting streak.

Fischer, who seemed often to be so childish and unpredictable, was almost frighteningly self-possessed over the chess board. His opponents seemed to be unnerved by his aggressive play, his ability to deflect attacks, and his complex tactics. Almost to a man, at some point in the match, his opponents would request a postponement due to illness. The phenomenon became known as “Fischer fever.”

So the match was set between Fischer and Spassky. Now the haggling over the sites started (which ended up being Iceland), then over the purse (which ended up being a couple of hundred thousand dollars). Fischer issued one demand after another. The Russians got upset. The more the Russians complained about Fischer, the more Fischer claimed they were conspiring against him. Meanwhile, Spassky waited.

Just when it appeared that the match would never happen, Fischer showed up, but his (non-chess-related) games were not at an end. He didn't show up at the opening ceremonies, sending a representative instead. This outraged the Russians who demanded an apology. Eventually, Fischer acceded, and the match that almost didn't happen started.
Fischer lost game 1. Then he went on a rampage about TV filming. He could hear the cameras; he could see the cameramen; the conditions were impossible. He refused to play until the cameras were removed. The next game went on with the cameras in place, and Fischer didn't show up, forfeiting the game. Now he was down 2-0 in a contest to 12.

There were considerably more histrionics, but finally Fischer was made happy, or least less unhappy. Whatever the reason, he decided to get down to business. Boy, did he get down to business. He won games 3, 5, 6, 8, and 10 (the others were draws) before Spassky finally broke back in game 11. After a draw in game 12, the score was 7 to 5 in favor of Fischer, but many thought the Russian had new life. Fat chance.

In game 13, Fischer brilliantly dodged Spassky's attempts at attack then turned the tables. After 74 moves, Spassky resigned, looking shell-shocked. Fischer was in command, and Spassky knew it. The next day he requested a postponement of the next game due to illness. “Fischer fever” had struck again.

From there on to game 20, the games were drawn. Fischer could afford to play for draws; Spassky could not. Yet Spassky couldn't find a formula to trap the American. Then came game 21.

Chess is a timed affair, because in the olden days, some players would simply wear down others by taking forever to make moves. Game 21, like several of the others, was adjourned, this time after 40 moves. Spassky had had some opportunities in the game, but he made misplays and generally seemed worn out. The next day, just as the match was about to start, Spassky phoned to say he resigned. Fischer was the world champion.

A lot of folks thought chess, would be big-time stuff after that, but it was not to be. Fischer, if I recall, never even defended his championship, essentially retiring from public life for years. Russians and Central Europeans went back to dominating the game, and, to Americans, chess went back to being a pursuit for eggheads, an occasional bright kid, and the piece-slamming speed players.

It's a shame, really. Our brains could stand the exercise.

Reference: Fischer-Spassky: The New York Times Report on the Chess Match of the Century, Richard Roberts, et. al. Bantam Books, 1972.

Wednesday, September 20, 2006

What Would Gloria Steinem Think?

Do you not know I am a woman? When I think, I must speak. ~William Shakespeare, As You Like It

Colleges have changed since I was a student. No, I don't mean that they've moved out of log cabins. What I'm talking about is that colleges seem to be realizing what they've drifted away from what they were created to do: Create an environment where students can learn, not just facts from books, but about life in the world around them. Some institutions are beginning to realize that they need to get back on track. The students and alumni, on the other hand, seem to be put out when the schools try to fulfill that mission.

I've written about a couple of strange situations lately. First, there the protests over the not-the-right-kind-of-deaf president of Gaulledet University. Then came the howls over 1500-student Birmingham Southern University deciding to move to Division III athletics from Division I in order to emphasize academics. But those incidents are nothing compared to the latest brouhaha, this time at Randolph-Macon Women's College.

RMWC is one of a dwindling number of women-only colleges. It's losing money and draws many of its students by offering financial incentives to reduce tuition costs. Unfortunately, this strategy is beginning to eat into the school's endowment, which, as any secondary education administrator will tell you, is slow death to an institution. The trustees, therefore, decided by a 25-2 vote that the only way to stay in the education business is to go co-ed.

They may as well have voted to admit orangutans.

Students have mounted protests. During one protest, the administrators tried to explain the reasoning behind the decision and were roundly shouted down. When the interim president implored the students not to turn their backs on the school, most of the crowd turned their backs on her. They posted banners, including one that said, “115 years of Women can't be wrong.”

They once said they same sort of thing about segregation, not allowing women to vote, and the Ptolemaic System.

At another news site (which, by the way, says the vote was 27-2, finding two more trustees somewhere), there were some comments posted, which as is usually the case, ranged from the outraged to the vulgar, with everything in between. The posting that caught my eye said this:

“Women's colleges turn out a higher percentage of women who go on to high profile careers than coed colleges. Grads from women's colleges are more likely to go on for postgraduate education resulting in more female PhDs than are the female grads of coed schools.

“In fact, I'm sad to say that a lot of women who go to coed colleges are depriving themselves of a rich educational environment.”

I have no idea if the writer had any statistics to back this up, but given that there are only 60 women-only schools left in the country, they aren't generating a lot of our female PhDs. Given the considerably larger number of women at co-ed schools, the percentage of them going to graduate school would have to be smaller. The statistic, if it's legitimate, is not really meaningful.

The irony here is overwhelming. It wasn't many years ago that women were working hard to break down barriers to them at men-only schools, at least partly because they felt that the educational environment was richer and potentially more rewarding. Remember the drive for women to enter the military academies? By the above poster's rationale, we'd have more female generals if women had their own academy.

A lot of people, female and male, spent a lot of effort during the twentieth century to try to ensure that anyone, regardless of race or sex, could get a quality education. Whenever a school was faced with having to remove barriers, the arguments of tradition and degradation of the college experience were trotted out routinely. Just as routinely, these arguments were tossed aside to provide opportunities to more students.

It's not that there aren't situations were men-only or women-only isn't okay. Men and women occasionally need to be in groups of just their own gender to talk freely, belch if they want to, scratch in places that aren't allowed in mixed company, and so on. But that doesn't work in the work place, in the voting booth, or in education.

The real world is full of interactions between men and women. An educational environment that supposedly prepares women for that world yet excludes men is not preparing them for anything. Perhaps going out of business isn't such a bad thing then.

But that's wrong. I like the idea of lots of different colleges, and I think the idea of one going under is tragic. If RMHC can continue and even grow by adding men to the student body, then the trustees are doing a disservice to the school if they don't go that route.

The trustees also face another problem. Once Randolph-Macon Women's College goes co-ed, they would have to drop “Women's” from their name. It turns out that there already is a co-ed school called Randolph-Macon College. That Randolph-Macon for many years did not admit women.

I tell you, the irony is so thick you could cut it with a knife.

Monday, September 18, 2006

Go Long

When I went to Catholic high school in Philadelphia, we just had one coach for football and basketball. He took all of us who turned out and had us run through a forest. The ones who ran into the trees were on the football team. ~George Raveling

You were probably too hung over from the first weekend of college football to notice, but September 5 was a momentous day in the history of the gridiron. On September 5, 1906, the St. Louis University Billikens' Bradbury Robinson through the first legal forward pass in the sport's history. It was incomplete.

As Woody “three yards and a cloud of dust” Hayes once said, “Three things can happen when you pass, and two of them are bad.”

Despite the inauspicious start and Woody Hayes, the forward pass has come to be the centerpiece of collegiate and professional offenses. It would be difficult to imagine the game without it. But if you want to try, picture your average felonious assault, and you'd have a pretty good idea.

In 1905, national concern about the violence of football had reached a peak. People were getting seriously injured, even killed, playing this game. Of course, this was football with the flying wedge, a formation where lineman linked arms and literally steamrollered down the field, concussing and trampling would-be tacklers. This was football where a ball carrier wasn't down until he was held down. “Piling on” wasn't a penalty; it was a required part of defensive play. People were calling for the game to be banned.

President Theodore Roosevelt, who was certainly no weenie, went on record as saying he would push for Congress to pass legislation forbidding the game unless the rules committee did something. He went so far as to make very specific suggestion: Legalize the forward pass. Now any presidential suggestion carries a lot of weight; coming from a man with the drive and personality of Teddy Roosevelt it was an order. Therefore, the rules, and the game itself, were changed forever.

I certainly didn't realize that St. Louis U. was the first school to use the pass. According to an AP article I read in my local paper, they overcame that initial incompletion to generate a convincing aerial offense. Immortal coach Eddie Cochems (no, I've never heard of him either) applied the pass so successfully that they went undefeated, slaughtering one opponent 71-0. Like most people, I was conned by Pat O'Brien and Ronald “The Gipper” Reagan in the film biography of Knute Rockne. In that movie, Rockne brings out the forward pass to beat Army (without Ronald Reagan's help; he comes later in the movie), I think it was. The referees are flummoxed; they don't even know the play is legal. Of course, given the national publicity caused by Roosevelt's support, it was unlikely that the game officials would have been unaware of the play, but it made for a good movie moment.

Fans who have grown up in the last twenty years probably think that passing was the prime offensive weapon in everyone's arsenal from 1906 on. Far from it. Football was basically a ground game right into the sixties. A team that threw more than 20 times a game seldom reached the championship game (the only game in town; no playoffs, remember?). Great quarterbacks like John Brodie and Sonny Jurgenson played on also-rans. The teams that played for the title had strong running games: Cleveland, Green Bay, New York, and so on.

It's not that there weren't great passers. Otto Graham was pinpoint accurate and possibly the best clutch quarterback of all time. Bobby Layne made the Detroit Lions formidable with his passing. But, both of those teams, along with all the others, depended on the running game to set up the pass. If they couldn't run, the passing game went nowhere.

Today, we're in the other extreme; the pass sets up the run. Teams come out flinging the ball then turn to the running game when the defense is spread out defending the receivers. I don't know that it's better, but it certainly is different.

Personally, I have always like the balanced offense, which is still a hallmark of successful teams. You can throw the ball all over the place when you need to, and you can rumble on the ground when you want to. It's hard to defend when a team is able to use the entire playbook every down. It's sad to see a team that has to throw on third-and-three because they can't block for the run. It's just as sad when a team that lives with the run (like Air Force, for example) can't come back when they're more than 10 points down because they have limited options to score quickly.

Of course, the other problem with a lot of passing is that it lengthens the game. The clock stoppages for incomplete passes played a significant role in causing games to run over three hours. Passing also creates more penalties, particularly for holding, which also causes stoppages. Of course, an hour of commercials per game also is a huge factor in creating three-and-a-half hour games, but I'm not even waste wear and tear on my fingers writing about that, since no one is going to reduce the ad stoppages.

So, when you're watching Peyton Manning or your favorite collegiate quarterback filling the air with pigskin, sit back and enjoy it, because the game would be duller without it. Especially enjoy those teams that can surprise you with the pass or the run when you don't expect it, because surprise is the essence of the game. Besides, if you took away the pass, you'd have an entirely different game. You'd have rugby.

Which, as I think of it, isn't THAT bad.

Wednesday, September 13, 2006

Getting the Business

Start with what is right rather than with what is acceptable. ~ Peter Drucker

During my recent discussions of the Space Frontier Foundation and commercial space flight over at Explorations, the gentlemen who disagreed with my assessment took issue with my attitude toward business. Although only one went so far as to imply that I am a socialist, both clearly felt that I was a typical commie-pinko-liberal-tree-hugging-private-enterprise-hating noodle brain.

Well, maybe not that bad, but they weren't happy with my attitude. Unfortunately, the business world keeps giving me reasons to feel that way.

Everyone is probably aware of the massive battery recall that Dell has had to make because of a small problem with Sony batteries bursting into flame. Now, Dell has gone to some lengths to make the recall process as painless as possible, which is great if you haven't had a laptop catch fire yet. It seems, though, that Dell may not have been as prompt as they might have been.

The Inquirer reports that Dell and Sony had some conversations months before the recall about potential problems with Sony's manufacturing methods for the lithium-ion batteries. Based on the meetings, Sony made changes to their manufacturing methods, but it was decided not recall any of the potentially defective batteries.

Oops.

Ironically, the article, dated August 21, mentions a possibility that Apple might be affected, but that the "official line" was that no action was required. Just a few days later, Apple issued a similar recall.

That slurping sound you hear is coming from lawyers licking their chops.

If this were an isolated incident, one would be willing to write it off. But it comes virtually on the heels of Sony's attempt to place DRM software on users' PC's without their knowledge that was effectively a rootkit that could be used by Sony or (as it turned out) any hacker to invade your PC in a variety of ways, none of which are to the users' advantage.

It also raises haunting echoes of the Ford Pinto fiasco of many years ago, where concerns were raised about potential fires from gas tanks rupturing if the car was hit from the rear. More recently, General Motors had a similar fiasco with the so-called "side-saddle"” gas tanks in some of their pickups that also burst into flame on collision. What was bad in both situations was the presence of internal documentation that showed that engineers were concerned about the dangers of the designs (evidently being a different group of engineers from the ones that did the designing). The companies decided to ignore the advice, with the result that they suffered from hideous publicity and expensive lawsuits.

And all of the above doesn't take into account purely illegal actions like the stock option tomfoolery that has Dell, McAfee, and about 100 other U.S. companies being investigated by the SEC. Then, there's the little matter of HP's hiring detectives to obtain phone records to determine which board member was leaking information to the press. That one is so egregious that the California Attorney General is investigating.

Of course, HP has come down hard on the person "responsible", board chair Patricia Dunn, by "firing" her. Actually, she is losing her high-paying chair job and being demoted to a slightly-less-high-paying director's position.

I won't even mention Enron.

How long does it take for corporations to learn? Don't they teach these people anything in MBA classes, like, say, ethics and the regulatory process?

Or has the pursuit of a quick buck taken complete charge of our business leaders to the point that having a few consumers get hurt is just part of the profit/loss equation? Is corporate ethics so completely dead that no company's leadership can be trusted?

Businesses have never been as white as the driven snow. The nature of competition is such that people who run companies are going to do whatever they can to gain an advantage. The difference is that in times past there was more emphasis on making the product right. If you could make your product better than anyone else and do it at a good price, you were going to win the fight. Of course, the idea was to wipe out all your competition so you could charge whatever you want. But monopolistic practices are another discussion.

Business planning was based on supplying some sort of value, either a real good product at a high price (like a Cadillac) or a serviceable product at a low price (like a Volkswagen). This was a long-range proposition; you had to invest money today, and possibly make less immediate profit, to score big later. The philosophy has changed since the 1960's to one of immediate gains at the expense of long-term viability. Get the stock up now; we'll worry about next year when it comes. Once the stock is up, you can then rake in the dough using those not-to-regulation stock options.

My father, who was in the restaurant business, had very strong feelings about how this came about. He blamed it on the MBAs, based in part on his own experiences. He worked for restaurant management companies, the sort that ran many restaurants, often in hotel chains. As the companies began to hire MBAs, the emphasis moved, in his mind, from the quality and variety of the menus to cost-cutting measures that would look good for a while. Trouble was, in the long run, customers would drift away because the food wasn't as good as it was or because favorite menu items had disappeared.

I don't know that it's that simple. There's no doubt that corporate leadership doesn't understand the manufacturing methods and the products as well as they once did, but it would be simplistic to blame everything on that. It would also be simplistic to think that, in the olden days, all businesses were ethical because they weren't. But, the cut-throatedness used to be between the competitors. Now it's the consumer and the ordinary shareholder who's getting it more and more often. That's a worrisome trend.

These things have happened before with the railroad trusts, stock manipulations during the late nineteenth and early twentieth centuries, the Standard Oil monopoly, and the like. Sooner or later, there is always a backlash, from consumers and stockholders. The backlash now is evident in the ever-increasing market share of Asian companies in the auto and electronics industry.

Some might take comfort in Sony's troubles, but the Koreans, Taiwanese, and other Japanese corporations are still doing fine, making better products all the time at low prices. So American corporations had better learn that doing the right thing is going to be the only way to get a grip on domestic markets again. If they don't, well, there are plenty of others willing to move in. And American investment dollars will follow them. That doesn't even count China's potential, which should scare the heebie-jeebies out of boards of directors.

Better start thinking about the long-term effects of poor products and fast-and-loose ethics, folks, because the short-term is shorter than you think.

Monday, September 11, 2006

Don't Forward This E-Mail

Information on the Internet is subject to the same rules and regulations as conversation at a bar. ~George Lundberg

Over at Explorations, I wrote a piece about a viral e-mail that seems to be popping up every August that says Mars is going be at its closest approach ever, and it's going to be bigger in the sky than the Moon. This is, of course, what I will politely call “hogwash.”

“Viral e-mails” are e-mail messages that are wildly circulated among Internet users. They don't necessarily have viruses in them, but they get sent out in such quantity that they seem to breed in a virus-like fashion. Viral e-mails are dumb jokes, political character assassinations, stupid pictures, and, most often, bad information on a ridiculous scale. I think the term “viral e-mail” dates back to the 1990's, when real viruses were just starting to have an impact.

The “Good Times Virus” e-mail was arguably the first really impressive viral e-mail. In fact, most often, it wasn't even an e-mail; it was a group posting on AOL (where it apparently started) and Compuserv. Basically, the message was to not open any e-mail or posting with the subject “Good Times”. If you did, horrific things would happen to your PC, to your friends' PC's, and possibly to all living things on the planet. It was soon joined by a warning about the “Pen Pal” virus, which had the same sort of legs. Eventually, parodies of these appeared, which really did say that the offending messages would cause all the food in your refrigerator to spoil, rape your grandmother, dig up your grandmother if she was dead and then rape her, and so on. They were hilarious.

They also got picked up by some idiots who thought they were legitimate, beginning a new round of viral e-mailing.

The thing that makes these things so effective is that they almost all end with the following deadly sentence: “Send this to everyone you know!” And people did, over and over again.

What makes these so disgusting, aside from the toll they take on mail systems, is that people actually believe some of these things. For example, there's the Mars thing, which if true, would mean serious death and destruction on Earth, thanks to earthquakes and volcanic activity. Or there was the Hillary Clinton e-mail of a year or so ago, which will probably be making the rounds again as Presidential election time rolls around. I forget the exact details, but it had something to do with Senator Clinton having defended a notorious Black Panther or some similar extremist. Trouble was that at the time the extremist was being tried, the Senator was still in law school.

There was National Don't Buy Gas Day, supposedly a national boycott of the petroleum companies. Or, if you like oldies but goodies, there's the Niemann-Marcus cookie recipe (or maybe it's brownies) that supposedly cost some woman a small fortune when she published it and was sued by the retailer. If you've got a gullible friend or two, you've gotten one.

As a public service, I'm going to let you in on the next big-time viral e-mail. Best of all, this one is going to come from, of all things, a 419 scam. I've described these before, so I won't do it again except to say that 419 scams are fraudulent attempts to get you to send money and/or your bank information to someone on the pretext that you are going to get very rich by assisting the scammer in removing copious amounts of money from a foreign country.

A new one has the potential to turn into a phony “news” piece claiming that WMD's were found in Iraq. As you will see if you follow the link, the letter claims to be from a member of the “US Marine Force on Monitoring and Peace keeping mission”, a mouthful if there ever was one. The scrappy Sgt. Tao, the supposed author of this message, was part of this group when they attacked a terrorist stronghold, killing a lot of people, and finding guns, bombs, cocaine, and nuclear weapons. Thanks to the rather strange syntax of the message, it could be that the cocaine was actually in one of the nukes, but I could be wrong.

At any rate, the good Sergeant also turned up $25 million lying about as well. If you are willing to help him violate U.S. law and smuggle the dough out of “Baghdad-Iraq”, you score 20% of the swag (or 10 years in jail, depending on your luck).

Of course, this is patent nonsense, although it will snag some slightly dishonest and completely stupid people. However, what I think is that the scam portion of this message is going to get dropped by one of the wags who starts those viral e-mails. Within days, I expect to have someone forward me “Proof Saddam Had WMD's” in which I will find the exploits of Sgt. Tao and the “US Marine Force”, complete with body count related in gory detail, and ending with explicit instructions that I should send this important information to everyone I know.

So consider yourself forewarned. I'd say the odds are good that you should be seeing this tripe in a mailbox near you soon. Just delete it and move on. My only concern is what happens when a copy of this gets sent to a member of a certain news service.

I mean, can't you just hear Fox News campaigning for Sgt. Tao to be decorated?

Wednesday, September 6, 2006

What's In A Name?

I shall ne'er be ware of mine own wit till I break my shins against it. ~ Williams Shakespeare, As You Like It

I was reading an article the other day dealing with the issue of who wrote the plays of William Shakespeare. To be honest, until a few years ago, I'd always thought the debate over whether Shakespeare in fact authored the works attributed to him was something of a gag, reserved for comedies of manners where someone was supposed to be a stuffy so-and-so and proved it by declaiming that Macbeth, Hamlet, Othello, and all those other works were created by Francis Bacon.

It turns out that, for some people, this business of authorship is still a live issue. A couple of years ago, there was a PBS series on the topic, which concluded that one William Shakespeare of Stratford-on-Avon did, in fact, write the body of plays and sonnets attributed to him. The September “Smithsonian Magazine”, however, brings the whole canard up again, thanks to a new exhibit of portraits of the bard, none of which, apparently, can be proved to actually have been him.

Despite all the old jokes about about Bacon, it appears that the number one suspect is Edward de Vere, Earl of Oxford. Other candidates have included Christopher Marlowe and Queen Elizabeth I (who one suspects would have been a bit too occupied with fighting Spain to be knocking out “Troilus and Cressida”), but Oxford (as his friends no doubt called him) seems to be the candidate of choice.

Now I'm not going to go into all the gory details of why most people feel this is bunk; if you're interested, invest a few bucks and buy the magazine. But, the arguments center on how little we really know about Will Shakespeare, how there are no definite portraits, and how a commoner from an illiterate family could have written all these plays, some with classical themes, others with classical references, and some borrowing from other literary sources. The answers to these issues?
  • Shakespeare was a commoner; we don't know a lot about any of them. That we can learn much about him at all is reasonably amazing. Ordinary folks didn't get books written about them, and playwrights weren't regarded any more highly than, say, accountants, perhaps less so, if that's possible.
  • You didn't drop into the paint-o-mat and get a portrait knocked out. You had to pay for it, and commoners didn't have tat kind of money; Shakespeare, in fact, like J. S. Bach later, was constantly trying to get money owed to him. He also wasn't all that famous in his day; he was a writer of popular plays, sort of like being a film writer today. Once he retired, other plays were produced by other authors and Shakespeare sort of dropped from view. He just wasn't famous enough in his day to rate the attention of a painter. Most, if not all, of the images of Shakespeare were done much later when he began to gain posthumous fame.
  • Shakespeare got a grammar school education. In those days, that meant studying the classics of Greece and Rome. Topics like Julius Caesar would have been bread and butter to someone like Shakespeare; making allusions to classical poetry and mythology would have been second nature to him. As to borrowing from other literature (like Hamlet and Othello), well, as an actor he would have become familiar with works like these. Doing his own remakes would have been a normal activity, just as it is today.
I don't know about you, but it doesn't bother me one way or the other. I still enjoy the plays, and it doesn't matter a whit whether William Shakespeare or the Earl of Oxford knocked them out. Perhaps the name Shakespeare was used by many, just as Alan Smithee was used by directors, who for one reason or another didn't want their names to appear in the credits for a film. Maybe some noblemen (or noblewomen) or some writer's collective used Shakespeare's name as a cover. It doesn't make “Macbeth” any the less riveting.

But, I really wanted to write about something else that the article brought to mind. Back when I was in high school, walking four miles to school, uphill, both ways, in the snow, killing sabretooth cats on the way, I knew a guy whom I shall call Bob, mostly because that was his name.

Bob was a genial character who was one of the primary class comedians. He was also a pretty sharp cookie, but for one reason or another, he downplayed that side of his personality. He was an okay student because that's all he wanted to be. For example, he was a very good writer, but he was technically sloppy, particularly with spelling.

Bob overcame these deficiencies as time went on, especially on the spelling front. I know about the spelling, because he cleaned my clock in a long running “hangman” match we had. We played tough rules. No words less than five letters and only six misses allowed; we didn't even draw the little stick figure, just tallied the misses. I figure he would have killed me at Scrabble (TM).

Before he developed these skills, though, he was the bane of our English teacher during our sophomore year. She used to give content and mechanics grades on each essay. Bob would get a score in the high 90's for content accompanied by something in the 70's for mechanics. But he outdid himself on an essay concerning the Bard of Avon.

Consider, if you will, how many times the word “Shakespeare” would occur in a 2 or 3 page essay on “Hamlet” (I think it was). Consider further how a mechanics grade might suffer if, each time you wrote that name, you spelled it “Shakspeare” or “Shakespear”. That's what Bob did. I don't recall the exact mechanics grade he got, but the teacher mentioned that she stopped marking that error after about thirty of them.

It is totally ironic that, in his own time, the Bard's name was variously rendered as “Shaxspere”, “Shagspere”, and “Shaxberd”, according to the Smithsonian article. Variants of names were not uncommon in those time, particularly if you weren't the Earl of Whatever. Now, Bob could debate with the best of them, and I wonder what the confrontation between him and the teacher would have been like had he been aware of the variability of spellings. It definitely would have made for one lively class. I certainly would have taken his side.

After all, he hadn't started beating my brains out at “hangman” yet.

Monday, September 4, 2006

Web 2.0, Users 0

The Internet is just a world passing around notes in a classroom. ~Jon Stewart

A movie came out recently entitled “Snakes on a Plane” which is about a bunch of snakes getting loose on – you guessed it – an airplane. Now this is pretty standard B-picture fare, but there was a lot of talk around the media about this flick before it was released The talk was not about the plot or the actor's performances; it was about the Internet and how “Snakes” was going to be a blockbuster because of all the Web buzz.

There is a term called “viral marketing”, which is a fancy way of saying “word of mouth.” Word-of-mouth advertising has long been a part of making small, independent movies successful, as well as creating cult classics like “The Rocky Horror Picture Show.” But this time, the word-of-mouth was on the World Wide Web, and, by golly, you know that means big money.

Well, surprise, surprise, it hasn't worked out that way. Oh, the film had a respectable start, making about half its cost the first weekend, but the “smart money” was saying that the film would score very big because of all that free publicity on the Internet. Well, listen up, “smart money”: The Internet isn't what you seem to think it is.

Perhaps you've heard about a Japanese video game that had been translated into English very badly. The most quoted line from the game was, “All your base are belong to us.” This tortured translation became so common on the 'Net that it became a cliché, fit for use only by newbies. “Snakes on a Plane” got to be the same sort of joke, either via terrible plays on words (grilling T-bones are “Steaks on a Flame”; I didn't say they were good jokes) or poking fun at Samuel L. Jackson's constant potty mouth.

Like most things about the 'Net, the real impact is overrated. It's great for shopping, although it's also an excellent way to get ripped off, either through fraudulent vendors or having your payment info stolen. It's a huge source of information, if you know how to use a search engine like Google effectively, but it also contains sources of misinformation like Wikipedia. It has many news sources, but the pressure to put up content has even reputable sources publishing stories that are ill-founded or outright false.

The 'Net is also a great mode of communication, if you're fortunate enough to have a high-speed connection. Why, you can work from home, if your ISP allows VPN connections; if they don't, well, you can't. You have e-mail, which is wonderful if you can sort out the legitimate mail from the tons of garbage you have brought down on your head by registering your e-mail address every time a web site asks for it. Oh, and, as a bonus, you get the added feature of having viruses and trojans e-mailed directly to you by friends whose PC's have been compromised.

So, the Internet is both useful and dangerous, just like a chef's knife, and, as with a chef's knife, you can get hurt if you're not careful. Also, there are things to which the Internet is suited and things for which it may not be so well suited, just like using that chef's knife as a screwdriver is not necessarily such a good idea.

Which brings us to Web 2.0, whatever that is.

For months now, the IT intelligentsia have been touting Web 2.0. It's going to walk your dog, weed your garden, and change the oil on your car. That is, it'll do all sorts of wonderful things, as soon as someone figures out what it is. While there is all sorts of talk about the coming wonders of Web 2.0, there's a distinct shortage of actual information about what will make it different from Web 1.0.

Oh, there's the customary talk about new means of collaboration, which has been a buzzword since the earliest days of computing. Strangely, with all these supposed tools for collaborating, we seem to be taking longer and having more meetings to get done what we used to accomplish more easily without the tools.

Okay, if it's not collaboration, maybe it's the web applications. Web apps have been bandied about for several years now. Basically, you take a perfectly good application and make it run in a browser. So, instead of a small client running on your local machine, you have to run the client from a server inside your browser, which is a memory hog by itself. The browser version will not have all the features available in the locally run client software. This is called “progress”. This is working so well, that Microsoft is changing the interface of their wildly popular WSUS application from a web-based front-end to a traditional management console.

(I wasn't being sarcastic. WSUS is used by local networks to automate the process of applying MS security patches. It's free and is, quite possibly, the best application MS has ever put out. However, its web front end had severe management limitations, so the console is back.)

Perhaps it's the hosted web apps, such as Google's calendar and pending spreadsheet and word processor. Well, application hosting has already pretty much faded into the sunset, because, no one has figured out what to do if a backhoe comes along and severs your connection to the Internet. If the backhoe doesn't get you, then you might have to contend with a server failure at the host. Or, best of all, the host could be hacked, and, all your data gets exposed and sold to the highest bidder.

Tim Berniers-Lee invented the Web. Well, he invented the HyperText Transfer Protocol, which you will recognize as the “http” you're always typing into your browser. When Tim Berniers-Lee talks, I listen. And he thinks this whole Web 2.0 thing is “useless jargon.” According to Mr. Berniers-Lee, all that's happening is people are using a bunch of new technology to do precisely what they've been doing right along.

Apparently in an attempt to add some sort of mystique to this Web 2.0 nonsense, a conference is being held. Usually, such shindigs are open events designed to get the word out. This one, though, is by invitation only. In other words, if you aren't a Web 2.0 player, you aren't getting in. Why do I get the feeling that this is a group that's trying to figure out how to make a quick buck before “dot com bust 2.0” occurs?

No matter how hard people try to make the Internet into the be-all and end-all of information processing, it's just another tool. Finding complex ways to do simple things may make the 'Net fancier, but it won't make it better. In fact, just as computers have to be faster just to run modern bloated software as well as our old 8086's could run stuff, the 'Net needs more and more bandwidth just to support the endless advertising that seems to be the hallmark (so far, at least) of Web 2.0.

In the way that a bunch of people mentioning a mediocre flick on the Web will not make it into “Gone With The Wind”, a bunch of new ways of foisting advertising on us does not constitute an improved Web. Besides some of us use ad blockers. We'll probably miss the whole thing.

Not that there'll be much to miss.

Friday, September 1, 2006

Weeping and Wailing

There are people who have an appetite for grief; pleasure is not strong enough and they crave pain. They have mithridatic stomachs which must be fed on poisoned bread, natures so doomed that no prosperity can sooth their ragged and dishevelled desolation. ~Ralph Waldo Emerson

I don't know if other countries are getting to be like us, but I hope not, because we have become a spiritless people.

Here we are, living in the one of the most, if not the most, prosperous countries in the world. Despite the best efforts of the current administration, we are still a free society. We have made tremendous strides in the last 50 years in civil rights and gender equity. Certainly we have problems, but rather than working to solve them, we drown in self-pity. We spend our time apologizing and wallowing in self-pity.

Apologizing

The other night, NBC aired the Emmy Awards. The show featured a sketch, which had been recorded earlier, with had Conan O'Brien aboard a jet that crash-landed on an island. It turned out that a commuter flight crashed in Kentucky the day the show was aired. Now Kentucky has nothing in common with a desert island, but apparently the NBC affiliate in Lexington was outraged that the network would air the sketch. NBC obsequiously apologized. Now, had NBC apologized for inflicting Conan O'Brien on the public that might be appropriate, but apologizing for show a humorous sketch which doesn't show a bunch of people being killed and has no relation to the event in Kentucky whatsoever is absolutely ridiculous.

We seem to expect apologies for everything. When the apologies are made, they aren't considered to have sufficient groveling associated with them. Bill Clinton apologized for slavery in the U.S. Unless Mr. Clinton is a lot older than he looks, I don't believe he had anything to do with it. Not only did he sound silly delivering the apology, many groups decided that the apology wasn't enough, beginning demands for “reparations” (which has become a fertile spam scan, by the way).

Mel Gibson makes a complete ass of himself and apologizes. Some people apparently would prefer burning him at the stake. Mr. Gibson is a jerk for doing what he did drunk, and he may be just as big a bigoted jerk when he's sober. But he publicly apologized, so can we please let it go? If you're really that mad, stay away from his movies.

The President admits that FEMA screwed up, and he's oh-so-sorry, but New Orleans still is without power to over 70% of the city. The Army Corps of Engineers thinks the levees will fail again if hit by a hurricane this year. I guess saying your sorry means never having to do anything to make it right.

And that's the point. If we've got a serious problem, quit worrying about apologies and do something about it. Slavery was abolished, civil rights legislation has been passed, so what does any apology do? We need to do more about bigotry and unfairness on all sides, so let's get on with it. The guys who needed to be sorry have been dead for over 100 years. Let's quit apologizing for screwing up in New Orleans and fix it. All the regrets in the world won't do anything about those levees.

As to things like NBC's unintentional faux pas, maybe we should just grow up. In the case of Mel Gibson, maybe HE should just grow up.

Wallowing

Katrina happened a year ago; 9/11 happened five years ago. I don't think a week has gone by since when there wasn't an article in the paper or a TV program wailing about how awful it all was, how our lives were changed forever, and how everything that happens is a consequence of one of those events. If we don't moan about disasters that happened, we're wailing over ones that might happen. Oh lordy, what if a hurricane hits New York City? Saints preserve us, is it true Yellowstone might erupt tomorrow? Omigawd, what if a big meteor clobbers us next week? Global warming or a new Ice Age is going kill us in the next decade, poor, poor us.

What a bunch of crybabies we've become. Katrina happened a year ago; maybe if we quit moaning about it and pitched in to do something, New Orleans would be a lot further down the road to recovery. It's been five years since 9/11. Are we angry because lousy construction caused them to collapse? Have we done anything to deal with the problems that allowed it to happen in the first place? Nope (and, no, killing over 2000 more Americans in Iraq does not qualify as “doing something”).

The fact is that we have become a people who spend lots of time feeling sorry for ourselves. We wallow in self-pity instead of moving forward. Ultimately, the prevailing attitude is, “Gee, that's awful for those folks, but I'm sure glad it didn't happen to me.” Then, when we do nothing to help, to assuage our guilt, we piously look at the events over and over again and feel bad each time. And while we're spending all that time looking back, the next thing comes along and whacks us in the head.

We have become paralyzed. Certainly there are potential global disasters, but we, in concert with the rest of the world, have more than sufficient resources to find ways to deal with these things. But, rather than do that, we kill each other over oil.

The Community

I don't know if the World War II generation is the “Greatest Generation” as it is often called; my vote would be for the Founding Fathers' generation. Both of them, though, went through greater trials and tribulations than any of us have known. There was a sense of community, a sense that we were a national family, ready to come to each other's aid. There was a sense that there was nothing we couldn't do if we pooled our resources and determination. The sense of community is dying. The determination is gone. Both have been replaced by a what's-in-it-for-me attitude of pure selfishness.

There are big challenges ahead, but we're going to have to grow up and face them squarely. If the current population had been faced with two world wars bracketed around an economic depression, I don't think we'd be sitting here doing blogs, downloading music, or watching DVD's. We'd be living in caves, banging rocks together for entertainment.

It's our choice: Come together or suck rocks.