Wednesday, December 20, 2006

A Yankee Amongst the Magnolias 5

The origin of grits: Cherokee Indians, native to the Southern region of the United States, first discovered grits trees growing wild during the thirteenth century. Chief Big Bear's squaw, Jemima Big Bear, is said to have run out of oatmeal one day, so she gathered the tiny grits growing from the grits trees and cooked them in water for Chief Big Bear. After eating the grits, Chief Big Bear ordered his squaw, Jemima, burned at the stake. ~ Lewis Grizzard, Don't Sit Under the Grits Tree with Anyone Else but Me.

The other day a coworker, a lady who is a native Alabamian, commented on how Northerners never really pick up Southern accents. We try, but we fail miserably. Anyone, even a non-Southerner, can immediately detect that hokey drawl, along with the incorrectly used “you-alls”.

(Note to Northerners who do not wish to embarrass themselves in Atlanta or Birmingham: It's “y'all” not “you-all”, and it's only used when referring to a group of people, not individuals. You don't say to an individual, “Have you-all been here before?”)

In fact, Northerners from heavily accented regions like New England or New York never completely lose their own distinctive, if somewhat abrasive, accents. My coworker noted that the opposite was not true. Southerners who move northward seem to lose their own accents, unless they happen to speak to other Southerners. I've witnessed this on my own. When I lived in Ohio, I knew a couple of relocated Southerners, and the longer one spoke to them, the fainter their accent got. When they were on the phone to a relative or talked to each other, though, their accents came back in force.

I think this has to do with the innate politeness that pervades the South. For example, when a Northerner is attempting a Georgia accent, the average Atlanta native will not start laughing out loud. The genuine Southerner will simply say, “You're not from around here, are you?” When the Southerner goes North, he or she is aware that the gentility of Southern speech stands out amidst the “youse guys” and “warsh your hands” of the locals (I was nine years old before I was sure that “wash” was not spelled with an “r” in the middle). So, the Southerner blends in, adding to everyone's comfort level.

Perhaps the one place they allow themselves to pull the Northerner's leg is the matter of grits.

When I first came to Alabama, the company where I worked was a division of a corporation headquartered in Erie, Pa. Whenever we visited potential vendors, one of us from Alabama would meet up with a buyer from Erie and go to the site. Generally, our vendors were located up North. On one such trip, I sat down at breakfast with Bert, the buyer, who looked at me sternly and said, “Don't you dare order grits!”

Of course, I knew better than to do that in Minnesota, but I was curious as to why. It seemed that Ed, one of my coworkers who also came from Ohio, had a penchant for tweaking the Erie buyers by ordering grits at every opportunity.

Now a real Southerner wouldn't do that. What a real Southerner does is confuse Northerners who come South.

I had moved on to another company down here that was regrettably owned by a Northern corporation. Periodically, we would be invaded by managers from New Jersey. One day at lunch, one of them commented on the fact that, when he ordered some bacon and eggs for breakfast, he was served a side of grits. I attempted to explain to him that grits are a staple of the Southern breakfast; if he had ordered coffee and a slice of pie, he'd have gotten a side of grits. That's just the way things are down here.

“Yeah, I understand that, but what I'm wondering is, what are grits, anyway?”

Before I could speak, our personnel manager, a native of Montgomery, spoke up and said, “They mine them.” At the manager's puzzled look, she launched into a fairly detailed discussion of the mining and grading of grits.

The next manager that visited got the grits tree treatment.

To be honest, I don't really know where grits come from, other than being reasonably sure that trees and mines aren't involved, but I do like them. I do, however, take a certain amount of abuse about the way I eat them. Traditionally, Southerners like their grits with a little salt and pepper and whole lot of butter. The might add cheese or bacon crumbles (real bacon, not those wretched things you sprinkle on salads), but they seldom sweeten them.

I prefer sugar on mine.

The first time I did this in the presence of some of the native-born folks, one of them merely rolled his eyes heavenward and sighed, “Yankees just don't understand grits.” Another one said to him, “Now don't get riled. At least he appreciates 'em, even if he doesn't know how to eat 'em properly.”

Then he looked at me and said, “You jest enjoy them grits any way that suits you.” And I did, and still do.

I do understand one traditional Southern favorite, though, and that's corn bread. I have actually seen recipes prepared by Northerners that contain sugar. Sweet cornbread is not real cornbread. If you're going to have cornbread with your jambalaya or barbecue, by golly you don't need sweet bread. I also understand that the only proper way to bake cornbread is in a cast iron skillet that's coated with bacon grease and preheated in the oven so that the batter sizzles when you pour it in.

I may eat grits wrong, but even my Southern friends will admit I understand cornbread.

Postscript: A day after I originally wrote this, I was walking by a local eatery which proclaimed as one of its lunch specials, "Fish and Grit". Just the one, but, man, it's huge.
More about being a Yankee Amongst the Magnolias (with links to earlier episodes)

Wednesday, December 13, 2006

High Anxiety

What, me worry? ~ Alfred E. Neuman

CNN, hyping an article in another Time-Warner outlet (sorry, the link is gone), Time Magazine, advised that Americans worry too much and about all the wrong things. Some of their examples:
  • We are scared to death about avian flu, which has killed exactly no one in the United States, while neglecting to get shots against the regular forms of flu, which do in people every year.
  • We won't buy spinach because it may be contaminated by E. coli, but we'll stuff our faces with greasy burgers and artery-clogging lardshakes.
  • We fear catastrophes but avoid doing anything in advance that could mitigate the effects of such disasters.
We may not be number one in a lot of things any more, but I am reasonably sure that America leads the world in neurotic worrying. Of course, it's articles like the one at CNN that make us this way. The media has learned that if you want to have readers or viewers, throw a scare headline out there.

“Next on Newscenter! Strawberry shortcake: Dessert or terrorist weapon?”

"When we come back, the latest in our series on dangers facing our children. Tonight: Jawbreakers."

“Don't miss the next report in our continuing series, '1001 things in your home that will kill you!' “ (Actually, I think they've done this one.)

And, of course, the ever-popular, “Today's thing that causes cancer!”

The only thing that gets more attention is a series on sex, usually disguised as either social commentary, important health information, or criticism of all the sex on TV (with lots of teaser scenes from shows on that news program's very network). But, the sex stories also cause worry, hinting that our daughters are going to become prostitutes, implying that we're all going to get AIDS, and pretty much calling anyone who watches a top-rated TV show a pervert.

It's not that we shouldn't worry about things. We just don't prioritize well. The ordinary person can do little about global warming, but he or she can do something about hunger in the community. The average Joe can't do much about international terrorism, but Joe can go to the polls and vote against those who are wasting our tax dollars uselessly in the name of “homeland security” while attacking our rights to life, liberty, and property.

We should be concerned with the well-being of our families, but we should do the right sorts of things to minimize our risks and then move on with enjoying our lives. And therein lies the problem, I think. The average repressed American feels guilty about enjoying life. I mean, people are working long hours, often both partners in a household, so they can enjoy all the material pleasures that can entail. Then they feel so guilty about taking a vacation that they find ways to stay in touch with their problems via cell phone and/or the Internet. It's natural, therefore, as part of our national guilt trip, that we should take on all the worries that the media dish out.

Well, maybe people should consider some of these things instead.
  • In spite of the best efforts of the economic geniuses in Washington, we still enjoy the highest standard of living in the world. We also have so many resources, we can hold out a helping hand to those in need.
  • Terrorism is a decided problem, but there is an infinitesimal chance of some ragged weirdo building a nuclear bomb (dirty or otherwise), mass-producing sarin, or conducting biological warfare.
  • Yes, there's a lot of sex and violence on TV and scattered around the Internet, but there are also thousands of alternatives that can educate, entertain, and, yes, even relax us.
  • You really don't have to stay in constant touch with the world when you're taking time off. Honest. The world will get along without you for a while.
The trouble is that, even though things are rarely as dire as they seem, Americans must worry. Therefore, I would like to suggest some worries that are more low stress. If you must worry, worry about these things.

  • Will the Cubs ever win a World Series?
  • Just how many times can one stand hearing “White Christmas” over the store Muzak system?
  • How will the expansion of the universe affect my shoe size?
  • Sure, e-mail spam is bad, but will velveeta come back?
  • Why can't I let the water out of a leaking boat by opening another hole for the water to go out of?
  • How badly will Ohio State beat Florida in the BCS championship game?
I mean, how low stress you can you get?

Monday, December 4, 2006

The Quality Conundrum

“Quality” is best measured by those who “use” a product rather than by those who make it. ~ Hunter S. Thompson

Sony is making another recall, a minor one, involving camera display screens, which do not appear to be catching fire or anything. Sony, as everyone well knows, has been shooting itself in the foot with an AK-47 over the last few months. First there was the rootkit fiasco. Basically, Sony DRM-protected CD's required that you place software on your machine to play them. That software, it turned out, behaved in a manner that allowed system files to be replaced, not only by Sony (which they denied was ever their intent), but by any hacker who knew where to look.

After that shot to the old corporate image, there was the battery recall that impacted a wide swath of laptops, including Sony's own Vaio. The problem was tiny little metal bits in the battery that ultimately caused the battery to overheat and rather spectacularly burst into flame. What made the situation even more damning was that Sony evidently was aware of the problem and even discussed it with Dell (who got the lion's share of these fire bombs).

None of this is particularly newsy except that in one of the references to the most recent recall, someone made a reference to the Sony's quality control people not doing their job.

As a former quality professional, I take umbrage at that sentiment. It raises my hackles -- and you know how painful that can be.

I started in quality control in 1974 and kept at it until I was “re-engineered” out of a job in 1994. To show just how stupid I was, after getting the boot in '94, I actually tried to find another job in quality for a couple of months, before I had an epiphany that it might actually be time to do something I enjoyed doing. If you wonder why it took so long, I supposed it's rather like the guy who's banging his head against a wall. When asked why he keeps doing that, he responds, “Because it feels so good when I stop.”

Boy, has it felt good.

At any rate, I wanted to make a point about how things like the Sony battery can happen. I don't know that this is what actually went on, but given the discussions with Dell mentioned above, I think I might be on the right track.

Quality Control departments don't catch everything that isn't made correctly. There are two reasons for this. First, most of them time it's prohibitively expensive to inspect every single part. With automation techniques, it is sometimes possible to test each product, but even those tests would not detect a problem that takes time to develop. Second, Production people will try to hide defective product. Yes, they will. The problem is that their goal is to generate quantity, and stopping to inspect everything or, worse, remaking a large quantity of parts due to a rejection gets in the way of that goal. For all the proud words companies spout about the importance of quality, their production supervisors know full-well that they will get a mild chastising for bad products but big rewards for making high production levels.

If it wasn't like that, you wouldn't need quality control departments in the first place.

So, what I'm saying is that sometimes, stuff gets out that shouldn't have. Sometimes, though, the quality people discover a defect issue and notify management, which proceeds to make a “management decision” to ship the stuff anyway.

Now, I can tell you in all honesty that many of these decisions are okay. Customers will over-specify products with a vengeance. When I worked in the rubber industry, I used to spend a lot of time trying to explain to customers that rubber parts are not like steel parts. Most of them are squishy. You can't hold the kinds of tolerances on a squishy material that you can a rigid one. Many's the time I decided on my own to let a washer go which was slightly too thick because I knew how the part was used, and I knew that being .001” over spec wasn't going to cause a problem.

But, I've also been a party to some bad decisions. At one company, I was overridden by a sales manager three times and by the president once. None of these involved life-threatening characteristics, but functionality could have been impaired in each case. Each one of those shipments was rejected by the customer. After a while, they stopped overruling my calls.

For the record, I often went to the sales manager to see if a customer might accept a condition, but on those occasions, he would call the customer and get a waiver. Those three occasions where he overrode me all came on the same day when the company was trying to generate some nice year-end shipping numbers.

Sony knew about the problem; their big customer Dell knew about the problem. The implication is that the quality people did their job, and the managers made one of those “decisions.”

In all the years I was in quality, I was with a company that had to deal with one of those situations (a really serious defect that could cause injury) only once. A defect was discovered during routine testing that showed that a manufacturing procedure had not been followed properly. To that company's enduring credit, it spent around $400,000 testing parts, remaking parts, revising procedures, and certifying production employees involved in the key process to feel sure that none of the defects had gotten to customers and to ensure that the problem didn't occur again.

I don't know how much it would have cost Sony to toss those defective batteries, but it would almost surely have cost a fraction of what recalling and replacing them cost, not to mention the cost of the bad press. But, I'll bet if you go looking for the people who let the situation get so far out of hand, you will find that the quality control group had been screaming like their hair was on fire long before “management decisions” were being made.

Quality control people have a tough job; they're the bad guys most of the time and get little or no credit when things go well. At one company, we were running one of those bad patches where a department was generating a lot of bad product. Our inspectors were identifying the problems, but the production supervisors just continued to run the jobs, with the result that every lot had to be inspected or reworked and a lot of stuff had to be remade. Finally, my boss and I held a council of war. We considered arming the inspectors but decided that bloodshed wasn't the answer. Eventually, my boss got the president to issue strict instructions to the head of Production that all shutdown notices from inspectors were to be honored immediately. Anyone failing to do so would be subject to disciplinary action.

Well, lo and behold, scrap and rework went way down, processes got corrected, and, just as the quality people keep saying, productivity went up. So who gets the credit? The president sends a letter to Production Manager praising him and his supervisors for the great job they did – said “great job” consisting of following the procedures they had been violating. The Quality Department? We got squat recognition.

You know the worst part of that story? It happened around 1976, yet I was dumb enough to stay in that profession for 18 more years. Oh, well, it could have been worse.

I could have stayed in quality and gone to work for Sony.

Monday, November 27, 2006

Gartner Fancies

Computers make it easier to do a lot of things, but most of the things they make it easier to do don't need to be done. ~Andy Rooney

Those wacky folks at Gartner are at it again. You know Gartner, the people who said a company's annual cost of owning a computer was $25,000. Bob Lewis, writing a column for InfoWorld at the time, showed that, using Gartner's methods, the annual cost of ownership of a Daytimer was $3000. That's right, the thing you write in with a pen and carry in your briefcase.

Now, Gartner's been around for a long time, and they have surely made some savvy predictions, if only accidentally. After all, when you make as many prognostications as they do, you're bound to hit one once in a while. But it seems that the ones the news media pick up are the ones that give one pause, if not outright belly laughs. The Register is reporting that Gartner thinks that the next few months will see the biggest changes in computing in a generation. Windows Vista is going to be disruptive to the organization, new hardware is going to change the price-performance relationship, users want more freedom while IT wants more control, and blah, blah, blah.

I presume they issued this same report when Windows 3.1, Windows 95, Windows 98, Windows 2000, and even Linux came out. In fact, the fundamental issues of computing haven't changed much in the last 10 or 15 years.

In the beginning, computer work meant banging away on a terminal, and relatively few people within a company actually had them. Most of the ones who did were data entry types or production control workers. At more advanced companies, you might even find Computer Aided Design (CAD) rigs with digitizer tablets that were the size of card table. Meanwhile, the rest of us were using ledger paper, graph paper, and calculators. In fact, to do all the graphs I used to do as a quality control engineer, I had a really neat set of colored pencils which were better than any crayola set I had as a kid.

PC's started creeping into companies during the mid-1980's. Sometimes it was a finance department wanting to use spreadsheet software to do what-if projections easily, sometimes it was technical departments wanting to crunch numbers with software that was more flexible than the mainframe offered. Initially, the PC's came in without IT blessing. In fact, the IT people could have cared less. As long as these guys were playing with the new toys, they bothered the programmers less, so that was good.

Eventually, PC's got more affordable, and people started creating ad hoc networks, sometimes with the cooperation of IT. People who never had a terminal might now get a PC. And if one person got a PC, someone else had to get one, and so on until most everyone had one. That was all right, until people began to play with the software.

For those few who had been working on the terminals, there were few options to play around with. The mainframe applications had their basic functionality, and the user could live with it or get out the ledger paper and colored pencils. Suddenly, with the PC, here was the user using programs that gave them all sorts of options to play with. There also seemed to be software to do most anything anyone wanted to do. The trouble was that people spent more time fooling around with the software than they did actually generating output.

They also started playing games. Then came Internet access, and what little productivity was left went into the tank.

The computing vicious circle also began. Computer makers would bring out new hardware, and software makers, dominated by Microsoft, would bring out new software and operating systems. No sooner did you get hardware that could run the old stuff than you got new software that needed new hardware again. We've been in this cycle for years now; Vista is just the newest iteration.

But there was a more subtle problem. For reasons which I have never understood, people seem to feel that the computer on their desktop is their personal property. The desk, the phone, they belong to the company, but it's the user's computer.

It goes back to those aforementioned early days of PC's, I guess, when users could finally control their computing environment, as opposed to being limited to the mainframe's few options. Even when PC's became more widespread, individual departments got whatever software they wanted. Indivdual users within departments got their own flavor of software to put on "their" computers, and they weren't interested in using anything else. IT has been trying to get that horse back into the barn ever since.

Gartner seems to have finally discovered that fact.

If I would have been transported from the first network environment I was ever in to one of today's modern networks, I know I would have marveled at the new technology and at the Internet, but the thing that would amaze me most is how little things would have changed between 1986 and today. I think the thing I would find most amazing is how Microsoft had wiped out most of its competition and the ever-increasing number of network servers companies use. But as to the way people use computers? It would seem quite familiar.

Something is going to occur one of these days to change the computing environment, but, whatever it is, it isn't on the horizon yet. Maybe Linux will finally offer a real alternative to Windows, maybe companies will realize that bloated "productivity software" is actually wasting employees time and computing resources. Whatever the change will be, history shows that the pundits, who have predicted the "year of ISDN", ATM to the desktop, and thin clients all running Java applications, haven't got a clue.

Who knows? Maybe someone will decide that large centralized servers are the wave of the future. Might even call them mainframes.

Monday, November 20, 2006

A Yankee Amongst the Magnolias – 4

Wherein our hero discourses on the wonders of Southern collegiate football.

To play this game you must have fire in you, and there is nothing that stokes fire like hate. ~ Vince Lombardi

Given that this past weekend featured any number of well-known college football rivalry games, it seems an appropriate time to consider the difference in attitude toward toward the fall's premier sport.

I lived in Ohio for over thirty years, and, like most Ohioans I rooted for Ohio State to beat Michigan each year (well, nearly every year; the Big Ten teams only play 8 of their opponents each year, so every 8 years, they didn't play each other). Even though I've moved to Alabama, I still like to see the Buckeyes come out on top. This year's game got a lot of national attention because OSU came in ranked number 1 and Michigan came in as number 2, meaning that the winner was going to the BCS championship game.

The sports networks made a huge deal about the game, especially ABC-ESPN, who were showing the match. It was supposed to be the most important game ever between the two. Every time they interviewed an ex-player or coach about it though, the reaction was the same. Listen, they would say, back in the day, the winner of this game generally went to the Rose Bowl, while the loser went home. Period. The Big Ten (and the Pac Ten) wanted the Rose Bowl to be special, so none of their teams could go to any other bowl games.

Eventually, of course, money began to talk, and bowl games began to proliferate, so that changed. But, the idea of the rivalry game being a big one has always been huge based on the consequences of winning and losing. So, adding the championship game into the mix was no bigger deal than the olden days.

Even given that titles were perennially on the line for OSU and Michigan, the buzz for the game didn't really start to build until a week or two before the game. Once the game was over, that was that for another year. Thoughts turned to the Rose Bowl, and once that was over, we began to think about hockey and baseball. It ain't that way down here.

My first inkling about how serious people are about football in Alabama came the first week I arrived in April of 1985. I was reading the morning newspaper and got to the sports section expecting to read about the upcoming opening of baseball season. Instead, the paper was full of college football news: Recruiting news, spring practice news, all kinds of football news. Baseball was relegated to a small section on part of one page. I began to suspect there was something different about sports attitudes in the South.

I got serious reinforcement at my new job, as I heard people in the spring of the year already arguing over the superiority of Alabama or Auburn in the coming annual Iron Bowl game. I presume the name “Iron Bowl” came about because the game was played for many years in Birmingham, once the steel capital of the South. At any rate, I found that I would be asked if I followed college football. Once I answered in the affirmative, I was asked whether my allegiance was to Auburn or Alabama. The answer, “Well, actually, I'm an Ohio State fan,” was not acceptable. I had to state a preference or be regarded as some sort of non-football-appreciating pariah.

In case you're curious (and anyone from the state of Alabama is), I chose the perennial underdog, Auburn. Of late, that's looked like a good choice, hasn't it, Tide boosters?

This year's Iron Bowl was for pride only, with neither team having a chance at the SEC title game or a national championship. I can assure you, however, that the bulk of Alabamians were watching the Iron Bowl Saturday rather than the big one in Columbus, which is where my attention was focused. They're far more serious about their rivalries here.

It's not just the Iron Bowl. Both teams have other rivalries, Alabama with Tennessee and Auburn with Georgia. People worry over those games almost as much as they do over the Alabama-Auburn tilt. But, every Southern college has one really, really serious rivalry, and it is a subject of discussion, debate, and downright arguing 365 days a year (366 in leap years).

I don't really understand it. Football is a sport associated with cold and snow, at least to anyone living north of Memphis. I can remember freezing my nether parts off watching many a high school, college,and pro game. Down here, they postpone games because it's raining. Of course, down here, games are played in heat that will cause the football to sweat, so I guess things have a way of evening out.

It still doesn't explain the football madness that pervades the South. Texas is legendary in its mania for high school football; there are high school stadiums that some colleges would kill for. As mentioned above, people following the recruiting circus year round, looking to see if the home team can sign that hot quarterback from some bitty high school in Arkansas.

My guess is that it has to do with the lack of professional sports that the South endured for so many years. Today, there are pro baseball, football, basketball, even hockey teams throughout the southern United States, but for decades, pro sports were the domain of the North, the Midwest, and the West. Down here in the the land of magnolias, grits and barbecue, it was the colleges that provided the sports fix for people. Interestingly, it was only the outdoor sports that really caught on. Except for the Carolinas, basketball was something to do while you were waiting for baseball (and spring football practice) to start. In recent years, basketball has gained in popularity, but even last year, we were treated to the Alabama basketball coach pleading for more fans to show up.

So, ultimately, a Southerner's first love is still football. It's hard to understand how this land of gentility and charm has given its soul to a game that is based on collisions, concussions, and general mayhem. Well, I think that's the point. Southerners are unfailingly polite and friendly people (okay, in Craigsville, Virginia, they shoot first and ask questions later, but that's atypical), so they have to put their aggressions somewhere. What better place than on a football field? For the fan, it brings together the elements of Southern sociability (tailgating and game parties) with that universal characteristic of mankind, violence (the game itself).

I don't know if any of that is correct, but I do know that when I get back to work, the discussions will not be about Ohio State going to the BCS national championship game. People will still be replaying Saturday's Iron Bowl and already talking about how next year's game will go. Me, I'm focused on Ohio State ... although I would like to say, for the record, War Eagle!

If you don't understand why it was necessary to add that last, you don't understand Southern football.

More Southern musings:
Southerners are nice people
Human Nature overcomes all
Northerners just don't understand

Wednesday, November 15, 2006

Left Out in New Jersey

If winning isn't everything, why do they keep score? ~Vince Lombardi

Okay, let me see if I've got this straight. There are currently four undefeated Division I-A college football teams: Ohio State, Michigan, Rutgers, and Boise State. According to the Bowl Championship Series (also known as the BCS or “that stupid piece of ...”) poll, they are ranked 1,2, 6, and 12 respectively. According to the computer portion of the BCS, they are ranked 3,1,2, and 10 respectively. As things stand now, once Ohio State beats Michigan, there could still be three undefeated teams, but only one of them, Ohio State (oh, okay, or Michigan if they should defy the gods and win in Columbus) will get into the national championship game.

Now, Boise State, which plays its home game on an azure blue field, is in a weak conference and doesn't play any significant non-conference foes, so one could understand how they might be excluded from the party. But the State University of New Jersey at Rutgers, as it is properly known, is getting the shaft.

Here's a team that is 9-0. Louisville was ranked number 3 when they met Rutgers and staggered home after losing to a team that was outmanned, out-talented, and outgunned. All Rutgers did was win. This was the same Louisville team that trashed West Virginia the previous week, when West Virginia was ranked number 3. Yet most people are saying that West Virginia will beat Rutgers.

What they really mean is that they hope like the dickens that West Virginia beats Rutgers. If they don't, we're going to have the embarrassment of an undefeated team that played some of the same opponents as teams above them, and beat them more impressively, yet will be left out of the championship picture.

Some of the rationale for promoting teams with 1 loss ahead of the Scarlet Knights goes like this:
  • Rutgers is in the crummy old Big East Conference. Last time I checked, there were three teams from the Big East in the top 10. There's only 1 from the PAC 10, that supposedly might conference that USC waltzes through each year. Only the Big 10 has as many in the top 10 as the Big East.
  • Rutgers doesn't have as much talent as Ohio State or USC or insert-name-of-traditional-powerhouse-here. You want talent? Auburn has talent; they got whupped by Georgia, a team that can barely get out its own way this year. I thought the idea of football was to win games, not to look good on scouting reports. If you go 9-0, there's some sort of talent on that field.
  • Rutgers can't beat Ohio State in a playoff game (no, really, I actually heard this one). Well, it's entirely possible that no one can beat OSU in the championship game, assuming they get there. So far, 11 teams have failed to beat OSU. If you used that criteria, according to most sports pundits, OSU should show up at the championship game and just conduct a scrimmage. But, after the Louisville game, no one should be saying that Rutgers has any worse chance than anyone else.
  • Rutgers is Mr. Magoo's alma mater. Well, maybe that one hasn't come up, but that's about as good an argument as the others.
The human pollsters are to blame mostly, but the BCS is at the root of the problem because while the BCS is run as currently constituted, there's no playoff structure to give a Rutgers a chance.

The BCS apologists always go on about how well the system works. It works so well that's it's been changed every year since its inception. Simply put, it's no better than the “mythical” championships that were handed out over the years by AP, UPI, and other press outlets, magazines, and networks. What makes the BCS worse is that it masquerades as some sort of “real” title game. Hell, it's not even what it calls itself; it's not a championship series. It's one game based mostly on two human polls with a computer poll thrown in just to confuse things.

The BCS has given us such farcical situations as:
  • A one-loss Miami team not getting to the championship game in 2000, while a one-loss Florida State team did. Who beat Florida State? Why, Miami, of course.
  • In 2001, a team ranked number 4 in the human polls, Nebraska, going to the championship (and getting pounded) over number 2-ranked Oregon State. Why? Strength of schedule, even though Nebraska didn't even win their conference, and Oregon State played in the mighty PAC 10 (which is now supposedly the toughest conference in the country).
  • Oklahoma loses its championship game but plays LSU for the title in 2003. USC was ranked number 2 in both polls, but, whaddya know, that ol' weak sister PAC 10 conference bit them again in the BCS poll.
  • Auburn, 13-0 in 2004, not only doesn't get to the title game, despite playing in what most people regarded as the toughest conference in the nation (the SEC), Texas is voted past them (thanks to politicking by coach Mack Brown) in the final poll.
Now, the lame apologists will say that this sort of stupidity makes for spirited discussion and arguments among fans, which is somehow good for the game. Well, here's a bulletin for you guys: People have been arguing about the national champ ever since Parke-Davis called both Princeton and Rutgers national champions in 1869. We don't need any more arguments; we need a playoff system. And, please, don't give me that stuff about adding games to the schedule. The NCAA saw no problem with giving teams a 12-game schedule, allowing teams in a conference with a title game to play 13. So, what are two or three more in a playoff series?

After Ohio State, the team I'll be rooting for most will be Rutgers. I don't know if it's possible to embarrass the NCAA, but leaving a school that beat two top-10 teams to stay undefeated (assuming they get by West Virginia) on the sidelines while some team with one loss is playing for the championship ought to do the trick. Then, perhaps the school presidents will think about doing what much smaller schools and all other Division I sports have been doing for years: Determine the champion in a playoff.

I won't be holding my breath.

Monday, November 13, 2006

Election Musings

Politics, it seems to me, for years, or all too long, has been concerned with right or left instead of right or wrong. ~Richard Armour

So, we've survived another political season, although the next one has already started, since someone I never heard of announced himself as a candidate tor the Democratic nomination for the presidency. The city I work for will have a mayoral election next November, so candidates for that post will start announcing themselves any day now, too. The political season never really ends anymore.

At any rate, as a card-carrying pundit, commentator on the passing scene, and the best blogger in my house (the Daughter has her own place, where she, presumably is the best blogger), I feel I must utilize my wit, wisdom, and incredible command of the English language to offer a few thoughts for your consideration.

Pardon me while I admire the incredible construction of that last sentence.

Oh, well, back to the mundane stuff. Incumbent governor Bob Riley easily beat Lt. Governor Lucy Baxley. The campaign was a most unusual one for Alabama in that it was conducted with restraint by both combatants. Lt. Gov. Baxley was a huge underdog, which usual means below-the-belt ads in a vain attempt to close the gap. But, both candidates seemed to limit themselves to the usual “the other guy is going to raise taxes, while I'm going to lower them”, which no one believes, but at least no one is calling anyone a fascist or a socialist.

It was a little strange that the Lt. Governor ran for governor this time around. Gov. Riley has done a quiet, creditable job, with no scandals or hints of scandal during his first term. Given some of the recent tenants of the governor's mansion, Bob Riley was a pleasant change. It's hard to imagine that he would have been vulnerable. Conventional wisdom would have suggested that Lt. Governor Baxley should have waited until the next election which would have been wide open (in Alabama, a governor can serve unlimited terms, but only two consecutively). But, the Democrats lacked a good candidate, given that former Governor Don Siegleman, who was being tried for various improprieties at the time (he was ultimately convicted), and no one else seemed willing to lose to the incumbent.

It would be nice to see Lt. Gov. Baxley run again next time, but it's tough to stay in the public eye for four years when you're no longer in office.

The incoming lieutenant governor's race was a veritable joke by comparison. In this corner, we had Luther Strange, a former lobbyist, carrying all the baggage that implies, especially with the Abramov mess being exposed on a daily basis in Washington. In the other corner stood former governor “Little” Jim Folsom, Jr., son of Big Jim Folsom Big Jim was the legendary former governor who once exhorted voters to vote for him for a second term because he had stolen all he was going to while a new governor would just start stealing all over again. I don't know if that statement is true, but I've heard it from so many Alabamians, there must be something to it.

Jim Jr. has his own history. He was lieutenant governor during the Guy Hunt administration and became governor when Hunt was found guilty of ethics violations (misuse of some campaign money to pay for inaugural events and using the state airplane to fly around the country preaching; not serous stuff, but not legal, either). Mr. Folsom proceeded to blow his opportunity to be elected in his right when his relatives started flying around on personal trips using the state plane.

I think Gov. Riley had the thing grounded.

At any rate, Little Jim pulled out the election, proving that Alabamians, like any sensible voters, would rather have a rather inept son-of-a-corrupt-politician over a lobbyist.

Oh, and the big news, of course, is that the Democrats have taken control of Congress. It's only taken six years of getting shafted by the most corrupt administration since the Teapot Dome scandal for Americans to realize it might be a good idea to make some changes. Frankly, given the pathetic state of the war in Iraq (which many notable Democrats did nothing to oppose when they could have) and the egregious handing over of power to corporate interests, the Whigs could have made a comeback.

The problem is that there is still no sign that the Democrats actually have any sort of program of their own. Anything they do pass will be vetoed by President Cheney, er, Bush. So, we've got two years of posturing, rhetoric, and general gridlock to look forward to.

It's time to revise the electoral system. We don't need a two-party system; we need a no-party system. Since neither of the big parties have meaningful platforms, they are largely irrelevant. Let's go to an open election process where individuals run against individuals. Basically, we could have non-partisan primaries, with the top two candidates going into a runoff. Let's cut the confusion over moderate Democrats trying to act like Republicans and liberal Republicans trying to act like Democrats. No more party campaign war chests to pump out supposedly “issue-oriented” ads actually aimed at specific opposing candidates.

Let's end the nonsense of a party controlling Congress. Elect the Speaker of the House and the President Pro Tem from the full House and Senate, instead of getting whatever hack the party-in-charge wants to hand the gavel to. Committee assignments? Draw 'em from a hat, then let the committee members elect a chairman. It can't work any worse than the system of handing out favors that exists now.

Oh, and dump the electoral college while we're at it.

Of course, it would be tough on the voters, having to decide what a candidate actually stands for, rather than just blindly voting the party line. But, political parties are an anachronism, dating back to when the populace was uneducated and politically naive. There's no excuse for that now, with TV, newspapers, and the Internet providing reams of information.

Sure it's a crazy idea, but, then, so was the American Revolution. Maybe it's time for another one, this time against the fat, dumb, and happy politicians lousing up a beautiful system.

It could happen.

Wednesday, November 8, 2006

Feel the Excitement

It is better to have a permanent income than to be fascinating. ~ Oscar Wilde

The Inquirer had a story about a software company in the UK running a competition to find the nation's most exciting accountant. Now, the term “exciting accountant” would appear to be as much of an oxymoron as “jumbo shrimp” or “military intelligence”.

I've known a lot of accountants over the years, but I can honestly say that I never met one that I could consider “exciting.” In fact, I can't think of one that was even moderately interesting compared to his or her fellow number-crunchers. They were all uniformly staid, bland, and accountant-ish. I suppose, though, that there must be one or two actuarial types who might be construed as less unexciting than their colleagues, possibly engaging in wild and crazy outside activities like philately.

Well, I did know one that had seven kids, but I'm not going to go there.

There are certain professions that seem to draw the more sedate types (to choose the least offensive term I could think of) to their ranks. For example, before I entered the high-energy world of network system administration (lord, it's such a rush deleting someone's files), I spent over twenty years in Quality Control. Let me tell you, I may have picked on the accountants above, but the most boring person at Price-Waterhouse (not counting those under indictment) is a virtual wild man compared to the average QC practitioner.

Perhaps it's more a matter of the profession taming the employee. Both QC and accountancy involve a lot of number processing, maintaining of charts and reports, and following copious and involved procedures (and inflicting them on your fellow employees). No one likes such people because their main function seems to be to hinder people from having a good time, either by preventing the spending of money or by making people remake products because they're defective. Accounting outranks QC, though, because Quality Managers don't inspect the work of the bookkeepers while the bookkeepers can stop QC from having the only fun they can have, buying new measurement equipment.

When you have no friends, you become a dull person.

All of this depressing prelude gets me to thinking about where other professions stack up in the excitement realm. Oh, sure, some jobs obviously attract exciting people and make dull people more interesting by their very nature. For instance, I can't imagine a dull test pilot, Mt. Everest guide, or race car driver (although Ryan Newman comes close). So, as a public service to all the teenagers out there who are surfing blogs looking for salacious material but have had the misfortune to arrive here and had the tenacity to have read this far, I'd like to provide the definitive Gog's Blog What's Exciting and What's Not Career Guide. Within each field, I will list the Potential for Exciting People (PEP) and the Welcome to Dullsvile (WTD) jobs.

Science PEP jobs:
  • Physics: Deals with cosmic questions and energy sources sufficient to vaporize Detroit.
  • Chemistry: Explosions and toxic substances; need I say more?
  • Paleontology. You get to camp out and lift really heavy things that you dig up. Back in the lab you get to make up incredible stories about how the animal that belongs to that tibia was 82 feet long and ate a forest each morning.
Science WTD jobs:
  • Mathematics: If accounting is dull, how exciting can its basis be?
  • Astronomy: Sure, you get to take those pretty pictures. All it takes is sitting in a cold observatory at high altitude for hours on end. You're oxygen-deprived (which admittedly can be exciting, for a while), and you work nights. Not much good for the social life.
  • Biology: Spend your life looking through a microscope rooting for the red paramecium to eat the green one.
Engineering PEP jobs:
  • Electrical Engineering: Not only do you get to design neat stuff like computers and microwave ovens, you get to leave large capacitors charged up and lying around where your friends can get serious shocks from them.
  • Aeronautical Engineering: Nothing like designing things that have no visible means of support.
Engineering WTD jobs:
  • Civil Engineering: Aside from routing a highway through someone's house, what fun can you have?
  • Materials Engineering: This should be exciting, but the people who actually design the materials don't actually get to do anything with them. Besides, all you ever get are unreasonable requests: “Hey, Fred, I need something lighter than air, stronger than steel, and flexible as string. Can you have that by next week?”
Medical PEP jobs:
  • Brain surgeons: Now, THAT'S getting into people's heads. One wrong poke, and some poor patient is going to wake up speaking in tongues.
  • Gynecologists: Always the life of the party.
Medical WTD jobs:
  • Dentists: Sure, you can understand people who have a drawerfull of hardware in their mouth, but you don't know how to talk with people who can form actual words.
  • Podiatrists: Feet are just dull. You see one falling arch, you've seen them all.
Business PEP jobs:
  • Finance: Not to be confused with Accounting. These are the people who actually spend the money. In fact, they're the only ones who know where the money is actually hidden. Everyone wants to be friends with these guys.
  • Marketing: Talk about a fantasy world. You get to invent features that the R&D people never dreamed could be put into your products. Big plus: Guess who gets to go to all those trade shows in Vegas?
  • IT: You get to play with all the new toys, surf the web with impunity, and decide who gets the newest equipment. Power, toys, and play. What more can you ask?
Business WTD jobs:
  • Accounting and Quality Control: Didn't you read the stuff at the top?
No need to thank me. Adding to the number of exciting people in the world is a reward in itself.

Wednesday, November 1, 2006

A Question of Relevance

University politics are vicious precisely because the stakes are so small. ~ Henry Kissinger

The mob got its way: Jane Fernandes will not become president of Gallaudet University.

I use the term “mob” because that's what the protesters had become. They had blocked access to the school and refused all attempts to negotiate. All that mattered was that Ms. Fernandes' appointment be rescinded. In the articles I read, I had a hard time finding out what exactly was supposed to be so wrong about this woman. I read about her assessment that some at Gallaudet thought she wasn't “deaf enough” because she was a lip reader who didn't learn American Sign Language until she was 23. That seemed a pretty lame excuse, so I was intrigued by the information in the article cited in the first paragraph.
  • Ms. Fernandes is currently university provost. Students and faculty accused her of being “divisive and ineffectual” in her leadership role.
  • She had wanted to “reach out to the broader population of deaf and hard-of-hearing students”, the vast majority of which attend conventional schools.
  • Gallaudet has serious problems.
That last is in the form of decreasing enrollment, a rating of “ineffectual” from the Office of Management and Budget (the school receives over $100 million of federal funding every year), and a graduation rate of under 50%. That's Trouble, with a capitol “T”, which rhymes with “G” and that stands for “Gone”, which is what Gallaudet will be if current trends continue.

Meanwhile, in a sort of “Lord of the Flies” scene, students and faculty were celebrating their ability to bring the University to its knees, with no apparent direction of their own except to defend “deaf culture”, which sounds suspiciously like elitism.

Who would imagine that someone would treat deafness as a cultural element?

This sort of nonsense is nothing new for the students and faculty of Gallaudet. In 1988, they forced a new president out (interestingly also a female) out after a week, because she was not deaf. That brought in current president I. King Jordan, who has been unflaggingly supportive of Ms. Fernandes. I suspect that he was having flashbacks.

The trustees statement said, “Now is the time for healing.” The protesters said, “Her resignation is not the end.” Basically, while making statements about the need for the search process to be “fair, equitable, transparent, and diverse”, they're really saying, “You won't appoint anyone we don't like. And we're not going to tell you what that is until you appoint someone.”

There's an interesting parallel between Gallaudet and Randolph-Macon Women's College. You may recall, if you're are that 2/3 of a regular reader that I have, that RMWC, a women-only school, voted to begin admitting men. As at Gallaudet, there were howls of protest, replete with demonstrations and comments like this one: “115 years of women can't be wrong.” Right, and 200 years of slave-owners couldn't have been wrong, either.

At least at RMWC, there wasn't an indication that the faculty was opposing the change, but both student bodies share the same blinkered attitude toward society. The thrust of the last 100 years has been to try to make society more inclusive, despite the ongoing attempts of racists, sexists, and generally intolerant people to thwart the efforts.

The disabled and minorities have been trying to get their piece of the American Dream, to be regarded first as “people” and secondarily as black, female, or disabled. That's why places like Gallaudet and Randolph-Macon are withering; they are anachronisms. They're better than the segregated institutions of years past only because they're now an option, not the only choice. It's because there are other options that they are failing.

RMWC has recognized that they need to recognize the existence of the real world and may succeed. Gallaudet tried and was rebuffed by its own students for whatever real and imagined reasons they may have. What's worse is that Galludet's faculty is fighting change as much as the students (one might suspect that there was considerable encouragement of the student protesters by at least some faculty members). The very people who should be forward-looking are choosing to turn their backs just as some RMWC protesters did.

Perhaps the faculty is enjoying the 100 million taxpayer dollars they receive each year. If they continue marginalizing the school, attempting to exclude those who are “not deaf enough”, they are liable to find themselves facing a public that questions their relevance and the relevance of the entire University. After all, if the majority of students don't graduate, that's an indictment of those faculty members and of the school itself.

The next time the school is shut down, it may not be done by the students.

Monday, October 30, 2006

Jellied Brains

Television has done much for psychiatry by spreading information about it, as well as contributing to the need for it. ~Alfred Hitchcock

When I was a kid, some parents worried that TV would turn our minds to jelly. It appears that they were right. I've seen a number of articles lately saying that there is an autism epidemic. Since autism does not appear to some sort of viral condition that would be contagious, increasing numbers of autistic children would seem to be mysterious. Could it be Bin Ladin at work with some bio-psycho-nucleo-weapon? Might it be due to global warming or cosmic rays? Perhaps all that fluoridation that the American Dental Association has foisted on us is creeping out of youthful teeth into youthful brains?

Nope. It's television. I saw an article the other day that said that it appeared that excessive TV viewing could be causing all this extra autism we're being told is abounding. I can believe that, although I think video games could be included as a factor. I'm not one of those who thinks all video games are evil and should be banned, but I do think that anything that reduces the need for one to use one's imagination is going to limit mental development.

At least video games, though, provide some sort of visceral stimulation. I mean, there you are, walking through barely lit hallways when some gruesome creature jumps out, determined to rip your virtual head off, and you mange to reduce it to a pile of goo with a chain gun. If that doesn't get your heart started, nothing will. And don't give me that stuff about violence. Before video games, kids played cops and robbers, cowboys and Native Americans, or soldiers, all of which involved bloodily dispatching each other with glee. And they did that long before you could buy a toy ivory-handled Colt .45 with quick-draw action.

We're just a very violent species.

No, it's TV that is really scary, because it's essentially mind-numbing. For all the paens offered by programmers to the idea of original programming, imitation still remains the sincerest (and most profitable) form of flattery. Currently, no network can do without reality shows because the shows are cheap to produce and provide something that everyone seems to love: People being humiliated. If you add some sort of talent component, you apparently have gold. If American Idol wasn't bad enough (and it was), we've had fashion designer faceoffs, model challenges, and, a current up-and-comer, chef competitions.

Of course, none of this is new, even the humiliation concept. Years ago, it was done in quiz-show format, like Beat the Clock, which involved people having to perform stunts in a set time (usually 1 to 2 minutes) that involved getting wet or having cream pies tossed in their faces. Or there was Truth or Consequences, where bad things could happen if you didn't answer the question correctly.

Over the years, trends came and trends went ... and often came and went again. For example:
  • Variety shows. Once the staple of the networks, all of them had a similar format. Opening musical number with the star (or, if the star was a comic, opening monologue, followed by a musical number by the show's chorus line); introduce the guest star; guest star number; skits; musical number with star and guest star; skits; humble closing by star. Some big-time shows could afford a couple of guest stars, which really took the pressure off the star.
  • Quiz shows. Just like now, quiz shows were cheap to produce and drew huge numbers of viewers. Champions would return week after week, and people actually would begin to root for or against certain contestants. Unfortunately, the Charles Van Doren scandal brought the whole house down, relegating quiz shows to a minor niche until that Regis Philbin millionaire thing, and, of course, Wheel of Fortune.
  • Westerns. Gunsmoke, Wagon Train, Bonanza, The Lone Ranger, and on and on. For years, westerns were such a staple that, some seasons, there was practically nothing but westerns and variety shows. The proliferation of cowboys, saloons, dance hall girls, and gunfights was one of the reasons that Newton Minnow, one time head of the FCC referred to TV as “a vast wasteland.” I think he had the Ponderosa in mind.
  • ”Professional” shows. Doctors and lawyers took over after the westerns. Ben Casey and Dr. Kildare led the field, although Medic was the pioneer. And of course, there was Marcus Welby, a sort of Doctor Knows Best. For the lawyers, we had Perry Mason, The Defenders, and Arrest and Trial (a predecessor to Law and Order, where they showed the crime, the pinch, and the trial, only it took 90 minutes for Arrest and Trial).
  • Primetime soaps. Peyton Place started it, but soon there was Dynasty and Falconcrest. Interestingly, Dynasty and its clones always seemed more like westerns, sort of Bonanza with drugs, money, and Joan Collins.
But there are two formats that were there at the start and have never gone away: Sitcoms and cop shows. Except for a short time during the ascendancy of the westerns (when there was virtually nothing else on), sitcoms and cop shows have always been the mainstay. And, frankly, if you've seen one sitcom or cop show, you've seen them all.

It's not that there haven't been some very good television programs over the years. There have been, but there haven't been hours and hours and hours of them every week. In any given year, if there's more than two or three really fine shows (either very entertaining or very thought-provoking), that's a banner season. Most of what was (and is) on is repetitive, commercial-riddled, and insulting to the intelligence.

Worse, the well-intentioned attempt to offer some education to kids turned into television-as-baby-sitter. Parents were more than happy to plunk the kids down in front of Sesame Street, Mister Rogers, and whatever else PBS was showing during the afternoon. The trouble is that they just left them there for the evening. So kids that grew up watching hours of TV became parents who were more than happy to let their own kids watch even more of the mindless fare. It's gotten to the horrific point of a baby channel, a cable channel aimed at infants, so parents don't have to sing lullabies. For slightly older kids, PBS Kids Sprout relieves parents of reading bedtime stories. Then, there's Teletubbies, which is just plain disturbing.

And we're wondering where all the autistic kids are coming from?

Amazingly, an argument for letting kids watch all the TV they want is that they won't be able to talk about what was on last night with their similarly brain-stunted friends. Even adults seem to feel that they might be ostracized if they didn't watch CSI:Podunk last night; they won't be able to bond with their co-workers around the water cooler.

I haven't watched a network (CBS, NBC, ABC, Fox (ugh), or even UPN) program in years. I've mentioned before that my viewing axis is science, history, sports (less and less all the time), and very old movies (if it's newer than 1949, bleah). However, I've never felt out of place when people start talking about their TV viewing because all you have to say is that you've never seen some top-rated show and stand back. I don't have to watch Law and Order:Weirdos Unit because people will tell me all about the show --and everything else that's on – at the drop of a hat.

Of course, they will, at some point, insist that they really don't watch much TV, and they certainly don't let their kids watch much.

Yeah, right. And, I got my excessive girth by eating nothing but carrots.

Wednesday, October 25, 2006

Blather On, Brothers and Sisters

It seems to me that the problem with diaries, and the reason that most of them are so boring, is that every day we vacillate between examining our hangnails and speculating on cosmic order. ~Ann Beattie

I was proofreading "We Deserve A Break" ... I beg your pardon? Yes, I proofread these entries. Granted I don't catch all the typos, but I do make an attempt. What? Why bother to proofread? Well, I'd like to present a piece with some degree of polish, and ... All right, that'll be enough of that. Even 2/3 of a reader a day deserves to be able to read a piece without having to decode bad sentence structure, misspellings, and missing words.

Everyone's a critic.

Anyway, I was proofreading the aforementioned piece, when I had an epiphany. Quit snickering; they don't hurt a bit. I got to thinking about why a member of the Reformed Church of God should be so bent out of shape about bloggers. Oh, certainly, he gave reasons, such as they were, including the fact that a blogger might have the gall to feel good about expressing an opinion when those opinions were so much “blather.” But, really now, if it is “blather”, then sooner or later the blogger will tire of it and move on to other, more obvious vices.

But then I got to thinking about another rumination where I dwelt on the media frenzy about blogs, and suddenly it all became clear: Vox Populi.

Vox Populi is Latin for “the voice of the people.” Blogs are providing a platform for people to speak, and that has the media and the people “in charge” worried.

It's not that people haven't had their say when they really wanted to, but, for a long time, they had to work at it. In early societies, the voice of the people was the spoken voice. It might be at the Agora or at the Forum, but you had to be willing to stand up and speak up, which was sometimes a risky thing to do. However, during those periods when the exchange of ideas was not totally feared by the leadership, people honed their speaking skills to be able to deliver their opinions. Aristotle gave a wonderful set of lectures, collected in “The Rhetorics” explaining not only how one should speak but a lot of the tricks of the trade to make your case if you were, say, arguing a case that someone had brought against you.

Speaking was nice if no one made you drink hemlock as a result, but, unless you were famous enough to end up drinking hemlock, your voice only reached a few. Beyond the folks who happened to be standing in the area where you were orating, it was unlikely that anyone would ever know what you thought.

It wasn't that there wasn't writing. Of course, there was, and you could always pen a scroll yourself (assuming you could write), but, again, unless you were well-known or rich, no one was going to undertake the labor-intensive copying of your scroll to disseminate it to the masses.

Later, the church door became a place to post your opinion. When Martin Luther posted his 95 theses on the cathedral door, he was doing so because this is where people placed announcements, protests, and cow-for-sale ads. Everyone went to church, and the church was a focal point of the community, so anything on the doors was going to be visible to many people. But this still limited the number of viewers to the local folks. Then along came Gutenberg.

When Gutenberg introduced the printing press to Europe, the ability to broadcast ideas increased vastly. Once the process became economical, anyone could prepare a pamphlet for a mass printing. The writer might even sell them for a shilling or a farthing or whatever small change was in those days, but a well-off person could afford to subsidize the printing and send his ideas far and wide.

Pamphleteering was the blogging of its age. Judging by the number of them and the varied subject matter, anyone who could put pen to paper might toss off a pamphlet at some time in his life. But, pamphleteers went beyond little homilies and witty stories. Revolutionary thought was advertised through pamphlets. Printing became a weapon for those who opposed the people in power. As a result, printing presses were often hunted out and smashed by the authorities trying to protect their turf. But, as fast as their presses were wrecked, the revolutionaries always seemed to get another set up.

Oddly, it seems that the 18th century was the last great age of the pamphleteer. After the American and French revolutions, the writers seemed to content themselves more with writing to newspapers or publishing longer works. The press seemed to be the one to collect opinion and spread it about during the 19th century, although the first quarter of the 20th century did see a flurry of rogue printers turning out revolutionary and reactionary material in countries like Russia and Germany. Once dictatorships were established in those countries, those presses fell silent.

Underground newspapers provided a limited forum during the 1960's onward, but these were mostly limited to intellectual and pseudo-intellectual thought. The ordinary person really had only the letter to the editor. The pollsters, the news services, and the television news departments now seemed to control the flow of opinion. At least they did, until the Internet.

Initially, web sites began to provide a means for average folks to broadcast their ideas. But web sites have to be maintained and, as I've noted in one of the earlier articles, are harder to set up to provide access to archival material. Blogs are another matter.

The mechanics of blog software is ideal for journaling, providing different points of view, and, of course, blathering. It would probably be fair to say that blather outweighs meaningful thought by many times, but, when you have millions of blogs, that still means that there are a lot of people putting out worthwhile reading. And that scares the bejeebers out of the media and others who would direct your thinking.

The church member was upset because people might think their opinions mattered. By extension, that bothered him most likely because, if they thought their own opinions mattered, then they might be inclined to question those of church authorities. Church authorities have historically not taken well to being questioned.

The media is upset because good blogs often demonstrate just how shallow the mainstream media sources are. Just look at Dick Destiny, written by George Smith, a Senior Fellow at GlobalSecurity.org. He debunks the media hysteria about ricin, liquid explosives, and other “imminent” terrorist threats which the evening news threatens us with on a regular basis.

Or look at the fear the media has of “political” blogs. The mainstream guys have been telling us what we think for a long time; they feel threatened when there's a body of information available that shows it's not the way they say it is.

So, gang, let's all keep blathering. Our seemingly infinite number of bloggers may not run out Shakespeare's plays, but we just might send out an occasional wakeup call.

Besides, what's wrong with thinking your opinion matters?

Monday, October 23, 2006

We Deserve A Break

Man is a Religious Animal. He is the only Religious Animal. He is the only animal that has the True Religion - several of them. He is the only animal that loves his neighbor as himself and cuts his throat if his theology isn't straight. He has made a graveyard of the globe in trying his honest best to smooth his brother's path to happiness and heaven. ~Mark Twain

All right, the Islamists and Christians need to take a time-out right now. Go to your rooms and stay there until you can learn some tolerance for others and for your own brethren. And you Jewish folks, don't go acting smug, or I'll send you to your rooms as well.

I could deal with Islamic people getting upset with the depictions of Mohammed in the Danish funny pages (or editorial pages, wherever) because it is a blatant disrespect of a basic tenet of their faith. Just as free speech doesn't apply to shouting "FIRE!" in a crowded theater, it also doesn't protect incendiary speech, especially when there are other ways to say the same thing. Of course, getting upset doesn't excuse some of the behavior that was exhibited by the faithful, but the nature of the incident made it difficult for more moderate voices to be heard.

However, when Muslims start claiming that every rectangular solid is a copy of the Ka'ba, the holy buildng in Mecca, they're just being ridiculous, especially when the solid is just a glass storefront for a computer store. What next? Perhaps it will be an insult to Islam to use any of the letters in the Prophet's name or for a non-Muslim to say Koran.

There is insult, there is over-sensitivity, and then there's just plain stupid.

Conservative Christians have their own techniques for making everyone's life miserable. When they're not trying to dictate what we see or read or teach in our schools, they're accusing their own flock of falling prey to the evils of the modern world. In particular, it seems some Evangelicals are worried about that new and powerful source of evil, blogs.

The Register reported the monthly newsletter of Reformed Church of God had a scathing article against the evils of blogging. The author of newsletter does allow as how some blogs, written by professionals and specialists are all right, but we ordinary folk are engaging in a socially accepted practice - just as are dating seriously too young, underage drinking and general misbehaving. I certainly can't qualify for either of the first two at the age of 57, so I guess I must be generally misbehaving.

Perhaps most damning of all (at least according to the author), blogging often makes the blogger "feel good or makes him feel as if his opinion counts - when it is mostly mindless blather!"

I wonder how he feels about Fox News.

Now, I'll be the first to admit that there's a lot of mindless blather in the blogosphere. Goodness knows, I try to contribute my share. But, if blather is a sin, then the bulk of humanity is going to Hell.

The author quotes Proverbs 17:27-28 as his basis:
  • He who has knowledge spares his words, and a man of understanding is of calm spirit.
  • Even a fool is counted wise when he holds his peace; when he shuts his lips, he is considered perceptive.
Thus, in this man's universe, there are wise people who never say anything and fools who seem wise because they don't talk. Where that leaves the average person is a bit mysterious, although stating an opinion seems guaranteed to get you labeled as a fool. Besides, the Bible isn't saying sounding like a fool is a sin; it's only saying that when a fool opens his mouth (or writes in his blog) he simply removes all doubt that he is a fool (female fools exist, too, but I get tired of writing he/she all the time).

Of course, what the newsletter writer really means is that if you don't agree with him, then you're a fool. And, heaven forbid that anyone should have the temerity to feel good about themselves.

And then there's the latest from Pope Benedict XVI. Not long ago, the Pope decided to hold a private seminar to assess the Church's position on Darwinian evolution. If anything has come out of that, I've missed it altogether, but I didn't miss the latest idea being floated by the Vatican. Seems that the Pope favors returning to the Latin mass.

As someone who grew up attending the Latin mass, I was one of those who wasn't in favor of having the mass in the local language. One of the wonderful things about the Catholic mass that it didn't matter whether you entered a Church in Montevideo or Monticello; it was the same Mass which could be followed in your missal. After all, "catholic" means universal, and what could be more universal than a mass that was consistent in all Catholic churches? So, on the one hand, I ought to be pleased.

However, it appears that this move has nothing to do with universality; it has to do a return to a more conservative Catholic Church. Instead of John Paul II's attempts to move the Church forward (in everything but birth control), we now have a desire to return to some idealized past. One can hardly wait to hear the next pronouncement out of Rome. Perhaps Galileo will be re-condemned or, better yet, maybe the Pope will call for a new Crusade.

Go to your rooms. Now.

Wednesday, October 18, 2006

Sound and Fury at Gallaudet

The right to swing my fist ends where the other man's nose begins. ~ Oliver Wendell Holmes

Back in May, I wrote a little piece that described protests going on at Gallaudet University, in which I manged to misspell the name of the school as Gaulledet, for which I abase myself. However, if you look past that, you find a commentary about protests going on concerning the appointment of a new university president, one Jane Fernandes. Students and faculty were, for reasons which were not very clear to me, very upset over her selection for the post, mostly because she wasn't their sort of deaf person.

As I said in the article, Ms. Fernandes was born deaf but grew up speaking, not learning American Sign Language until the age of 23. She is married to a former Gallaudet professor, who is not deaf, and has two children who can hear. She was opposed, according to one protesting faculty member, as not representing the deaf community.

Frankly, this sounded lame. I figured that had to be more to the story, but I found nothing more informative at the time. Ultimately, the coverage died down, and I lost track of the story until the other day when this brief story showed that things had not quieted down at Gallaudet. Quite the opposite, in fact. Things have gotten out of hand as protesters had closed the school.

It is still completely unclear to an outsider like me what the issues are here. All that is clear is that the protesters have no desire to negotiate, preferring to deny access to the school's facilities to students who wish to attend classes and to others who use the facilities. The rhetoric is hot and heavy, though.
  • “We will not let the campus go unless Jane Fernandes resigns.” - Noah Beckman, student body president
  • “The whole school is speaking now.” - student and protest leader Chris Corrigan
  • “This illegal and unlawful behavior must stop.” - outgoing university president I. King Jordan.
The irony here is that Mr. Jordan was installed as president thanks to another round of protests from students many years ago, calling for the school to appoint a deaf president. I sense a pattern here. The irony gets thicker when you consider that the faculty that, no doubt, participated in those protests, has pass a no-confidence vote in that same Mr. Jordan and in the Board of Trustees (whom the faculty probably never liked in the first place; trustees and faculty members are natural enemies). The vote doesn't really do anything except indicate that the faculty is ticked off.

I know a little about protests. I was in college until 1971, and, even though I was at a place where demonstrations were minimal, even we were not passed by events (see Radio Daze parts III and IV). Even if we had not had such activity on our campus, we would have had to be dead not to see the level of dissent on campuses around the country, brought about by the dissent against the Viet Nam War and the support of the Civil Rights movement. The most profound effect of student protests of that day was to shine a spotlight on the war and social problems.

When students were shot at Kent State, virtually every campus in the country boiled over, some just a heavy simmer and some completely over the top. In many cases, students demanded the universities close down, primarily so they could continue their protests without having to go to class.

At least, that's the best reason I can see now in looking back at it. Even Case shut down after a fashion. Students were given the choice of finishing out the semester (it was late in the term) or take a Pass/Fail grade on their work to that point in the year and go do whatever they wanted to. So the school was open, but students were given the option of ending their year. I did that so I could work at the radio station, since I was program director and we were on a 24-hour schedule. The university chancellor praised us for our work in squelching rumors, and I got to avoid taking final exams, definitely a win-win situation.

Most of the schools that chose to shut down didn't offer such choices; they either stayed open or closed completely for the summer. Looking back, I recognize that many students who wanted to continue going to class were deprived of the opportunity, ostensibly in the name of freedom of choice.

The protest movement had a clear target, focused primarily on the war. Despite that focus, it seemed that demonstrations would take on a life of their own, leading to building takeovers or outlandish demands that no school could sensibly meet. Ultimately, these sort of events petered out of their own accord. In fact, some of the schools that took the closure route did so just to let things cool off for a while.

This is not to say that the protests didn't have an important and often positive impact. But as time went on, many of them were simply “me-too” events featuring tortured rhetoric, weird mixtures of socialism and anarchy, mixed with a heavy dose of iconoclasm. Keep in mind that those long-haired liberal weirdos you see in films of these demonstrations are now your Congressmen.

(That has nothing to do with Gallaudet, but it bothers me from time to time.)

The Gallaudet protests have the feel of iconoclastic demonstrations that are feeding on their own momentum. Football players volunteer to block entrances to the university; protest leaders refuse any negotiation or attempt to find a middle ground; a “coupe d'universite” is declared. The captain of the football team said he would say to Ms. Fernandes, “Resign now. It's as simple as that. If you resign, we can move on with our lives.”

I'm sorry, but it's not as simple as that. A can of worms has been opened, and there's no putting them back. Ms. Fernandes is not about to resign; the students and faculty have burned all the bridges to a middle ground. The school is open again, but for how long? It would not be surprising to see the school closed again when Ms. Fernandes is to take over her post in January.

Ultimately, the faculty and students have got to come up with something more damning than Ms. Fernandes not representing the deaf community, or, as she puts it, “not being deaf enough.” In the olden days, demonstrators and universities would attempt to open “dialogs.” The idea was that if protest leaders and school leaders actually discussed the issues, some accommodation could be reached that would satisfy both sides (or at least, minimally dissatisfy both sides), allowing life to go on. Most importantly, it would allow those who wished to be students to be students.

No one has the right to block the right of another to an education, not George Wallace standing in the door of the University of Alabama nor a football player standing in the gate at Gallaudet.

Monday, October 16, 2006

The Man Behind the Network

In my memory, Dad was always president of one company or another. As a kid, I didn't know what this meant, so I asked, “What does a president do?” Dad said, “A president is the guy who sticks around to empty the trash after everyone else has gone home.” ~ Brent Noorda, remembering his father, Ray

Ray Noorda died recently, at the age of 82. Chances are good you have no idea who he was, but 23 years ago he took over a little 17-person company in Provo, Utah and changed the face of computing. By the time he left, Novell had become the titan of server-based networks, and Netware, its flagship network operating system, had become a de facto standard.

Netware provided a simpler means to set up a file and printer sharing network than had been the case with Unix. Now, that's a relative term; you still had to know of what you were doing, but one big advantage of Netware is that it ran on x86 hardware, the “PC” chipset (or CISC) rather than the more expensive RISC systems that ran Unix.

Noorda made Novell a customer-driven company and listened to his users complaints, suggestions, and needs. As a result, Netware got more solid yet more versatile. Oh, sure, Microsoft is the big guy now, but when Bill Gates was struggling to get past peer-to-peer networking, Ray Noorda was providing software that would make the mainframe people nervous.

Netware servers just ran. By version 3.12 (around 1993 or 4), Novell servers would run for months without issues. Novell Directory Services (NDS; originally Netware Directory Services) was well on its way when Noorda left in 1995. Noorda had a huge jump on Bill Gates when it came to the network and stood to continue to build on that lead, but he made a huge mistake. He decided to compete on Gates' turf, the desktop, instead of letting Gates try to catch up with him in the server arena.

Novell thought it could command the computing landscape from desktop to server, which they realized was the ultimate goal of Microsoft. Gates had pulled a major coup by offering his MS Office suite of applications at an affordable price. For less than the price of a copy of Wordperfect, a user could get word processing, a spreadsheet, presentation software, and a database program. Noorda saw Office selling like hotcakes and decided that he had to fight back.

It didn't work. He bought Wordperfect, which by this time had fallen woefully behind MS Word in features; he also bought Paradox, a mediocre database program. Oh, I know there are still people out there running it, but the sad fact is that dBase, FoxPro, and even Access were better. Noorda might have had a shot at FoxPro, but for whatever reason he went with Paradox. The spreadsheet was Quattro Pro, which was a good program, but it also had fallen behind the likes of Lotus and even Excel because of the time and energy spent by Borland (who owned Quattro) losing in court to Lotus over “look-and-feel” issues.

The time and effort spent trying to win the desktop sapped strength from the Netware effort, which ultimately opened the door for Microsoft. But the real killer was Noorda leaving.

By this time, Novell had 12,000 employees, and Netware was still king of the networking hill. But, Noorda's desktop move had cost the company dearly as they had sold their suite at low prices to try to cut into Microsoft's lead. It seemed that Noorda, who knew how to sell networks, didn't know how to capture the desktop market. But, it was a disaster for Novell when Noorda left, because they lost touch with their customers. I know of many instances, some from personal experience, how Novell began to ignore their main cash source, the network customers.

I was working for a client who had a large Novell network. We had decided that it was time to upgrade from version 3.12 to 4.11, so we called Novell to talk about it. We called and called and called. Finally, when we had all but completed our planning, a salesman finally returned our calls and offered to bring out a Netware Engineer. Now, this was a 3000 user network, so we're not talking chump change, even for Novell, yet we had to beg to get someone to talk to us. This was a typical scenario that would not have occurred when Ray Noorda was in charge.

It didn't help that Netware was hell for third-party developers. One thing about Windows is that it was easier to create server-based software on Microsoft's platform than Novell's. Partly that was due to Netware's structure, utilizing “Netware Loadable Modules” (NLM) which were notoriously difficult to write. It was also due to the fact that Netware demanded that software behave itself. No program could operate at certain restricted levels that protected the operating system from crashing. Windows had no such restrictions, so it was easier to write for. Of course, it also made Windows servers more prone to crashing.

But, if you're an IT guy who needs a server-based database, and all the good ones are written for Windows, you're going to bring in some Windows servers. Since Windows servers could be integrated into a Novell network without much trouble, Microsoft got their foot in the door through applications. Eventually, they kicked the door in.

Noorda, meanwhile, went on to found a venture capital company called Canopy, which is still around. Canopy specialized in investing in start-ups with potential and had some successes. Word was that Noorda had a lot of fun at Canopy, which is more than can be said of the parade of executives that followed him at Novell.

I am amazed at how little was being written in the technical press when Noorda passed away. I learned of it from a British news site, the Register. American tech outlets seemed to let Noorda's passing just slide by in favor of the latest hype about Vista. The first article on a U.S. site that I saw was Dave Kearn's charming rembrance, a couple of days later.

That's a crying shame, without Noorda having paved the way, Bill Gates would have had a much tougher time cracking the Unix networking world. By the time Gates had got Windows NT to where it only crashed occasionally, Netware had muscled aside Unix and the mainframe, establishing server-based computing as the way of the future.

A guy who could do that deserves some words of praise.