Computers make it easier to do a lot of things, but most of the things they make it easier to do don't need to be done. ~Andy Rooney
Those wacky folks at Gartner are at it again. You know Gartner, the people who said a company's annual cost of owning a computer was $25,000. Bob Lewis, writing a column for InfoWorld at the time, showed that, using Gartner's methods, the annual cost of ownership of a Daytimer was $3000. That's right, the thing you write in with a pen and carry in your briefcase.
Now, Gartner's been around for a long time, and they have surely made some savvy predictions, if only accidentally. After all, when you make as many prognostications as they do, you're bound to hit one once in a while. But it seems that the ones the news media pick up are the ones that give one pause, if not outright belly laughs. The Register is reporting that Gartner thinks that the next few months will see the biggest changes in computing in a generation. Windows Vista is going to be disruptive to the organization, new hardware is going to change the price-performance relationship, users want more freedom while IT wants more control, and blah, blah, blah.
I presume they issued this same report when Windows 3.1, Windows 95, Windows 98, Windows 2000, and even Linux came out. In fact, the fundamental issues of computing haven't changed much in the last 10 or 15 years.
In the beginning, computer work meant banging away on a terminal, and relatively few people within a company actually had them. Most of the ones who did were data entry types or production control workers. At more advanced companies, you might even find Computer Aided Design (CAD) rigs with digitizer tablets that were the size of card table. Meanwhile, the rest of us were using ledger paper, graph paper, and calculators. In fact, to do all the graphs I used to do as a quality control engineer, I had a really neat set of colored pencils which were better than any crayola set I had as a kid.
PC's started creeping into companies during the mid-1980's. Sometimes it was a finance department wanting to use spreadsheet software to do what-if projections easily, sometimes it was technical departments wanting to crunch numbers with software that was more flexible than the mainframe offered. Initially, the PC's came in without IT blessing. In fact, the IT people could have cared less. As long as these guys were playing with the new toys, they bothered the programmers less, so that was good.
Eventually, PC's got more affordable, and people started creating ad hoc networks, sometimes with the cooperation of IT. People who never had a terminal might now get a PC. And if one person got a PC, someone else had to get one, and so on until most everyone had one. That was all right, until people began to play with the software.
For those few who had been working on the terminals, there were few options to play around with. The mainframe applications had their basic functionality, and the user could live with it or get out the ledger paper and colored pencils. Suddenly, with the PC, here was the user using programs that gave them all sorts of options to play with. There also seemed to be software to do most anything anyone wanted to do. The trouble was that people spent more time fooling around with the software than they did actually generating output.
They also started playing games. Then came Internet access, and what little productivity was left went into the tank.
The computing vicious circle also began. Computer makers would bring out new hardware, and software makers, dominated by Microsoft, would bring out new software and operating systems. No sooner did you get hardware that could run the old stuff than you got new software that needed new hardware again. We've been in this cycle for years now; Vista is just the newest iteration.
But there was a more subtle problem. For reasons which I have never understood, people seem to feel that the computer on their desktop is their personal property. The desk, the phone, they belong to the company, but it's the user's computer.
It goes back to those aforementioned early days of PC's, I guess, when users could finally control their computing environment, as opposed to being limited to the mainframe's few options. Even when PC's became more widespread, individual departments got whatever software they wanted. Indivdual users within departments got their own flavor of software to put on "their" computers, and they weren't interested in using anything else. IT has been trying to get that horse back into the barn ever since.
Gartner seems to have finally discovered that fact.
If I would have been transported from the first network environment I was ever in to one of today's modern networks, I know I would have marveled at the new technology and at the Internet, but the thing that would amaze me most is how little things would have changed between 1986 and today. I think the thing I would find most amazing is how Microsoft had wiped out most of its competition and the ever-increasing number of network servers companies use. But as to the way people use computers? It would seem quite familiar.
Something is going to occur one of these days to change the computing environment, but, whatever it is, it isn't on the horizon yet. Maybe Linux will finally offer a real alternative to Windows, maybe companies will realize that bloated "productivity software" is actually wasting employees time and computing resources. Whatever the change will be, history shows that the pundits, who have predicted the "year of ISDN", ATM to the desktop, and thin clients all running Java applications, haven't got a clue.
Who knows? Maybe someone will decide that large centralized servers are the wave of the future. Might even call them mainframes.