Y2K Plus One: The 21st Century And The Third Millennium Arrive For Real

By Applelinks Contributing Editor Charles W. Moore

As of the date this column is posted, there are three days left in the 20th Century, or three days to go until the beginning of the third Christian millennium, depending upon which direction you're looking.

There was no year 0, so all the hoopla a year ago about "the turn of the 21st Century" jumped the gun by 12 months, and the real 21st Century will be ushered in on a relatively quiet note at midnight on Sunday night.

That is, unless you are old calendarist Eastern Orthodox, in which case the millennium turnover comes 12 days later on the Julian calendar. Most of the Roman Catholic West switched to the Gregorian calendar in 1582, rectifying the Julian calendar's year being longer than the true solar year by 11 minutes and 14 seconds.

The English, however, waited until 1752 to institute calendar reform, at which time 11 days were dropped, the day after September 3rd (or Sept. 2; resources differ) becoming September 14th. Also in 1752 (or perhaps 1751, resources also contradict each other on this), New Year's was changed from March 25th the January 1st. The Sixth Century monk who was responsible for setting the original Christian calendar, one Dionysius Exiguus (AKA Dennis the Humble), had designated the year one to have begun on March 25th, with the birth of Christ taking place on December 25th of that year.

Since the millennium nominally commemorates the 2000th anniversary of the birth of Jesus Christ, both the Julian and the Gregorian calendars are almost certainly inaccurate, since recent scholarship suggests that Jesus was likely born around 4 BC, which means that the third millennium actually started with no fanfare at all back in 1996.

In any case, by mid-January, all the bases should finally be covered, and we can finally say without equivocation that we are in the 21st Century.

This month also marks the tenth anniversary of the World Wide Web. British computer engineer Tim Berners-Lee first proposed the Web in 1989 while working at CERN, the Geneva-based European Organization for Nuclear Research, as an unsanctioned project to develop means of controlling computers remotely. Given that Nextstep forms the basis of Mac OS X, Applelinks readers may be interested to note that Berners-Lee wrote the original Web software on a Nextstep computer, beginning in October 1990. The first crude command line browser was working by mid-November, and the program went beta at CERN by Christmas Day, 1990. At that point Berners-Lee and a colleague were the World Wide Web's only users. There are now some seven million Websites.

Berners-Lee went on to found the World Wide Web Consortium in 1994 to develop Web standards, around the same time that Marc Andreessen and his Mosaic team got their graphical user interface browser working. Mosaic became Netscape, and the rest, as they say, is history.

This time last year of course, we were all waiting with varying degrees of apprehension see what would happen when the computer clocks turnover from "19" to "20." As we all know now, nothing very much at all happened, making the Y2K bug one of the most monumental fizzles in history. At least for Y2K apocalypse die-hards who were still predicting (hoping for?) a global cataclysm due to the computer date bug.

It wasn't that there was nothing at all to worry about. Businesses and governments around the world had spent some $200 billion to fix the problem created when software engineers in the days when computer memory was astronomically expensive decided to use two digits instead of four to represent the year.

Harris Miller, president of the Information Technology Association of America, was quoted this week by the Associated Press noting that: “Problems did occur, and the fact that it was so minimal means that people did a good job.”

A rash of small Y2K glitches did crop up, but unbelievably few considering the level of hysteria that had been generated over the issue in the lead-up months. I was a Y2K skeptic a call long, and became even more so as purportedly "critical" dates rolled by in 1999 with barely a hiccup. When I expressed my growing conviction that the Y2K scare was being vastly overblown in a column here on Applelinks, it was not well received by some Y2K true believers, who e-mailed suggesting that I was a fool or worse for my complacency.

One could only surmise that the notion of Y2K as the agent of apocalypse was not an essentially computer bug related issue, and that the Y2K alarmists were just the latest permutation of a cultural sub-genre that specializes in obsessing about the end of civilization. I know otherwise sensible people who were storing dozens of five-gallon plastic containers filled with grain, legumes, and other dried foodstuffs in their basements way back in the 1970s, in anticipation that apocalypse was due any day. Remember predictions of stock market meltdowns that were supposed to come in 1989, 1990, and almost any other year in the past 20 or so? Meanwhile, the Dow Jones Industrial Average advanced from the 1,000s to the 11,000s with just a few expectable and temporary corrections along the way, one of which we are experiencing in late 2000, completely unrelated to any Y2K bug.

It seems that apocalyptic dread appeals to certain philosophical and ideological mindsets, such as:

• Some Reconstructionists and other fundamentalist Christians of the pre-millennial persuasion who are convinced that secularist society must collapse and be rebuilt to the blueprint of their particular form of Christianity.
• Control-freaks who can't tolerate uncertainty.
• Anarchist/survivalist/radical populist fringe-dwellers who hate government and would like nothing better than to see the business and financial establishment come crashing down.
• Neo-Gnostics who fancy that they belong to an elite whose special "gnosis" (privileged knowledge) provides them with vision to see clearly what is happening as opposed to the ignorant masses (one Y2K prognosticator referred to Y2K non-believers as "the fools of America")
• The perpetually paranoid, always primed to latch on to any conspiracy theory -- the sort of people who believe that big oil companies bribed (or murdered) the inventor of a device that would allow cars to get 200 miles per gallon; who think that the CIA is conspiring with the UN to impose world government; who scan the skies for "black helicopters;" who are sure that the U.S. Government is concealing evidence of alien UFO landings; and so on.
• Gun-nuts (as opposed to responsible gun-owners) who hoard weapons and "ammo" and would relish a civil war with the hated forces of ordered authority.

All of these sub-categories and more glommed on to Y2K as the latest nexus of their paranoia, and/or apocalyptic fantasy, and/or cultural resentment, all too many of them delivering their lugubrious predictions with an indecent degree of schadenfreude. You got the impression that they would be delighted if the Y2K bug actually did destroy the international banking system and undermine governmental authority, not to mention punish the great, unwashed masses of "fools" who refused to listen to their doomsday message.

It is fascinating to note a year later, how quickly the Y2K Cassandras went silent after last January 1st, becoming even thinner on the ground than people who admitted voting for Richard Nixon after Watergate.

Personally, twelve months after Y2K computer doomsday, I have yet to encounter any noticeable Y2K glitches on my own hard drive, but of course I use a Mac.

Moving along in the Mac orbit, January 2000 rolled in with a lot of anticipation about what might be introduced by Apple at MacWorld Expo San Francisco. Notably, there was high expectation that there would be a new G4 PowerBook announced to replace the then eight-month old Lombard, and some of the more extravagant optimists suggested that the chimerical 17-inch iMac would debut.

Much to the chagrin of the rumoristas, none of this happened, and indeed, no new hardware at all appeared with Steve Jobs on the stage at San Francisco. Instead there was a preview demo of Mac OS X, and AppleWorks 6 was announced, although it would be a couple of months more months before it shipped, and several more bug fix patch releases after that before AppleWorks 6 was really usable. Not one of Apple's finest efforts, and many users, including yours truly, still think that AppleWorks 5 is still a superior product.

A new PowerBook did finally take its bow at MacWorld Expo Japan in Tokyo in February, although it had speedbumped G3 chips -- not G4. I was not especially surprised, as I had thought all long that stuffing the G4 as we knew it into a PowerBook seemed an unlikely feat of engineering, and the 400 MHz and 500 MHz G3 chips seemed to make a lot more sense for portable use.

Also released at Tokyo were modestly upgraded iBooks, with larger, six gigabyte hard drives, and 64 megabytes of standard RAM, which they should have had from the get-go. There was also a new iBook SE in graphite livery, with a 366 MHz G3 processor. FireWire and DVD would have to wait, however.

The big news at MacWorld Expo New York in July was the G4 Cube, whose intro was tarnished by an inadvertent leak, albeit vague, by an employee of graphics card maker ATI, letting the cat out of the bag that there would be a new G4 Power Mac product released, which in turn incited Steve jobs to purge all mention of ATI from his keynote presentation, in spite of the fact that the new ATI Radion graphics card was being introduced.

Also new in New York, were dual-processor G4 Power Macs -- a stopgap attempt at stanching the growing MHz gap as Motorola remained unable to push the G4 past the 500 MHz threshold. The iMac got an "Earth Tone" facelift and lower prices, and there were a long-overdue new Apple professional USB keyboard and an optical USB mouse to replace the widely-reviled "hockey puck" unit that had been introduced with the original iMac.

September at Apple Expo 2000 in Paris in ushered in the new, second-generation iBooks with FireWire (finally), new 366 MHz and 466 MHz IBM's 750cx G3 chips with onboard cache, a new multimedia part, and a DVD-ROM drive in the iBook SE, as well as two new colors - - Indigo and Key Lime, with Graphite held over. The entry level iBook's price was dropped $100 as well, and the upgrades turned the iBook into a no-apologies great value. Also released in Paris was the eagerly - anticipated Mac OS X Public Beta.

Paris definitely marked the zenith of 2000's trajectory for Apple and the Mac platform, as the success of the new iBooks and OS X Beta launches was followed all too soon by the now infamous "lower profit warning" that caused Apple stock to lose roughly half its value overnight. Unhappily, things have gotten even worse from there, with the APPL's price continuing to decline, and the "lower profit" turning out to be Apple's first quarterly loss in three years.

So, turning our focus from retrospective to prospective, what does the first year of the new millennium hold for Apple and the Mac?

MacWorld Expo 2001 is only about two weeks away, and once again a G4 PowerBook is topping the list of expected hardware introductions, this time with considerably more lively prospects of becoming reality. With the current Pismo PowerBook now more than ten months old with no significant changes, and its form factor dating back to May 1999, this time I will be surprised if a new PowerBook does not appear on stage at San Francisco. However, I will also be surprised if it has a G4 processor. The G4, as we know it, is still too hot-running and power-hungry to make a good portable processor chip. Of course, they might go with an Intel-style workaround that drops power-consumption (and performance) when the machine is running on battery power, but a cooler-running G3 will be a lot more elegant, and make a lot more sense, as well as offering a higher clock speed number for marketing purposes.

Speedbumped iMacs and iBooks are also likely a good bet, if Apple can convince itself that having consumer machines with higher clock speed numbers than the professional Power Macs is acceptable from a PR and marketing standpoint. IBM's 750cxe chips at 700 MHz or 733 MHz may find a home in the iMac and iBook. There are also rumors of a possible G4 speed bump to 600 MHz.

Software-wise, Mac OS 9.1, which went final several weeks ago, is virtually a lock for introduction in San Francisco; not a revolutionary upgrade, but some refinements and a few new wrinkles added to the Classic Macintosh OS. QuickTime 5 is also coming (already out in public beta), and may be formally announced as a final release.

As for later in the year, the crystal ball becomes cloudier. Mac OS X will definitely be released as a final in 2001. The question is, when? Some are speculating that it could be as soon as MacWorld Expo, Tokyo, in February. My best guess would be the World-Wide Developers Conference in May, and it wouldn't greatly surprise me if they held it back for MacWorld Expo New York, although I suspect it will be released earlier than that.

Back on the hardware side, the iMac is overdue for a major revamp, which raises the question of a 17-inch display again. Falling LCD flat screen prices make a flat screen iMac a dark horse possibility. There are persistent rumors of a "CubeBook" portable that would restore symmetry to Apple's product lineup. The big Power Mac desktop form factor is getting very long in the tooth, its 2nd birthday approaching in two weeks with no significant changes over that interval save for the color scheme shift from Blue & White to Graphite & Ice in September 1999. Lower priced Cubes, perhaps with G3 rather than G4 processors may also be in the works.

But what do I know? In any event, let's hope that whatever Apple comes up with in the first year of the new century will be insanely great enough to restore the company to profitability and pull APPL out of its swoon.

Happy New Year and New Millennium.

Charles W. Moore

Moore's Views & Reviews Homepage <--> Moore's Views & Reviews Archive

 

  

.

 

.