[ Return to "Stuff I Wish I..." ]

"Net Apocalypse", by Keith Ferrell (1996)

There's a disaster waiting to happen on the Internet.

More than one disaster, in fact. A number of them are gaining strength, gathering their forces, testing the Net for vulnerabilities, or lying in wait for the right moment--which may come sooner rather than later.

If you want a soundtrack for Net doomsday, start humming Wagner. And think horses--with very scary riders.

They've saddled up and pointed their steeds toward us, the Four Horsemen of the Net apocalypse: Pestilence (the millennium bug), War (cyberterrorism), Famine (Internet address limitations), and Death (traffic overload). To help you dodge destruction at the hands of these wrathful riders, we'll introduce you to each of them (what we call the apocalypse scenario), tell you how each one works (apocalypse explained), and then describe how to keep these catastrophes from happening to you (apocalypse avoided).

Think you've got nothing to worry about? Listen carefully. The rumble you hear may just be the sound of their hooves as they approach.


It's Saturday, January 1, the first morning of the year 2000--and suddenly your aching head is the least of your problems. You stagger to your computer, intending to log on and check out the newsgroup: alt.mygod.hangover. But your headache is nothing compared to the one many of the world's computers began suffering at midnight--the minute they lost track of time.

You turn on your machine and get an invalid date reading at start-up. You switch to another PC that boots up and dial in to your Internet service provider (ISP). But you're denied entry because of unpaid bills dating back decades. Trying another provider, you get on the Net, but it's...different. Sites you've visited every day are suddenly gone. Your email queue is confused, too, with the newest messages all the way at the bottom. Whole networks have disappeared. You try to enter an online transaction, only to be told that your credit card has expired. You check with your electronic bank, and it tells you the same thing: you're out-of-date.

Millions of computers and billions of lines of code are currently incapable of recognizing the calendar's rollover from 1999 to 2000. At midnight on 01/01/00, many computers will start counting time all over again--from the beginning of the 20th century. That's what those two zeros at the end of "2000" still mean to too many systems.

You think your head hurts? This particular computer hangover has been building its headache for nearly 40 years. Back in the '60s, when memory and storage were pricey beyond belief, every digit saved meant valuable space that could be assigned to other information. Digital shorthand helped wherever possible, and one place where memory and space could easily be saved was with dates--hence the now-familiar mm/dd/yy format.

Unfortunately, once every 100 years, those two-digit date fields become incomprehensible to computers whose software has not been updated to account for the century shift. To a lot of code around the world, 01/01/00 means January 1, 1900.

Say you decide to call your sweet, silver-haired granny a minute or two before midnight on December 31, 1999, wanting to share the revelry with her. You chat for a few minutes, then hang up. A dozen or so terrific minutes for Grandma, a dozen or so not-too-bad ones for you--and half a dozen ways the millennium bug can make you sick.

If the phone company's time and billing software is designed to reconcile negative numbers into positive ones, you could find yourself charged for more than 50 million minutes. As far as the phone company is concerned, that call lasted from 1900 to 1999. (If the software doesn't reconcile, you might actually get a credit for that much time!) And if you were connected to the Internet while you were on the phone, you'd owe your ISP for each month of the century.

And we didn't even mention date-dependent banks, international financial exchanges, or insurance companies--plus spreadsheets not equipped to handle double zeros in the date field.

Net result?
In some ways the Net is in better shape than many closed systems. Many Internet servers use the Unix platform, which counts time in seconds and therefore has no millennium problem. But those servers are often part of networks that include other types of machines that may be susceptible to the bug. A likelier contributor to your millennial headache may be your service provider, which may not have updated the code in its systems. You should check with your ISP now.

PC problems
Remember that your own computer and software may not be fully millennium-compliant. How's your BIOS, by the way? Test it yourself. Go to the DOS prompt, and reset the date. (Have a boot disk ready in case you encounter any problems, and don't forget to reset your machine to the accurate date and time when you're done.) Do the same thing with any software you use that includes a date field. Not every millennium glitch bounces you to an "invalid" message or even to a 1900 date; some of them toss you back to 1980, or 1901, or some other random year. It all depends on how the code was cut.

server scenarios
Now think about some of those servers you've gotten so attached to using as you surf the Net. A server that incorporates two-digit date fields and hasn't been updated may not be functional come 2000. And all it takes is one computer whose dating system is out of whack to crash an entire network.
Just how big is the year-2000 problem? The Gartner Group, an information-technology research firm, estimates that 180 billion lines of COBOL code worldwide will be affected by the date change. While COBOL remains one of the major languages of the worldwide computer infrastructure, it's only one language; plenty of other code will have to be addressed. To get an idea of how much work must be done, take a look at The Year 2000 Information Center.

So it's a programming problem, but it can be solved easily enough: just put an additional five or six hundred thousand COBOL programmers to work over the next few years, by some calculations. The most conservative cost estimates for correcting the problem start at $200 billion over the next three years--and that's to reprogram the troublesome software we know about. The Gartner Group estimates that the cost may climb past half a trillion dollars.

It gets worse. The source code for many applications was lost long ago, making the applications uncorrectable. Some older languages, such as variations of COBOL, Pascal, FORTRAN, and even Basic, have fallen into disuse, which means that finding programmers to do the fixes will be harder than ever. Millions of computers have out-of-date BIOS files installed. The list goes on and on.

The good news is that the millennium bug is starting to get a lot of attention. It's this year's big topic on the conference and symposium circuit. CNET Radio's Brian Cooley recently interviewed Eliot Weinman and William Ulrich, the two main organizers of the Year 2000 Conference & Exposition. To listen to the interviews, which provide more details about the millennium bug and why it's so difficult to solve, click on the image to the left. (The interviews are in RealAudio format; if you don't have the RealAudio Player, you'll need to download it.)

Most of the problems will be fixed. Today's PCs have BIOSs that can be updated with software. High-end programmers with COBOL under their belts will get plenty of work. Fortunes will be made by companies that specialize in fixing the millennium bug. (It's likely to be a source of healthy returns if you pick the right program-repair companies to invest in.) For a look at some of the companies tackling the millennium bug, check out Year 2000 Tool Vendors. The scope of the problem, and the challenges facing companies seeking to solve it, are enormous. Every aspect of every computer and piece of software on every network will have to be examined, upgraded or replaced, and then tested as a whole to ensure that no glitches have slipped through the process. And barely three years remain to accomplish the feat.

As far as your home PC goes, you can take a few steps to be sure it's millennium compliant. First, test your machine, as described in "Pestilence: apocalypse explained." Then, with the PC set for 01/01/00 (if it accepts that date), shake down your date-dependent software--calendars, date stamps in word processing programs, and so on.

Should your computer have trouble making the roll to 2000, talk to your hardware vendor about how it could be upgraded. If individual software applications have a dating hiccup, get in touch with the makers--through the companies' Web sites, if possible--for patches that eliminate the glitch.

Midnight on January 1, 2000, is coming right on schedule--and along with it, one humongous lesson in how well the computer industry and individual users respond to an apocalypse we knew about well in advance.


It feels like Bosnia on the bad days--scuttling from place to place, hiding, never knowing where the next shot is coming from or whom to trust.

It wasn't supposed to be this way. The Web was supposed to change everything--make it all more open, friendlier, one big happy electronic community.

Yeah, right. Already this year you may have...

* had your online bank account cracked and your account overdrawn.
* been cut off from your own Web site for two days when your ISP was flooded with false synchronization requests--a SYN hit.
* lost threads on half of your favorite newsgroups because roving cancelbots didn't like what was posted there.
* had all of your electronic subscriptions snuffed when someone grabbed your passwords.

You've had it easy...Right up until the next time you switch on your computer and the message, "Bang! You're snuffed!!" flashes on the screen as the virus you picked up during your last Web ride destroys all the data on your hard disk.

This horseman is War, and his belligerent minions are smart, dangerous, and adaptable--adjusting and evolving their weapons to every new technology that's introduced. Hackers, crackers, cyberterrorists, infowarriors: whatever name you assign them, they may well prove to be the most dangerous of all the troubles headed for the Net. And there may be very little we can do about it.

Unleashing digital weapons that take advantage of the very openness of the Internet--every connected computer accessible to every other connected computer--these evildoers rove at will through cyberspace, destroying data and altering internal systems so they're incapable of executing even the most common commands. At best, their work creates an environment of paranoia, forcing users to overlay their systems with software shields that slow operation and require constant updating.

In the worst case, computer viruses and other scourges will become so prevalent and dangerous that they corrupt hardware and software at major switching points, potentially bringing the Internet to its cyberknees and transforming it into an electronic hot zone, riddled with infectious pieces of deadly code. Net surfers will live on a battlefield where they won't know whom to trust, what to look at, which links are safe, and where the deadly programs breed--an online world where the only safe passage requires sealing ourselves in suits of electronic armor that must be constantly updated and upgraded, ever vigilant against unseen attackers.

Take a breath...but make sure you're wearing a filtration mask.

The Internet makes an appealing target to terrorists, thieves, and malicious pranksters: all that information, all that electronic vulnerability, all that money...and most important, all that access.

Add to that the relative lack of expense involved in mounting a cyberattack. Bombs cost money and have to be physically transported to hard targets. Creating electronic weapons often costs no more than some programming time. Delivering the blow can be as simple as clicking on an email package's Send button.

Think about how much damage cyberterror can do. The Net is not the only thing that's vulnerable. Suppose some evil bit of code takes down the air traffic control system. Or scrambles the computers in a large hospital. Or introduces a destructive program into the computers of a stock exchange or a bank. Or knocks out power to a whole region of the United States. (Indeed, some media commentators invoked this very possibility earlier this year when much of the West went dark.) The possibilities are endless--and endlessly terrifying.

Finally, tack on the relative ease with which a terrorist can maintain anonymity. No airport checkpoints to pass through. No fingerprints left on steering wheels or bomb fragments. No human presence at Ground Zero. It's no wonder, then, that so many cyberterrorists are out there with so many different types of weapons at their disposal.

The oldest and best-known software weapons, computer viruses come in all shapes and flavors, from "harmless" prank messages to electronic forms of Ebola that chew up your data and spit it out as garbage. The very openness of the Internet--and the number of relatively inexperienced newcomers using it--makes it likely we'll be hearing about a lot of virus-ridden computers in the next few years.

According to experts at McAfee Associates, a maker of virus detection and protection software, as many as 10,000 viruses may be currently in circulation. And the company estimates that 300 to 400 new viruses are being created and circulated per month. That's a dozen or so new ones every day.

Some viruses infect your PC's boot sector--the first data area your computer seeks when you start it up--and rewrite the sector, crippling your system. Others infect the files that launch or run most of your software, rendering your programs unusable. (According to McAfee, macro viruses, which take effect when you execute a macro command from an application such as a word processor or spreadsheet, are now the most common type of virus being developed.)

Other deadly viruses erase your computer's CMOS setup tables (the records that tell your machine what sort of system it is), making it impossible for your computer to work.

Or consider a virus that makes only the smallest and most subtle of changes to your computer's data, the sort of thing you wouldn't notice until the moment when you really need something--and it's been corrupted.

But the nature of viruses and the fear they engender has led to another weapon of cyberterrorism, even subtler and more insidious than an actual virus: the false virus warning. The most infamous of these is the Good Times virus announced in December 1994, with warnings appearing on computers around the world. In fact, there was no Good Times virus, but the warning and the paranoia it created live on.

Worms are breeder programs, reproducing themselves endlessly to fill up memory and hard disks. Worms are often designed to send themselves throughout a network, making their spread active and deliberate, rather than taking the hitchhiker approach used by most viruses.

logic bombs
Logic bombs are embedded pieces of destructive code that detonate on preset dates or when a specified set of instructions is executed, unleashing destructive actions within a computer or throughout a network. Often left by disgruntled employees to wreak their havoc years later, logic bombs can be very hard to find.

Bots (from robot) are pieces of code designed to rove the Internet and perform specific actions. A newsbot, for example, might fetch only the news you want. In the wrong hands, though, bots can be destructive--cancelbots that erase newsgroup messages, censorbots that delete postings that their creators find offensive, and so on. On September 22, 1996, Usenet groups lost about 25,000 messages to a cancelbot.

SYN attacks
SYN attacks involve sending a torrent of connection requests--the same sort you make every time you click on a Web site--to targeted sites. In effect, a SYN flood creates a major traffic jam at the site, cutting it off. SYN floods are spreading, and any site can be a target. In September, a popular chess site was checkmated as a result of such an attack. SYN floods are attractive to wanna-be hackers because they require only simple programming--in some cases just a few lines of code. And sample SYN code is readily available both online and in print.

A mere threat can be as effective as an actual attack. Within the past year, according to unnamed sources, several U.S. banks have paid six-figure fees to buy off hackers who cracked the banks' security codes.

As more money moves on the Internet, the more appealing the Net will become to crooks. A survey by Science Applications Corporation reported that computer break-ins at 40 corporations resulted in losses exceeding $800 million last year alone. It's a battlefield out there--and it may already be expanding into your machine.

Electronic terrors are very real and are being met by very real activity on a number of fronts. Government and military groups are mounting aggressive research programs and other forms of defense aimed at stymieing the likeliest data-doomsday scenarios. According to McAfee Associates, virus detection software is already a multibillion-dollar industry. Investment in corporate information security dwarfs that, with tens of billions being spent on firewalls, encryption technologies, and secure-communications protocols. Already we are seeing many corporations and institutions cutting themselves off from the Internet, sealing their information transfers into intranets to keep outsiders--and their schemes--out.

Firewalls themselves are spawning corporations devoted to protecting the firewalls against vulnerability. (For more information, consult the National Computer Security Association's firewall certification program.) One risk is that as more and more firewalls are erected, we may find ourselves cut off from previously public Internet sites.

Computer security has always been a thriving business; now it's explosively expanding. The federal government's National Security Agency plans to assign as many as 1,000 people to an information warfare department. Its shape, goals, and ultimate budget are not yet known, but its establishment sends a clear signal that federal security officials are watching cyberterrorism with growing concern.

The Internet itself may be relatively safe from full-scale attack, but the aftermath of the Morris worm continues to be felt. All it takes is a single vulnerable spot and a smart antagonist, and we could experience a Net-wide apocalypse.

But what about us, the individual users? In our case, the very openness of the Internet--what makes it so vulnerable--also works to our advantage: word about new viruses and other forms of sabotage spreads quickly. Software to deal with the problem can be distributed with equal speed.

You can and should do a few things to protect yourself:

* Get a good, frequently updated virus detection program. (You can download McAfee's VirusScan from the Web; or invest in a commercial package such as Trend's PC-cillin, Symantec's Norton AntiVirus, or Dr. Solomon's Anti-Virus Toolkit.) Use it every time you use your computer, and be obsessed with keeping it current. Change your passwords frequently, never give them out, and make sure they're random and unrelated to any of your personal information that may be on the Net.
* Be careful what you download and where it comes from.
* Use encryption software such as PGP to protect your electronic communications from the prying eyes of interceptors.
* Stay informed of the latest cyberterrorist goings-on; you need to know what you're up against before you can protect yourself from it.
* Make sure everyone who uses your computer follows the same precautions you do.

This is one potential apocalypse that will not go away as a result of software fixes or hardware upgrades. Money alone will not keep determined crooks at bay. Whether or not cyberterrorism brings down the Net depends on staying a step ahead of the best and brightest of the cyberwarriors.


You've got a great idea for a site. You've got the content, you've got the programming. You've even got the funding. Everything's in place for a Web triumph. If you build it, they will come. So you build it. But you can't get an address. There's no place for them to come to.

You've finally cleared your schedule and want to spend an afternoon just Net surfing. The junk food is at hand; the chair is comfortable. You dial up your ISP only to discover that the pool of address numbers available to subscribers is used up at the moment. Try again later.

These are the scenarios we face as the current Internet address structure becomes strained to the breaking point. More and more people want to get connected, but soon we won't have a sufficient number of addresses for all the computers on the Net. We're face to face with Famine, starving for numbers on the Net.
It all began with the U.S. Department of Defense. Seeking to design a network of networks that could withstand apocalyptic conflagration such as nuclear war, the engineers of the original ARPANet designed it to allow different types of networks to communicate with each other through sets of instructions called protocols. The heart of these protocols is the Transmission Control Protocol (TCP). Its first cousin, the Internet Protocol (IP), extends the internetworking communications standards to the Net itself.

Together they are called TCP/IP, and they comprise the rules that allow all of the different routers and switches throughout the Internet to find the specific host computer of the site you're looking for. Currently, TCP/IP incorporates more than 1,000 different communications protocols, smoothing them into a relatively seamless whole that lets you surf anywhere on the Internet. This is accomplished through numerical addresses assigned to each computer on the Net. Whether large or small, corporate or private, every computer on the Internet must have a unique IP address. When the Net began to take shape back in 1969, an address structure was created that allowed what seemed like reasonable expansion of this network of networks.

The addressing system is composed of two basic parts--a network address and the address of the host computer itself. To enable what seemed a quarter of a century ago to be reasonable network expansion, the protocol designers assigned 32 bits of space for each address. Within that 32-bit space are four sets of numbers separated by decimals. (These numbers are what you sometimes see when you move from site to site on the World Wide Web.) Everything on the Net depends on these numbers and on their accurate and rapid recognition by the routers that direct Internet traffic.

The apocalypse that's building now is occurring because the original Internet designers failed to foresee the rapid growth of the very network they created. They anticipated an internetwork that might include as many as 256 different networks--32 bits of address space could handle far more than that, if need be--and they designed accordingly.

Theoretically, 32 bits of space is enough to provide more than 4 billion host addresses; but as with so many theories, this one is impractical in the real world. Why? Because those addresses are stored on routing tables, which are used to locate the various computers on the Internet. The larger the routing table, the more time it takes for even the most sophisticated router to resolve your route. Indeed, many engineers feel that the theoretical IP limit is not the real problem. Long before we reach that 4 billion mark, our routers may no longer be able to handle the necessary number of addresses.

Both address famine and router overload are approaching now, at the speed of the Net. Collapse as a result of router overload could be just five years away if the Internet's growth continues at its present pace--and will happen even sooner if the Net's rate of growth increases. Even if we solve router overload one year from now, collapse due to address famine is on tap about five years after that. The Internet is about to reach its limit.

Just as the Internet itself is in constant evolution, so are the protocols that govern its operation. Currently taking shape is IPng--Internet Protocol Next Generation.

The Internet Engineering Task Force is working furiously to design and incorporate a new system of protocols to enable the Net's continued explosive expansion.

The first step is to extend the available address space from 32 bits to 128 bits, making a multibillion-node Net both possible and practical. One way would be to wrap an additional address structure (more numbers) around the existing structure, extending the information capacity of the address and the number of usable addresses in the system. Another approach would eliminate the need for wholly unique addresses for every computer on the Net, requiring them only for the networks themselves and duplicating individual addresses only on different networks.

In the meantime, engineers are working on protocol upgrades involving software fixes that make managing address tables easier and more efficient for current and next-generation routers, easing some of the burden on hardware and buying time for a wholesale revision of how the Internet handles addresses.

It won't be easy. Changing the way the Internet works means not only upgrading or updating every router, but also ensuring that every computer on the Net--yours, mine, everyone's--is updated as well. Hardware must be replaced and software distributed and installed. There will be inconveniences and errors, minor problems, and major screw-ups. But a future with a multibillion-node Internet will likely take a path that skirts the edge of address famine and the apocalypse that would result.

You click on the site you want to visit, then check your watch. Since it's not the world's most popular site, you've only got time to mow the front lawn before the page loads on your screen.

That's OK. This afternoon you've got to grab some video footage for a presentation you're giving next week. That ought to give you time to spruce up the backyard and clean the rain gutters. Then again, it's Saturday and the traffic volume will be higher. Maybe you'll paint the garage, too.

The "World Wide Wait" may once have seemed like a clever neologism, but these days it seems like waiting on the Web outstrips time actually spent exploring the Web.

Not every horseman gallops toward us bearing instant and dramatic doom, devouring us in a gulp. You can be nibbled to death as well, and the increasingly large chunks of time wasted could well eat up even your interest in staying on the Net--not to mention eliminating the commercial zeal of those who are building the sites that will supposedly finance the further expansion of the Internet.

Death comes for the Web.

In a way, the Internet is a victim of its own success. The more people online, the more people--and companies--who put up content for them. The more sophisticated the computers they're using, the more multimedia elements put up for them to find. Which brings more people online.

"Information superhighway?" "Information patchwork" is more like it--and those patches are being mightily strained by growing traffic. Strained enough to crash the Net altogether? Probably not, but certainly strained enough to degrade our Internet experiences.

Much of the Internet congestion has occurred because the network is growing faster than the planners can plan it. The Internet originated as a system to link governmental, military, and academic computers. This network backbone served its original purposes, handling a trillion or so bytes a month in 1991--before the World Wide Web caught the public fancy and put the Net on everyone's mind. Since then, traffic has grown at an exponential rate, passing 17 trillion bytes a month by the end of 1994. And to make things worse, the backbone of the Internet is no longer maintained by the government; rather, it consists of a loose conglomeration of international telephone companies, service providers, private networks, and so on. You know the result. We all know the result. The Net has slowed down.

For individual Net surfers, the most noticeable problem occurs when a lot of traffic tries to move through one of the Internet's narrow arteries all at once. For Internet traffic in the aggregate, the problem is more complex and more fundamental. Every packet of information traveling across the Net moves into and out of networks and phone lines, through interchanges and switches, encountering routers that move you from one network node to another.

It works like this: you find a site you want to visit, and click on a link. Your request goes to a router at your local service provider, which forwards it to a router at a regional service provider, which forwards it to yet another router--this one national in scope (and usually owned by a major telecommunications operator such as MCI or Sprint). This last step is the backbone network. And once your request has climbed up that ladder, you have to climb down the other side to get to your destination, encountering heavily taxed routers all the way.

Most of the routers in use today were not designed to handle this kind of volume, and they're starting to buckle under the strain. (See "Famine: not enough addresses" for details on the problems facing today's routers. If you're using Windows 95, you can get a good picture of what your poor request has to go through by trying a simple experiment.)

Add to this the increasing size of the data packets that return in response to your request. So even though you may send out a simple click on a link, what comes back to you--especially from large commercial sites--may include sound, graphics, animation, even video, all of which eat bandwidth. And you're not the only one asking for those glitzy graphics and vivacious videos. Every request contributes to the snarl.

Web crawl. Traffic jams. The death of the Net? Probably not. It's more likely that we'll run into increasing numbers of brownouts, slowdowns, and hiccups in service. They're small annoyances once in a while--but major problems if you're trying to build a business on the Web. It's not apocalyptic...or is it?

Death from a thousand small nibbles can be more painful than having your head bitten off. Traffic overload may not wipe out the Net, but it may turn it into a medium not worth bothering with.


You'd think that the easiest way to solve traffic problems is simply to build bigger roads, and for some of the Net's problems, that's true. Fiber-optic lines are being strung at faster and faster rates around the country, increasing the telephone companies' data-moving capacity. Fiber is expensive, though, and you can bet that the providers will supply the lines only as long as they can see a return on their investment.

At the same time, cable television companies are very interested in the Internet market. Their lines are already high-capacity, and as much as 70 percent of the United States is currently wired for cable.

Cable-based Internet services are being tested and deployed around the country, but in most communities, cable is a one-way street. More and more cable companies are investing in the infrastructure required for two-way communications, though, and by early in the next century, cable paths to the Net may be quite common.

Faster routers are on the way, too, designed to handle the growing volume of traffic coming their way. Implementation of new routers will be constant and ongoing over the next few months--provided (as always) that it makes financial sense.

What about the Net equivalents of telecommuting and flex-time at work, where employees adjust their schedules to suit various needs? A lot of Internet schemes now call for downloading only limited stuff in real time and moving the bulk of information-gathering to off-hours, or even delivering some of the high-bandwidth multimedia stuff on CD-ROM. CompuServe and the new Microsoft Network are turning to the CD-ROM approach.

One solution that's not really a solution is similar to employee flex-time. The Net version of this scheme would be to shift downloads and time-intensive operations to off-peak, low-traffic periods. But this is, at best, only a temporary fix. As the Web spreads its strands across the world, there will be fewer and fewer off-peak periods. Lots of people will be online all the time, every day. And the Internet is about convenience, about community, about information on demand. I want my CNET now, not in the middle of the night when I'm asleep.

For all the noise the government has made about embracing the information superhighway, you might suppose that a great federal initiative along the lines of the interstate highway system is in the works. Forget it. In the lean and mean 1990s, the government contribution will likely take the form of deregulation and incentives for private investment. So it will be up to the telecommunications companies, cable television industry, and Internet service providers to pony up the money needed to break the traffic jam.

Market economics say they'll probably do just that. After all, there's a traffic problem only in places where people want to go. And if people want to go somewhere, other people--in the form of banks, investors, and infrastructure builders--will put up the wherewithal to help them on their way.

We may face a few more years of traffic jams, and there may always be spots on the Net that get snarled. But it's early, and of the four horsemen of the Net, traffic is the one getting the most--and the most immediate--attention.

With luck, we won't have to wait long to break the log-on jam.

[ Return to "Stuff I Wish I..." ]


benturner.com:  click here to start at the beginning
RECENT NEWS (MORE):  Subscribe to my del.icio.us RSS feed! about moods | mood music
12/03/08 MOOD:  (mood:  yellow)