ust before Christmas, in an annual telebriefing designed to predict future technology trends, John Gantz, chief research officer of International Data Corp. (IDC), in Framingham, Mass., remarked that “a major cyber-terrorism event will disrupt the economy and bring the Internet to its knees for at least a day or two.”
“The event could take the form of a denial-of-service attack, a network intrusion, or even a physical attack on key network assets,” Gantz said.
Earlier that same month, Ken Watson, the president and chairman of the Partnership for Critical Infrastructure Security, remarked to CNET, “The worst threat is a combination of a physical attack and then a cyber-attack that would disable the response.” Then, lumping the Internet, emergency response networks, and the computers that run traffic signals into one massive construct, Watson added, “So, if there was another horrendous bombing attack and then someone disabled 911 emergency responders or screwed with the traffic lights, that would be a pretty significant nightmare scenario.”
For as long as the Internet continues to thrive, the pundits will predict its demise. That is the sexy part of the story, the unexpected twist to the world’s embrace of inter-connectivity. Predicting its continued existence isn’t nearly as exciting; however, history suggests that the demise of the Internet is closer to science fiction than to science.
In fact, reading such comments makes a long-term Internet geek wonder what these folks think the Internet is and just how it is constructed.
No Knees to Kick
The Internet is an inter-net: a gigantic network of networks. There are over 200 million machines connected via this network, as listed in the most recent survey by the Internet Software Consortium. Most media distribution mechanisms are single-source. For example, a newspaper has a printing plant and radio and television stations have studios and antennas (or cable connections). In contrast, the Internet is distributed and egalitarian. Every “host” machine is on a par with every other host machine. And while there are major connection points, there are no unique connections in the Internet’s core.
This construction makes the success of a physical attack incredibly unlikely?what would the attacker target? In fact, the Internet has already survived a severe physical attack on network assets. At just past 9 a.m. EDT on September 11, 2001, Matrix NetSystems’ gobal Internet performance sensors registered an 8 percent decrease in performance speed; a similar percentage of the world’s servers were inaccessible from Matrix’s worldwide beacons. Ninety minutes later, overall performance had rebounded to within 2 percent of what it had been at 8 a.m. Despite an incredible amount of physical destruction, the fabric of the Internet suffered only to a small extent. Transatlantic cables that came into lower Manhattan were fused by the destruction, two switching stations were destroyed and tens of thousands of computers around Ground Zero were demolished. Yet global Internet performance suffered only a mild degradation and rebounded quickly.
Some experts see denial-of-service (DoS) attacks as a more likely threat than physical destruction. However, statistical evidence gathered from previous attacks does not support this claim, either. The DoS/Smurf attack of February 9, 2000, produced only a 3 percent notch in worldwide ISP reachability , according to Matrix NetSystems’ measurements, which was monitoring 1,701 ISPs at the time. Because reachability reflects the availability of routers and servers to respond to requests for information, the measurement best reflects the overall performance of the Internet at any given time. The “Code Red” DoS attack had a similar effect a year later?a 3 percent drop in reachability?even though the perpetrators claimed they could “take over.” Even the much publicized Slammer worm of January 25 only slowed the Internet for several hours, it didn’t come close to bringing the global network down.
Despite alarming statements that the end of the Internet is near, it has withstood massive physical destruction and multiple instances of cyber attack. Do not assume, however, that physical or software attacks are totally innocuous?they’re not; but they aren’t likely to be crippling to an entity as large and diverse the Internet. Even in catastrophic events such as Sept. 11, traffic flow only slowed: it didn’t stop.
Such slowdowns occur because large amounts of traffic suddenly begin passing through wires and connection points designed for lower rates. Such slowing is important primarily to those whose business depends on rapid turnaround: markets where prices change rapidly, arbitrageurs depending on tenths of a percentage point; it doesn’t substantially impact such common Internet operations as transferring files or sending email, which comprise an overwhelming majority of the activities for which the Internet is being used at any given time.
As the Internet grows larger and society increases its dependence upon it, it becomes a bigger target for both attackers and pundits. In 1985, just prior to the period in which the Internet grew from 2,000 to 5,000 hosts in about 10 months, Eugene Spafford, a Georgia Tech researcher who is now the director of the University of Purdue’s Center for Education and Research in Information Assurance and Security (CERIAS), predicted the imminent demise of the Internet. In 1994, Elizabeth Lear-Newman, a prominent technology columnist, foresaw the same imminent death in an article in Internet World in September 1994. At that time the Internet had waxed to over 750,000 hosts. As previously noted, there are over 200 million hosts on the Internet now. The Internet isn’t dying; it’s thriving. And the more it grows, the more resilient it becomes.
The whole point of a distributed, packet-switching system is that it isn’t centralized. There are several routes between most points. If one is fishing with a hook and line, severing that line handicaps the ability to fish. If one is dragging an enormous net, snagging a thread and cutting one knot does not handicap the ability to fish. The Internet is an enormous net, not a single line. As it expands, the number of connections increases, and it becomes increasingly difficult to impact overall performance by attacking any single or even multiple points.
Certainly there are Internet exchange points, but there are over 10,000 of them. There are at least that many router banks. And there are several hundred Domain Name System (DNS) servers. Late last year there was a DoS attack on the 13 “root” DNS servers. It wasn’t a disaster. The Internet’s built-in redundancy meant that more than nine of those 13 servers would have had to go out of service simultaneously. Last year’s attacks never came close; few people experienced anything more than a minor inconvenience.
Vulnerability Starts at Home
However, businesses need to learn to mimic the resiliency of the Internet?especially those businesses that rely on it. Many businesses are still fishing with a hook and line when it comes to their Internet resiliency. When their carrier’s cable gets cut, or a DoS attack targets their provider, they have no back-up options. For them, the end of the Internet is just a click away.
What can a business do? If possible, a business should mimic the resiliency of the Internet and “multi-home.” That is, a business should subscribe to more than one ISP. In past events, some ISPs have been more affected than others: occasionally this has been the result of attackers specifically targeting that provider, in other cases it has been because the systems and network administrators of one ISP have been more attentive than those of another. The adage “don’t put all of your eggs in one basket” applies. Basing all of your Internet connectivity on a single ISP makes your personal risk of an Internet disaster high, even if the impact to the Internet overall is small.
There have certainly been malicious attacks on the Internet. There will undoubtedly be more. Problems occur from other causes as well?cables get cut inadvertently, incorrect routing tables get loaded, and human error surfaces from time to time. But in a distributed network, these events are a nuisance, not a disaster. This is as true for the Internet as a whole as it is for each organization that relies upon it.
Over the centuries, there have been warnings of great floods and volcanic eruptions, of hostile alien invasions, of nation-swallowing earthquakes and life-ending asteroids?of the end of the world. Prophetic announcements of imminent disaster don’t seem to slow or cease over time. As we become more reliant upon the Internet, it is only natural that the doomsayers bring their fearmongering to this medium.
How should reasonable people react to these warnings? Simply put, consider the source and act accordingly. A healthy dose of skepticism should greet any charlatan who predicts disaster in one breath and peddles the cure in the next. While there will be events from time to time that affect individual connections to the Internet, we are not going to see the whole thing disappear. Businesses and individuals should take precautions to mitigate isolated incidents and pay little heed to pundits trying to play off of fear and uncertainty.