Can the internet keep up?

Spent an hour or so reading a great report on theSpider web in rainState of the Internet” by Analysys Mason for the Internet Society. It is a rebuttal to those who think the Internet is falling apart, needs fixing including the proposed shift to ‘sending network pays’.

The document is an excellent primer of the state of the internet today, the crucial role of IXPs and how historically three forces, technology, investment and changes in traffic flows have collectively met the challenge of the exploding use of the internet.

A good case in point, caught the tale end of a report on my local NPR station about how researchers at MIT and elsewhere that uses an algebraic equation to reconstitute dropped packets thereby removing data congestion bottlenecks.  The results are very impressive to quote:

Testing the system on Wi-Fi networks at MIT, where 2 percent of packets are typically lost, Medard’s group found that a normal bandwidth of one megabit per second was boosted to 16 megabits per second. In a circumstance where losses were 5 percent—common on a fast-moving train—the method boosted bandwidth from 0.5 megabits per second to 13.5 megabits per second. In a situation with zero losses, there was little if any benefit, but loss-free wireless scenarios are rare. [source]

Looks as if the internet will be around for a couple more years after all.