Service providers operate their networks based on a series of calculations. Among the variables they consider is concurrency, or the number of subscribers likely to be tuned in or logged on at any given time. In recent modeling, cable operators have planned for roughly 10% concurrency with video and 1% concurrency with high-speed data. (The 1% number may be higher now. I have that figure from 2006.) If there are a thousand subscribers, that means 100 could access video streams at the same time via cable TV and 10 could use the promised 6 Mbps (or 8 Mbps, or 10 Mbps…) for Internet surfing all at the same time. (Or 20 subscribers could use 3 Mbps. Or 60 could use 1 Mbps. You get the idea.)
I bring this up because Ars Technica has an article posted about the FCC hearing on Comcast’s broadband traffic management policies. Forget the controversy itself for a moment. The article includes a paragraph talking about how Comcast cannot fully support its product because its network is not built to handle peak usage all the time. The last sentence acknowledges that Comcast defends itself by saying no networks are built that way. But the acknowledgement is a throw-away line at best, as if Comcast had just created the statement to argue its point further.
Concurrency modeling is most certainly not something Comcast made up. In fact, if I understand things correctly, our power grids operate the same way, which is why on hot days there are brownouts and blackouts in some regions as everyone turns the air conditioner on high. Until now, the 10% and 1% concurrency models have worked quite well in the cable industry. They make economic sense and they’ve usually met subscriber bandwidth needs. Comcast shouldn’t be blamed for operating on an industry standard. The problem is, that standard has to change.
On the data side, multimedia applications (P2P and otherwise) are raising the bandwidth rates consumers use for common tasks, making it much more likely that they’ll hit peak usage in greater numbers. On the video side, on-demand video is raising concurrency rates because suddenly subscribers have a lot more video streams to choose from. Chances are there’s always something on that a subscriber would want to watch. In conservative estimates, on-demand services will raise concurrency rates from 10% to 25% by 2012.
We’ve used the term “always on” since broadband first became popular. Maybe we should change that now to “always on high.”