Apple's criteria for bandwidth limitations to users of the mac.com website hosting service:
"Apple has recently implemented bandwidth limitations on iTools HomePages. If an iTools user's website receives more than 500 hits in a six-hour period, the user is limited to roughly two times their iDisk capacity of data throughput in that six-hour period.
iTools users who have upgraded their iDisk capacity will have additional bandwidth. If an upgraded user's website receives FEWER than 500 hits in a six-hour period, the user is limited to roughly 14 times their iDisk capacity in that six-hour period. If an upgraded user's website receives MORE than 500 hits in a six-hour period, the user is limited to 2.5 times their iDisk capacity in that 6 hour period.
When these limitations are exceeded, that iTools user's websites will be disabled for a period of up to, but not greater than, 12 hours. The limits stated here are subject to change without notice."
[Macintouch]
Now, besides comments on the fact that maybe they should have warned users, and that I think that this policy is harming small developers who using Apple's servers to distribute their software, I'm very interested in any kind of approach to this problem.
It looks like companies are realizing that bandwidth has a cost and there's the need to define rules to manage these costs. On our servers, we have customers that are paying very low prices and using *a lot* of bandwidth and processors, while others are paying more and using almost nothing. But it's hard to build a statistical model having almost no data to work on.
The other issue is quality: is people willing to pay for services such as backups, reduntant servers, more bandwidth? If anybody would offer such services, would webloggers be interested in paying more (or in some cases, to start paying) for their tools and services?