This is awful. I've noticed significant jumps in bandwidth consumption in the last month or two. This is what Awstats shows now:
Traffic viewed = 2.37 GB
Traffic not viewed = 14.29 GB
As Awstats tells us, "not viewed" includes "traffic generated by robots, worms, or replies with special HTTP status codes."
Sadly, as of this writing, I've exceeded my alloted bandwidth I run a "modest" personal site, both in terms of scale and popularity.
Some possible reasons:
* It looks like most of the traffic can be blamed on busy/nosy search engine bots (some consuming several gigabytes per visit -- that ain't right).
* I host a few blogs. These get spammed a lot (vast majority of ads are caught by filters, though, not that this is relevant).
* I run a web feed aggregator with some 100 blog RSS feeds, etc. Until recently the feed cache had not been configured correctly, so maybe this could account for some usage (not sure how much).
* I used to host a calendar that ran into infinity but quite a while I marked it "disallowed" in my robots file, so that should not be an issue (rogue bots not withstanding).
Is your bandwidth anything near this ridiculous proportion (mostly non-human traffic)? What do you do about it? Any suggestions? Thanks.