Rachel by the Bay's atom.xml file is about 544 KiB (which for some reason seems to be always sent uncompressed regardless of the "Accept-Encoding" header, but I digress). It's by far the most requested file, with every subscriber's feed reader polling it periodically. For example, Feeder's default polling frequency is ten minutes, but some might poll more frequently.
That file is served over HTTPS, so every response must be encrypted individually. If it's cached by the client, that's a 304 and a couple hundred bytes of response headers to encrypt. If it's not, that's a 200 and a couple hundred thousand bytes to encrypt.
Now imagine if a garbage reader doesn't cache the feed and requests the full file every time, like Tiny Tiny RSS tends to do in my access logs. Multiply that by the number of subscribers using these busted readers. Multiply that by the number of requests they do each day.
That's a whole lot of load on the server's CPU wasted for nothing.
These readers download 544 KiB per request. At 1k rps from badly behaved readers, that's 544 MiB of uncached responses to encrypt and upload per second. That's enough to saturate a gigabit Ethernet link five times over.
The server also has to process the useful requests on top of that.
That file is served over HTTPS, so every response must be encrypted individually. If it's cached by the client, that's a 304 and a couple hundred bytes of response headers to encrypt. If it's not, that's a 200 and a couple hundred thousand bytes to encrypt.
Now imagine if a garbage reader doesn't cache the feed and requests the full file every time, like Tiny Tiny RSS tends to do in my access logs. Multiply that by the number of subscribers using these busted readers. Multiply that by the number of requests they do each day.
That's a whole lot of load on the server's CPU wasted for nothing.