Hacker Newsnew | past | comments | ask | show | jobs | submit | AbortedLaunch's commentslogin

We need mitochondria.


They are technically no longer individual life forms. They sure used to be, but we merged quite some time ago. Of course that opens a whole other can of worms with respect to who you really are. You're trillions of microorganisms living together and quite a few of them don't even share your DNA.


Cats do attack bats. See https://www.sciencedirect.com/science/article/abs/pii/S16165...

More anecdotally, I knew a cat that brought in bats almost daily, which is why the owner had her vaccinated against rabies (classical rabies RABV has been extirpated here).


You may find the recently published article “A General Principle of Neuronal Evolution Reveals a Human-Accelerated Neuron Type Potentially Underlying the High Prevalence of Autism in Humans” interesting.

https://academic.oup.com/mbe/article/42/9/msaf189/8245036


Some of these crawlers appear to be designed to avoid rate limiting based on IP. I regularly see millions of unique ips doing strange requests, each just one or at most a few per day. When a response contains a unique redirect I often see a geographically distinct address fetching the destination.


"I regularly see millions of unique ips doing strange requests, each just one or at most a few per day."

How would UA string help

For example, a crawler making "strange" requests can send _any_ UA string, and a crawler doing "normal" requests can also send _any_ UA string.

The "doing requests" is what I refer to as "behaviour"

A website operator might think "Crawlers making strange requests send UA string X but not Y"

Let's assume the "strange" requests cause a "website load" problem^1

Then a crawler, or any www user, makes a "normal" request and sends UA string X; the operator blocks or redirects the request, unnecessarily

Then a crawler makes "strange" request and sends UA string Y; the operator allows the request and the website "blows up"

What matters for the "blowing up websites" problem^1 is behaviour, not UA string

1. The article's title calls it the "blowing up websites" problem, but the article text calls it a problem with "website load". As always the details are missing. For example, what is the "load" at issue. Is it TCP connections or HTTP requests. What number of simultaneous connections and/or requests per second is acceptable, what number is not unacceptable. Again, behaviour is the issue, not UA string

The acceptable numbers need to be published; for example, see documentation for "web APIs"


I do not make any point on UA-strings, just on the difficulty of rate limiting.


"Some of these crawlers appear to be designed to avoid rate limiting based on IP."

Unless the rate is exceeded, the limit is not being avoided

"I regularly see millions of unique ips doing strange requests, each just one or at most a few per day."

Assuming the rate limit is more than one or a few requests every 24h this would be complying with the limit, not avoiding it

It could be that sometimes the problem website operators are concerned about is not "website load", i.e., the problem the article is discussing, it is actually something else (NB. I am not speculating about this particular operator, I am making a general observation)

If a website is able to fulfill all requests from unique IPs without affecting quality of service, then it stands to reason "website load" is not a problem the website operator is having

For example, the article's title claims Meta is amongst the "worst offenders" of creating excessive website load caused by "AI crawlers, fetchers"

Meta has been shown to have used third party proxy services wth rotating IP addresses in order to scrape other websites; it also sued one of these services because it was being used to scrape Meta's website, Facebook

https://brightdata.com/blog/general/meta-dismisses-claim-aga...

Whether the problem that Meta was having with this "scraping" was "website load" is debatable; if the requests were being fulfilled without affecting QoS, then arguably "website load" was not a problem

Rate-limiting addresses the problem of website load; it allows website operators to ensure that requests from all IP addresses are adequately served as opposed to preferentially servicing some IP addresses to the detriment of others (degraded QoS)

Perhaps some website operators become concerned that many unique IP addresses may be under the control of a single entity, and that this entity may be a competitor; this could be a problem for them

But if their website is able to fulfill all the requests it receives without degrading QoS then arguably "website load" is not a problem they are having

NB. I am not suggesting that a high volume of requests from a single entity, each complying with a rate-limit is acceptable, nor am I making any comment about the practice of "scraping" for commercial gain. I am only commenting about what rate-limiting is designed to do and whether it works for that purpose


I would not expect this to make a difference as pasteurization does not inactivate Clostridium endospores.


Mosses and ferns also have motile, flagellated reproductive cells.


Intraguild killing (such as lions killing leopards or cheetahs) has been described in multiple species.

Perhaps interesting; https://pubmed.ncbi.nlm.nih.gov/34816428/ (Still need to read it)


Hang up and call back may not save you on some landline systems, where scammers are able to keep a connection open when the other party hangs up.


Can you elaborate? How can caller influence my landline connection?


They can't, unless they're physically down the street from you with a buttinski plugged into a junction box. That post was nonsense.



Well that's cool. Now I'm going down a rabbit hole of learning the differences between recombination and reassortment.


Thanks, that looks like the right kind of thing. :)


Clostridium endospores can survive just fine in the presence of oxygen.


Oxygen is literally toxic to anaerobes in the clostridia class. Which means their endospores are likely to be where the bacteria is.

Like botulism or bacterial gangrene, tetani can survive for short periods (as can humans sans oxygen) but is unlikely to be found in sufficient quantities to cause concern.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: