I just started using this myself, seems pretty great so far!
Clearly doesn’t stop all AI crawlers, but a significantly large chunk of them.
I just started using this myself, seems pretty great so far!
Clearly doesn’t stop all AI crawlers, but a significantly large chunk of them.
I’m not the person who brought git up. I was just stating that work is work. Sure, git is doing something useful with it. This is arguably useful without the work itself being important. Work is the thing you’re complaining about, not the proof.
Yeah, but the effect it has on legitimate usage is trivial. It’s a cost to illegitimate scrapers. Them not paying this cost also has an impact on the environment. In fact, this theoretically doesn’t. They’ll spend the same time scraping either way. This way they get delayed and don’t gather anything useful for more time.
To use your salesman analogy, it’s similar to that, except their car is going to be running regardless. It just prevents them from reaching as many houses. They’re going to go to as many as possible. If you can stall them then they use the same amount of gas, they just reach fewer houses.
This is probably wrong, because you’re using the salesman idea. Computers have threads. If they’re waiting for something then they can switch tasks to something else. It protects a site, but it doesn’t slow them down. It doesn’t actually really waste their time because they’re performing other tasks while they wait.
If they’re going to use the energy anyway, we might as well make them get less value. Eventually the cost may be more than the benefit. If it isn’t, they spend all the energy they have access to anyway. That part isn’t going to change.