@joeo10 yep. I don’t use robots.txt
, though. The problem with the text file is that only good actors will abide by it. Instead the API itself will check and see whether a request should be fulfilled or not. If the request comes from a blocklisted agent or one that is clearly scraping, then a 422 is returned.