You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Originally posted by amiremami April 9, 2024
In some programs, we need to blacklist specific path such as https://www.example.com/blog/
However, it seems this is not possible with bbot, I wanted to suggest if it's possible add blacklist based on keyword.
So, if I add blog , then it won't scan or crawl any links that have blog in it.
Thanks 🙏
Update: I was also thinking about a way to limit crawling of similar links. For example, a site can have 100k products. I want to crawl only one of them, because the others are similar to this. one. Or for example a site can have 50k posts, but I want to crawl one of them. That would be great if it's possible to implement this.
The text was updated successfully, but these errors were encountered:
Discussed in #1243
Originally posted by amiremami April 9, 2024
In some programs, we need to blacklist specific path such as https://www.example.com/blog/
However, it seems this is not possible with bbot, I wanted to suggest if it's possible add blacklist based on keyword.
So, if I add blog , then it won't scan or crawl any links that have blog in it.
Thanks 🙏
Update: I was also thinking about a way to limit crawling of similar links. For example, a site can have 100k products. I want to crawl only one of them, because the others are similar to this. one. Or for example a site can have 50k posts, but I want to crawl one of them. That would be great if it's possible to implement this.
The text was updated successfully, but these errors were encountered: