Skip to content

Commit

Permalink
update docs
Browse files Browse the repository at this point in the history
  • Loading branch information
github-actions committed Nov 7, 2024
1 parent 442aa57 commit f255b45
Show file tree
Hide file tree
Showing 2 changed files with 23 additions and 1 deletion.
2 changes: 1 addition & 1 deletion bbot/presets/spider.yml
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ modules:

blacklist:
# Prevent spider from invalidating sessions by logging out
- "RE:/.*(sign[_-]?out|log[_-]?out)"
- "RE:/.*(sign|log)[_-]?out"

config:
web:
Expand Down
22 changes: 22 additions & 0 deletions docs/scanning/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -217,6 +217,28 @@ If you only want to blacklist the URL, you could narrow the regex like so:
bbot -t evilcorp.com --blacklist 'RE:signout\.aspx$'
```

Similar to targets and whitelists, blacklists can be specified in your preset. The `spider` preset makes use of this to prevent the spider from following logout links:

```yaml title="spider.yml"
description: Recursive web spider

modules:
- httpx

blacklist:
# Prevent spider from invalidating sessions by logging out
- "RE:/.*(sign|log)[_-]?out"

config:
web:
# how many links to follow in a row
spider_distance: 2
# don't follow links whose directory depth is higher than 4
spider_depth: 4
# maximum number of links to follow per page
spider_links_per_page: 25
```
## DNS Wildcards
BBOT has robust wildcard detection built-in. It can reliably detect wildcard domains, and will tag them accordingly:
Expand Down

0 comments on commit f255b45

Please sign in to comment.