Experimenting with yarrr' Burp Proxy tab going brrrrrrrrrrrrr.
• Report Bug •
- burpference
"burpference" started as a research idea of offensive agent capabilities and is a fun take on Burp Suite and running inference. The extension is open-source and designed to capture in-scope HTTP requests and responses from Burp's proxy history and ship them to a remote LLM API in JSON format. It's designed with a flexible approach where you can configure custom system prompts, store API keys and select remote hosts from numerous model providers as well as the ability for you to create your own API configuration. The idea is for an LLM to act as an agent in an offensive web application engagement to leverage your skills and surface findings and lingering vulnerabilities. By being able to create your own configuration and model provider allows you to also host models locally via Ollama to prevent potential high inference costs and potential network delays or rate limits.
Some key features:
- Automated Response Capture: Burp Suite acts as your client monitor, automatically capturing responses that fall within your defined scope. This extension listens for, captures, and processes these details with an offensive-focused agent.
- API Integration: Once requests and response streams are captured, they are packaged and forwarded to your configured API endpoint in JSON format, including any necessary system-level prompts or authentication tokens.
- Only in-scope items are sent, optimizing resource usage and avoiding unnecessary API calls.
- By default, certain MIME types are excluded.
- Color-coded tabs display
critical/high/medium/low/informational
findings from your model for easy visualization.
- Comprehensive Logging: A logging system allows you to review intercepted responses, API requests sent, and replies received—all clearly displayed for analysis.
- A clean table interface displaying all logs, intercepted responses, API calls, and status codes for comprehensive engagement tracking.
- Stores inference logs in both the "Inference Logger" tab as a live preview and a timestamped file in the /logs directory.
- Flexible Configuration: Customize system prompts, API keys, or remote hosts as needed. Use your own configuration files for seamless integration with your workflow.
- Supports custom configurations, allowing you to load and switch between system prompts, API keys, and remote hosts
- Several examples are provided in the repository, and contributions for additional provider plugins are welcome.
So grab yer compass, hoist the mainsail, and let burpference be yer guide as ye plunder the seven seas of HTTP traffic! Yarrr'!
Before using Burpference, ensure you have the following:
- Due to it's awesomeness, burpference may require higher system resources to run optimally, especially if using local models. Trust the process and make the machines go brrrrrrrrrrrrr!
- Installed Burp Suite (Community or Professional edition).
- Downloaded and set up Jython standalone
.jar
file (a Python interpreter compatible with Java) to run Python-based extensions in Burp Suite.- You do not need Python2.x runtime in your environment for this to work.
- The
registerExtenderCallbacks
reads a configuration file specific to the remote endpoint's input requirements. Ensure this exists in your environment and Burp has the necessary permissions to access it's location on the filesystem.- Important: as Burp Suite cannot read from a filesystem's
os
environment, you will need to explicitly include API key values in the configuration.json
files per-provider. - If you intend to fork or contribute to burpference, ensure that you have excluded the files from git tracking via
.gitignore
. - There's also a pre-commit hook in the repo as an additional safety net. Install pre-commit hooks here.
- Important: as Burp Suite cannot read from a filesystem's
- Setup relevant directory permissions for burpference to create log files:
chmod -R 755 logs configs
In some cases when loading the extension you may experience directory permission write issues and as such its recommended to restart Burp Suite following the above.
- Ollama locally installed if using this provider plugin, example config and the model running locally - ie
ollama run mistral-small
(model docs).
If Burp Suite is not already installed, download it from: Burp Suite Community/Professional
Jython enables Burp Suite to run Python-based extensions. You will need to download and configure it within Burp Suite.
- Go to the Jython Downloads Page.
- Download the standalone Jython
.jar
file (e.g.,jython-standalone-2.7.4.jar
). - Open Burp Suite.
- Go to the
Extensions
tab in Burp Suite. - Under the
Options
tab, scroll down to the Python Environment section. - Click Select File, and choose the
jython-standalone-2.7.4.jar
file you just downloaded. - Click Apply to load the Jython environment into Burp Suite.
Download the latest supported release from the repo, unzip it and add it as a python-based extension in Burp Suite. It's recommended to save this in a ~/git
directory based on the current code and how the logs and configs are structured.
- Open Burp Suite.
- Navigate to the Extensions tab.
- Click on Add to install a new extension.
- In the dialog box:
- Extension Type: Choose Python and the
burpference/burpference.py
file, this will instruct Burp Suite to initialize the extension by invoking theregisterExtenderCallbacks
method. Click Next and the extension will be loaded. 🚀
- Extension Type: Choose Python and the
If you prefer to build from source, clone the repo and follow the steps above:
-
Download or clone the Burpference project from GitHub:
git clone https://github.com/dreadnode/burpference.git
Head over to the configuration docs!
We also recommend setting up a custom hotkey in Burp to save clicks.
Longer-term roadmap is a potential Kotlin-based successor (mainly due to the limitations of Jython with the Extender API) or additionally, compliment burpference.
The below bullets are cool ideas for the repo at a further stage or still actively developing.
- Scanner
- An additional custom one-click "scanner" tab which scans an API target/schema with a selected model and reports findings/payloads and PoCs.
- Conversations
- Enhanced conversation turns with the model to reflect turns for both HTTP requests and responses to build context.
- Prompt Tuning:
- Modularize a centralized source of prompts sent to all models.
- Grounding and context: Equip the model with context, providing links to OpenAPI schemas and developer documentation.
- Offensive Agents and Tool Use
- Equip agents with burpference results detail and tool use for weaponization and exploitation phase.
- Optimization:
- Extend functionality of selecting multiple configurations and sending results across multiple endpoints for optimal results.
- Introduce judge reward systems for findings.
The following known issues are something that have been reported so far and marked against issues in the repo.
We welcome any issues or contributions to the project, share the treasure! If you like our project, please feel free to drop us some love <3
By watching the repo, you can also be notified of any upcoming releases.