An example Python app demonstrating how to integrate Pangea's Secure Audit Log service into a LangChain app to maintain an audit log of prompts being sent to LLMs.
- Python v3.12 or greater.
- pip v24.2 or uv v0.4.29.
- A Pangea account with Secure Audit Log enabled with the AI Audit Log Schema Config.
- An OpenAI API key.
git clone https://github.com/pangeacyber/langchain-python-prompt-tracing.git
cd langchain-python-prompt-tracing
If using pip:
python -m venv .venv
source .venv/bin/activate
pip install .
Or, if using uv:
uv sync
source .venv/bin/activate
The sample can then be executed with:
python -m langchain_prompt_tracing "Give me information on John Smith"
Usage: python -m langchain_prompt_tracing [OPTIONS] PROMPT
Options:
--model TEXT OpenAI model. [default: gpt-4o-mini; required]
--audit-token SECRET Pangea Secure Audit Log API token. May also be set
via the `PANGEA_AUDIT_TOKEN` environment variable.
[required]
--audit-config-id TEXT Pangea Secure Audit Log configuration ID.
--pangea-domain TEXT Pangea API domain. May also be set via the
`PANGEA_DOMAIN` environment variable. [default:
aws.us.pangea.cloud; required]
--openai-api-key SECRET OpenAI API key. May also be set via the
`OPENAI_API_KEY` environment variable. [required]
--help Show this message and exit.
This does not modify the input or output so it's transparent to the LLM and end user.
Audit logs can be viewed at the Secure Audit Log Viewer.