Python autoinstrumentation library for MistralAI's Python SDK.
The traces emitted by this instrumentation are fully OpenTelemetry compatible and can be sent to an OpenTelemetry collector for viewing, such as arize-phoenix
pip install openinference-instrumentation-mistralai
In this example we will instrument a small program that uses the MistralAI chat completions API and observe the traces via arize-phoenix
.
Install packages.
pip install openinference-instrumentation-mistralai mistralai arize-phoenix opentelemetry-sdk opentelemetry-exporter-otlp
Start the phoenix server so that it is ready to collect traces. The Phoenix server runs entirely on your machine and does not send data over the internet.
python -m phoenix.server.main serve
In a python file, setup the MistralAIInstrumentor
and configure the tracer to send traces to Phoenix.
from mistralai.client import MistralClient
from mistralai.models.chat_completion import ChatMessage
from openinference.instrumentation.mistralai import MistralAIInstrumentor
from opentelemetry import trace as trace_api
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk import trace as trace_sdk
from opentelemetry.sdk.trace.export import ConsoleSpanExporter, SimpleSpanProcessor
endpoint = "http://127.0.0.1:6006/v1/traces"
tracer_provider = trace_sdk.TracerProvider()
tracer_provider.add_span_processor(SimpleSpanProcessor(OTLPSpanExporter(endpoint)))
# Optionally, you can also print the spans to the console.
tracer_provider.add_span_processor(SimpleSpanProcessor(ConsoleSpanExporter()))
trace_api.set_tracer_provider(tracer_provider)
MistralAIInstrumentor().instrument()
if __name__ == "__main__":
client = MistralClient()
response = client.chat(
model="mistral-large-latest",
messages=[
ChatMessage(
content="Who won the World Cup in 2018?",
role="user",
)
],
)
print(response.choices[0].message.content)
Since we are using MistralAI, we must set the MISTRAL_API_KEY
environment variable to authenticate with the MistralAI API.
export MISTRAL_API_KEY=[your_key_here]
Now simply run the python file and observe the traces in Phoenix.
python your_file.py
Fore details about tracing with OpenInference and Phoenix, consult the Phoenix documentation.
For AI/ML observability solutions in production, check out the docs on Arize.