Skip to content

Commit

Permalink
Adding Arize to the integrations page (#253)
Browse files Browse the repository at this point in the history
* adding arize to the integrations page

* resolving some comments

* Fixes to the arize integration docs (#256)

* add more links, fix spelling

* fix spacing

* add import

* split readmes in two

* docs: cleanup arize integration (#257)

* docs: cleanup arize integration

* clean up

* add an image

* final image

---------

Co-authored-by: Xander Song <[email protected]>

* remove this gif in favor of already existing one

---------

Co-authored-by: Mikyo King <[email protected]>
Co-authored-by: Xander Song <[email protected]>
  • Loading branch information
3 people authored Aug 19, 2024
1 parent a4c6b7d commit 6720ca1
Show file tree
Hide file tree
Showing 4 changed files with 225 additions and 0 deletions.
121 changes: 121 additions & 0 deletions integrations/arize-phoenix.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,121 @@
---
layout: integration
name: Arize Phoenix
description: Trace your Haystack pipelines with Arize Phoenix
authors:
- name: Arize AI
socials:
github: Arize-ai
twitter: ArizePhoenix
linkedin: arizeai
pypi: https://pypi.org/project/openinference-instrumentation-haystack/
repo: https://github.com/Arize-ai/phoenix
type: Monitoring Tool
report_issue: https://github.com/Arize-ai/openinference/issues
logo: /logos/arize-phoenix.png
version: Haystack 2.0
toc: true
---

### **Table of Contents**

- [Overview](#overview)
- [Installation](#installation)
- [Usage](#usage)
- [Resources](#resources)

## Overview

**Arize Phoenix** is Arize's open-source platform that offers developers the quickest way to troubleshoot, evaluate, and experiment with LLM applications.

For a detailed integration guide, see the [documentation for Phoenix + Haystack](https://docs.arize.com/phoenix/tracing/integrations-tracing/haystack)

## Installation

```bash
pip install openinference-instrumentation-haystack haystack-ai opentelemetry-sdk opentelemetry-exporter-otlp arize-phoenix
```

## Usage

To trace any Haystack pipeline with Phoenix, simply initialize OpenTelemetry and the `HaystackInstrumentor`. Haystack pipelines that run within the same environment send traces to Phoenix.

First, start a Phoenix instance to send traces to.

```sh
python -m phoenix.server.main serve
```

Now let's connect our Haystack pipeline to Phoenix using OpenTelemetry.

```python
from openinference.instrumentation.haystack import HaystackInstrumentor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import (
OTLPSpanExporter,
)
from opentelemetry.sdk import trace as trace_sdk
from opentelemetry.sdk.trace.export import SimpleSpanProcessor

endpoint = "http://localhost:6006/v1/traces" # The URL to your Phoenix instance
tracer_provider = trace_sdk.TracerProvider()
tracer_provider.add_span_processor(SimpleSpanProcessor(OTLPSpanExporter(endpoint)))

HaystackInstrumentor().instrument(tracer_provider=tracer_provider)
```

Now, you can run a Haystack pipeline within the same environment, resulting in the following trace:

> To run the example below, export your OpenAI Key to the `OPENAI_API_KEY` environment variable.
![Arize Phoenix Demo](https://raw.githubusercontent.com/deepset-ai/haystack-integrations/main/images/arize-demo.gif)

```python
from haystack import Document, Pipeline
from haystack.components.builders.prompt_builder import PromptBuilder
from haystack.components.generators import OpenAIGenerator
from haystack.components.retrievers.in_memory import InMemoryBM25Retriever
from haystack.document_stores.in_memory import InMemoryDocumentStore

document_store = InMemoryDocumentStore()
document_store.write_documents([
Document(content="My name is Jean and I live in Paris."),
Document(content="My name is Mark and I live in Berlin."),
Document(content="My name is Giorgio and I live in Rome.")
])

prompt_template = """
Given these documents, answer the question.
Documents:
{% for doc in documents %}
{{ doc.content }}
{% endfor %}
Question: {{question}}
Answer:
"""

retriever = InMemoryBM25Retriever(document_store=document_store)
prompt_builder = PromptBuilder(template=prompt_template)
llm = OpenAIGenerator()

rag_pipeline = Pipeline()
rag_pipeline.add_component("retriever", retriever)
rag_pipeline.add_component("prompt_builder", prompt_builder)
rag_pipeline.add_component("llm", llm)
rag_pipeline.connect("retriever", "prompt_builder.documents")
rag_pipeline.connect("prompt_builder", "llm")

question = "Who lives in Paris?"
results = rag_pipeline.run(
{
"retriever": {"query": question},
"prompt_builder": {"question": question},
}
)
```

## Resources

- Check out the Phoenix [github.com/Arize-ai/phoenix](GitHub repository)
- For an in-depth guide on how to host your own Phoenix instance, see the [Phoenix documentation](https://docs.arize.com/phoenix/deployment)
- Try out free hosted Phoenix instances at [phoenix.arize.com](https://phoenix.arize.com/)
- Check out the [Phoenix documentation](https://docs.arize.com/phoenix)
104 changes: 104 additions & 0 deletions integrations/arize.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,104 @@
---
layout: integration
name: Arize AI
description: Trace and Monitor your Haystack pipelines with Arize AI
authors:
- name: Arize AI
socials:
github: Arize-ai
twitter: arizeai
linkedin: arizeai
pypi: https://pypi.org/project/openinference-instrumentation-haystack/
repo: https://github.com/Arize-ai/openinference
type: Monitoring Tool
report_issue: https://github.com/Arize-ai/openinference/issues
logo: /logos/arize.png
version: Haystack 2.0
toc: true
---

### **Table of Contents**

- [Overview](#overview)
- [Installation](#installation)
- [Usage](#usage)

## Overview

Arize is AI Observability and Evaluation platform designed to help you troubleshoot, evaluate, and experiment on LLM and ML applications. Developers use Arize to get applications working quickly, evaluate performance, detect and prevent production issues, and curate datasets.

- [Documentation for Arize AI + Haystack](https://docs.arize.com/arize/large-language-models/tracing/auto-instrumentation/haystack)

## Installation

```bash
pip install openinference-instrumentation-haystack haystack-ai arize-otel opentelemetry-sdk opentelemetry-exporter-otlp
```

## Usage

To trace any Haystack pipeline with Arize, simply initialize OpenTelemetry and the `HaystackInstrumentor`. Haystack pipelines that run within the same environment send traces to Arize.

```python
from openinference.instrumentation.haystack import HaystackInstrumentor
# Import open-telemetry dependencies
from arize_otel import register_otel, Endpoints

# Setup OTEL via our convenience function
register_otel(
endpoints = Endpoints.ARIZE,
space_id = "<your-space-id>", # from the space settings page
api_key = "<your-api-key>", # from the space settings page
model_id = "<your-haystack-app-name>", # name this to whatever you would like
)
```

Now, you can run a Haystack pipeline within the same environment, resulting in the following trace:

> To run the example below, export your OpenAI Key to the `OPENAI_API_KEY` environment variable.
![Arize Demo](https://raw.githubusercontent.com/deepset-ai/haystack-integrations/main/images/arize-demo.gif)

```python
from haystack import Document, Pipeline
from haystack.components.builders.prompt_builder import PromptBuilder
from haystack.components.generators import OpenAIGenerator
from haystack.components.retrievers.in_memory import InMemoryBM25Retriever
from haystack.document_stores.in_memory import InMemoryDocumentStore

document_store = InMemoryDocumentStore()
document_store.write_documents([
Document(content="My name is Jean and I live in Paris."),
Document(content="My name is Mark and I live in Berlin."),
Document(content="My name is Giorgio and I live in Rome.")
])

prompt_template = """
Given these documents, answer the question.
Documents:
{% for doc in documents %}
{{ doc.content }}
{% endfor %}
Question: {{question}}
Answer:
"""

retriever = InMemoryBM25Retriever(document_store=document_store)
prompt_builder = PromptBuilder(template=prompt_template)
llm = OpenAIGenerator()

rag_pipeline = Pipeline()
rag_pipeline.add_component("retriever", retriever)
rag_pipeline.add_component("prompt_builder", prompt_builder)
rag_pipeline.add_component("llm", llm)
rag_pipeline.connect("retriever", "prompt_builder.documents")
rag_pipeline.connect("prompt_builder", "llm")

question = "Who lives in Paris?"
results = rag_pipeline.run(
{
"retriever": {"query": question},
"prompt_builder": {"question": question},
}
)
```
Binary file added logos/arize-phoenix.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added logos/arize.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.

0 comments on commit 6720ca1

Please sign in to comment.