Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[8.x] [inference] Add support for inference connectors (#204541) #205078

Merged
merged 2 commits into from
Dec 23, 2024

Conversation

kibanamachine
Copy link
Contributor

Backport

This will backport the following commits from main to 8.x:

Questions ?

Please refer to the Backport tool documentation

## Summary

~Depends on~  elastic#200249 merged!

Fix elastic#199082

- Add support for the `inference` stack connectors to the `inference`
plugin (everything is inference)
- Adapt the o11y assistant to use the `inference-common` utilities for
connector filtering / compat checking

## How to test

**1. Starts ES with the unified completion feature flag**

```sh
yarn es snapshot --license trial ES_JAVA_OPTS="-Des.inference_unified_feature_flag_enabled=true"
```

**2. Enable the inference connector for Kibana**

In the Kibana config file:
```yaml
xpack.stack_connectors.enableExperimental: ['inferenceConnectorOn']
```

**3. Start Dev Kibana**

```sh
node scripts/kibana --dev --no-base-path
```

**4. Create an inference connector**

Go to
`http://localhost:5601/app/management/insightsAndAlerting/triggersActionsConnectors/connectors`,
create an inference connector

- Type: `AI connector`

then

- Service: `OpenAI`
- API Key: Gwzk... Kidding, please ping someone
- Model ID: `gpt-4o`
- Task type: `completion`

-> save

**5. test the o11y assistant**

Use the assistant as you would do for any other connector (just make
sure the inference connector is selected as the one being used) and do
your testing.

---------

Co-authored-by: kibanamachine <[email protected]>
(cherry picked from commit 3dcae51)
@elasticmachine
Copy link
Contributor

Pinging @elastic/obs-ai-assistant (Team:Obs AI Assistant)

@kibanamachine kibanamachine merged commit a08a128 into elastic:8.x Dec 23, 2024
8 checks passed
@elasticmachine
Copy link
Contributor

💚 Build Succeeded

Metrics [docs]

Module Count

Fewer modules leads to a faster build time

id before after diff
inference 25 26 +1
observabilityAIAssistant 119 118 -1
observabilityAIAssistantApp 435 436 +1
observabilityAiAssistantManagement 381 395 +14
searchAssistant 267 281 +14
total +29

Public APIs missing comments

Total count of every public API that lacks a comment. Target amount is 0. Run node scripts/build_api_docs --plugin [yourplugin] --stats comments for more detailed information.

id before after diff
@kbn/inference-common 40 46 +6
observabilityAIAssistant 383 379 -4
total +2

Async chunks

Total size of all lazy-loaded chunks that will be downloaded as the user navigates the app

id before after diff
observabilityAIAssistantApp 295.1KB 295.2KB +128.0B
searchAssistant 165.0KB 165.1KB +128.0B
total +256.0B

Public APIs missing exports

Total count of every type that is part of your API that should be exported but is not. This will cause broken links in the API documentation system. Target amount is 0. Run node scripts/build_api_docs --plugin [yourplugin] --stats exports for more detailed information.

id before after diff
@kbn/inference-common 3 4 +1
inference 6 5 -1
total -0

Page load bundle

Size of the bundles that are downloaded on every page load. Target size is below 100kb

id before after diff
observabilityAIAssistant 48.3KB 48.1KB -247.0B
Unknown metric groups

API count

id before after diff
@kbn/inference-common 141 150 +9
observabilityAIAssistant 385 381 -4
total +5

History

cc @pgayvallet

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
backport Team:Obs AI Assistant Observability AI Assistant
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants