From 81ffb1f28385eaf7528c25d7e8843e3fbe22971c Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Edu=20Gonz=C3=A1lez=20de=20la=20Herr=C3=A1n?= <25320357+eedugon@users.noreply.github.com> Date: Wed, 18 Dec 2024 10:29:45 +0100 Subject: [PATCH] Differentiation of managed and self-managed connectors in AI Assistant doc (#4679) * managed and self-managed connectors mentioned * Update docs/en/observability/observability-ai-assistant.asciidoc Co-authored-by: Liam Thompson <32779855+leemthompo@users.noreply.github.com> * Update docs/en/observability/observability-ai-assistant.asciidoc Co-authored-by: Liam Thompson <32779855+leemthompo@users.noreply.github.com> --------- Co-authored-by: Liam Thompson <32779855+leemthompo@users.noreply.github.com> (cherry picked from commit a5a72b40b7af42365f43d0c5aa08874a9c260e82) --- .../observability-ai-assistant.asciidoc | 13 ++++++++----- 1 file changed, 8 insertions(+), 5 deletions(-) diff --git a/docs/en/observability/observability-ai-assistant.asciidoc b/docs/en/observability/observability-ai-assistant.asciidoc index 889dab537d..7dac0e3bc3 100644 --- a/docs/en/observability/observability-ai-assistant.asciidoc +++ b/docs/en/observability/observability-ai-assistant.asciidoc @@ -11,7 +11,7 @@ The AI Assistant uses generative AI to provide: [role="screenshot"] image::images/obs-assistant2.gif[Observability AI assistant preview, 60%] -The AI Assistant integrates with your large language model (LLM) provider through our supported Elastic connectors: +The AI Assistant integrates with your large language model (LLM) provider through our supported {stack} connectors: * {kibana-ref}/openai-action-type.html[OpenAI connector] for OpenAI or Azure OpenAI Service. * {kibana-ref}/bedrock-action-type.html[Amazon Bedrock connector] for Amazon Bedrock, specifically for the Claude models. @@ -41,7 +41,7 @@ The AI assistant requires the following: ** OpenAI `gpt-4`+. ** Azure OpenAI Service `gpt-4`(0613) or `gpt-4-32k`(0613) with API version `2023-07-01-preview` or more recent. ** AWS Bedrock, specifically the Anthropic Claude models. -* An {enterprise-search-ref}/server.html[Enterprise Search] server if {ref}/es-connectors.html[search connectors] are used to populate external data into the knowledge base. +* An {enterprise-search-ref}/server.html[Enterprise Search] server if Elastic managed {ref}/es-native-connectors.html[search connectors] are used to populate external data into the knowledge base. * An account with a third-party generative AI provider that preferably supports function calling. If your AI provider does not support function calling, you can configure AI Assistant settings under **Stack Management** to simulate function calling, but this might affect performance. + @@ -156,13 +156,16 @@ Search connectors are only needed when importing external data into the Knowledg {ref}/es-connectors.html[Connectors] allow you to index content from external sources thereby making it available for the AI Assistant. This can greatly improve the relevance of the AI Assistant’s responses. Data can be integrated from sources such as GitHub, Confluence, Google Drive, Jira, AWS S3, Microsoft Teams, Slack, and more. -These connectors are managed under the Search Solution in {kib}, and they require an {enterprise-search-ref}/server.html[Enterprise Search] server connected to the Elastic Stack. +UI affordances for creating and managing search connectors are available in the Search Solution in {kib}. +You can also use the {es} {ref}/connector-apis.html[Connector APIs] to create and manage search connectors. + +The infrastructure for deploying connectors can be managed by Elastic or self-managed. Managed connectors require an {enterprise-search-ref}/server.html[Enterprise Search] server connected to the Elastic Stack. Self-managed connectors are run on your own infrastructure and don't require the Enterprise Search service. By default, the AI Assistant queries all search connector indices. To override this behavior and customize which indices are queried, adjust the *Search connector index pattern* setting on the <> page. This allows precise control over which data sources are included in AI Assistant knowledge base. -To create a connector and make its content available to the AI Assistant knowledge base, follow these steps: +To create a connector in the {kib} UI and make its content available to the AI Assistant knowledge base, follow these steps: -. To open **Connectors**, find `Content / Connectors` in the {kibana-ref}/introduction.html#kibana-navigation-search[global search field]. +. Open **Connectors** by finding `Content / Connectors` in the {kibana-ref}/introduction.html#kibana-navigation-search[global search field]. + [NOTE] ====