diff --git a/docs/AI-for-security/api/chat-complete-api.asciidoc b/docs/AI-for-security/api/chat-complete-api.asciidoc index 41b12aa51a..b5b502c9e5 100644 --- a/docs/AI-for-security/api/chat-complete-api.asciidoc +++ b/docs/AI-for-security/api/chat-complete-api.asciidoc @@ -1,7 +1,7 @@ [[chat-complete-api]] === Complete chat -The Chat Complete API allows to communicate to the configured LLM and if needed to persist the result as a conversation (new or extend existing). +The complete chat API allows you to communicate with the configured large language model (LLM) and, if needed, persist the result as a conversation (create new or extend existing). [discrete] === Request URL @@ -44,7 +44,7 @@ The Chat Complete API allows to communicate to the configured LLM and if needed *Example 1* -Sending the message to LLM and getting the response. The data is anonymized with the central anonimization applied and the extended with the list of the fields to anonymize. +Sends a message to the LLM. The data is anonymized with central anonymization applied and extended with a list of fields to anonymize. [source,console] -------------------------------------------------- @@ -72,7 +72,7 @@ POST api/security_ai_assistant/chat/complete *Example 2* -Sending the message to LLM within the conversation and the data as a context. The data is anonymized with the central anonimization applied and the extended with the list of the fields to anonymize. Adding the LLM response with the role "assistant" to existing Conversation. +Sends a message to the LLM within an existing conversation and provides data as a context. The data is anonymized with central anonymization applied and extended with a list of fields to anonymize. Adds the LLM response with the role `assistant` to the existing conversation. [source,console] -------------------------------------------------- @@ -110,7 +110,7 @@ POST api/security_ai_assistant/chat/complete *Example 3* -Sending the message to LLM. Creating the new Conversation and adding the LLM response with the role "assistant". +Sends a message to the LLM. Creates a new conversation and adds the LLM response with the role `assistant`. [source,console] -------------------------------------------------- diff --git a/docs/AI-for-security/api/conversation-api-find.asciidoc b/docs/AI-for-security/api/conversation-api-find.asciidoc index 37ee57932f..87456f3d2a 100644 --- a/docs/AI-for-security/api/conversation-api-find.asciidoc +++ b/docs/AI-for-security/api/conversation-api-find.asciidoc @@ -45,7 +45,7 @@ Retrieve a list of Elastic AI Assistant conversations for the current user. *Example 1* -Get Conversation list for the current user. +Get a conversation list for the current user. [source,console] -------------------------------------------------- diff --git a/docs/siem-apis.asciidoc b/docs/siem-apis.asciidoc index c2d2f86c24..a0a2f8e930 100644 --- a/docs/siem-apis.asciidoc +++ b/docs/siem-apis.asciidoc @@ -13,7 +13,7 @@ NOTE: Console supports sending requests to {kib} APIs. Prepend any {kib} API end * <>: Create source event value lists for use with rule exceptions * <>: Import and export timelines * <>: Open and manage cases -* <>: Security AI Assistant APIs +* <>: Interact with and manage Elastic AI Assistant * <>: Create and manage asset criticality records Additionally, the {kib} <> is partially