[Security Solution] Integrate pre-configured Kibana AI Connector to the Security GenAI powered functionality #202625
Labels
8.18 candidate
Team:Security Generative AI
Security Generative AI
Team:Security-Scalability
Team label for Security Integrations Scalability Team
Milestone
Currently at Security Solution we are using three different GenAI connector types, which helps to integrate directly with OpenAI, Gemini and Bedrock.
The long term vision is to migrate from the explicit connectors implementation to the usage of the generic
.inference
connector, which is integrated with Elasticsearch Inference API. This connector supports multiple LLMs integrations (including OpenAI, Gemini and Bedrock), but due to the big scope of the complete adoption and migration path to.inference
connector, the scope was reduced to the MVP integration with the usage of the pre-configured connector for EIS service.In 8.18 there is a plan to provide Elastic Default LLM experience, which will be exposed within Kibana
.inference
connector type.Requirements:
.inference
pre-configured connector to the list of the available connectors and select/use it by default (if the connector selection was not changed).inference
connector experienceChatOpenAI
orBaseChatModelParams
GenAI functionality to cover:
The text was updated successfully, but these errors were encountered: