Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Gemini Connector Assistant Integration #184741

Merged
merged 171 commits into from
Jun 17, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
171 commits
Select commit Hold shift + click to select a range
47a3e94
feat: Gemini dashboard creation
honeyn303 Apr 10, 2024
fa2b7b9
feat: Added the API Key attribute
honeyn303 Apr 22, 2024
5893522
fix: Fixed the api error.
honeyn303 Apr 23, 2024
a2c919a
fix: Remove the access & secret keys input field and populated the de…
honeyn303 Apr 24, 2024
918699c
feat: Added the configuration parameters for generateContent api
honeyn303 Apr 24, 2024
7704bd5
feat: Modified the code to use Vertex AI endpoint instead of AI studio
honeyn303 May 7, 2024
31efa17
feat: Added Project ID & location configuration field.
honeyn303 May 7, 2024
0afa1fe
fix: modified variable apiKey to accessToken
honeyn303 May 7, 2024
2575b19
feat: changed the default gemini model version to gemini-1.5-pro
honeyn303 May 8, 2024
98a4d40
Rebased this branch to the gemini branch
honeyn303 May 8, 2024
289b644
removed unused dependencies
honeyn303 May 9, 2024
156264e
removing file
honeyn303 May 9, 2024
f6f3e32
rebasing with the main branch
honeyn303 May 9, 2024
d179be0
Merge pull request #5 from honeyn303/gemini-ai-assistant
honeyn303 May 9, 2024
49a9dc0
Merge branch 'elastic:main' into gemini-assistant
honeyn303 May 9, 2024
b8099c0
Merge branch 'elastic:main' into gemini-assistant
honeyn303 May 10, 2024
ba83a51
Merge branch 'elastic:main' into gemini-assistant
honeyn303 May 13, 2024
a9411c0
Merge branch 'elastic:main' into gemini-assistant
honeyn303 May 14, 2024
230c713
ai assistant issue
honeyn303 May 14, 2024
a30af87
fix
honeyn303 May 14, 2024
2cedc33
Merge branch 'elastic:main' into gemini-bug-fix
honeyn303 May 14, 2024
01fc5fe
bug fix
honeyn303 May 14, 2024
0a9a26f
Merge branch 'elastic:main' into gemini-bug-fix
honeyn303 May 15, 2024
3d8472f
fix: Fixed the issue with the test api throwing an error
honeyn303 May 15, 2024
e689126
corrected signal parameter missing issue in gemini connector
rohanxz May 15, 2024
1654aea
Merge branch 'elastic:main' into gemini-bug-fix
honeyn303 May 15, 2024
6426c8d
Adding unit test cases for Gemini
shivankawasthi11 May 15, 2024
c371347
Revert "Adding unit test cases for Gemini"
shivankawasthi11 May 16, 2024
23032b2
Merge branch 'elastic:main' into gemini-bug-fix
honeyn303 May 16, 2024
8b80d4a
feat: Removed the Observability featureID, code clean up and added do…
honeyn303 May 16, 2024
2b680d9
feat: uncommented code
honeyn303 May 16, 2024
d68ca63
Merge branch 'elastic:main' into gemini-connector-integration
honeyn303 May 16, 2024
5da317f
fxi
honeyn303 May 16, 2024
bd47bfc
Merge branch 'elastic:main' into gemini-connector-integration
honeyn303 May 16, 2024
1c0605a
fixed sceret parameters
rohanxz May 16, 2024
90c1714
Merge branch 'elastic:main' into gemini-connector-integration
honeyn303 May 16, 2024
332ef77
Merge branch 'elastic:main' into gemini-connector-integration
honeyn303 May 16, 2024
57fcfbb
Merge branch 'elastic:main' into gemini-connector-integration
honeyn303 May 19, 2024
53d0ba1
fix: addressing the PR#183668 comments
honeyn303 May 19, 2024
644edc4
fix: code tweaks
honeyn303 May 19, 2024
0b12b47
fix: code clean up
honeyn303 May 19, 2024
f4f632d
Adding unit tests for Gemini
shivankawasthi11 May 17, 2024
223ddce
Updating unit test to implement OAuth
shivankawasthi11 May 20, 2024
ace52eb
feat: add token tracking for non stream responses
honeyn303 May 20, 2024
41673a8
initial commit: Adding functional tests (might need further tweaks)
honeyn303 May 20, 2024
d28edad
functional test cases for gemini
honeyn303 May 20, 2024
f626e00
fix: tweaks to functional test
honeyn303 May 21, 2024
4d748dc
Merge branch 'elastic:main' into gemini-connector-integration
honeyn303 May 21, 2024
ef455e8
fix: tweaks to oauth implementation
honeyn303 May 21, 2024
0522284
feat: Adding client side test cases
honeyn303 May 21, 2024
26f6776
feat: added the required documentation
honeyn303 May 21, 2024
da7fd52
fixes to the documentation
honeyn303 May 21, 2024
f652084
fix: Fixed token tracking issue for run and test sub-action
honeyn303 May 21, 2024
73dfdbe
feat: Added server side test cases
honeyn303 May 21, 2024
c2f3ff3
Delete .yarnrc
honeyn303 May 21, 2024
df11787
Delete .yarn/releases/yarn-1.22.19.cjs
honeyn303 May 21, 2024
e965f89
restored .yarnrc
rohanxz May 21, 2024
42c4eaa
Delete yarn.lock.orig
honeyn303 May 21, 2024
b6132cc
Update .yarnrc
honeyn303 May 21, 2024
54e34d4
feat: added docs images
honeyn303 May 21, 2024
a8a4598
Update llm.ts
honeyn303 May 21, 2024
047f8e7
Update constants.ts
honeyn303 May 21, 2024
4491673
Update index.tsx
honeyn303 May 21, 2024
4755c45
Merge branch 'elastic:main' into gemini-connector-integration
honeyn303 May 21, 2024
c48b14f
Merge branch 'main' into gemini-connector-integration
kibanamachine May 22, 2024
25cf790
fix: pr review comments
honeyn303 May 22, 2024
dbc6f70
fix: removed invoke ai and invoke stream
honeyn303 May 22, 2024
6c471a5
adding gemini files to CODEOWNERS
honeyn303 May 22, 2024
33fa26a
fix: es lint errors
honeyn303 May 22, 2024
1c9419b
fix: eslint error
honeyn303 May 22, 2024
1e19e54
fix: unique translations id
honeyn303 May 22, 2024
97526c0
fix: linting errors and added checks to validate the credentials JSON…
honeyn303 May 22, 2024
3c2363b
Fixing PR comments on unit tests
shivankawasthi11 May 23, 2024
776d940
Updating test body for Gemini
shivankawasthi11 May 23, 2024
0da2eb9
fix: removed invokeai and invoke stream schema
honeyn303 May 23, 2024
5d26a2b
fix: add proper credentials json file to the test case and few code t…
honeyn303 May 23, 2024
a20af20
fix: changed the credentials json format
honeyn303 May 23, 2024
1e3f330
fix: updated the json file to include the updated config and secret p…
honeyn303 May 23, 2024
0b02118
fix: the CI error related to node scripts/yarn_deduplicate
honeyn303 May 23, 2024
2fe2fa4
Merge branch 'main' into gemini-connector-integration
kibanamachine May 23, 2024
47af483
[CI] Auto-commit changed files from 'node scripts/eslint --no-cache -…
kibanamachine May 23, 2024
ee2a23a
fix: pr review comment changes
honeyn303 May 23, 2024
ccf2647
Updating unit tests to fix PR comments
shivankawasthi11 May 24, 2024
eb478a9
Merge branch 'main' into gemini-connector-integration
kibanamachine May 24, 2024
40098b1
[CI] Auto-commit changed files from 'node scripts/eslint --no-cache -…
kibanamachine May 24, 2024
21c8479
Merge branch 'main' into gemini-connector-integration
honeyn303 May 26, 2024
9c51343
fix: liniting issues
honeyn303 May 28, 2024
f11e36d
fix: eslint issues
honeyn303 May 28, 2024
0888b57
fix: linitng errors
honeyn303 May 28, 2024
94939e2
fix: using ES storage for storing access tokens
honeyn303 May 26, 2024
f6b570b
fix: few tweaks
honeyn303 May 27, 2024
78422fb
fix: test failures
honeyn303 May 28, 2024
2dac8ff
reverting changes to connector_token_client.ts
honeyn303 May 28, 2024
21bd29d
fix: removed console logs
honeyn303 May 28, 2024
375a316
fix: linting errors
honeyn303 May 28, 2024
85fa2dc
fix: changed the sorted field
honeyn303 May 28, 2024
123f206
fix: linting errors
honeyn303 May 29, 2024
5cd663a
Merge branch 'main' into gemini-connector-integration
kibanamachine May 29, 2024
9e162e3
Test and lint errors fix
shivankawasthi11 May 29, 2024
2e5b95d
fix: test failures
honeyn303 May 29, 2024
5ff956b
fix: test failures
honeyn303 May 29, 2024
19cca4f
Fixing test case
shivankawasthi11 May 30, 2024
883be6b
resolved PR review comments and refactored get_oauth_token file
rohanxz May 30, 2024
ce59195
added Unit tests for get_gcp_oauth_access_token.test
rohanxz May 30, 2024
ac0fbe3
Merge remote-tracking branch 'origin' into gemini-connector-integration
rohanxz May 30, 2024
b04aaca
updated yarn.lock
rohanxz May 30, 2024
bf07435
Removing the test
honeyn303 May 31, 2024
6abbcd8
resolved linting errors
rohanxz May 31, 2024
a391b96
fix: replaced the update function
honeyn303 May 31, 2024
a80be35
[CI] Auto-commit changed files from 'node scripts/lint_ts_projects --…
kibanamachine May 31, 2024
906f028
resolving conflit in connector_token_client.ts
honeyn303 May 31, 2024
2dced08
Merge branch 'main' into gemini-connector-integration
honeyn303 May 31, 2024
c9d3afb
[CI] Auto-commit changed files from 'node scripts/lint_ts_projects --…
kibanamachine May 31, 2024
b493737
resolving linting issue
rohanxz May 31, 2024
67032a1
Merge remote-tracking branch 'origin' into gemini-connector-integration
honeyn303 May 31, 2024
671e7d2
Merge branch 'elastic:main' into gemini-connector-integration
honeyn303 May 31, 2024
fd66af2
resolved conflicts
honeyn303 May 31, 2024
04ba7df
Merge branch 'main' into gemini-connector-integration
kibanamachine May 31, 2024
1ecee62
feat: adding invokeai and invoke stream subactions. Token tracking fo…
honeyn303 Jun 1, 2024
a9599f5
Merge branch 'elastic:main' into gemini-assistant-integration-latest
honeyn303 Jun 1, 2024
4af9c68
fix: code tweaks
honeyn303 Jun 2, 2024
b46c497
fix: code clean uo & eslint fix
honeyn303 Jun 2, 2024
b8b926b
fix: code clean up
honeyn303 Jun 2, 2024
10ca7fb
feat: added unit tests
honeyn303 Jun 2, 2024
b646622
fix: eslint issues
honeyn303 Jun 2, 2024
eac47e7
fix: code clean up
honeyn303 Jun 3, 2024
5f75c63
fix: code clean up
honeyn303 Jun 3, 2024
2da728c
feat: unit tests
honeyn303 Jun 3, 2024
b3b447a
fix: addressing pr review comments
honeyn303 Jun 3, 2024
35b727b
partial test
honeyn303 Jun 3, 2024
1cabe37
fix: bug
honeyn303 Jun 4, 2024
71cf579
fix: unit tests and eslint errors
honeyn303 Jun 4, 2024
204c551
fix: eslint errors
honeyn303 Jun 4, 2024
73559e4
feat: unit tests
honeyn303 Jun 4, 2024
19285df
feat: unit tests
honeyn303 Jun 4, 2024
0449f67
fix: build test failures
honeyn303 Jun 4, 2024
b5e26d2
Merge branch 'main' into gemini-assistant-integration-latest
honeyn303 Jun 4, 2024
048883d
[CI] Auto-commit changed files from 'node scripts/eslint --no-cache -…
kibanamachine Jun 4, 2024
16b19b1
Merge branch 'main' into gemini-assistant-integration-latest
honeyn303 Jun 5, 2024
4190ca1
fix: changes to gemini icon
honeyn303 Jun 5, 2024
c84990e
removing .yarn/releases/yarn-1.22.19.cjs
honeyn303 Jun 5, 2024
e84896c
revert connector_types.test.ts.snap
honeyn303 Jun 5, 2024
0ff4d38
Merge branch 'main' into gemini-assistant-integration-latest
kibanamachine Jun 5, 2024
212d954
fix: liniting errors
honeyn303 Jun 5, 2024
3620213
Merge branch 'main' into gemini-assistant-integration-latest
kibanamachine Jun 5, 2024
e9ba540
adding test snap
rohanxz Jun 6, 2024
2068d65
wip
stephmilovic Jun 6, 2024
c5f4497
fix: PR review comments
honeyn303 Jun 6, 2024
5fcece8
added streaming character by character
rohanxz Jun 6, 2024
9ae9988
Merge branch 'main' into gemini-assistant-integration-latest
honeyn303 Jun 6, 2024
54cc437
fix: resolved conflicts
honeyn303 Jun 6, 2024
d9c04fa
fix: code tweaks
honeyn303 Jun 6, 2024
8f49d63
[CI] Auto-commit changed files from 'node scripts/eslint --no-cache -…
kibanamachine Jun 7, 2024
a196288
fix: updating snapshots to fix build failures
honeyn303 Jun 7, 2024
c295348
fix: test failures
honeyn303 Jun 7, 2024
d3fda56
Merge branch 'main' into gemini-assistant-integration-latest
kibanamachine Jun 10, 2024
47edd78
Merge branch 'gemini-assistant-integration-latest' of github.com:hone…
stephmilovic Jun 10, 2024
92e52f9
Merge branch 'main' into gemini-assistant-integration-latest
kibanamachine Jun 10, 2024
5210e07
streaming for LangChain gemini
stephmilovic Jun 10, 2024
7f0957f
Merge branch 'gemini-assistant-integration-latest' of github.com:hone…
stephmilovic Jun 10, 2024
b701f78
Merge branch 'main' into gemini-assistant-integration-latest
kibanamachine Jun 10, 2024
1df5f80
fix: streaming modifications
honeyn303 Jun 10, 2024
3eafc96
fix tests
stephmilovic Jun 10, 2024
87b93f9
Merge branch 'gemini-assistant-integration-latest' of github.com:hone…
stephmilovic Jun 10, 2024
c3ef066
Merge branch 'main' into gemini-assistant-integration-latest
kibanamachine Jun 11, 2024
46592cd
Merge branch 'main' into gemini-assistant-integration-latest
kibanamachine Jun 11, 2024
e77a033
Merge branch 'main' into gemini-assistant-integration-latest
kibanamachine Jun 11, 2024
8b88cee
add tests for gemini stream utils
stephmilovic Jun 11, 2024
8cfd4d2
Merge branch 'main' into gemini-assistant-integration-latest
kibanamachine Jun 12, 2024
32f7210
Merge branch 'main' into gemini-assistant-integration-latest
honeyn303 Jun 17, 2024
2df6232
Merge branch 'main' into gemini-assistant-integration-latest
kibanamachine Jun 17, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -134,7 +134,7 @@ describe('API tests', () => {
);
});

it('calls the non-stream API when assistantStreamingEnabled is true and actionTypeId is gemini and isEnabledKnowledgeBase is true', async () => {
it('calls the stream API when assistantStreamingEnabled is true and actionTypeId is gemini and isEnabledKnowledgeBase is true', async () => {
const testProps: FetchConnectorExecuteAction = {
...fetchConnectorArgs,
apiConfig: apiConfig.gemini,
Expand All @@ -145,13 +145,13 @@ describe('API tests', () => {
expect(mockHttp.fetch).toHaveBeenCalledWith(
'/internal/elastic_assistant/actions/connector/foo/_execute',
{
...staticDefaults,
body: '{"message":"This is a test","subAction":"invokeAI","conversationId":"test","actionTypeId":".gemini","replacements":{},"isEnabledKnowledgeBase":true,"isEnabledRAGAlerts":false}',
...streamingDefaults,
body: '{"message":"This is a test","subAction":"invokeStream","conversationId":"test","actionTypeId":".gemini","replacements":{},"isEnabledKnowledgeBase":true,"isEnabledRAGAlerts":false}',
}
);
});

it('calls the non-stream API when assistantStreamingEnabled is true and actionTypeId is gemini and isEnabledKnowledgeBase is false and isEnabledRAGAlerts is true', async () => {
it('calls the stream API when assistantStreamingEnabled is true and actionTypeId is gemini and isEnabledKnowledgeBase is false and isEnabledRAGAlerts is true', async () => {
const testProps: FetchConnectorExecuteAction = {
...fetchConnectorArgs,
apiConfig: apiConfig.gemini,
Expand All @@ -164,8 +164,8 @@ describe('API tests', () => {
expect(mockHttp.fetch).toHaveBeenCalledWith(
'/internal/elastic_assistant/actions/connector/foo/_execute',
{
...staticDefaults,
body: '{"message":"This is a test","subAction":"invokeAI","conversationId":"test","actionTypeId":".gemini","replacements":{},"isEnabledKnowledgeBase":false,"isEnabledRAGAlerts":true}',
...streamingDefaults,
body: '{"message":"This is a test","subAction":"invokeStream","conversationId":"test","actionTypeId":".gemini","replacements":{},"isEnabledKnowledgeBase":false,"isEnabledRAGAlerts":true}',
}
);
});
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -64,13 +64,7 @@ export const fetchConnectorExecuteAction = async ({
traceOptions,
}: FetchConnectorExecuteAction): Promise<FetchConnectorExecuteResponse> => {
// TODO add streaming support for gemini with langchain on
const isStream =
assistantStreamingEnabled &&
(apiConfig.actionTypeId === '.gen-ai' ||
apiConfig.actionTypeId === '.bedrock' ||
// TODO add streaming support for gemini with langchain on
// tracked here: https://github.com/elastic/security-team/issues/7363
(apiConfig.actionTypeId === '.gemini' && !isEnabledRAGAlerts && !isEnabledKnowledgeBase));
const isStream = assistantStreamingEnabled;

const optionalRequestParams = getOptionalRequestParams({
isEnabledRAGAlerts,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -30,6 +30,7 @@ export interface Props {
const actionTypeKey = {
bedrock: '.bedrock',
openai: '.gen-ai',
gemini: '.gemini',
};

export const useLoadConnectors = ({
Expand All @@ -44,7 +45,9 @@ export const useLoadConnectors = ({
(acc: AIConnector[], connector) => [
...acc,
...(!connector.isMissingSecrets &&
[actionTypeKey.bedrock, actionTypeKey.openai].includes(connector.actionTypeId)
[actionTypeKey.bedrock, actionTypeKey.openai, actionTypeKey.gemini].includes(
connector.actionTypeId
)
? [
{
...connector,
Expand Down
2 changes: 2 additions & 0 deletions x-pack/packages/kbn-langchain/server/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -9,10 +9,12 @@ import { ActionsClientChatOpenAI } from './language_models/chat_openai';
import { ActionsClientLlm } from './language_models/llm';
import { ActionsClientSimpleChatModel } from './language_models/simple_chat_model';
import { parseBedrockStream } from './utils/bedrock';
import { parseGeminiResponse } from './utils/gemini';
import { getDefaultArguments } from './language_models/constants';

export {
parseBedrockStream,
parseGeminiResponse,
getDefaultArguments,
ActionsClientChatOpenAI,
ActionsClientLlm,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,10 @@ export const getDefaultArguments = (llmType?: string, temperature?: number, stop
temperature: temperature ?? DEFAULT_BEDROCK_TEMPERATURE,
stopSequences: stop ?? DEFAULT_BEDROCK_STOP_SEQUENCES,
}
: llmType === 'gemini'
? {
temperature: temperature ?? DEFAULT_GEMINI_TEMPERATURE,
}
: { n: 1, stop: stop ?? null, temperature: temperature ?? DEFAULT_OPEN_AI_TEMPERATURE };

export const DEFAULT_OPEN_AI_TEMPERATURE = 0.2;
Expand All @@ -19,4 +23,5 @@ export const DEFAULT_OPEN_AI_TEMPERATURE = 0.2;
export const DEFAULT_OPEN_AI_MODEL = 'gpt-4';
const DEFAULT_BEDROCK_TEMPERATURE = 0;
const DEFAULT_BEDROCK_STOP_SEQUENCES = ['\n\nHuman:', '\nObservation:'];
const DEFAULT_GEMINI_TEMPERATURE = 0;
export const DEFAULT_TIMEOUT = 180000;
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,7 @@ import { mockActionResponse } from './mocks';
import { BaseMessage } from '@langchain/core/messages';
import { CallbackManagerForLLMRun } from '@langchain/core/callbacks/manager';
import { parseBedrockStream } from '../utils/bedrock';
import { parseGeminiStream } from '../utils/gemini';

const connectorId = 'mock-connector-id';

Expand Down Expand Up @@ -94,6 +95,7 @@ const defaultArgs = {
streaming: false,
};
jest.mock('../utils/bedrock');
jest.mock('../utils/gemini');

describe('ActionsClientSimpleChatModel', () => {
beforeEach(() => {
Expand Down Expand Up @@ -216,6 +218,7 @@ describe('ActionsClientSimpleChatModel', () => {
describe('_call streaming: true', () => {
beforeEach(() => {
(parseBedrockStream as jest.Mock).mockResolvedValue(mockActionResponse.message);
(parseGeminiStream as jest.Mock).mockResolvedValue(mockActionResponse.message);
});
it('returns the expected content when _call is invoked with streaming and llmType is Bedrock', async () => {
const actionsClientSimpleChatModel = new ActionsClientSimpleChatModel({
Expand All @@ -238,7 +241,7 @@ describe('ActionsClientSimpleChatModel', () => {
it('returns the expected content when _call is invoked with streaming and llmType is Gemini', async () => {
const actionsClientSimpleChatModel = new ActionsClientSimpleChatModel({
...defaultArgs,
actions: mockActions,
actions: mockStreamActions,
llmType: 'gemini',
streaming: true,
});
Expand All @@ -248,8 +251,8 @@ describe('ActionsClientSimpleChatModel', () => {
callOptions,
callRunManager
);
const subAction = mockExecute.mock.calls[0][0].params.subAction;
expect(subAction).toEqual('invokeAI');
const subAction = mockStreamExecute.mock.calls[0][0].params.subAction;
expect(subAction).toEqual('invokeStream');

expect(result).toEqual(mockActionResponse.message);
});
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,7 @@ import { KibanaRequest } from '@kbn/core-http-server';
import { v4 as uuidv4 } from 'uuid';
import { get } from 'lodash/fp';
import { CallbackManagerForLLMRun } from '@langchain/core/callbacks/manager';
import { parseGeminiStream } from '../utils/gemini';
import { parseBedrockStream } from '../utils/bedrock';
import { getDefaultArguments } from './constants';

Expand Down Expand Up @@ -75,8 +76,7 @@ export class ActionsClientSimpleChatModel extends SimpleChatModel {
this.llmType = llmType ?? 'ActionsClientSimpleChatModel';
this.model = model;
this.temperature = temperature;
// only enable streaming for bedrock
this.streaming = streaming && llmType === 'bedrock';
this.streaming = streaming;
}

_llmType() {
Expand Down Expand Up @@ -154,7 +154,6 @@ export class ActionsClientSimpleChatModel extends SimpleChatModel {
return content; // per the contact of _call, return a string
}

// Bedrock streaming
const readable = get('data', actionResult) as Readable;

if (typeof readable?.read !== 'function') {
Expand Down Expand Up @@ -182,13 +181,9 @@ export class ActionsClientSimpleChatModel extends SimpleChatModel {
}
}
};
const streamParser = this.llmType === 'bedrock' ? parseBedrockStream : parseGeminiStream;

const parsed = await parseBedrockStream(
readable,
this.#logger,
this.#signal,
handleLLMNewToken
);
const parsed = await streamParser(readable, this.#logger, this.#signal, handleLLMNewToken);

return parsed; // per the contact of _call, return a string
}
Expand Down
9 changes: 1 addition & 8 deletions x-pack/packages/kbn-langchain/server/utils/bedrock.ts
Original file line number Diff line number Diff line change
Expand Up @@ -6,17 +6,10 @@
*/

import { finished } from 'stream/promises';
import { Readable } from 'stream';
import { Logger } from '@kbn/core/server';
import { EventStreamCodec } from '@smithy/eventstream-codec';
import { fromUtf8, toUtf8 } from '@smithy/util-utf8';

type StreamParser = (
responseStream: Readable,
logger: Logger,
abortSignal?: AbortSignal,
tokenHandler?: (token: string) => void
) => Promise<string>;
import { StreamParser } from './types';

export const parseBedrockStream: StreamParser = async (
responseStream,
Expand Down
89 changes: 89 additions & 0 deletions x-pack/packages/kbn-langchain/server/utils/gemini.test.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,89 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0; you may not use this file except in compliance with the Elastic License
* 2.0.
*/

import { Readable } from 'stream';
import { parseGeminiStream, parseGeminiResponse } from './gemini';
import { loggerMock } from '@kbn/logging-mocks';

describe('parseGeminiStream', () => {
const mockLogger = loggerMock.create();
let mockStream: Readable;

beforeEach(() => {
jest.clearAllMocks();
mockStream = new Readable({
read() {},
});
});

it('should parse the stream correctly', async () => {
const data =
'data: {"candidates":[{"content":{"role":"system","parts":[{"text":"Hello"}]},"finishReason":"stop","safetyRatings":[{"category":"safe","probability":"low"}]}],"usageMetadata":{"promptTokenCount":10,"candidatesTokenCount":10,"totalTokenCount":20}}\n';
mockStream.push(data);
mockStream.push(null);

const result = await parseGeminiStream(mockStream, mockLogger);
expect(result).toBe('Hello');
});

it('should handle abort signal correctly', async () => {
const abortSignal = new AbortController().signal;
setTimeout(() => {
abortSignal.dispatchEvent(new Event('abort'));
}, 100);

const result = parseGeminiStream(mockStream, mockLogger, abortSignal);

await expect(result).resolves.toBe('');
expect(mockLogger.info).toHaveBeenCalledWith('Bedrock stream parsing was aborted.');
});

it('should call tokenHandler with correct tokens', async () => {
const data =
'data: {"candidates":[{"content":{"role":"system","parts":[{"text":"Hello world"}]},"finishReason":"stop","safetyRatings":[{"category":"safe","probability":"low"}]}],"usageMetadata":{"promptTokenCount":10,"candidatesTokenCount":10,"totalTokenCount":20}}\n';
mockStream.push(data);
mockStream.push(null);

const tokenHandler = jest.fn();
await parseGeminiStream(mockStream, mockLogger, undefined, tokenHandler);

expect(tokenHandler).toHaveBeenCalledWith('Hello ');
expect(tokenHandler).toHaveBeenCalledWith('world ');
});

it('should handle stream error correctly', async () => {
const error = new Error('Stream error');
const resultPromise = parseGeminiStream(mockStream, mockLogger);

mockStream.emit('error', error);

await expect(resultPromise).rejects.toThrow('Stream error');
});
});

describe('parseGeminiResponse', () => {
it('should parse response correctly', () => {
const response =
'data: {"candidates":[{"content":{"role":"system","parts":[{"text":"Hello"}]},"finishReason":"stop","safetyRatings":[{"category":"safe","probability":"low"}]}],"usageMetadata":{"promptTokenCount":10,"candidatesTokenCount":10,"totalTokenCount":20}}\n';
const result = parseGeminiResponse(response);
expect(result).toBe('Hello');
});

it('should ignore lines that do not start with data: ', () => {
const response =
'invalid line\ndata: {"candidates":[{"content":{"role":"system","parts":[{"text":"Hello"}]},"finishReason":"stop","safetyRatings":[{"category":"safe","probability":"low"}]}],"usageMetadata":{"promptTokenCount":10,"candidatesTokenCount":10,"totalTokenCount":20}}\n';
const result = parseGeminiResponse(response);
expect(result).toBe('Hello');
});

it('should ignore lines that end with [DONE]', () => {
const response =
'data: {"candidates":[{"content":{"role":"system","parts":[{"text":"Hello"}]},"finishReason":"stop","safetyRatings":[{"category":"safe","probability":"low"}]}],"usageMetadata":{"promptTokenCount":10,"candidatesTokenCount":10,"totalTokenCount":20}}\ndata: [DONE]';
const result = parseGeminiResponse(response);
expect(result).toBe('Hello');
});
});
80 changes: 80 additions & 0 deletions x-pack/packages/kbn-langchain/server/utils/gemini.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,80 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0; you may not use this file except in compliance with the Elastic License
* 2.0.
*/

import { StreamParser } from './types';

export const parseGeminiStream: StreamParser = async (
stream,
logger,
abortSignal,
tokenHandler
) => {
let responseBody = '';
stream.on('data', (chunk) => {
const decoded = chunk.toString();
const parsed = parseGeminiResponse(decoded);
if (tokenHandler) {
const splitByQuotes = parsed.split(`"`);
splitByQuotes.forEach((chunkk, index) => {
// add quote back on except for last chunk
const splitBySpace = `${chunkk}${index === splitByQuotes.length - 1 ? '' : '"'}`.split(` `);

for (const char of splitBySpace) {
tokenHandler(`${char} `);
}
});
}
responseBody += parsed;
});
return new Promise((resolve, reject) => {
stream.on('end', () => {
resolve(responseBody);
});
stream.on('error', (err) => {
reject(err);
});
if (abortSignal) {
abortSignal.addEventListener('abort', () => {
logger.info('Bedrock stream parsing was aborted.');
stream.destroy();
resolve(responseBody);
});
}
});
};

/** Parse Gemini stream response body */
export const parseGeminiResponse = (responseBody: string) => {
return responseBody
.split('\n')
.filter((line) => line.startsWith('data: ') && !line.endsWith('[DONE]'))
.map((line) => JSON.parse(line.replace('data: ', '')))
.filter(
(
line
): line is {
candidates: Array<{
content: { role: string; parts: Array<{ text: string }> };
finishReason: string;
safetyRatings: Array<{ category: string; probability: string }>;
}>;
usageMetadata: {
promptTokenCount: number;
candidatesTokenCount: number;
totalTokenCount: number;
};
} => 'candidates' in line
)
.reduce((prev, line) => {
if (line.candidates[0] && line.candidates[0].content) {
const parts = line.candidates[0].content.parts;
const text = parts.map((part) => part.text).join('');
return prev + text;
}
return prev;
}, '');
};
Loading