diff --git a/README.md b/README.md index af0f001..ec88ec5 100644 --- a/README.md +++ b/README.md @@ -6,12 +6,16 @@ ## Introduction -The LLM Interface project is a versatile and comprehensive wrapper designed to interact with multiple Large Language Model (LLM) APIs. It simplifies integrating various LLM providers, including **OpenAI, Anthropic, Google Gemini, Groq, Reka AI, and LLaMA.cpp**, into your applications. This project aims to provide a simplified and unified interface for sending messages and receiving responses from different LLM services, making it easier for developers to work with multiple LLMs without worrying about the specific intricacies of each API. +The LLM Interface project is a versatile and comprehensive wrapper designed to interact with multiple Large Language Model (LLM) APIs. It simplifies integrating various LLM providers, including **OpenAI, Anthropic, Google Gemini, Goose AI, Groq, Reka AI, and LLaMA.cpp**, into your applications. This project aims to provide a simplified and unified interface for sending messages and receiving responses from different LLM services, making it easier for developers to work with multiple LLMs without worrying about the specific intricacies of each API. + +## New in 0.0.7 + +- **Goose AI**: Added support for Goose AI ## Features - **Unified Interface**: A single, consistent interface to interact with multiple LLM APIs. -- **Dynamic Module Loading**: Automatically loads and manages different LLM handlers. +- **Dynamic Module Loading**: Automatically loads and manages different LLM LLMInterface. - **Error Handling**: Robust error handling mechanisms to ensure reliable API interactions. - **Extensible**: Easily extendable to support additional LLM providers as needed. - **JSON Output**: Simple to use JSON output for OpenAI and Gemini responses. @@ -20,7 +24,7 @@ The LLM Interface project is a versatile and comprehensive wrapper designed to i The project relies on several npm packages and APIs. Here are the primary dependencies: -- `axios`: For making HTTP requests (used for LLaMA.cpp and Reka). +- `axios`: For making HTTP requests (used for Goose AI, LLaMA.cpp and Reka). - `@anthropic-ai/sdk`: SDK for interacting with the Anthropic API. - `@google/generative-ai`: SDK for interacting with the Google Gemini API. - `groq-sdk`: SDK for interacting with the Groq API. @@ -42,19 +46,19 @@ npm install llm-interface Import `llm-interface` using: ```javascript -const handlers = require("llm-interface"); +const LLMInterface = require("llm-interface"); ``` or ```javascript -import handlers from "llm-interface"; +import LLMInterface from "llm-interface"; ``` then call the handler you want to use: ```javascript -const openai = new handlers.openai(process.env.OPENAI_API_KEY); +const openai = new LLMInterface.openai(process.env.OPENAI_API_KEY); const message = { model: "gpt-3.5-turbo", messages: [ diff --git a/config.js b/config.js index 180dabf..94e6adc 100644 --- a/config.js +++ b/config.js @@ -12,4 +12,5 @@ module.exports = { llamaURL: process.env.LLAMACPP_URL, anthropicApiKey: process.env.ANTHROPIC_API_KEY, rekaApiKey: process.env.REKA_API_KEY, + gooseApiKey: process.env.GOOSE_API_KEY, }; diff --git a/docs/USAGE.md b/docs/USAGE.md index 07ec91f..d38df78 100644 --- a/docs/USAGE.md +++ b/docs/USAGE.md @@ -2,10 +2,10 @@ ## Usage Examples -First, require the handlers from the `llm-interface` package: +First, require the LLMInterface from the `llm-interface` package: ```javascript -const handlers = require("llm-interface"); +const LLMInterface = require("llm-interface"); ``` ### OpenAI Interface @@ -15,7 +15,7 @@ The OpenAI interface allows you to send messages to the OpenAI API. #### Example ```javascript -const openai = new handlers.openai(process.env.OPENAI_API_KEY); +const openai = new LLMInterface.openai(process.env.OPENAI_API_KEY); const message = { model: "gpt-3.5-turbo", @@ -42,7 +42,7 @@ The Anthropic interface allows you to send messages to the Anthropic API. #### Example ```javascript -const anthropic = new handlers.anthropic(process.env.ANTHROPIC_API_KEY); +const anthropic = new LLMInterface.anthropic(process.env.ANTHROPIC_API_KEY); const message = { model: "claude-3-opus-20240229", @@ -74,7 +74,7 @@ The Gemini interface allows you to send messages to the Google Gemini API. #### Example ```javascript -const gemini = new handlers.gemini(process.env.GEMINI_API_KEY); +const gemini = new LLMInterface.gemini(process.env.GEMINI_API_KEY); const message = { model: "gemini-1.5-flash", @@ -101,7 +101,7 @@ The Groq interface allows you to send messages to the Groq API. #### Example ```javascript -const groq = new handlers.groq(process.env.GROQ_API_KEY); +const groq = new LLMInterface.groq(process.env.GROQ_API_KEY); const message = { model: "llama3-8b-8192", @@ -128,7 +128,7 @@ The Reka AI interface allows you to send messages to the Reka AI REST API. #### Example ```javascript -const reka = new handlers.reka(process.env.REKA_API_KEY); +const reka = new LLMInterface.reka(process.env.REKA_API_KEY); const message = { model: "reka-core", @@ -156,7 +156,7 @@ The LLaMA.cpp interface allows you to send messages to the LLaMA.cpp API; this i #### Example ```javascript -const llamacpp = new handlers.llamacpp(process.env.LLAMACPP_URL); +const llamacpp = new LLMInterface.llamacpp(process.env.LLAMACPP_URL); const message = { model: "some-llamacpp-model", diff --git a/env b/env index 6579d0e..833a03f 100644 --- a/env +++ b/env @@ -3,4 +3,5 @@ GEMINI_API_KEY= ANTHROPIC_API_KEY= REKA_API_KEY= + GOOSE_API_KEY= LLAMACPP_URL=http://localhost:8080/completions \ No newline at end of file diff --git a/package.json b/package.json index abf1540..48deb56 100644 --- a/package.json +++ b/package.json @@ -1,6 +1,6 @@ { "name": "llm-interface", - "version": "0.0.6", + "version": "0.0.7", "main": "src/index.js", "description": "A simple, unified interface for integrating and interacting with multiple Large Language Model (LLM) APIs, including OpenAI, Anthropic, Google Gemini, Groq, and LlamaCPP.", "scripts": { diff --git a/src/goose.js b/src/goose.js new file mode 100644 index 0000000..69454cc --- /dev/null +++ b/src/goose.js @@ -0,0 +1,88 @@ +/** + * @file goose.js + * @class Goose + * @description Wrapper class for the Goose AI API. + * @param {string} apiKey - The API key for Goose AI. + */ + +const axios = require("axios"); + +class Goose { + /** + * Creates an instance of the Goose class. + * + * @constructor + * @param {string} apiKey - The API key for Goose AI. + */ + constructor(apiKey) { + this.client = axios.create({ + baseURL: "https://api.goose.ai/v1/engines", + headers: { + "Content-Type": "application/json", + Authorization: `Bearer ${apiKey}`, + }, + }); + } + + /** + * Sends a message to the Goose AI API and returns the response. + * + * @param {Object} message - The message object containing model and messages. + * @param {string} message.model - The model to use for the request. + * @param {Array} message.messages - The array of message objects to form the prompt. + * @param {Object} [options] - Optional parameters for the request. + * @returns {Promise>} The response text from the API. + * @throws {Error} Throws an error if the request fails. + * + * @example + * const message = { + * model: "gpt-neo-20b", + * messages: [ + * { role: "system", content: "You are a helpful assistant." }, + * { role: "user", content: "Explain the importance of low latency LLMs." } + * ], + * }; + * goose.sendMessage(message, { max_tokens: 100 }) + * .then(response => console.log(response)) + * .catch(error => console.error(error)); + */ + async sendMessage(message, options = {}) { + const { model, messages } = message; + + // Convert messages array to a single prompt string + const formattedPrompt = messages + .map((message) => message.content) + .join(" "); + + const payload = { + prompt: formattedPrompt, + ...options, + }; + + try { + const url = `https://api.goose.ai/v1/engines/${model}/completions`; + const response = await this.client.post(url, payload); + const responseText = + response && + response.data && + response.data.choices && + response.data.choices[0] && + response.data.choices[0].text + ? response.data.choices[0].text.trim() + : null; + + return responseText; + } catch (error) { + if (error.response) { + console.error("Response data:", error.response.data); + } else if (error.request) { + console.error("No response received:", error.request); + } else { + console.error("Error setting up the request:", error.message); + } + throw error; + } + } +} + +module.exports = Goose; diff --git a/src/index.js b/src/index.js index 5959e10..8dbd58f 100644 --- a/src/index.js +++ b/src/index.js @@ -1,6 +1,6 @@ /** * @file index.js - * @description Entry point for the LLM interface module, dynamically loading handlers for different LLM providers. + * @description Entry point for the LLM interface module, dynamically loading LLMInterface for different LLM providers. */ const modules = { @@ -10,11 +10,12 @@ const modules = { llamacpp: "./llamacpp", reka: "./reka", groq: "./groq", + goose: "./goose", }; -const handlers = {}; +const LLMInterface = {}; Object.keys(modules).forEach((key) => { - Object.defineProperty(handlers, key, { + Object.defineProperty(LLMInterface, key, { get: function () { if (!this[`_${key}`]) { this[`_${key}`] = require(modules[key]); @@ -26,4 +27,5 @@ Object.keys(modules).forEach((key) => { }); }); -module.exports = handlers; +const handlers = LLMInterface; +module.exports = { LLMInterface, handlers }; diff --git a/test/anthropic.test.js b/test/anthropic.test.js index 276c9c9..d3e8051 100644 --- a/test/anthropic.test.js +++ b/test/anthropic.test.js @@ -29,7 +29,7 @@ test("Anthropic API Client should send a message and receive a response", async ], }; - const response = await anthropic.sendMessage(message, { max_tokens: 150 }); + const response = await anthropic.sendMessage(message, { max_tokens: 100 }); console.log(response); expect(typeof response).toBe("string"); }, 30000); // Extend timeout to 30 seconds diff --git a/test/goose.test.js b/test/goose.test.js new file mode 100644 index 0000000..4e7f316 --- /dev/null +++ b/test/goose.test.js @@ -0,0 +1,29 @@ +/** + * @file goose.test.js + * @description Tests for the Goose AI API client. + */ + +const Goose = require("../src/goose"); // Adjust path as needed +const { gooseApiKey } = require("../config"); + +test("Goose API Client should send a message and receive a response", async () => { + expect(typeof gooseApiKey).toBe("string"); + + const goose = new Goose(gooseApiKey); + const message = { + model: "gpt-neo-20b", + messages: [ + { + role: "system", + content: "You are a helpful assistant.", + }, + { + role: "user", + content: "Explain the importance of low latency LLMs.", + }, + ], + }; + + const response = await goose.sendMessage(message, { max_tokens: 100 }); + expect(typeof response).toBe("string"); +}, 30000); // Extend timeout to 30 seconds