Skip to content

Commit

Permalink
v0.0.7
Browse files Browse the repository at this point in the history
- Added support for Goose AI
- Renamed the main handler (while maintaining backwards compatability)
  • Loading branch information
samestrin committed Jun 15, 2024
1 parent 82323f3 commit 52f45d5
Show file tree
Hide file tree
Showing 9 changed files with 145 additions and 20 deletions.
16 changes: 10 additions & 6 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,12 +6,16 @@

## Introduction

The LLM Interface project is a versatile and comprehensive wrapper designed to interact with multiple Large Language Model (LLM) APIs. It simplifies integrating various LLM providers, including **OpenAI, Anthropic, Google Gemini, Groq, Reka AI, and LLaMA.cpp**, into your applications. This project aims to provide a simplified and unified interface for sending messages and receiving responses from different LLM services, making it easier for developers to work with multiple LLMs without worrying about the specific intricacies of each API.
The LLM Interface project is a versatile and comprehensive wrapper designed to interact with multiple Large Language Model (LLM) APIs. It simplifies integrating various LLM providers, including **OpenAI, Anthropic, Google Gemini, Goose AI, Groq, Reka AI, and LLaMA.cpp**, into your applications. This project aims to provide a simplified and unified interface for sending messages and receiving responses from different LLM services, making it easier for developers to work with multiple LLMs without worrying about the specific intricacies of each API.

## New in 0.0.7

- **Goose AI**: Added support for Goose AI

## Features

- **Unified Interface**: A single, consistent interface to interact with multiple LLM APIs.
- **Dynamic Module Loading**: Automatically loads and manages different LLM handlers.
- **Dynamic Module Loading**: Automatically loads and manages different LLM LLMInterface.
- **Error Handling**: Robust error handling mechanisms to ensure reliable API interactions.
- **Extensible**: Easily extendable to support additional LLM providers as needed.
- **JSON Output**: Simple to use JSON output for OpenAI and Gemini responses.
Expand All @@ -20,7 +24,7 @@ The LLM Interface project is a versatile and comprehensive wrapper designed to i

The project relies on several npm packages and APIs. Here are the primary dependencies:

- `axios`: For making HTTP requests (used for LLaMA.cpp and Reka).
- `axios`: For making HTTP requests (used for Goose AI, LLaMA.cpp and Reka).
- `@anthropic-ai/sdk`: SDK for interacting with the Anthropic API.
- `@google/generative-ai`: SDK for interacting with the Google Gemini API.
- `groq-sdk`: SDK for interacting with the Groq API.
Expand All @@ -42,19 +46,19 @@ npm install llm-interface
Import `llm-interface` using:

```javascript
const handlers = require("llm-interface");
const LLMInterface = require("llm-interface");
```

or

```javascript
import handlers from "llm-interface";
import LLMInterface from "llm-interface";
```

then call the handler you want to use:

```javascript
const openai = new handlers.openai(process.env.OPENAI_API_KEY);
const openai = new LLMInterface.openai(process.env.OPENAI_API_KEY);
const message = {
model: "gpt-3.5-turbo",
messages: [
Expand Down
1 change: 1 addition & 0 deletions config.js
Original file line number Diff line number Diff line change
Expand Up @@ -12,4 +12,5 @@ module.exports = {
llamaURL: process.env.LLAMACPP_URL,
anthropicApiKey: process.env.ANTHROPIC_API_KEY,
rekaApiKey: process.env.REKA_API_KEY,
gooseApiKey: process.env.GOOSE_API_KEY,
};
16 changes: 8 additions & 8 deletions docs/USAGE.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,10 +2,10 @@

## Usage Examples

First, require the handlers from the `llm-interface` package:
First, require the LLMInterface from the `llm-interface` package:

```javascript
const handlers = require("llm-interface");
const LLMInterface = require("llm-interface");
```

### OpenAI Interface
Expand All @@ -15,7 +15,7 @@ The OpenAI interface allows you to send messages to the OpenAI API.
#### Example

```javascript
const openai = new handlers.openai(process.env.OPENAI_API_KEY);
const openai = new LLMInterface.openai(process.env.OPENAI_API_KEY);

const message = {
model: "gpt-3.5-turbo",
Expand All @@ -42,7 +42,7 @@ The Anthropic interface allows you to send messages to the Anthropic API.
#### Example

```javascript
const anthropic = new handlers.anthropic(process.env.ANTHROPIC_API_KEY);
const anthropic = new LLMInterface.anthropic(process.env.ANTHROPIC_API_KEY);

const message = {
model: "claude-3-opus-20240229",
Expand Down Expand Up @@ -74,7 +74,7 @@ The Gemini interface allows you to send messages to the Google Gemini API.
#### Example

```javascript
const gemini = new handlers.gemini(process.env.GEMINI_API_KEY);
const gemini = new LLMInterface.gemini(process.env.GEMINI_API_KEY);

const message = {
model: "gemini-1.5-flash",
Expand All @@ -101,7 +101,7 @@ The Groq interface allows you to send messages to the Groq API.
#### Example

```javascript
const groq = new handlers.groq(process.env.GROQ_API_KEY);
const groq = new LLMInterface.groq(process.env.GROQ_API_KEY);

const message = {
model: "llama3-8b-8192",
Expand All @@ -128,7 +128,7 @@ The Reka AI interface allows you to send messages to the Reka AI REST API.
#### Example

```javascript
const reka = new handlers.reka(process.env.REKA_API_KEY);
const reka = new LLMInterface.reka(process.env.REKA_API_KEY);

const message = {
model: "reka-core",
Expand Down Expand Up @@ -156,7 +156,7 @@ The LLaMA.cpp interface allows you to send messages to the LLaMA.cpp API; this i
#### Example

```javascript
const llamacpp = new handlers.llamacpp(process.env.LLAMACPP_URL);
const llamacpp = new LLMInterface.llamacpp(process.env.LLAMACPP_URL);

const message = {
model: "some-llamacpp-model",
Expand Down
1 change: 1 addition & 0 deletions env
Original file line number Diff line number Diff line change
Expand Up @@ -3,4 +3,5 @@
GEMINI_API_KEY=
ANTHROPIC_API_KEY=
REKA_API_KEY=
GOOSE_API_KEY=
LLAMACPP_URL=http://localhost:8080/completions
2 changes: 1 addition & 1 deletion package.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"name": "llm-interface",
"version": "0.0.6",
"version": "0.0.7",
"main": "src/index.js",
"description": "A simple, unified interface for integrating and interacting with multiple Large Language Model (LLM) APIs, including OpenAI, Anthropic, Google Gemini, Groq, and LlamaCPP.",
"scripts": {
Expand Down
88 changes: 88 additions & 0 deletions src/goose.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,88 @@
/**
* @file goose.js
* @class Goose
* @description Wrapper class for the Goose AI API.
* @param {string} apiKey - The API key for Goose AI.
*/

const axios = require("axios");

class Goose {
/**
* Creates an instance of the Goose class.
*
* @constructor
* @param {string} apiKey - The API key for Goose AI.
*/
constructor(apiKey) {
this.client = axios.create({
baseURL: "https://api.goose.ai/v1/engines",
headers: {
"Content-Type": "application/json",
Authorization: `Bearer ${apiKey}`,
},
});
}

/**
* Sends a message to the Goose AI API and returns the response.
*
* @param {Object} message - The message object containing model and messages.
* @param {string} message.model - The model to use for the request.
* @param {Array<Object>} message.messages - The array of message objects to form the prompt.
* @param {Object} [options] - Optional parameters for the request.
* @returns {Promise<Array<string>>} The response text from the API.
* @throws {Error} Throws an error if the request fails.
*
* @example
* const message = {
* model: "gpt-neo-20b",
* messages: [
* { role: "system", content: "You are a helpful assistant." },
* { role: "user", content: "Explain the importance of low latency LLMs." }
* ],
* };
* goose.sendMessage(message, { max_tokens: 100 })
* .then(response => console.log(response))
* .catch(error => console.error(error));
*/
async sendMessage(message, options = {}) {
const { model, messages } = message;

// Convert messages array to a single prompt string
const formattedPrompt = messages
.map((message) => message.content)
.join(" ");

const payload = {
prompt: formattedPrompt,
...options,
};

try {
const url = `https://api.goose.ai/v1/engines/${model}/completions`;
const response = await this.client.post(url, payload);
const responseText =
response &&
response.data &&
response.data.choices &&
response.data.choices[0] &&
response.data.choices[0].text
? response.data.choices[0].text.trim()
: null;

return responseText;
} catch (error) {
if (error.response) {
console.error("Response data:", error.response.data);
} else if (error.request) {
console.error("No response received:", error.request);
} else {
console.error("Error setting up the request:", error.message);
}
throw error;
}
}
}

module.exports = Goose;
10 changes: 6 additions & 4 deletions src/index.js
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
/**
* @file index.js
* @description Entry point for the LLM interface module, dynamically loading handlers for different LLM providers.
* @description Entry point for the LLM interface module, dynamically loading LLMInterface for different LLM providers.
*/

const modules = {
Expand All @@ -10,11 +10,12 @@ const modules = {
llamacpp: "./llamacpp",
reka: "./reka",
groq: "./groq",
goose: "./goose",
};

const handlers = {};
const LLMInterface = {};
Object.keys(modules).forEach((key) => {
Object.defineProperty(handlers, key, {
Object.defineProperty(LLMInterface, key, {
get: function () {
if (!this[`_${key}`]) {
this[`_${key}`] = require(modules[key]);
Expand All @@ -26,4 +27,5 @@ Object.keys(modules).forEach((key) => {
});
});

module.exports = handlers;
const handlers = LLMInterface;
module.exports = { LLMInterface, handlers };
2 changes: 1 addition & 1 deletion test/anthropic.test.js
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ test("Anthropic API Client should send a message and receive a response", async
],
};

const response = await anthropic.sendMessage(message, { max_tokens: 150 });
const response = await anthropic.sendMessage(message, { max_tokens: 100 });
console.log(response);
expect(typeof response).toBe("string");
}, 30000); // Extend timeout to 30 seconds
29 changes: 29 additions & 0 deletions test/goose.test.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
/**
* @file goose.test.js
* @description Tests for the Goose AI API client.
*/

const Goose = require("../src/goose"); // Adjust path as needed
const { gooseApiKey } = require("../config");

test("Goose API Client should send a message and receive a response", async () => {
expect(typeof gooseApiKey).toBe("string");

const goose = new Goose(gooseApiKey);
const message = {
model: "gpt-neo-20b",
messages: [
{
role: "system",
content: "You are a helpful assistant.",
},
{
role: "user",
content: "Explain the importance of low latency LLMs.",
},
],
};

const response = await goose.sendMessage(message, { max_tokens: 100 });
expect(typeof response).toBe("string");
}, 30000); // Extend timeout to 30 seconds

0 comments on commit 52f45d5

Please sign in to comment.