Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How can I integrate the third-party API interface? #3942

Closed
1 task done
Karenina-na opened this issue May 7, 2023 · 12 comments
Closed
1 task done

How can I integrate the third-party API interface? #3942

Karenina-na opened this issue May 7, 2023 · 12 comments
Assignees
Labels
potential plugin This may fit better into our plugin system. Stale

Comments

@Karenina-na
Copy link

Duplicates

  • I have searched the existing issues

Summary 💡

I would like to switch the interface of Auto-GPT from the official OpenAI to a third-party interface, such as https://openai.api2d.net/. How to proceed with the change?

Examples 🌈

None

Motivation 🔦

free using

@Karenina-na
Copy link
Author

See the following searches:

I would like to modify the interface so that it can be invoked in a manner similar to the JavaScript code below.
What steps do I need to take to modify the code?
Thank you.

const https = require('https');

var postData = JSON.stringify({
  model: 'gpt-3.5-turbo',
  messages: [{ role: 'user', content: 'Hello, what is your name?' }],
});

var options = {
  hostname: 'openai.api2d.net',
  port: 443,
  path: '/v1/chat/completions',
  method: 'POST',
  headers: {
    'Content-Type': 'application/json',
    Authorization: 'Bearer fkxxxxx', // Forward Key,  This is a third-party API interface.
  },
};

var req = https.request(options, (res) => {
  console.log('statusCode:', res.statusCode);
  console.log('headers:', res.headers);

  res.on('data', (d) => {
    process.stdout.write(d);
  });
});

req.on('error', (e) => {
  console.error(e);
});

req.write(postData);
req.end();

@Boostrix
Copy link
Contributor

Boostrix commented May 12, 2023

personally, my suggestion would be add a new command to configure the API based on a json template and then pass that in - that way, people could tinker with different APIs, and as long as the json contains some comments/meta info, the agent could also tinker with variations of it (e.g. by fetching the API schema first).

See: #3651

@Karenina-na
Copy link
Author

OK,i will have a try.

@Boostrix
Copy link
Contributor

Boostrix commented May 16, 2023

First, see: #2594

note that you can probably take existing code snippets you may already have (like the javascript code above) and adapt it as needed with a little guidance from GPT itself.

Also, if this is possibly supposed to be reviewed/integrated at some point, it would make sense to use the equivalent of a "profile" (python dict with key/value pairs) to generalize the current config. That way, you woud introduce profiles for different API interfaces.

This could work analogous to the "model" parameter (see llm_utils.py)
If there are more changes needed to accomodate a different API, you will want to come up with an APIProvider abstraction and sub-class its interface.

If in doubt, consider joining discord to discuss the details. But with these hints (and some help from GPT) you can probably come up with a basic prototype rather quickly.

And before you spend more than 15 minutes coding up something like this, please do consider using a draft PR and announce your work on discord, so that others are aware of your effort - which can probably save you a ton of time. Realistically, a simple prototype should not take longer than 1 hour to draft. If you are finding yourself needing more time, Discord is your friend.

@Boostrix Boostrix added the good first issue Good for newcomers label May 16, 2023
@DGdev91
Copy link
Contributor

DGdev91 commented May 16, 2023

#2594 makes possible to connect AutoGPT to any third party service wich is compliant with OpenAI's API, even if the underlying model is different.
There are already many projects wich do that, like https://github.com/keldenl/gpt-llama.cpp, https://github.com/go-skynet/LocalAI and even the openai plugin of https://github.com/oobabooga/text-generation-webui
That API seems like to be openai-compliant, so that PR should be just enough.

Connecting to a non-compliant api would be a different matter. But i'm not sure that adding an "APIProvider" would be the right way, probably that should be handled by a plugin. and before making a PR (or even a draft) it would be a good idea to discuss about that in discord

@Boostrix Boostrix added potential plugin This may fit better into our plugin system. and removed good first issue Good for newcomers labels May 16, 2023
@Karenina-na
Copy link
Author

#2594 makes possible to connect AutoGPT to any third party service wich is compliant with OpenAI's API, even if the underlying model is different. There are already many projects wich do that, like https://github.com/keldenl/gpt-llama.cpp, https://github.com/go-skynet/LocalAI and even the openai plugin of https://github.com/oobabooga/text-generation-webui That API seems like to be openai-compliant, so that PR should be just enough.

Connecting to a non-compliant api would be a different matter. But i'm not sure that adding an "APIProvider" would be the right way, probably that should be handled by a plugin. and before making a PR (or even a draft) it would be a good idea to discuss about that in discord

Therefore, if my third-party interface adheres to the guidelines set forth by OpenAI, may I employ it, per chance? I express my gratitude for your consideration.

@DGdev91
Copy link
Contributor

DGdev91 commented May 18, 2023

Therefore, if my third-party interface adheres to the guidelines set forth by OpenAI, may I employ it, per chance? I express my gratitude for your consideration.

Precisely. Your API should expose the same method offered by OpenAI's API, and then it should work just fine.

Note: There's still an open discussion about that PR and it may be implemented differently

@Karenina-na
Copy link
Author

Therefore, if my third-party interface adheres to the guidelines set forth by OpenAI, may I employ it, per chance? I express my gratitude for your consideration.

Precisely. Your API should expose the same method offered by OpenAI's API, and then it should work just fine.

Note: There's still an open discussion about that PR and it may be implemented differently

So,I have perused your PR, wherein I appended the aforementioned

        elif os.getenv("OPENAI_API_BASE_URL", None):
            openai.api_base = os.getenv("OPENAI_API_BASE_URL")

to the file config.py located within the directory autogpt\config. Subsequently, is it sufficient to include OPENAI_API_BASE_URL=xxx in the .env file? I am a novice in GitHub, unfamiliar with its usage.

@DGdev91
Copy link
Contributor

DGdev91 commented May 19, 2023

to the file config.py located within the directory autogpt\config. Subsequently, is it sufficient to include OPENAI_API_BASE_URL=xxx in the .env file? I am a novice in GitHub, unfamiliar with its usage.

Indeed, an you'll have also to set EMBED_DIM according to the model you are using.
In the config file you'll find some examples for llama and derivates.
Also, you'll have to use my fork if you want to try that. The code has been just proposed, but not yet merged to the main project. https://github.com/DGdev91/Auto-GPT

@Karenina-na
Copy link
Author

to the file config.py located within the directory autogpt\config. Subsequently, is it sufficient to include OPENAI_API_BASE_URL=xxx in the .env file? I am a novice in GitHub, unfamiliar with its usage.

Indeed, an you'll have also to set EMBED_DIM according to the model you are using.
In the config file you'll find some examples for llama and derivates.
Also, you'll have to use my fork if you want to try that. The code has been just proposed, but not yet merged to the main project. https://github.com/DGdev91/Auto-GPT

OK, i will have a try. Thank you.

@github-actions
Copy link
Contributor

github-actions bot commented Sep 6, 2023

This issue has automatically been marked as stale because it has not had any activity in the last 50 days. You can unstale it by commenting or removing the label. Otherwise, this issue will be closed in 10 days.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
potential plugin This may fit better into our plugin system. Stale
Projects
None yet
Development

No branches or pull requests

3 participants