Skip to content

Commit

Permalink
Merge branch 'fix/supportscompletions' of https://github.com/bufferov…
Browse files Browse the repository at this point in the history
…erflow/continue into bufferoverflow-fix/supportscompletions
  • Loading branch information
sestinj committed Apr 18, 2024
2 parents 88fcace + e571e1c commit 78dd9af
Show file tree
Hide file tree
Showing 2 changed files with 8 additions and 1 deletion.
3 changes: 2 additions & 1 deletion core/llm/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,8 @@ export abstract class BaseLLM implements ILLM {
if (this.providerName === "openai") {
if (
this.apiBase?.includes("api.groq.com") ||
this.apiBase?.includes(":1337")
this.apiBase?.includes(":1337") ||
this._llmOptions.useLegacyCompletionsEndpoint?.valueOf() === false
) {
// Jan + Groq don't support completions : (
return false;
Expand Down
6 changes: 6 additions & 0 deletions docs/docs/reference/Model Providers/openai.md
Original file line number Diff line number Diff line change
Expand Up @@ -52,4 +52,10 @@ If you are [using an OpenAI compatible server / API](../../model-setup/select-pr
}
```

To force usage of `chat/completions` instead of `completions` endpoint you can set

```json
"useLegacyCompletionsEndpoint": false
```

[View the source](https://github.com/continuedev/continue/blob/main/core/llm/llms/OpenAI.ts)

0 comments on commit 78dd9af

Please sign in to comment.