Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Changelog page December 6, 2024 #89

Merged
merged 1 commit into from
Dec 8, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 9 additions & 1 deletion changelog/2024-12-05.mdx → fern/changelog/2024-12-05.mdx
Original file line number Diff line number Diff line change
@@ -1,4 +1,6 @@
1. **OAuth2 Support for Custom LLM Credentials and Webhooks**: You can now secure access to your [custom LLMs](https://docs.vapi.ai/customization/custom-llm/using-your-server#step-2-configuring-vapi-with-custom-llm) and [server urls (aka webhooks)](https://docs.vapi.ai/server-url) using OAuth2 (RFC 6749). Create a webhook credential with `CreateWebhookCredentialDTO` and specify the following information.
1. **OAuth2 Support for Custom LLM Credentials and Webhooks**: You can now authorize access to your [custom LLMs](https://docs.vapi.ai/customization/custom-llm/using-your-server#step-2-configuring-vapi-with-custom-llm) and [server urls (aka webhooks)](https://docs.vapi.ai/server-url) using OAuth2 (RFC 6749).

For example, create a webhook credential with `CreateWebhookCredentialDTO` with the following payload:

```json
{
Expand All @@ -13,4 +15,10 @@
}
```

This returns a [`WebhookCredential`](https://api.vapi.ai/api) object as follows:

<Frame caption="Refer to the `WebhookCredential` schema for more information">
<img src="../static/images/changelog/webhook-credential.png" />
</Frame>

3. **Removal of Canonical Knowledge Base**: The ability to create, update, and use canoncial knowledge bases in your assistant has been removed from the API(as custom knowledge bases and the Trieve integration supports as superset of this functionality). Please update your implementations as endpoints and models referencing canoncial knowledge base schemas are no longer available.
24 changes: 24 additions & 0 deletions fern/changelog/2024-12-06.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
1. **OAuth 2 Authentication for Custom LLM Models and Webhooks**: In addition to (AuthZ)[https://www.okta.com/identity-101/authentication-vs-authorization/], you can now now authenticate users accessing your [custom LLMs](https://docs.vapi.ai/customization/custom-llm/using-your-server#step-2-configuring-vapi-with-custom-llm) and [server urls (aka webhooks)](https://docs.vapi.ai/server-url) using OAuth2 (RFC 6749). Use the `authenticationSession` dictionary which contains an `accessToken` and `expiresAt` datetime to authenticate further requests to your custom LLM or server URL.

For example, create a webhook credential with `CreateCustomLLMCredentialDTO` with the following payload:
```json
{
"provider": "custom-llm",
"apiKey": "your-api-key-max-10000-characters",
"authenticationPlan": {
"type": "oauth2",
"url": "https://your-url.com/your/path/token",
"clientId": "your-client-id",
"clientSecret": "your-client-secret"
},
"name": "your-credential-name-between-1-and-40-characters"
}
```

This returns a [`CustomLLMCredential`](https://api.vapi.ai/api) object as follows:

<Frame caption="Refer to the `CustomLLMCredential` schema for more information">
<img src="../static/images/changelog/custom-llm-credential.png" />
</Frame>

This can be used to authenticate successive requests to your custom LLM or server URL.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading