Skip to content

Commit

Permalink
feat(Traefik Hub): add APICatalogItem and ManagedSubscription support
Browse files Browse the repository at this point in the history
  • Loading branch information
jspdown authored Nov 26, 2024
1 parent 4c2a65e commit 6bfdd50
Show file tree
Hide file tree
Showing 8 changed files with 687 additions and 1 deletion.
245 changes: 245 additions & 0 deletions traefik/crds/hub.traefik.io_aiservices.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,245 @@
---
apiVersion: apiextensions.k8s.io/v1
kind: CustomResourceDefinition
metadata:
annotations:
controller-gen.kubebuilder.io/version: v0.14.0
name: aiservices.hub.traefik.io
spec:
group: hub.traefik.io
names:
kind: AIService
listKind: AIServiceList
plural: aiservices
singular: aiservice
scope: Namespaced
versions:
- name: v1alpha1
schema:
openAPIV3Schema:
description: AIService is a Kubernetes-like Service to interact with a text-based
LLM provider. It defines the parameters and credentials required to interact
with various LLM providers.
properties:
apiVersion:
description: |-
APIVersion defines the versioned schema of this representation of an object.
Servers should convert recognized schemas to the latest internal value, and
may reject unrecognized values.
More info: https://git.k8s.io/community/contributors/devel/sig-architecture/api-conventions.md#resources
type: string
kind:
description: |-
Kind is a string value representing the REST resource this object represents.
Servers may infer this from the endpoint the client submits requests to.
Cannot be updated.
In CamelCase.
More info: https://git.k8s.io/community/contributors/devel/sig-architecture/api-conventions.md#types-kinds
type: string
metadata:
type: object
spec:
description: The desired behavior of this AIService.
properties:
anthropic:
description: Anthropic configures Anthropic backend.
properties:
model:
type: string
params:
description: Params holds the LLM hyperparameters.
properties:
frequencyPenalty:
type: number
maxTokens:
type: integer
presencePenalty:
type: number
temperature:
type: number
topP:
type: number
type: object
token:
type: string
required:
- token
type: object
azureOpenai:
description: AzureOpenAI configures AzureOpenAI.
properties:
apiKey:
type: string
baseUrl:
type: string
deploymentName:
type: string
model:
type: string
params:
description: Params holds the LLM hyperparameters.
properties:
frequencyPenalty:
type: number
maxTokens:
type: integer
presencePenalty:
type: number
temperature:
type: number
topP:
type: number
type: object
required:
- apiKey
- baseUrl
- deploymentName
type: object
bedrock:
description: Bedrock configures Bedrock backend.
properties:
model:
type: string
params:
description: Params holds the LLM hyperparameters.
properties:
frequencyPenalty:
type: number
maxTokens:
type: integer
presencePenalty:
type: number
temperature:
type: number
topP:
type: number
type: object
region:
type: string
systemMessage:
type: boolean
type: object
cohere:
description: Cohere configures Cohere backend.
properties:
model:
type: string
params:
description: Params holds the LLM hyperparameters.
properties:
frequencyPenalty:
type: number
maxTokens:
type: integer
presencePenalty:
type: number
temperature:
type: number
topP:
type: number
type: object
token:
type: string
required:
- token
type: object
gemini:
description: Gemini configures Gemini backend.
properties:
apiKey:
type: string
model:
type: string
params:
description: Params holds the LLM hyperparameters.
properties:
frequencyPenalty:
type: number
maxTokens:
type: integer
presencePenalty:
type: number
temperature:
type: number
topP:
type: number
type: object
required:
- apiKey
type: object
mistral:
description: Mistral configures Mistral AI backend.
properties:
apiKey:
type: string
model:
type: string
params:
description: Params holds the LLM hyperparameters.
properties:
frequencyPenalty:
type: number
maxTokens:
type: integer
presencePenalty:
type: number
temperature:
type: number
topP:
type: number
type: object
required:
- apiKey
type: object
ollama:
description: Ollama configures Ollama backend.
properties:
baseUrl:
type: string
model:
type: string
params:
description: Params holds the LLM hyperparameters.
properties:
frequencyPenalty:
type: number
maxTokens:
type: integer
presencePenalty:
type: number
temperature:
type: number
topP:
type: number
type: object
required:
- baseUrl
type: object
openai:
description: OpenAI configures OpenAI.
properties:
model:
type: string
params:
description: Params holds the LLM hyperparameters.
properties:
frequencyPenalty:
type: number
maxTokens:
type: integer
presencePenalty:
type: number
temperature:
type: number
topP:
type: number
type: object
token:
type: string
required:
- token
type: object
type: object
type: object
served: true
storage: true
4 changes: 3 additions & 1 deletion traefik/crds/hub.traefik.io_apiaccesses.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,9 @@ spec:
singular: apiaccess
scope: Namespaced
versions:
- name: v1alpha1
- deprecated: true
deprecationWarning: APIAccess is deprecated in favor of APICatalogItems and ManagedSubscription
name: v1alpha1
schema:
openAPIV3Schema:
description: APIAccess defines who can access to a set of APIs.
Expand Down
Loading

0 comments on commit 6bfdd50

Please sign in to comment.