Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Go] weave models.md #473

Merged
merged 2 commits into from
Jun 25, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 3 additions & 2 deletions docs-go/generate.sh
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@ if [[ ! -f $weave ]]; then
go -C ../go install ./internal/cmd/weave
fi

$weave flows > flows.md

for file in flows models; do
$weave $file > $file.md
done

159 changes: 159 additions & 0 deletions docs-go/models
Original file line number Diff line number Diff line change
@@ -0,0 +1,159 @@
# Generating content

Firebase Genkit provides an easy interface for generating content with LLMs.

## Models

Models in Firebase Genkit are libraries and abstractions that provide access to
various Google and non-Google LLMs.

Models are fully instrumented for observability and come with tooling
integrations provided by the Genkit Developer UI -- you can try any model using
the model runner.

When working with models in Genkit, you first need to configure the model you
want to work with. Model configuration is performed by the plugin system. In
this example you are configuring the Vertex AI plugin, which provides Gemini
models.

- {Go}

%include ../go/internal/doc-snippets/models.go import

%include ../go/internal/doc-snippets/models.go init

Note: Different plugins and models use different methods of
authentication. For example, Vertex API uses the Google Auth Library so it can
pull required credentials using Application Default Credentials.

To use models provided by the plugin, you need a reference to the specific model
and version:

- {Go}

%include ../go/internal/doc-snippets/models.go model

## Supported models

Genkit provides model support through its plugin system. The following plugins
are officially supported:

| Plugin | Models |
| ------------------------- | ------------------------------------------------------------------------ |
| [Google Generative AI][1] | Gemini Pro, Gemini Pro Vision |
| [Google Vertex AI][2] | Gemini Pro, Gemini Pro Vision, Gemini 1.5 Flash, Gemini 1.5 Pro, Imagen2 |
| [Ollama][3] | Many local models, including Gemma, Llama 2, Mistral, and more |

[1]: plugins/google-genai.md
[2]: plugins/vertex-ai.md
[3]: plugins/ollama.md

See the docs for each plugin for setup and usage information.

<!-- TODO: There's also a wide variety of community supported models available
you can discover by ... -->

## How to generate content

Genkit provides a simple helper function for generating content with models.

To just call the model:

- {Go}

%include ../go/internal/doc-snippets/models.go call

You can pass options along with the model call. The options that are supported
depend on the model and its API.

- {Go}

%include ../go/internal/doc-snippets/models.go options

### Streaming responses

Genkit supports chunked streaming of model responses:

- {Go}

To use chunked streaming, pass a callback function to `Generate()`:

%include ../go/internal/doc-snippets/models.go streaming

## Multimodal input

If the model supports multimodal input, you can pass image prompts:

- {Go}

%include ../go/internal/doc-snippets/models.go multimodal

<!-- TODO: gs:// wasn't working for me. HTTP? -->

The exact format of the image prompt (`https` URL, `gs` URL, `data` URI) is
model-dependent.

## Function calling (tools)

Genkit models provide an interface for function calling, for models that support
it.

- {Go}

%include ../go/internal/doc-snippets/models.go tools

This will automatically call the tools in order to fulfill the user prompt.

<!-- TODO: returnToolRequests: true` -->

<!--

### Adding retriever context

Documents from a retriever can be passed directly to `generate` to provide
grounding context:

```javascript
const docs = await companyPolicyRetriever({ query: question });

await generate({
model: geminiPro,
prompt: `Answer using the available context from company policy: ${question}`,

context: docs,
});
```

The document context is automatically appended to the content of the prompt
sent to the model.

-->

### Recording message history

Genkit models support maintaining a history of the messages sent to the model
and its responses, which you can use to build interactive experiences, such as
chatbots.

- {Go}

In the first prompt of a session, the "history" is simply the user prompt:

%include ../go/internal/doc-snippets/models.go hist1

When you get a response, add it to the history:

%include ../go/internal/doc-snippets/models.go hist2

You can serialize this history and persist it in a database or session storage.
For subsequent user prompts, add them to the history before calling
`Generate()`:

%include ../go/internal/doc-snippets/models.go hist3

If the model you're using supports the system role, you can use the initial
history to set the system message:

- {Go}

%include ../go/internal/doc-snippets/models.go hist4
Loading
Loading