From 6706dce5779f212271fdd55281b238051424ed2c Mon Sep 17 00:00:00 2001 From: gagb Date: Tue, 3 Dec 2024 16:54:53 -0800 Subject: [PATCH] Improves tutorial (#4507) * Change nav depth * Remove repetitive module names * Increase nav depth * Decrease base font size of docs * Improve docs * Undo css change --- .../user-guide/agentchat-user-guide/index.md | 11 +- .../agentchat-user-guide/installation.md | 4 +- .../agentchat-user-guide/quickstart.ipynb | 10 +- .../agentchat-user-guide/tutorial/index.md | 6 +- .../tutorial/models.ipynb | 368 +++++++++--------- 5 files changed, 188 insertions(+), 211 deletions(-) diff --git a/python/packages/autogen-core/docs/src/user-guide/agentchat-user-guide/index.md b/python/packages/autogen-core/docs/src/user-guide/agentchat-user-guide/index.md index 82ecb204b752..02a0b95178ae 100644 --- a/python/packages/autogen-core/docs/src/user-guide/agentchat-user-guide/index.md +++ b/python/packages/autogen-core/docs/src/user-guide/agentchat-user-guide/index.md @@ -13,18 +13,9 @@ For beginner users, AgentChat is the recommended starting point. For advanced users, [`autogen-core`](../core-user-guide/index.md)'s event-driven programming model provides more flexibility and control over the underlying components. -AgentChat aims to provide intuitive defaults, such as **Agents** with preset +AgentChat provides intuitive defaults, such as **Agents** with preset behaviors and **Teams** with predefined [multi-agent design patterns](../core-user-guide/design-patterns/index.md). -to simplify building multi-agent applications. -```{include} warning.md - -``` - -```{tip} -If you are interested in implementing complex agent interaction behaviours, defining custom messaging protocols, or orchestration mechanisms, consider using the [ `autogen-core`](../core-user-guide/index.md) package. - -``` ::::{grid} 2 2 2 2 :gutter: 3 diff --git a/python/packages/autogen-core/docs/src/user-guide/agentchat-user-guide/installation.md b/python/packages/autogen-core/docs/src/user-guide/agentchat-user-guide/installation.md index c3d256e20aef..3f1b1d1496bb 100644 --- a/python/packages/autogen-core/docs/src/user-guide/agentchat-user-guide/installation.md +++ b/python/packages/autogen-core/docs/src/user-guide/agentchat-user-guide/installation.md @@ -7,7 +7,7 @@ myst: # Installation -## Create a virtual environment (optional) +## Create a Virtual Environment (optional) When installing AgentChat locally, we recommend using a virtual environment for the installation. This will ensure that the dependencies for AgentChat are isolated from the rest of your system. @@ -55,7 +55,7 @@ conda deactivate `````` -## Intall the AgentChat package using pip +## Install Using pip Install the `autogen-agentchat` package using pip: diff --git a/python/packages/autogen-core/docs/src/user-guide/agentchat-user-guide/quickstart.ipynb b/python/packages/autogen-core/docs/src/user-guide/agentchat-user-guide/quickstart.ipynb index 378e05e220dc..e4f5817f1ddc 100644 --- a/python/packages/autogen-core/docs/src/user-guide/agentchat-user-guide/quickstart.ipynb +++ b/python/packages/autogen-core/docs/src/user-guide/agentchat-user-guide/quickstart.ipynb @@ -11,15 +11,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "```{include} warning.md\n", - "\n", - "```\n", - "\n", - ":::{note}\n", - "For installation instructions, please refer to the [installation guide](./installation).\n", - ":::\n", - "\n", - "In AutoGen AgentChat, you can build applications quickly using preset agents.\n", + "Via AgentChat, you can build applications quickly using preset agents.\n", "To illustrate this, we will begin with creating a team of a single agent\n", "that can use tools and respond to messages.\n", "\n", diff --git a/python/packages/autogen-core/docs/src/user-guide/agentchat-user-guide/tutorial/index.md b/python/packages/autogen-core/docs/src/user-guide/agentchat-user-guide/tutorial/index.md index fa0ed5f6caa3..597063a0c01a 100644 --- a/python/packages/autogen-core/docs/src/user-guide/agentchat-user-guide/tutorial/index.md +++ b/python/packages/autogen-core/docs/src/user-guide/agentchat-user-guide/tutorial/index.md @@ -7,11 +7,7 @@ myst: # Tutorial -Tutorial to get started with AgentChat. - -```{include} ../warning.md - -``` +Get started with AgentChat through this comprehensive tutorial. ::::{grid} 2 2 2 3 :gutter: 3 diff --git a/python/packages/autogen-core/docs/src/user-guide/agentchat-user-guide/tutorial/models.ipynb b/python/packages/autogen-core/docs/src/user-guide/agentchat-user-guide/tutorial/models.ipynb index bd2bcaf230f8..dc521efa550b 100644 --- a/python/packages/autogen-core/docs/src/user-guide/agentchat-user-guide/tutorial/models.ipynb +++ b/python/packages/autogen-core/docs/src/user-guide/agentchat-user-guide/tutorial/models.ipynb @@ -1,187 +1,185 @@ { - "cells": [ - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "# Models\n", - "\n", - "In many cases, agents need access to model services such as OpenAI, Azure OpenAI, and local models.\n", - "AgentChat utilizes model clients provided by the\n", - "[`autogen-ext`](../../core-user-guide/framework/model-clients.ipynb) package." - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "## OpenAI\n", - "\n", - "To access OpenAI models, you need to install the `openai` extension to use the {py:class}`~autogen_ext.models.OpenAIChatCompletionClient`." - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": { - "vscode": { - "languageId": "shellscript" - } - }, - "outputs": [], - "source": [ - "pip install 'autogen-ext[openai]==0.4.0.dev8'" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "You will also need to obtain an [API key](https://platform.openai.com/account/api-keys) from OpenAI." - ] - }, - { - "cell_type": "code", - "execution_count": 10, - "metadata": {}, - "outputs": [], - "source": [ - "from autogen_ext.models import OpenAIChatCompletionClient\n", - "\n", - "opneai_model_client = OpenAIChatCompletionClient(\n", - " model=\"gpt-4o-2024-08-06\",\n", - " # api_key=\"sk-...\", # Optional if you have an OPENAI_API_KEY environment variable set.\n", - ")" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "To test the model client, you can use the following code:" - ] - }, - { - "cell_type": "code", - "execution_count": 11, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "CreateResult(finish_reason='stop', content='The capital of France is Paris.', usage=RequestUsage(prompt_tokens=15, completion_tokens=7), cached=False, logprobs=None)\n" - ] - } - ], - "source": [ - "from autogen_core.components.models import UserMessage\n", - "\n", - "result = await opneai_model_client.create([UserMessage(content=\"What is the capital of France?\", source=\"user\")])\n", - "print(result)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "```{note}\n", - "You can use this client with models hosted on OpenAI-compatible endpoints, however, we have not tested this functionality.\n", - "See {py:class}`~autogen_ext.models.OpenAIChatCompletionClient` for more information.\n", - "```" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "## Azure OpenAI\n", - "\n", - "Install the `azure` and `openai` extensions to use the {py:class}`~autogen_ext.models.AzureOpenAIChatCompletionClient`." - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": { - "vscode": { - "languageId": "shellscript" - } - }, - "outputs": [], - "source": [ - "pip install 'autogen-ext[openai,azure]==0.4.0.dev8'" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "To use the client, you need to provide your deployment id, Azure Cognitive Services endpoint, api version, and model capabilities.\n", - "For authentication, you can either provide an API key or an Azure Active Directory (AAD) token credential.\n", - "\n", - "The following code snippet shows how to use AAD authentication.\n", - "The identity used must be assigned the [Cognitive Services OpenAI User](https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/role-based-access-control#cognitive-services-openai-user) role." - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "from autogen_ext.models import AzureOpenAIChatCompletionClient\n", - "from azure.identity import DefaultAzureCredential, get_bearer_token_provider\n", - "\n", - "# Create the token provider\n", - "token_provider = get_bearer_token_provider(DefaultAzureCredential(), \"https://cognitiveservices.azure.com/.default\")\n", - "\n", - "az_model_client = AzureOpenAIChatCompletionClient(\n", - " azure_deployment=\"{your-azure-deployment}\",\n", - " model=\"{model-name, such as gpt-4o}\",\n", - " api_version=\"2024-06-01\",\n", - " azure_endpoint=\"https://{your-custom-endpoint}.openai.azure.com/\",\n", - " azure_ad_token_provider=token_provider, # Optional if you choose key-based authentication.\n", - " # api_key=\"sk-...\", # For key-based authentication.\n", - ")" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "See [here](https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/managed-identity#chat-completions) for how to use the Azure client directly or for more info." - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "## Local Models\n", - "\n", - "We are working on it. Stay tuned!" - ] - } - ], - "metadata": { - "kernelspec": { - "display_name": ".venv", - "language": "python", - "name": "python3" - }, - "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 3 - }, - "file_extension": ".py", - "mimetype": "text/x-python", - "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython3", - "version": "3.11.5" - } - }, - "nbformat": 4, - "nbformat_minor": 2 + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Models\n", + "\n", + "In many cases, agents need access to LLM model services such as OpenAI, Azure OpenAI, or local models. Since there are many different providers with different APIs, `autogen-core` implements a protocol for [model clients](../../core-user-guide/framework/model-clients.ipynb) and `autogen-ext` implements a set of model clients for popular model services. AgentChat can use these model clients to interact with model services. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## OpenAI\n", + "\n", + "To access OpenAI models, install the `openai` extension, which allows you to use the {py:class}`~autogen_ext.models.OpenAIChatCompletionClient`." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "vscode": { + "languageId": "shellscript" + } + }, + "outputs": [], + "source": [ + "pip install 'autogen-ext[openai]==0.4.0.dev8'" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "You will also need to obtain an [API key](https://platform.openai.com/account/api-keys) from OpenAI." + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "metadata": {}, + "outputs": [], + "source": [ + "from autogen_ext.models import OpenAIChatCompletionClient\n", + "\n", + "opneai_model_client = OpenAIChatCompletionClient(\n", + " model=\"gpt-4o-2024-08-06\",\n", + " # api_key=\"sk-...\", # Optional if you have an OPENAI_API_KEY environment variable set.\n", + ")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "To test the model client, you can use the following code:" + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "CreateResult(finish_reason='stop', content='The capital of France is Paris.', usage=RequestUsage(prompt_tokens=15, completion_tokens=7), cached=False, logprobs=None)\n" + ] + } + ], + "source": [ + "from autogen_core.components.models import UserMessage\n", + "\n", + "result = await opneai_model_client.create([UserMessage(content=\"What is the capital of France?\", source=\"user\")])\n", + "print(result)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "```{note}\n", + "You can use this client with models hosted on OpenAI-compatible endpoints, however, we have not tested this functionality.\n", + "See {py:class}`~autogen_ext.models.OpenAIChatCompletionClient` for more information.\n", + "```" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Azure OpenAI\n", + "\n", + "Similarly, install the `azure` and `openai` extensions to use the {py:class}`~autogen_ext.models.AzureOpenAIChatCompletionClient`." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "vscode": { + "languageId": "shellscript" + } + }, + "outputs": [], + "source": [ + "pip install 'autogen-ext[openai,azure]==0.4.0.dev8'" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "To use the client, you need to provide your deployment id, Azure Cognitive Services endpoint, api version, and model capabilities.\n", + "For authentication, you can either provide an API key or an Azure Active Directory (AAD) token credential.\n", + "\n", + "The following code snippet shows how to use AAD authentication.\n", + "The identity used must be assigned the [Cognitive Services OpenAI User](https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/role-based-access-control#cognitive-services-openai-user) role." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from autogen_ext.models import AzureOpenAIChatCompletionClient\n", + "from azure.identity import DefaultAzureCredential, get_bearer_token_provider\n", + "\n", + "# Create the token provider\n", + "token_provider = get_bearer_token_provider(DefaultAzureCredential(), \"https://cognitiveservices.azure.com/.default\")\n", + "\n", + "az_model_client = AzureOpenAIChatCompletionClient(\n", + " azure_deployment=\"{your-azure-deployment}\",\n", + " model=\"{model-name, such as gpt-4o}\",\n", + " api_version=\"2024-06-01\",\n", + " azure_endpoint=\"https://{your-custom-endpoint}.openai.azure.com/\",\n", + " azure_ad_token_provider=token_provider, # Optional if you choose key-based authentication.\n", + " # api_key=\"sk-...\", # For key-based authentication.\n", + ")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "See [here](https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/managed-identity#chat-completions) for how to use the Azure client directly or for more information." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Local Models\n", + "\n", + "We are working on it. Stay tuned!" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": ".venv", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.11.5" + } + }, + "nbformat": 4, + "nbformat_minor": 2 }