Skip to content

Commit

Permalink
wip
Browse files Browse the repository at this point in the history
  • Loading branch information
eliasecchig committed Sep 17, 2024
1 parent d88f6f9 commit 67da1a6
Show file tree
Hide file tree
Showing 5 changed files with 23 additions and 29 deletions.
Empty file added core
Empty file.
Original file line number Diff line number Diff line change
Expand Up @@ -4,11 +4,13 @@ We'd love to accept your patches and contributions to this sample. There are
just a few small guidelines you need to follow.

## Contributor License Agreement
Contributions to this project must be accompanied by a Contributor License Agreement. You (or your employer) retain the copyright to your contribution; this simply gives us permission to use and redistribute your contributions as part of the project. Head over to https://cla.developers.google.com/ to see your current agreements on file or to sign a new one.

Contributions to this project must be accompanied by a Contributor License Agreement. You (or your employer) retain the copyright to your contribution; this simply gives us permission to use and redistribute your contributions as part of the project. Head over to [Google Developers CLA](https://cla.developers.google.com/) to see your current agreements on file or to sign a new one.

You generally only need to submit a CLA once, so if you've already submitted one (even if it was for a different project), you probably don't need to do it again.

## Community Guidelines, Code Reviews, Contributor Guide

Please refer to the [root repository CONTRIBUTING.md file](https://github.com/GoogleCloudPlatform/generative-ai/blob/main/CONTRIBUTING.md) for Community Guidelines, Code Reviews, Contributor Guide, or specific guidance for Google Employees.

## Code Quality Checks
Expand All @@ -26,8 +28,8 @@ Then, execute the following Makefile targets:
```bash
make lint
```

This command runs the following linters to check for code style, potential errors, and type hints:

- **codespell**: Detects common spelling mistakes in code and documentation.
- **ruff**: A fast linter that combines the functionality of several popular tools like flake8, isort, pycodestyle, and others.
- **mypy**: Performs static type checking to catch type errors before runtime.
Expand All @@ -37,10 +39,8 @@ make test
```

This command runs the test suite using pytest, covering both unit and integration tests:

- **`poetry run pytest tests/unit`**: Executes unit tests located in the `tests/unit` directory.
- **`poetry run pytest tests/integration`**: Executes integration tests located in the `tests/integration` directory.


Your pull request will also be automatically checked by these tools using GitHub Actions. Ensuring your code passes these checks locally will help expedite the review process.


Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
FROM python:3.11-slim

RUN pip install poetry==1.6.1
RUN pip install --no-cache-dir poetry==1.6.1

RUN poetry config virtualenvs.create false

Expand All @@ -14,4 +14,4 @@ RUN poetry install --no-interaction --no-ansi --no-dev

EXPOSE 8080

CMD exec uvicorn app.server:app --host 0.0.0.0 --port 8080
CMD ["uvicorn", "app.server:app", "--host", "0.0.0.0", "--port", "8080"]
22 changes: 7 additions & 15 deletions gemini/sample-apps/conversational-genai-app-template/app/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ This folder implements a chatbot application using FastAPI, and Google Cloud ser

## Folder Structure

```
```plaintext
.
├── server.py # Main FastAPI server
├── chain.py # Default chain implementation
Expand All @@ -14,6 +14,7 @@ This folder implements a chatbot application using FastAPI, and Google Cloud ser
├── utils/ # Utility functions and classes
└── eval/ # Evaluation tools and data
```

## Generative AI Application Patterns

### 1. Default Chain
Expand All @@ -38,7 +39,6 @@ To switch between different patterns, modify the import statement in `server.py`

All chains have the same interface, allowing for seamless swapping without changes to the Streamlit frontend.


## Monitoring and Observability

![monitoring_flow](../images/monitoring_flow.png)
Expand All @@ -47,29 +47,21 @@ All chains have the same interface, allowing for seamless swapping without chang

This application utilizes [OpenTelemetry](https://opentelemetry.io/) and [OpenLLMetry](https://github.com/traceloop/openllmetry) for comprehensive observability, emitting events to Google Cloud Trace and Google Cloud Logging. Every interaction with LangChain and VertexAI is instrumented (see [`server.py`](server.py)), enabling detailed tracing of request flows throughout the application.

Leveraging the [CloudTraceSpanExporter](https://cloud.google.com/python/docs/reference/spanner/latest/opentelemetry-tracing), the application captures and exports tracing data. To address the limitations of Cloud Trace ([256-byte attribute value limit](https://cloud.google.com/trace/docs/quotas#limits_on_spans)) and [Cloud Logging](https://cloud.google.com/logging/quotas) ([256KB log entry size](https://cloud.google.com/logging/quotas)), a custom extension of the CloudTraceSpanExporter is implemented in [`app/utils/tracing.py`](app/utils/tracing.py).
Leveraging the [CloudTraceSpanExporter](https://cloud.google.com/python/docs/reference/spanner/latest/opentelemetry-tracing), the application captures and exports tracing data. To address the limitations of Cloud Trace ([256-byte attribute value limit](https://cloud.google.com/trace/docs/quotas#limits_on_spans)) and [Cloud Logging](https://cloud.google.com/logging/quotas) ([256KB log entry size](https://cloud.google.com/logging/quotas)), a custom extension of the CloudTraceSpanExporter is implemented in [`app/utils/tracing.py`](app/utils/tracing.py).

This extension enhances observability by:

* **Creating a corresponding Google Cloud Logging entry for every captured event.**
* **Automatically storing event data in Google Cloud Storage when the payload exceeds 256KB.**
- Creating a corresponding Google Cloud Logging entry for every captured event.
- Automatically storing event data in Google Cloud Storage when the payload exceeds 256KB.

Logged payloads are associated with the original trace, ensuring seamless access from the Cloud Trace console.

### Log Router

Events are forwarded to BigQuery through a [log router](https://cloud.google.com/logging/docs/routing/overview) for long-term storage and analysis. The deployment of the log router is done via Terraform code in [deployment/terraform](../deployment/terraform).
Events are forwarded to BigQuery through a [log router](https://cloud.google.com/logging/docs/routing/overview) for long-term storage and analysis. The deployment of the log router is done via Terraform code in [deployment/terraform](../deployment/terraform).

### Looker Studio Dashboard

Once the data is written to BigQuery, it can be used to populate a [Looker Studio dashboard](TODO: Add link).
Once the data is written to BigQuery, it can be used to populate a [Looker Studio dashboard](TODO: Add link).

This dashboard, offered as a template, provides a starting point for building custom visualizations on the top of the data being captured.








Original file line number Diff line number Diff line change
@@ -1,9 +1,11 @@
## Deployment README.md
# Deployment README.md

This folder contains the infrastructure-as-code and CI/CD pipeline configurations for deploying a conversational Generative AI application on Google Cloud.

The application leverages [**Terraform**](http://terraform.io) to define and provision the underlying infrastructure, while [**Cloud Build**](https://cloud.google.com/build/) orchestrates the continuous integration and continuous deployment (CI/CD) pipeline.

### Deployment Workflow
## Deployment Workflow

![Deployment Workflow](../images/deployment_workflow.png)

**Description:**
Expand All @@ -24,8 +26,7 @@ The application leverages [**Terraform**](http://terraform.io) to define and pro
- Deploys to production environment



### Setup
## Setup

**Prerequisites:**

Expand All @@ -40,7 +41,7 @@ The application leverages [**Terraform**](http://terraform.io) to define and pro
gcloud services enable serviceusage.googleapis.com cloudresourcemanager.googleapis.com cloudbuild.googleapis.com secretmanager.googleapis.com
```

### Step-by-Step Guide
## Step-by-Step Guide

1. **Create a Git Repository using your favorite git provider (GitHub, GitLab, Bitbucket, etc.)**

Expand Down Expand Up @@ -82,7 +83,8 @@ Other optional variables include: telemetry and feedback BigQuery dataset IDs, l
After completing these steps, your infrastructure will be set up and ready for deployment!


### Dev Deployment
## Dev Deployment

For end-to-end testing of the application, including tracing and feedback sinking to BigQuery, without the need to trigger a CI/CD pipeline.

After you edited the relative [`env.tfvars` file](../terraform/dev/vars/env.tfvars), follow the following instructions:
Expand All @@ -98,7 +100,7 @@ gcloud run deploy conversational-app-sample --source . --project $YOUR_DEV_PROJE
```


**E2E Demo video**
### E2E Demo video

<a href="https://storage.googleapis.com/test-elia-us-central1/template%20deployment%20demo.mp4">
<img src="../images/preview_video.png" alt="Watch the video" width="300"/>
Expand Down

0 comments on commit 67da1a6

Please sign in to comment.