Skip to content

Commit

Permalink
Update integration libraries (#13277)
Browse files Browse the repository at this point in the history
Co-authored-by: Bill Palombi <[email protected]>
  • Loading branch information
discdiver and billpalombi authored May 13, 2024
1 parent 2159357 commit c9e22de
Show file tree
Hide file tree
Showing 16 changed files with 665 additions and 1,089 deletions.
171 changes: 49 additions & 122 deletions docs/guides/dask-ray-task-runners.md

Large diffs are not rendered by default.

31 changes: 16 additions & 15 deletions docs/integrations/prefect-aws/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ pip install -U prefect-aws
```
</div>

### Register newly installed blocks types
### Register newly installed block types

Register the block types in the prefect-aws module to make them available for use.

Expand All @@ -28,10 +28,20 @@ prefect block register -m prefect_aws
```
</div>

Below we show how to create blocks with Python code.
## Examples

### Run flows on AWS ECS

Run flows on [AWS Elastic Container Service (ECS)](https://aws.amazon.com/ecs/) to dynamically scale your infrastructure.

See the [ECS guide](/ecs_guide/) for a walkthrough of using ECS in a hybrid work pool.

If you're using Prefect Cloud, [ECS push work pools](https://docs.prefect.io/latest/guides/deployment/push-work-pools/#__tabbed_1_1) provide all the benefits of ECS with a quick setup and no worker needed.

In the examples below, you create blocks with Python code.
Alternatively, each block can be created through the Prefect UI.

## Save credentials to an AWS Credentials block
### Save credentials to an AWS Credentials block

Use of most AWS services requires an authenticated session.
Prefect makes it simple to provide credentials via a AWS Credentials block.
Expand All @@ -58,7 +68,7 @@ Prefect is using the Boto3 library under the hood.
To find credentials for authentication, any data not provided to the block are sourced at runtime in the order shown in the [Boto3 docs](https://boto3.amazonaws.com/v1/documentation/api/latest/guide/credentials.html#configuring-credentials).
Prefect creates the session object using the values in the block and then, any missing values follow the sequence in the Boto3 docs.

See an example of using the `AwsCredentials` block with [AWS Secrets Manager](#aws-secrets-manager) with third-party services without storing credentials in the block itself in [this guide](/guides/secrets/).
See an example of using the `AwsCredentials` block with [AWS Secrets Manager](#aws-secrets-manager) with third-party services without storing credentials in the block itself in [this guide](https://docs.prefect.io/latest/guides/secrets/).

Here's how to load the saved credentials:

Expand All @@ -71,7 +81,7 @@ AwsCredentials.load("BLOCK-NAME-PLACEHOLDER")

The AWS Credentials block is often nested within other blocks, such as `S3Bucket` or `AwsSecret`, and provides authentication for those services.

## Read and write files to AWS S3
### Read and write files to AWS S3

Upload a file to an AWS S3 bucket and download the same file under a different file name.
The following code assumes that the bucket already exists:
Expand Down Expand Up @@ -105,7 +115,7 @@ if __name__ == "__main__":
s3_flow()
```

## Save secrets with AWS Secrets Manager
### Access secrets with AWS Secrets Manager

Write a secret to AWS Secrets Manager, read the secret data, delete the secret, and return the secret data.

Expand All @@ -128,15 +138,6 @@ if __name__ == "__main__":
secrets_manager_flow()
```

## Run flows on AWS ECS

Run flows on [AWS Elastic Container Service (ECS)](https://aws.amazon.com/ecs/) to dynamically scale your infrastructure.

See the [ECS guide](/ecs_guide/) for a walkthrough of using ECS in a hybrid work pool.

If you're using Prefect Cloud and your organization's security posture allows storing credentials in blocks, [ECS push work pools](https://docs.prefect.io/latest/guides/deployment/push-work-pools/#__tabbed_1_1) are a great option.
They provide all the benefits of ECS with a quick setup and no worker needed.

## Resources

For assistance using AWS, consult the [AWS documentation](https://docs.aws.amazon.com/) and, in particular, the [Boto3 documentation](https://boto3.amazonaws.com/v1/documentation/api/latest/index.html).
Expand Down
123 changes: 47 additions & 76 deletions docs/integrations/prefect-azure/index.md
Original file line number Diff line number Diff line change
@@ -1,38 +1,42 @@
# prefect-azure

<p align="center">
<a href="https://pypi.python.org/pypi/prefect-azure/" alt="PyPI version">
<img alt="PyPI" src="https://img.shields.io/pypi/v/prefect-azure?color=26272B&labelColor=090422"></a>
<a href="https://pepy.tech/badge/prefect-azure/" alt="Downloads">
<img src="https://img.shields.io/pypi/dm/prefect-azure?color=26272B&labelColor=090422" /></a>
</p>

`prefect-azure` is a collection of Prefect integrations for orchestration workflows with Azure.
`prefect-azure` makes it easy to leverage the capabilities of Azure in your workflows.
For example, you can retrieve secrets, read and write Blob Storage objects, and deploy your flows on Azure Container Instances (ACI).

## Getting Started

### Installation
### Prerequisites

Install `prefect-azure` with `pip`
- [Prefect installed](https://docs.prefect.io/latest/getting-started/installation/) in a virtual environment.
- An [Azure account](https://azure.microsoft.com/) and the necessary permissions to access desired services.

```bash
pip install prefect-azure
```
### Install prefect-azure

To use Blob Storage:
<div class = "terminal">
```bash
pip install "prefect-azure[blob_storage]"
pip install -U prefect-azure
```
</div>

To use Cosmos DB:
If necessary, see [additional installation options for Blob Storage, Cosmos DB, and ML Datastore](#additional-installation-options).

To install with all additional functionality, use the following command:

<div class = "terminal">
```bash
pip install "prefect-azure[cosmos_db]"
pip install -U "prefect-azure[all_extras]"
```
</div>

To use ML Datastore:
### Register newly installed block types

Register the block types in the module to make them available for use.

<div class = "terminal">
```bash
pip install "prefect-azure[ml_datastore]"
prefect block register -m prefect_azure
```
</div>

## Examples

Expand Down Expand Up @@ -61,6 +65,7 @@ example_blob_storage_download_flow()
```

Use `with_options` to customize options on any existing task or flow:

```python
custom_blob_storage_download_flow = example_blob_storage_download_flow.with_options(
name="My custom task name",
Expand All @@ -69,76 +74,42 @@ custom_blob_storage_download_flow = example_blob_storage_download_flow.with_opti
)
```

### Run a command on an Azure container instance
### Run flows on Azure Container Instances

```python
from prefect import flow
from prefect_azure import AzureContainerInstanceCredentials
from prefect_azure.container_instance import AzureContainerInstanceJob
Run flows on [Azure Container Instances (ACI)](https://learn.microsoft.com/en-us/azure/container-instances/) to dynamically scale your infrastructure.

See the [Azure Container Instances Worker Guide](/aci_worker/) for a walkthrough of using ACI in a hybrid work pool.

@flow
def container_instance_job_flow():
aci_credentials = AzureContainerInstanceCredentials.load("MY_BLOCK_NAME")
container_instance_job = AzureContainerInstanceJob(
aci_credentials=aci_credentials,
resource_group_name="azure_resource_group.example.name",
subscription_id="<MY_AZURE_SUBSCRIPTION_ID>",
command=["echo", "hello world"],
)
return container_instance_job.run()
```
If you're using Prefect Cloud, [ACI push work pools](/guides/deployment/push-work-pools/#__tabbed_1_2) provide all the benefits of ACI with a quick setup and no worker needed.

### Use Azure Container Instance as infrastructure
## Resources

If we have `a_flow_module.py`:
For assistance using Azure, consult the [Azure documentation](https://learn.microsoft.com/en-us/azure).

```python
from prefect import flow, get_run_logger
Refer to the prefect-azure API documentation linked in the sidebar to explore all the capabilities of the prefect-azure library.

@flow
def log_hello_flow(name="Marvin"):
logger = get_run_logger()
logger.info(f"{name} said hello!")
### Additional installation options

if __name__ == "__main__":
log_hello_flow()
```

We can run that flow using an Azure Container Instance, but first create the infrastructure block:

```python
from prefect_azure import AzureContainerInstanceCredentials
from prefect_azure.container_instance import AzureContainerInstanceJob

container_instance_job = AzureContainerInstanceJob(
aci_credentials=AzureContainerInstanceCredentials.load("MY_BLOCK_NAME"),
resource_group_name="azure_resource_group.example.name",
subscription_id="<MY_AZURE_SUBSCRIPTION_ID>",
)
container_instance_job.save("aci-dev")
```
To use Blob Storage:

Then, create the deployment either on the UI or through the CLI:
<div class="terminal">
```bash
prefect deployment build a_flow_module.py:log_hello_flow --name aci-dev -ib container-instance-job/aci-dev
pip install -U "prefect-azure[blob_storage]"
```
</div>

Visit [Prefect Deployments](https://docs.prefect.io/tutorials/deployments/) for more information about deployments.

## Azure Container Instance Worker
The Azure Container Instance worker is an excellent way to run
your workflows on Azure.
To use Cosmos DB:

To get started, create an Azure Container Instances typed work pool:
```
prefect work-pool create -t azure-container-instance my-aci-work-pool
<div class="terminal">
```bash
pip install -U "prefect-azure[cosmos_db]"
```
</div>

Then, run a worker that pulls jobs from the work pool:
```
prefect worker start -n my-aci-worker -p my-aci-work-pool
```
To use ML Datastore:

The worker should automatically read the work pool's type and start an
Azure Container Instance worker.
<div class="terminal">
```bash
pip install -U "prefect-azure[ml_datastore]"
```
</div>
Loading

0 comments on commit c9e22de

Please sign in to comment.