Skip to content

Commit

Permalink
docs: update README/DEVELOPMENT
Browse files Browse the repository at this point in the history
Signed-off-by: Morgan Epp <[email protected]>
  • Loading branch information
epmog committed Apr 1, 2024
1 parent fe15a89 commit 966023e
Show file tree
Hide file tree
Showing 7 changed files with 207 additions and 78 deletions.
77 changes: 75 additions & 2 deletions DEVELOPMENT.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,77 @@
# The AWS Deadline Cloud Client Library (`deadline.client`)
# Development documentation

This documentation provides guidance on developer workflows for working with the code in this repository.

## Code organization

This repository is split up into two main modules:
1. `src/client`
2. `src/job_attachments`

The `src/client` organization is laid out below.

For more information on job attachments, see [here](src/deadline/job_attachments/README.md).

### `src/client/api`

This submodule contains utilities to call boto3 in a standardized way
using an aws profile configured for AWS Deadline Cloud, helpers for working with the
AWS Deadline Cloud monitor login/logout, and objects representing AWS Deadline Cloud
resources.

### `src/client/cli`

This submodule contains entry points for the CLI applications provided
by the library.

### `src/client/config`

This submodule contains an interface to the machine-specific AWS Deadline Cloud
configuration, specifically settings stored in `~/.deadline/*`

### `src/client/ui`

This submodule contains Qt GUIs, based on PySide(2/6), for common controls
and widgets used in interactive submitters, and to display the status
of various AWS Deadline Cloud resoruces.

### `src/client/job_bundle`

This submodule contains code related to the history of job submissions
performed on the workstation. Its initial functionality is to create
job bundle directories in a standardized manner.

# Build / Test / Release

## Build the package.
```
hatch build
```

## Run tests
```
hatch run test
```

## Run integration tests
```
hatch run integ:test
```

## Run linting
```
hatch run lint
```

## Run formating
```
hatch run fmt
```

## Run tests for all supported Python versions.
```
hatch run all:test
```

## Qt and Calling AWS (including AWS Deadline Cloud) APIs

Expand Down Expand Up @@ -90,7 +163,7 @@ class MyCustomWidget(QWidget):

```

**We recommend you set up your runtimes via `asdf`.**
**We recommend you set up your runtimes via `mise`.**

## Running Docker-based Unit Tests

Expand Down
176 changes: 111 additions & 65 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,111 +1,157 @@
# The AWS Deadline Cloud Client Library (`deadline.client`)
# AWS Deadline Cloud client

## Overview
[![pypi](https://img.shields.io/pypi/v/deadline.svg?style=flat)](https://pypi.python.org/pypi/deadline)
[![python](https://img.shields.io/pypi/pyversions/deadline.svg?style=flat)](https://pypi.python.org/pypi/deadline)
[![license](https://img.shields.io/pypi/l/deadline.svg?style=flat)](https://github.com/aws-deadline/deadline/blob/mainline/LICENSE)

This is a shared Python library that implements functionality to support
client applications using AWS Deadline Cloud.

It is divided into the following submodules:
AWS Deadline Cloud client is a multi-purpose python library and command line tool for interacting with and submitting [Open Job Description (OpenJD)][openjd] jobs to [AWS Deadline Cloud][deadline-cloud].

### api
To support building workflows on-top of AWS Deadline Cloud it implements its own user interaction, job creation, file upload/download, and other useful helpers around the service's api. It can function as a pipeline tool, a standalone gui application, or even be embedded within other application's runtimes.

This submodule contains utilities to call boto3 in a standardized way
using an aws profile configured for AWS Deadline Cloud, helpers for working with
Deadline Cloud Monitor Desktop login/logout, and objects representing AWS Deadline Cloud
resources.
[deadline-cloud]: https://docs.aws.amazon.com/deadline-cloud/latest/userguide/what-is-deadline-cloud.html
[deadline-cloud-monitor]: https://docs.aws.amazon.com/deadline-cloud/latest/userguide/working-with-deadline-monitor.html
[deadline-cloud-samples]: https://github.com/aws-deadline/deadline-cloud-samples
[deadline-jobs]: https://docs.aws.amazon.com/deadline-cloud/latest/userguide/deadline-cloud-jobs.html
[openjd]: https://github.com/OpenJobDescription/openjd-specifications/wiki

### cli

This submodule contains entry points for the CLI applications provided
by the library.

### config

This submodule contains an interface to the machine-specific AWS Deadline Cloud
configuration, specifically settings stored in `~/.deadline/*`

### ui
## Compatibility

This submodule contains Qt GUIs, based on PySide(2/6), for common controls
and widgets used in interactive submitters, and to display the status
of various AWS Deadline Cloud resoruces.
This library requires:

### job_bundle
1. Python 3.7 or higher; and
2. Linux, Windows, or macOS operating system.

This submodule contains code related to the history of job submissions
performed on the workstation. Its initial functionality is to create
job bundle directories in a standardized manner.
## Getting Started

## Compatibility
AWS Deadline Cloud client can be installed by the standard python packaging mechanisms:
```
$ pip install deadline
```

This library requires:
or if you want the optional gui dependencies:
```
pip install deadline[gui]
```

1. Python 3.7 or higher; and
2. Linux, MacOS, or Windows operating system.
## Usage

## Versioning
After installation it can then be used as a command line tool:
```
$ deadline farm list
- farmId: farm-1234567890abcdefg
displayName: my-first-farm
```

This package's version follows [Semantic Versioning 2.0](https://semver.org/), but is still considered to be in its
initial development, thus backwards incompatible versions are denoted by minor version bumps. To help illustrate how
versions will increment during this initial development stage, they are described below:
or as a python library:
```
>>> from deadline.client import api
>>> api.list_farms()
{'farms': [{'farmId': 'farm-1234567890abcdefg', 'displayName': 'my-first-farm', ...},]}
```

1. The MAJOR version is currently 0, indicating initial development.
2. The MINOR version is currently incremented when backwards incompatible changes are introduced to the public API.
3. The PATCH version is currently incremented when bug fixes or backwards compatible changes are introduced to the public API.
## Job attachments

## Downloading
Job attachments enable you to transfer files between your workstations and AWS Deadline Cloud, by using Amazon S3 buckets as [content-addressed storage](https://en.wikipedia.org/wiki/Content-addressable_storage) in your AWS account.

You can download this package from:
- [GitHub releases](https://github.com/casillas2/deadline-cloud/releases)
See [job attachments](src/deadline/job_attachments/README.md) for a more in-depth look at how files are uploaded, stored, and retrieved.

## Development
## Job bundles

See instructions in DEVELOPMENT.md
At minimum a job bundle is a folder that contains an [OpenJD][openjd] template, however it can optionally include:
1. an `asset_references.yaml` - lists file inputs and outputs,
2. a `parameter_values.yaml` - contains the selected values for the job template's parameters,
3. and any number of additional files required for the job.

## Telemetry
For example job bundles, visit the [samples repository][deadline-cloud-samples].

This library collects telemetry data by default. Telemetry events contain non-personally-identifiable information that helps us understand how users interact with our software so we know what features our customers use, and/or what existing pain points are.
To submit a job bundle, you can run
```
$ deadline bundle submit <path/to/bundle>
```

You can opt out of telemetry data collection by either:
or if you have the optional GUI components installed, you can load up a job bundle for submission by running:
```
$ deadline bundle gui-submit --browse
```

1. Setting the environment variable: `DEADLINE_CLOUD_TELEMETRY_OPT_OUT=true`
2. Setting the config file: `deadline config set telemetry.opt_out true`
On submission a job bundle will be created in the job history directory (default: `~/.deadline/job_history`).

Note that setting the environment variable supersedes the config file setting.
For more information on jobs and job bundles, see [AWS Deadline Cloud jobs][deadline-jobs].

# Build / Test / Release
## Configuration

## Build the package.
You can see the current configuration by running:
```
hatch build
$ deadline config show
```
and change the settings by running the associated `get` and `set` commands.

## Run tests
To see a list of settings that can be configured, run:
```
hatch run test
$ deadline config --help
```

## Run integration tests
Or you can manage settings by a graphical interface if you have the optional gui dependencies:
```
hatch run integ:test
$ deadline config gui
```

## Run linting
By default, configuration of AWS Deadline Cloud is provided at `~/.deadline/config`, however this can be overridden by the `DEADLINE_CONFIG_FILE_PATH` environment variable.

## Authentication

In addition to the standard AWS credential mechanisms (AWS Profiles, instance profiles, and environment variables), AWS Deadline Cloud monitor credentials are also supported.

To view the currently configured credentials authentication status, run:

```
hatch run lint
$ deadline auth status
Profile Name: (default)
Source: HOST_PROVIDED
Status: AUTHENTICATED
API Availability: True
```

## Run formating
If the currently selected AWS Profile is set-up to use [AWS Deadline Cloud monitor][deadline-cloud-monitor] credentials, you can authenticate by logging in:

```
hatch run fmt
$ deadline auth login
```

## Run tests for all supported Python versions.
and removing them by logging out:
```
hatch run all:test
$ deadline auth logout
```

# Optional Third Party Dependencies - GUI
## Versioning

This package's version follows [Semantic Versioning 2.0](https://semver.org/), but is still considered to be in its
initial development, thus backwards incompatible versions are denoted by minor version bumps. To help illustrate how
versions will increment during this initial development stage, they are described below:

1. The MAJOR version is currently 0, indicating initial development.
2. The MINOR version is currently incremented when backwards incompatible changes are introduced to the public API.
3. The PATCH version is currently incremented when bug fixes or backwards compatible changes are introduced to the public API.

## Contributing

See [`CONTRIBUTING.md`](https://github.com/aws-deadline/deadline-cloud/blob/mainline/CONTRIBUTING.md)
for information on reporting issues, requesting features, and developer information.

## Security

See [security issue notifications](https://github.com/aws-deadline/deadline-cloud/blob/release/CONTRIBUTING.md#security-issue-notifications) for more information.

## Telemetry

See [telemetry](https://github.com/aws-deadline/deadline-cloud/blob/release/docs/telemetry.md) for more information.

## Optional third party dependencies - GUI

N.B.: Although this repository is released under the Apache-2.0 license, its optional GUI feature
uses the third party Qt && PySide projects. The Qt and PySide projects' licensing includes the LGPL-3.0 license.
uses the third party Qt and PySide projects. The Qt and PySide projects' licensing includes the LGPL-3.0 license.

## License

This project is licensed under the Apache-2.0 License.
10 changes: 10 additions & 0 deletions docs/telemetry.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
# Telemetry

This library collects telemetry data by default. Telemetry events contain non-personally-identifiable information that helps us understand how users interact with our software so we know what features our customers use, and/or what existing pain points are.

You can opt out of telemetry data collection by either:

1. Setting the environment variable: `DEADLINE_CLOUD_TELEMETRY_OPT_OUT=true`
2. Setting the config file: `deadline config set telemetry.opt_out true`

Note that setting the environment variable supersedes the config file setting.
2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ dynamic = ["version"]
readme = "README.md"
license = "Apache-2.0"
requires-python = ">=3.7"
description = "Shared Python library that implements functionality to support client applications using AWS Deadline Cloud."
description = "Multi-purpose library and command line tool that implements functionality to support applications using AWS Deadline Cloud."
# https://pypi.org/classifiers/
classifiers = [
"Development Status :: 5 - Production/Stable",
Expand Down
4 changes: 2 additions & 2 deletions src/deadline/client/cli/_deadline_cli.py
Original file line number Diff line number Diff line change
Expand Up @@ -55,8 +55,8 @@
@click.pass_context
def main(ctx: click.Context, log_level: str):
"""
The AWS Deadline Cloud CLI provides functionality to work with the AWS Deadline Cloud
closed beta service.
The AWS Deadline Cloud CLI provides functionality to interact with the AWS Deadline Cloud
service.
"""
logging.basicConfig(level=log_level)
if log_level == "DEBUG":
Expand Down
2 changes: 1 addition & 1 deletion src/deadline/client/cli/_groups/auth_group.py
Original file line number Diff line number Diff line change
Expand Up @@ -85,7 +85,7 @@ def auth_logout():
)
@_handle_error
def auth_status(output, **args):
"""EXPERIMENTAL - Gets the authentication status for the given AWS profile"""
"""Gets the authentication status for the given AWS profile"""
# Get a temporary config object with the standard options handled
config = _apply_cli_options_to_config(**args)
profile_name = get_setting("defaults.aws_profile_name", config=config)
Expand Down
14 changes: 7 additions & 7 deletions src/deadline/job_attachments/README.md
Original file line number Diff line number Diff line change
@@ -1,12 +1,12 @@
# AWS Deadline Cloud Job Attachments
# Job attachments

[Job attachments][job-attachments] enable you to transfer files back and forth between your workstation and [AWS Deadline Cloud][deadline-cloud], using an Amazon S3 bucket in your AWS account associated with your [Deadline Cloud queues][queue].
[Job attachments][job-attachments] enable you to transfer files back and forth between your workstation and [AWS Deadline Cloud][deadline-cloud], using an Amazon S3 bucket in your AWS account associated with your [AWS Deadline Cloud queues][queue].

Job attachments uses your configured S3 bucket as a [content-addressable storage](https://en.wikipedia.org/wiki/Content-addressable_storage), which creates a snapshot of the files used in your job submission in [asset manifests](#asset-manifests), only uploading files that aren't already in S3. This saves you time and bandwidth when iterating on jobs. When an [AWS Deadline Cloud Worker Agent][worker-agent] starts working on a job with job attachments, it recreates the file system snapshot in the worker agent session directory, and uploads any outputs back to your S3 bucket.
Job attachments uses your configured S3 bucket as a [content-addressable storage](https://en.wikipedia.org/wiki/Content-addressable_storage), which creates a snapshot of the files used in your job submission in [asset manifests](#asset-manifests), only uploading files that aren't already in S3. This saves you time and bandwidth when iterating on jobs. When an [AWS Deadline Cloud worker agent][worker-agent] starts working on a job with job attachments, it recreates the file system snapshot in the worker agent session directory, and uploads any outputs back to your S3 bucket.

You can then easily download your outputs with the [Deadline client](../client/) `deadline job download-output` command, or using the [protocol handler](#protocol-handler) to download from a click of a button in the [Deadline Cloud Monitor][monitor].
You can then easily download your outputs with the [deadline cli](../client/) `deadline job download-output` command, or using the [protocol handler](#protocol-handler) to download from a click of a button in the [AWS Deadline Cloud monitor][monitor].

Job attachments also works as an auxiliary storage when used with [AWS Deadline Cloud Storage Profiles][shared-storage], allowing you to flexibly upload files to your Amazon S3 bucket that aren't on your configured shared storage.
Job attachments also works as an auxiliary storage when used with [AWS Deadline Cloud storage profiles][shared-storage], allowing you to flexibly upload files to your Amazon S3 bucket that aren't on your configured shared storage.

See the [`examples`](../../../examples/) directory for some simple examples on how to use job attachments.

Expand Down Expand Up @@ -53,12 +53,12 @@ In order to further improve submission time, there are currently two local [`cac

## Protocol Handler

On Windows and Linux operating systems, you can choose to install the [Deadline client](../client/) protocol handler in order to run AWS Deadline Cloud commands sent from a web browser. Of note is the ability to download job attachments outputs from your jobs through the [Deadline Cloud monitor][downloading-output].
On Windows and Linux operating systems, you can choose to install the [Deadline client](../client/) protocol handler in order to run AWS Deadline Cloud commands sent from a web browser. Of note is the ability to download job attachments outputs from your jobs through the [AWS Deadline Cloud monitor][downloading-output].

You can install the protocol handler by running the command: `deadline handle-web-url --install`

[downloading-output]: https://docs.aws.amazon.com/deadline-cloud/latest/userguide/download-finished-output.html

## Security

When creating a queue, provide the name of the an S3 bucket in the same account and region as the queue you are creating, and provide a 'root prefix' name for files to be uploaded to. You also must provide an IAM role that has access to the S3 bucket. See the [security best practices][ja-security] documentation for more information on securely configuring job attachments.
When creating a queue, provide the name of the an S3 bucket in the same account and region as the queue you are creating, and provide a 'root prefix' name for files to be uploaded to. You also must provide an IAM role that has access to the S3 bucket. See the [security best practices][ja-security] documentation for more information on securely configuring job attachments.

0 comments on commit 966023e

Please sign in to comment.