-
Notifications
You must be signed in to change notification settings - Fork 1.5k
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Add dagster_databricks package for Databricks integration (#2468)
* Add dagster-databricks package This package is closely modeled off the dagster_aws.emr subpackage and provides the databricks_pyspark_step_launcher resource and the DatabricksRunJobSolidDefinition solid for running Databricks jobs. * Reference Databricks docs in dagster-databricks configs module * Move build_pyspark_zip into dagster_pyspark utils module * Fix style/minor issues in dagster-databricks Specifically: - triple single quotes instead of triple double quotes for docstrings - single quotes instead of double quotes everywhere else - oneline docstrings where possible; start on same line everywhere else - rename 'is_terminal' to 'has_terminated' - use 'databricks_run_id' instead of 'run_id' for clarity - make DatabricksJobRunner.client a property - remove unnecessary blank lines * Add references to Databricks storage docs in 'main' script * Add comment explaining global vars in databricks_step_main.py * Fix Python 2 issues in dagster-databricks * Check invariants when setting up storage in Databricks job * Fix dependencies in dagster-databricks/tox.ini * Move 'secret_scope' field into inner credentials object to simplify Databricks storage * isort dagster-databricks * Add pylint to tox.ini for dagster_databricks * Install dagster-databricks in 'make install_dev_python_modules' * Reference GitHub issue for better storage setup in databricks_step_main.py * Uncomment dagster-azure related config * Replace assert_called_once with call_count for Python3.5 compat * Fix lint errors in databricks.py * Improve handling of libraries by including required libs by default * Fix version to match other dagster libraries * Specify supported_pythons to exclude Python 3.8 from dagster-databricks tests on buildkite See #1960. * Add README for dagster-databricks * Install dagster-databricks in dagster-examples tox.ini * Update snapshot test for dagster example using databricks * Add API docs for dagster_databricks * Add coveragerc for dagster-databricks
- Loading branch information
Ben Sully
authored
Jun 9, 2020
1 parent
f8c89ee
commit 19146d4
Showing
36 changed files
with
2,318 additions
and
75 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
26 changes: 26 additions & 0 deletions
26
docs/sections/api/apidocs/libraries/dagster_databricks.rst
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,26 @@ | ||
Databricks (dagster_databricks) | ||
------------------------- | ||
|
||
The ``dagster_databricks`` package provides two main pieces of functionality: | ||
|
||
- a resource, ``databricks_pyspark_step_launcher``, which will execute a solid within a Databricks | ||
context on a cluster, such that the ``pyspark`` resource uses the cluster's Spark instance; and | ||
- a solid, ``DatabricksRunJobSolidDefinition``, which submits an external configurable job to | ||
Databricks using the 'Run Now' API. | ||
|
||
See the 'simple_pyspark' Dagster example for an example of how to use the resource. | ||
|
||
Note that either S3 or Azure Data Lake Storage config **must** be specified for solids to succeed, | ||
and the credentials for this storage must also be stored as a Databricks Secret and stored in the | ||
resource config so that the Databricks cluster can access storage. | ||
|
||
.. currentmodule:: dagster_databricks | ||
|
||
.. autodata:: dagster_databricks.databricks_pyspark_step_launcher | ||
:annotation: ResourceDefinition | ||
|
||
.. autoclass:: dagster_databricks.DatabricksRunJobSolidDefinition | ||
|
||
.. autoclass:: dagster_databricks.DatabricksJobRunner | ||
|
||
.. autoclass:: dagster_databricks.DatabricksError |
32 changes: 32 additions & 0 deletions
32
examples/dagster_examples/simple_pyspark/environments/prod_databricks.yaml
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,32 @@ | ||
resources: | ||
pyspark_step_launcher: | ||
config: | ||
run_config: | ||
cluster: | ||
new: | ||
nodes: | ||
node_types: | ||
node_type_id: Standard_DS3_v2 | ||
size: | ||
num_workers: 1 | ||
spark_version: 6.5.x-scala2.11 | ||
run_name: dagster-tests | ||
databricks_host: uksouth.azuredatabricks.net | ||
databricks_token: | ||
env: DATABRICKS_TOKEN | ||
local_pipeline_package_path: . | ||
staging_prefix: /dagster-databricks-tests | ||
storage: | ||
s3: | ||
secret_scope: dagster-databricks-tests | ||
access_key_key: aws-access-key | ||
secret_key_key: aws-secret-key | ||
solids: | ||
make_weather_samples: | ||
inputs: | ||
file_path: s3://dagster-databricks-tests/sfo_q2_weather_fixed_header.txt | ||
storage: | ||
s3: | ||
config: | ||
s3_bucket: dagster-databricks-tests | ||
s3_prefix: simple-pyspark |
File renamed without changes.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.