Skip to content
This repository has been archived by the owner on Jun 25, 2024. It is now read-only.

Add DatabricksLibrary

Simon D'Morias edited this page Sep 15, 2019 · 4 revisions

external help file: azure.databricks.cicd.tools-help.xml Module Name: azure.databricks.cicd.tools online version: schema: 2.0.0

Add-DatabricksLibrary

SYNOPSIS

Installs a library to a Databricks cluster.

SYNTAX

Add-DatabricksLibrary [[-BearerToken] <String>] [[-Region] <String>] [-LibraryType] <String>
 [-LibrarySettings] <String> [-ClusterId] <String> [<CommonParameters>]

DESCRIPTION

Attempts install of library. Note you must check if the install completes successfully as the install happens async. See Get-DatabricksLibraries. Also note that libraries installed via the API do not show in UI. Again see Get-DatabricksLibraries. This is a known Databricks issue which maybe addressed in the future. Note the API does not support the auto install to all clusters option as yet. Cluster must not be in a terminated state (PENDING is ok).

EXAMPLES

EXAMPLE 1

Add-DatabricksLibrary -BearerToken $BearerToken -Region $Region -LibraryType "jar" -LibrarySettings "dbfs:/mnt/libraries/library.jar" -ClusterId "bob-1234"

This example installs a library from a jar which exists in dbfs.

EXAMPLE 2

Add-DatabricksLibrary -BearerToken $BearerToken -Region $Region -LibraryType "pypi" -LibrarySettings 'simplejson2' -ClusterId 'Bob-1234'

The above example applies a pypi library to a cluster by id

PARAMETERS

-BearerToken

Your Databricks Bearer token to authenticate to your workspace (see User Settings in Databricks WebUI)

Type: String
Parameter Sets: (All)
Aliases:

Required: False
Position: 1
Default value: None
Accept pipeline input: False
Accept wildcard characters: False

-Region

Azure Region - must match the URL of your Databricks workspace, example northeurope

Type: String
Parameter Sets: (All)
Aliases:

Required: False
Position: 2
Default value: None
Accept pipeline input: False
Accept wildcard characters: False

-LibraryType

egg, jar, pypi, whl, cran, maven

Type: String
Parameter Sets: (All)
Aliases:

Required: True
Position: 3
Default value: None
Accept pipeline input: False
Accept wildcard characters: False

-LibrarySettings

Settings can by path to jar (starting dbfs), pypi name (optionally with repo), or egg

If jar, URI of the jar to be installed. DBFS and S3 URIs are supported. For example: { "jar": "dbfs:/mnt/databricks/library.jar" } or { "jar": "s3://my-bucket/library.jar" }. If S3 is used, make sure the cluster has read access on the library. You may need to launch the cluster with an IAM role to access the S3 URI.

If egg, URI of the egg to be installed. DBFS and S3 URIs are supported. For example: { "egg": "dbfs:/my/egg" } or { "egg": "s3://my-bucket/egg" }. If S3 is used, make sure the cluster has read access on the library. You may need to launch the cluster with an IAM role to access the S3 URI.

If whl, URI of the wheel or zipped wheels to be installed. DBFS and S3 URIs are supported. For example: { "whl": "dbfs:/my/whl" } or { "whl": "s3://my-bucket/whl" }. If S3 is used, make sure the cluster has read access on the library. You may need to launch the cluster with an IAM role to access the S3 URI. Also the wheel file name needs to use the correct convention. If zipped wheels are to be installed, the file name suffix should be .wheelhouse.zip.

If pypi, specification of a PyPI library to be installed. For example: { "package": "simplejson" }

If maven, specification of a Maven library to be installed. For example: { "coordinates": "org.jsoup:jsoup:1.7.2" }

If cran, specification of a CRAN library to be installed.

Type: String
Parameter Sets: (All)
Aliases:

Required: True
Position: 4
Default value: None
Accept pipeline input: False
Accept wildcard characters: False

-ClusterId

The cluster to install the library to. Note that the API does not support auto installing to all clusters. See Get-DatabricksClusters.

Type: String
Parameter Sets: (All)
Aliases:

Required: True
Position: 5
Default value: None
Accept pipeline input: False
Accept wildcard characters: False

CommonParameters

This cmdlet supports the common parameters: -Debug, -ErrorAction, -ErrorVariable, -InformationAction, -InformationVariable, -OutVariable, -OutBuffer, -PipelineVariable, -Verbose, -WarningAction, and -WarningVariable. For more information, see about_CommonParameters.

INPUTS

OUTPUTS

NOTES

Author: Simon D'Morias / Data Thirst Ltd

RELATED LINKS

Clone this wiki locally