Skip to content
This repository has been archived by the owner on Jun 25, 2024. It is now read-only.

Start DatabricksJob

Simon D'Morias edited this page Sep 15, 2019 · 2 revisions

external help file: azure.databricks.cicd.tools-help.xml Module Name: azure.databricks.cicd.tools online version: schema: 2.0.0

Start-DatabricksJob

SYNOPSIS

Starts a Databricks Job by id or name.

SYNTAX

Start-DatabricksJob [[-BearerToken] <String>] [[-Region] <String>] [[-JobName] <String>] [[-JobId] <String>]
 [[-PythonParameters] <String[]>] [[-JarParameters] <String[]>] [[-SparkSubmitParameters] <String[]>]
 [[-NotebookParametersJson] <String>] [<CommonParameters>]

DESCRIPTION

Starts a Databricks Job by id or name.

EXAMPLES

Example 1

PS C:\> {{ Add example code here }}

{{ Add example description here }}

PARAMETERS

-BearerToken

Your Databricks Bearer token to authenticate to your workspace (see User Settings in Datatbricks WebUI)

Type: String
Parameter Sets: (All)
Aliases:

Required: False
Position: 1
Default value: None
Accept pipeline input: False
Accept wildcard characters: False

-Region

Azure Region - must match the URL of your Databricks workspace, example northeurope

Type: String
Parameter Sets: (All)
Aliases:

Required: False
Position: 2
Default value: None
Accept pipeline input: False
Accept wildcard characters: False

-JobName

Optional. Start Job(s) matching this name (note that names are not unique in Databricks)

Type: String
Parameter Sets: (All)
Aliases:

Required: False
Position: 3
Default value: None
Accept pipeline input: False
Accept wildcard characters: False

-JobId

Optional. Will start Job with this Id.

Type: String
Parameter Sets: (All)
Aliases:

Required: False
Position: 4
Default value: None
Accept pipeline input: False
Accept wildcard characters: False

-PythonParameters

Optional. Array for parameters for job, for example "--pyFiles", "dbfs:/myscript.py", "myparam"

Type: String[]
Parameter Sets: (All)
Aliases:

Required: False
Position: 5
Default value: None
Accept pipeline input: False
Accept wildcard characters: False

-JarParameters

Optional. Array for parameters for job, for example "--pyFiles", "dbfs:/myscript.py", "myparam"

Type: String[]
Parameter Sets: (All)
Aliases:

Required: False
Position: 6
Default value: None
Accept pipeline input: False
Accept wildcard characters: False

-SparkSubmitParameters

Optional. Array for parameters for job, for example "--pyFiles", "dbfs:/myscript.py", "myparam"

Type: String[]
Parameter Sets: (All)
Aliases:

Required: False
Position: 7
Default value: None
Accept pipeline input: False
Accept wildcard characters: False

-NotebookParametersJson

{{ Fill NotebookParametersJson Description }}

Type: String
Parameter Sets: (All)
Aliases:

Required: False
Position: 8
Default value: None
Accept pipeline input: False
Accept wildcard characters: False

CommonParameters

This cmdlet supports the common parameters: -Debug, -ErrorAction, -ErrorVariable, -InformationAction, -InformationVariable, -OutVariable, -OutBuffer, -PipelineVariable, -Verbose, -WarningAction, and -WarningVariable. For more information, see about_CommonParameters.

INPUTS

OUTPUTS

NOTES

Author: Simon D'Morias / Data Thirst Ltd

RELATED LINKS

Clone this wiki locally