< Back
Add-DatabricksLibrary
Post
NAME Add-DatabricksLibrary
SYNOPSIS
Installs a library to a Databricks cluster.
SYNTAX
Add-DatabricksLibrary [[-BearerToken] <String>] [[-Region] <String>] [-LibraryType] <String> [-LibrarySettings]
<String> [-ClusterId] <String> [<CommonParameters>]
DESCRIPTION
Attempts install of library. Note you must check if the install completes successfully as the install
happens async. See Get-DatabricksLibraries.
Also note that libraries installed via the API do not show in UI. Again see Get-DatabricksLibraries. This
is a known Databricks issue which maybe addressed in the future.
Note the API does not support the auto install to all clusters option as yet.
Cluster must not be in a terminated state (PENDING is ok).
PARAMETERS
-BearerToken <String>
Your Databricks Bearer token to authenticate to your workspace (see User Settings in Databricks WebUI)
Required? false
Position? 1
Default value
Accept pipeline input? false
Accept wildcard characters? false
-Region <String>
Azure Region - must match the URL of your Databricks workspace, example northeurope
Required? false
Position? 2
Default value
Accept pipeline input? false
Accept wildcard characters? false
-LibraryType <String>
egg, jar, pypi, whl, cran, maven
Required? true
Position? 3
Default value
Accept pipeline input? false
Accept wildcard characters? false
-LibrarySettings <String>
Settings can by path to jar (starting dbfs), pypi name (optionally with repo), or egg
If jar, URI of the jar to be installed. DBFS and S3 URIs are supported. For example: { "jar":
"dbfs:/mnt/databricks/library.jar" } or { "jar": "s3://my-bucket/library.jar" }. If S3 is used, make sure the
cluster has read access on the library. You may need to launch the cluster with an IAM role to access the S3
URI.
If egg, URI of the egg to be installed. DBFS and S3 URIs are supported. For example: { "egg": "dbfs:/my/egg" }
or { "egg": "s3://my-bucket/egg" }. If S3 is used, make sure the cluster has read access on the library. You
may need to launch the cluster with an IAM role to access the S3 URI.
If whl, URI of the wheel or zipped wheels to be installed. DBFS and S3 URIs are supported. For example: {
"whl": "dbfs:/my/whl" } or { "whl": "s3://my-bucket/whl" }. If S3 is used, make sure the cluster has read
access on the library. You may need to launch the cluster with an IAM role to access the S3 URI. Also the
wheel file name needs to use the correct convention. If zipped wheels are to be installed, the file name
suffix should be .wheelhouse.zip.
If pypi, specification of a PyPI library to be installed. For example: { "package": "simplejson" }
If maven, specification of a Maven library to be installed. For example: { "coordinates":
"org.jsoup:jsoup:1.7.2" }
If cran, specification of a CRAN library to be installed.
Required? true
Position? 4
Default value
Accept pipeline input? false
Accept wildcard characters? false
-ClusterId <String>
The cluster to install the library to. Note that the API does not support auto installing to
all clusters. See Get-DatabricksClusters.
Required? true
Position? 5
Default value
Accept pipeline input? false
Accept wildcard characters? false
<CommonParameters>
This cmdlet supports the common parameters: Verbose, Debug,
ErrorAction, ErrorVariable, WarningAction, WarningVariable,
OutBuffer, PipelineVariable, and OutVariable. For more information, see
about_CommonParameters (https:/go.microsoft.com/fwlink/?LinkID=113216).
INPUTS
OUTPUTS
NOTES
Author: Simon D'Morias / Data Thirst Ltd
-------------------------- EXAMPLE 1 --------------------------
C:\\PS>Add-DatabricksLibrary -BearerToken $BearerToken -Region $Region -LibraryType "jar" -LibrarySettings
"dbfs:/mnt/libraries/library.jar" -ClusterId "bob-1234"
This example installs a library from a jar which exists in dbfs.
-------------------------- EXAMPLE 2 --------------------------
C:\\PS>Add-DatabricksLibrary -BearerToken $BearerToken -Region $Region -LibraryType "pypi" -LibrarySettings
'simplejson2' -ClusterId 'Bob-1234'
The above example applies a pypi library to a cluster by id
RELATED LINKS
SYNOPSIS
Installs a library to a Databricks cluster.
SYNTAX
Add-DatabricksLibrary [[-BearerToken] <String>] [[-Region] <String>] [-LibraryType] <String> [-LibrarySettings]
<String> [-ClusterId] <String> [<CommonParameters>]
DESCRIPTION
Attempts install of library. Note you must check if the install completes successfully as the install
happens async. See Get-DatabricksLibraries.
Also note that libraries installed via the API do not show in UI. Again see Get-DatabricksLibraries. This
is a known Databricks issue which maybe addressed in the future.
Note the API does not support the auto install to all clusters option as yet.
Cluster must not be in a terminated state (PENDING is ok).
PARAMETERS
-BearerToken <String>
Your Databricks Bearer token to authenticate to your workspace (see User Settings in Databricks WebUI)
Required? false
Position? 1
Default value
Accept pipeline input? false
Accept wildcard characters? false
-Region <String>
Azure Region - must match the URL of your Databricks workspace, example northeurope
Required? false
Position? 2
Default value
Accept pipeline input? false
Accept wildcard characters? false
-LibraryType <String>
egg, jar, pypi, whl, cran, maven
Required? true
Position? 3
Default value
Accept pipeline input? false
Accept wildcard characters? false
-LibrarySettings <String>
Settings can by path to jar (starting dbfs), pypi name (optionally with repo), or egg
If jar, URI of the jar to be installed. DBFS and S3 URIs are supported. For example: { "jar":
"dbfs:/mnt/databricks/library.jar" } or { "jar": "s3://my-bucket/library.jar" }. If S3 is used, make sure the
cluster has read access on the library. You may need to launch the cluster with an IAM role to access the S3
URI.
If egg, URI of the egg to be installed. DBFS and S3 URIs are supported. For example: { "egg": "dbfs:/my/egg" }
or { "egg": "s3://my-bucket/egg" }. If S3 is used, make sure the cluster has read access on the library. You
may need to launch the cluster with an IAM role to access the S3 URI.
If whl, URI of the wheel or zipped wheels to be installed. DBFS and S3 URIs are supported. For example: {
"whl": "dbfs:/my/whl" } or { "whl": "s3://my-bucket/whl" }. If S3 is used, make sure the cluster has read
access on the library. You may need to launch the cluster with an IAM role to access the S3 URI. Also the
wheel file name needs to use the correct convention. If zipped wheels are to be installed, the file name
suffix should be .wheelhouse.zip.
If pypi, specification of a PyPI library to be installed. For example: { "package": "simplejson" }
If maven, specification of a Maven library to be installed. For example: { "coordinates":
"org.jsoup:jsoup:1.7.2" }
If cran, specification of a CRAN library to be installed.
Required? true
Position? 4
Default value
Accept pipeline input? false
Accept wildcard characters? false
-ClusterId <String>
The cluster to install the library to. Note that the API does not support auto installing to
all clusters. See Get-DatabricksClusters.
Required? true
Position? 5
Default value
Accept pipeline input? false
Accept wildcard characters? false
<CommonParameters>
This cmdlet supports the common parameters: Verbose, Debug,
ErrorAction, ErrorVariable, WarningAction, WarningVariable,
OutBuffer, PipelineVariable, and OutVariable. For more information, see
about_CommonParameters (https:/go.microsoft.com/fwlink/?LinkID=113216).
INPUTS
OUTPUTS
NOTES
Author: Simon D'Morias / Data Thirst Ltd
-------------------------- EXAMPLE 1 --------------------------
C:\\PS>Add-DatabricksLibrary -BearerToken $BearerToken -Region $Region -LibraryType "jar" -LibrarySettings
"dbfs:/mnt/libraries/library.jar" -ClusterId "bob-1234"
This example installs a library from a jar which exists in dbfs.
-------------------------- EXAMPLE 2 --------------------------
C:\\PS>Add-DatabricksLibrary -BearerToken $BearerToken -Region $Region -LibraryType "pypi" -LibrarySettings
'simplejson2' -ClusterId 'Bob-1234'
The above example applies a pypi library to a cluster by id
RELATED LINKS