< Back
Add-DatabricksDBFSFile
Post
NAME Add-DatabricksDBFSFile
SYNOPSIS
Upload a file or folder of files from your local filesystem into DBFS.
SYNTAX
Add-DatabricksDBFSFile [[-BearerToken] <String>] [[-Region] <String>] [-LocalRootFolder] <String> [-FilePattern]
<String> [-TargetLocation] <String> [<CommonParameters>]
DESCRIPTION
Upload a file or folder of files to DBFS. Supports exact path or pattern matching. Target folder in DBFS does not
need to exist - they will be created as needed.
Existing files will be overwritten.
Use this as part of CI/CD pipeline to publish your code & libraries.
PARAMETERS
-BearerToken <String>
Your Databricks Bearer token to authenticate to your workspace (see User Settings in Databricks WebUI)
Required? false
Position? 1
Default value
Accept pipeline input? false
Accept wildcard characters? false
-Region <String>
Azure Region - must match the URL of your Databricks workspace, example northeurope
Required? false
Position? 2
Default value
Accept pipeline input? false
Accept wildcard characters? false
-LocalRootFolder <String>
Path to file(s) to upload, can be relative or full. Note that subfolders are recursed always.
Required? true
Position? 3
Default value
Accept pipeline input? false
Accept wildcard characters? false
-FilePattern <String>
File pattern to match. Examples: *.py *.* ProjectA*.*
Required? true
Position? 4
Default value
Accept pipeline input? false
Accept wildcard characters? false
-TargetLocation <String>
Target folder in DBFS should start /.
Does not need to exist.
Required? true
Position? 5
Default value
Accept pipeline input? false
Accept wildcard characters? false
<CommonParameters>
This cmdlet supports the common parameters: Verbose, Debug,
ErrorAction, ErrorVariable, WarningAction, WarningVariable,
OutBuffer, PipelineVariable, and OutVariable. For more information, see
about_CommonParameters (https:/go.microsoft.com/fwlink/?LinkID=113216).
INPUTS
OUTPUTS
NOTES
Author: Simon D'Morias / Data Thirst Ltd
-------------------------- EXAMPLE 1 --------------------------
C:\\PS>Add-DatabricksDBFSFile -BearerToken $BearerToken -Region $Region -LocalRootFolder "Samples" -FilePattern
"Test.jar" -TargetLocation '/test' -Verbose
This example uploads a single file called Test.jar which is a relative path to your working directory.
-------------------------- EXAMPLE 2 --------------------------
C:\\PS>Add-DatabricksDBFSFile -BearerToken $BearerToken -Region $Region -LocalRootFolder Samples/DummyNotebooks
-FilePattern "*.py" -TargetLocation '/test2/' -Verbose
This example uploads a folder of py files
RELATED LINKS
SYNOPSIS
Upload a file or folder of files from your local filesystem into DBFS.
SYNTAX
Add-DatabricksDBFSFile [[-BearerToken] <String>] [[-Region] <String>] [-LocalRootFolder] <String> [-FilePattern]
<String> [-TargetLocation] <String> [<CommonParameters>]
DESCRIPTION
Upload a file or folder of files to DBFS. Supports exact path or pattern matching. Target folder in DBFS does not
need to exist - they will be created as needed.
Existing files will be overwritten.
Use this as part of CI/CD pipeline to publish your code & libraries.
PARAMETERS
-BearerToken <String>
Your Databricks Bearer token to authenticate to your workspace (see User Settings in Databricks WebUI)
Required? false
Position? 1
Default value
Accept pipeline input? false
Accept wildcard characters? false
-Region <String>
Azure Region - must match the URL of your Databricks workspace, example northeurope
Required? false
Position? 2
Default value
Accept pipeline input? false
Accept wildcard characters? false
-LocalRootFolder <String>
Path to file(s) to upload, can be relative or full. Note that subfolders are recursed always.
Required? true
Position? 3
Default value
Accept pipeline input? false
Accept wildcard characters? false
-FilePattern <String>
File pattern to match. Examples: *.py *.* ProjectA*.*
Required? true
Position? 4
Default value
Accept pipeline input? false
Accept wildcard characters? false
-TargetLocation <String>
Target folder in DBFS should start /.
Does not need to exist.
Required? true
Position? 5
Default value
Accept pipeline input? false
Accept wildcard characters? false
<CommonParameters>
This cmdlet supports the common parameters: Verbose, Debug,
ErrorAction, ErrorVariable, WarningAction, WarningVariable,
OutBuffer, PipelineVariable, and OutVariable. For more information, see
about_CommonParameters (https:/go.microsoft.com/fwlink/?LinkID=113216).
INPUTS
OUTPUTS
NOTES
Author: Simon D'Morias / Data Thirst Ltd
-------------------------- EXAMPLE 1 --------------------------
C:\\PS>Add-DatabricksDBFSFile -BearerToken $BearerToken -Region $Region -LocalRootFolder "Samples" -FilePattern
"Test.jar" -TargetLocation '/test' -Verbose
This example uploads a single file called Test.jar which is a relative path to your working directory.
-------------------------- EXAMPLE 2 --------------------------
C:\\PS>Add-DatabricksDBFSFile -BearerToken $BearerToken -Region $Region -LocalRootFolder Samples/DummyNotebooks
-FilePattern "*.py" -TargetLocation '/test2/' -Verbose
This example uploads a folder of py files
RELATED LINKS