< Back
Write-DbaDbTableData
Post
NAME Write-DbaDbTableData
SYNOPSIS
Writes data to a SQL Server Table.
SYNTAX
Write-DbaDbTableData -SqlInstance <DbaInstanceParameter> [-SqlCredential <Pscredential>] [-Database
<System.Object>] -InputObject <System.Object> [-Table] <String> [[-Schema] <String>] [-BatchSize <Int>]
[-NotifyAfter <Int>] [-AutoCreateTable <Switch>] [-NoTableLock <Switch>] [-CheckConstraints <Switch>]
[-FireTriggers <Switch>] [-KeepIdentity <Switch>] [-KeepNulls <Switch>] [-Truncate <Switch>] [-bulkCopyTimeOut
<Int>] [-EnableException <Switch>] [-UseDynamicStringLength <Switch>] [<CommonParameters>]
DESCRIPTION
Writes a .NET DataTable to a SQL Server table using SQL Bulk Copy.
PARAMETERS
-AutoCreateTable [<Switch>]
If this switch is enabled, the table will be created if it does not already exist. The table will be created
with sub-optimal data types such as nvarchar(max)
Required? false
Position? named
Default value
Accept pipeline input? False
Accept wildcard characters? false
-BatchSize [<Int>]
The BatchSize for the import defaults to 5000.
Required? false
Position? named
Default value
Accept pipeline input? False
Accept wildcard characters? false
-bulkCopyTimeOut [<Int>]
Value in seconds for the BulkCopy operations timeout. The default is 30 seconds.
Required? false
Position? named
Default value
Accept pipeline input? False
Accept wildcard characters? false
-CheckConstraints [<Switch>]
If this switch is enabled, the SqlBulkCopy option to process check constraints will be enabled.
Per Microsoft "Check constraints while data is being inserted. By default, constraints are not checked."
Required? false
Position? named
Default value
Accept pipeline input? False
Accept wildcard characters? false
-Database [<System.Object>]
The database to import the table into.
Required? false
Position? named
Default value
Accept pipeline input? False
Accept wildcard characters? false
-EnableException [<Switch>]
By default, when something goes wrong we try to catch it, interpret it and give you a friendly warning message.
This avoids overwhelming you with "sea of red" exceptions, but is inconvenient because it basically disables
advanced scripting.
Using this switch turns this "nice by default" feature off and enables you to catch exceptions with your own
try/catch.
Required? false
Position? named
Default value
Accept pipeline input? False
Accept wildcard characters? false
-FireTriggers [<Switch>]
If this switch is enabled, the SqlBulkCopy option to fire insert triggers will be enabled.
Per Microsoft "When specified, cause the server to fire the insert triggers for the rows being inserted into
the Database."
Required? false
Position? named
Default value
Accept pipeline input? False
Accept wildcard characters? false
-InputObject [<System.Object>]
This is the DataTable (or data row) to import to SQL Server.
Required? false
Position? named
Default value
Accept pipeline input? False
Accept wildcard characters? false
-KeepIdentity [<Switch>]
If this switch is enabled, the SqlBulkCopy option to preserve source identity values will be enabled.
Per Microsoft "Preserve source identity values. When not specified, identity values are assigned by the
destination."
Required? false
Position? named
Default value
Accept pipeline input? False
Accept wildcard characters? false
-KeepNulls [<Switch>]
If this switch is enabled, the SqlBulkCopy option to preserve NULL values will be enabled.
Per Microsoft "Preserve null values in the destination table regardless of the settings for default values.
When not specified, null values are replaced by default values where applicable."
Required? false
Position? named
Default value
Accept pipeline input? False
Accept wildcard characters? false
-NoTableLock [<Switch>]
If this switch is enabled, a table lock (TABLOCK) will not be placed on the destination table. By default,
this operation will lock the destination table while running.
Required? false
Position? named
Default value
Accept pipeline input? False
Accept wildcard characters? false
-NotifyAfter [<Int>]
Sets the option to show the notification after so many rows of import
Required? false
Position? named
Default value
Accept pipeline input? False
Accept wildcard characters? false
-Schema [<String>]
Defaults to dbo if no schema is specified.
Required? false
Position? named
Default value
Accept pipeline input? False
Accept wildcard characters? false
-SqlCredential [<Pscredential>]
Login to the target instance using alternative credentials. Accepts PowerShell credentials (Get-Credential).
Windows Authentication, SQL Server Authentication, Active Directory - Password, and Active Directory -
Integrated are all supported.
For MFA support, please use Connect-DbaInstance.
Required? false
Position? named
Default value
Accept pipeline input? False
Accept wildcard characters? false
-SqlInstance [<DbaInstanceParameter>]
The target SQL Server instance or instances.
Required? false
Position? named
Default value
Accept pipeline input? False
Accept wildcard characters? false
-Table [<String>]
The table name to import data into. You can specify a one, two, or three part table name. If you specify a one
or two part name, you must also use -Database.
If the table does not exist, you can use -AutoCreateTable to automatically create the table with inefficient
data types.
If the object has special characters please wrap them in square brackets [ ].
Using dbo.First.Table will try to import to a table named 'Table' on schema 'First' and database 'dbo'.
The correct way to import to a table named 'First.Table' on schema 'dbo' is by passing dbo.[First.Table]
Any actual usage of the ] must be escaped by duplicating the ] character.
The correct way to import to a table Name] in schema Schema.Name is by passing [Schema.Name].[Name]]]
Required? false
Position? named
Default value
Accept pipeline input? False
Accept wildcard characters? false
-Truncate [<Switch>]
If this switch is enabled, the destination table will be truncated after prompting for confirmation.
Required? false
Position? named
Default value
Accept pipeline input? False
Accept wildcard characters? false
-UseDynamicStringLength [<Switch>]
By default, all string columns will be NVARCHAR(MAX).
If this switch is enabled, all columns will get the length specified by the column's MaxLength property (if
specified)
Required? false
Position? named
Default value
Accept pipeline input? False
Accept wildcard characters? false
<CommonParameters>
This cmdlet supports the common parameters: Verbose, Debug,
ErrorAction, ErrorVariable, WarningAction, WarningVariable,
OutBuffer, PipelineVariable, and OutVariable. For more information, see
about_CommonParameters (https:/go.microsoft.com/fwlink/?LinkID=113216).
INPUTS
OUTPUTS
NOTES
Tags: DataTable, Insert
Author: Chrissy LeMaire (@cl), netnerds.net
Website: https://dbatools.io
Copyright: (c) 2018 by dbatools, licensed under MIT
License: MIT https://opensource.org/licenses/MIT
-------------------------- EXAMPLE 1 --------------------------
PS C:\\>$DataTable = Import-Csv C:\\temp\\customers.csv
PS C:\\> Write-DbaDbTableData -SqlInstance sql2014 -InputObject $DataTable -Table mydb.dbo.customers
Performs a bulk insert of all the data in customers.csv into database mydb, schema dbo, table customers. A
progress bar will be shown as rows are inserted. If the destination table does not exist, the import will be
halted.
-------------------------- EXAMPLE 2 --------------------------
PS C:\\>$tableName = "MyTestData"
PS C:\\> $query = "SELECT name, create_date, owner_sid FROM sys.databases"
PS C:\\> $dataset = Invoke-DbaQuery -SqlInstance 'localhost,1417' -SqlCredential $containerCred -Database master
-Query $query
PS C:\\> $dataset | Select-Object name, create_date, @{L="owner_sid";E={$_."owner_sid"}} | Write-DbaDbTableData
-SqlInstance 'localhost,1417' -SqlCredential $containerCred -Database tempdb -Table myTestData -Schema dbo
-AutoCreateTable
Pulls data from a SQL Server instance and then performs a bulk insert of the dataset to a new, auto-generated
table tempdb.dbo.MyTestData.
-------------------------- EXAMPLE 3 --------------------------
PS C:\\>$DataTable = Import-Csv C:\\temp\\customers.csv
PS C:\\> Write-DbaDbTableData -SqlInstance sql2014 -InputObject $DataTable -Table mydb.dbo.customers
-AutoCreateTable -Confirm
Performs a bulk insert of all the data in customers.csv. If mydb.dbo.customers does not exist, it will be created
with inefficient but forgiving DataTypes.
Prompts for confirmation before a variety of steps.
-------------------------- EXAMPLE 4 --------------------------
PS C:\\>$DataTable = Import-Csv C:\\temp\\customers.csv
PS C:\\> Write-DbaDbTableData -SqlInstance sql2014 -InputObject $DataTable -Table mydb.dbo.customers -Truncate
Performs a bulk insert of all the data in customers.csv. Prior to importing into mydb.dbo.customers, the user is
informed that the table will be truncated and asks for confirmation. The user is prompted again to perform the
import.
-------------------------- EXAMPLE 5 --------------------------
PS C:\\>$DataTable = Import-Csv C:\\temp\\customers.csv
PS C:\\> Write-DbaDbTableData -SqlInstance sql2014 -InputObject $DataTable -Database mydb -Table customers
-KeepNulls
Performs a bulk insert of all the data in customers.csv into mydb.dbo.customers. Because Schema was not specified,
dbo was used. NULL values in the destination table will be preserved.
-------------------------- EXAMPLE 6 --------------------------
PS C:\\>$passwd = ConvertTo-SecureString "P@ssw0rd" -AsPlainText -Force
PS C:\\> $AzureCredential = New-Object System.Management.Automation.PSCredential("AzureAccount"),$passwd)
PS C:\\> $DataTable = Import-Csv C:\\temp\\customers.csv
PS C:\\> Write-DbaDbTableData -SqlInstance AzureDB.database.windows.net -InputObject $DataTable -Database mydb
-Table customers -KeepNulls -Credential $AzureCredential -BulkCopyTimeOut 300
This performs the same operation as the previous example, but against a SQL Azure Database instance using the
required credentials.
-------------------------- EXAMPLE 7 --------------------------
PS C:\\>$process = Get-Process
PS C:\\> Write-DbaDbTableData -InputObject $process -SqlInstance sql2014 -Table
"[[DbName]]].[Schema.With.Dots].[`"[Process]]`"]" -AutoCreateTable
Creates a table based on the Process object with over 60 columns, converted from PowerShell data types to SQL
Server data types. After the table is created a bulk insert is performed to add process information into the table
Writes the results of Get-Process to a table named: "[Process]" in schema named: Schema.With.Dots in database
named: [DbName]
The Table name, Schema name and Database name must be wrapped in square brackets [ ]
Special characters like " must be escaped by a ` character.
In addition any actual instance of the ] character must be escaped by being duplicated.
This is an example of the type conversion in action. All process properties are converted, including special types
like TimeSpan. Script properties are resolved before the type conversion starts thanks to ConvertTo-DbaDataTable.
RELATED LINKS
https://dbatools.io/Write-DbaDbTableData
SYNOPSIS
Writes data to a SQL Server Table.
SYNTAX
Write-DbaDbTableData -SqlInstance <DbaInstanceParameter> [-SqlCredential <Pscredential>] [-Database
<System.Object>] -InputObject <System.Object> [-Table] <String> [[-Schema] <String>] [-BatchSize <Int>]
[-NotifyAfter <Int>] [-AutoCreateTable <Switch>] [-NoTableLock <Switch>] [-CheckConstraints <Switch>]
[-FireTriggers <Switch>] [-KeepIdentity <Switch>] [-KeepNulls <Switch>] [-Truncate <Switch>] [-bulkCopyTimeOut
<Int>] [-EnableException <Switch>] [-UseDynamicStringLength <Switch>] [<CommonParameters>]
DESCRIPTION
Writes a .NET DataTable to a SQL Server table using SQL Bulk Copy.
PARAMETERS
-AutoCreateTable [<Switch>]
If this switch is enabled, the table will be created if it does not already exist. The table will be created
with sub-optimal data types such as nvarchar(max)
Required? false
Position? named
Default value
Accept pipeline input? False
Accept wildcard characters? false
-BatchSize [<Int>]
The BatchSize for the import defaults to 5000.
Required? false
Position? named
Default value
Accept pipeline input? False
Accept wildcard characters? false
-bulkCopyTimeOut [<Int>]
Value in seconds for the BulkCopy operations timeout. The default is 30 seconds.
Required? false
Position? named
Default value
Accept pipeline input? False
Accept wildcard characters? false
-CheckConstraints [<Switch>]
If this switch is enabled, the SqlBulkCopy option to process check constraints will be enabled.
Per Microsoft "Check constraints while data is being inserted. By default, constraints are not checked."
Required? false
Position? named
Default value
Accept pipeline input? False
Accept wildcard characters? false
-Database [<System.Object>]
The database to import the table into.
Required? false
Position? named
Default value
Accept pipeline input? False
Accept wildcard characters? false
-EnableException [<Switch>]
By default, when something goes wrong we try to catch it, interpret it and give you a friendly warning message.
This avoids overwhelming you with "sea of red" exceptions, but is inconvenient because it basically disables
advanced scripting.
Using this switch turns this "nice by default" feature off and enables you to catch exceptions with your own
try/catch.
Required? false
Position? named
Default value
Accept pipeline input? False
Accept wildcard characters? false
-FireTriggers [<Switch>]
If this switch is enabled, the SqlBulkCopy option to fire insert triggers will be enabled.
Per Microsoft "When specified, cause the server to fire the insert triggers for the rows being inserted into
the Database."
Required? false
Position? named
Default value
Accept pipeline input? False
Accept wildcard characters? false
-InputObject [<System.Object>]
This is the DataTable (or data row) to import to SQL Server.
Required? false
Position? named
Default value
Accept pipeline input? False
Accept wildcard characters? false
-KeepIdentity [<Switch>]
If this switch is enabled, the SqlBulkCopy option to preserve source identity values will be enabled.
Per Microsoft "Preserve source identity values. When not specified, identity values are assigned by the
destination."
Required? false
Position? named
Default value
Accept pipeline input? False
Accept wildcard characters? false
-KeepNulls [<Switch>]
If this switch is enabled, the SqlBulkCopy option to preserve NULL values will be enabled.
Per Microsoft "Preserve null values in the destination table regardless of the settings for default values.
When not specified, null values are replaced by default values where applicable."
Required? false
Position? named
Default value
Accept pipeline input? False
Accept wildcard characters? false
-NoTableLock [<Switch>]
If this switch is enabled, a table lock (TABLOCK) will not be placed on the destination table. By default,
this operation will lock the destination table while running.
Required? false
Position? named
Default value
Accept pipeline input? False
Accept wildcard characters? false
-NotifyAfter [<Int>]
Sets the option to show the notification after so many rows of import
Required? false
Position? named
Default value
Accept pipeline input? False
Accept wildcard characters? false
-Schema [<String>]
Defaults to dbo if no schema is specified.
Required? false
Position? named
Default value
Accept pipeline input? False
Accept wildcard characters? false
-SqlCredential [<Pscredential>]
Login to the target instance using alternative credentials. Accepts PowerShell credentials (Get-Credential).
Windows Authentication, SQL Server Authentication, Active Directory - Password, and Active Directory -
Integrated are all supported.
For MFA support, please use Connect-DbaInstance.
Required? false
Position? named
Default value
Accept pipeline input? False
Accept wildcard characters? false
-SqlInstance [<DbaInstanceParameter>]
The target SQL Server instance or instances.
Required? false
Position? named
Default value
Accept pipeline input? False
Accept wildcard characters? false
-Table [<String>]
The table name to import data into. You can specify a one, two, or three part table name. If you specify a one
or two part name, you must also use -Database.
If the table does not exist, you can use -AutoCreateTable to automatically create the table with inefficient
data types.
If the object has special characters please wrap them in square brackets [ ].
Using dbo.First.Table will try to import to a table named 'Table' on schema 'First' and database 'dbo'.
The correct way to import to a table named 'First.Table' on schema 'dbo' is by passing dbo.[First.Table]
Any actual usage of the ] must be escaped by duplicating the ] character.
The correct way to import to a table Name] in schema Schema.Name is by passing [Schema.Name].[Name]]]
Required? false
Position? named
Default value
Accept pipeline input? False
Accept wildcard characters? false
-Truncate [<Switch>]
If this switch is enabled, the destination table will be truncated after prompting for confirmation.
Required? false
Position? named
Default value
Accept pipeline input? False
Accept wildcard characters? false
-UseDynamicStringLength [<Switch>]
By default, all string columns will be NVARCHAR(MAX).
If this switch is enabled, all columns will get the length specified by the column's MaxLength property (if
specified)
Required? false
Position? named
Default value
Accept pipeline input? False
Accept wildcard characters? false
<CommonParameters>
This cmdlet supports the common parameters: Verbose, Debug,
ErrorAction, ErrorVariable, WarningAction, WarningVariable,
OutBuffer, PipelineVariable, and OutVariable. For more information, see
about_CommonParameters (https:/go.microsoft.com/fwlink/?LinkID=113216).
INPUTS
OUTPUTS
NOTES
Tags: DataTable, Insert
Author: Chrissy LeMaire (@cl), netnerds.net
Website: https://dbatools.io
Copyright: (c) 2018 by dbatools, licensed under MIT
License: MIT https://opensource.org/licenses/MIT
-------------------------- EXAMPLE 1 --------------------------
PS C:\\>$DataTable = Import-Csv C:\\temp\\customers.csv
PS C:\\> Write-DbaDbTableData -SqlInstance sql2014 -InputObject $DataTable -Table mydb.dbo.customers
Performs a bulk insert of all the data in customers.csv into database mydb, schema dbo, table customers. A
progress bar will be shown as rows are inserted. If the destination table does not exist, the import will be
halted.
-------------------------- EXAMPLE 2 --------------------------
PS C:\\>$tableName = "MyTestData"
PS C:\\> $query = "SELECT name, create_date, owner_sid FROM sys.databases"
PS C:\\> $dataset = Invoke-DbaQuery -SqlInstance 'localhost,1417' -SqlCredential $containerCred -Database master
-Query $query
PS C:\\> $dataset | Select-Object name, create_date, @{L="owner_sid";E={$_."owner_sid"}} | Write-DbaDbTableData
-SqlInstance 'localhost,1417' -SqlCredential $containerCred -Database tempdb -Table myTestData -Schema dbo
-AutoCreateTable
Pulls data from a SQL Server instance and then performs a bulk insert of the dataset to a new, auto-generated
table tempdb.dbo.MyTestData.
-------------------------- EXAMPLE 3 --------------------------
PS C:\\>$DataTable = Import-Csv C:\\temp\\customers.csv
PS C:\\> Write-DbaDbTableData -SqlInstance sql2014 -InputObject $DataTable -Table mydb.dbo.customers
-AutoCreateTable -Confirm
Performs a bulk insert of all the data in customers.csv. If mydb.dbo.customers does not exist, it will be created
with inefficient but forgiving DataTypes.
Prompts for confirmation before a variety of steps.
-------------------------- EXAMPLE 4 --------------------------
PS C:\\>$DataTable = Import-Csv C:\\temp\\customers.csv
PS C:\\> Write-DbaDbTableData -SqlInstance sql2014 -InputObject $DataTable -Table mydb.dbo.customers -Truncate
Performs a bulk insert of all the data in customers.csv. Prior to importing into mydb.dbo.customers, the user is
informed that the table will be truncated and asks for confirmation. The user is prompted again to perform the
import.
-------------------------- EXAMPLE 5 --------------------------
PS C:\\>$DataTable = Import-Csv C:\\temp\\customers.csv
PS C:\\> Write-DbaDbTableData -SqlInstance sql2014 -InputObject $DataTable -Database mydb -Table customers
-KeepNulls
Performs a bulk insert of all the data in customers.csv into mydb.dbo.customers. Because Schema was not specified,
dbo was used. NULL values in the destination table will be preserved.
-------------------------- EXAMPLE 6 --------------------------
PS C:\\>$passwd = ConvertTo-SecureString "P@ssw0rd" -AsPlainText -Force
PS C:\\> $AzureCredential = New-Object System.Management.Automation.PSCredential("AzureAccount"),$passwd)
PS C:\\> $DataTable = Import-Csv C:\\temp\\customers.csv
PS C:\\> Write-DbaDbTableData -SqlInstance AzureDB.database.windows.net -InputObject $DataTable -Database mydb
-Table customers -KeepNulls -Credential $AzureCredential -BulkCopyTimeOut 300
This performs the same operation as the previous example, but against a SQL Azure Database instance using the
required credentials.
-------------------------- EXAMPLE 7 --------------------------
PS C:\\>$process = Get-Process
PS C:\\> Write-DbaDbTableData -InputObject $process -SqlInstance sql2014 -Table
"[[DbName]]].[Schema.With.Dots].[`"[Process]]`"]" -AutoCreateTable
Creates a table based on the Process object with over 60 columns, converted from PowerShell data types to SQL
Server data types. After the table is created a bulk insert is performed to add process information into the table
Writes the results of Get-Process to a table named: "[Process]" in schema named: Schema.With.Dots in database
named: [DbName]
The Table name, Schema name and Database name must be wrapped in square brackets [ ]
Special characters like " must be escaped by a ` character.
In addition any actual instance of the ] character must be escaped by being duplicated.
This is an example of the type conversion in action. All process properties are converted, including special types
like TimeSpan. Script properties are resolved before the type conversion starts thanks to ConvertTo-DbaDataTable.
RELATED LINKS
https://dbatools.io/Write-DbaDbTableData