

# Full load SQL Server database migration
<a name="chap-manageddatabases.sql-server-rds-sql-server-full-load"></a>

The full load migration phase populates the target database with a copy of the source data. In each section, you can find detailed information about the full load method and their results to help you choose the one that fits your use case. For all three methods, we use the [https://github.com/aws-samples/aws-database-migration-samples/blob/master/sqlserver/sampledb/v1/README.md](https://github.com/aws-samples/aws-database-migration-samples/blob/master/sqlserver/sampledb/v1/README.md) database as an example. The `dms_sample` database includes tables, views, indexes, stored procedures, and other database objects.

**Topics**
+ [SQL Server database backup and restore using Amazon S3](chap-manageddatabases.sql-server-rds-sql-server-full-load-backup-restore.md)
+ [SQL Server import and export wizard](chap-manageddatabases.sql-server-rds-sql-server-full-load-import-export.md)
+ [Generate and Publish Scripts wizard and Bulk Copy Program Utility](chap-manageddatabases.sql-server-rds-sql-server-full-load-bcp.md)

# SQL Server database backup and restore using Amazon S3
<a name="chap-manageddatabases.sql-server-rds-sql-server-full-load-backup-restore"></a>

Backup and restore is the easiest and usually the preferred method for the initial load of the target database. In this method, you create a full backup of your self-managed SQL Server database, transfer it to an Amazon S3 bucket, and restore it to your Amazon RDS for SQL Server instance. For more information, see [Importing and exporting SQL Server databases using native backup and restore](https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/SQLServer.Procedural.Importing.html) in the *Amazon RDS User Guide*.

The backup and restore method is suitable for the following use cases:
+ Your database size is less than 64 TiB.
+ You want to carry out a lift and shift migration with no changes or minimal changes to the database. For example, you want to migrate secondary database objects such as users, views, stored procedures, triggers, and so on in addition to your data.
+ Network connectivity between your on-premises data center and AWS is often congested or has frequent disconnects. Backup and restore gives you the flexibility to transmit backup files during non-business hours.

The backup and restore method has the following limitations:
+ Amazon RDS for SQL Server supports native restore of databases up to 64 TiB in size. For SQL Server Express Edition databases, Amazon RDS supports native restore of up to 10 GiB.
+ On Multi-AZ database instances, you can only natively restore databases that are backed up in full recovery model.
+ The Amazon S3 bucket where you store your data, has to be located in the same AWS Region as your target Amazon RDS for SQL Server database instance.
+ Restoring backups from one time zone to a different time zone isn’t recommended.
+ You can’t transform or filter data at a table-level when you use backup and restore.
+ When you need to migrate a subset of tables, you can’t use backup and restore.

## Migration steps
<a name="chap-manageddatabases.sql-server-rds-sql-server-full-load-backup-restore-steps"></a>

At a high level, the steps involved in backup and restore are the following:
+ Perform a full backup of the source database.
+ Copy the backup file to an Amazon S3 bucket.
+ Restore the backup from the Amazon S3 bucket onto the target Amazon RDS for SQL Server database.

We use the [https://github.com/aws-samples/aws-database-migration-samples/blob/master/sqlserver/sampledb/v1/README.md](https://github.com/aws-samples/aws-database-migration-samples/blob/master/sqlserver/sampledb/v1/README.md) database in the following example.

## Perform full backup
<a name="chap-manageddatabases.sql-server-rds-sql-server-full-load-backup-restore-full-backup"></a>

First, perform a full back up of the source database. RDS for Microsoft SQL Server currently limits data files to 5 TiB. If the database backup size is less than 5 TiB, you can use the following command.

```
Use [dms_sample]
GO

BACKUP DATABASE [dms_sample] TO
DISK = 'C:\Backup\dms_sample.bak'
WITH NOFORMAT, NOINIT,
NAME = 'Full Backup of dms_sample', SKIP, NOREWIND, NOUNLOAD, STATS = 10
Go
```

If your database is larger than 5 TiB, split the backup files. Make sure that each file is less than 5 TiB in size. For example, the size of the `dms_sample` database is 15 TiB. This means that we use three backup files.

```
Use [dms_sample]
GO

BACKUP DATABASE [dms_sample] TO
DISK = 'C:\Backup\dms_sample1.bak',
DISK = 'C:\Backup\dms_sample2.bak',
DISK = 'C:\Backup\dms_sample3.bak'
WITH NOFORMAT, NOINIT,
NAME = 'Full Backup of dms_sample', SKIP, NOREWIND, NOUNLOAD, STATS = 10
Go
```

## Copy backup files to Amazon S3
<a name="chap-manageddatabases.sql-server-rds-sql-server-full-load-backup-restore-copy-backup"></a>

Now, use the AWS CLI to upload the backup file to an Amazon S3 bucket.

```
aws s3 cp C:\Backup\dms_sample.bak s3://sampledatabaseuswest2/
```

For multiple backup files, use the folder path to copy the backup files to an Amazon S3 bucket.

```
aws s3 cp "C:\Backup" s3://sampledatabaseuswest2/ --recursive
```

Make sure that you define an AWS Identity and Access Management (IAM) role to access the option group. An option group can specify features, called options, that are available for a particular Amazon RDS DB instance. When you associate a DB instance with an option group, the specified options and option settings are enabled for that DB instance

When you create this IAM role, attach a trust relationship and a permissions policy. For more information, see [Manually creating an IAM role for native backup and restore](https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/SQLServer.Procedural.Importing.html#SQLServer.Procedural.Importing.Native.Enabling.IAM).

We create the `sql-server-backup-restore` role, and then use it when we configure the target Amazon RDS database.

## Restore your backup to the target database
<a name="chap-manageddatabases.sql-server-rds-sql-server-full-load-backup-restore-restore-backup"></a>

To restore your backup, do the following:

1. Create an option group for the target database.

   1. In the Amazon RDS console, choose **Option groups**, and then choose **Create option group**.

   1. For **Name**, enter **SQLServerrestore**.

   1. For **Description**, enter **SQLServerrestore**.

   1. For **Engine**, choose **sqlserver-se**.

   1. For **Major engine version**, choose **14.00**.

   1. Choose **Create**.

1. Add the `SQLSERVER_BACKUP_RESTORE` option and the `sql-server-backup-restore` role to this option group to access S3 bucket.

   1. On the **Option groups** page, choose the option group that you created.

   1. For **Options**, choose **Add option**. The **Add option** page opens.

   1. For **Option name**, choose **SQLSERVER\$1BACKUP\$1RESTORE**.

   1. For **IAM role**, choose the `sql-server-backup-restore` role.

1. Modify your Amazon RDS for SQL Server DB instance and attach this option group.

   1. In the Amazon RDS console, choose **Databases**, and then choose your target database.

   1. Choose **Modify**. The **Modify DB instance** page opens.

   1. In the **Additional configuration** section, choose **SQLServerrestore** for **Option group**.

Now, you can restore the backup file from Amazon S3 into the target Amazon RDS for SQL Server database. To restore your database, call the `rds_restore_database` stored procedure. For more information, see [Restoring a database](https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/SQLServer.Procedural.Importing.html#SQLServer.Procedural.Importing.Native.Using.Restore).

```
exec msdb.dbo.rds_restore_database
@restore_db_name='DMS',
@s3_arn_to_restore_from='arn:aws:s3:::sampledatabaseuswest2/dms_sample.bak';
```

To restore multiple backup files, use the following command.

```
exec msdb.dbo.rds_restore_database
@restore_db_name='DMS',
@s3_arn_to_restore_from='arn:aws:s3:::sampledatabaseuswest2/dms_sample*';
```

The preceding statement returns the ID of the task. You can use the following command to check the status of the restore using this task ID.

```
exec msdb.dbo.rds_task_status
 [@db_name='DMS'],
 [@task_id=<ID_number>];
```

Finally, use the following SQL command to get the log sequence number (LSN) of the on-premises source database backup. Then use this LSN to set up the change data capture (CDC) task in AWS DMS.

```
Use [dms_sample]
GO

SELECT [Current LSN], [Begin Time], Description FROM fn_dblog(NULL, NULL) Where [Transaction Name] = 'Backup:CommitDifferentialBase'
```

# SQL Server import and export wizard
<a name="chap-manageddatabases.sql-server-rds-sql-server-full-load-import-export"></a>

Microsoft SQL Server Import and Export Wizard is a high-performance option for data migration. It uses the SQL Server Integration Services (SSIS) framework. For more information, see [Import and Export Data with the SQL Server Import and Export Wizard](https://docs.microsoft.com/en-us/sql/integration-services/import-export-data/import-and-export-data-with-the-sql-server-import-and-export-wizard?view=sql-server-ver15) and [SQL Server Integration Services](https://docs.microsoft.com/en-us/sql/integration-services/sql-server-integration-services?view=sql-server-ver15).

The Import and Export Wizard is suitable for the following use cases:
+ To achieve high migration performance.
+ To transform data during the migration. You can use the wizard to create SSIS packages and modify them in Visual Studio with an SSIS extension to achieve this.
+ To rename the target tables or schemas during the migration.
+ To migrate only the tables and avoid the migration of the secondary database objects such as users, views, stored procedures, triggers, foreign keys or functions.

The migration performance is affected by resource constraints of the host where you run the wizard. During the migration, all data is funneled through this host.

## Migration steps
<a name="chap-manageddatabases.sql-server-rds-sql-server-full-load-import-export-steps"></a>

Use the following steps to migrate all the tables and views from the `dms_sample` database to your target database.

Disable all constraints on the target DB instance before to the migration. The Import and Export Wizard copies tables in a random order. This may lead to failures if you enforce referential integrity on the target.

```
EXEC sp_msforeachtable 'ALTER TABLE ? NOCHECK CONSTRAINT all'
```

Make sure that you capture the current log sequence number (LSN) from the source database before your start the full load. To capture the current LSN, use the following command.

```
SELECT max([Current LSN]) FROM fn_dblog(NULL, NULL)
```

Then you can use this LSN to set up the change data capture (CDC) task in AWS DMS.

Open the SQL Server Import and Export Wizard from the Windows Start menu. Connect to your source and target databases and select the source tables and views. The following image shows the SQL Server Import and Export Wizard application window.

![\[SQL Server Import and Export Wizard application window\]](http://docs.aws.amazon.com/dms/latest/sbs/images/sql-server-rds-sql-server-full-load-import-export.png)


Choose **Next**, then choose **Run immediately**, and then choose **Finish**. The SQL Server Import and Export Wizard starts the migration. You can monitor the progress of your migration using the Performing Operation screen. For more information, see [Performing Operation (SQL Server Import and Export Wizard)](https://docs.microsoft.com/en-us/sql/integration-services/import-export-data/performing-operation-sql-server-import-and-export-wizard?view=sql-server-ver15).

Make sure that you turn on constraints after you complete the migration.

```
EXEC sp_msforeachtable 'ALTER TABLE ? CHECK CONSTRAINT all'
```

For more information, see [Get started with this simple example of the Import and Export Wizard](https://docs.microsoft.com/en-us/sql/integration-services/import-export-data/get-started-with-this-simple-example-of-the-import-and-export-wizard?view=sql-server-ver15).

# Generate and Publish Scripts wizard and Bulk Copy Program Utility
<a name="chap-manageddatabases.sql-server-rds-sql-server-full-load-bcp"></a>

You can use the SQL Server Generate and Publish Scripts wizard to create Transact-SQL scripts for objects in your database. Then you can run the Bulk Copy Program Utility (bcp) to copy data from your Microsoft SQL Server instance into data files. Also, you can use bcp to import data into a table from data files. For more information, see [How to: Generate a Script (SQL Server Management Studio)](https://docs.microsoft.com/en-us/previous-versions/sql/sql-server-2008-r2/ms178078(v=sql.105)?redirectedfrom=MSDN) and [bcp Utility](https://docs.microsoft.com/en-us/sql/tools/bcp-utility?view=sql-server-ver15).

This approach is suitable for the following use cases:
+ You don’t transform data during the migration.
+ You don’t rename tables or schemas during the migration.
+ You use referential integrity on target tables. In this case, bcp automatically suspends RI constraints on target tables during data import.
+ You can script and migrate all database objects using the SQL Server Generate and Publish Scripts wizard.

This approach has the following limitations:
+ You need to create schemas on your target database before you can use migration scripts.
+ This approach is slower than the Import and Export Wizard.
+ Data transformations aren’t supported.
+ In bcp, the error messages are limited to 512 bytes. This can make troubleshooting complicated.
+ You run the bcp command for each table. This increases the complexity for large migrations.

## Migration Steps
<a name="chap-manageddatabases.sql-server-rds-sql-server-full-load-bcp-steps"></a>

At a high level, the steps involved in this approach are the following:
+ Use Microsoft Generate and Publish Scripts wizard to create Transact-SQL scripts from source database.
+ Use the created Transact-SQL scripts to create database objects in Target database.
+ Use SQL Server Bulk Copy Program Utility (bcp) to export data from the source database to data files. Then, use bcp to import data from the data files into the target database table.

The following example shows how to migrate the `dms_sample` database using Generate and Publish Scripts wizard and Bulk Copy Program Utility.

Generate a Transact-SQL script for the source database tables. You can save the script as single file or save in a new query window.

![\[Microsoft Generate and Publish Scripts wizard\]](http://docs.aws.amazon.com/dms/latest/sbs/images/sql-server-rds-sql-server-full-load-bcp.png)


Next, create database objects on the target database using the script that you generated in the previous step.

Run the following command on the source database to capture the current log sequence number (LSN). Then use this LSN to set up the change data capture (CDC) task in AWS DMS.

```
SELECT max([Current LSN]) FROM fn_dblog(NULL, NULL)
```

Use the Windows command prompt to export the source tables to data files with the bcp utility.

```
bcp [database_name.] schema.table_name out "data_file" -c -t -S [server_name] -d [database_name] -U [login] -P [password]
```

Import the data files created in the previous step into the target database with the bcp utility.

```
bcp [database_name.] schema.table_name in "data_file" -S [server_name] -d [database_name] -c -t
```

You can create a .bat file with all the bcp scripts to avoid running script one by one. The following code example shows the contents of this .bat file.

```
bcp dbo.export1 out C:\BCP\export1.dat -c -t -S source-server-name -d dms_sample -U dms_user -P password
bcp dbo.export2 out C:\BCP\export2.dat -c -t -S source-server-name -d dms_sample -U dms_user -P password
bcp dbo.export3 out C:\BCP\export3.dat -c -t -S source-server-name -d dms_sample -U dms_user -P password
bcp dbo.export1 in C:\BCP\export1.dat -c -t -S target-server-name -d dms_sample -U dms_user -P password
bcp dbo.export2 in C:\BCP\export2.dat -c -t -S target-server-name -d dms_sample -U dms_user -P password
bcp dbo.export3 in C:\BCP\export3.dat -c -t -S target-server-name -d dms_sample -U dms_user -P password
```