

# Creating source data providers in DMS Schema Conversion
<a name="data-providers-source"></a>

You can use a Microsoft SQL Server, Oracle, PostgreSQL, MySQL, IBM DB2 for LUW, IBM DB2 for z/OS and SAP ASE (Sybase ASE) database as a source data provider in migration projects for DMS Schema Conversion. 

Make sure that you configure the network to permit interaction between your source data provider and DMS Schema Conversion. For more information, see [Setting up a network for DMS Schema Conversion](instance-profiles-network.md). 

**Topics**
+ [

# Using a Microsoft SQL Server database as a source in DMS Schema Conversion
](data-providers-sql-server.md)
+ [

# Using an Oracle database as a source in DMS Schema Conversion
](data-providers-oracle.md)
+ [

# Using an Oracle Data Warehouse database as a source in DMS Schema Conversion
](data-providers-oracle-dw.md)
+ [

# Using a PostgreSQL database as a source in DMS Schema Conversion
](sc-data-providers-postgresql.md)
+ [

# Using a MySQL database as a source in DMS Schema Conversion
](sc-data-providers-mysql.md)
+ [

# Using an IBM Db2 for Linux, UNIX and Windows database as a source in DMS Schema Conversion
](sc-data-providers-db2luw.md)
+ [

# Using an IBM Db2 for z/OS database as a source in DMS Schema Conversion
](sc-data-providers-db2zos.md)
+ [

# Using a SAP ASE (Sybase ASE) database as a source in AWS DMS Schema Conversion
](dm-data-providers-source-sybase-ASE.md)

# Using a Microsoft SQL Server database as a source in DMS Schema Conversion
<a name="data-providers-sql-server"></a>

You can use SQL Server databases as a migration source in DMS Schema Conversion.

You can use DMS Schema Conversion to convert database code objects from SQL Server to the following targets:
+ Aurora MySQL
+ Aurora PostgreSQL
+ RDS for MySQL
+ RDS for PostgreSQL

For information about the supported SQL Server database versions, see [Source data providers for DMS Schema Conversion](CHAP_Introduction.Sources.md#CHAP_Introduction.Sources.SchemaConversion).

For more information about using DMS Schema Conversion with a source SQL Server database, see the [SQL Server to MySQL migration step-by-step walkthrough](https://docs.aws.amazon.com/dms/latest/sbs/schema-conversion-sql-server-mysql.html).

## Privileges for Microsoft SQL Server as a source
<a name="data-providers-sql-server-permissions"></a>

View the following list of privileges required for Microsoft SQL Server as a source: 
+ VIEW DEFINITION
+ VIEW DATABASE STATE

The `VIEW DEFINITION` privilege enables users that have public access to see object definitions. DMS Schema Conversion uses the `VIEW DATABASE STATE` privilege to check the features of the SQL Server Enterprise edition.

Repeat the grant for each database whose schema you are converting.

In addition, grant the following privileges on the `master` database:
+ VIEW SERVER STATE
+ VIEW ANY DEFINITION

DMS Schema Conversion uses the `VIEW SERVER STATE` privilege to collect server settings and configuration. Make sure that you grant the `VIEW ANY DEFINITION` privilege to view data providers.

To read information about Microsoft Analysis Services, run the following command on the `master` database.

```
EXEC master..sp_addsrvrolemember @loginame = N'<user_name>', @rolename = N'sysadmin'
```

In the preceding example, replace the `<user_name>` placeholder with the name of the user who you previously granted with the required privileges.

To read information about SQL Server Agent, add your user to the SQLAgentUser role. Run the following command on the `msdb` database.

```
EXEC sp_addrolemember <SQLAgentRole>, <user_name>;
```

In the preceding example, replace the `<SQLAgentRole>` placeholder with the name of the SQL Server Agent role. Then replace the `<user_name>` placeholder with the name of the user who you previously granted with the required privileges. For more information, see [Adding a user to the SQLAgentUser role](https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/Appendix.SQLServer.CommonDBATasks.Agent.html#SQLServerAgent.AddUser) in the *Amazon RDS User Guide*.

To detect log shipping, grant the `SELECT on dbo.log_shipping_primary_databases` privilege on the `msdb` database.

To use the notification approach of the data definition language (DDL) replication, grant the `RECEIVE ON <schema_name>.<queue_name>` privilege on your source databases. In this example, replace the `<schema_name>` placeholder with the schema name of your database. Then, replace the `<queue_name>` placeholder with the name of a queue table.

# Using an Oracle database as a source in DMS Schema Conversion
<a name="data-providers-oracle"></a>

You can use Oracle databases as a migration source in DMS Schema Conversion.

To connect to your Oracle database, use the Oracle System ID (SID). To find the Oracle SID, submit the following query to your Oracle database:

```
SELECT sys_context('userenv','instance_name') AS SID FROM dual;
```

You can use DMS Schema Conversion to convert database code objects from Oracle Database to the following targets:
+ Aurora MySQL
+ Aurora PostgreSQL
+ RDS for MySQL
+ RDS for PostgreSQL

For information about the supported Oracle database versions, see [Source data providers for DMS Schema Conversion](CHAP_Introduction.Sources.md#CHAP_Introduction.Sources.SchemaConversion).

For more information about using DMS Schema Conversion with a source Oracle database, see the [Oracle to PostgreSQL migration step-by-step walkthrough](https://docs.aws.amazon.com/dms/latest/sbs/schema-conversion-oracle-postgresql.html).

## Privileges for Oracle as a source
<a name="data-providers-oracle-permissions"></a>

The following privileges are required for Oracle as a source: 
+ CONNECT 
+ SELECT\$1CATALOG\$1ROLE 
+ SELECT ANY DICTIONARY 
+ SELECT ON SYS.ARGUMENT\$1

# Using an Oracle Data Warehouse database as a source in DMS Schema Conversion
<a name="data-providers-oracle-dw"></a>

You can use Oracle Data Warehouse databases as a migration source in DMS Schema Conversion to convert database code objects and application code to Amazon Redshift.

For information about supported Oracle database versions, see [Source data providers for DMS Schema Conversion](CHAP_Introduction.Sources.md#CHAP_Introduction.Sources.SchemaConversion). For more information about using DMS Schema Conversion with a source Oracle database, see the [ Oracle to PostgreSQL migration step-by-step walkthrough](https://docs.aws.amazon.com/dms/latest/sbs/schema-conversion-oracle-postgresql.html). 

## Privileges for using an Oracle Data Warehouse database as a source
<a name="data-providers-oracle-dw-privileges"></a>

The following privileges are required for Oracle Data Warehouse as a source:
+ CONNECT
+ SELECT\$1CATALOG\$1ROLE
+ SELECT ANY DICTIONARY

## Oracle Data Warehouse to Amazon Redshift conversion settings
<a name="data-providers-oracle-dw-settings"></a>

For information about editing DMS Schema Conversion settings, see [Specifying schema conversion settings for migration projects](schema-conversion-settings.md). 

Oracle Data Warehouse to Amazon Redshift conversion settings include the following:
+ **Add comments in the converted code for the action items of selected severity and higher**: This setting limits the number of comments with action items in the converted code. DMS adds comments in the converted code for action items of the selected severity and higher.

  For example, to minimize the number of comments in your converted code, choose **Errors only**. To include comments for all action items in your converted code, choose **All messages**.
+ **The maximum number of tables for the target Amazon Redshift cluster**: This setting sets the maximum number of tables that DMS can apply to your target Amazon Redshift cluster. Amazon Redshift has quotas that limit the use tables for different cluster node types. This setting supports the following values:
  + **Auto**: DMS determines the number of tables to apply to your target Amazon Redshift cluster depending on the node type.
  + **Set a value**: Set the number of tables manually.

  DMS converts all your source tables, even if the number of tables is more than your Amazon Redshift cluster can store. DMS stores the converted code in your project and doesn't apply it to the target database. If you reach the Amazon Redshift cluster quota for the tables when you apply the converted code, DMS displays a warning message. Also, DMS applies tables to your target Amazon Redshift cluster until the number of tables reaches the limit.

  For information about Amazon Redshift table quotas, see [Quotas and limits in Amazon Redshift](https://docs.aws.amazon.com/redshift/latest/mgmt/amazon-redshift-limits.html).
+ **Use the UNION ALL view**: This setting lets you set the maximum number of target tables that DMS can create for a single source table.

  Amazon Redshift doesn't support table partitioning. To emulate table partitioning and make queries run faster, DMS can migrate each partition of your source table to a separate table in Amazon Redshift. Then, DMS creates a view that includes data from all of the target tables it creates.

  DMS automatically determines the number of partitions in your source table. Depending on the type of source table partitioning, this number can exceed the quota for the tables that you can apply to your Amazon Redshift cluster. To avoid reaching this quota, enter the maximum number of target tables that DMS can create for partitions of a single source table. The default option is 368 tables, which represents a partition for 366 days of a year, plus two tables for NO RANGE and UNKNOWN partitions.
+ **Datetype format elements that you use in the Oracle code are similar to datetime format strings in Amazon Redshift**: Use this setting to convert data type formatting functions such as `TO_CHAR`, `TO_DATE`, and `TO_NUMBER` with datetime format elements that Amazon Redshift doesn't support. By default, DMS uses extension pack functions to emulate these unsupported format elements in the converted code.

  The datetime format model in Oracle includes more elements than the datetime format strings in Amazon Redshift. When your source code includes only datetime format elements that Amazon Redshift supports, set this value to avoid extension pack functions in the converted code. Avoiding the extension functions makes the converted code run faster.
+ **Numeric format elements that you use in the Oracle code are similar to numeric format strings in Amazon Redshift**: Use this setting to convert numeric data type formatting functions that Amazon Redshift doesn't support. By default, DMS uses extension pack functions to emulate these unsupported format elements in the converted code. 

  The numeric format model in Oracle includes more elements than the numeric format strings in Amazon Redshift. When your source code includes only numeric format elements that Amazon Redshift supports, set this value to avoid extension pack functions in the converted code. Avoiding the extension functions makes the converted code run faster.
+ **Use the NVL function to emulate the behavior of Oracle LEAD and LAG functions**: If your source code doesn't use the default values for offset in the `LEAD` and `LAG` functions, DMS can emulate these functions with the `NVL` function. By default, DMS raises an action item for each use of the `LEAD` and `LAG` functions. Emulating these functions using `NVL` makes the converted code run faster.
+ **Emulate the behavior of primary and unique keys**: Set this setting to cause DMS to emulate the behavior of primary and unique key constraints on the target Amazon Redshift cluster. Amazon Redshift doesn't enforce primary and unique key constraints, and uses them for informational purposes only. If your source code uses primary or unique key constraints, set this setting to ensure that DMS emulates their behavior.
+ **Use compression encoding**: Set this setting to apply compression encoding to Amazon Redshift table columns. DMS assigns compression encoding automatically using the default Redshift algorithm. For information about compression encoding, see [Compression encodings](https://docs.aws.amazon.com/redshift/latest/dg/c_Compression_encodings.html) in the *Amazon Redshift Database Developer Guide*.

  Amazon Redshift doesn't apply compression by default to columns that are defined as sort and distribution keys. To apply compression to these columns, set **Use compression encoding for KEY columns**. You can only select this option when you set **Use compression encoding**.

# Using a PostgreSQL database as a source in DMS Schema Conversion
<a name="sc-data-providers-postgresql"></a>

You can use PostgreSQL databases as a migration source in DMS Schema Conversion. 

You can use DMS Schema Conversion to convert database code objects from PostgreSQL database to the following targets:
+ MySQL
+ Aurora MySQL

The privileges required for PostgreSQL as a source are as follows: 
+ CONNECT ON DATABASE <database\$1name>
+ USAGE ON SCHEMA <database\$1name>
+ SELECT ON ALL TABLES IN SCHEMA <database\$1name>
+ SELECT ON ALL SEQUENCES IN SCHEMA <database\$1name>

# Using a MySQL database as a source in DMS Schema Conversion
<a name="sc-data-providers-mysql"></a>

You can use MySQL databases as a migration source in DMS Schema Conversion.

You can use DMS Schema Conversion to convert database code objects from MySQL Database to the following targets: 
+ PostgreSQL
+ Aurora PostgreSQL

The privileges required for MySQL as a source are as follows:
+ `SELECT ON *.*`
+ `SHOW VIEW ON *.*`

## MySQL to PostgreSQL conversion settings
<a name="sc-data-providers-mysql-settings"></a>

For information about editing DMS Schema Conversion settings, see [Specifying schema conversion settings for migration projects](schema-conversion-settings.md). 

MySQL to PostgreSQL conversion settings include the following:
+ **Comments in converted SQL code**: Set this setting to add comments in the converted code for the action items of the selected severity and higher. 

  Valid values:
  + **Errors only**
  + **Errors and warnings**
  + **All messages**

# Using an IBM Db2 for Linux, UNIX and Windows database as a source in DMS Schema Conversion
<a name="sc-data-providers-db2luw"></a>

You can use an IBM Db2 for Linux, UNIX and Windows databases as a migration source in DMS Schema Conversion.

You can use DMS Schema Conversion to convert database code objects from Db2 for for Linux, UNIX and Windows Database to the following targets: 
+ Aurora PostgreSQL
+ Amazon RDS for PostgreSQL

For information about the supported IBM Db2 for Linux, UNIX and Windows databases versions, see [Source data providers for DMS Schema Conversion](CHAP_Introduction.Sources.md#CHAP_Introduction.Sources.SchemaConversion)

## Prerequisites for IBM Db2 for Linux, UNIX and Windows as a source database
<a name="sc-data-providers-db2luw-prereq"></a>

You can use the following code example to check the version of your source IBM Db2 for Linux, UNIX and Windows database: 

```
SELECT SERVICE_LEVEL FROM SYSIBMADM.ENV_INST_INFO;
```

## Privileges for IBM Db2 for Linux, UNIX and Windows as a source database
<a name="sc-data-providers-db2luw-privileges"></a>

The privileges needed to connect to a Db2 for Linux, UNIX and Windows database and read system catalogs and tables are as follows:

```
EXECUTE ON PACKAGE NULLID.SYSSH200
EXECUTE ON PACKAGE NULLID.SQLC2O26 /* If Db2 LUW runs on a Linux operating system */
EXECUTE ON FUNCTION SYSPROC.ENV_GET_INST_INFO
EXECUTE ON FUNCTION SYSPROC.AUTH_LIST_AUTHORITIES_FOR_AUTHID
EXECUTE ON FUNCTION SYSPROC.AUTH_LIST_GROUPS_FOR_AUTHID
EXECUTE ON FUNCTION SYSPROC.AUTH_LIST_ROLES_FOR_AUTHID
SELECT ON SYSIBMADM.ENV_INST_INFO
SELECT ON SYSIBMADM.ENV_SYS_INFO
SELECT ON SYSIBMADM.PRIVILEGES
SELECT ON SYSCAT.ATTRIBUTES
SELECT ON SYSCAT.CHECKS
SELECT ON SYSCAT.COLIDENTATTRIBUTES
SELECT ON SYSCAT.COLUMNS
SELECT ON SYSCAT.DATAPARTITIONEXPRESSION
SELECT ON SYSCAT.DATAPARTITIONS
SELECT ON SYSCAT.DATATYPEDEP
SELECT ON SYSCAT.DATATYPES
SELECT ON SYSCAT.HIERARCHIES
SELECT ON SYSCAT.INDEXCOLUSE
SELECT ON SYSCAT.INDEXES
SELECT ON SYSCAT.INDEXPARTITIONS
SELECT ON SYSCAT.KEYCOLUSE
SELECT ON SYSCAT.MODULEOBJECTS
SELECT ON SYSCAT.MODULES
SELECT ON SYSCAT.NICKNAMES
SELECT ON SYSCAT.PERIODS
SELECT ON SYSCAT.REFERENCES
SELECT ON SYSCAT.ROUTINEPARMS
SELECT ON SYSCAT.ROUTINES
SELECT ON SYSCAT.ROWFIELDS
SELECT ON SYSCAT.SCHEMATA
SELECT ON SYSCAT.SEQUENCES
SELECT ON SYSCAT.TABCONST
SELECT ON SYSCAT.TABLES
SELECT ON SYSCAT.TRIGGERS
SELECT ON SYSCAT.VARIABLEDEP
SELECT ON SYSCAT.VARIABLES
SELECT ON SYSCAT.VIEWS
SELECT ON SYSIBM.SYSDUMMY1
```

# Using an IBM Db2 for z/OS database as a source in DMS Schema Conversion
<a name="sc-data-providers-db2zos"></a>

You can use an IBM Db2 for z/OS databases as a migration source in DMS Schema Conversion.

You can use DMS Schema Conversion to convert database code objects from Db2 for z/OS Database to the following targets: 
+ Amazon RDS for Db2
+ Amazon RDS for PostgreSQL
+ Aurora PostgreSQL

For more information regarding the supported IBM Db2 for z/OS database versions, see [Source data providers for DMS Schema Conversion](https://docs.aws.amazon.com/dms/latest/userguide/CHAP_Introduction.Sources.html#CHAP_Introduction.Sources.SchemaConversion).

## Prerequisites for IBM Db2 for z/OS as a source database
<a name="sc-data-providers-db2zos-prereq"></a>

The IBM Db2 for z/OS version 12 function level 100 database version does not support most new capabilities of IBM Db2 for z/OS version 12. This database version provides support for fallback to Db2 version 11 and data sharing with Db2 version 11. To avoid the conversion of unsupported features of Db2 version 11, we recommend that you use an IBM Db2 for z/OS database function level 500 or higher as a source for AWS DMS SC.

You can use the following code example to check the version of your source IBM Db2 for z/OS database:

```
SELECT GETVARIABLE('SYSIBM.VERSION') as version FROM SYSIBM.SYSDUMMY1;
```

Ensure that this code returns version `DSN12015` or higher.

You can use the following code example to check the value of the `APPLICATION COMPATIBILITY` special register in your source IBM Db2 for z/OS database:

```
SELECT CURRENT APPLICATION COMPATIBILITY as version FROM SYSIBM.SYSDUMMY1;
```

Ensure that this code returns version `V12R1M500` or higher.

## Privileges for IBM Db2 for z/OS as a source database
<a name="sc-data-providers-db2zos-privileges"></a>

The privileges needed to connect to a Db2 for z/OS database and read system catalogs and tables are as follows:

```
SELECT ON SYSIBM.LOCATIONS
SELECT ON SYSIBM.SYSCHECKS
SELECT ON SYSIBM.SYSCOLUMNS
SELECT ON SYSIBM.SYSDATABASE
SELECT ON SYSIBM.SYSDATATYPES
SELECT ON SYSIBM.SYSDUMMY1
SELECT ON SYSIBM.SYSFOREIGNKEYS
SELECT ON SYSIBM.SYSINDEXES
SELECT ON SYSIBM.SYSKEYCOLUSE
SELECT ON SYSIBM.SYSKEYS
SELECT ON SYSIBM.SYSKEYTARGETS
SELECT ON SYSIBM.SYSJAROBJECTS
SELECT ON SYSIBM.SYSPACKAGE
SELECT ON SYSIBM.SYSPARMS
SELECT ON SYSIBM.SYSRELS
SELECT ON SYSIBM.SYSROUTINES
SELECT ON SYSIBM.SYSSEQUENCES
SELECT ON SYSIBM.SYSSEQUENCESDEP
SELECT ON SYSIBM.SYSSYNONYMS
SELECT ON SYSIBM.SYSTABCONST
SELECT ON SYSIBM.SYSTABLES
SELECT ON SYSIBM.SYSTABLESPACE
SELECT ON SYSIBM.SYSTRIGGERS
SELECT ON SYSIBM.SYSVARIABLES
SELECT ON SYSIBM.SYSVIEWS
```

# Using a SAP ASE (Sybase ASE) database as a source in AWS DMS Schema Conversion
<a name="dm-data-providers-source-sybase-ASE"></a>

You can use SAP ASE (Sybase ASE) databases as a migration source in DMS Schema Conversion.

You can use DMS Schema Conversion to convert database code objects from SAP ASE (Sybase ASE) Database to the following targets:
+ Aurora PostgreSQL
+ RDS for PostgreSQL

For information about the supported SAP ASE (Sybase ASE) database versions, see [Source data providers for DMS Schema Conversion](CHAP_Introduction.Sources.md#CHAP_Introduction.Sources.SchemaConversion)

## Privileges for SAP ASE (Sybase ASE) as a source database
<a name="dm-data-providers-source-sybase-ASE.privileges"></a>

The following privileges are required when using SAP ASE (Sybase ASE) as a source database:
+ USE master
+ select on dbo.spt\$1values
+ select on asehostname

For each database to be migrated, grant the following privileges:
+ USE db\$1name *(Replace db\$1name with the name of the database being migrated)*
+ select on dbo.sysusers
+ select on dbo.sysobjects
+ select on dbo.sysindexes
+ select on dbo.syscolumns
+ select on dbo.sysreferences
+ select on dbo.syscomments
+ select on dbo.syspartitions
+ select on dbo.syspartitionkeys
+ select on dbo.sysconstraints
+ select on dbo.systypes
+ select on dbo.sysqueryplans