

# Change Python and Perl applications to support database migration from Microsoft SQL Server to Amazon Aurora PostgreSQL-Compatible Edition
<a name="change-python-and-perl-applications-to-support-database-migration-from-microsoft-sql-server-to-amazon-aurora-postgresql-compatible-edition"></a>

*Dwarika Patra and Deepesh Jayaprakash, Amazon Web Services*

## Summary
<a name="change-python-and-perl-applications-to-support-database-migration-from-microsoft-sql-server-to-amazon-aurora-postgresql-compatible-edition-summary"></a>

This pattern describes changes to application repositories that might be required when you migrate databases from Microsoft SQL Server to Amazon Aurora PostgreSQL-Compatible Edition. The pattern assumes that these applications are Python-based or Perl-based, and provides separate instructions for these scripting languages.

Migrating SQL Server databases to Aurora PostgreSQL-Compatible involves schema conversion, database object conversion, data migration, and data loading. Because of the differences between PostgreSQL and SQL Server (relating to data types, connection objects, syntax, and logic), the most difficult migration task involves making the necessary changes to the code base so that it works correctly with PostgreSQL.

For a Python-based application, connection objects and classes are scattered throughout the system. Also, the Python code base might use multiple libraries to connect to the database. If the database connection interface changes, the objects that run the application’s inline queries also require changes.

For a Perl-based application, changes involve connection objects, database connection drivers, static and dynamic inline SQL statements, and how the application handles complex dynamic DML queries and results sets.

When you migrate your application, you can also consider possible enhancements on AWS, such as replacing the FTP server with Amazon Simple Storage Service (Amazon S3) access.

The application migration process involves the following challenges:
+ Connection objects. If connection objects are scattered in the code with multiple libraries and function calls, you might have to find a generalized way to change them to support PostgreSQL.
+ Error or exception handling during record retrieval or updates. If you have conditional create, read, update, and delete (CRUD) operations on the database that return variables, results sets, or data frames, any errors or exceptions might result in application errors with cascading effects. These should be handled carefully with proper validations and save points. One such save point is to call large inline SQL queries or database objects inside `BEGIN...EXCEPTION...END` blocks.
+ Controlling transactions and their validation. These includes manual and automatic commits and rollbacks. The PostgreSQL driver for Perl requires you to always explicitly set the auto-commit attribute.
+ Handling dynamic SQL queries. This requires a strong understanding of the query logic and iterative testing to ensure that queries work as expected.
+ Performance. You should ensure that code changes don’t result in degraded application performance.

This pattern explains the conversion process in detail.

## Prerequisites and limitations
<a name="change-python-and-perl-applications-to-support-database-migration-from-microsoft-sql-server-to-amazon-aurora-postgresql-compatible-edition-prereqs"></a>

**Prerequisites**
+ Working knowledge of Python and Perl syntax.
+ Basic skills in SQL Server and PostgreSQL.
+ Understanding of your existing application architecture.
+ Access to your application code, SQL Server database, and PostgreSQL database.
+ Access to Windows or Linux (or other Unix) development environment with credentials for developing, testing, and validating application changes.
+ For a Python-based application, standard Python libraries that your application might require, such as **Pandas **to handle data frames, and **psycopg2** or **SQLAlchemy **for database connections.
+ For a Perl-based application, required Perl packages with dependent libraries or modules. The Comprehensive Perl Archive Network (CPAN) module can support most application requirements.
+ All required dependent customized libraries or modules. 
+ Database credentials for read access to SQL Server and read/write access to Aurora.
+ PostgreSQL to validate and debug application changes with services and users.
+ Access to development tools during application migration such as Visual Studio Code, Sublime Text, or **pgAdmin**.

**Limitations**
+ Some Python or Perl versions, modules, libraries, and packages aren’t compatible with the cloud environment.
+ Some third-party libraries and frameworks used for SQL Server cannot be replaced to support PostgreSQL migration. 
+ Performance variations might require changes to your application, to inline Transact-SQL (T-SQL) queries, database functions, and stored procedures.
+ PostgreSQL supports lowercase names for table names, column names, and other database objects. 
+ Some data types, such as UUID columns, are stored in lowercase only. Python and Perl applications must handle such case differences. 
+ Character encoding differences must be handled with the correct data type for the corresponding text columns in the PostgreSQL database.                                

**Product versions**
+ Python 3.6 or later (use the version that supports your operating system)
+ Perl 5.8.3 or later (use the version that supports your operating system)
+ Aurora PostgreSQL-Compatible Edition 4.2 or later (see [details](https://docs.aws.amazon.com/AmazonRDS/latest/AuroraUserGuide/AuroraPostgreSQL.Updates.20180305.html#AuroraPostgreSQL.Updates.20180305.42))

## Architecture
<a name="change-python-and-perl-applications-to-support-database-migration-from-microsoft-sql-server-to-amazon-aurora-postgresql-compatible-edition-architecture"></a>

**Source technology stack**
+ Scripting (application programming) language: Python 2.7 or later, or Perl 5.8 
+ Database: Microsoft SQL Server version 13
+ Operating system: Red Hat Enterprise Linux (RHEL) 7 

**Target technology stack  **
+ Scripting (application programming) language: Python 3.6 or later, or Perl 5.8 or later 
+ Database: Aurora PostgreSQL-Compatible 4.2
+ Operating system: RHEL 7 

**Migration architecture **

![Migrating a Perl or Python application with SQL Server to Aurora PostgreSQL-Compatible](http://docs.aws.amazon.com/prescriptive-guidance/latest/patterns/images/pattern-img/b64de64a-bd55-4db7-ba7b-0a2557862af1/images/b8fab3e2-ded5-4f58-86bf-3f645252e9fc.png)


## Tools
<a name="change-python-and-perl-applications-to-support-database-migration-from-microsoft-sql-server-to-amazon-aurora-postgresql-compatible-edition-tools"></a>

**AWS services and tools**
+ [Aurora PostgreSQL–Compatible Edition](https://docs.aws.amazon.com/AmazonRDS/latest/AuroraUserGuide/Aurora.AuroraPostgreSQL.html) is a fully managed, PostgreSQL-compatible, and ACID-compliant relational database engine that combines the speed and reliability of high-end commercial databases with the cost-effectiveness of open-source databases. Aurora PostgreSQL is a drop-in replacement for PostgreSQL and makes it easier and more cost-effective to set up, operate, and scale your new and existing PostgreSQL deployments.
+ [AWS Command Line Interface (AWS CLI) ](https://docs.aws.amazon.com/cli/latest/userguide/cli-chap-welcome.html)is an open-source tool that enables you to interact with AWS services by using commands in your command-line shell.

**Other tools**
+ [Python](https://www.python.org/) and PostgresSQL database connection libraries such as [psycopg2](https://pypi.org/project/psycopg2/) and [SQLAlchemy](https://www.sqlalchemy.org/)
+ [Perl](https://www.perl.org/) and its [DBI modules](https://metacpan.org/pod/DBD::Pg)
+ [PostgreSQL interactive terminal](https://www.postgresql.org/docs/13/app-psql.html) (psql)

## Epics
<a name="change-python-and-perl-applications-to-support-database-migration-from-microsoft-sql-server-to-amazon-aurora-postgresql-compatible-edition-epics"></a>

### Migrate your application repository to PostgreSQL – high-level steps
<a name="migrate-your-application-repository-to-postgresql-ndash-high-level-steps"></a>


| Task | Description | Skills required | 
| --- | --- | --- | 
| Follow these code conversion steps to migrate your application to PostgreSQL. | [See the AWS documentation website for more details](http://docs.aws.amazon.com/prescriptive-guidance/latest/patterns/change-python-and-perl-applications-to-support-database-migration-from-microsoft-sql-server-to-amazon-aurora-postgresql-compatible-edition.html)The following epics provide detailed instructions for some of these conversion tasks for Python and Perl applications. | App developer | 
| Use a checklist for each step of the migration. | Add the following to your checklist for each step of application migration, including the final step:[See the AWS documentation website for more details](http://docs.aws.amazon.com/prescriptive-guidance/latest/patterns/change-python-and-perl-applications-to-support-database-migration-from-microsoft-sql-server-to-amazon-aurora-postgresql-compatible-edition.html) | App developer | 

### Analyze and update your application – Python code base
<a name="analyze-and-update-your-application-ndash-python-code-base"></a>


| Task | Description | Skills required | 
| --- | --- | --- | 
| Analyze your existing Python code base.  | Your analysis should include the following to facilitate the application migration process:[See the AWS documentation website for more details](http://docs.aws.amazon.com/prescriptive-guidance/latest/patterns/change-python-and-perl-applications-to-support-database-migration-from-microsoft-sql-server-to-amazon-aurora-postgresql-compatible-edition.html) | App developer | 
| Convert your database connections to support PostgreSQL.  | Most Python applications use the **pyodbc **library to connect with SQL Server databases as follows.<pre>import pyodbc<br />....<br />try:<br />    conn_string = "Driver=ODBC Driver 17 for SQL<br />    Server;UID={};PWD={};Server={};Database={}".format (conn_user, conn_password,<br />    conn_server, conn_database)<br />    conn = pyodbc.connect(conn_string)<br />    cur = conn.cursor()<br />    result = cur.execute(query_string)<br />    for row in result:<br />    print (row)<br />except Exception as e:<br />    print(str(e))</pre>Convert the database connection to support PostgreSQL as follows.<pre>import pyodbc<br />import psycopg2<br />....<br />try:<br />    conn_string = ‘postgresql+psycopg2://’+<br />    conn_user+’:’+conn_password+’@’+conn_server+’/’+conn_database<br />    conn = pyodbc.connect(conn_string, connect_args={‘options’:’-csearch_path=dbo’})<br />    cur = conn.cursor()<br />    result = cur.execute(query_string)<br />    for row in result:<br />    print (row)<br />except Exception as e:<br />    print(str(e))</pre> | App developer | 
| Change inline SQL queries to PostgreSQL. | Convert your inline SQL queries to a PostgreSQL-compatible format. For example, the following SQL Server query retrieves a string from a table.<pre>dtype = "type1"<br />stm = ‘"SELECT TOP 1 searchcode FROM TypesTable (NOLOCK)<br />WHERE code="’ + "’" + str(dtype) + "’"<br /># For Microsoft SQL Server Database Connection<br />engine = create_engine(‘mssql+pyodbc:///?odbc_connect=%s’ % urllib.parse.quote_plus(conn_string), connect_args={‘connect_timeout’:login_timeout})<br />conn = engine_connect()<br />rs = conn.execute(stm)<br />for row in rs:<br />    print(row)</pre>After conversion, the PostgreSQL-compatible inline SQL query looks like the following.<pre>dtype = "type1"<br />stm = ‘"SELECT searchcode FROM TypesTable<br />WHERE code="’ + "’" + str(dtype) + "’ LIMIT 1"<br /># For PostgreSQL Database Connection<br />engine = create_engine(‘postgres+psycopg2://%s’ %conn_string, connect_args={‘connect_timeout’:login_timeout})<br />conn = engine.connect()<br />rs = conn.execute(stm)<br />for row in rs:<br />    print(row)</pre> | App developer | 
| Handle dynamic SQL queries. | Dynamic SQL can be present in one script or in multiple Python scripts. Earlier examples showed how to use Python’s string replace function to insert variables for constructing dynamic SQL queries. An alternate approach is to append the query string with variables wherever applicable. In the following example, the query string is constructed on the fly based on the values returned by a function.<pre>query = ‘"SELECT id from equity e join issues i on e.permId=i.permId where e.id’"<br />query += get_id_filter(ids) + " e.id is NOT NULL</pre>These types of dynamic queries are very common during application migration. Follow these steps to handle dynamic queries:[See the AWS documentation website for more details](http://docs.aws.amazon.com/prescriptive-guidance/latest/patterns/change-python-and-perl-applications-to-support-database-migration-from-microsoft-sql-server-to-amazon-aurora-postgresql-compatible-edition.html) | App developer | 
| Handle results sets, variables, and data frames. | For Microsoft SQL Server, you use Python methods such as `fetchone()` or `fetchall()` to retrieve the results set from the database. You can also use `fetchmany(size)` and specify the number of records to return from the results set. To do this, you can use the **pyodbc **connection object as shown in the following example.**pyodbc (Microsoft SQL Server)**<pre>import pyodbc <br />server = 'tcp:myserver.database.windows.net' <br />database = 'exampledb' <br />username = 'exampleusername' <br />password = 'examplepassword' <br />conn = pyodbc.connect('DRIVER={ODBC Driver 17 for SQL Server};SERVER='+server+';DATABASE='+database+';UID='+username+';PWD='+ password)<br />cursor = conn.cursor()<br />cursor.execute("SELECT * FROM ITEMS") <br />row = cursor.fetchone() <br />while row: <br />    print(row[0])<br />    row = cursor.fetchone()</pre>In Aurora, to perform similar tasks such as connecting to PostgreSQL and fetching results sets, you can use either **psycopg2 **or **SQLAlchemy**. These Python libraries provide the connection module and cursor object to traverse through the PostgreSQL database records, as shown in the following example.**psycopg2 (Aurora PostgreSQL-Compatible)**<pre>import psycopg2<br />query = "SELECT * FROM ITEMS;"<br />//Initialize variables<br />host=dbname=user=password=port=sslmode=connect_timeout=""<br />connstring = "host='{host}' dbname='{dbname}' user='{user}' \<br />password='{password}'port='{port}'".format(host=host,dbname=dbname,\<br />user=user,password=password,port=port) <br />conn = psycopg2.connect(connstring)<br />cursor = conn.cursor()<br />cursor.execute(query)<br />column_names = [column[0] for column in cursor.description]<br />print("Column Names: ", column_names)<br />print("Column values: "<br />for row in cursor:<br />    print("itemid :", row[0])<br />    print("itemdescrption :", row[1])<br />    print("itemprice :", row[3]))</pre>**SQLAlchemy (Aurora PostgreSQL-Compatible)**<pre>from sqlalchemy import create_engine<br />from pandas import DataFrame<br />conn_string = 'postgresql://core:database@localhost:5432/exampledatabase'<br />engine = create_engine(conn_string)<br />conn = engine.connect()<br />dataid = 1001<br />result = conn.execute("SELECT * FROM ITEMS")<br />df = DataFrame(result.fetchall())<br />df.columns = result.keys()<br />df = pd.DataFrame()<br />engine.connect()<br />df = pd.read_sql_query(sql_query, engine, coerce_float=False)<br />print("df=", df)</pre> | App developer | 
| Test your application during and after migration. | Testing the migrated Python application is an ongoing process. Because the migration includes connection object changes (**psycopg2 **or **SQLAlchemy**), error handling, new features (data frames), inline SQL changes, bulk copy functionalities (`bcp` instead of `COPY`) and similar changes, it must be tested carefully during and after application migration. Check for:[See the AWS documentation website for more details](http://docs.aws.amazon.com/prescriptive-guidance/latest/patterns/change-python-and-perl-applications-to-support-database-migration-from-microsoft-sql-server-to-amazon-aurora-postgresql-compatible-edition.html) | App developer | 

### Analyze and update your application – Perl code base
<a name="analyze-and-update-your-application-ndash-perl-code-base"></a>


| Task | Description | Skills required | 
| --- | --- | --- | 
| Analyze your existing Perl code base. | Your analysis should include the following to facilitate the application migration process. You should identify:[See the AWS documentation website for more details](http://docs.aws.amazon.com/prescriptive-guidance/latest/patterns/change-python-and-perl-applications-to-support-database-migration-from-microsoft-sql-server-to-amazon-aurora-postgresql-compatible-edition.html) | App developer | 
| Convert the connections from the Perl application and DBI module to support PostgreSQL. | Perl-based applications generally use the Perl DBI module, which is a standard database access module for the Perl programming language. You can use the same DBI module with different drivers for SQL Server and PostgreSQL.For more information about required Perl modules, installations, and other instructions, see the [DBD::Pg documentation](https://metacpan.org/pod/DBD::Pg). The following example connects to Aurora PostgreSQL-Compatible at `exampletest-aurorapg-database.cluster-sampleclusture.us-east.-rds.amazonaws.com`.<pre>#!/usr/bin/perl<br />use DBI;<br />use strict;<br />my $driver = "Pg";<br />my $hostname = "exampletest-aurorapg-database-sampleclusture.us-east.rds.amazonaws.com"<br />my $dsn = "DBI:$driver: dbname = $hostname;host = 127.0.0.1;port = 5432";<br />my $username = "postgres";<br />my $password = "pass123";<br />$dbh = DBI->connect("dbi:Pg:dbname=$hostname;host=$host;port=$port;options=$options",<br />      $username,<br />      $password,<br />      {AutoCommit => 0, RaiseError => 1, PrintError => 0}<br />      );</pre> | App developer | 
| Change Inline SQL queries to PostgreSQL. | Your application might have inline SQL queries with `SELECT`, `DELETE`, `UPDATE`, and similar statements that include query clauses that PostgreSQL doesn’t support. For example, query keywords such as `TOP` and `NOLOCK` aren’t supported in PostgreSQL. The following examples show how you can handle `TOP`, `NOLOCK`, and Boolean variables.In SQL Server:<pre>$sqlStr = $sqlStr<br />. "WHERE a.student_id in (SELECT TOP $numofRecords c_student_id   \<br />FROM active_student_record b WITH (NOLOCK) \<br />INNER JOIN student_contributor c WITH (NOLOCK) on c.contributor_id = b.c_st)</pre>For PostgreSQL, convert to:<pre>$sqlStr = $sqlStr<br />. "WHERE a.student_id in (SELECT TOP $numofRecords c_student_id  \<br />FROM active_student_record b INNER JOIN student_contributor c  \<br />on c.contributor_id = b.c_student_contr_id WHERE b_current_1 is true \<br />LIMIT $numofRecords)"</pre> | App developer | 
| Handle dynamic SQL queries and Perl variables. | Dynamic SQL queries are SQL statements that are built at application runtime. These queries are constructed dynamically when the application is running, depending on certain conditions, so the full text of the query isn’t known until runtime. An example is a financial analytics application that analyzes the top 10 shares on a daily basis, and these shares change every day. The SQL tables are created based on top performers, and the values aren’t known until runtime.Let’s say that the inline SQL queries for this example are passed to a wrapper function to get the results set in a variable, and then a variable uses a condition to determine whether the table exists:[See the AWS documentation website for more details](http://docs.aws.amazon.com/prescriptive-guidance/latest/patterns/change-python-and-perl-applications-to-support-database-migration-from-microsoft-sql-server-to-amazon-aurora-postgresql-compatible-edition.html)Here’s an example of variable handling, followed by the SQL Server and PostgreSQL queries for this use case.<pre>my $tableexists = db_read( arg 1, $sql_qry, undef, 'writer');<br />my $table_already_exists = $tableexists->[0]{table_exists};<br />if ($table_already_exists){<br /># do some thing<br />}<br />else {<br /># do something else<br />}</pre>SQL Server:<pre>my $sql_qry = "SELECT OBJECT_ID('$backendTable', 'U') table_exists", undef, 'writer')";</pre>PostgreSQL:<pre>my $sql_qry = "SELECT TO_REGCLASS('$backendTable', 'U') table_exists", undef, 'writer')";</pre>The following example uses a** **Perl variable in inline SQL, which runs a `SELECT` statement with a `JOIN` to fetch the primary key of the table and position of the key column.SQL Server:<pre>my $sql_qry = "SELECT column_name', character_maxi mum_length \<br />FROM INFORMATION_SCHEMA.COLUMNS \<br />WHERE TABLE_SCHEMA='$example_schemaInfo' \<br />AND TABLE_NAME='$example_table' \<br />AND DATA_TYPE IN ('varchar','nvarchar');";</pre>PostgreSQL:<pre>my $sql_qry = "SELECT c1.column_name, c1.ordinal_position \<br />FROM information_schema.key_column_usage AS c LEFT \<br />JOIN information_schema.table_constraints AS t1 \<br />ON t1.constraint_name = c1.constraint_name \<br />WHERE t1.table_name = $example_schemaInfo'.'$example_table’ \<br />AND t1.constraint_type = 'PRIMARY KEY' ;";</pre> | App developer | 

### Make additional changes to your Perl-based or Python-based application to support PostgreSQL
<a name="make-additional-changes-to-your-perl-based-or-python-based-application-to-support-postgresql"></a>


| Task | Description | Skills required | 
| --- | --- | --- | 
| Convert additional SQL Server constructs to PostgreSQL. | The following changes apply to all applications, regardless of programming language.[See the AWS documentation website for more details](http://docs.aws.amazon.com/prescriptive-guidance/latest/patterns/change-python-and-perl-applications-to-support-database-migration-from-microsoft-sql-server-to-amazon-aurora-postgresql-compatible-edition.html) | App developer | 

### Improve performance
<a name="improve-performance"></a>


| Task | Description | Skills required | 
| --- | --- | --- | 
| Take advantage of AWS services to make performance enhancements. | When you migrate to the AWS Cloud, you can refine your application and database design to take advantage of AWS services. For example, if the queries from your Python application, which is connected to an Aurora PostgreSQL-Compatible database server, is taking more time than your original Microsoft SQL Server queries, you could consider creating a feed of historical data directly to an Amazon Simple Storage Service (Amazon S3) bucket from the Aurora server, and use Amazon Athena-based SQL queries to generate reports and analytic data queries for your user dashboards. | App developer, Cloud architect | 

## Related resources
<a name="change-python-and-perl-applications-to-support-database-migration-from-microsoft-sql-server-to-amazon-aurora-postgresql-compatible-edition-resources"></a>
+ [Perl](https://www.perl.org/)
+ [Perl DBI Module](https://metacpan.org/pod/DBI)
+ [Python](https://www.python.org/)
+ [psycopg2](https://pypi.org/project/psycopg2/)
+ [SQLAlchemy](https://www.sqlalchemy.org/)
+ [Bulk Copy - PostgreSQL](https://www.postgresql.org/docs/9.2/sql-copy.html)
+ [Bulk Copy - Microsoft SQL Server](https://docs.microsoft.com/en-us/sql/tools/bcp-utility?view=sql-server-ver15)
+ [PostgreSQL](https://www.postgresql.org/)
+ [Working with Amazon Aurora PostgreSQL](https://docs.aws.amazon.com/AmazonRDS/latest/AuroraUserGuide/Aurora.AuroraPostgreSQL.html)

## Additional information
<a name="change-python-and-perl-applications-to-support-database-migration-from-microsoft-sql-server-to-amazon-aurora-postgresql-compatible-edition-additional"></a>

Both Microsoft SQL Server and Aurora PostgreSQL-Compatible are ANSI SQL-complaint. However, you should still be aware of any incompatibilities in syntax, column data types, native database-specific functions, bulk inserts, and case sensitivity when you migrate your Python or Perl application from SQL Server to PostgreSQL.

The following sections provide more information about possible inconsistencies.

**Data type comparison**

Data type changes from SQL Server to PostgreSQL can lead to significant differences in the resulting data that applications operate on. For a comparison of data types, see the table on the [Sqlines website](https://www.sqlines.com/sql-server-to-postgresql).

**Native or built-in SQL functions**

The behavior of some functions differs between SQL Server and PostgreSQL databases. The following table provides a comparison.


| 
| 
| Microsoft SQL Server | Description | PostgreSQL | 
| --- |--- |--- |
| `CAST`  | Converts a value from one data type to another. | PostgreSQL `type :: operator` | 
| `GETDATE()` | Returns the current database system date and time, in a `YYYY-MM-DD hh:mm:ss.mmm` format. | `CLOCK_TIMESTAMP` | 
| `DATEADD` | Adds a time/date interval to a date. | `INTERVAL` expression | 
| `CONVERT` | Converts a value to a specific data format. | `TO_CHAR` | 
| `DATEDIFF` | Returns the difference between two dates. | `DATE_PART` | 
| `TOP` | Limits the number of rows in a `SELECT` results set. | `LIMIT/FETCH` | 

** Anonymous blocks**

A structured SQL query is organized into sections such as declaration, executables, and exception handling. The following table compares the Microsoft SQL Server and PostgreSQL versions of a simple anonymous block. For complex anonymous blocks, we recommend that you call a custom database function within your application.


| 
| 
| Microsoft SQL Server | PostgreSQL | 
| --- |--- |
| <pre>my $sql_qry1=<br />my $sql_qry2 =<br />my $sqlqry = "BEGIN TRAN<br />$sql_qry1 $sql_qry2<br />if @\@error !=0 ROLLBACK<br />TRAN<br />else COMIT TRAN";</pre> | <pre>my $sql_qry1=<br />my $sql_qry2 =<br />my $sql_qry = " DO \$\$<br />BEGIN<br />$header_sql $content_sql<br />END<br />\$\$";</pre> | 

 

**Other differences**
+ **Bulk inserts of rows:** The PostgreSQL equivalent of the [Microsoft SQL Server bcp utility](https://docs.microsoft.com/en-us/sql/tools/bcp-utility?view=sql-server-ver15) is [COPY](https://www.postgresql.org/docs/9.2/sql-copy.html).
+ **Case sensitivity:**  Column names are case-sensitive in PostgreSQL, so you have to convert your SQL Server column names to lowercase or uppercase. This becomes a factor when you extract or compare data, or place column names in results sets or variables. The following example identifies columns where values might be stored in uppercase or lowercase.

```
my $sql_qry = "SELECT $record_id FROM $exampleTable WHERE LOWER($record_name) = \'failed transaction\'"; 
```
+ **Concatenation: **SQL Server uses `+` as an operator for string concatenation, whereas PostgreSQL uses `||`.
+ **Validation: **You should test and validate inline SQL queries and functions before you use them in application code for PostgreSQL.
+ **ORM Library inclusion : **You can also look for including or replace existing database connection library with Python ORM libraries such as [SQLAlchemy](https://www.sqlalchemy.org/) and [PynomoDB](https://pynamodb.readthedocs.io/en/latest/quickstart.html). This will help to easily query and manipulate data from a database using an object-oriented paradigm.