Migrate from Oracle to SQL Server 2016

There is been plenty of activity and excitement recently concerning Microsoft’s aggressive move to obtain a large portion of Oracle’s market share.  There are always pros and cons to any situation, so I’m going to illustrate both, although as a Microsoft partner, we see more pros than cons.

We’re going to dive into five major questions that anyone considering migrating needs to answer specific to their own organizational culture and business needs.

  1. WHY move from Oracle to SQL Server 2016?
  2. How will it empower our business?
  3. What are the business, financial and migration impacts for us?
  4. How do we assess and mitigate risk when considering migration?
  5. What assistance is available to us to make this move?

So the biggest and obviously most important question here is, WHY should a company move away from Oracle to SQL Server 2016?

The fact is that Oracle requires a significant investment. Anyone that has adopted this technology also realizes the size of the footprint throughout their technology stack it probably has.  Nobody purchases Oracle licensing for small, or even medium businesses.  This is and has always been a very large scale solution.

Pragmatic Works has completed several conversions from Oracle to SQL Server, and the  thing we constantly hear is, “We have a lot of money, as well as processes, tied to this product. Why would we want to change?”

Let’s dive into that, but first let’s cover some Economics 101.  I received my undergraduate education in computer science, so when I began my MBA, most of these concepts were new to me.  The most important aspect of any technology investment is the Return on Investment, or ROI. EVERY single senior financial is going to complete this analysis and come up with some concrete numbers to justify the investment.  This is calculated as:

Gain from Investment – Cost of Investment
Cost of Investment

So if the gain from the investment exceeds the cost of the investment by a decent margin (why we divide by cost again), then it’s considered a solid investment, worth the time and money, and things move forward.

The Cost of Investment can be defined as the physical cost of the SQL Server hardware and software, plus the cost to migrate all objects to SQL Server. The physical cost is a clear, definable cost. The migration cost is highly variable and depends on the current implementation and will be discussed in further depth shortly (hint, Microsoft will help foot the bill).

So let’s dig into the costs we know.

SQL Server VS Oracle Cost

Comparing SQL Server Enterprise’s total price of $27,496 to Oracle’s $346,000 is a HUGE difference.  The reason is quite simple.  SQL Server 2016 is being labeled as “All Inclusive”, or just a buffet including steak, potatoes, drink, dessert, etc.  Oracle sells individual products “a la carte” which would be akin to ordering a steak, then ordering potatoes, then a drink, a dessert, etc.

Let’s look a little closer at those individual costs to make sure they add up to what is truly advertised.

  SQL Server Oracle Database
(All Options not Shown)
Enterprise Base License
(includes 1 year of support)



Data Available Included $11,500 (Active Data Guard)
(times 2 for 2 nodes)

$23,000 Total

Performance and Scale Included $11,500 (Advanced Compression)
$23,000 (Database In-Memory)
$11,500 (Partitioning)
$11,500 (Real Application Testing)
$7,500 (Diagnostics Pack)
$5,000 (Tuning Pack)

$140,000 Total

Enterprise Security Included
Free Download
$15,000 (Advanced Security)
$11,500 (Label Security)

$35,000 Total

Any Data, Built In Included $17,500 (Spatial & Graph)

$35,000 Total

Total Cost



What does this all mean?  Going back to Economics 101….


In economics and business decision-making, a sunk cost is a cost that has already been incurred and cannot be recovered. Sunk costs (also known as retrospective costs) are sometimes contrasted with prospective costs, which are future costs that may be incurred or changed if an action is taken.

For a great example on sunk costs, check out the U.S. Defense’s failed F-35 program.  It’s another way of saying, “Throwing good money after bad.”

The point here is that it doesn’t matter what has already been spent on Oracle licensing and infrastructure, it’s whether the technology satisfies the business in every aspect in the future, including financial, business, technical, etc.

In microeconomics, economies of scale are the cost advantages obtained due to size, output, or scale of operation, with cost per unit of output generally decreasing with increasing scale as fixed costs are spread out over more units of output.  So it’s the cost of the product decreasing as the quantity increases.

Economies of scope is an economic theory stating that the average total cost of production decreases as a result of increasing the number of different goods produced.  The concept here is that Microsoft is not investing all of their resources in just ONE product, they have the same resources contributing to several products, thus reducing the overall cost for the collective unit.  This all results in the following:

  • Extreme flexibility in product design and product mix
  • Rapid responses to changes in market demand, product design and mix, output rates, and equipment scheduling
  • Greater control, accuracy, and repeatability of processes
  • Reduced costs from less waste and lower training and changeover costs
  • More predictability (e.g., maintenance costs)
  • Less risk
  • ….and more

The Core of the Everything is, Am I Going to Get my Gain from Investment?

Yes, you will, by empowering the business through data.

Gain from the Investment by Providing for the Decision Makers

  • Robust Data Infrastructure
  • Powerful Integration
  • Analytics Tools
  • Free and Simple Access to Information
  • Intuitive Interface

Providing for Decision Makers

So There are Lots of Goodies, but Does the Technology Perform?

Yes, it does!  But don’t take it from me, let’s check out what the leading authority on technology assessment, and a completely non-biased company, Gartner Inc., says.

Breaking it down by subject, here’s what they say about Advanced Analytics Platforms and BI:

BI and Analytics Leader

How about database performance?  Does it perform?  Gotcha covered….

Database Leader

Okay, but at SOME point, you may want to move the cloud.  Gotcha covered….

Cloud Leader

  • Ask 10 Oracle DBAs the top reason to use Oracle and 9 times, you’ll hear “It has better security!”  Wrong….
  • The other 1 may say “It performs better”  Nope….
  • “What about this whole Mobile BI movement I hear so much about?”  Gotcha covered….

Secure Performing Mobile

  • Want to use R in-database?
  • In-memory across workloads?
  • Consistent experience from on-premise to cloud?

In-Database In-Memory Consistent

But wait, there’s more!!

SQL Server and Linux.png

Built on SQL Server 2016, SQL Server on Linux gives you the power to build and deploy intelligent applications on a single data management and business analytics platform. The preview is currently available to play around with, but the production version is slated for a mid-2017 release.

Migration Process

This process generally consists of the following 5 phases, not necessarily executed in order, sometimes reiterated , etc.

Analyze and Strategize

  • Project Road map
  • Validate ROI

Architect and Validate

  • Complete High Level Design
  • Justify Investment
  • Assess Migration Technical Issues

Complete Detailed Design

  • Implementation Plan
  • Create POC


  • Complete Implementation
  • Testing & UAT
  • Production Cut-over


  • Monitor Results
  • Optimize

Sometimes phases affect other phases and require re-work.  For example, if we began our detailed design but discover technology inadequacies, or the architecture cannot fully satisfy the business, the Architect and Validate step may need to be re-iterated.

Thankfully, there a couple of very helpful tools that can mitigate risk, as well as speed up the process.

Migration Tools

Microsoft Assessment and Planning (MAP) Toolkit for SQL Server


Using the MAP tool, get a complete network-wide inventory of SQL Server and Oracle instances, edition identification, and hardware assessment.

With MAP Toolkit, utilize the following reports:

  • Discovery and Inventory: Database inventory and reporting of Oracle instances on Windows and Linux based servers
  • Detailed Reporting: Covers specific SQL Server instances that include component name, version, edition, and more. Includes wide-ranging details of databases and server instances that can be used for consolidation. Oracle schema reporting identifies the size and use of each schema, estimates of the complexity of migration.
  • Hardware and platform details: Details for computers that are running heterogeneous database instances, including determining whether the machines are virtual or physical.

At the end of this process, you should have the capacity to do each of the following effectively:

  1. Scope for Server & Storage Consolidation, Migrations and Upgrade
  2. Utilize the Existing Infrastructure
  3. Streamline SQL Server Licensing
  4. Reduce Administrative Costs

SQL Server Migration Assistant (SSMA) for Oracle


According to Microsoft, use the tool to “quickly convert Oracle database schemas to SQL Server schemas, upload the resulting schemas into SQL Server and migrate data from Oracle to SQL Server.”

According to me personally, use the tool to gather all metadata for all Oracle objects to perform deeper analysis.  Utilize Visual Studio for creating and maintaining databases and objects, as well as ETL development for data population.  Using formal development tools allows for a more iterative, reproducible, quantifiable and testable implementation.

Use SSMA for Oracle to Discover Objects


The first 3 columns can be discovered by SSMA, however all Scheduler items must be manually discovered and assessed.

So we’ve found a list of all objects, now we just complete them all at the same time in a nice linear fashion, right?  Wrong!

Which of These Groups has the Highest Risk?

Programmability Risk

The highest risk item, BY FAR, is the programmability.  Creating every other object is a fraction of the time that re-writing programmable items will require.  I estimate that 75% or more of the object conversion effort will be converting programmability, and specifically PL/SQL.

This group include Packages, Stored Procedures, Functions and Triggers.

Fortunately, MOST programmable objects perform CRUD (Create, Read, Update and Delete) operations

  • If following ANSI SQL standards, platform independent syntax, they should be using COALESCE, CASE, JOIN, CAST, etc.
  • The majority of conversion work is when directly converting PL/SQL to T-SQL

Additional business logic embedded in the PL/SQL increases conversion time substantially, if not exponentially.

Complete In-Depth Programmability Assessment

Compile and detail a matrix consisting of the following items:

Programmability Assessment

This can also be modified to a working conversion document by adding columns such as ‘Converted to T-SQL’, ‘Unit Tested’, ‘UAT’, ‘Deployed’, notes for each column, etc.

This should be a deliverable that can be taken at the completion of the assessment and handed to any technology solution company (hopefully Pragmatic Works) for completion. The time and cost will have clear and defined boundaries.

Use ONE Tool to Kick off a Changeover Process, or “flip the switch”, on an Instance-by-Instance Basis

I prefer SSIS, because it does SEVERAL things very well (and I’m kinda good at it), AND it’s a Microsoft BI product.

  1. Execute Script Task (Create Objects)
  2. Data Flows (Migration of Data)
  3. Troubleshoot (Using Pragmatic Works’s BI xPress)
  • Execution Time (at any almost granularity)
  • Error Logging
  • Audit Framework
  • Troubleshooting
  • Notifications for Scheduled Migration Tests

Using SSIS, perform the following tasks programmatically in order (3 steps):
(Following Microsoft Data Warehouse Toolkit Recommendation)

Create (and drop) objects

  • Schemas
  • Tables
  • Synonyms, Sequences, Views

Populate Data

  • Enable Identity Insert
  • Cast/Convert
  • Lookup Whenever Necessary

Create Optimizations/Integrity

  • Partitions
  • Indexes
  • Relationships
  • Constraints

Migration Support

So how can we get a little bit of help with all of this?!  Who can help us?


Why our buddies at Microsoft can help us of course!

Easier than ever to switch – simplified migration tools, free training for Oracle DBAs, and starter deployment services help you break free of your current platform. Enjoy flexible deployment options for your applications whether maintaining on premise installations or optimize costs and adopt the cloud with Azure, the most trusted cloud in the industry.

….and let’s be honest about all of this, it isn’t free nor cheap.

Show me the money

And they have!  Microsoft has put their money where their mouth is and they’re offering to help out with this, not just with tools but financially.

Training and services are only available through June 30, 2016, but take advantage of:

  • Free SQL Server licenses (with Software Assurance)
  • Free training – SQL Server Essentials for the Oracle DBA
  • Subsidized deployment services to kick-start migration

Check out the Offer Datasheet for additional details.

We may or may not have convinced you to pick SQL Server 2016 over Oracle, but you have to admit that it’s a rather compelling argument and AT LEAST needs serious consideration.


The SQL Server 2016 Query Store

What is the Query Store?

A SQL Server 2016 (and SQL DB) Feature that can be Enabled

  • Provides insight on query plan choice and performance
  • Simplifies performance troubleshooting by quickly finding performance differences caused by query plan changes
  • Automatically captures a history of queries, plans, and runtime statistics, and retains these for review
  • Separates data by time windows to better illustrate database usage patterns and understand when query plan changes happened on the server

Why Use the Query Store?

Performance and Cost

Keeping performance high, and costs low


How the Query Store “Works”

How Query Store Works

  1. A SQL statement is executed
  2. The statement is compiled
    1. A compilation message is generated
  3. The statement is executed
    1. An execution message is generated
  4. Both messages are sent to the Query Store
    1. The Plan Store stores the execution plans
    2. The Runtime Stats are collected
    3. The Query Store Schema is updated with all information
      1. The Write-Back is asynchronous, which reduces latency

Does It Work with In-Memory OLTP?


  • SQL Server Query Store allows you to monitor the performance of natively compiled code for workloads running in-memory OLTP
  • Compile and runtime statistics are collected and exposed the same way as for disk-based workloads
  • When you migrate to in-memory OLTP you can continue using Query Store views in SQL Server Management Studio (SSMS) as well as custom scripts you have developed for disk-based workloads before migration
  • This saves your investment in learning Query Store technology and makes it generally usable for troubleshooting all type of workloads

How to Use the Query Store

In Just 3 Easy Steps, You TOO can be a DBA!!

Steps to Use Query Store

  1. Enable the Query Store
  2. Let the Query Store Collect the Data
  3. Pinpoint and fix “problematic” queries

Admittedly, the last step requires a bit more expertise and effort.

Setting up the Query Store

There are primarily 2 ways to accomplish this.

  1. Using the Query Store page in SSMS

In Object Explorer, right-click a database, and then click Properties

Setting up Query Store 1

In the Database Properties dialog box, select the Query Store page

Select related attributes

  • General
    • Operation Mode (Read or Read/Write)
  • Monitoring
    • Data Flush Interval (Manual Entry in Minutes)
    • Statistics Collection (5 mins, 10 mins, 15 mins, 30 mins, 1 hr, 1 day)
  • Max Size
    • Query Store Capture Mode (All or Auto)
    • Size Based Cleanup Mode (Off or Auto)
    • Stale Query Threshold (Manual Entry in Days)

Setting up Query Store 2

  1. By Using T-SQL Statements

Use the ALTER DATABASE statement

Setting up Query Store 3

Query Store Options with T-SQL

Query Store Options with TSQL

What Information Does the Query Store Provide?

Everything in the following data model:

Query Store Data Model

Query Store Views

Query Store Best Practices

  • Information and performance are generally a give and take, but there ARE some best practices
  • Every single setting in the Query Store has associated costs and benefits
  • Treat Query Store performance tuning like an agile project with Sprints
    • Sprint 1 – Keep Operation Mode in READ/WRITE mode for information collection
      • This reduces write time, and subsequent overall database performance, for data collection as no reporting is occurring
    • Sprint 2 – Change Operation Mode to READ mode for analysis and actual performance tuning
      • This increases report performance because no writes are occurring. This also results in no “dirty reads” because of the asynchronous writes


  • Information and performance are generally a give and take, and it ultimately relies on each and every business and its processes
  • Flush the data at specific business case intervals.  For example, if performance of the DB seems to be poor around end of the month due financial reporting, consider using the Agile methodology to complete one iteration, then flush the data before the next iteration.
  • Keep the Max Size small enough to keep the reporting relatively snappy, however, keep enough data to capture all of the business needs
  • Keep an accurate Cleanup Policy as the business and reporting solution evolves
  • Collect statistics as frequently as possible to see exactly which execution plans are being executed.  If the statistics are not updated for some time, you may be looking at an old execution plan because the statistics have changed

Query Store Reporting

For Azure SQL DB

Azure Query Store Reporting

Quick Performance Insight

The top queries view opens and the top CPU consuming queries are listed

  • The top line shows overall DTU% for the database, while the bars show CPU% consumed by the selected queries during the selected interval
  • The bottom grid represents aggregated information for the visible queries
    • Average CPU per query during observable interval
    • Total duration per query
    • Total number of executions for a particular query

Query Performance Insight

Viewing Individual Query Details

On Premise Solution

Use SQL Server Management Studio (SSMS)


Query Store Dashboards

Query Store Dashboards

Custom Views and SQL

Leverage the data model

Query Store Data Model

Query Store Administration with T-SQL

Utilize the Query Store stored procedures

Query Store Stored Procedures

Forcing Execution Plans

Old Way

Old Way

New Way

New Way

Error State

  • Remember to verify intermittently that the Query Store is still collecting data

SELECT actual_state_desc, desired_state_desc, current_storage_size_mb,                                          max_storage_size_mb, readonly_reason, interval_length_minutes,                                          stale_query_threshold_days, size_based_cleanup_mode_desc,                                          query_capture_mode_desc
FROM   sys.database_query_store_options;

If there is a difference between the actual_state_desc and desired_state_desc, this indicates that a change of operations mode occurred automatically.

The most common change is for the Query Store to silently switch to read-only mode. In extremely rarely circumstances, Query Store can end up in the ERROR state because of internal errors.

When the actual state is read-only, use the readonly_reason column to determine the root cause. Typically, the Query Store transitioned to read-only mode because the size quota was exceeded. In that case the readonly_reason is set to 65536.

How to Fix Error State


Fix Error State

Live Query Statistics

This is a new feature of SSMS 2016.  Even if your organization chooses not to upgrade to 2016, this is still an option with 2014 SP1.

This feature provides the ability to view the live execution plan of an active query. This live query plan provides real-time insights into the query execution process as the controls flow from one query plan operator to another. The live query plan displays the overall query progress and operator-level run-time execution statistics such as the number of rows produced, elapsed time, operator progress, etc. Because this data is available in real time without needing to wait for the query to complete, these execution statistics are extremely useful for debugging query performance issues.

Live Query Statistics