Database DevOps – UrbanCode Deploy & DBmaestro Integration

In this post I’m going to talk about one of the trending topic of DevOps journey: Database DevOps.

DevOps has transformed the software release cycle, but is not easily applied to databases.

A database crash can be exponentially more expensive than a buggy application because database errors take longer to identify and fix.

Anyway, configuration drift, team conflict, and manual coding errors, can be drastically minimized simply by applying DevOps principles and tools to the database.

During last months I had the opportunity to work with DBmaestro DevOps Platform.

DBmaestro’s DevOps Platform (DOP) paves the way for safe implementation of CI/CD for the database. With automatic drift prevention mechanisms, customizable role management, and a complete audit trail, the platform makes unplanned database downtime a thing of the past. The platform combines several key features that make its value greater than the sum of its parts: repeatable release automation, database version control, governance and security modules, and a business activity monitor. The result is complete database oversight from a single source and a vital edge over the competition.

My goal was to explore and find a reliable and efficient way to integrate UrbanCode Deploy and DBMaestro platform, aiming to include Database DevOps into CDRA (Continoun Delivery Release Automation) process.

Benefits of UrbanCode Deploy and DBmaestro integration
Enables a unified process of continuous delivery across all application tiers.
Enforces organizational policies and automate critical code review process.
Segregates roles and responsibilities for regulatory compliance and governance.
Supports complete audit report through managed repository of database objects and static data.
Reduces development time by eliminating manual release of database scripts.
Generates upgrade and rollback scripts and shortening feedback loops.

The scenario of this POC includes:

– DBmaestro Cloud Platform as a Service
– DBmaestro Agent
– UrbanCode Deploy Server (CD server from where all automation is going to be orchestrated)
– UrbanCode Deploy Agent
– Oracle 19 DB Instance (Amazon RDS).

DBmaestro configuration

I was provided with a DBmaestro as a Service instance, a 30 days trial environment and a document as hands-on reference. The guide overviews prerequisites, installation of the DBmaestro agent and connecting to the dedicated cloud container.

All communication to the database has been done via the DBmaestro agent, installed on a AWS EC2 Windows Server Instance.

For my purpose I created two empty DB schema on my Oracle 19 DB Instance, one for a supposed SIT and another one for a supposed UAT environment.

Then I logged in to DBmaestro DOP web console:


and created a Project:

A DBmaestro Project is a group of Environments representing the same developed application.

Each environment in the project represents a different stage (development, testing, or production) in the delivery pipeline. Database changes flow from development environments to production environments.

Project creation is done in three stages using the Add Project wizard:

– Project Settings – specify the Project name, Project group, Source Control Folder, Deploy Mode type, Auto options, Database type, Advanced settings, Additional options.

Deploy Mode Type
Deploy by Version – Releases are based on sequential accumulative release packages.
Deploy by Task – Releases are based on tasks or tickets and changes associated with them.

I picked, for this project, Deploy by Versions Mode Type:

– Environment Types – select the Project template and Environment types

Hybrid – contains development environments (and can run Build in MS-SQL Server projects), which creates scripts according to changes performed in the database
Static – has only upgrade and validation options and the scripts are created and manually inserted

In my case, the Static environment is selected and Release Source + UAT environments added.

Release Source is the mandatory DevOps environment that is the basis for Upgrade actions of QA through Prod environments.

You must first promote the package to the Release Source environment. Packages that have been deployed to the Release Source environment can then be promoted to all higher level environments.

All validations are performed between the environment you want to validate and the Release Source environment. There can be only one Release Source environment per project.

– Assign Schemas – configure the Environments by assigning schemas to the Environments

Starting from Release Source environment, change the Environment Name to a unique name and select a database to assign to the Release Source environment clicking “Discover Unmanaged Databases”.

I’ve added the SIT schema created earlier and tested connection:

then added the UAT schema and completed the project:

UrbanCode Deploy configuration

– Installed a uDeploy Agent on the Windows Server (where also DBmaestro Agent it’s installed)

– Installed DBmaestro automation plugin.

Here below some of the DBmaestro automation commands available within the plugin:

Then I’ve created my application: OT_ORACLE_VER:

Two environments: SIT and UAT

A component: OT-DBPackage

Several Component Processes: DBPackage, DBUpgrade, DBRollback

Application Processes:

and Resources Groups:

For convenience, as Source Configuration Type for my component, I used a File System Versioned:

In a real environment you can user Git, Nexus or push component versions with a CI tool like Jenkins.

Now, let’s have a look to the customized DBmaestro deployment package.

Automated packaging has two parts:

– The directory containing the sql scripts.
– The package manifest.

The directory name must be the Release Source Environment schema name: OT_SIT_VER in this case.

The directory should include the version scripts that can be Upgrade or Downgarde scripts, where a Downgrade script executes the reverse operations of an Upgrade script.

For example, if an Upgrade script contains “CREATE TABLE X”, the Downgrade script must contain “DROP TABLE X”.

Here below an example of package version of OT-DBPackage component:

The Package.json manifest consists of a header section and then an array of script sections in the order in which the scripts should be executed when an environment is upgraded:

{
    "operation": "create",
    "type": "regular",
    "enabled": true,
    "closed" : false,
    "scripts": [
                {
                        "name": "ot_data_upgrade_5_0.sql",
                        "tags": [],
                        "scriptType": "Upgrade"
                },
                {
                        "name": "ot_data_downgrade_5_0.sql",
                        "tags": [],
                        "scriptType": "Downgrade"
                }
        ],
    "tags" : [
        ]
 }

Now let’s have a closer look to component processes, starting from: DBPackage

The DBpackage process:

– Cleans the uDeploy Agent Working Directory.
– Downloads the artifact.
– Create a zip file ${p:version.name}.zip, where version.name is the version of the package (V5.0 in the example).
– Cleans and copies the file into the package directory.
– Executes the DBmaestro package automation command.

Here below the dbpackage script command:

java -jar ${p:DBmaestroAgentJarPath} -Package -ProjectName ${p:DBmaestroProject} -IgnoreScriptWarnings TRUE -FilePath "${p:DBmaestroPackageDir}\${version.name}.zip" -Server ${p:DBmaestroServer}:8017 -AuthType ${p:DBmaestroAccountType} -UserName ${p:DBmaestroAccount} -Password ${p:DBmaestroPassword}

You can see different properties in the above command:

Application Properties:

System Properties:

Please note that I used a shell script and not the “Package plugin step” because the option, -FilePath, mandatory while using the “DevOps Platform as a Service”, is not included in the step.

Once the Package process is executed, the package will be available on DBmaestro ready to be deployed to Release Source:

The Upgrade Component Process is used to deploy the package:

In this case we can use the DBUpgrade plugin step, providing all the properties needed.

Same thing for DBRollback:

Check the youtube video to see how the overall process works:

Have a nice and safe day!

Operations Engineer working with Cloud Infrastructure, Containers, Kubernetes, CI/CD pipelines, CDRA processes, GitOps.