Hey, did you catch that redirect? The Toad World URL is now community.toadworld.com. Don't worry -- you'll still find all the same great content here (and more on the way). We're just in the process of giving our Toad World site some well-deserved love. Stay tuned for some more updates coming this way.
Meanwhile, enjoy community.toadworld.com.
Written by Wissem EL-Khlifi
Before diving into the world of the database in the cloud operations, it is important to understand the concepts and the differences between a database as a service offered in the cloud industry (especially offered by Amazon) versus a fully managed database.
The Oracle database as a service (Oracle RDS) option of Amazon offers Oracle databases as managed services. Those managed services free up the Oracle DBA and DevOps from time-consuming database administration tasks including provisioning, backups, software patching, monitoring, and hardware scaling. The goal is to reduce the cost in maintenance, acquisition, operations and focus on the customer service and the service quality. Let’s see what features offers Amazon Oracle RDS to their customers.
By Default, the Oracle RDS enables the following features:
At the time of writing this article, the Oracle RDS supports the following:
As a database administrator, you may ask how your role is affected in an RDS environment?
DevOps is an approach of bringing development and operations into a single profile. The goal is to assure quality of the delivered stack by avoiding unnecessary efforts in operation tasks that can be automated and can be fully managed by the Oracle RDS.
Read more about IaaC here: http://www.toadworld.com/platforms/oracle/w/wiki/11715.iaac-for-devops-infrastructure-automation-using-aws-cloudformation
Since the service is maintained by Amazon (AWS), the DevOps are not provided with all configuration settings and operations that they used to have in a fully managed Oracle database. For example, the DevOps doesn’t have a system access to the Oracle RDS database server. As an Oracle DBA, operating in an Oracle RDS environments leaves you with very few options and tools. It is true that an Oracle RDS instance is at the end a normal Oracle database, but some operations have to be done differently. For example, you won’t use the DBCA tool to create an Oracle RDS instead as DBA or DevOps you have the options to use the AWS console or the Amazon command line tool (AWS Cli). In the following section of the article, we are going to show you how to move data between those two RDS instances giving we don’t have access to the database servers.
Because in an Oracle RDS environment, there is no system access to the database server, we have two options to use the Oracle Data pump tool to move data and schemas between two different Oracle RDS instances.
In this section, we will use the DBMS_DATAPUMP package.
First step involves an export procedure creation on SOURCEDB RDS database. You can use Toad for Oracle tool to create the procedure DB_PROC_EXPDP_SCHEMA.
CREATE OR REPLACE PROCEDURE appadmin.DB_PROC_EXPDP_SCHEMA IS V_H1 NUMBER; V_DIR_NAME VARCHAR2(30); V_SCHEMA_NAME VARCHAR2(30) := 'WIND';
BEGIN V_DIR_NAME := 'DATA_DUMP_DIR';V_H1 := DBMS_DATAPUMP.OPEN(OPERATION => 'EXPORT', JOB_MODE => 'SCHEMA', VERSION => 'COMPATIBLE');
-- Filter for schema
DBMS_DATAPUMP.ADD_FILE(HANDLE => V_H1, FILENAME => 'exp_' || V_SCHEMA_NAME || '.dmp', REUSEFILE => 1, -- REUSE_DUMPFILES=Y DIRECTORY => V_DIR_NAME, FILETYPE => DBMS_DATAPUMP.KU$_FILE_TYPE_DUMP_FILE);
DBMS_DATAPUMP.ADD_FILE(HANDLE => V_H1, FILENAME => 'exp_' || V_SCHEMA_NAME || '.log', DIRECTORY => V_DIR_NAME, FILETYPE => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);
DBMS_DATAPUMP.metadata_filter(handle => V_H1, name => 'SCHEMA_LIST', VALUE => '''' || V_SCHEMA_NAME || '''');
DBMS_DATAPUMP.START_JOB(HANDLE => V_H1); DBMS_DATAPUMP.DETACH(HANDLE => V_H1);
EXCEPTIONWHEN OTHERS THEN DBMS_OUTPUT.PUT_LINE('Error DB_PROC_EXPDP_SCHEMA:::' || SQLERRM);
Let’s explain the list of procedures and functions invoked from this PLSQL procedure:
- KU$_FILE_TYPE_DUMP_FILE (dump file for a job)
- KU$_FILE_TYPE_LOG_FILE (log file for a job)
- KU$_FILE_TYPE_SQL_FILE (output for SQL_FILE job)
After DB_PROC_EXPDP_SCHEMA procedure creation and execution on SOURCEDB RDS database. We need to find a way to access and read the export log file to check for any error in the export execution.
We can create an Oracle external table to read the export log file. Later you query the external table for any errors in the export. Let’s create the external table:
CREATE TABLE appadmin.exp_wind_ext (message VARCHAR2(4000))
ORGANIZATION EXTERNAL (TYPE ORACLE_LOADERDEFAULT DIRECTORY DATA_DUMP_DIR
ACCESS PARAMETERS ( RECORDS DELIMITED BY NEWLINE FIELDS TERMINATED BY '[#EOC#]' MISSING FIELD VALUES ARE NULL ( message CHAR(4000) ))LOCATION ('exp_WIND.log'))PARALLEL 5REJECT LIMIT UNLIMITED;
It is important to have a network connectivity because the two Oracle RDS instances to move the export file from SOURCEDB to TARGETDB. From SOURCEDB create the following database link:
create database link DBLINK_TO_TARGETDB connect to appadmin identified by xxx using '//xxxx.us-east-1.rds.amazonaws.com:1521/ TARGETDB ';
Run the following PLSQL block to transfer the dump file from the two RDS instances:
SQL> set serveroutput onSQL>SQL> BEGIN
DBMS_FILE_TRANSFER.PUT_FILE( source_directory_object => 'DATA_PUMP_DIR', source_file_name => 'exp_WIND.dmp', destination_directory_object => 'DATA_PUMP_DIR', destination_file_name => 'exp_WIND.dmp', destination_database => 'DBLINK_TO_TARGETDB' );
EXCEPTION WHEN OTHERS THEN DBMS_OUTPUT.PUT_LINE ('Error::: '||SQLERRM); END; /
PL/SQL procedure successfully completedSQL>
Before running the import Data Pump from TARGEDB RDS instance, make sure there is no WIND user connected in the database. In the case, kill the sessions using the following command:
You can use Toad for Oracle tool to create the following import procedure DB_PROC_IMPDP_SCHEMA.
CREATE OR REPLACE PROCEDURE APPADMIN.DB_PROC_IMPDP_SCHEMA ISV_H1 NUMBER; -- data pump job handleV_JOB_NOT_EXIST EXCEPTION;
PRAGMA EXCEPTION_INIT(V_JOB_NOT_EXIST, -31626);V_SCHEMA_NAME VARCHAR2(30) := 'WIND';V_DUMPFILENAME VARCHAR2(30);
BEGINV_DUMPFILENAME:='exp_' || V_SCHEMA_NAME || '.dmp';V_H1 := DBMS_DATAPUMP.OPEN(OPERATION => 'IMPORT', JOB_MODE => 'SCHEMA', JOB_NAME => NULL);
DBMS_DATAPUMP.SET_PARAMETER(V_H1, 'TABLE_EXISTS_ACTION', 'REPLACE');DBMS_DATAPUMP.ADD_FILE(V_H1, V_DUMPFILENAME, 'DATA_DUMP_DIR');DBMS_DATAPUMP.ADD_FILE(V_H1, 'imp_' || V_SCHEMA_NAME || '.log', 'DATA_DUMP_DIR', NULL, DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);
The import procedure calls a new procedure DBMS_DATAPUMP.SET_PARAMETER comparing to the export procedure. In the SET_PARAMETER procedure we specified to replace the WIND schema if exists.
We can create an Oracle external table to read the import log file. Let’s create the external table:
CREATE TABLE APPADMIN.imp_wind_ext (message VARCHAR2(4000))
ACCESS PARAMETERS ( RECORDS DELIMITED BY NEWLINE FIELDS TERMINATED BY '[#EOC#]'
MISSING FIELD VALUES ARE NULL ( message CHAR(4000)
PARALLEL 5REJECT LIMIT UNLIMITED;
In this article, we have introduced the Oracle RDS features and operations. We have seen that in an RDS environment we as DBA or DevOps have to choose different methods to perform database operations like Oracle database creation and replications, DBCA and Data Pimp executables from command lines are not available for use anymore. As DevOps we have limited options in using Data Pump because of no access to the database servers. We have seen the steps of moving database schemas between different RDS instances.