Oracle Multi-Directional Data Replication

Thursday May 11th 2006 by Vincent Chan
Share:

Part one of this series illustrates the procedures for setting up bi-directional data replication and conflict resolution between two databases using Oracle 10g Streams. This month we look at the steps to add a new source site into our existing bi-directional replicated environment.

In part one of this series, we illustrated the procedures for setting up bi-directional data replication and conflict resolution between two databases using Oracle 10g Streams. In the second half of this series, we will provide the steps to add a new source site into our existing bi-directional replicated environment. At the end of the set up, we will have a three-way directional replicated environment. Changes originate from one database, are propagated to the rest of the databases, thus keeping all sites synchronized.

Adding a New Source Site

An overview of the current Oracle bi-directional Streams environment:

Host Name

Instance Name

Global Database Name

Database Role

trout3

ladb

ladb.world

Source / Destination database

trout4

sfdb

sfdb.world

Source / Destination database

An overview of the Oracle multi-directional Streams environment:

Host Name

Instance Name

Global Database Name

Database Role

trout1

dcdb

dcdb.world

Source / Destination database

trout3

ladb

ladb.world

Source / Destination database

trout4

sfdb

sfdb.world

Source / Destination database

Step 1: Configure archive log mode

Archived redo logs are required by the capture process to extract changes. Verify dcdb is in archive log mode and configure your log archive destination or flash recovery area.

Step 2: Modify initialization parameters

Parameter

dcdb

Description

compatible

10.2.0

Both databases are running Oracle 10gR2.

global_names

true

Database link name must match global name.

job_queue_processes

2

The maximum number of job queue processes to propagate messages. Set this parameter to at least 2.

db_recovery_file_dest_size

10G

Specifies the storage size of database recovery files such as archive logs.

db_recovery_file_dest

/u01/app/oracle/flashdest

Specifies the location of database recovery files such as archive logs.

parallel_max_servers

6

The maximum number of parallel execution processes that can be used by the capture and apply processes.

You may encounter ora-1372 or ora-16081 during Streams capture and apply if the value is set too low. Set this parameter to at least 6.

sga_target

500M

Enables Automatic Shared Memory Management (ASMM). In Oracle 10gR2, ASMM automatically manages the streams_pool_size. streams_pool_size provides buffer areas for streams processing.

If this parameter is not set, you should manually set streams_pool_area to at least 200M. Otherwise, memory will be allocated for Streams from the shared pool area.

Step 3: Set up tnsnames.ora

Add the TNS entries on trout1, trout3 and trout4.

DCDB.WORLD =
   (DESCRIPTION =
       (ADDRESS = (PROTOCOL = TCP)(HOST = trout1)(PORT = 1521))
       (CONNECT_DATA =
       (SERVER = DEDICATED)
            (SERVICE_NAME = dcdb)
        )
     )   
LADB.WORLD =
   (DESCRIPTION =
       (ADDRESS = (PROTOCOL = TCP)(HOST = trout3)(PORT = 1521))
       (CONNECT_DATA =
       (SERVER = DEDICATED)
       (SERVICE_NAME = ladb)
        )
    )
SFDB.WORLD =
   (DESCRIPTION =
       (ADDRESS = (PROTOCOL = TCP)(HOST = trout4)(PORT = 1521))
       (CONNECT_DATA =
       (SERVER = DEDICATED)
            (SERVICE_NAME = sfdb)
        )
     )   

Step 4: Create tablespaces

Create the tablespaces for the application schema and Streams administrator in dcdb.

create tablespace appsts datafile '/u02/oradata/dcdb/appsts01.dbf' size 100M;
 
create tablespace streamsts datafile '/u02/oradata/dcdb/streamsts01.dbf' size 100M;

Step 5: Create Streams administrator

Create the Streams administrator in dcdb.

create user strmadmin identified by strmadmin default tablespace streamsts temporary tablespace temp;
grant dba,select_catalog_role to strmadmin;
exec dbms_streams_auth.grant_admin_privilege('strmadmin',true);

Step 6: Create application schema

Create the application schema in dcdb.

create user apps identified by apps default tablespace appsts temporary tablespace temp;
grant connect,resource to apps;
grant select on v_$database to apps;

Step 7: Create database link

As the Streams administrator, create the following private database links in each database.

On dcdb

create database link ladb.world connect to strmadmin identified by strmadmin using 'ladb.world';
create database link sfdb.world connect to strmadmin identified by strmadmin using 'sfdb.world';

On ladb

create database link dcdb.world connect to strmadmin identified by strmadmin using 'dcdb.world';

On sfdb

create database link dcdb.world connect to strmadmin identified by strmadmin using 'dcdb.world';

Step 8: Create Streams queues

As the Streams administrator, create the capture and apply queues in dcdb.

begin
dbms_streams_adm.set_up_queue(
  queue_table   => 'apply_dcqtab',
  queue_name    => 'apply_dcq',
  queue_user    => 'strmadmin');
end;
/
begin
dbms_streams_adm.set_up_queue(
  queue_table   => 'capture_dcqtab',
  queue_name    => 'capture_dcq',
  queue_user    => 'strmadmin');
end;
/

Step 9: Configure capture, apply and propagation processes

As the Streams administrator on each source database, execute:

On ladb

begin
 dbms_streams_adm.add_schema_propagation_rules (
  schema_name  => 'apps',
  streams_name  => 'prop_ladb_to_dcdb',
  source_queue_name => 'capture_laq',
  destination_queue_name => 'apply_dcq@dcdb.world',
  include_dml  => true,
  include_ddl  => true,
  source_database => 'ladb.world');
end;
/
begin
 dbms_streams_adm.add_schema_rules (
  schema_name  => 'apps',
  streams_type  => 'apply',
  streams_name  => 'apply_src_dcdb',
  queue_name  => 'apply_laq',
  include_dml  => true,
  include_ddl  => true,
  source_database => 'dcdb.world');
end;
/

On sfdb

begin
 dbms_streams_adm.add_schema_propagation_rules (
  schema_name  => 'apps',
  streams_name  => 'prop_sfdb_to_dcdb',
  source_queue_name => 'capture_sfq',
  destination_queue_name => 'apply_dcq@dcdb.world',
  include_dml  => true,
  include_ddl  => true,
  source_database => 'sfdb.world');
end;
/
begin
 dbms_streams_adm.add_schema_rules (
  schema_name  => 'apps',
  streams_type  => 'apply',
  streams_name  => 'apply_src_dcdb',
  queue_name  => 'apply_sfq',
  include_dml  => true,
  include_ddl  => true,
  source_database => 'dcdb.world');
end;
/

On dcdb

begin
 dbms_streams_adm.add_schema_rules (
  schema_name  => 'apps',
  streams_type  => 'capture',
  streams_name  => 'capture_dcstrm',
  queue_name  => 'capture_dcq',
  include_dml  => true,
  include_ddl  => true,
  inclusion_rule => true);
end;
/
begin
 dbms_streams_adm.add_schema_propagation_rules (
  schema_name  => 'apps',
  streams_name  => 'prop_dcdb_to_ladb',
  source_queue_name => 'capture_dcq',
  destination_queue_name => 'apply_laq@ladb.world',
  include_dml  => true,
  include_ddl  => true,
  source_database => 'dcdb.world');
end;
/
begin
 dbms_streams_adm.add_schema_propagation_rules (
  schema_name  => 'apps',
  streams_name  => 'prop_dcdb_to_sfdb',
  source_queue_name => 'capture_dcq',
  destination_queue_name => 'apply_sfq@sfdb.world',
  include_dml  => true,
  include_ddl  => true,
  source_database => 'dcdb.world');
end;
/
begin
 dbms_streams_adm.add_schema_rules (
  schema_name  => 'apps',
  streams_type  => 'apply',
  streams_name  => 'apply_src_ladb',
  queue_name  => 'apply_dcq',
  include_dml  => true,
  include_ddl  => true,
  source_database => 'ladb.world');
end;
/
begin
 dbms_streams_adm.add_schema_rules (
  schema_name  => 'apps',
  streams_type  => 'apply',
  streams_name  => 'apply_src_sfdb',
  queue_name  => 'apply_dcq',
  include_dml  => true,
  include_ddl  => true,
  source_database => 'sfdb.world');
end;
/

Step 10: Prepare schema instantiation

As the Streams administrator, prepare for instantiation all current and future schema objects in ladb and sfdb. The lowest SCN of each object in the apps schema is recorded for instantiation.

On ladb

exec dbms_capture_adm.prepare_schema_instantiation(schema_name =>'apps');

On sfdb

exec dbms_capture_adm.prepare_schema_instantiation(schema_name =>'apps');

Step 11: Stop the apply process at ladb

Before exporting the apps schema objects in ladb, stop applying the LCRs from sfdb. During the duration when the apply process is disabled, changes propagated from sfdb are queued at ladb. The LCRs are applied to the apps schema when the apply process is re-enabled later in the procedure.

exec dbms_apply_adm.stop_apply('apply_src_sfdb');

Step 12: Export application schema on ladb

On trout3, create or use an existing directory on the filesystem to store the export dump file.

$> mkdir -p /u01/app/oracle/admin/ladb/export

As the sys user on ladb, create an Oracle directory:

create directory expdir as '/u01/app/oracle/admin/ladb/export';

Obtain the current SCN on ladb:

select dbms_flashback.get_system_change_number() from dual;

Export the application schema using the current SCN obtained above:

$> expdp system/system  schemas=apps  directory=expdir \
    logfile=expapps_dcdb.log dumpfile=apps_for_dcdb.dmp FLASHBACK_SCN=<current SCN>

Step 13: Import application schema on dcdb

On trout1, create or use an existing directory on the filesystem to store the export dump file from ladb. Transfer the dump file from trout3 to this directory.

$> mkdir -p /u01/app/oracle/admin/dcdb/import

As the sys user on dcdb, create an Oracle directory:

create directory impdir as '/u01/app/oracle/admin/dcdb/import';

Import the application schema from ladb:

impdp system/system directory=impdir dumpfile=apps_for_dcdb.dmp logfile=impapps.log

The supplemental log groups are imported together with the rest of the application schema’s objects.

SQL> connect apps/apps@dcdb.world
SQL> select log_group_name, table_name
  2  from dba_log_groups
  3  where owner='APPS';
LOG_GROUP_NAME                 TABLE_NAME
------------------------------ ------------------------------
LOGGRP_EMP                     EMP
LOGGRP_DEPT                    DEPT

Step 14: Configure conflict resolution on dcdb

Set up an update conflict handler to overwrite changes on dcdb with ladb’s or sfdb’s during conflict resolution. The overwrite handler is called when an update conflict occurs to at least one of the columns in the column_list.

connect strmadmin/strmadmin@dcdb.world
declare
 field  dbms_utility.name_array;
begin
 field(1) := 'ename';
 field(2) := 'site';
 field(3) := 'job';
 field(4) := 'deptno';
 
   dbms_apply_adm.set_update_conflict_handler(
      object_name            =>  'apps.emp',
      method_name           =>  'overwrite',
      resolution_column      =>  'site',  
      column_list            =>  field);
end;
/
declare
 field  dbms_utility.name_array;
begin
 field(1) := 'dname';
 field(2) := 'site';
 
   dbms_apply_adm.set_update_conflict_handler(
      object_name            =>  'apps.dept',
      method_name            =>  'overwrite',
      resolution_column      =>  'site',  
      column_list            =>  field);
end;
/

Step 15: Set schema instantiation SCN for sfdb at dcdb

Obtain the applied_message_number of sfdb that was last applied on ladb. This number is used to set the instantiation SCN at dcdb. Message numbers greater than this applied_message_number are propagated and applied on dcdb.

connect strmadmin/strmadmin@dcdb.world
declare
 v_scn  number;
begin
 select applied_message_number into v_scn from dba_apply_progress@ladb.world where apply_name='APPLY_SRC_SFDB';
 dbms_apply_adm.set_schema_instantiation_scn (
 source_schema_name  => 'apps',
 source_database_name => 'sfdb.world',
 instantiation_scn => v_scn,
 recursive  => true);    
end;
/

Step 16: Set schema instantiation SCN for dcdb at ladb and sfdb

connect strmadmin/strmadmin@dcdb.world
declare
 v_scn number;
begin
 v_scn := dbms_flashback.get_system_change_number();
 dbms_apply_adm.set_schema_instantiation_scn@ladb.world (
 source_schema_name  => 'apps',
 source_database_name => 'dcdb.world',
 instantiation_scn => v_scn,
 recursive             => true);    
 dbms_apply_adm.set_schema_instantiation_scn@sfdb.world (
 source_schema_name  => 'apps',
 source_database_name => 'dcdb.world',
 instantiation_scn => v_scn,
 recursive             => true);    
end;
/

Step 17: Start capture and apply processes on dcdb

Setting the disable_on_error parameter to ‘N’ allows the apply process to continue applying row LCRs even when it encounters errors. The default value is ‘Y’ which disables the apply process automatically on the first error encountered.

connect strmadmin/strmadmin@dcdb.world
begin
 dbms_apply_adm.set_parameter (
  apply_name => 'apply_src_ladb',
  parameter => 'disable_on_error',
  value => 'N');
end;
/
begin
 dbms_apply_adm.start_apply (
  apply_name => 'apply_src_ladb');
end;
/
begin
 dbms_apply_adm.set_parameter (
  apply_name => 'apply_src_sfdb',
  parameter => 'disable_on_error',
  value => 'N');
end;
/
begin
 dbms_apply_adm.start_apply (
  apply_name => 'apply_src_sfdb');
end;
/
begin
 dbms_capture_adm.start_capture (
  capture_name => 'capture_dcstrm');
end;
/

Step 18: Start apply process on ladb

connect strmadmin/strmadmin@ladb.world
begin
 dbms_apply_adm.start_apply (
  apply_name => 'apply_src_sfdb');
end;
/

begin
 dbms_apply_adm.set_parameter (
  apply_name => 'apply_src_dcdb',
  parameter => 'disable_on_error',
  value => 'N');
end;
/
begin
 dbms_apply_adm.start_apply (
  apply_name => 'apply_src_dcdb');
end;
/

Step 19: Start apply process on sfdb

connect strmadmin/strmadmin@sfdb.world
begin
 dbms_apply_adm.set_parameter (
  apply_name => 'apply_src_dcdb',
  parameter => 'disable_on_error',
  value => 'N');
end;
/
begin
 dbms_apply_adm.start_apply (
  apply_name => 'apply_src_dcdb');
end;
/

Step 20: It’s time to test drive …

We will replicate DMLs and DDLs and test the overwrite conflict handlers on dcdb by simulating update conflicts.

SQL> connect apps/apps@ladb.world
SQL> select * from emp;
     EMPNO ENAME                SITE       JOB            DEPTNO D
---------- -------------------- ---------- ---------- ---------- -
         1 MIKE                 LADB       CLERK               1 N
         6 JANE                 LADB       ANALYST             1 N
        11 ZACH                 LADB       ENGINEER           11 N
SQL> select * from dept;
    DEPTNO DNAME                SITE
---------- -------------------- ----------
         1 FINANCE              LADB
         6 HR                   LADB
        11 RESEARCH             LADB
SQL> connect apps/apps@sfdb.world
SQL> select * from emp;
     EMPNO ENAME                SITE       JOB            DEPTNO D
---------- -------------------- ---------- ---------- ---------- -
         1 MIKE                 LADB       CLERK               1 N
         6 JANE                 LADB       ANALYST             1 N
        11 ZACH                 LADB       ENGINEER           11 N
SQL> select * from dept;
    DEPTNO DNAME                SITE
---------- -------------------- ----------
         1 FINANCE              LADB
         6 HR                   LADB
        11 RESEARCH             LADB
SQL> connect apps/apps@dcdb.world
SQL> select * from emp;
     EMPNO ENAME                SITE       JOB            DEPTNO D
---------- -------------------- ---------- ---------- ---------- -
         1 MIKE                 LADB       CLERK               1 N
         6 JANE                 LADB       ANALYST             1 N
        11 ZACH                 LADB       ENGINEER           11 N
SQL> select * from dept;
    DEPTNO DNAME                SITE
---------- -------------------- ----------
         1 FINANCE              LADB
         6 HR                   LADB
        11 RESEARCH             LADB

Test DDL

SQL> connect apps/apps@dcdb.world
SQL> create sequence deptno_dcseq start with 3 increment by 5;  
Sequence created.
SQL> connect apps/apps@ladb.world
SQL> select sequence_name from user_sequences where sequence_name='DEPTNO_DCSEQ';
SEQUENCE_NAME
------------------------------
DEPTNO_DCSEQ
SQL> connect apps/apps@sfdb.world
SQL> select sequence_name from user_sequences where sequence_name='DEPTNO_DCSEQ';
SEQUENCE_NAME
------------------------------
DEPTNO_DCSEQ

Test DML

SQL> connect apps/apps@sfdb.world
SQL> insert into emp (empno,ename,job,deptno) values (empno_sfseq.nextval,'AL','SCIENTIST',11);
1 row created.
SQL> commit;
Commit complete.
SQL> connect apps/apps@ladb.world
SQL> select * from emp;
     EMPNO ENAME                SITE       JOB            DEPTNO D
---------- -------------------- ---------- ---------- ---------- -
         1 MIKE                 LADB       CLERK               1 N
         6 JANE                 LADB       ANALYST             1 N
        11 ZACH                 LADB       ENGINEER           11 N
       102 AL                   SFDB       SCIENTIST          11 N
SQL> connect apps/apps@dcdb.world
SQL> select * from emp;
     EMPNO ENAME                SITE       JOB            DEPTNO D
---------- -------------------- ---------- ---------- ---------- -
         1 MIKE                 LADB       CLERK               1 N
         6 JANE                 LADB       ANALYST             1 N
        11 ZACH                 LADB       ENGINEER           11 N
       102 AL                   SFDB       SCIENTIST          11 N

Test Update Conflict Resolution – Overwrite handler at dcdb

To test the update conflict handler, we will create conflicts on the dname column for deptno=6 in dcdb. By setting the Streams tag to a value other than the default NULL at dcdb, we instruct the capture process not to capture the changes in the redo logs. In this case, the updated SALES value was not applied at ladb and sfdb.

SQL> connect apps/apps@dcdb.world
SQL> exec dbms_streams.set_tag(hextoraw('1'));
PL/SQL procedure successfully completed.
SQL> select dbms_streams.get_tag from dual;
GET_TAG
---------------------------------------------------------------
01
SQL> update dept set dname='SALES' where dname='HR';
1 row updated.
SQL> commit;
Commit complete.
SQL> select * from dept;
    DEPTNO DNAME                SITE
---------- -------------------- ----------
         1 FINANCE              LADB
         6 SALES                LADB
        11 RESEARCH             LADB
SQL> connect apps/apps@sfdb.world
SQL> select * from dept;
    DEPTNO DNAME                SITE
---------- -------------------- ----------
         1 FINANCE              LADB
         6 HR                   LADB
        11 RESEARCH             LADB
SQL> connect apps/apps@ladb.world
SQL> select * from dept;
    DEPTNO DNAME                SITE
---------- -------------------- ----------
         1 FINANCE              LADB
         6 HR                   LADB
        11 RESEARCH             LADB
SQL> update dept set dname='PARTS' where dname='HR';
1 row updated.
SQL> commit;
Commit complete.
SQL> select * from dept;
    DEPTNO DNAME                SITE
---------- -------------------- ----------
         1 FINANCE              LADB
         6 PARTS                LADB
        11 RESEARCH             LADB
SQL> connect apps/apps@sfdb.world
SQL> select * from dept;
    DEPTNO DNAME                SITE
---------- -------------------- ----------
         1 FINANCE              LADB
         6 PARTS                LADB
        11 RESEARCH             LADB
SQL> connect apps/apps@dcdb.world
SQL> select * from dept;
    DEPTNO DNAME                SITE
---------- -------------------- ----------
         1 FINANCE              LADB
         6 PARTS                LADB
        11 RESEARCH             LADB

The overwrite handler was invoked in dcdb and the conflict was resolved by overwriting SALES with PARTS.

Conclusion

We have demonstrated the procedure to add a third source site into a bi-directional replicated environment. The same steps outlined in this article can also be used to add subsequent source databases.

» See All Articles by Columnist Vincent Chan

Share:
Home
Mobile Site | Full Site
Copyright 2017 © QuinStreet Inc. All Rights Reserved