Thursday, July 25, 2013

Create Metric Extension to Monitor Archivelog Gap on Standby Database

Create Metric Extension to Monitor Archivelog Gap on Standby

I want to monitor when a standby database falls behind a certain amount of archivelog files. I want to be able to do this no matter what platform or database version. I want to use a standard monitor across the database ecosystem. In this case I used metric extension to create my own metric for monitoring archivelog gap on standby database.

1.  Login to OEM

2. Click Enterprise>Monitoring>Metric Extension

3. Click create













4. Fill in the information and set collection schedule as needed click next.
Target Type- Database Instance
Name- <any_name>
Display Name- <any_name>
Adapter- SQL
Description- <any_description>



5. Insert the SQL query below into the SQL Query box and click next. What this query is doing is selecting on the gv$archived_log table on the primary where arch dest is standby and appiled is yes. If a standby has not applied an archivelog then on the primary applied will be no. There are cases where mrp process has stop sending confirmation to the primary that archivelog have been applied in that case it is still an issue and this alert will give you notification to take action.
select sum(local.sequence#-target.sequence#) Total_gap
from
(select thread#,max(sequence#) sequence# from gv$archived_log where
dest_id=(select dest_id from v$archive_dest where TARGET='STANDBY'
)
and applied='YES' group by thread#) target,
(select thread#,max(sequence#) sequence# from gv$log group by thread#) local
where target.thread#=local.thread#;
 6. Click add button to setup column info.
7. Fill in the information and click OK then next.
Name- column name
Display Name- how to display the column
Column Type- Data Column
Value Type- Number
Comparison Operator- Greater than or equal to what metric threshold needed
Set the advance values as needed

 8. Set credentials as needed
 9. Click add button to add target to test the metric.
 10. Select a database to test and click select.
 11. Click the run test button.
 12. Testing progress will start
 13. Test results will show up in test result section once complete click next.
 14. Review the metric extensions setting and click finish.
 15. You will now see the new metric extension but the status is editable. Select the new metric and click >action>save as deployable draft.
 16. Now the status of the metric extension is deployable and you will be able to deploy the metric extension to database instance targets. Click action>deploy to targets.
 17. Click add button
 18. Select the target or targets to deploy the metric. In this case we have the primary database targets in a group called Data Guard Primary Databases.
19. Click the submit button to submit metric process will start.

Example of alert

Saturday, July 20, 2013

Provision Pluggable Database 12c with Data Pump Transportable Export and Import using Enterprise Manger 12c

In this demo I am show how to take an existing Oracle Database 11.2.0.3 non-CDB and use Enterprise Manager 12.1.0.2 to provision a pluggable database into my existing CDB with datapump.

This blog post is part of my Oracle database consolidation blog which you can read here.


Demo

I will be using my dbtest1 11.2.0.3 non-CDB database to plug into my container database testdbs. The PDB will be called dev1db.

Note:
The non-CDB must be 11.2.0.3.
The compatible parameter must be set to 11.2.0.3.0
The character set for the non-cdb must match the cdb charatcer set.



1. Check the character set and the compatible parameter in the 11.2.0.3 database.
  • SQL>SHOW PARAMETER COMPATIBLE
  • SQL>select * from NLS_DATABASE_PARAMETERS where parameter ='NLS_CHARACTERSET';
2. Login to OEM and go to the container testdbs database home page.

3. Click on Oracle Database>Provisioning>Provision Pluggable Databases>



















4. Select migrate existing databases and click launch.

5. Set database login credential click login.

6. Select Use Oracle Data Pump Full Transportable Export and Import. Set Oracle home credentials click next.

7. Click add and select the database we are going to adopt then click select.

8. Set the database and host credentials, set the export directory, fill in the destination information, select datafile location and import directory then click next. Set the object exists action to take.

In my case I used the same directory for export and import but these directories can be different.
I set the name of the PDB to dev1db but you can leave the PDB name the same as the non-CDB.

9. Pre-check process will begin...

10. When validation is complete click close.

11. Give the job a new and description then set the schedule and click next.

In my case I left the name as is and the schedule as immediately.

12. Review your migration job and click submit.

13. Confirmation of job successful submitted will pop up click view job.

14. You can review the procedure activity to see the status of each step in the process. Expand the procedure steps and select the step. In the step details you can review the details of the step or steps you have selected.

15. In the below screenshot you can see in the step details it list the logs that are being created for the procedure step "Create Pluggable Database Step" click the link of the log to see what is written into the log.

16. Here we can see the output of the log as the step is running.

17. To get back to the job activity click Enterprise>Provisioning and Patching>Procedure Activity. Click the job name.

18. On the server we can see the files that are created on the directory we specified.

19. When the job completes all procedure step will show completed.

20. From the container home page we can see information about the pluggable database.

21. From the server we log into the container database and can see the new PDB database and active service.

If you want to see a video of this demo watch below.


My webpages
http://db12c.blogspot.com/
http://cloudcontrol12c.blogspot.com/

http://www.youtube.com/user/jfruiz11375

Follow me on Twitter



Thursday, July 18, 2013

Refresh Database Target Configuration

In this demo I will show how to refresh database target configuration to get the updated init parameter compatible.

The database dbtest1 just had its compatible parameter changed to 11.2.0.3.0 and the database has been restarted. Enterprise Manager still shows the compatible parameter as 11.2.0.0.0.

Refresh Database Target Configuration

1. Log into Enterprise Manager


2. Navigate to the database home page


3. Select Oracle Database>configuration>Last Collected



4. Select initialization parameters in the search field type comp and click search.



5. You can see the date show Jul-5th which was more then two week ago. You an also see that the compatible parameter says 11.2.0.0.0.



6. Right click on top of the database target name and select refresh. The will start the refresh process.


 You will see a successful confirmation message.






Note: If you receive an error message like the one below click here to go to resolve section.




7. Select initialization parameters in the search field type comp and click search. You can now see the collected date show Jul-19 and compatible now shows 11.2.0.3.0.














Resolve Error from Refresh Configuartion

1. Go to the agent home page and select agent>resynchronization



2.
















2. Click continue



3. Click the job link name
 4. Click the running link to monitor the job it runs in about 2 minutes

5. Once the job is successful click here to follow the step in the refresh target configuration section.

Saturday, July 13, 2013

Use Enterprise Manager 12c to take RMAN Backup of Oracle Database 12c PDB

I am going to use Enterprise Manager to take an RMAN backup of my Oracle Database 12c PDB


1. Login to OEM.

2. Go to the container database home page.

3. Select Availability>Backup & Recovery>Schedule Backup

4. Select pluggable database set your host credentials then click schedule customized backup

5. Click the add button to select the PDB

6. Select the PDB and click select

7. Click next

8. Backup type is full in my case I want to delete the archivelog when the backup completes as well as any obsolete backups then Click next

9. You can leave the default settings or update as needed then click next


10. Set the schedule for the backup job in my case I want to run immediately then click next.

11. Here you can review the backup job configuration you can even copy the content of the RMAN script that will be executed if you wanted to execute out side of Enterprise Manager.

Once you complete the review click submit job.

12. Click view job

13. Click the running link for the backup step to review execution.

14. You can monitor the RMAN script execution output as needed.

15. After the backup is successful you can review the output to see everything that was backed up. The PDB backup is faster then a whole database backup as the whole database backup will backup the CDB and all PDBs.

My webpages
http://db12c.blogspot.com/
http://cloudcontrol12c.blogspot.com/

http://www.youtube.com/user/jfruiz11375

Follow me on Twitter