Monday, November 23, 2009

Oracle Data Pump for data refreshes part 4

Part one, part two and part three.

This is a follow-up to the third part of this series with some real-world examples to make corrections and to justify some points.

I ran into a known 10g bug when attempting a DATA_ONLY export using parallel processing (ORA-00001: unique constraint () violated. The bug was fixed with 11g (Oracle forums). The source database was small so I did ran parallel=1 to get it to work. For larger databases that might not be feasible. The next test would be to modify the first export to get data and metadata and then do the DATA_ONLY on the first import step.

Other than the error above, the export and import for the first refresh went smoothly. The second import was more of a complete account copy do I could not use exactly what I specified in part three.

I did one export and one import. The export went quickly and without error. The import ground to a halt at the TABLE_STATISTICS step due to a bug in the Oracle Data Pump when schema and/or tablespace remapping is used. I had to terminate the import. Forums suggest using a EXCLUDE=STATISTICS in the parameter file to avoid these two incredibly slow steps and then rebuilding the statistics after the import (done through Enterprise Manager or whatever). The export of 57 GB of data took roughly 12 minutes. The import of that data took roughly 20 minutes for just the data. Source objects seemed to take a while and, as mentioned, the statistics took forever.

The data pump is a remarkable improvement over the previous import and export utilities. What took hours before can now be accomplished in less than an hour.

More to come.

No comments: