So today I wanted to export a 35GB partitioned table and really didn’t feel like going through the process of using DataPump; the thought of creating a parameter file for one lousy table didn’t appeal to me. So I used the old trusty Export (exp). It barfed with an internal consistency error. Hmmmm. I can’t be bothered solving that one for now, I just want my data. No Toad, you are not going to put 38 million rows into Excel or a pretty INSERT script file. I don’t think so.
Why make it simple eh? Let’s use the DBMS_DATAPUMP API to do it. Heck, yeah!
SET SERVEROUTPUT ON SIZE 1000000
DECLARE
l_dp_handle NUMBER;
l_last_job_state VARCHAR2 (30) := ‘UNDEFINED’;
l_job_state VARCHAR2 (30) := ‘UNDEFINED’;
l_sts KU$_STATUS;
BEGIN
l_dp_handle :=
DBMS_DATAPUMP.open (operation => ‘EXPORT’,
job_mode => ‘TABLE’,
remote_link => NULL,
job_name => ‘TX_EXPORT’,
version => ‘LATEST’);
DBMS_DATAPUMP.add_file (handle => l_dp_handle,
filename => ‘SPB.dmp’,
directory => ‘DATAPUMP_EXP’);
DBMS_DATAPUMP.add_file (handle => l_dp_handle,
filename => ‘SPB.log’,
directory => ‘DATAPUMP_EXP’,
filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);
DBMS_DATAPUMP.metadata_filter (handle => l_dp_handle,
name => ‘NAME_EXPR’,
VALUE => ‘= ”TRANSACTION_AD”’);
DBMS_DATAPUMP.start_job (l_dp_handle);
DBMS_DATAPUMP.detach (l_dp_handle);
END;
/
So there it is. What did I do?
- Define some variables
- Open an export job for type TABLE and gave it a job name
- Added a file to receive the data, referencing the pre-created and r/w granted directory
- Added a file to receive the log, ditto on the directory
- Created the filter that uses a name expression to export the single table “TRANSACTION_AD”
- Started the job
- Detached the session (I suspect if I had not done this, the session would not return until the job completed; I wanted to walk away and get coffee and have a natter).
That was it. I ended up with a pair of dump/log files that I could use elsewhere for my nefarious purposes.
The benefits of being idle.