If you have a small schema in the Oracle Cloud Autonomous Database, you can actually run DataPump from SQL Developer Web. DATA_PUMP_DIR maps to a DBFS mount inside the Oracle Database.
Logged in to my Oracle ADB as "ADMIN"
I check if DATA_PUMP_DIR exists and I find that it is in dbfs :
I run a PLSQL Block to export the HEMANT schema using the DBMS_DATAPUMP API :
After I drop the two tables in the schema, I run the import using the DBMS_DATAPUMP API and then refresh the list of Tables owned by "HEMANT" :
This method is a quick way of using the Autonomous Database itself when you don't have an external Object Store to hold the Datapump file. So, I'd use this only for very small schemas as the dump is itself loaded into DBFS.
The PLSQL Code is :
REM Based on Script posted by Dick Goulet, posted to oracle-l@freelists.org REM With modifications by me. REM Hemant K Chitale REM Export schema "HEMANT" declare h1 NUMBER := 0; h2 varchar2(1000); ex boolean := TRUE; fl number := 0; schema_exp varchar2(1000) := 'in(''HEMANT'')'; f_name varchar2(50) := 'My_DataPump'; dp_mode varchar2(100) := 'export'; blksz number := 0; SUCCESS_WITH_INFO exception; begin utl_file.fgetattr('DATA_PUMP_DIR', dp_mode||'.log', ex, fl, blksz); if(ex = TRUE) then utl_file.fremove('DATA_PUMP_DIR',dp_mode||'.log'); end if; h1 := dbms_datapump.open (operation => 'EXPORT', job_mode => 'SCHEMA', job_name => upper(dp_mode)||'_EXP', version => 'COMPATIBLE'); dbms_datapump.set_parallel(handle => h1, degree => 2); dbms_datapump.add_file(handle => h1, filename => f_name||'.dmp%U', directory => 'DATA_PUMP_DIR', filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_DUMP_FILE); dbms_datapump.add_file(handle => h1, filename => f_name||'.log', directory => 'DATA_PUMP_DIR', filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE); dbms_datapump.set_parameter(handle => h1, name => 'KEEP_MASTER', value => 0); dbms_datapump.set_parameter(handle => h1, name => 'INCLUDE_METADATA', value => 1); dbms_datapump.metadata_filter(handle=>h1, name=>'SCHEMA_EXPR',value=>schema_exp); dbms_datapump.start_job(handle => h1, skip_current => 0, abort_step => 0); dbms_datapump.wait_for_job(handle=>h1, job_state=>h2); exception when SUCCESS_WITH_INFO THEN NULL; when others then h2 := sqlerrm; if(h1 != 0) then dbms_datapump.stop_job(h1,1,0,0); end if; dbms_output.put_line(h2); end; REM Import schema "HEMANT" declare h1 NUMBER := 0; h2 varchar2(1000); ex boolean := TRUE; fl number := 0; schema_exp varchar2(1000) := 'in(''HEMANT'')'; f_name varchar2(50) := 'My_DataPump'; dp_mode varchar2(100) := 'import'; blksz number := 0; SUCCESS_WITH_INFO exception; begin utl_file.fgetattr('DATA_PUMP_DIR', dp_mode||'.log', ex, fl, blksz); if(ex = TRUE) then utl_file.fremove('DATA_PUMP_DIR',dp_mode||'.log'); end if; h1 := dbms_datapump.open (operation => 'IMPORT', job_mode => 'SCHEMA', job_name => upper(dp_mode)||'_EXP'); dbms_datapump.set_parallel(handle => h1, degree => 2); dbms_datapump.add_file(handle => h1, filename => f_name||'.dmp%U', directory => 'DATA_PUMP_DIR', filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_DUMP_FILE); dbms_datapump.add_file(handle => h1, filename => f_name||'.log', directory => 'DATA_PUMP_DIR', filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE); dbms_datapump.set_parameter(handle => h1, name => 'KEEP_MASTER', value => 0); dbms_datapump.set_parameter(handle => h1, name => 'TABLE_EXISTS_ACTION', value=>'SKIP'); dbms_datapump.metadata_filter(handle=>h1, name=>'SCHEMA_EXPR',value=>schema_exp); dbms_datapump.start_job(handle => h1, skip_current => 0, abort_step => 0); dbms_datapump.wait_for_job(handle=>h1, job_state=>h2); exception when SUCCESS_WITH_INFO THEN NULL; when others then h2 := sqlerrm; if(h1 != 0) then dbms_datapump.stop_job(h1,1,0,0); end if; dbms_output.put_line(h2); end;
Again, I emphasise that this is only for small dumps.
No comments:
Post a Comment