Oracle Autonomous Datawarehouse offers loading the data from object stores like Oracle Object Store and AWS S3 or Azure Blob Store.
For this, a new package has been introduced called DBMS_CLOUD and you can load/unload the data and also manage the files with the package. This facilitates easy management of objects in the objects store from the database and also provides programmatic access to the object stores.
Copying Data from Oracle Object store or S3
Create Credentials (object store)
BEGIN
DBMS_CLOUD.create_credential (
credential_name => 'awss3cred',
username => 'geeks',
password => 'secretkey'
) ;
END;
/
Create Credentials (object store)
BEGIN
DBMS_CLOUD.create_credential (
credential_name => 'oraclecred,
username => 'geeks',
password => 'secretkey'
) ;
END;
/
Copy the data (example oracle object store)
BEGIN
DBMS_CLOUD.copy_data(
table_name => 'CHANNELS',
credential_name => 'oraclecred',
file_uri_list => 'https://swiftobjectstorage.us-ashburn-1.oraclecloud.com/v1/geeksinsights/datalake3/channels.txt',
);
END;
/
Copy the data (example AWS S3)
BEGIN
DBMS_CLOUD.copy_data(
table_name => 'CHANNELS',
credential_name => 'awss3cred',
file_uri_list => 'https://s3-eu-west-1.amazonaws.com/datalake/channels.txt',
);
END;
/
Copying Data from Datawarehouse to Object Store or S3
In addition, to copy the data back to object store you can use dbms_cloud.put_object procedure with directory name, the size limit of the file is 5gb.
BEGIN
DBMS_CLOUD.PUT_OBJECT (
credential_name => 'awss3cred',
object_uri => 'https://s3-eu-west-1.amazonaws.com/datalake/',
directory_name => 'DATA_PUMP_DIR',
file_name => 'EXPORT_ORCL_28NOV2018.dmp'
);
/
Listing Files in the Object Store similar like AWS RDSADMIN.RDS_FILE_UTIL.LISTDIR('DATA_PUMP_DIR')
SELECT * FROM DBMS_CLOUD.LIST_FILES('DATA_PUMP_DIR');
Thanks
Suresh
Follow Me!!!