Subscribe to Posts by Email

Subscriber Count

    700

Disclaimer

All information is offered in good faith and in the hope that it may be of use for educational purpose and for Database community purpose, but is not guaranteed to be correct, up to date or suitable for any particular purpose. db.geeksinsight.com accepts no liability in respect of this information or its use. This site is independent of and does not represent Oracle Corporation in any way. Oracle does not officially sponsor, approve, or endorse this site or its content and if notify any such I am happy to remove. Product and company names mentioned in this website may be the trademarks of their respective owners and published here for informational purpose only. This is my personal blog. The views expressed on these pages are mine and learnt from other blogs and bloggers and to enhance and support the DBA community and this web blog does not represent the thoughts, intentions, plans or strategies of my current employer nor the Oracle and its affiliates or any other companies. And this website does not offer or take profit for providing these content and this is purely non-profit and for educational purpose only. If you see any issues with Content and copy write issues, I am happy to remove if you notify me. Contact Geek DBA Team, via geeksinsights@gmail.com

Pages

Oracle Autonomous Datawarehouse : Loading Data from Oracle Objects Stores or AWS S3

Oracle Autonomous Datawarehouse offers loading the data from object stores like Oracle Object Store and AWS S3 or Azure Blob Store.

For this, a new package has been introduced called DBMS_CLOUD and you can load/unload the data and also manage the files with the package. This facilitates easy management of objects in the objects store from the database and also provides programmatic access to the object stores.

Copying Data from Oracle Object store or S3

Create Credentials (object store)

BEGIN
DBMS_CLOUD.create_credential (
credential_name => 'awss3cred',
username => 'geeks',
password => 'secretkey'
) ;
END;
/

Create Credentials (object store)

BEGIN
DBMS_CLOUD.create_credential (
credential_name => 'oraclecred,
username => 'geeks',
password => 'secretkey'
) ;
END;
/

Copy the data (example oracle object store)

BEGIN
DBMS_CLOUD.copy_data(
table_name      => 'CHANNELS',
credential_name => 'oraclecred',
file_uri_list   => 'https://swiftobjectstorage.us-ashburn-1.oraclecloud.com/v1/geeksinsights/datalake3/channels.txt',
);
END;
/

Copy the data (example AWS S3)

BEGIN
DBMS_CLOUD.copy_data(
table_name      => 'CHANNELS',
credential_name => 'awss3cred',
file_uri_list   => 'https://s3-eu-west-1.amazonaws.com/datalake/channels.txt',
);
END;
/

Copying Data from Datawarehouse to Object Store or S3

In addition, to copy the data back to object store you can use dbms_cloud.put_object procedure with directory name, the size limit of the file is 5gb.

BEGIN
DBMS_CLOUD.PUT_OBJECT (
credential_name => 'awss3cred',
object_uri   => 'https://s3-eu-west-1.amazonaws.com/datalake/',
directory_name => 'DATA_PUMP_DIR',
file_name => 'EXPORT_ORCL_28NOV2018.dmp'
);
/

Listing Files in the Object Store similar like AWS RDSADMIN.RDS_FILE_UTIL.LISTDIR('DATA_PUMP_DIR')

SELECT * FROM DBMS_CLOUD.LIST_FILES('DATA_PUMP_DIR');

Thanks

Suresh

 

Comments are closed.