
Introduction
In multicloud environments, while using resources from multiple cloud vendors, you usually deal with migrating data across the environments or keeping it in one cloud environment while accessing it from another. The Oracle Database DBMS_CLOUD PL/SQL package allows you to access data on OCI Object Storage, Azure Blob Storage, Amazon S3 buckets, and Google Cloud Storage buckets or import it into an Oracle Database from these sources.
This blog post provides a step-by-step guide on accessing and importing data from Google Cloud Storage bucket into an Oracle Database.
The Environment
- Data Pump dump file in a Google Cloud Storage bucket.
- Oracle Autonomous Database on OCI, as DBMS_CLOUD package is already pre-installed. You can use any Oracle Database version 19.10 and higher where you can install the DBMS_CLOUD package.
For testing purposes, network connectivity can be established over the internet. For production environments, set up a private dedicated connection between Google Cloud and OCI via Oracle Interconnect for Google Cloud or a 3rd party network service provider. For Oracle Database@Google Cloud, no network connectivity setup between the clouds is needed, as the Oracle databases reside in the Google Cloud data center.
Preparation on Google Cloud
Step 1: Create an Access Key and a Secret
To enable the database to authenticate and access the objects in your Google Cloud Storage bucket, you need to create an access key and a secret for your user account.
From the Google Cloud Storage Settings page, click on the INTEROPERABILITY tab.

Scroll down to the Access keys for your user account section and click on CREATE A KEY. An access key and a secret will be created. Copy both values to be used in the next steps.

In this case (some characters are replaced):
- Access key: GOOGA3N3xxx7MDD
- Secret: sVQHCkZVn8VvmG2+mHLDExxx
Step 2: Get your file’s URL in the Google Cloud bucket
From the Google Cloud Storage Buckets page, get your bucket name.

In this case:
- Bucket name: adbbucket
Click on the bucket name to get the object file name in the bucket.

In this case, it’s a Data Pump export file:
- Object name: hr.dmp
The access URL is as follows:
https://<bucket_name>.storage.googleapis.com/<object_name>
In this case:
- Object URL: https://adbbucket.storage.googleapis.com/hr.dmp
Preparation on OCI
Step 3: Create the credentials in the Oracle Database
Create the credentials using the DBMS_CLOUD PL/SQL package to allow the database to access the Google Cloud Storage bucket. The username parameter is the access key and the password parameter is the secret created in step 1:
BEGIN
DBMS_CLOUD.CREATE_CREDENTIAL(
credential_name => '<Crednedial_Name>',
username => '<Storage_Account_Name>',
password => '<Key>'
);
END;
/
-- in this examplae
SQL> BEGIN
DBMS_CLOUD.CREATE_CREDENTIAL(
credential_name => 'GCP_CRED',
username => 'GOOGA3N3xxx7MDD',
password => 'sVQHCkZVn8VvmG2+mHLDExxx'
);
END;
/
PL/SQL procedure successfully completed.
Step 4: Test the access to Google Cloud Storage
To test the access to the bucket, list the files using the LIST_OBJECTS function of the DBMS_CLOUD package. Use the credential name from step 3 and the location URL from step 2; however, without the object name at the end:
SQL> select object_name from dbms_cloud.list_objects('GCP_CRED','https://adbbucket.storage.googleapis.com');
OBJECT_NAME
--------------------------------------------------------------------------------
hr.dmp
Importing Data
Step 5: Import data from Google Cloud Storage to Oracle Database
Create a data pump parameter file. The dumpfile parameter is the object name from step 2.
# data pump parameter file: impgcp.par
directory=DATA_PUMP_DIR
credential=GCP_CRED
schemas=hr
remap_tablespace=USERS:DATA
dumpfile=https://adbbucket.storage.googleapis.com/hr.dmp
Execute the Data Pump import command using the parameter file.
impdp userid=ADMIN@adbzdm_high parfile=impgcp.par
...
Job "ADMIN"."SYS_IMPORT_SCHEMA_01" successfully completed at Tue Oct 1 13:55:38 2024 elapsed 0 00:00:07
Conclusion
The DBMS_CLOUD package provides the flexibility to access data on multiple cloud vendors and a simple way to import it into an Oracle Database, simplifying data migrations across clouds.
