
Introduction
In multicloud environments, while using resources from multiple cloud vendors, you usually deal with migrating data across the environments or keeping it on one cloud environment while accessing it from another. The Oracle Database DBMS_CLOUD PL/SQL package allows you to access data on OCI Object Storage, Azure Blob Storage, and Amazon S3 buckets or import it into an Oracle Database from these sources.
This blog post provides a step-by-step guide on accessing and importing data from Azure Blob Storage into an Oracle Database.
The Environment
- Data files in an Azure Blob Storage container.
- Oracle Autonomous Database on OCI, as DBMS_CLOUD package is already pre-installed. You can use any Oracle Database version 19.10 and higher where you can install the DBMS_CLOUD package.
For testing purposes, network connectivity can be established over the internet. For production environments, set up a private dedicate connection between Azure and OCI via the Azure and OCI Interconnect or a 3rd party network service provider.
Preparation on Azure
Step 1: Assign an Azure role for access to blob data in a storage account
Follow the Azure documentation to assign the roles needed to access data stored on Azure Blob Storage. For example, using the Azure CLI:
# replace the placeholder values in brackets with your own values
az role assignment create \
--role "Storage Blob Data Contributor" \
--assignee <email> \
--scope "/subscriptions/<subscription>/resourceGroups/<resource-group>/providers/Microsoft.Storage/storageAccounts/<storage-account>/blobServices/default/containers/<container>"
Step 2: Copy the Access Key of your storage account
From your storage account page, choose Access Keys in the Security + Networking section, click on Show keys (text will change to Hide keys), and copy the key into a secure location.

For example:
Storage account name: sptstorageaccount
Key: BfU6yU/doVAxmIAaknLTWFwcXrVn9AUHOn33v8+Fa2bZ2aMiqRJV19Y5+5NymnV251OLc0VHNyQo6jcNxBz/Jg==
We will use these later in step 4.
Step 3: Get your Azure Blob file URL
Get the URL of the object (the data file) stored in an Azure Blob Storage container that you want to import into an Oracle Database. From the storage account page, choose Containers, and click on the container name where your objects reside. Click on the object name and copy the Object URL:

In this case, the data.csv file is elementary and contains two rows only with data separated by a comma:
1,name1
2,name2
Preparation on OCI
Step 4: Create the credentials in the Oracle Database
Create the credentials using the DBMS_CLOUD PL/SQL package to allow the database to access the Azure Blob Storage. Use the Access Key from step 2:
BEGIN
DBMS_CLOUD.CREATE_CREDENTIAL(
credential_name => '<Crednedial_Name>',
username => '<Storage_Account_Name>',
password => '<Key>'
);
END;
/
-- for examplae
SQL> BEGIN
DBMS_CLOUD.CREATE_CREDENTIAL(
credential_name => 'AZURECRED',
username => 'sptstorageaccount',
password => 'BfU6yU/doVAxmIAaknLTWFwcXrVn9AUHOn33v8+Fa2bZ2aMiqRJV19Y5+5NymnV251OLc0VHNyQo6jcNxBz/Jg=='
);
END;
/
PL/SQL procedure successfully completed.
Step 5: Test the access to Azure Blob Storage
To test the access to the container, list the files using the LIST_OBJECTS function of the DBMS_CLOUD package. Use the credential name from step 4 and the location URL from step 3; however, without the object name at the end:
SQL> select * from dbms_cloud.list_objects('AZURECRED','https://sptstorageaccount.blob.core.windows.net/spt-container/');
OBJECT_NAME BYTES CHECKSUM CREATED LAST_MODIFIED
_________________ _________ ___________________________ ______________________________________ ______________________________________
data.csv 16 cXUnXSt44GOpDwnMVPeepA== 16-FEB-22 09.51.56.000000000 AM GMT 16-FEB-22 09.51.56.000000000 AM GMT
data.txt 16 EbMV1qfTnnKxH5RAxIvKHA== 16-FEB-22 09.51.56.000000000 AM GMT 16-FEB-22 09.51.56.000000000 AM GMT
sampleuser.dmp 356352 BE0WmZOd7+l4NsR44e0QWQ== 16-FEB-22 09.51.56.000000000 AM GMT 16-FEB-22 09.51.56.000000000 AM GMT
Importing Data
Step 6: Import data from S3 bucket to Oracle Database
Now we are ready to import the data using the COPY_DATA procedure of the DBMS_CLOUD package.
Import from .csv file:
SQL> create table mydata (id number, name VARCHAR2(64));
Table MYDATA created.
SQL> BEGIN
DBMS_CLOUD.COPY_DATA(
table_name => 'mydata',
credential_name => 'AZURECRED',
file_uri_list => 'https://sptstorageaccount.blob.core.windows.net/spt-container/data.csv',
format => json_object('delimiter' value ',')
);
END;
/
PL/SQL procedure successfully completed.
SQL> select * from mydata;
ID NAME
_____ ________
1 name1
2 name2
Import from .txt file:
BEGIN
DBMS_CLOUD.COPY_DATA(
table_name => 'mydata',
credential_name => 'AZURECRED',
file_uri_list => 'https://sptstorageaccount.blob.core.windows.net/spt-container/data.txt',
format => json_object('delimiter' value ',')
);
END;
/
Oracle Data Pump import:
# data pump parameter file: impazure.par
directory=DATA_PUMP_DIR
credential=AZURECRED
schemas=sampleuser
remap_tablespace=USERS:DATA
dumpfile=https://sptstorageaccount.blob.core.windows.net/spt-container/sampleuser.dmp
logfile=azureimp.log
# data pump import command
impdp userid=ADMIN@adbfra_high parfile=impazure.par
Access to Import Logfile in Autonomous Database
After Data Pump Import into Oracle Autonomous Database, you can check the files created in the DATA_PUMP_DIRECTORY on the database server:
SQL> select object_name from DBMS_CLOUD.LIST_FILES('DATA_PUMP_DIR') order by last_modified desc;
OBJECT_NAME
-------------------
azureimp.log
You can copy the log file directly to Azure Blob Storage and download it from there:
SQL> BEGIN
DBMS_CLOUD.PUT_OBJECT(
'AZURECRED',
'https://adbmigration.blob.core.windows.net/dumpfiles/',
'DATA_PUMP_DIR',
'azureimp.log');
END;
/
PL/SQL procedure successfully completed.
External Tables
If you want to keep your files on Azure Blob Storage while accessing the data through an Oracle Database, you can create an external table pointing to the files on Azure Blob Storage using the CREATE_EXTERNAL_TABLE procedure of the DBMS_CLOUD package:
SQL> BEGIN
DBMS_CLOUD.CREATE_EXTERNAL_TABLE(
table_name => 'exttab',
credential_name => 'AZURECRED',
file_uri_list => 'https://sptstorageaccount.blob.core.windows.net/spt-container/data.csv',
column_list => 'id NUMBER, name VARCHAR2(64)',
format => json_object('delimiter' value ',')
);
END;
/
PL/SQL procedure successfully completed.
SQL> select * from exttab;
ID NAME
_____ ________
1 name1
2 name2
Conclusion
The DBMS_CLOUD package provides the flexibility to access data on multiple cloud vendors and a simple way to import it into an Oracle Database, simplifying data migrations across clouds.
Further Reading
- 10 Reasons to Adopt a Multicloud Strategy
- Considerations and Challenges of Multicloud and how to Overcome them
- AWS RDS for Oracle to Autonomous Database using Zero Downtime Migration
- Import your Data directly from Object Storage in Oracle Database 21.3 and 19.12
- Save Storage Cost with Hybrid Partitioned Tables in Oracle Autonomous Database
- What should I consider when migrating to Autonomous Database using Data Pump?