Loading Data from Local Files with SQL Developer in Autonomous Database

Hello,
In this blog post, I will describe how to load data into an existing table in Autonomous Database with the SQL Developer Web import from file feature. Before loading data, we need to create the table in Autonomous Database. The file formats that you can upload with the SQL Developer Web upload feature are CSV, XLS, XLSX, TSV and TXT.
Let me show the necessary steps;

1- Open SQL Developer Web from Autonomous Database.

2- Sign in with ADMIN user.

3- To import data, in SQL Developer Web, select the Worksheet tab. And create table which you want to load data.

4- In the Navigator, right-click the table where you want to load data and select Data loading → Upload Data

5- In the Import data dialog box you can drag and drop files or click Select files to show a browser to select the files to import.

6- Complete the mapping for the columns you are importing. There are a number of options for column mapping.

7- After you finish selecting format and mapping options, click Next to preview the column mapping.

8- Finally click Finish.

9- Click OK to confirm the import

10- SQL Developer Web provides history to show the status of the import and to allow you to review the results or errors associated with the import operation.

Now, our data is loaded into our table without any errors.

Using Database Links with Autonomous Databases

Hello,

In this blog post, I will describe how to create database links from Other Databases to Autonomous Database and from Autonomous Database to Other Databases. Use DBMS_CLOUD_ADMIN.CREATE_DATABASE_LINK to create database links from an Autonomous Database to another database. To use database links with Autonomous Database the target database must be configured to use TCP/IP with SSL (TCPS) authentication. Autonomous Databases use TCP/IP with SSL (TCPS) authentication by default, so you do not need to do any configuration in your target database for an Autonomous database. Autonomous Database supports creating database links only if the target database is accessible through a public IP. Only one wallet file is valid per directory for use with database links. To use multiple cwallet.sso files with database links you need to create additional directories and put each cwallet.sso in a different directory. When you create database links with DBMS_CLOUD_ADMIN.CREATE_DATABASE_LINK, specify the directory that contains the wallet with the directory_name parameter.

Let me show the creation a database link to a target database.

1- Copy your target database wallet, cwallet.sso, containing the certificates for the target database to Object Store.

2- Upload the target database wallet to the data_pump_dir directory.

BEGIN
DBMS_CLOUD.GET_OBJECT(
credential_name => ‘ADMIN’,
object_uri => ‘https://objectstorage.eu-frankfurt-1.oraclecloud.com/n/frskjd0nbatp/b/wallets/o/cwallet.sso’,
directory_name => ‘DATA_PUMP_DIR’);
END;
/

In this example;
my bucket name: wallets
my namespace: frskjd0nbatp
my cloud server: eu-frankfurt-1
my credential_name : ADMIN

3- On Autonomous Database create credentials to access the target database. The username and password you specify with DBMS_CLOUD.CREATE_CREDENTIAL are the credentials for the target database that you use to create the database link.

BEGIN
DBMS_CLOUD.CREATE_CREDENTIAL(
credential_name => ‘DB_LINK_USER’,
username => ‘ADMIN’,
password => ‘*********’
);
END;
/

This operation stores the credentials in the database in an encrypted format. If the password of the target user changes you can update the credential that contains the target user’s credentials as follows:

BEGIN
DBMS_CLOUD.UPDATE_CREDENTIAL (
credential_name => ‘DB_LINK_USER’,
attribute => ‘PASSWORD’,
value => ‘*******’);
END;
/

4- Create the database link to the target database.

BEGIN
DBMS_CLOUD_ADMIN.CREATE_DATABASE_LINK(
db_link_name => ‘MYLINK’,
hostname => ‘adb.eu-frankfurt-1.oraclecloud.com’,
port => ‘1522’,
service_name => ‘vjdijq7k4au1w9x_myatp_medium.atp.oraclecloud.com’,
ssl_server_cert_dn => ‘CN=adwc.eucom-central-1.oraclecloud.com,OU=Oracle BMCS FRANKFURT,O=Oracle Corporation,L=Redwood City,ST=California,C=US’,
credential_name => ‘DB_LINK_USER’,
directory_name => ‘DATA_PUMP_DIR’);
END;
/

If you need drop the created database link, you can drop as follows:

BEGIN
DBMS_CLOUD_ADMIN.DROP_DATABASE_LINK(
db_link_name => ‘MYLINK’ );
END;
/

5- Test the database link you created to access data on the target database.

select * from dba_db_links;

select * from dual@MYLINK;

Now, let’s create database link from another Oracle Database to an Autonomous Database.

1- Download your Autonomous Database wallet. I had explained in my previous post.

2- Upload the wallet to the database instance where you want to create the link to the Autonomous database.

3- Set GLOBAL_NAMES to FALSE

ALTER SYSTEM SET GLOBAL_NAMES = FALSE;

4- Create the database link to the target Autonomous Database.the security path includes my_wallet_directory; the path where you upload the Autonomous Database wallet.

CREATE DATABASE LINK AUTONOM_DBLINK
CONNECT TO ADMIN IDENTIFIED BY “*********”
USING
‘(description= (retry_count=20)(retry_delay=3)(address=(protocol=tcps)(port=1522)(host=adb.eu-frankfurt-1.oraclecloud.com))(connect_data=(service_name=vjdijq7k4au1w9x_myatp_high.atp.oraclecloud.com))(security=(my_wallet_directory=/home/oracle/wallet)
(ssl_server_dn_match=true) (ssl_server_cert_dn=”CN=adwc.eucom-central-1.oraclecloud.com,OU=Oracle BMCS FRANKFURT,O=Oracle Corporation,L=Redwood City,ST=California,C=US”)))’;

Test the database link you created to access data on your Autonomous Database instance.

Now, I want to do some data test. I will create DB_INVENTORY table copy to my local 12.1.0.2 database over database link. And then re insert from this copy table to my 19c autonomous database.

let me check the data from my autonomous database using SQL Developer Web.

Yes! My data is here.

Importing Data Using Oracle Data Pump on Autonomous Database

Hello,

In this blog post, I want to describe necessary steps to import data using Oracle Data Pump on Autonomous Database. You need to use Oracle Data Pump Export to export your existing Oracle Database schemas to migrate them to Autonomous Database using Oracle Data Pump Import. Oracle Data Pump Export provides several export modes, Oracle recommends using the schema mode for migrating to Autonomous Database. For a faster migration, export your schemas into multiple Data Pump files and use parallelism. You can specify the dump file name format you want to use with the dumpfile parameter. Set the parallel parameter to at least the number of CPUs you have in your Autonomous database.

With encryption_pwd_prompt=yes Oracle Data Pump export prompts for an encryption password to encrypt the dump files. The following example exports the HR schema from a source Oracle Database for migration to an Autonomous database with 4 CPUs

expdp HR/HR@MYORCL exclude=index,cluster,indextype,materialized_view,materialized_view_log,materialized_zonemap,db_link data_options=group_partition_table_data parallel=4 schemas=HR dumpfile=export%u.dmp encryption_pwd_prompt=yes

If during the export with expdp you use the encryption_pwd_prompt=yes parameter then also use encryption_pwd_prompt=yes with your import and input the same password at the impdp prompt to decrypt the dump files.

1- Store your Cloud Object Storage credential using DBMS_CLOUD.CREATE_CREDENTIAL.

BEGIN
DBMS_CLOUD.CREATE_CREDENTIAL(
credential_name => ‘ADMIN’,
username => ‘talip…..’,
password => ‘Zgkz:2lE)zw:duc}K1>8’
);
END;
/

2- Set the credential as the default credential for your Autonomous Database, as the ADMIN user.

ALTER DATABASE PROPERTY SET DEFAULT_CREDENTIAL = ‘ADMIN.ADMIN’;

3- Run Data Pump Import with the dumpfile parameter set to the list of file URLs on your Cloud Object Storage, and set the default_credential keyword.

impdp admin/********@veridataatp_high directory=data_pump_dir dumpfile=default_credential:https://objectstorage.eu-frankfurt-1.oraclecloud.com/n/frskjd0nbatp/b/my_dumpfiles/o/export%u.dmp parallel=4 encryption_pwd_prompt=yes partition_options=merge transform=segment_attributes:n exclude=index,cluster,indextype,materialized_view,materialized_view_log,materialized_zonemap,db_link logfile=import.log tables=TALIP

If during the export with expdp you used the encryption_pwd_prompt=yes parameter then use encryption_pwd_prompt=yes and input the same password at the impdp prompt that you specified during the export.

In this example;
my bucket name: my_dumpfiles
my namespace: frskjd0nbatp
my cloud server: eu-frankfurt-1
my credential_name : ADMIN

For the best import performance use the HIGH database service for your import connection and set the PARALLEL parameter to the number of OCPUs in your Autonomous Database.
The log files for Data Pump Import operations are stored in the directory you specify with the data pump impdp directory parameter.To view the log file you need to move the log file to your Cloud Object Storage using the procedure DBMS_CLOUD.PUT_OBJECT

BEGIN
DBMS_CLOUD.PUT_OBJECT(
credential_name => ‘ADMIN’,
object_uri => ‘https://objectstorage.eu-frankfurt-1.oraclecloud.com/n/frskjd0nbatp/b/my_dumpfiles/o/import.log’,
directory_name => ‘DATA_PUMP_DIR’,
file_name => ‘import.log’);
END;
/

If required, download the import logfile from Cloud Object Store

Exporting Data from Autonomous Database for Other Oracle Databases

Hello,
In this blog post, I will describe the necessary steps to export data from Autonomous Database for Other Oracle Databases. Oracle recommends using the latest Oracle Data Pump version for exporting data from Autonomous Database to other Oracle databases.

Firstly we need create a directory in Autonomous Database.

CREATE DIRECTORY creates the database directory object and also creates the file system directory if it does not already exist. If the file system directory exists then CREATE DIRECTORY only creates the database directory object. For example, the following command creates the database directory called mydir and creates the file system directory mydir

CREATE DIRECTORY mydir AS ‘mydir’;

Or you can also specify sub directories

CREATE DIRECTORY mydir AS ‘veridata/mydir’;

To add a directory, you must have the CREATE ANY DIRECTORY system privilege. The ADMIN user is granted the CREATE ANY DIRECTORY system privilege. After you create the MYDIR directory, use the following command to list all files:

SELECT * FROM DBMS_CLOUD.list_files(‘MYDIR’);

After creating the directory, you can export data as following steps

1- You must download client connection credential as I explained my previous blog post.

2- Write your Autonomous Database TNS alias to your tnsnames.ora
for example,

myatp_high = (description= (retry_count=20)(retry_delay=3)(address=(protocol=tcps)(port=1522)(host=adb.eu-frankfurt-1.oraclecloud.com))(connect_data=(service_name=vjdijq7k4au1w9x_myatp_high.atp.oraclecloud.com))(security=(ssl_server_cert_dn=”CN=adwc.eucom-central-1.oraclecloud.com,OU=Oracle BMCS FRANKFURT,O=Oracle Corporation,L=Redwood City,ST=California,C=US”)))

3- Run Data Pump Export with the dumpfile parameter set, the filesize parameter set to less than 5G, and the directory parameter set.

expdp ADMIN/*******@myatp_high directory=data_pump_dir dumpfile=exp%U.dmp parallel=4 encryption_pwd_prompt=yes filesize=1G logfile=export.log

If during the export with expdp you use the encryption_pwd_prompt=yes parameter then use encryption_pwd_prompt=yes with your import and input the same password at the impdp prompt to decrypt the dump files.

4- After the export is finished you can see the generated dump files by running the following query.

SELECT * FROM DBMS_CLOUD.LIST_FILES(‘DATA_PUMP_DIR’);

5- Now, let me move the dump file set to your Cloud Object Store, upload the files from the Autonomous Database database directory to your Cloud Object Store.

6- Connect to your Autonomous Database.

7- Store your object store credentials using the procedure DBMS_CLOUD.CREATE_CREDENTIAL.

BEGIN
DBMS_CLOUD.CREATE_CREDENTIAL(
credential_name => ‘ADMIN’,
username => ‘talip…..’,
password => ‘Zgkz:2lE)zw:duc}K1>8’
);
END;
/

This operation stores the credentials in the database in an encrypted format. You can use any name for the credential name. Note that this step is required only once unless your object store credentials change. Once you store the credentials you can then use the same credential name.

8- Move the dump files from the Autonomous Database to your Cloud Object Store by calling DBMS_CLOUD.PUT_OBJECT.
In my previous blog post, I had explained the setting up a bucket to store Autonomous Database.

BEGIN
DBMS_CLOUD.PUT_OBJECT(credential_name => ‘ADMIN’,
object_uri => ‘https://objectstorage.eu-frankfurt-1.oraclecloud.com/n/frskjd0nbatp/b/my_dumpfiles/o/exp01.dmp’,
directory_name => ‘DATA_PUMP_DIR’,
file_name => ‘exp01.dmp’);
DBMS_CLOUD.PUT_OBJECT(credential_name => ‘ADMIN’,
object_uri => ‘https://objectstorage.eu-frankfurt-1.oraclecloud.com/n/frskjd0nbatp/b/my_dumpfiles/o/exp02.dmp’,
directory_name => ‘DATA_PUMP_DIR’,
file_name => ‘exp02.dmp’);
DBMS_CLOUD.PUT_OBJECT(credential_name => ‘ADMIN’,
object_uri => ‘https://objectstorage.eu-frankfurt-1.oraclecloud.com/n/frskjd0nbatp/b/my_dumpfiles/o/exp03.dmp’,
directory_name => ‘DATA_PUMP_DIR’,
file_name => ‘exp03.dmp’);
DBMS_CLOUD.PUT_OBJECT(credential_name => ‘ADMIN’,
object_uri => ‘https://objectstorage.eu-frankfurt-1.oraclecloud.com/n/frskjd0nbatp/b/my_dumpfiles/o/exp04.dmp’,
directory_name => ‘DATA_PUMP_DIR’,
file_name => ‘exp04.dmp’);
END;
/

In this example;
my bucket name: my_dumpfiles
my namespace: frskjd0nbatp
my cloud server: eu-frankfurt-1
my credential_name : ADMIN (Specified in step 7)

9- Now let me check the dump files in BUCKET. Open the navigation menu. Under Core Infrastructure, click Object Storage.

10- Select a compartment from the Compartment list on the left side of the page. A list of existing buckets is displayed.

11- Click the bucket name to see bucket details.Under the Objects, you will see moved dump files.

12- If required, download the dump files from Cloud Object Store and use Oracle Data Pump Import to import the dump file set to the target database.

Select the dump files and click the download

13- Run Data Pump Import to import the dump file set to the target database.

How To Send Mail with Email Delivery on Autonomous Database?

Hello,

In this blog post, I will describe sending mail using UTL_SMTP on Autonomous Database. The only supported email provider is Oracle Cloud Infrastructure Email Delivery service.

Let’s do step by step

1- Identify your SMTP connection endpoint for Email Delivery. You may need to subscribe to additional Oracle Cloud Infrastructure regions if Email Delivery is not available in your current region.

For example, select one of the following for the SMTP connection endpoint:

smtp.us-phoenix-1.oraclecloud.com
smtp.us-ashburn-1.oraclecloud.com
smtp.email.uk-london-1.oci.oraclecloud.com
smtp.email.eu-frankfurt-1.oci.oraclecloud.com

I will select “smtp.email.eu-frankfurt-1.oci.oraclecloud.com” as SMTP connection endpoint.

2- Generate SMTP credentials for Email Delivery. UTL_SMTP uses credentials to authenticate with Email Delivery servers when you send email.

To generate SMTP credentials for a user;
a- Open the navigation menu. Under Governance and Administration, go to Identity and click Users. Locate the user in the list that has permissions to manage email, and then click the user’s name to view the details.

b- Click user link

c- Click SMTP Credentials and Generate SMTP Credentials.

d- Enter a Description of the SMTP Credentials in the dialog box.
e- Click Generate SMTP Credentials. A user name and password is displayed.

f- Copy the user name and password for your records and click Close.

3- Create an approved sender for Email Delivery. Complete this step for all email addresses you use as the “From” with UTL_SMTP.MAIL.
To create an approved sender for Email Delivery;
a- Open the navigation menu. Under Solutions and Platform, go to Email Delivery and click Email Approved Senders. Ensure that you are in the correct compartment. Your user must be in a group with permissions to manage approved-senders in this compartment.
b- Click Create Approved Sender within the Approved Senders view.
c- Enter the email address you want to list as an approved sender in the Create Approved Sender dialog box.
d- Click Create Approved Sender. The email address is added to your Approved Senders list.

4- Allow SMTP access for ADMIN user by appending an Access Control Entry (ACE).

BEGIN
DBMS_NETWORK_ACL_ADMIN.APPEND_HOST_ACE(
host => ‘*’,
lower_port => 587,
upper_port => 587,
ace => xs$ace_type(privilege_list => xs$name_list(‘SMTP’),
principal_name => ‘ADMIN’,
principal_type => xs_acl.ptype_db));
END;
/

5- let me create a PL/SQL procedure to send email.
For example,

CREATE OR REPLACE PROCEDURE SEND_MAIL (
msg_to varchar2,
msg_subject varchar2,
msg_text varchar2 )
IS

mail_conn utl_smtp.connection;
username varchar2(1000):= ‘ocid1.user.oc1.user’;
passwd varchar2(50):= ‘passs’;
msg_from varchar2(50) := ‘autonomous@taliphakanozturk.com’;
mailhost VARCHAR2(50) := ‘smtp.email.eu-frankfurt-1.oci.oraclecloud.com’;

BEGIN
mail_conn := UTL_smtp.open_connection(mailhost, 587);
utl_smtp.starttls(mail_conn);

UTL_SMTP.AUTH(mail_conn, username, passwd, schemes => ‘PLAIN’);

utl_smtp.mail(mail_conn, msg_from);
utl_smtp.rcpt(mail_conn, msg_to);

UTL_smtp.open_data(mail_conn);

UTL_SMTP.write_data(mail_conn, ‘Date: ‘ || TO_CHAR(SYSDATE, ‘DD-MON-YYYY HH24:MI:SS’) || UTL_TCP.crlf);
UTL_SMTP.write_data(mail_conn, ‘To: ‘ || msg_to || UTL_TCP.crlf);
UTL_SMTP.write_data(mail_conn, ‘From: ‘ || msg_from || UTL_TCP.crlf);
UTL_SMTP.write_data(mail_conn, ‘Subject: ‘ || msg_subject || UTL_TCP.crlf);
UTL_SMTP.write_data(mail_conn, ‘Reply-To: ‘ || msg_to || UTL_TCP.crlf || UTL_TCP.crlf);
UTL_SMTP.write_data(mail_conn, msg_text || UTL_TCP.crlf || UTL_TCP.crlf);

UTL_smtp.close_data(mail_conn);
UTL_smtp.quit(mail_conn);

EXCEPTION
WHEN UTL_smtp.transient_error OR UTL_smtp.permanent_error THEN
UTL_smtp.quit(mail_conn);
dbms_output.put_line(sqlerrm);
WHEN OTHERS THEN
UTL_smtp.quit(mail_conn);
dbms_output.put_line(sqlerrm);
END;
/

6- Now let me send a test email using the PL/SQL procedure.
execute send_mail(‘dba@taliphakanozturk.com’, ‘Email from Oracle Autonomous Database’, ‘Sent using UTL_SMTP’);