04 June 2024

Reading and writing files on Amazon S3 from Oracle Autonomous Database and PL/SQL

In our mixed cloud environment of AWS and Oracle Autonomous databases, sometimes it's necessary to move files/blobs between the database and AWS S3 storage.  This post describes one way of moving them directly between the database and AWS S3 storage.  

We'll start by setting up an AWS user and user group with the permissions needed.  Then we'll create OCI credentials using PL/SQL and the DBMS_CLOUD package.  Finally, we'll move some image blobs over and back between our Oracle database's PL/SQL environment and AWS S3.


Stage One - create an AWS group, a user and get access keys


Create a User Group and assign the permission policy AmazonS3FullAccess to it.

Using the "User Groups -> Create group" wizard, we'll create a new group called S3FullAccess2024.  We'll also attach the Amazon-supplied policy "AmazonS3FullAccess" to this group









Create an S3 User

Login to your AWS Identity and Access Management (IAM) Console and create a new user.  Let's call this user S3PLSQL.  We'll follow the "Create User" wizard that we'll select from the Users option on the Identity and Access Management menu.


Create user - step 1: User details




Create user - step 2:
Set permissions by adding the user to the S3FullAccess2024 group
 that we created earlier




Create user - step 3: Review and create user




Create user - finished: User created



Create and get your Access Key and Secret Key

Next we'll view the user and create an Access Key.  To keep things as simple as possible for now, we'll select "Other" as our use-case.

Create Access key - step 1: Choose a use-case




Create Access key - step 2: Add an optional description




Create Access key - step 3: Retrieve the access key and secret


Important: Don't forget to make sure that you store your key and secret somewhere, as this will be the only time that you will see it.  If you lose your key later on, you'll have to create a new access key.





Stage Two - create an OCI credential using DBMS_CLOUD

The good news is that it's all in an Oracle environment from here on.  

Firstly, from an admin account, make sure that your database user (in this case "myuser") can execute procedures in the dbms_cloud package.
grant execute on dbms_cloud to myuser;

Connect as myuser and, using the Access Key (created earlier at the end of Stage One) as your username and the Secret Key as your password, create a credential using DBMS_CLOUD.CREATE_CREDENTIAL.  Let's call this credential "s3test_cred".

begin
  dbms_cloud.create_credential
  (credential_name => 's3test_cred',
   username => 'A*****************7U',
   password => 'y*********************************V');
end;
/
Once the above is complete, we now disconnect from the admin user and continue by connecting to the myuser database account



Stage Three - access and update an S3 bucket using PL/SQL and DBMS_CLOUD


Firstly, ensure that we're now connected using the myuser account (or whatever the name we gave to it is).  Using the credential that we've just created, we'll try and list the contents of an existing bucket.  In this case, let's use a pre-created bucket called plsql-s3 containing three image files.

Before we start, we'll just take a quick look at the bucket's contents using S3's web interface.  In this example, we can see that it currently contains 3 jpegs.






Let's start by listing all the files in the bucket using DBMS_CLOUD.LIST_OBJECTS.  This can be done with a simple SQL query.
 
We'll pass two parameters, the credentials that we created in Stage 2 and the path to the S3 bucket that we want to list.

-- list the contents of an S3 bucket
select f.*
  from dbms_cloud.list_objects
         ('s3test_cred'
         ,'https://s3.eu-west-1.amazonaws.com/plsql-s3/') f;

Query output - 3 jpegs




Let's get one of the files and read it into a PL/SQL blob variable using the DBMS_CLOUD.GET_OBJECT function.  We'll then check the length of the retrieved blob, just to show ourselves that the get was successful and that the blob is the same size as the file on S3.

-- read a file from S3 into a blob
set serveroutput on
declare
  l_file blob;
begin

  l_file := 
  dbms_cloud.get_object
    (credential_name => 's3test_cred',
     object_uri      
      => 'https://s3.eu-west-1.amazonaws.com/plsql-s3/Sheep.jpeg');

  dbms_output.put_line
    ('retrieved blob length is: '||dbms_lob.getlength(l_file));
end;
/

retrieved blob length is: 2622899

PL/SQL procedure successfully completed.


Now we'll read the file into a blob and use the DBMS_CLOUD.PUT_OBJECT procedure to write a copy of this file as a new file to S3 using the retrieved blob.

set serveroutput on
declare
  l_file blob;
begin
  -- read the file from S3 into a blob
  l_file := 
  dbms_cloud.get_object
    (credential_name => 's3test_cred',
     object_uri      
      => 'https://s3.eu-west-1.amazonaws.com/plsql-s3/Sheep.jpeg');

  -- using the blob that we read, we'll create a new file on S3
  dbms_cloud.put_object (
    credential_name => 's3test_cred',
    object_uri      
     => 'https://s3.eu-west-1.amazonaws.com/plsql-s3/Sheep2.jpeg',
    contents => l_file);

end;
/

PL/SQL procedure successfully completed.


-- let's check if the new file "Sheep2.jpeg" has been created

select f.*
  from dbms_cloud.list_objects
         ('s3test_cred'
         ,'https://s3.eu-west-1.amazonaws.com/plsql-s3/') f;

Results: 






And, using the AWS web interface to verify the contents of our S3 bucket, we'll see that the new file Sheep2.jpeg is now visible in the bucket.



So now we are using the Oracle DBMS_CLOUD package and its LIST_OBJECTS, GET_OBJECT and PUT_OBJECT procedures to list, read and write from and to AWS S3.  

To follow on from here, we could, for example, load files into an Oracle database table using Oracle APEX and then move them to S3 for permanent storage.  The approach and techniques described above can be used in this and many other similar scenarios.

I hope that this is useful to some of you.  Wishing you lots of fun on your Oracle/AWS journeys!
 

P.S. added by popular demand, here's the Sheep Photo... 🐑

taken on a beautiful summer's day on Achill Island, Co. Mayo, Ireland       
© 2024 Niall Mc Phillips