Contact Us 1-800-596-4880

Inserting Large File Content into a Table Examples - Mule 4

Anypoint Connector for Database (Database Connector) Insert operation supports the insertion of character large objects (CLOBs) into tables. The following example illustrates how to read a large file’s content and insert it into a table that supports the type of the content.

To accomplish this example, you must:

  • Create the table

  • Create the Mule application

  • Run and test the application with curl commands

Example Mule app flow in Studio canvas

Create the Table

Use your desired SQL client tool to run an .sql script with the following SQL statement to create the table:

    clob_data CLOB

Create the Mule Application

Follow these steps to create the Mule application:

  1. In Studio, drag an HTTP Listener source to the canvas.

  2. Click the plus sign (+) next to the Connector configuration field to configure an HTTP global connection.

  3. Configure the HTTP global connection and click OK.

  4. Set Path to /read-insert.

  5. Drag a Logger component to the right of the HTTP Listener source.

  6. Set Message to Starting LARGE CLOB reading.

  7. Drag the File Connector Read operation to the right of the Logger component.

  8. In the General tab of the operation configuration screen, click the plus sign (+) next to the Connector configuration field to access the global element configuration fields.

  9. Set Working Directory to /dir/containing/file/ to read the file properly.

  10. Click OK.

  11. Set File Path to awesomeFile.json.

  12. In the MIME Type tab, set MIME Type to application/octet-stream.

  13. Drag another Logger component to the right of the Read operation.

  14. Set Display Name to LARGE CLOB’s content on flow.

  15. Drag a Database Connector Insert operation to the right of the second Logger component.

  16. Click the plus sign (+) next to the Connector configuration field to configure a database global connection.

  17. Configure the database connection and click OK.

  18. Set SQL Query Text to INSERT INTO AWESOMECLOBDATA (clob_data) VALUES (:payload).
    The SQL text inserts a new record with the file’s content in the table.

  19. Set Input Parameters to the expression:

	payload: payload
  1. Drag another Logger component to the right of the Insert operation.

  2. Set Display Name to LARGE CLOB’s content inserted.

  3. Set Message to the expression payload.

  4. Save and run your Mule application.

  5. Test the app by using the following curl command in your terminal: curl localhost:8081/read-insert.

Application MIME Type

The application/octet-stream MIME type converts the content of the file to binary format enabling the proper flow of data from the Read operation to the Insert operation.

For example, if you use text/plain MIME type, then an out-of-memory (OOM) error can arise. This can occur because the DataWeave block, defined in the Input Parameters field from the Insert operation, tries to convert to text the entire data stream from the file before delivering the data to the operation itself.

You can also use the application/json MIME type. Though, to reference the payload use payload.^raw in the Insert operation.

XML for the Mule Application

Paste this code into your Studio XML editor to quickly load the flow for this example into your Mule app:

<?xml version="1.0" encoding="UTF-8"?>

<mule xmlns:ee=""
	xmlns:db="" xmlns:file=""
	xmlns="" xmlns:doc="" xmlns:xsi="" xsi:schemaLocation="">
	<http:listener-config name="HTTP_Listener_config"  >
		<http:listener-connection host="" port="8081" />

	<file:config name="File_Config" >
		<file:connection workingDir="/dir/containing/file/" />

	<db:config name="dbConfig"  >
		<db:oracle-connection host="" user="sys as sysdba" password="password" serviceName="orcl"/>

	<flow name="read-insert-large-clob-file"  >
		<http:listener doc:name="/read-insert"  config-ref="HTTP_Listener_config" path="/read-insert" />

		<logger level="INFO" message="Starting LARGE CLOB reading" />

		<file:read config-ref="File_Config" path='awesomeFile.json' outputMimeType="application/octet-stream" />

		<logger level="INFO" doc:name="LARGE CLOB's content on flow" />

		<db:insert doc:name="Insert LARGE CLOB" config-ref="dbConfig">
			      <db:sql><![CDATA[INSERT INTO AWESOMECLOBDATA (clob_data) VALUES (:payload)]]></db:sql>
	payload: payload

		<logger level="INFO" message="#[payload]" />
View on GitHub