Nav
You are viewing an older version of this section. Click here to navigate to the latest version.

Salesforce Analytics Cloud Connector

Introduction

The Anypoint™ Connector for Salesforce Analytics Cloud lets you connect to the Salesforce Analytics Cloud application using External Data API. The connector exposes convenient methods for creating and populating datasets into Salesforce Analytics Cloud system. Load data into Analytics Cloud from many different data sources whether they are on-premise or on the cloud. Go beyond .csv files with this connector.

Read through this User Guide to understand how to set up and configure a basic integration flow using the connector. Read through the Technical Reference to understand how the connector operations tie in with the External Data API calls. You will also find demo applications here that illustrate how to use the connector operations using a static data set.

Prerequisites

This document assumes you are familiar with Mule, Anypoint Connectors, and Anypoint Studio Essentials To increase your familiarity with Studio, consider completing one or more Anypoint Studio Tutorials. Further, this page assumes that you have a basic understanding of Mule flows and Mule Global Elements.

Requirements

To use the Salesforce Analytics Cloud connector, you need:

Compatibility

Application/Service Version

Mule Runtime

3.5.0 and higher

External Data API

34.0

Installing and Configuring

Installing

You can install a connector in Anypoint Studio using the instructions in Installing a Connector from Anypoint Exchange.

Using the Connector in a New Project

To use the Salesforce Analytics Cloud connector in a Mule application project:

  1. In Studio, select File > New > Mule Project. Create new project

  2. Enter a name for your new project and leave the remaining options with their default values. Create new project dialog box

  3. If you plan to use Git, select Create a .gitignore file for the project with default .gitignore settings for Studio Projects, and then click Next.

  4. Click Finish to create the project.

Configuring the Salesforce Analytics Cloud Global Element

To use the Salesforce Analytics Cloud connector in your Mule application, you must configure a global Salesforce Analytics Cloud element that can be used by all the Salesforce Analytics Cloud connectors in the application (read more about global elements).

Salesforce Analytics Cloud Connector Authentication

To access the data in a Salesforce Analytics Cloud instance, you have the following possibilities in terms of authentication:

  • Basic authentication

  • OAuth 2.0 Web Flow

  • OAuth 2.0 JWT Bearer

  • OAuth 2.0 SAML Bearer

Basic authentication is the easiest to implement. All you need to do is provide your credentials in a global configuration, then reference the global configuration in any Salesforce Analytics Cloud connector in your application. Basic authentication is generally recommended for internal applications. Implementing OAuth 2.0-related authentication mechanisms, on the other hand, involves a few extra steps, but may be preferred if your service is exposed to external users, as it ensures better security. More technical information on these authentication mechanisms at the following links: Basic Auth, OAuth 2.0 Web Flow, OAuth 2.0 SAML Bearer and OAuth 2.0 JWT Bearer.

  1. Click the Global Elements tab at the base of the canvas.

  2. On the Global Mule Configuration Elements screen, click Create.

  3. In the Choose Global Type wizard, filter by "salesforce analytics", expand Connector Configuration, and then select one of the four available configurations depending on your needs. Create global element

  4. Click Ok

  5. Enter the global element properties:

    For all the configurations, proxy is supported as below:

    Field Description

    Host

    Host name of the proxy server. If this is not set then no proxy is used.

    Port

    The port number on which the proxy server runs.

    Username

    The username to log in to the server. If this is not set then no authentication is used.

    Password

    The username to log in to the server.

    1. For Salesforce Analytics Cloud: Basic Authentication: Basic authentication configuration

      In the image above, the placeholder values refer to a configuration file placed in the src folder of your project (Learn how to configure properties). You can either hardcode your credentials into the global configuration properties, or reference a configuration file that contains these values. For simpler maintenance and better re-usability of your project, Mule recommends that you use a configuration file. Keeping these values in a separate file is useful if you need to deploy to different environments, such as production, development, and QA, where your access credentials differ. See Deploying to Multiple Environments for instructions on how to manage this.
      Field Description

      Name

      Enter a name for this connector to reference it later.

      Username

      Enter a Salesforce Analytics Cloud username.

      Password

      Enter the corresponding password.

      Security token

      Enter the Security Token for the username.

      Do not confuse the Security Token required in Basic Authentication with the one required in OAuth authentication. Here, the token refers to your user, not to your application, as it does in OAuth.

      Read timeout

      Specifies the amount of time, in milliseconds, that the consumer will wait for a response before it times out. Default value is 0 which means infinite.

      Connection timeout

      Specifies the amount of time, in milliseconds, that the consumer will attempt to establish a connection before it times out. Default value is 0 which means infinite.

      Enable Data Sense

      When enabled, Adding DataSense extracts metadata for Salesforce Analytics Cloud objects to automatically determine the data type and format that your application must deliver to, or can expect from Salesforce Analytics Cloud system. By enabling this functionality, Mule discovers the type of data you must send to, or receive from Salesforce Analytics.

      Metadata file name

      Enter the path for the file that contains the descriptions of the object structure of the row that is uploaded into the Salesforce Analytics Cloud system. This path has to be relative to the src/main/resources directory.

    2. For Salesforce Analytics Cloud: Salesforce Analytics Cloud (OAuth):

      1. On the General tab, configure the following fields: OAuth Web Flow

        Field Description

        Name

        Enter a name for this connector to reference it later.

        Consumer Key

        Enter the consumer key for your connected app from Salesforce.

        Consumer Secret

        Enter the consumer secret for your connected app from Salesforce.

        On No Token

        Select the action that the connector must take if it finds no access token.

        Read timeout

        Specifies the amount of time, in milliseconds, that the consumer will wait for a response before it times out. Default value is 0 which means infinite.

        Connection timeout

        Specifies the amount of time, in milliseconds, that the consumer will attempt to establish a connection before it times out. Default value is 0 which means infinite.

        Enable Data Sense

        When enabled, DataSense extracts metadata for Salesforce Analytics Cloud objects to automatically determine the data type and format that your application must deliver to, or can expect from Salesforce Analytics Cloud system. By enabling this functionality, Mule discovers the type of data you must send to, or receive from Salesforce Analytics.

        Metadata file name

        Enter the path for the file that contains the descriptions of the object structure of the row that is uploaded into the Salesforce Analytics Cloud system. This path has to be relative to src/main/resources dir.

        For more information on how to create connected app see: Creating a Connected App
      2. On the OAuth tab, configure the following fields: OAuth Web Flow OAuth tab

        Field Description

        Domain

        Enter the domain name to use as the callback endpoint. The domain name is not a full URL, but a domain name, IP address, or a hostname.

        Local Port

        Enter the local port to use for the callback endpoint.

        Remote Port

        Enter the remote port to use to build the callback URL.

        Path

        Enter the path to use for the callback endpoint.

        Http Connector Reference

        Enter the HTTP Connector Reference to use for the callback endpoint.

        Default Access Token Id

        Enter the Mule Expression to use as an access token.

        Object Store Reference

        Enter the name of the Object Store Reference.

    3. For Salesforce Analytics Cloud: OAuth 2.0 JWT Bearer: OAuth JWT Bearer

      Field Description

      Consumer key

      Enter the consumer key for your connected app from Salesforce.

      Key store

      Enter the path to a java key store file that is going to be used to sign the JWT. The path should be relative to src/main/resources folder.

      Store password

      Enter the password for the above provided key store.

      Principal

      Enter the username of the user that you are going to take action of behalf of.

      Token endpoint

      Enter the URL to server providing the token. For more info see: Understanding OAuth Endpoints.

      Read timeout

      Specifies the amount of time, in milliseconds, that the consumer will wait for a response before it times out. Default value is 0 which means infinite.

      Connection timeout

      Specifies the amount of time, in milliseconds, that the consumer will attempt to establish a connection before it times out. Default value is 0 which means infinite.

      Enable Data Sense

      When enabled, DataSense extracts metadata for Salesforce Analytics Cloud objects to automatically determine the data type and format that your application must deliver to, or can expect from Salesforce Analytics Cloud system. By enabling this functionality, Mule discovers the type of data you must send to, or receive from Salesforce Analytics.

      Metadata file name

      Enter the path for the file that contains the descriptions of the object structure of the row that is uploaded into the Salesforce Analytics Cloud system. This path has to be relative to src/main/resources dir.

      How to generate a Keystore file

      1. Go to your Mule workspace, and open the command prompt (for Windows) or Terminal (for Mac).

      2. Type keytool -genkeypair -alias salesforce-cert -keyalg RSA -keystore salesforce-cert.jks and press enter.

      3. Enter the following details:

        1. Password for the key store.

        2. Your first name and last name.

        3. Your organization unit.

        4. Name of your City, State, and the two letters code of your county.

      4. The system generates a java keystore file containing a private/public key pair in your workspace. You need to provide a file path for the Keystore in your connector configuration.

      5. Type keytool -exportcert -alias salesforce-cert -file salesforce-cert.crt -keystore salesforce-cert.jks and press enter.

      6. The system now exports the public key from keystore into the workspace. This is the public key that you need to enter in your Salesforce instance.

      7. Make sure that you have both the key store (salesforce-cert.jks) and the public key (salesforce-cert.crt) files in your workspace.

    4. For Salesforce Analytics Cloud: OAuth 2.0 SAML Bearer: OAuth SAML Bearer

      Field Description

      Consumer key

      Enter the consumer key for your connected app from Salesforce.

      Key store

      Enter the path to a java key store file that is going to be used to sign the JWT. The path should be relative to the src/main/resources folder.

      Store password

      Enter the password for the above provided key store.

      Principal

      Enter the username of the user that you are going to take action of behalf of.

      Token endpoint

      Enter the URL to the server providing the token. For more info see: Understanding OAuth Endpoints.

      Read timeout

      Specifies the amount of time, in milliseconds, that the consumer will wait for a response before it times out. Default value is 0 which means wait indefinitely.

      Connection timeout

      Specifies the amount of time, in milliseconds, that the consumer will attempt to establish a connection before it times out. Default value is 0 which means wait indefinitely.

      Enable Data Sense

      When enabled, DataSense extracts metadata for Salesforce Analytics Cloud objects to automatically determine the data type and format that your application must deliver to, or can expect from Salesforce Analytics Cloud system. By enabling this functionality, Mule discovers the type of data you must send to, or receive from Salesforce Analytics.

      Metadata file name

      Enter the path for the file that contains the descriptions of the object structure of the row that is uploaded into the Salesforce Analytics Cloud system. This path has to be relative to src/main/resources dir.

      How to generate a Keystore file

    1. Go to your Mule workspace, and open the command prompt (for Windows) or Terminal (for Mac).

    2. Type keytool -genkeypair -alias salesforce-cert -keyalg RSA -keystore salesforce-cert.jks and press enter.

    3. Enter the following details:

      1. Password for the key store.

      2. Your first name and last name.

      3. Your organization unit.

      4. Name of your City, State, and the two letters code of your county.

    4. The system generates a java keystore file containing a private/public key pair in your workspace. You need to provide file path for the Keystore in your connector configuration.

    5. Type keytool -exportcert -alias salesforce-cert -file salesforce-cert.crt -keystore salesforce-cert.jks and press enter.

    6. The system now exports the public key from keystore into the workspace. This is the public key that you need to enter in your Salesforce instance.

    7. Make sure that you have both the key store (salesforce-cert.jks) and the public key (salesforce-cert.crt) files in your workspace.

Using the Connector

You can use the Salesforce Analytics Cloud connector as an outbound connector in your flow to push data into Salesforce Analytics Cloud system. To use it as an outbound connector, simply place the connector in your flow at any point after an inbound endpoint. Note that you can also use the Salesforce Analytics Cloud connector in a batch process to push data to Salesforce Analytics Cloud system in batches.

Use cases

The following are the common use cases for the Salesforce Analytics Cloud connector:

  1. Create a data set in the Salesforce Analytics Cloud system, upload data into the data set from an input file, and trigger the system to start processing the data. Use this when dealing with smaller files, preferably less than 10 MB.

  2. Create a data set in the Salesforce Analytics Cloud system, read the data from an input file and split it into batches, upload batches of data into the data set, and trigger the system to start processing the data. We recommend ingesting huge volumes of Data using this approach. Make sure that your batch commit size is less than or equal to 10 MB for optimal performance. The connector throws a warning if the batch commit size is greater than 10 MB.

Adding the Salesforce Analytics Cloud Connector to a Flow

  1. Create a new Mule project in Anypoint Studio.

  2. Drag the Salesforce Analytics Cloud connector onto the canvas, then select it to open the properties editor.

  3. Configure the connector’s parameters:

    Analytics operation config

    Field Description

    Display Name

    Enter a unique label for the connector in your application.

    Connector Configuration

    Select a global Salesforce Analytics connector element from the dropdown.

    Operation

    Select an operation for the connector to perform.

  4. Save your configurations.

Example Use Case 1

Create a dataset and upload data into it by processing all the data in one big chunk.

  1. Create a new Mule Project by clicking on File > New > Mule Project. In the new project dialog box, the only thing you are required to enter is the name of the project. Click on Finish.

    New project dialog

  2. Now let’s create the flow. Navigate through the project’s structure and double-click on src/main/app/project-name.xml and follow the steps below.

  3. On the right side of studio search for File.

    Search for File

  4. Drag the File element onto the canvas.

  5. Search for DataMapper and drag it after File.

  6. Search for Salesforce Analytics Cloud and drag it after DataMapper.

  7. After completing the previous steps you should see:

    Unconfigured All In One flow

  8. Let’s start configuring each element. Double-click on the File element.

    File component

  9. Click on …​ next to the Path field.

  10. Choose a folder with only the csv file that you want to upload. You can download our example file and save it into chosen folder.

  11. Double-click on Salesforce Analytics Cloud connector.

  12. Click on the plus sign next to the Connector configuration dropdown.

    Create data set config

  13. A pop-up appears asking for type of configuration. Choose Salesforce Analytics Cloud: Basic Authentication option and click OK.

  14. A new pop-up appears asking for information required for basic authentication. For more info see the Installing and Configuring section

    Basic Auth config

  15. In the Connection section enter the credentials used to access the Salesforce instance.

  16. In the DataSense metadata section for the Metadata file name field enter the filename that describes the data structure you are going to upload. The filename has to be relative to the src/main/resources directory of your Studio project. For the file provided a few steps earlier (CsvDemoTestData.csv) you can use the metadata file provided below but do not forget to copy it into the src/main/resources directory.

  17. Click OK to return to the Salesforce Analytics Cloud tab.

  18. From the Operation dropdown in the Basic Settings section choose Upload external data into new data set and start processing.

  19. From the Operation dropdown in the DataSet info section choose OVERWRITE.

  20. In the Description enter Test data set.

  21. In the Label field under DataSet info enter Test data set.

  22. In the Name field under DataSet info enter test_data_set.

  23. Double-click on the DataMapper element.

  24. Click on the Type dropdown in the Input section and choose CSV

    Data Mapper set CSV as input type

  25. Click on …​ next to the CSV field of the Input section and browse to the csv file in the same folder you selected for the File connector.

  26. Click the Create mapping button and you should see something similar to the picture below.

    Data mapper mappings

  27. Now everything is set up and the application can be deployed.

It’s time to test the app. Run the app in Anypoint Studio (Right-click on project name > Run as > Mule Application). Monitor the studio console and check Salesforce Wave Analytics UI to see if the data was uploaded.

  1. Add the sfdc-analytics namespace to the mule element as follows:

    
           
                   
                
    1
    
    xmlns:sfdc-analytics="http://www.mulesoft.org/schema/mule/sfdc-analytics"
  2. Add the location of the analytics schema referred to by the sfdc-analytics namespace:

    
           
                   
                
    1
    
    http://www.mulesoft.org/schema/mule/sfdc-analytics http://www.mulesoft.org/schema/mule/sfdc-analytics/current/mule-sfdc-analytics.xsd
  3. Add the data-mapper namespace as follows:

    
           
                   
                
    1
    
    xmlns:data-mapper="http://www.mulesoft.org/schema/mule/ee/data-mapper"
  4. Add location of data mapper schema referred by data-mapper namespace with the following value:

    
           
                   
                
    1
    
    http://www.mulesoft.org/schema/mule/ee/data-mapper http://www.mulesoft.org/schema/mule/ee/data-mapper/current/mule-data-mapper.xsd
  5. Add a context:property-placeholder element to your project, then configure its attributes as follows:

    <context:property-placeholder location="mule-app.properties"/>
  6. Add a data-mapper:config element to your project, then configure its attributes as follows:

    
           
                   
                
    1
    
    <data-mapper:config name="CSV_To_List_Record_" transformationGraphPath="csv_to_list_record_.grf" doc:name="CSV_To_List_Record_"/>
  7. Add a sfdc-analytics:config element to your project, then configure its attributes as follows:

    
           
                   
                
    1
    
    <sfdc-analytics:config name="Salesforce_Analytics_Cloud__Basic_authentication" username="${salesforce.username}" password="${salesforce.password}" securityToken="${salesforce.securityToken}" metadataFileName="${metadata.file.analytics}" doc:name="Salesforce Analytics Cloud: Basic authentication" url="${salesforce.url}"/>
  8. Add an empty flow element to your project as follows:

    
           
                   
                
    1
    2
    
    <flow name="analytics_performanceFlow">
    </flow>
  9. Within the flow element add a file:inbound-endpoint element as follows:

    
           
                   
                
    1
    2
    
    <file:inbound-endpoint path="path_to_folder_to_monitor" moveToDirectory="path_to_folder_where_to_move_processed_files" responseTimeout="10000" doc:name="File">
    </file:inbound-endpoint>
  10. Within the flow element add a data-mapper:transform element as follows:

    
           
                   
                
    1
    
    <data-mapper:transform config-ref="CSV_To_List_Record_" doc:name="CSV To List<Record>"/>
  11. Within the flow element add a sfdc-analytics:upload-external-data-into-new-data-set-and-start-processing element as follows:

    
           
                   
                
    1
    2
    3
    
    <sfdc-analytics:upload-external-data-into-new-data-set-and-start-processing config-ref="Salesforce_Analytics_Cloud__Basic_authentication1" type="recordId" operation="UPSERT" description="Test upload of 2500 records all in one step" label="records_2500_in_one_step" dataSetName="records_2500_in_one_step_with_app" edgemartContainer="TestContainer" notificationSent="ALWAYS" notificationEmail="name@email.com" doc:name="Salesforce Analytics Cloud">
        <sfdc-analytics:payload ref="#[payload]"/>
    </sfdc-analytics:upload-external-data-into-new-data-set-and-start-processing>
  12. In the end the xml file should look like this:

    
           
                   
                
    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
    23
    24
    25
    26
    
    <?xml version="1.0" encoding="UTF-8"?>
    <mule xmlns:file="http://www.mulesoft.org/schema/mule/file"
            xmlns:context="http://www.springframework.org/schema/context"
            xmlns="http://www.mulesoft.org/schema/mule/core" xmlns:doc="http://www.mulesoft.org/schema/mule/documentation"
            xmlns:spring="http://www.springframework.org/schema/beans" version="EE-3.7.0"
            xmlns:sfdc-analytics="http://www.mulesoft.org/schema/mule/sfdc-analytics"
            xmlns:data-mapper="http://www.mulesoft.org/schema/mule/ee/data-mapper"
            xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
            xsi:schemaLocation="http://www.mulesoft.org/schema/mule/sfdc-analytics http://www.mulesoft.org/schema/mule/sfdc-analytics/current/mule-sfdc-analytics.xsd
    http://www.mulesoft.org/schema/mule/file http://www.mulesoft.org/schema/mule/file/current/mule-file.xsd
    http://www.mulesoft.org/schema/mule/ee/data-mapper http://www.mulesoft.org/schema/mule/ee/data-mapper/current/mule-data-mapper.xsd
    http://www.springframework.org/schema/context http://www.springframework.org/schema/context/spring-context-current.xsd
    http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-current.xsd
    http://www.mulesoft.org/schema/mule/core http://www.mulesoft.org/schema/mule/core/current/mule.xsd">
            <context:property-placeholder location="mule-app.properties"/>
            <sfdc-analytics:config name="Salesforce_Analytics_Cloud__Basic_authentication" username="${salesforce.username}" password="${salesforce.password}" securityToken="${salesforce.securityToken}" metadataFileName="${metadata.file.analytics}" doc:name="Salesforce Analytics Cloud: Basic authentication" url="${salesforce.url}"/>
            <data-mapper:config name="CSV_To_List_Record_" transformationGraphPath="csv_to_list_record_.grf" doc:name="CSV_To_List_Record_"/>
            <flow name="analytics_performanceFlow">
            <file:inbound-endpoint path="path_to_folder_to_monitor" moveToDirectory="path_to_folder_where_to_move_processed_files" responseTimeout="10000" doc:name="File">
            </file:inbound-endpoint>
            <data-mapper:transform config-ref="CSV_To_List_Record_" doc:name="CSV To List<Record>"/>
            <sfdc-analytics:upload-external-data-into-new-data-set-and-start-processing config-ref="Salesforce_Analytics_Cloud__Basic_authentication" type="recordId" operation="UPSERT" description="Test upload of 2500 records all in one step" label="records_2500_in_one_step" dataSetName="records_2500_in_one_step_with_app" edgemartContainer="TestContainer" notificationSent="ALWAYS" notificationEmail="name@email.com" doc:name="Salesforce Analytics Cloud">
                <sfdc-analytics:payload ref="#[payload]"/>
            </sfdc-analytics:upload-external-data-into-new-data-set-and-start-processing>
        </flow>
    </mule>

Example Use Case 2

Create a dataset and upload data into it by processing the data in several chunks.

  1. Create a new Mule Project by clicking on File > New > Mule Project. In the new project dialog box, the only thing you are required to enter is the name of the project. Click Finish.

    New project dialog

  2. Now let’s create the flow. Navigate through the project’s structure and double click on src/main/app/project-name.xml and follow the steps below.

  3. On the right side of Studio search for Batch.

    Search for batch

  4. Select Batch and drag it onto the canvas.

    Batch component on canvas

    When using the Batch component, tune it based on the amount of memory that you provide to the Mule ESB server.
  5. Similar to what was done in step 1, search for File.

  6. Drag File into the Input section of the batch element created earlier.

  7. Search for Message Enricher, then drag and drop it after File.

  8. Search for DataMapper and drag it after Message Enricher.

  9. Search for the connector named Salesforce Analytics Cloud and drag it into Message Enricher.

  10. Search for the Batch Commit component in the palette and drag it into the Batch step section of Batch.

    When using DataMapper, ensure the "Streaming" option in the Batch Commit component is enabled. This way you avoid loading the entire input in memory.
  11. Search for the connector named Salesforce Analytics Cloud and drag it into the Batch Commit section of Batch step.

    Bear in mind that the default threading profile uses 16 threads and each thread is loading data in chunks of 100 records until it reaches the "Commit size" set on the Batch Commit component. You can minimize the memory used by decreasing the number of threads.

    Finally you have to be aware of the fact that Salesforce Analytics Cloud connector is also using some memory internally and you should tune the "Commit Size" in the Batch Commit component for the purpose of efficiency, therefore do not set the parameter too low.

  12. Drag another Salesforce Analytics Cloud connector into the On complete section of Batch.

  13. After completing all the above steps you should see:

    Unconfigured Batch flow

  14. Lets start configuring each element. Double click on the File element.

    File component

  15. Click on …​ next to the Path field.

  16. Choose a folder with only the csv file that you want to upload. You can download our example file and save it into your chosen folder.

  17. Double-click on the Salesforce Analytics Cloud connector in the Message Enricher.

  18. Click on the plus sign next to the Connector configuration dropdown.

    Create data set config

  19. A pop-up asking for type of configuration appears. Choose the Salesforce Analytics Cloud: Basic Authentication option and click OK.

  20. A new pop-up asks for information required for basic authentication. For more info see the Installing and Configuring section

    Basic Auth config

  21. In the Connection section enter the credentials used to access the Salesforce instance.

  22. In the DataSense metadata section for the Metadata file name field enter the filename that describes the data structure you are going to upload. The filename has to be relative to the src/main/resources directory of your Studio project. For the file provided a few steps earlier (CsvDemoTestData.csv) you can use the metadata file provided below, but do not forget to copy it into the src/main/resources directory.

  23. Click OK to return to the Salesforce Analytics Cloud tab.

  24. From the Operation dropdown in the Basic Settings section choose Create data set.

  25. From the Operation dropdown in the DataSet info section choose OVERWRITE.

  26. In the Description field enter Test data set.

  27. In the Label field under DataSet info enter Test data set.

  28. In the Name field under DataSet info enter test_data_set.

  29. Double-click on Message Enricher and fill in the fields as below.

    Message Enricher Config

  30. Double-click on Batch Commit from Batch step.

  31. For Commit size enter the number of records you want to process in one step. (e.g. 5000)

    The application is logging a warning message if the data provided for processing in one step exceeds the size of data accepted by Analytics Cloud System in one step. The message looks like this: "The size of data provided for processing in one step exceeded the maximum size of one chunk allowed by Analytics Cloud System. In order to optimize the memory used you should decrease the size of data provided in one step. If you see this message then you should tune the Commit Size by decreasing it until you do not see the message anymore.
  32. Double-click on Salesforce Analytics Cloud from Batch Commit.

  33. From the Connector configuration dropdown choose Salesforce_Analytics_Cloud__Basic_authentication (only this option should be available).

  34. Choose Upload external data as the operation.

  35. Check the bottom corner on the right-hand side and wait for DataSense to fetch metadata.

    Fetch metadata progress bar

  36. For Data Set Id enter #[variable:dataSetId]

  37. Double-click on the DataMapper element.

  38. Click on the Type dropdown in the Input section and choose CSV

    Data Mapper set CSV as input type

  39. Click on …​ next to the CSV field of the Input section and browse to the csv file in the same folder you selected for the File connector.

  40. Click the Create mapping button and you should see something like the following.

    Data mapper mappings

  41. Double-click on Salesforce Analytics Cloud from the On complete section of Batch.

  42. From the Connector configuration dropdown select Salesforce_Analytics_Cloud__Basic_authentication (only this option should be available).

  43. From the Operation dropdown select Start data processing.

  44. In the Data Set Id field enter #[variable:dataSetId]

  45. At this point, everything should be set up and the application can be deployed.

It is time to test the application. Run the application in Anypoint Studio (Right click on the project name in Studio’s package explorer and select Run as -> Mule Application). Monitor the studio console and check Salesforce Wave Analytics UI to see if the data was uploaded.

  1. Add sfdc-analytics namespace to mule element as follows:

    
           
                   
                
    1
    
    xmlns:sfdc-analytics="http://www.mulesoft.org/schema/mule/sfdc-analytics"
  2. Add location of analytics schema referred by sfdc-analytics namespace with the following value:

    
           
                   
                
    1
    
    http://www.mulesoft.org/schema/mule/sfdc-analytics http://www.mulesoft.org/schema/mule/sfdc-analytics/current/mule-sfdc-analytics.xsd
  3. Add data-mapper namespace as follows:

    xmlns:data-mapper="http://www.mulesoft.org/schema/mule/ee/data-mapper"
  4. Add location of data mapper schema referred by data-mapper namespace with the following value:

    
           
                   
                
    1
    
    http://www.mulesoft.org/schema/mule/ee/data-mapper http://www.mulesoft.org/schema/mule/ee/data-mapper/current/mule-data-mapper.xsd
  5. Add a context:property-placeholder element to your project, then configure its attributes as follows:

    <context:property-placeholder location="mule-app.properties"/>
  6. Add a data-mapper:config element to your project, then configure its attributes as follows:

    
           
                   
                
    1
    
    <data-mapper:config name="CSV_To_List_Record_" transformationGraphPath="csv_to_list_record_.grf" doc:name="CSV_To_List_Record_"/>
  7. Add a sfdc-analytics:config element to your project, then configure its attributes as follows:

    
           
                   
                
    1
    
    <sfdc-analytics:config name="Salesforce_Analytics_Cloud__Basic_authentication" username="${salesforce.username}" password="${salesforce.password}" securityToken="${salesforce.securityToken}" metadataFileName="${metadata.file.analytics}" doc:name="Salesforce Analytics Cloud: Basic authentication" url="${salesforce.url}"/>
  8. Add an empty batch:job element to your project as follows:

    
           
                   
                
    1
    2
    3
    4
    5
    6
    7
    8
    
    <batch:job name="demoBatch">
        <batch:input>
        </batch:input>
        <batch:process-records>
        </batch:process-records>
        <batch:on-complete>
        </batch:on-complete>
    </batch:job>
  9. Add a file:inbound-endpoint element into batch:input of batch:job, then configure it as follows:

    
           
                   
                
    1
    2
    3
    
    <file:inbound-endpoint path="path_to_folder_to_monitor" moveToDirectory="path_to_folder_where_to_move_processed_files" responseTimeout="10000"
                           doc:name="File For Batch">
    </file:inbound-endpoint>
  10. Add an empty enricher element into batch:input of batch:job, then configure it as follows:

    
           
                   
                
    1
    2
    
    <enricher source="#[payload]" target="#[variable:dataSetId]" doc:name="Message Enricher">
    </enricher>
  11. Add a sfdc-analytics:create-data-set element into enricher, then configure it as follows:

    
           
                   
                
    1
    
    <sfdc-analytics:create-data-set config-ref="Salesforce_Analytics_Cloud__Basic_authentication" operation="OVERWRITE" description="${batch.dataSetDescription}" label="${batch.dataSetLabel}" dataSetName="${batch.dataSetName}" edgemartContainer="${batch.dataSetEdgemartContainer}" notificationSent="ALWAYS" notificationEmail="name@email.com" doc:name="Salesforce Analytics Cloud"/>
  12. Add a data-mapper:transform element into batch:input of batch:job, then configure it as follows:

    
           
                   
                
    1
    
    <data-mapper:transform config-ref="CSV_To_List_Record_" doc:name="CSV To List<Record>"/>
  13. Add an empty batch:step element into batch:process-records of batch:job, then configure it as follows:

    
           
                   
                
    1
    2
    
    <batch:step name="Batch_Step">
    </batch:step>
  14. Add an empty batch:commit element into batch:step of batch:process-records, then configure it as follows:

    
           
                   
                
    1
    2
    
    <batch:commit  doc:name="Batch Commit" size="3000">
    </batch:commit>
  15. Add a sfdc-analytics:upload-external-data element into batch:commit of batch:step of batch:process-records, then configure it as follows:

    
           
                   
                
    1
    2
    3
    
    <sfdc-analytics:upload-external-data config-ref="Salesforce_Analytics_Cloud__Basic_authentication" type="recordId" dataSetId="#[variable:dataSetId]" doc:name="Salesforce Analytics Cloud">
        <sfdc-analytics:payload ref="#[payload]"/>
    </sfdc-analytics:upload-external-data>
  16. Add a sfdc-analytics:start-data-processing element into batch:on-complete of batch:job, then configure it as follows:

    
           
                   
                
    1
    
    <sfdc-analytics:start-data-processing config-ref="Salesforce_Analytics_Cloud__Basic_authentication" dataSetId="#[variable:dataSetId]" doc:name="Salesforce Analytics Cloud"/>
  17. In the end the xml file should look like this:

    
           
                   
                
    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
    23
    24
    25
    26
    27
    28
    29
    30
    31
    32
    33
    34
    35
    36
    37
    38
    39
    40
    41
    42
    43
    44
    
    <?xml version="1.0" encoding="UTF-8"?>
    <mule xmlns:batch="http://www.mulesoft.org/schema/mule/batch"
            xmlns:file="http://www.mulesoft.org/schema/mule/file"
            xmlns:context="http://www.springframework.org/schema/context"
            xmlns="http://www.mulesoft.org/schema/mule/core" xmlns:doc="http://www.mulesoft.org/schema/mule/documentation"
            xmlns:spring="http://www.springframework.org/schema/beans" version="EE-3.7.0"
            xmlns:sfdc-analytics="http://www.mulesoft.org/schema/mule/sfdc-analytics"
            xmlns:data-mapper="http://www.mulesoft.org/schema/mule/ee/data-mapper"
            xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
            xsi:schemaLocation="
    http://www.mulesoft.org/schema/mule/batch http://www.mulesoft.org/schema/mule/batch/current/mule-batch.xsd
    http://www.mulesoft.org/schema/mule/sfdc-analytics http://www.mulesoft.org/schema/mule/sfdc-analytics/current/mule-sfdc-analytics.xsd
    http://www.mulesoft.org/schema/mule/file http://www.mulesoft.org/schema/mule/file/current/mule-file.xsd
    http://www.mulesoft.org/schema/mule/ee/data-mapper http://www.mulesoft.org/schema/mule/ee/data-mapper/current/mule-data-mapper.xsd
    http://www.springframework.org/schema/context http://www.springframework.org/schema/context/spring-context-current.xsd
    http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-current.xsd
    http://www.mulesoft.org/schema/mule/core http://www.mulesoft.org/schema/mule/core/current/mule.xsd">
            <context:property-placeholder location="mule-app.properties"/>
            <sfdc-analytics:config name="Salesforce_Analytics_Cloud__Basic_authentication" username="${salesforce.username}" password="${salesforce.password}" securityToken="${salesforce.securityToken}" metadataFileName="${metadata.file.analytics}" doc:name="Salesforce Analytics Cloud: Basic authentication" url="${salesforce.url}"/>
            <data-mapper:config name="CSV_To_List_Record_" transformationGraphPath="csv_to_list_record_.grf" doc:name="CSV_To_List_Record_"/>
            <batch:job name="demoBatch">
            <batch:input>
                <file:inbound-endpoint path="path_to_folder_to_monitor" moveToDirectory="path_to_folder_where_to_move_processed_files" responseTimeout="10000"
                                       doc:name="File For Batch">
                </file:inbound-endpoint>
                <enricher source="#[payload]" target="#[variable:dataSetId]" doc:name="Message Enricher">
                    <sfdc-analytics:create-data-set config-ref="Salesforce_Analytics_Cloud__Basic_authentication" operation="OVERWRITE" description="${batch.dataSetDescription}" label="${batch.dataSetLabel}" dataSetName="${batch.dataSetName}" edgemartContainer="${batch.dataSetEdgemartContainer}" notificationSent="ALWAYS" notificationEmail="name@email.com" doc:name="Salesforce Analytics Cloud"/>
                </enricher>
                <data-mapper:transform config-ref="CSV_To_List_Record_" doc:name="CSV To List<Record>"/>
            </batch:input>
            <batch:process-records>
                <batch:step name="Batch_Step">
                    <batch:commit  doc:name="Batch Commit" size="3000">
                        <sfdc-analytics:upload-external-data config-ref="Salesforce_Analytics_Cloud__Basic_authentication" type="recordId" dataSetId="#[variable:dataSetId]" doc:name="Salesforce Analytics Cloud">
                            <sfdc-analytics:payload ref="#[payload]"/>
                        </sfdc-analytics:upload-external-data>
                    </batch:commit>
                </batch:step>
            </batch:process-records>
            <batch:on-complete>
                <sfdc-analytics:start-data-processing config-ref="Salesforce_Analytics_Cloud__Basic_authentication" dataSetId="#[variable:dataSetId]" doc:name="Salesforce Analytics Cloud"/>
            </batch:on-complete>
        </batch:job>
    </mule>