{
"OPENAI": {
"OPENAI_API_KEY": "YOUR_OPENAI_API_KEY"
},
"MISTRAL_AI": {
"MISTRAL_AI_API_KEY": "YOUR_MISTRAL_AI_API_KEY"
},
"OLLAMA": {
"OLLAMA_BASE_URL": "http://baseurl.ollama.com"
},
"GROQAI_OPENAI": {
"GROQ_API_KEY": "YOUR_GROQAI_APIKEY"
},
"ANTHROPIC": {
"ANTHROPIC_API_KEY": "YOUR_ANTHROPIC_API_KEY"
},
"AZURE_OPENAI": {
"AZURE_OPENAI_KEY": "YOUR_AZURE_OPENAI_KEY",
"AZURE_OPENAI_ENDPOINT": "http://endpoint.azure.com",
"AZURE_OPENAI_DEPLOYMENT_NAME": "YOUR_DEPLOYMENT_NAME"
}
}
Using Anypoint Studio to Configure MuleSoft Chain AI Connector 1.0 - Mule 4
Anypoint Studio (Studio) editors help you design and update your Mule applications, properties, and configuration files.
To add and configure a connector in Studio:
When you run the connector, you can view the app log to check for problems in real time, as described in View the App Log.
If you are new to configuring connectors in Studio, see Using Anypoint Studio to Configure a Connector. If, after reading this topic, you need additional information about the connector fields, see the MuleSoft AI Chain Connector Reference.
Create a Mule Project
In Studio, create a new Mule project in which to add and configure Anypoint Connector for MuleSoft AI Chain (MuleSoft AI Chain Connector):
-
In Studio, select File > New > Mule Project.
-
Enter a name for your Mule project and click Finish.
Add the Connector to Your Mule Project
Add MuleSoft AI Chain Connector to your Mule project to automatically populate the XML code with the connector’s namespace and schema location and add the required dependencies to the project’s pom.xml
file:
-
In Mule Palette, click (X) Search in Exchange.
-
In Add Dependencies to Project, type
mulesoft ai chain
in the search field. -
Click MuleSoft AI Chain Connector in Available modules.
-
Click Add.
-
Click Finish.
Adding a connector to a Mule project in Studio does not make that connector available to other projects in your Studio workspace.
Configure a Source
A source initiates a flow when a specified condition is met. You can configure one of these sources to use with MuleSoft Chain AI Connector:
-
HTTP > Listener
Initiates a flow each time it receives a request on the configured host and port
-
Scheduler
Initiates a flow when a time-based condition is met
For example, to configure an HTTP > Listener source, follow these steps:
-
In Mule Palette, select HTTP > Listener.
-
Drag Listener to the Studio canvas.
-
On the Listener configuration screen, optionally change the value of the Display Name field.
-
Specify a value for the Path field.
-
Click the plus sign (+) next to the Connector configuration field to configure a global element that can be used by all instances of the HTTP > Listener source in the app.
-
On the General tab, specify the connection information for the connector.
-
On the TLS tab, optionally specify the TLS information for the connector.
-
On the Advanced tab, optionally specify reconnection information, including a reconnection strategy.
-
Click Test Connection to confirm that Mule can connect with the specified server.
-
Click OK.
Add a Connector Operation to the Flow
When you add a connector operation to your flow, you are specifying an action for that connector to perform.
To add an operation for MuleSoft AI Chain Connector, follow these steps:
-
In Mule Palette, select MuleSoft AI Chain Connector and then select the operation to add.
-
Drag the operation onto the Studio canvas, next to the source.
Configure a Global Element for the Connector
When you configure a connector, configure a global element that all instances of that connector in the app can use. Configuring a global element requires you to provide the authentication credentials that the connector requires to access the target system.
You can reference a configuration file that contains ANT-style property placeholders (recommended), or you can enter your authorization credentials in the global configuration properties. For information about the benefits of using property placeholders and how to configure them, see Anypoint Connector Configuration.
To configure the global element for MuleSoft AI Chain Connector, follow these steps:
-
Select the operation in the Studio canvas.
-
On the properties screen for the operation, click the Add (+) icon to access the global element configuration fields.
-
In the Global Elements Properties > General tab, choose your preferred large language model (LLM) type from LLM:
-
Anthropic
-
Azure OpenAI
-
Mistral AI
-
Ollama
-
OpenAI
-
GroqAI
-
-
From Config type, choose from these configuration types:
-
Environment Variables
This configuration requires you to set the environment variables in the operating system where Mule runtime is deployed. When you choose this option, enter a
-
in the File path field.Based on the LLM type you chose, you must set different environment variables. These are the supported environment variables based on the supported LLM types:
-
Anthropic:
ANTHROPIC_API_KEY
-
AWS Bedrock:
AWS_ACCESS_KEY_ID- AWS_SECRET_ACCESS_KEY
-
Azure OpenAI:
AZURE_OPENAI_ENDPOINT- AZURE_OPENAI_KEY- AZURE_OPENAI_DEPLOYMENT_NAME
-
MistralAI:
MISTRAL_AI_API_KEY
-
OpenAI:
OPENAI_API_KEY
-
Ollama:
OLLAMA_BASE_URL
-
Groq AI:
GROQ_API_KEY
-
-
Configuration JSON
This configuration requires you to provide a configuration JSON file with all the required LLM properties.
This is an example of the configuration JSON file:
Make sure you fill out the required properties for your LLM type. You can store the file externally or add it directly in the Mule application under the
src/main/resource
folder by using this Dataweave expression:mule.home "/apps/" app.name ++ "/envVars.json"
-
-
In File path, enter
-
if you chose Environment Variables as the configuration type. If you chose Configuration JSON, enter the path to the JSON file, or the DataWeave expression if you are storing it in the Mule application under thesrc/main/resource
folder. -
In Model name, choose from the supported models for the LLM provider.
-
In Temperature, enter a number between
0
and2
, or use the default value0.7
.The temperature is used to control the randomness of the output. When you set it higher, you’ll get more random outputs. When you set it lower, towards
0
, the values are more deterministic. -
In LLM Timeout, enter the number of seconds of when to time out the request. The default is
60
seconds. -
In Max tokens, enter the number of LLM tokens to use when generating a response.
This parameter helps control the usage and costs when engaging with LLMs.
-
Click OK.
Configure Additional Connector Fields
After you configure a global element for the connector, configure the other required fields. The required fields vary depending on which connector operation you use.
View the App Log
To check for problems, you can view the app log as follows:
-
If you’re running the app from Anypoint Platform, the app log output is visible on the Anypoint Studio console window.
-
If you’re running the app using Mule from the command line, the app log output is visible in your operating system console.
Unless the log file path is customized in the app’s log file (log4j2.xml
), you can also view the app log in the default location MULE_HOME/logs/<app-name>.log
. You can configure the location of the log path in the app log file log4j2.xml
.