Set Up a Caching Strategy
You can create a caching strategy from a Cache scope properties panel or a Global Elements configuration in Anypoint Studio. After you create a strategy, a Cache component in your flow can reference it.
You can configure the cache size, expiration time, and maximum allowed entries by setting these values in the object store that you define or reference from the caching strategy.
Follow these steps to configure a caching strategy:
Open the caching strategy configuration window.
You can open it from the Cache scope properties panel or the Caching Strategies option in the Global Elements tab within Studio:
Define the Name of the caching strategy.
Define the ObjectStore to use by selecting any of the following options:
Use this option to create an Object Store configuration specific for this caching strategy.
Use this option either to select an existing Object Store, or to create a new global Object Store that can be referenced by this caching strategy and any other component in your application.
Select a mechanism for generating a key used for storing events within the caching strategy:
A DataWeave expression (for example,
Note that for two requests that are the same ("equal"), you need to generate the same key. Otherwise, you can get wrong results.
Requires you to implement the
(Optional) Open the Advanced tab in the properties window to configure advanced settings:
Select or create a Response Generator.
Note that this step requires that you implement the
Select the Event Copy Strategy:
Simple event copy strategy
Data is immutable. This is the default value.
Serializable event copy strategy
Data is mutable.
Mule enables you to synchronize access to a cache, which can avoid unexpected results if multiple message processors (on the same or different Mule instances) use the cache at the same time.
Consider a scenario where two message processors attempt to retrieve a value from a cache but do not find the value in the cache. So each message processor calculates the value independently and inserts it into the cache. The value inserted by the second message processor overwrites the value inserted by the first message processor. If the values are different, then two different results are obtained for the same input, with only the last one stored in the cache.
In some scenarios this is valid, but it can be a problem if the application requires cache coherence. Synchronizing the caching strategy ensures this coherence. A synchronized cache is locked when it is being modified by a message processor. In the previous scenario, a locked cache forces the second message processor to wait until the first message processor has calculated the value, and then it retrieves the value from the cache.
Synchronization affects performance, enable this feature only if it is needed. Note that performance degradation is most severe in cluster mode.
To enable synchronization, define the
synchronized property in the
<ee:object-store-caching-strategy> element. Accepted values are
The following XML snippet shows the configuration of a caching strategy that synchronizes access to the cache, and defines a persistent Object Store to store the cached responses. The caching strategy is then referenced by a Cache scope:
<!-- Caching strategy definition --> <ee:object-store-caching-strategy name="Caching_Strategy" doc:name="Caching Strategy" synchronized="true" > <!-- Object Store defined for the caching strategy--> <os:private-object-store alias="CachingStrategy_ObjectStore" maxEntries="100" entryTtl="10" expirationInterval="5" config-ref="ObjectStore_Config" /> </ee:object-store-caching-strategy> <!-- Cache scope referencing the strategy--> <ee:cache doc:name="Cache" cachingStrategy-ref="Caching_Strategy"> <!-- Some processing logic to cache--> </ee:cache>