Create a Tokenization Service
The tokenization service enables you to substitute a sensitive data element with a non-sensitive equivalent.
Prerequisites
To configure and use the tokenization service, you must:
-
Have the Anypoint Security - Edge entitlement for your Anypoint Platform account. If you don’t see Security in Management Center or the Tokenization Service tab in Runtime Manager, contact your customer success manager to enable the tokenization service for your account.
If you do not see the Tokenization Service tab in Runtime Manager, contact your customer success manager to enable the tokenization service for your account. -
Have the correct permissions to manage tokenization.
See Granting Tokenization Permissions -
Have Runtime Fabric 1.1.153 or later with inbound traffic configured.
See the Runtime Fabric documentation.
Runtime Fabric requires the Anypoint Integration Advanced package or a Platinum or Titanium subscription to Anypoint Platform. -
Have a secrets group to store the tokenization table encryption keys that are created by the tokenization service.
-
Have a tokenization format, which describes the format and how data is tokenized.
See Tokenization Formats.
Create the Tokenization Service
-
Go to the Runtime ManagerĀ > Tokenization Service page.
-
Click Create Tokenization Service.
The prerequisites you completed provide the information you need to make the following selections: -
Select the Runtime Fabric to which to deploy the tokenization service.
-
Select the tokenization format.
You can assign one or more formats or all formats to one tokenization service. -
Select the number of cores to use for the tokenization service replicas.
The tokenization service runs on worker nodes in Runtime Fabric. You can select a minimum and maximum number of cores, defined as:
-
Reserved vCPU
The amount of vCPU guaranteed and reserved for the application’s use.
-
vCPU Limit
The maximum amount of vCPU the application can use (the level to which it can burst). This is shared CPU on the worker node.
-
-
Select the log level for the tokenization service or keep the default (ERROR).
-
Click Build and Deploy to create the tokenization table.
A mapping table prebuilds, containing a large number of randomizations that are used at the core of tokenizing and detokenizing. This prebuilt table is not a table of one-to-one mappings; it is used during internal steps to swap in and swap out randomizations in place of the actual data.
The tokenization mapping table build is a one-time action and takes some time to complete, depending on the size of the tokenization format. For example, a table that contains only an SSN format with a size that is less than 200 MB might build in 2 minutes, but a table that uses a larger format, such as lax alphanumeric, might take up to 20 minutes to build.
If many or all of the formats are selected, it takes much longer to build the table, because a table with all formats is approximately 2 GB in size.