Nav

Using Logs

The Anypoint Monitoring aggregates your log files so you can manage, search for, filter, and analyze your logs. You can use aggregated logs in conjunction with monitoring to help identify causes and investigate failures quickly.

Searching Logs

You can search for text in logs in a couple of ways:

  • Typing your search query into the search field.

    You can provide full terms, such as "code 400" to find bad requests. For exact matches to multi-worded searches, you need to surround your search terms in quotation marks. Here are results from a "code 400" search:

    Log Search

    You can also provide wildcards such as payload=org.glassfish.grizzly*. Here are the results of that search:

    Log Search Using a Wildcard
  • Using Add A Filter to create query that you can save as a filter.

    • You can create a query from predefined filters and operators, for example:

      Using a Filter Query
      Figure 1. Using Search Filter Values
    • You can create an Elasticsearch query.

      Note that the UI can convert a query into an Elasticsearch Query DSL for you. This example show a query for log-level INFO:

      Edit Query DSL (Elasticsearch Query)
      
                     
                  
      1
      2
      3
      4
      5
      6
      7
      8
      9
      10
      
      {
        "query": {
          "match": {
            "log-level": {
              "query": "INFO",
              "type": "phrase"
            }
          }
        }
      }

      The Elasticsearch query UI provides a link to the query documentation so that you can perform more complex queries than otherwise available, for example:

      More Complex Elasticsearch Query
      
                     
                  
      1
      2
      3
      4
      5
      6
      7
      8
      9
      10
      
      {
        "query": {
          "range": {
            "workerId": {
              "gte": 0,
              "lte": 20
            }
          }
        }
      }

      Note that some complex Elasticsearch queries do not have an equivalent in the Search Filter Values UI, so you can only create and view such queries in the Elasticsearch query UI.

Working with Predefined Filters

When the Quick Filter list is open, you can select and use predefined log filters it. This example selects the ERROR log level.

Predefined Log Filter

After you select a predefined filter from the Quick Filter list, you see it near the top of the page, next to +Add a Filter.

Filter Actions

When you mouse over a filter, you can click a button for any of these actions:

  • Check (enable) or uncheck (disable) the filter.

  • Pin or unpin the filter.

  • Invert the search filter. For example, if the search is log level IS ERROR, the inversion is log level IS NOT ERROR

  • Delete the filter.

  • Edit the filter configuration.

Adding Filter Columns to Logs

You can add or remove a filter column to your log list from the Quick Filters list.

You simply click the blue plus icon for a filter field, for example:

To Add a Filter Column to the Logs

The resulting log-level column looks like this in the logs:

Filter Column in the Logs

To remove the filter column, simply click the red X icon for the filter field, for example:

To Remove a Filter Column from the Logs

Search Filters

In addition to using predefined filters, you can create, define, and use your own filter to use when searching for logs that are important to your organization.

Filter Description Examples

application

Full domain of the Mule app in CloudHub.

am-flights.us-e2.stgx.cloudhub.io

class

Java class that generates the log.

[am-flights].am-flights-api-httpListenerConfig.worker.34

environment

CloudHub environment name.

myEnv

environment type

CloudHub environment type

Sandbox, Design, Production

host

Host IP.

ip-172-25-175-175

log level

ALL, DEBUG, ERROR, FATAL, INFO, OFF, TRACE, WARN

INFO

logger

Log4J Logger class

HttpListenerRegistry

message

Log4J message

No listener found for request: (POST)/zaraza

timestamp

Timestamp on the log. Accepted formats: MMM dd yyyy, HH:mm:ss.SSS

Supported time settings are s for seconds, m for minutes, h for hours, d for days, w for weeks, M for months, y for years, with now for relative times, for example:

  • Last 5 minutes: now-5m

  • The day so far: now/d

  • This week: now/w

  • Week to date: now/w

  • Previous Month: now-1M/M

worker

CloudHub worker ID.

0, 1

Operators

You can use these operators when creating a search filter. For example log-level IS

is

Available to all filters.

is not

Available to all filters.

is between

Available to timestamp.

is not between

Available to timestamp.

exists

Available to all filters.

does not exist

Available to all filters.

Viewing Log Data

By default, the logs entries are abbreviated. You can expand and view the log message and details as a table or in JSON format.

To Expand a Log Message

Tabular Logs

This example shows and expanded log entry as a table. The table contains log filter fields that you see in the Quick Filter list.


          
       
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
application              test40x.us-e1.qax.cloudhub.io
class                    qtp437897409-31
environment              Sandbox
environment type        %{[fields][env_type]}
log level                ERROR
logger            DefaultMessagingExceptionStrategy
message
                  ****************************************************
                  Message               : No record could be found in payload or in flow variable BATCH_RECORD (java.lang.IllegalStateException).
                  Element               : null
                  ----------------------------------------------------
                  Exception stack is:
                  No record could be found in payload or in flow variable BATCH_RECORD (java.lang.IllegalStateException). (org.mule.api.transformer....
                  (72 more...)

                  (set debug level logging or '-Dmule.verbose.exceptions=true' for everything)
                  ****************************************************
timestamp         April 25th 2018, 17:09:53.517
worker            0

JSON-Formatted Logs

This example shows and expanded log entry in JSON format.


          
       
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
{
  "_version": 1,
  "_source": {
    "class": "qtp437897409-31",
    "logger": "DefaultMessagingExceptionStrategy",
    "_ending": 0,
    "timestamp": "2018-04-26T00:09:53.517Z",
    "message": ".Example*** .ExampleMessage: No record could be found in payload or in flow variable BATCH_RECORD (java.lang.IllegalStateException)..ExampleElement...",
    "log level": "ERROR",
    "worker": "0",
    "environment": "Sandbox",
    "environment type": "%{[fields][env_type]}",
    "application": "test40x.us-e1.qax.cloudhub.io"
  },
  "fields": {
    "timestamp": [
      "2018-04-26T00:09:53.517Z"
    ]
  },
  "highlight": {
    "orgId": [
      "@kibana-highlighted-field@6046b96d@/..."
    ]
  },
  "sort": [
    1524701393517
  ]
}