In a separator log, the entire log data can be structured according to the specified separator, and each complete log ends with a line break
\n. When CLS processes separator logs, you need to define a unique key for each separate field.
Suppose your raw log data is:
10.20.20.10 - ::: [Tue Jan 22 14:49:45 CST 2019 +0800] ::: GET /online/sample HTTP/1.1 ::: 127.0.0.1 ::: 200 ::: 647 ::: 35 ::: http://127.0.0.1/
If the separator for log parsing is specified as
:::, the log will be segmented into eight fields, and a unique key will be defined for each of them as shown below:
IP: 10.20.20.10 - bytes: 35 host: 127.0.0.1 length: 647 referer: http://127.0.0.1/ request: GET /online/sample HTTP/1.1 status: 200 time: [Tue Jan 22 14:49:45 CST 2019 +0800]
Log in to the CLS console and click Logset on the left sidebar.
Select the target logset, click Create Log Topic, enter the log topic name "test-separator", and click OK.
The log collection path is in the format of [directory prefix expression]/**/[filename expression]. LogListener will match all common prefix paths that meet the [directory prefix expression] rule and listen on all log files in the directories (including subdirectories) that meet the [filename expression] rule. The parameters are as detailed below:
|Directory prefix||Directory structure of the log file prefix. Only wildcards
|/**/||Current directory and all its subdirectories|
|Filename||Log file name. Only wildcards
Common configuration modes are as follows:
- [Common directory prefix]/**/[common filename prefix]*
- [Common directory prefix]/**/*[common filename suffix]
- [Common directory prefix]/**/[common filename prefix]*[common filename suffix]
- [Common directory prefix]/**/*[Common string]*
|No.||Directory Prefix Expression||Filename Expression||Description|
|1.||/var/log/nginx||access.log||In this example, the log path is configured as
|2.||/var/log/nginx||*.log||In this example, the log path is configured as
|3.||/var/log/nginx||error*||In this example, the log path is configured as
- Only LogListener 2.3.9 or above allows adding multiple collection paths.
- By default, a log file can only be collected by one log topic. If you want to have multiple collection configurations for the same file, please add a soft link to the source file and add it to another collection configuration.
Select the target machine group from the machine group list and associate it with the current log topic. Please note that the associated machine group must be in the same region as the log topic. For detailed directions, please see Machine Group Management.
You need to select a unique separator first. The system segments the sample log according to the selected separator and displays it in the extraction result box. You need to define a unique key for each field. Currently, log collection supports a variety of separators. Common separators include space, tab, comma, semicolon, and vertical bar. If your log data uses other separators such as
::: , it can also be parsed through custom delimiter.
The time configuration is as described below:
Keep Collection Time enabled as shown below:
Disable Collection Time and enter the time key of the original timestamp and the corresponding time parsing format in Time Key and Time Parsing Format respectively. For more information on the time parsing format, please see Configuring Time Format.
Below are examples of how to enter a time parsing format:
Example 1: if the original timestamp of the sample log is
10/Dec/2017:08:00:00, then the parsing format is
Example 2: if the original timestamp of the sample log is
2017-12-10 08:00:00, then the parsing format is
Example 3: if the original timestamp of the sample log is
12/10/2017, 08:00:00, then the parsing format is
You can set "second" as the log time unit. If you enter a log time with incorrect format, the collection time will be used as the log time.
Filters are designed to help you extract valuable log data by adding log collection filter rules based on your business needs. If the filter rule is a Perl regular expression, the created filter rule will be a hit rule; in other words, only logs that match the regular expression will be collected and reported.
You need to configure a filter rule for separator logs according to the defined custom key-value pair. For example, if you want to collect all log data with a
status field whose value is 400 or 500 after the sample log is parsed in separator mode, you need to configure
status and the filter rule as
The relationship between multiple filter rules is logic "AND". If multiple filter rules are configured for the same key name, previous rules will be overwritten.
Log in to the CLS console, select Log Search on the left sidebar, select the target logset and log topic, and click Search to search for logs.
Index configuration must be enabled first before you can perform searches.