In a separator log, the entire log data can be structured according to the specified separator, and each complete log ends with a line break
\n. When CLS processes separator logs, you need to define a unique key for each separate field.
Assume the raw data of a log is as follows:
10.20.20.10 - ::: [Tue Jan 22 14:49:45 CST 2019 +0800] ::: GET /online/sample HTTP/1.1 ::: 127.0.0.1 ::: 200 ::: 647 ::: 35 ::: http://127.0.0.1/
If the separator for log parsing is specified as
:::, the log will be segmented into eight fields, and a unique key will be defined for each of them as shown below:
IP: 10.20.20.10 - bytes: 35 host: 127.0.0.1 length: 647 referer: http://127.0.0.1/ request: GET /online/sample HTTP/1.1 status: 200 time: [Tue Jan 22 14:49:45 CST 2019 +0800]
test-separatoras Log Topic Name and click OK.
On the Collection Configuration page, set Collection Path according to the log collection path format as shown below:
Log collection path format:
[directory prefix expression]/**/[filename expression].
After the log collection path is entered, LogListener will match all common prefix paths that meet the [directory prefix expression] rule and listen for all log files in the directories (including subdirectories) that meet the [filename expression] rule. The parameters are as detailed below:
|Directory prefix||Directory structure of the log file prefix. Only wildcards
|/**/||Current directory and all its subdirectories.|
|Filename||Log filename. Only wildcards
Common configuration modes are as follows:
Below are examples:
|No.||Directory Prefix Expression||Filename Expression||Description|
|1.||/var/log/nginx||access.log||In this example, the log path is configured as
|2.||/var/log/nginx||*.log||In this example, the log path is configured as
|3.||/var/log/nginx||error*||In this example, the log path is configured as
- Only LogListener 2.3.9 and above support adding multiple collection paths.
- The system does not support uploading logs with contents in multiple text formats, which may cause write failures, such as
- You are advised to configure the collection path as
log/*.logand rename the old file after log rotation as
- By default, a log file can only be collected by one log topic. If you want to have multiple collection configurations for the same file, please add a soft link to the source file and add it to another collection configuration.
:::, it can also be parsed through custom delimiter.
- The log time is measured in seconds. If the log time is entered in an incorrect format, the collection time is used as the log time.
- The time attribute of a log is defined in two ways: collection time and original timestamp.
- Collection time: the time attribute of a log is determined by the time when CLS collects the log.
- Original timestamp: the time attribute of a log is determined by the timestamp in the raw log.
Using the collection time as the time attribute of logs
Keep Collection Time enabled as shown below:
Using the original timestamp as the time attribute of logs
Disable Collection Time and enter the time key of the original timestamp and the corresponding time parsing format in Time Key and Time Parsing Format respectively. For more information on the time parsing format, please see Configuring Time Format.
Below are examples of how to enter a time parsing format:
Example 1: the parsing format of the original timestamp
Example 2: the parsing format of the original timestamp
2017-12-10 08:00:00 is
Example 3: the parsing format of the original timestamp
12/10/2017, 08:00:00 is
Second can be used as the unit of log time. If the time is entered in a wrong format, the collection time is used as the log time.
Filters are designed to help you extract valuable log data by adding log collection filter rules based on your business needs. If the filter rule is a Perl regular expression, the created filter rule will be used for matching; in other words, only logs that match the regular expression will be collected and reported.
For separator-formatted logs, you need to configure a filter rule according to the defined custom key-value pair. For example, if you want to collect all log data with a
status field whose value is 400 or 500 after the sample log is parsed in separator mode, you need to configure
status and the filter rule as
The relationship logic between multiple filter rules is "AND". If multiple filter rules are configured for the same key name, previous rules will be overwritten.