JSON Shipping

Last updated: 2019-09-25 17:09:29

PDF

Overview

You can log in to the CLS Console and ship data of the JSON format to Cloud Object Storage (COS). This topic describes how to create a task of shipping data of the JSON format.

Prerequisites

  1. You have activated CLS, created a logset and a log topic, and successfully collected the log data.
  2. You have activated COS and created a bucket in the target region of the log topic shipping. For more information, see Creating Bucket.
  3. You have to ensure that the current account has the permission to configure shipping tasks.

Directions

  1. Log in to the CLS Console.
  2. Click Logset Management in the left sidebar.
  3. Click on the logset ID/name for which you want to set shipping tasks to go to its details page.
  4. Locate the log topic to be shipped, click Configure -> Shipping to COS Configuration to go to the Shipping Configuration page.
  5. Click Add Shipping Configuration to go to the Ship to COS page and enter the configuration information successively.
    img

The configuration items are as follows:

Configuration Item Description Rule Required or Not
Shipping Task Name Sets the name of a shipping task. Letters, numbers, underscores (_), and hyphens (-) Yes
COS Bucket The bucket in the same region as the current log topic is used as the target bucket for shipping. A value selected from the list Yes
Directory prefix CLS allows you to define a directory prefix. Log files are shipped to the directory of the COS bucket. Log files are stored in the bucket by default with the path {COS bucket}{directory prefix}{partition format}_{random}_{index}.{type}, where {random}_{index} is a random number. Not starting with a forward slash (/) No
Partition Format Automatically generates a directory for the shipping task creation time based on the strftime syntax. The forward slash (/) represents a level-1 COS directory. strftime format Yes
File Size Specifies the maximum size of an uncompressed file to be shipped during a shipping interval. It means that during the time interval, the maximum size of the log file that can be shipped is the value you set. A file larger than this size will be split into multiple log files. The value should be from 100 MB to 10,000 MB. 100 MB to 10,000 MB Yes
Shipping Interval Specifies the time interval of shipping. It currently supports from 60 seconds to 3,600 seconds. If you set it to 5 minutes, a log file is generated from your log data every 5 minutes, and multiple log files will be shipped together to your bucket at a regular interval (within half an hour). 60 seconds to 3,600 seconds Yes

Enter partition formats based on the requirements of the strftime format. Different partition formats may affect the paths of files shipped to COS. The following example describes how to use partition formats. For example, if a file is shipped to the bucket_test bucket, the directory prefix is logset/, and the shipping time is 2018/7/31 17:14, the corresponding shipping file path is as follows:

Bucket Name Directory prefix Partition Format COS File Path
bucket_test logset/ %Y/%m/%d bucket_test:logset/2018/7/31_{random}_{index}
bucket_test logset/ %Y%m%d/%H bucket_test:logset/20180731/14_{random}_{index}
bucket_test logset/ %Y%m%d/log bucket_test:logset/20180731/log_{random}_{index}
  1. Click Next to go to the advanced configuration page. Set the shipping format to JSON and enter relevant parameters successively.
    img

The configuration items are as follows:

Configuration Item Description Rule Required or Not
Compressed Shipping You can determine whether to compress log files before shipping. The size of an uncompressed file to be shipped is limited to 10 GB. Files can be compressed into a GZIP or LZOP package. Enabled/Disabled Yes

Advanced Options(Optional)
You can open Advanced Options to filter logs based on log content before shipping.

Up to 5 shipping filtering rules are allowed, among rules is “And” logic, i.e. the log can be shipped only when it meet all rules.

a. Specify a key, and perform RegEx extraction on it by setting filtering rules.
b. Use “()” to capture objects that needs to match the value and enter the value to match. The system first performs a match according to the regular expressions in shipping rule, extracts the content of the capture group "()", and compares it with the value. When the captured content is equal to the value, the log data will be shipped.
Sample 1:
Specify a field as status. For example, the key-value pair is status: 404. If you want to ship the log with a status field of 404, the filtering rule is (.*)

Sample 2:
Specify a field as http_host. For example, the key-value pair is http_host:172.16.19.20. If you want to ship the log with a http_host field start with “172.16”, the filtering rule is ^(\d+\.\d+)\..*.

7. Click OK. The shipping is then enabled.
img