With the prosperity of the Kafka community, more and more users have started to use Kafka for activities such as log collection, big data analysis, and streaming data processing. CKafka has made the following optimizations on open-source Kafka:
SCF has been deeply integrated with CKafka, and a lot of practical features have been launched. With the help of SCF and Ckafka trigger, it is easy to dump CKafka messages to COS, ES, and TencentDB. This document describes how to use SCF in place of Logstash to dump CKafka messages to ES as shown below:
SCF can consume messages in CKafka in real time in various scenarios such as data storage, log cleansing, and real-time consumption, and the data dump feature has been integrated in the CKafka Console and can be enabled quickly, making it easier to use as shown below:
Compared to a CVM-based self-created CKafka consumer, SCF has the following advantages:
Compared to CVM-based self-created Logstash service, SCF has the following advantages:
This document uses the Guangzhou region as an example:
Log in to the SCF Console and click Functions on the left sidebar.
Select the region where to create a function at the top of the "Functions" page and click Create to enter the function creation process.
Create a function as follows in "Basic Info" on the "Create Function" page and click Next.
On the "Function configuration" page, keep the default configuration and click Complete.
Enter the "Function configuration" page of the created function, click Edit in the top-right corner, and complete the function configuration as follows:
|ES_User||ES username, which is `elastic` by default.||Yes|
|ES_Index_KeyWord||ES keyword index.||Yes|
|ES_Log_IgnoreWord||Keyword to be deleted. If this parameter is left empty, all keywords will be written. For example, you can enter `name` or `password`.||No|
|ES_Index_TimeFormat||Index by day or hour. If this parameter is left empty, the index will be by day. For example, you can enter `hour`.||No|
The main parameter information is as follows. Keep the remaining parameters as default:
If you have not ingested actual data into CKafka, you can use the client tool to simulate message production.
Select Log Query on the sidebar of the function to view the function execution log.
View Kibana. For more information, please see Accessing Clusters from Kibana.
If you want to implement advanced log cleansing logic, you can modify the logic in the code location as shown below: