Common terminology for SCS:
|Stream Compute Service (SCS)||SCS can persistently read data from streaming sources, and write the data into a sink after it is processed by an SQL program written by the user. It can also perform multiple tasks in series.|
|Source||Provides input data for SCS, such as Tencent Cloud CKafka.|
|Sink||Indicates a place where the computing results of SCS are output, such as Tencent Cloud CKafka.|
|Schema||Indicates the structural information of a table, such as the column names, column types, etc.|
|Time mode||Instructs the system on how to get timestamps when processing data.|
|Event Time||In the Event Time mode, timestamps are provided by a field in the input data. You can use the WATERMARK FOR statement to specify the field and enable the Event Time mode. This mode is suitable for the scenarios where sources contain exact timestamps.|
|Watermark||Indicates a specified point in time before which all data has been processed properly.
Watermark is automatically generated by the system, and you can specify the maximum tolerance of timestamps through the WATERMARK FOR BOUNDED statement.
|Processing Time||In the Processing Time mode, timestamps are automatically generated by the system and added to the source (named after
|Time window||Defines multiple time periods and the relationship between them (for example, whether they can overlap, or whether they are fixed in size). Supported values include TUMBLE, HOP, and SESSION.|
|Stream Connector||It is a high-performance and high-availability messaging system provided by Tencent Cloud, which supports the definition of Schema and inputs and outputs data in this format. It is the most perfect source and sink supported by SCS. The topic types of Stream Connector include Tuple, Upsert, and Blob (not available). You can create a project and a topic under it in the "Stream Connector" page.|
|Integrator||Stream Connector's integrator is responsible for outputting the data in Stream Connector to other sinks, such as CDB (MySQL PostgreSQL), and COS (Cloud Object Storage), so as to realize the final output of computing results.|
|Tuple and Append stream||Tuple is a type of Stream Connector that can store incremental Append streaming data. An Append stream is a data stream that is consistently appended with new data. It does not update previously issued data. A variety of sources and sinks support input and output of Append stream.|
|Upsert and Upsert stream||Upsert is a type of Stream Connector data table that can store Upsert streaming data.
Upsert (abbreviation of update or insert) is generated by queries such as DISTINCT, non-window-based GROUP BY, non-window-based JOIN and other statements, which has a primary key. If the data issued at a later point in time has the same primary key as a piece of previous data, the record will be updated to a new value. Otherwise, a new row of data is added. It ensures that previously issued data is updated to reflect the latest value. Upsert streams can only be written into Stream Connector.
|CKakfa||CKafka is a distributed, high-throughput and highly scalable messaging system provided by Tencent Cloud, and is fully compatible with Kafka 0.9 API. CSV and JSON are supported as input and output formats.|
|CDB (for MySQL)||CDB is a high-performance, high-reliability and scalable database hosting service provided by Tencent Cloud. It allows users to easily deploy and use MySQL databases on the cloud.|
|DDL||DDL, short for Data Definition Language, is a subset of the SQL language and consists of CREATE statements. It can be used to define tables, views, and user-defined functions (UDF), etc.|
|DML||DML, short for Data Manipulation Language, is a subnet of the SQL language and consists of INSERT and SELECT statements. It can be used to select, convert, filter and insert data tables and views.|