tencent cloud

Data Lake Compute

Release Notes
Product Introduction
Overview
Strengths
Use Cases
Purchase Guide
Billing Overview
Refund
Payment Overdue
Configuration Adjustment Fees
Getting Started
Complete Process for New User Activation
DLC Data Import Guide
Quick Start with Data Analytics in Data Lake Compute
Quick Start with Permission Management in Data Lake Compute
Quick Start with Partition Table
Enabling Data Optimization
Cross-Source Analysis of EMR Hive Data
Standard Engine Configuration Guide
Configuring Data Access Policy
Operation Guide
Console Operation Introduction
Development Guide
Runtime Environment
SparkJar Job Development Guide
PySpark Job Development Guide
Query Performance Optimization Guide
UDF Function Development Guide
System Restraints
Client Access
JDBC Access
TDLC Command Line Interface Tool Access
Third-party Software Linkage
Python Access
Practical Tutorial
Accessing DLC Data with Power BI
Table Creation Practice
Using Apache Airflow to Schedule DLC Engine to Submit Tasks
Direct Query of DLC Internal Storage with StarRocks
Spark cost optimization practice
DATA + AI
Using DLC to Analyze CLS Logs
Using Role SSO to Access DLC
Resource-Level Authentication Guide
Implementing Tencent Cloud TCHouse-D Read and Write Operations in DLC
DLC Native Table
SQL Statement
SuperSQL Statement
Overview of Standard Spark Statement
Overview of Standard Presto Statement
Reserved Words
API Documentation
History
Introduction
API Category
Making API Requests
Data Table APIs
Task APIs
Metadata APIs
Service Configuration APIs
Permission Management APIs
Database APIs
Data Source Connection APIs
Data Optimization APIs
Data Engine APIs
Resource Group for the Standard Engine APIs
Data Types
Error Codes
General Reference
Error Codes
Quotas and limits
Operation Guide on Connecting Third-Party Software to DLC
FAQs
FAQs on Permissions
FAQs on Engines
FAQs on Features
FAQs on Spark Jobs
DLC Policy
Privacy Policy
Data Privacy And Security Agreement
Service Level Agreement
Contact Us
DocumentationData Lake ComputePractical TutorialImplementing Tencent Cloud TCHouse-D Read and Write Operations in DLC

Implementing Tencent Cloud TCHouse-D Read and Write Operations in DLC

PDF
Focus Mode
Font Size
Last updated: 2026-01-30 15:57:51
Data Lake Compute (DLC) has a built-in Tencent Cloud TCHouse-D connector that enables you to connect to the Tencent Cloud TCHouse-D cluster by adding the necessary configurations during development. This document explains how to implement read and write operations for Tencent Cloud TCHouse-D in DLC.

Background Information

Tencent Cloud TCHouse-D is built on the industry-leading OLAP database Apache Doris kernel. It is MySQL protocol compatible and integrates with the cloud big data ecosystem. With extensive cluster control capabilities and a comprehensive inspection and alarm system, Tencent Cloud TCHouse-D offers simple-to-use and easy-to-maintain fully managed cloud services to help customers quickly perform real-time OLAP data analysis. For more information about Tencent Cloud TCHouse-D, refer to the Tencent Cloud TCHouse-D Overview.

Prerequisites

You have purchased the Tencent Cloud TCHouse-D cluster.
You have activated theDLC service.
Note:
Cross-regional read/write for Tencent Cloud TCHouse-D is not currently supported. Plan your environment accordingly to ensure that both Tencent Cloud TCHouse-D and DLC are in the same region.

Operation Steps

Step 1: Creating Tencent Cloud TCHouse-D Catalogs

1. Log in to the DLC console, select the service region, and ensure that the logged-in account has the catalog creation permission. To enable sub-account permissions, refer to Sub-account Permission Management.
2. Go to Data Management and click Create Data Catalog.
3. Go to the data source creation interface, and select Tencent Cloud TCHouse-D as the connection type. After entering the connection information, complete the network configurations to establish the network between the engine and the external data source.

4. Enter the data source information and click Confirm to create the data source.
5. In the data catalog list, you can view the connection information, status, creator, and other details.
Note:
You need to enter a username and password for an account with the corresponding Tencent Cloud TCHouse-D data permissions. Otherwise, queries will return an error.

Step 2: Querying or Writing Data to Tencent Cloud TCHouse-D

1. Log in to the DLC console, select Data Explore, and switch the data catalog in the top-left corner to the Tencent Cloud TCHouse-D catalog created in the previous step (assuming that the catalog is named "tchouse").
2. Create a query. Assume that a TPCDS benchmark dataset exists in Tencent Cloud TCHouse-D. Using the following sample SQL, you can join the dataset with the internal DLC data and write the results back to Tencent Cloud TCHouse-D.
insert into tchouse.tpcds.d_table
SELECT
a.sk AS ctr_customer_sk,
b.sk AS ctr_store_sk
FROM
tchouse.tpcds.f_table a
LEFT JOIN
DataLakeCatalog.dlc.dlc_table b
ON a.id = b.id
WHERE
a.sssk = '123'
GROUP BY
a.sk,
b.sk
;
3. Run the above query. The DLC computing engine will read the Tencent Cloud TCHouse-D dataset and write the query result total_return back to Tencent Cloud TCHouse-D.
Note:
Currently, dynamic overwrite is not supported for writing the query results back to Tencent Cloud TCHouse-D.


Help and Support

Was this page helpful?

Help us improve! Rate your documentation experience in 5 mins.

Feedback