tencent cloud

TDLC Command Line Interface Tool Access
Last updated:2026-03-04 11:17:11
TDLC Command Line Interface Tool Access
Last updated: 2026-03-04 11:17:11
TDLC is a client command tool provided by Tencent Cloud Data Lake Compute (DLC). Using the TDLC tool, you can submit SQL and Spark tasks to the DLC data engine.
TDLC is written in Go, based on the Cobra framework, and supports configuring multiple buckets and cross-bucket operations. You can view the usage of TDLC by running ./tdlc [command] --help.

Downloading and Installation

The TDLC Tencent Cloud Command Line Interface provides binary packets for Windows, Mac, and Linux operating systems. You can use it after a simple installation and configuration. Select the download based on your client's operating system type.
Operating System
TDLC Binary Packet Download Link
Windows
Mac
Linux
Rename the downloaded file to tdlc. Open the command line on your client, navigate to the download path, and if you are using Mac/Linux, use the chmod +x tdlc command to grant executable permissions to the file. After ./tdlc is executed, if the following content is displayed successfully, the installation is complete and the tool is ready for use.
Tencentcloud DLC command tools is used to play around with DLC.
With TDLC user can manger engines, execute SQLs and submit Spark Jobs.

Usage:
tdlc [flags]
tdlc [command]

Available Commands:
config
help Help about any command
spark Submit spark app to engines.
sql Executing SQL.
version

Flags:
--endpoint string Endpoint of Tencentcloud account. (default "dlc.tencentcloudapi.com")
--engine string DLC engine. (default "public-engine")
-h, --help help for tdlc
--region string Region of Tencentcloud account.
--role-arn string Required by spark jar app.
--secret-id string SecretId of Tencentcloud account.
--secret-key string SecretKey of Tencentcloud account.
--token string Token of Tencentcloud account.

Use "tdlc [command] --help" for more information about a command.

Usage Instructions 

Global Parameters

TDLC provides the following global parameters.
Global Parameters
Description
--endpoint string
Service connection address, and dlc.tencentcloudapi.com is used by default.
--engine string
DLC data engine name. The default value is public-engine. It is recommended to use a Dedicated Data Engine.
--region string
Region to be used Examples: ap-nanjing, ap-beijing, ap-guangzhou, ap-shanghai, ap-chengdu, ap-chongqing, na-siliconvalley, ap-singapore, ap-hongkong
--role-arn string
When submitting Spark jobs, you need to specify the permissions to access COS files. This involves assigning the appropriate rolearn with the required permissions. For details on the rolearn, see Configuring Data Access Policy.
--secret-id string
Tencent Cloud account secretId
--secret-key string
Tencent Cloud account secretKey
--token string
(Optional) Tencent Cloud account temporary token

CONFIG Command

The config command can configure commonly used parameters. The configured parameters will be provided as default values. Command line parameters will override the configured config parameters.
Command
 Note
list
List current default configurations.
set
Change configuration.
unset
Reset configuration.
Example:
./tdlc config list
./tdlc config set secret-id={1} secret-key={2} region={b}
./tdlc config unset region

SQL Subcommands

SQL subcommands currently support Presto or SparkSQL clusters. The following parameters are supported by SQL subcommands.
Parameter
 Note
-e, --exec
Execute SQL statements
-f, --file
Execute SQL files. If there are multiple SQL files, please separate them with ;.
--no-result
Do not fetch results after execution.
-p, --progress
Display execution progress
-q, --quiet
Quiet mode, do not wait for task execution status after submission.
Example:
./tdlc sql -e "SELECT 1" --secret-id aa --secret-key bb --region ap-beijing --engine public-engine
./tdlc sql -f ~/biz.sql --no-result 

SPARK Subcommands

Spark subcommands include the following commands, which can be used to submit Spark jobs, view running logs, and terminate tasks.
Command
Description
submit
Submit a task through spark-submit.
run 
Execute Spark jobs.
log
View running logs.
list
View the list of Spark jobs.
kill
Terminate tasks
The following parameters are supported by the Spark submit subcommand, with file-related parameters supporting the use of local files or COSN protocol.
Parameter
 Note
--driver-size 
Driver specification, defaults to small, medium, large, xlarge. For memory-optimized clusters, use m.small, m.medium, m.large, m.xlarge.
--executor-size 
Executor specification, defaults to small, medium, large, xlarge. For memory-optimized clusters, use m.small, m.medium, m.large, m.xlarge.
--executor-num
Quantities of executors
--files
View the list of Spark jobs.
--archives
Dependent compressed files
--class
Primary function run by the Java/Scala application.
--jars
Dependent jar files, separated by the comma.
--name
Application name
--py-files
Dependent Python files, supports .zip, .egg, .py formats.
--conf
Additional configurations
--network
Network Configuration, such as --network "network name"
Example:
./tdlc spark submit --name spark-demo1 --engine sparkjar --jars /root/sparkjar-dep.jar --class com.demo.Example /root/sparkjar-main.jar arg1
./tdlc spark submit --name spark-demo2 cosn://bucket1/abc.py arg1
Was this page helpful?
You can also Contact Sales or Submit a Ticket for help.
Yes
No

Feedback