./tdlc [command] --help.Operating System | TDLC Binary Packet Download Link |
Windows | |
Mac | |
Linux |
tdlc. Open the command line on your client, navigate to the download path, and if you are using Mac/Linux, use the chmod +x tdlc command to grant executable permissions to the file. After ./tdlc is executed, if the following content is displayed successfully, the installation is complete and the tool is ready for use.Tencentcloud DLC command tools is used to play around with DLC.With TDLC user can manger engines, execute SQLs and submit Spark Jobs.Usage:tdlc [flags]tdlc [command]Available Commands:confighelp Help about any commandspark Submit spark app to engines.sql Executing SQL.versionFlags:--endpoint string Endpoint of Tencentcloud account. (default "dlc.tencentcloudapi.com")--engine string DLC engine. (default "public-engine")-h, --help help for tdlc--region string Region of Tencentcloud account.--role-arn string Required by spark jar app.--secret-id string SecretId of Tencentcloud account.--secret-key string SecretKey of Tencentcloud account.--token string Token of Tencentcloud account.Use "tdlc [command] --help" for more information about a command.
Global Parameters | Description |
--endpoint string | Service connection address, and dlc.tencentcloudapi.com is used by default. |
--engine string | DLC data engine name. The default value is public-engine. It is recommended to use a Dedicated Data Engine. |
--region string | Region to be used Examples: ap-nanjing, ap-beijing, ap-guangzhou, ap-shanghai, ap-chengdu, ap-chongqing, na-siliconvalley, ap-singapore, ap-hongkong |
--role-arn string | When submitting Spark jobs, you need to specify the permissions to access COS files. This involves assigning the appropriate rolearn with the required permissions. For details on the rolearn, see Configuring Data Access Policy. |
--secret-id string | Tencent Cloud account secretId |
--secret-key string | Tencent Cloud account secretKey |
--token string | (Optional) Tencent Cloud account temporary token |
Command | Note |
list | List current default configurations. |
set | Change configuration. |
unset | Reset configuration. |
./tdlc config list./tdlc config set secret-id={1} secret-key={2} region={b}./tdlc config unset region
Parameter | Note |
-e, --exec | Execute SQL statements |
-f, --file | Execute SQL files. If there are multiple SQL files, please separate them with ;. |
--no-result | Do not fetch results after execution. |
-p, --progress | Display execution progress |
-q, --quiet | Quiet mode, do not wait for task execution status after submission. |
./tdlc sql -e "SELECT 1" --secret-id aa --secret-key bb --region ap-beijing --engine public-engine./tdlc sql -f ~/biz.sql --no-result
Command | Description |
submit | Submit a task through spark-submit. |
run | Execute Spark jobs. |
log | View running logs. |
list | View the list of Spark jobs. |
kill | Terminate tasks |
Parameter | Note |
--driver-size | Driver specification, defaults to small, medium, large, xlarge. For memory-optimized clusters, use m.small, m.medium, m.large, m.xlarge. |
--executor-size | Executor specification, defaults to small, medium, large, xlarge. For memory-optimized clusters, use m.small, m.medium, m.large, m.xlarge. |
--executor-num | Quantities of executors |
--files | View the list of Spark jobs. |
--archives | Dependent compressed files |
--class | Primary function run by the Java/Scala application. |
--jars | Dependent jar files, separated by the comma. |
--name | Application name |
--py-files | Dependent Python files, supports .zip, .egg, .py formats. |
--conf | Additional configurations |
--network | Network Configuration, such as --network "network name" |
./tdlc spark submit --name spark-demo1 --engine sparkjar --jars /root/sparkjar-dep.jar --class com.demo.Example /root/sparkjar-main.jar arg1./tdlc spark submit --name spark-demo2 cosn://bucket1/abc.py arg1
Feedback