Skip to content

πŸ›  Steps to Setup Databricks CLI on LinuxπŸ”—


1. Install Python & pipπŸ”—

Databricks CLI is a Python package. Check if you have Python 3 installed:

python3 --version

If not, install it:

sudo apt update
sudo apt install python3 python3-pip -y

2. Install Databricks CLIπŸ”—

Run:

pip3 install databricks-cli --upgrade

Check installation:

databricks --version

3. Generate a Personal Access Token (PAT) in DatabricksπŸ”—

  1. Go to your Databricks workspace in the browser.
  2. Click on your username (top right) β†’ User Settings.
  3. Under Access Tokens, click Generate New Token.
  4. Copy the token (you won’t see it again).

4. Configure Databricks CLIπŸ”—

Run:

databricks configure --token

It will ask:

  • Databricks Host (URL) β†’ Example:

  • AWS: https://<workspace-url>.cloud.databricks.com

  • Azure: https://<workspace-name>.azuredatabricks.net
  • Token β†’ Paste the token you generated.

5. Test the CLIπŸ”—

Run a test command to list clusters:

databricks clusters list

If successful, you’ll see details of your clusters.


6. (Optional) Store Multiple ProfilesπŸ”—

You can save multiple workspace logins using profiles in ~/.databricks/config. Example:

[DEFAULT]
host = https://myworkspace.azuredatabricks.net
token = dapi123abc

[staging]
host = https://staging-workspace.azuredatabricks.net
token = dapi456def

Then use:

databricks --profile staging clusters list

7. Use CLI for Common TasksπŸ”—

Examples:

# List jobs
databricks jobs list

# Upload a Python file to DBFS
databricks fs cp myscript.py dbfs:/FileStore/scripts/myscript.py

# Run a job
databricks jobs run-now --job-id <job_id>