You can achieve that by piping several values at the same time to configure command, like this is done for Databricks Connect here:
echo "$(databricks_host)
$(databricks_token)" | databricks configure --token
But in reality, you can just use environment variables DATABRICKS_HOST
and DATABRICKS_TOKEN
as described in documentation - Databricks CLI will pickup them.
The only thing that you need to take into account - that sensitive data like Databricks token need to be stored securely in pipeline definition, and you'll need to access them special way - via env
(full definition is here):
- script: |
echo "[{{cookiecutter.profile}}]" >> ~/.databrickscfg
echo "host = $DATABRICKS_HOST" >> ~/.databrickscfg
echo "token = $DATABRICKS_TOKEN" >> ~/.databrickscfg
env:
DATABRICKS_HOST: $(DATABRICKS_HOST)
DATABRICKS_TOKEN: $(DATABRICKS_TOKEN)
P.S. You can use cicd-templates project to generate a template for Azure DevOps (or for Github Actions) that works out of box with Databricks.
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…