Skip to content
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 4 additions & 1 deletion NEXT_CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,14 +4,17 @@

### CLI

* Promote the aitools skills-management surface (`install`, `update`, `uninstall`, `list`, `version`) from `databricks experimental aitools` to top-level `databricks aitools`. The old paths under `databricks experimental aitools` continue to work as silent backward-compat aliases. The `tools` subtree (`query`, `discover-schema`, `get-default-warehouse`, `statement …`) and the `skills` alias group remain under `databricks experimental aitools`.
* Promote the aitools skills-management surface (`install`, `update`, `uninstall`, `list`, `version`) from `databricks experimental aitools` to top-level `databricks aitools`. The old paths under `databricks experimental aitools` continue to work as silent backward-compat aliases. The `tools` subtree (`query`, `discover-schema`, `get-default-warehouse`, `statement …`) and the `skills` alias group remain under `databricks experimental aitools`. The `--project` and `--global` flags on `install`, `update`, `uninstall`, and `list` are deprecated in favor of `--scope=project|global` (with `--scope=both` accepted by `update` and `list`); the booleans keep working with a stderr deprecation warning and will be removed in a later release.
* `databricks api` now works against unified hosts. Adds `--account` to scope a call to the account API and `--workspace-id` to override the workspace routing identifier per call. A `?o=<workspace-id>` query parameter on the path (the SPOG URL convention used by the Databricks UI) is also recognized as a per-call workspace override, so URLs pasted from the browser route correctly.
* JSON output for single objects now uses standard `"key": "value"` spacing (matching list output and `encoding/json` defaults).
* `databricks auth describe` now reports where U2M (`databricks-cli`) tokens are stored: `plaintext` (`~/.databricks/token-cache.json`) or `secure` (OS keyring), and the source of the choice (env var, config setting, or default).
* Marked the default profile in the interactive pickers shown by `databricks auth switch`, `databricks auth logout`, `databricks auth token`, and `databricks auth login`, and moved it to the top of the list. `databricks auth login` and `databricks auth logout` now offer the same selectors as `databricks auth token` and `databricks auth switch` respectively.
* `databricks aitools list` honors `--output json`, emitting a structured `{release, skills[…], summary{}}` document so coding agents and CI can consume the skill/version/installation matrix without scraping the tabular text output.

### Bundles
* Stop applying `presets.name_prefix` (and the dev-mode `[dev <user>]` rename) to `vector_search_endpoints` ([#5209](https://github.com/databricks/cli/pull/5209)).

* Fix `bundle generate` job to preserve nested notebook directory structure ([#4596](https://github.com/databricks/cli/pull/4596))
* Propagate authentication environment (including `DATABRICKS_CONFIG_PROFILE`) to the `experimental.python` subprocess so bundle validate/deploy no longer fails with a multi-profile host ambiguity error when several profiles in `~/.databrickscfg` share the same host.
* Fixed `--force-pull` on `bundle summary` and `bundle open` so the flag bypasses the local state cache and reads state from the workspace.

Expand Down
2 changes: 1 addition & 1 deletion acceptance/bundle/debug/list-targets/databricks.yml
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ targets:
mode: development
workspace:
host: https://dev.example.com
staging: {}
staging:
prod:
mode: production
workspace:
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
bundle:
name: nested_notebooks
11 changes: 11 additions & 0 deletions acceptance/bundle/generate/job_nested_notebooks/out.job.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
resources:
jobs:
out:
name: dev.my_repo.my_job
tasks:
- task_key: my_notebook_task
notebook_task:
notebook_path: src/my_folder/my_notebook.py
- task_key: other_notebook_task
notebook_task:
notebook_path: src/other_folder/other_notebook.py
3 changes: 3 additions & 0 deletions acceptance/bundle/generate/job_nested_notebooks/out.test.toml

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

9 changes: 9 additions & 0 deletions acceptance/bundle/generate/job_nested_notebooks/output.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
File successfully saved to src/my_folder/my_notebook.py
File successfully saved to src/other_folder/other_notebook.py
Job configuration successfully saved to out.job.yml
=== old flattened files should be gone ===
src/my_notebook.py removed
src/other_notebook.py removed
=== new nested files ===
src/my_folder/my_notebook.py
src/other_folder/other_notebook.py
12 changes: 12 additions & 0 deletions acceptance/bundle/generate/job_nested_notebooks/script
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
mkdir -p src
echo "old" > src/my_notebook.py
echo "old" > src/other_notebook.py

$CLI bundle generate job --existing-job-id 1234 --config-dir . --key out --force --source-dir src 2>&1 | sort

echo "=== old flattened files should be gone ==="
test ! -f src/my_notebook.py && echo "src/my_notebook.py removed" || echo "src/my_notebook.py still exists"
test ! -f src/other_notebook.py && echo "src/other_notebook.py removed" || echo "src/other_notebook.py still exists"

echo "=== new nested files ==="
find src -type f | sort
42 changes: 42 additions & 0 deletions acceptance/bundle/generate/job_nested_notebooks/test.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,42 @@
Ignore = ["src"]

[[Server]]
Pattern = "GET /api/2.2/jobs/get"
Response.Body = '''
{
"job_id": 11223344,
"settings": {
"name": "dev.my_repo.my_job",
"tasks": [
{
"task_key": "my_notebook_task",
"notebook_task": {
"notebook_path": "/my_data_product/dev/my_folder/my_notebook"
}
},
{
"task_key": "other_notebook_task",
"notebook_task": {
"notebook_path": "/my_data_product/dev/other_folder/other_notebook"
}
}
]
}
}
'''

[[Server]]
Pattern = "GET /api/2.0/workspace/get-status"
Response.Body = '''
{
"object_type": "NOTEBOOK",
"language": "PYTHON",
"repos_export_format": "SOURCE"
}
'''

[[Server]]
Pattern = "GET /api/2.0/workspace/export"
Response.Body = '''
print("Hello, World!")
'''
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
bundle:
name: python_job_and_deploy

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

29 changes: 29 additions & 0 deletions acceptance/bundle/generate/python_job_and_deploy/output.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@

=== Upload notebook to a workspace path
>>> [CLI] workspace import /Workspace/Users/[USERNAME]/test_notebook.py --file test_notebook.py --format AUTO --overwrite

=== Create a job that references the notebookCreated job

=== Generate bundle config from the job
>>> [CLI] bundle generate job --existing-job-id [JOB_ID] --key out --config-dir resources --source-dir src --force
File successfully saved to src/test_notebook.py
Job configuration successfully saved to resources/out.job.yml

=== Verify generated yaml has expected fields
=== Deploy the generated bundle
>>> [CLI] bundle deploy
Uploading bundle files to /Workspace/Users/[USERNAME]/.bundle/python_job_and_deploy/default/files...
Deploying resources...
Deployment complete!

=== Destroy the deployed bundle
>>> [CLI] bundle destroy --auto-approve
All files and directories at the following location will be deleted: /Workspace/Users/[USERNAME]/.bundle/python_job_and_deploy/default

Deleting files...
Destroy complete!

=== Cleanup: delete the original job and notebook
>>> errcode [CLI] jobs delete [JOB_ID]

>>> errcode [CLI] workspace delete /Workspace/Users/[USERNAME]/test_notebook
40 changes: 40 additions & 0 deletions acceptance/bundle/generate/python_job_and_deploy/script
Original file line number Diff line number Diff line change
@@ -0,0 +1,40 @@
title "Upload notebook to a workspace path"
trace $CLI workspace import "/Workspace/Users/${CURRENT_USER_NAME}/test_notebook.py" --file test_notebook.py --format AUTO --overwrite

title "Create a job that references the notebook"
JOB_ID=$($CLI jobs create --json '{
"name": "test-job",
"max_concurrent_runs": 1,
"queue": {"enabled": true},
"tasks": [
{
"task_key": "test_task",
"notebook_task": {
"notebook_path": "/Workspace/Users/'${CURRENT_USER_NAME}'/test_notebook"
}
}
]
}' | jq -r '.job_id')
echo "Created job"
# Disable MSYS_NO_PATHCONV when invoking python scripts: with it set, Git Bash on Windows
# fails to translate the script path so the python interpreter can't find the file.
env -u MSYS_NO_PATHCONV add_repl.py "$JOB_ID" JOB_ID

cleanup() {
title "Cleanup: delete the original job and notebook"
trace errcode $CLI jobs delete "$JOB_ID"
trace errcode $CLI workspace delete "/Workspace/Users/${CURRENT_USER_NAME}/test_notebook"
}
trap cleanup EXIT

title "Generate bundle config from the job"
trace $CLI bundle generate job --existing-job-id "$JOB_ID" --key out --config-dir resources --source-dir src --force

title "Verify generated yaml has expected fields"
cat resources/out.job.yml | env -u MSYS_NO_PATHCONV contains.py "task_key: test_task" "notebook_task:" "notebook_path: ../src/test_notebook.py" > /dev/null

title "Deploy the generated bundle"
trace $CLI bundle deploy

title "Destroy the deployed bundle"
trace $CLI bundle destroy --auto-approve
13 changes: 13 additions & 0 deletions acceptance/bundle/generate/python_job_and_deploy/test.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
Local = true
Cloud = true

Ignore = [
"databricks.yml",
"resources/*",
"src/*",
".databricks",
]

[Env]
# MSYS2 automatically converts absolute paths on Windows; disable for the workspace path.
MSYS_NO_PATHCONV = "1"
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
# Databricks notebook source
print("Hello, World!")
19 changes: 17 additions & 2 deletions aitools/cmd/install.go
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,7 @@ func defaultPromptAgentSelection(ctx context.Context, detected []*agents.Agent)
}

func NewInstallCmd() *cobra.Command {
var skillsFlag, agentsFlag string
var skillsFlag, agentsFlag, scopeFlag string
var includeExperimental bool
var projectFlag, globalFlag bool

Expand All @@ -62,17 +62,30 @@ func NewInstallCmd() *cobra.Command {
Long: `Install Databricks AI skills for detected coding agents.

By default, skills are installed globally to each agent's skills directory.
Use --project to install to the current project directory instead.
Use --scope=project to install to the current project directory instead.
When multiple agents are detected, skills are stored in a canonical location
and symlinked to each agent to avoid duplication.

Use --skills name1,name2 to install specific skills.

Agent selection:
--agents <name>[,<name>...] Install only for the named agents.
(unset, interactive) Multi-select prompt over detected agents.
(unset, non-interactive) Install for every detected agent.

The list of agents the command will act on is always logged to stderr before
the install runs, so callers can verify what was picked.

Supported agents: Claude Code, Cursor, Codex CLI, OpenCode, GitHub Copilot, Antigravity`,
Args: cobra.NoArgs,
RunE: func(cmd *cobra.Command, args []string) error {
ctx := cmd.Context()

projectFlag, globalFlag, err := parseScopeFlag(scopeFlag, projectFlag, globalFlag, false)
if err != nil {
return err
}

// Resolve scope.
scope, err := resolveScopeWithPrompt(ctx, projectFlag, globalFlag)
if err != nil {
Expand Down Expand Up @@ -131,8 +144,10 @@ Supported agents: Claude Code, Cursor, Codex CLI, OpenCode, GitHub Copilot, Anti
cmd.Flags().StringVar(&skillsFlag, "skills", "", "Specific skills to install (comma-separated)")
cmd.Flags().StringVar(&agentsFlag, "agents", "", "Agents to install for (comma-separated, e.g. claude-code,cursor)")
cmd.Flags().BoolVar(&includeExperimental, "experimental", false, "Include experimental skills")
cmd.Flags().StringVar(&scopeFlag, "scope", "", "Install scope: project or global (default: global, or prompt when interactive)")
cmd.Flags().BoolVar(&projectFlag, "project", false, "Install to project directory (cwd)")
cmd.Flags().BoolVar(&globalFlag, "global", false, "Install globally (default)")
markScopeBoolsDeprecated(cmd)
return cmd
}

Expand Down
39 changes: 39 additions & 0 deletions aitools/cmd/install_test.go
Original file line number Diff line number Diff line change
Expand Up @@ -409,6 +409,45 @@ func TestInstallGlobalAndProjectErrors(t *testing.T) {
assert.Contains(t, err.Error(), "cannot use --global and --project together")
}

func TestInstallScopeFlag(t *testing.T) {
tests := []struct {
name string
args []string
wantScope string
wantErr string
}{
{name: "scope project", args: []string{"--scope", "project"}, wantScope: installer.ScopeProject},
{name: "scope global", args: []string{"--scope", "global"}, wantScope: installer.ScopeGlobal},
{name: "scope both rejected", args: []string{"--scope", "both"}, wantErr: "--scope=both is not supported"},
{name: "scope invalid value", args: []string{"--scope", "all"}, wantErr: `invalid --scope "all"`},
{name: "scope conflicts with legacy", args: []string{"--scope", "global", "--project"}, wantErr: "cannot use --scope with --project or --global"},
}

for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
setupTestAgents(t)
calls := setupInstallMock(t)

ctx := cmdio.MockDiscard(t.Context())
cmd := NewInstallCmd()
cmd.SetContext(ctx)
cmd.SetArgs(tt.args)
cmd.SilenceErrors = true
cmd.SilenceUsage = true

err := cmd.Execute()
if tt.wantErr != "" {
require.Error(t, err)
assert.Contains(t, err.Error(), tt.wantErr)
return
}
require.NoError(t, err)
require.Len(t, *calls, 1)
assert.Equal(t, tt.wantScope, (*calls)[0].opts.Scope)
})
}
}

func TestInstallNoFlagNonInteractiveUsesGlobal(t *testing.T) {
setupTestAgents(t)
calls := setupInstallMock(t)
Expand Down
Loading
Loading