Skip to content

Conversation

@fjakobs
Copy link
Contributor

@fjakobs fjakobs commented Nov 13, 2025

Changes

This PR adds the MCP for building Databricks apps as a new command databricks experimental mcp.

The code is a direct port of the https://github.com/appdotbuild/agent rust code base.

Why

Tests

  • The new code has ~ 80% test coverage

@fjakobs fjakobs changed the title Apps mcp Add experimental Apps MCP server Nov 13, 2025
@eng-dev-ecosystem-bot
Copy link
Collaborator

eng-dev-ecosystem-bot commented Nov 13, 2025

Run: 19463623415

Env 🟨​KNOWN 🔄​flaky 💚​RECOVERED 🙈​SKIP ✅​pass 🙈​skip Time
🟨​ aws linux 7 2 360 613 16:57
🟨​ aws windows 7 2 361 612 16:34
💚​ aws-ucws linux 7 2 498 503 17:36
💚​ aws-ucws windows 7 2 499 502 22:36
🔄​ azure linux 4 1 4 356 612 21:19
💚​ azure windows 1 4 361 611 22:18
💚​ azure-ucws linux 1 4 494 502 23:10
💚​ azure-ucws windows 1 4 495 501 21:40
💚​ gcp linux 1 4 356 615 19:04
💚​ gcp windows 1 4 357 614 19:39
13 failing tests:
Test Name aws linux aws windows aws-ucws linux aws-ucws windows azure linux azure windows azure-ucws linux azure-ucws windows gcp linux gcp windows
TestAccept 🟨​K 🟨​K 💚​R 💚​R 💚​R 💚​R 💚​R 💚​R 💚​R 💚​R
TestAccept/bundle/resources/permissions 🙈​S 🙈​S 🙈​S 🙈​S 🙈​S 🙈​S 🙈​S 🙈​S 🙈​S 🙈​S
TestAccept/bundle/resources/permissions/jobs/destroy_without_mgmtperms/with_permissions 🟨​K 🟨​K 💚​R 💚​R 🙈​S 🙈​S 🙈​S 🙈​S 🙈​S 🙈​S
TestAccept/bundle/resources/permissions/jobs/destroy_without_mgmtperms/with_permissions/DATABRICKS_BUNDLE_ENGINE=direct 🟨​K 🟨​K 💚​R 💚​R
TestAccept/bundle/resources/permissions/jobs/destroy_without_mgmtperms/with_permissions/DATABRICKS_BUNDLE_ENGINE=terraform 🟨​K 🟨​K 💚​R 💚​R
TestAccept/bundle/resources/permissions/jobs/destroy_without_mgmtperms/without_permissions 🟨​K 🟨​K 💚​R 💚​R 🙈​S 🙈​S 🙈​S 🙈​S 🙈​S 🙈​S
TestAccept/bundle/resources/permissions/jobs/destroy_without_mgmtperms/without_permissions/DATABRICKS_BUNDLE_ENGINE=direct 🟨​K 🟨​K 💚​R 💚​R
TestAccept/bundle/resources/permissions/jobs/destroy_without_mgmtperms/without_permissions/DATABRICKS_BUNDLE_ENGINE=terraform 🟨​K 🟨​K 💚​R 💚​R
TestAccept/bundle/run/app-with-job 🙈​S 🙈​S 🙈​S 🙈​S 🙈​S 🙈​S 🙈​S 🙈​S 🙈​S 🙈​S
TestWorkspaceFilesExtensions_ExportFormatIsPreserved/jupyter_r ✅​p ✅​p ✅​p ✅​p 🔄​f ✅​p ✅​p ✅​p ✅​p ✅​p
TestWorkspaceFilesExtensions_ExportFormatIsPreserved/jupyter_scala ✅​p ✅​p ✅​p ✅​p 🔄​f ✅​p ✅​p ✅​p ✅​p ✅​p
TestWorkspaceFilesExtensions_ExportFormatIsPreserved/jupyter_sql ✅​p ✅​p ✅​p ✅​p 🔄​f ✅​p ✅​p ✅​p ✅​p ✅​p
TestWorkspaceFilesExtensions_ExportFormatIsPreserved/source_python ✅​p ✅​p ✅​p ✅​p 🔄​f ✅​p ✅​p ✅​p ✅​p ✅​p
Top 30 slowest tests (at least 2 minutes):
duration env testname
8:25 azure windows TestAccept/bundle/resources/clusters/deploy/data_security_mode/DATABRICKS_BUNDLE_ENGINE=direct
6:22 gcp linux TestAccept/bundle/resources/clusters/deploy/data_security_mode/DATABRICKS_BUNDLE_ENGINE=direct
6:03 azure linux TestAccept/bundle/resources/clusters/deploy/update-after-create/DATABRICKS_BUNDLE_ENGINE=terraform
5:48 gcp linux TestAccept/bundle/resources/clusters/deploy/update-after-create/DATABRICKS_BUNDLE_ENGINE=direct
5:39 aws-ucws windows TestAccept/bundle/resources/clusters/deploy/data_security_mode/DATABRICKS_BUNDLE_ENGINE=direct
5:39 gcp windows TestAccept/bundle/resources/clusters/deploy/data_security_mode/DATABRICKS_BUNDLE_ENGINE=direct
5:35 gcp linux TestAccept/bundle/resources/clusters/deploy/update-after-create/DATABRICKS_BUNDLE_ENGINE=terraform
5:31 aws windows TestAccept/bundle/resources/clusters/deploy/update-after-create/DATABRICKS_BUNDLE_ENGINE=terraform
5:31 aws-ucws windows TestAccept/bundle/resources/clusters/deploy/update-after-create/DATABRICKS_BUNDLE_ENGINE=terraform
5:30 aws-ucws windows TestAccept/bundle/resources/clusters/deploy/update-after-create/DATABRICKS_BUNDLE_ENGINE=direct
5:28 aws windows TestAccept/bundle/resources/clusters/deploy/update-after-create/DATABRICKS_BUNDLE_ENGINE=direct
5:25 gcp windows TestAccept/bundle/resources/clusters/deploy/update-after-create/DATABRICKS_BUNDLE_ENGINE=terraform
5:19 aws windows TestAccept/bundle/resources/clusters/deploy/data_security_mode/DATABRICKS_BUNDLE_ENGINE=direct
5:16 gcp windows TestAccept/bundle/resources/clusters/deploy/update-after-create/DATABRICKS_BUNDLE_ENGINE=direct
4:59 aws linux TestAccept/bundle/resources/clusters/deploy/data_security_mode/DATABRICKS_BUNDLE_ENGINE=direct
4:01 azure linux TestAccept/bundle/resources/clusters/deploy/update-after-create/DATABRICKS_BUNDLE_ENGINE=direct
3:27 azure-ucws linux TestAccept/bundle/resources/clusters/deploy/update-after-create/DATABRICKS_BUNDLE_ENGINE=direct
3:09 azure-ucws linux TestAccept/bundle/resources/synced_database_tables/basic
2:52 aws linux TestAccept/bundle/resources/clusters/deploy/update-after-create/DATABRICKS_BUNDLE_ENGINE=terraform
2:36 azure windows TestAccept/bundle/resources/clusters/deploy/update-after-create/DATABRICKS_BUNDLE_ENGINE=terraform
2:28 aws linux TestAccept/bundle/resources/clusters/deploy/update-after-create/DATABRICKS_BUNDLE_ENGINE=direct
2:27 azure-ucws windows TestAccept/bundle/resources/synced_database_tables/basic
2:26 azure windows TestAccept/bundle/resources/clusters/deploy/update-after-create/DATABRICKS_BUNDLE_ENGINE=direct
2:17 azure-ucws linux TestAccept/bundle/resources/clusters/deploy/data_security_mode/DATABRICKS_BUNDLE_ENGINE=direct
2:17 aws-ucws linux TestAccept/bundle/resources/clusters/deploy/update-after-create/DATABRICKS_BUNDLE_ENGINE=terraform
2:14 azure-ucws windows TestAccept/bundle/resources/clusters/deploy/update-after-create/DATABRICKS_BUNDLE_ENGINE=terraform
2:10 aws-ucws linux TestAccept/bundle/resources/clusters/deploy/data_security_mode/DATABRICKS_BUNDLE_ENGINE=direct
2:10 azure linux TestAccept/bundle/resources/clusters/deploy/data_security_mode/DATABRICKS_BUNDLE_ENGINE=direct
2:06 aws-ucws linux TestAccept/bundle/resources/clusters/deploy/update-after-create/DATABRICKS_BUNDLE_ENGINE=direct
2:04 azure-ucws windows TestAccept/bundle/resources/clusters/deploy/data_security_mode/DATABRICKS_BUNDLE_ENGINE=direct

Copy link
Contributor

@lennartkats-db lennartkats-db left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Mostly non-blocking comments, want to get this in ASAP, please review

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.


The server communicates via stdio using the Model Context Protocol.`,
Example: ` # Start MCP server with required warehouse
databricks experimental apps-mcp --warehouse-id abc123
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Non-blocking: we should apply the "default warehouse" logic from https://github.com/databricks/cli/pull/3907/files?w=1#diff-5af5f7c0874402273882f7e0c97c9bc7a0471ad92771d90ef23c415da090f09aR74 here. That same logic, discussed with the default warehouse team, is also applied in many other client-side tools. We'll get a proper API for it in the future.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we should pull in the complete getting started logic from your PR


// Define flags
cmd.Flags().StringVar(&warehouseID, "warehouse-id", "", "Databricks SQL Warehouse ID")
cmd.Flags().BoolVar(&allowDeployment, "allow-deployment", false, "Enable deployment tools")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Non-blocking: I don't think we want to keep this concept. And it seems odd to default to false?

)

// JSON-RPC 2.0 error codes as defined in the specification.
const (
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

return t.writer.WriteEntry(entry)
}

func (t *Tracker) RecordToolCall(toolName string, args any, result *mcpsdk.CallToolResult, err error) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Non-blocking: this is something we should also have for a CLI-based approach!

}

cmd.AddCommand(aitools.New())
cmd.AddCommand(mcp.NewMcpCmd())
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The aitools package should also nest a cmd package.

databricks experimental apps-mcp check
# Check with specific profile
databricks experimental apps-mcp check --profile production`,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this a tool call or for real users?

We have databricks auth describe that does the same.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this was a carry-over from rust. I'll remove it

- github.com/databricks/databricks-sdk-go v0.90.0 (updated)
- Dependencies will be fully populated when code is copied

For parity-40 (Phase 2: Code Structure Migration)
Copied all library code with original structure:
- Utilities (pathutil, fileutil, version, errors)
- Templates (code + embedded)
- Sandbox (interface + local + dagger)
- Session and trajectory
- Providers (databricks, io, workspace, deployment)
- MCP server

Import paths still reference old package. Will fix in next phase.

For parity-40 (Phase 2: Code Structure Migration)
Changed all imports:
- github.com/databricks/go-mcp/pkg → github.com/databricks/cli/libs/mcp
- github.com/appdotbuild/go-mcp/pkg → github.com/databricks/cli/libs/mcp
- github.com/databricks/go-mcp/internal → github.com/databricks/cli/internal/mcp

Added MCP dependencies:
- github.com/modelcontextprotocol/go-sdk v1.1.0
- github.com/zeebo/blake3 v0.2.4
- dagger.io/dagger v0.19.6

Build still fails due to missing config/logging integration.

For parity-40 (Phase 2: Code Structure Migration)
Created libs/mcp/config.go with MCP configuration types.
Removed dependency on pkg/config (which used Viper).
Configuration will be populated from command flags in Phase 3.

Updated all imports:
- config.Config → mcp.Config
- config.IoConfig → mcp.IoConfig
- config.ValidationConfig → mcp.ValidationConfig
- config.DaggerConfig → mcp.DaggerConfig

Build still fails due to missing logging integration.

For parity-40 (Phase 2: Code Structure Migration)
- Fixed registry.go to use context instead of logger
- Fixed trajectory.go to alias mcp SDK import
- Removed logger fields from provider structs
- Started replacing logger calls with log package

Still TODO:
- Fix provider init functions
- Fix mcp SDK import aliasing in all providers
- Complete logger call replacements
- Fix ctx variable name conflicts

For parity-40 (Phase 2: Code Structure Migration)
- Aliased mcp SDK imports to avoid package conflicts (mcpsdk)
- Removed logger parameters from all providers
- Added context parameters for logging throughout
- All providers now use libs/log package
- Renamed server package correctly
- Fixed trajectory tracker to use context

Build succeeds: go build ./libs/mcp/...

For parity-40 (Phase 2: Code Structure Migration)
fjakobs and others added 10 commits November 17, 2025 17:13
This renames the command from `databricks experimental mcp` to
`databricks experimental apps-mcp` to better reflect its purpose as
an Apps-focused MCP server.

Changes:
- Rename folder: experimental/mcp → experimental/apps-mcp
- Update command name: mcp → apps-mcp
- Update all import paths across Go files
- Update documentation and examples in README
- Update acceptance test folder structure

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <[email protected]>
Replace the custom version package in apps-mcp with the top-level
build.GetInfo() to ensure version consistency across the CLI.

Changes:
- Remove experimental/apps-mcp/lib/version directory
- Update server.go to import github.com/databricks/cli/internal/build
- Use build.GetInfo().Version instead of custom version package
- Update MCP server name to "databricks-apps-mcp"

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <[email protected]>
Replaced github.com/modelcontextprotocol/go-sdk with a minimal local
implementation in experimental/apps-mcp/lib/mcp/ that provides all the
functionality needed for the apps-mcp command.

Changes:
- Created experimental/apps-mcp/lib/mcp/ package with:
  - types.go: Core MCP types (Implementation, Tool, Content, etc.)
  - protocol.go: JSON-RPC and MCP protocol types
  - transport.go: STDIO transport for line-delimited JSON
  - server.go: MCP server with tool registration and request handling
  - tool.go: Typed tool handler with automatic schema generation

- Updated all providers to use local MCP SDK:
  - lib/providers/databricks/provider.go
  - lib/providers/deployment/provider.go
  - lib/providers/io/provider.go
  - lib/providers/workspace/provider.go
  - lib/server/server.go
  - lib/session/engine_guide.go
  - lib/trajectory/tracker.go

- Removed github.com/modelcontextprotocol/go-sdk from go.mod
- Kept github.com/google/jsonschema-go for schema generation

All tests pass and linter shows 0 issues.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <[email protected]>
Removes the github.com/zeebo/blake3 external dependency and uses the
standard library crypto/sha256 instead for computing file checksums in
the MCP apps state management.

Benefits:
- Reduces binary size by ~955 KB (1.72%)
- Eliminates external dependency
- Uses stable standard library API

Both hash implementations follow the same hash.Hash interface, making
this a drop-in replacement.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <[email protected]>
Replace the custom databricks.Client wrapper with direct use of the
workspace client from context (cmdctx.WorkspaceClient), following the
pattern used in the rest of the CLI codebase.

Changes:
- Remove lib/providers/databricks/client.go
- Convert all Client methods to standalone functions that accept
  context and config parameters
- Update provider to get workspace client from context instead of
  storing a custom client instance
- Update deployment provider to pass config instead of client

This makes the code consistent with other parts of the CLI and
eliminates unnecessary abstraction.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <[email protected]>
This commit removes all Dagger-related code, configuration, and documentation
from the codebase. The Dagger sandbox implementation has been replaced with a
minimal stub that returns "not implemented" errors.

Changes:
- Removed Dagger sandbox implementation (dagger.go, dagger_test.go, metrics.go)
- Created stub implementation that registers with sandbox factory
- Removed `databricks experimental apps-mcp check` command
- Removed `--use-dagger` and `--docker-image` CLI flags
- Removed DaggerConfig and related fields from ValidationConfig
- Simplified validation logic to only use local sandbox
- Updated README to remove all Dagger/containerization mentions
- Removed dagger.io/dagger dependency from go.mod
- Updated doc comments to reflect stub-only status

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <[email protected]>
Copy link
Contributor

@pietern pietern left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please update the go.mod entries before merging.

Now that it is a single dep it can be included in the main list.

go.mod Outdated
google.golang.org/api v0.249.0 // indirect
google.golang.org/genproto/googleapis/rpc v0.0.0-20250922171735-9219d122eba9 // indirect
google.golang.org/grpc v1.75.1 // indirect
google.golang.org/grpc v1.76.0 // indirect
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The other bumps are unrelated, please revert (will come in separately).

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done

UpdateTime string `json:"update_time"`
Updater string `json:"updater"`
URL string `json:"url"`
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For later PR, these structs duplicate the SDK structs. Why?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This was an artefact from the port. The Rust version had a vibe-coded SDK. I'll strip that code.

cmd/cmd.go Outdated
"github.com/databricks/cli/cmd/sync"
"github.com/databricks/cli/cmd/version"
"github.com/databricks/cli/cmd/workspace"
ssh "github.com/databricks/cli/experimental/ssh/cmd"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can also revert.

@fjakobs fjakobs merged commit df0f5cc into main Nov 18, 2025
13 checks passed
@fjakobs fjakobs deleted the apps-mcp branch November 18, 2025 12:12
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants