Databricks

Integrate Databricks with your AI workspace

Databricks is the lakehouse company, helping data teams solve the world's toughest problems with unified analytics platform for big data and AI.

Explore Triggers and Actions

Add Block to DBFS Stream

Tool to append a block of data to an open DBFS stream. Use when uploading large files in chunks as part of the DBFS streaming upload workflow: 1) create a stream handle, 2) add blocks, 3) close the stream. Each block is limited to 1 MB of base64-encoded data.

ActionTry it

Add Compute Instance Profile

Tool to register an instance profile in Databricks for cluster launches. Use when administrators need to grant users permission to launch clusters using that profile. Requires admin access. Successfully registered profiles enable clusters to use the associated IAM role.

ActionTry it

Add Member to Security Group

Tool to add a user or group as a member to a Databricks security group. Use when you need to grant group membership for access control.

ActionTry it

Assign Metastore to Workspace

Tool to assign a Unity Catalog metastore to a workspace. Use when you need to link a workspace to a Unity Catalog metastore, enabling shared data access with consistent governance policies. Requires account admin privileges. If an assignment for the same workspace_id exists, it will be overwritten by the new metastore_id and default_catalog_name.

ActionTry it

Batch Create Access Requests

Tool to batch create access requests for Unity Catalog permissions. Use when you need to request access to catalogs, schemas, tables, or other Unity Catalog securables. Maximum 30 requests per API call, and maximum 30 securables per principal per call.

ActionTry it

Batch Get Marketplace Consumer Listings

Retrieve multiple published marketplace listings by their IDs in a single API call. Use this action when you need to fetch details for multiple listings efficiently instead of making individual GET requests. The action accepts up to 50 listing IDs per request and returns complete listing information including summaries, detailed descriptions, asset types, pricing, and metadata. Only listings that exist and are accessible to the caller are returned; invalid or inaccessible IDs are silently filtered out. Returns an empty array if no listings match or if the IDs parameter is omitted. Ideal for bulk operations like displaying multiple listings, syncing catalog data, or checking listing availability.

ActionTry it

Batch Get Marketplace Consumer Providers

Retrieve multiple marketplace provider details in a single batch API call. Returns information about Databricks Marketplace providers that have at least one publicly visible listing. Use this tool when you need to get details for multiple providers efficiently (up to 50 providers per request). Provider information includes contact details, descriptions, branding assets, and policy links.

ActionTry it

Cancel All Databricks Job Runs

Cancel all active runs of a Databricks job asynchronously. Requires either job_id or all_queued_runs=true. With job_id: cancels all active runs of the specified job. With all_queued_runs=true (no job_id): cancels all queued runs in the workspace. Cancellation is asynchronous and does not prevent new runs from starting. Use with caution when cancelling workspace-wide runs.

ActionTry it

Cancel Databricks Job Run

Tool to cancel a Databricks job run asynchronously. Use when you need to terminate a running job. The run will be terminated shortly after the request completes. If the run is already in a terminal state, this is a no-op.

ActionTry it

Cancel SQL Statement Execution

Tool to cancel an executing SQL statement on a Databricks warehouse. Use when you need to terminate a running SQL query. The response indicates successful receipt of the cancel request, but does not guarantee cancellation. Callers must poll the statement status to confirm the terminal state (CANCELED, SUCCEEDED, FAILED, or CLOSED).

ActionTry it

Check Table Exists

Tool to check if a table exists in Unity Catalog metastore. Use when you need to verify table existence before performing operations. Requires metastore admin privileges, table ownership with SELECT privilege, or USE_CATALOG and USE_SCHEMA privileges on parent objects.

ActionTry it

Create Catalog Connection

Tool to create a new Unity Catalog connection to external data sources. Use when you need to establish connections to databases and services such as MySQL, PostgreSQL, Snowflake, etc. Requires metastore admin privileges or CREATE CONNECTION privilege on the metastore.

ActionTry it

Create Catalog Credential

Tool to create a new credential for Unity Catalog access to cloud services. Use when you need to establish authentication for STORAGE (cloud storage) or SERVICE (external services like AWS Glue) purposes. Requires metastore admin or CREATE_STORAGE_CREDENTIAL/CREATE_SERVICE_CREDENTIAL privileges. Exactly one cloud credential type must be provided.

ActionTry it

Create Clean Room

Tool to create a new Databricks Clean Room for secure data collaboration with specified collaborators. Use when you need to establish a collaborative environment for multi-party data analysis. This is an asynchronous operation; the clean room starts in PROVISIONING state and becomes ACTIVE when ready. Requires metastore admin privileges or CREATE_CLEAN_ROOM privilege on the metastore.

ActionTry it

Create Clean Room Auto-Approval Rule

Tool to create a new auto-approval rule for a Databricks Clean Room. Use when you need to automatically approve notebooks shared by other collaborators that meet specific criteria. In 2-person clean rooms, auto-approve notebooks from the other collaborator using author_collaborator_alias. In multi-collaborator clean rooms, use author_scope=ANY_AUTHOR to auto-approve from any author.

ActionTry it

Create Compute Cluster Policy

Tool to create a new cluster policy with prescribed settings for controlling cluster creation. Use when you need to establish policies that govern cluster configurations. Only admin users can create cluster policies.

ActionTry it

Create Compute Instance Pool

Tool to create a new Databricks instance pool with specified configuration. Use when you need to set up a pool that reduces cluster start and auto-scaling times by maintaining idle, ready-to-use cloud instances. When attached to a pool, a cluster allocates driver and worker nodes from the pool.

ActionTry it

Create Data Quality Monitor

Tool to create a data quality monitor for a Unity Catalog Delta table. Use when you need to set up monitoring for table quality, track data drift, or monitor ML model inference logs. Supports snapshot, time series, and inference log monitoring types. Only one monitor can be created per table.

ActionTry it

Create Database Instance

Creates a new Lakebase Provisioned database instance in Databricks. Lakebase is Databricks' serverless PostgreSQL-compatible database service for OLTP workloads. Use this action to provision a new database instance with specified compute capacity and configuration. The creator automatically receives database owner privileges with the databricks_superuser role, granting full administrative capabilities on the instance including schema management, user provisioning, and data operations. The instance will initially be in STARTING state during provisioning, then transition to AVAILABLE when ready for connections. Use the returned read_write_dns endpoint to connect PostgreSQL clients and applications. Note: Requires available credits/billing setup in your Databricks workspace. Instance creation may take several minutes to complete.

ActionTry it

Create Databricks App

Tool to create a new Databricks app with specified configuration. Use when you need to create apps hosted on Databricks serverless platform to deploy secure data and AI applications. The app name must be unique within the workspace, contain only lowercase alphanumeric characters and hyphens, and cannot be changed after creation.

ActionTry it

Create Databricks Cluster

Tool to create a new Databricks Spark cluster with specified configuration. Use when you need to provision compute resources for data processing. This is an asynchronous operation that returns a cluster_id immediately with the cluster in PENDING state. The cluster transitions through states until reaching RUNNING.

ActionTry it

Create DBFS File Stream

Tool to open a stream to write to a DBFS file and returns a handle. Use when uploading files to DBFS using the streaming workflow: 1) create a stream handle, 2) add blocks of data, 3) close the stream. The returned handle has a 10-minute idle timeout and must be used within that period.

ActionTry it

Create External Location

Tool to create a new Unity Catalog external location combining a cloud storage path with a storage credential. Use when you need to establish access to cloud storage in Azure Data Lake Storage, AWS S3, or Cloudflare R2. Requires metastore admin or CREATE_EXTERNAL_LOCATION privilege on both the metastore and the associated storage credential.

ActionTry it

Create Genie Message

Tool to create a message in a Genie conversation and get AI-generated responses. Use when you need to ask questions or send messages to Genie for data analysis. The response initially has status 'IN_PROGRESS' and should be polled every 1-5 seconds until reaching COMPLETED, FAILED, or CANCELLED status. Subject to 5 queries-per-minute rate limit during Public Preview.

ActionTry it

Create Genie Space

Tool to create a new Genie space from a serialized payload for programmatic space management. Use when you need to create a Genie workspace for AI-powered data analysis. The space requires a SQL warehouse ID and a serialized configuration with at minimum a version field (1 or 2). Optionally specify data source tables to enable querying specific datasets.

ActionTry it

Create Global Init Script

Tool to create a new global initialization script in Databricks workspace. Use when you need to run scripts on every node in every cluster. Global init scripts run on all cluster nodes and only workspace admins can create them. Scripts execute in position order and clusters must restart to apply changes. The script cannot exceed 64KB when decoded.

ActionTry it

Create IAM Group V2

Tool to create a new group in Databricks workspace using SCIM v2 protocol. Use when you need to create a new security group with a unique display name, optionally with initial members, entitlements, and roles.

ActionTry it

Create IAM Service Principal V2

Tool to create a new service principal in Databricks workspace using SCIM v2 protocol. Use when you need to create a service principal that already exists in the Databricks account. Required for identity-federated workspaces where you must specify a valid UUID applicationId.

ActionTry it

Create IAM User V2

Tool to create a new user in Databricks workspace using SCIM v2 protocol. Use when you need to provision a new user account with a unique userName (email), optionally with display name, activation status, group memberships, entitlements, and roles.

ActionTry it

Create IP Access List

Tool to create a new IP access list for workspace access control. Use when you need to allow or block specific IP addresses and CIDR ranges from accessing the Databricks workspace. The API will reject creation if the resulting list would block the caller's current IP address. Changes may take a few minutes to take effect.

ActionTry it

Create Lakeview Dashboard

Tool to create a new Lakeview dashboard in Databricks. Use when you need to create AI/BI dashboards for data visualization and analytics. Both display_name and serialized_dashboard are required. To create a blank dashboard, provide a minimal serialized_dashboard with an empty pages array like '{"pages":[{"name":"page_001","displayName":"Page 1"}]}'.

ActionTry it

Create Legacy SQL Alert

Tool to create a legacy SQL alert that periodically runs a query and notifies when conditions are met. Use when you need to create alerts using the legacy API endpoint. Note: This is a legacy endpoint that has been replaced by /api/2.0/sql/alerts and is deprecated.

ActionTry it

Create Legacy SQL Query

Tool to create a new SQL query definition using the legacy API. Use when you need to create queries with the legacy /preview/sql/queries endpoint that uses data_source_id. Note: This is a legacy endpoint. The API has been replaced by /api/2.0/sql/queries which uses warehouse_id instead of data_source_id.

ActionTry it

Create Legacy SQL Query Visualization

Tool to create a visualization in a SQL query using the legacy API. Use when you need to add a visual representation (table, chart, counter, pivot, etc.) to an existing saved query. Note: This is a deprecated endpoint; users should migrate to the current /api/2.0/sql/visualizations API. Databricks does not recommend modifying visualization settings in JSON.

ActionTry it

Create Logged Model

Tool to create a new logged model in MLflow that ties together model metadata, parameters, metrics, and artifacts. Use when you need to create a LoggedModel object as part of the unified 'log + register' workflow introduced in MLflow 2.8. LoggedModel objects persist throughout a model's lifecycle and provide a centralized way to track model information.

ActionTry it

Create Marketplace Consumer Installation

Tool to create a marketplace consumer installation for Databricks Marketplace listings. Use when you need to install data products, datasets, notebooks, models, or other marketplace offerings into a workspace. Requires acceptance of consumer terms and the listing ID to proceed with installation.

ActionTry it

Create Marketplace Provider Listing

Tool to create a new listing in Databricks Marketplace for data providers. Use when you need to publish data products, datasets, models, or notebooks to the marketplace. Requires a listing object with summary information (name and listing_type). For free and instantly available data products, a share must be included during creation.

ActionTry it

Create Metastore

Tool to create a new Unity Catalog metastore. Use when you need to establish a top-level container for data in Unity Catalog, registering metadata about securable objects (tables, volumes, external locations, shares) and access permissions. Requires account admin privileges. By default, the owner is the user calling the API; setting owner to empty string assigns ownership to System User.

ActionTry it

Create ML Experiment

Tool to create a new MLflow experiment for tracking machine learning runs and models. Use when you need to organize and track ML experiments within Databricks. Returns RESOURCE_ALREADY_EXISTS error if an experiment with the same name already exists.

ActionTry it

Create ML Feature Store Online Store

Tool to create a Databricks Online Feature Store for real-time feature serving. Use when you need to establish serverless infrastructure for low-latency access to feature data at scale. Requires Databricks Runtime 16.4 LTS ML or above, or serverless compute.

ActionTry it

Create ML Forecasting Experiment

Tool to create a new AutoML forecasting experiment for time series prediction. Use when you need to automatically train and optimize forecasting models on time series data. The experiment will train multiple models and select the best one based on the primary metric.

ActionTry it

Create MLflow Experiment Run

Tool to create a new MLflow run within an experiment for tracking machine learning execution. Use when starting a new ML training run, experiment execution, or data pipeline that needs parameter and metric tracking. Returns the created run with a unique run_id for subsequent metric and parameter logging.

ActionTry it

Create Notification Destination

Tool to create a notification destination for alerts and jobs. Use when you need to set up destinations for sending notifications outside of Databricks (email, Slack, PagerDuty, Microsoft Teams, or webhooks). Only workspace admins can create notification destinations. Requires HTTPS for webhooks with SSL certificates signed by a trusted certificate authority.

ActionTry it

Create OAuth Service Principal Secret

Tool to create an OAuth secret for service principal authentication. Use when you need to obtain OAuth access tokens for accessing Databricks Accounts and Workspace APIs. A service principal can have up to five OAuth secrets, each valid for up to two years (730 days). The secret value is only shown once upon creation.

ActionTry it

Create Personal Access Token

Tool to create a personal access token (PAT) for Databricks API authentication. Use when you need to generate a new token for REST API requests. Each PAT is valid for only one workspace. Users can create up to 600 PATs per workspace. Databricks automatically revokes PATs that haven't been used for 90 days.

ActionTry it

Create Provider Analytics Dashboard

Tool to create a provider analytics dashboard for monitoring Databricks Marketplace listing metrics. Use when you need to establish analytics tracking for listing views, requests, installs, conversion rates, and consumer information. Requires Marketplace admin role and system tables to be enabled in the metastore.

ActionTry it

Create Provisioned Throughput Endpoint

Tool to create a provisioned throughput serving endpoint for AI models in Databricks. Use when you need to provision model units for production GenAI applications with guaranteed throughput. The endpoint name must be unique across the workspace and can consist of alphanumeric characters, dashes, and underscores. Returns a long-running operation that completes when the endpoint is ready.

ActionTry it

Create Quality Monitor V2

Tool to create a quality monitor for Unity Catalog table. Use when you need to set up monitoring for data quality metrics, track drift over time, or monitor ML inference logs. Monitor creation is asynchronous; dashboard and metric tables take 8-20 minutes to complete. Exactly one monitor type (snapshot, time_series, or inference_log) must be specified.

ActionTry it

Create Secret Scope

Tool to create a new secret scope in Databricks workspace. Use when you need to establish a secure location to store credentials and sensitive information. Scope names must be unique, case-insensitive, and cannot exceed 128 characters. By default, the scope is Databricks-backed with MANAGE permission for the creator.

ActionTry it

Create Share

Tool to create a new share for data objects in Unity Catalog. Use when you need to establish a share for distributing data assets via Delta Sharing protocol. Data objects can be added after creation with update. Requires metastore admin or CREATE_SHARE privilege on the metastore.

ActionTry it

Create Sharing Provider

Tool to create a new authentication provider in Unity Catalog for Delta Sharing. Use when establishing a provider object for receiving data from external sources that aren't Unity Catalog-enabled. Requires metastore admin privileges or CREATE_PROVIDER permission on the metastore. Most recipients should not need to create provider objects manually as they are typically auto-created during Delta Sharing.

ActionTry it

Create Sharing Recipient

Tool to create a Delta Sharing recipient in Unity Catalog metastore. Use when you need to create a recipient object representing an identity who will consume shared data. Recipients can be configured for Databricks-to-Databricks sharing or open sharing with token authentication. Requires metastore admin or CREATE_RECIPIENT privilege.

ActionTry it

Create SQL Alert

Tool to create a new Databricks SQL alert for query monitoring. Use when you need to set up alerts that monitor query results and trigger notifications when specified conditions are met. The alert will evaluate the query results and send notifications when the condition threshold is crossed.

ActionTry it

Create SQL Query

Tool to create a saved SQL query object in Databricks. Use when you need to create a new saved query definition that includes the target SQL warehouse, query text, name, description, tags, and parameters. Note: This creates a saved query object, not an immediate execution. Use Statement Execution API for immediate query execution.

ActionTry it

Create SQL Query Visualization

Tool to create a new visualization for a Databricks SQL query. Use when you need to add a visual representation (table, chart, counter, funnel, or pivot table) to an existing saved query. The visualization will be attached to the specified query and can be added to dashboards.

ActionTry it

Create Storage Credential

Tool to create a new storage credential in Unity Catalog for cloud data access. Use when you need to establish authentication for accessing cloud storage paths. Requires metastore admin or CREATE_STORAGE_CREDENTIAL privilege on the metastore. Exactly one cloud credential type must be provided.

ActionTry it

Create Tag Policy

Tool to create a new tag policy (governed tag) in Databricks with built-in rules for consistency and control. Use when you need to establish governed tags with restricted values and define who can assign them. Maximum of 1,000 governed tags per account. Each governed tag can have up to 50 allowed values. Requires appropriate account-level permissions.

ActionTry it

Create Vector Search Endpoint

Tool to create a new vector search endpoint to host indexes in Databricks Mosaic AI Vector Search. Use when you need to provision compute resources for hosting vector search indexes. The endpoint will be in PROVISIONING state initially and transition to ONLINE when ready.

ActionTry it

Create Workspace Directory

Tool to create a directory and necessary parent directories in the workspace. Use when you need to create new directories. The operation is idempotent - if the directory already exists, the command succeeds without action. Returns error RESOURCE_ALREADY_EXISTS if a file (not directory) exists at any prefix of the path.

ActionTry it

Create Workspace Git Credentials

Tool to create Git credentials for authenticating with remote Git repositories in Databricks. Use when you need to set up Git integration for version control operations. Multiple credentials can be created, typically one per Git provider. Use the is_default_for_provider flag to set a credential as the default for its provider.

ActionTry it

Create Workspace Repo

Tool to create and optionally checkout a Databricks Repo linking a Git repository to the workspace. Use when you need to connect a Git repository to Databricks for collaborative development. Can optionally specify branch or tag to checkout after creation and configure sparse checkout for performance.

ActionTry it

Delete AI/BI Dashboard Embedding Access Policy

Tool to delete AI/BI dashboard embedding access policy, reverting to default. Use when you need to remove the workspace-level policy for AI/BI published dashboard embedding. Upon deletion, the workspace reverts to the default setting (ALLOW_APPROVED_DOMAINS), conditionally permitting AI/BI dashboards to be embedded on approved domains.

ActionTry it

Delete AI/BI Dashboard Embedding Approved Domains

Tool to delete the list of approved domains for AI/BI dashboard embedding, reverting to default. Use when you need to remove the workspace-level approved domains list for hosting embedded AI/BI dashboards. Upon deletion, the workspace reverts to an empty approved domains list. The approved domains list cannot be modified when the current access policy is not configured to ALLOW_APPROVED_DOMAINS.

ActionTry it

Delete Catalog

Tool to delete a catalog from Unity Catalog metastore. Use when you need to permanently remove a catalog and optionally its contents. By default, the catalog must be empty (except for information_schema). Use force=true to delete non-empty catalogs. Do not delete the main catalog as it can break existing data operations.

ActionTry it

Delete Catalog Connection

Tool to delete a Unity Catalog connection to external data sources. Use when you need to remove connections to databases and services. Deleting a connection removes the abstraction used to connect from Databricks Compute to external data sources.

ActionTry it

Delete Catalog Credential

Tool to delete a Unity Catalog credential for cloud storage or service access. Use when you need to remove credentials that authenticate access to cloud resources. By default, deletion will fail if the credential has dependent resources. Use force=true to delete credentials with dependencies.

ActionTry it

Delete Catalog Table

Tool to delete a table from Unity Catalog. Use when you need to permanently remove a table from its parent catalog and schema. The operation requires appropriate permissions on the parent catalog, schema, and table.

ActionTry it

Delete Compute Cluster Policy

Tool to delete a cluster policy. Use when you need to remove a cluster policy from the workspace. Clusters governed by this policy can still run, but cannot be edited. Only workspace admin users can delete policies. This operation is permanent and cannot be undone.

ActionTry it

Delete Compute Instance Pool

Tool to delete a Databricks compute instance pool. Use when you need to permanently remove an instance pool. The idle instances in the pool are terminated asynchronously after deletion.

ActionTry it

Delete Custom LLM Agent

Tool to delete a Custom LLM agent created through Agent Bricks. Use when you need to remove a custom LLM and all associated data. This operation is irreversible and deletes all data including temporary transformations, model checkpoints, and internal metadata.

ActionTry it

Delete Dashboard Email Subscriptions Setting

Tool to delete the dashboard email subscriptions setting, reverting to default value. Use when you need to revert the workspace setting that controls whether schedules or workload tasks for refreshing AI/BI Dashboards can send subscription emails. Upon deletion, the setting reverts to its default value (enabled/true). This is a workspace-level setting.

ActionTry it

Delete Database Instance

Delete a Databricks Lakebase Postgres database instance. Permanently removes the database instance and all associated data. Prerequisites: The instance should be stopped before deletion, and you must have CAN_MANAGE permissions on the instance. Warning: This operation cannot be undone. All data will be permanently deleted. Consider deleting associated Unity Catalog catalogs and synced tables first to avoid orphaned references.

ActionTry it

Delete Databricks App

Tool to delete a Databricks app from the workspace. Use when you need to remove an app and its associated service principal. When an app is deleted, Databricks automatically deletes the provisioned service principal.

ActionTry it

Delete Databricks Cluster

Tool to terminate a Databricks Spark cluster asynchronously. Use when you need to stop and remove a cluster. The cluster is terminated asynchronously and removed after completion. Cluster configuration is retained for 30 days after termination.

ActionTry it

Delete Databricks Job Run

Tool to delete a non-active Databricks job run. Use when you need to remove a job run from the workspace. The run must be in a non-active state; attempting to delete an active run will return an error. Runs are automatically removed after 60 days.

ActionTry it

Delete Databricks Pipeline

Tool to delete a Databricks Delta Live Tables pipeline permanently and stop any active updates. Use when you need to remove a pipeline completely. If the pipeline publishes to Unity Catalog, deletion will cascade to all pipeline tables. This action cannot be easily undone without Databricks support assistance.

ActionTry it

Delete DBFS File or Directory

Tool to delete a file or directory from DBFS. Use when you need to remove files or directories from the Databricks File System. For large deletions (>10K files), use dbutils.fs in a cluster context instead of the REST API. Operation may return 503 PARTIAL_DELETE for large deletions and should be re-invoked until completion.

ActionTry it

Delete Default Namespace Setting

Tool to delete the default namespace setting for the workspace, removing the default catalog configuration. Use when you need to remove the default catalog used for queries without fully qualified names. Strongly recommended to use etag in a read-delete pattern to prevent concurrent modification conflicts (HTTP 409).

ActionTry it

Delete Default Warehouse ID Setting

Tool to delete the default warehouse ID setting for the workspace, reverting to default state. Use when you need to remove the default SQL warehouse configuration. Strongly recommended to use etag in a read-delete pattern to prevent concurrent modification conflicts (HTTP 409).

ActionTry it

Delete Disable Legacy Access Setting

Tool to delete the disable legacy access workspace setting, re-enabling legacy features. Use when you need to revert to allowing legacy Databricks features. Strongly recommended to use etag in a read-delete pattern to prevent concurrent modification conflicts (HTTP 409). Changes take up to 5 minutes and require cluster/warehouse restart.

ActionTry it

Delete Disable Legacy DBFS Setting

Tool to delete the disable legacy DBFS workspace setting, reverting to default DBFS access behavior. Use when you need to re-enable access to legacy DBFS root and mounts. Strongly recommended to use etag in a read-delete pattern to prevent concurrent modification conflicts (HTTP 409).

ActionTry it

Delete External Location

Tool to delete an external location from Unity Catalog metastore. Use when you need to remove an external location that combines a cloud storage path with a storage credential. The caller must be the owner of the external location. Use force=true to delete even if there are dependent external tables or mounts.

ActionTry it

Delete Genie Conversation

Tool to delete a conversation from a Genie space programmatically. Use when you need to remove conversations to manage the Genie space limits (10,000 conversations per space). Useful for deleting older or test conversations that are no longer needed.

ActionTry it

Delete Genie Conversation Message

Tool to delete a specific message from a Genie conversation. Use when you need to remove individual messages from conversations. This operation permanently deletes the message and cannot be undone.

ActionTry it

Delete Global Init Script

Tool to delete a global initialization script from Databricks workspace. Use when you need to remove a script that runs on every cluster node. Requires workspace administrator privileges. Clusters must restart to reflect the removal of the script.

ActionTry it

Delete IAM Group V2

Tool to delete a group from Databricks workspace using SCIM v2 protocol. Use when you need to permanently remove a security group. Requires appropriate permissions to delete the group.

ActionTry it

Delete IAM Service Principal V2

Tool to delete a service principal from Databricks workspace using SCIM v2 protocol. Use when you need to permanently remove a service principal and revoke its access to the workspace. The operation is idempotent - subsequent DELETE requests to the same ID will return 404 Not Found.

ActionTry it

Delete IAM User V2

Tool to delete a user from Databricks workspace using SCIM v2 protocol. Use when you need to inactivate a user and revoke their access to the workspace. Note that users are automatically purged 30 days after deletion if they do not own or belong to any workspace. Applications or scripts using tokens generated by the deleted user will no longer be able to access Databricks APIs.

ActionTry it

Delete Lakeview Dashboard Schedule

Tool to delete a dashboard schedule from a Lakeview dashboard. Use when you need to remove scheduled refreshes or updates for a dashboard. Provide the etag parameter to ensure the schedule hasn't been modified since last retrieval (optimistic concurrency control).

ActionTry it

Delete Legacy SQL Alert

Tool to permanently delete a legacy SQL alert (permanent deletion). Use when you need to permanently remove an alert using the legacy API endpoint. Note: This is a legacy endpoint that permanently deletes alerts. Unlike the newer /api/2.0/sql/alerts endpoint, deleted alerts cannot be restored from trash.

ActionTry it

Delete Legacy SQL Query

Tool to delete a legacy SQL query (soft delete to trash). Use when you need to remove a legacy query from searches and list views. The query is moved to trash and permanently deleted after 30 days. Note: This is a deprecated legacy API that will be phased out; use the non-legacy endpoint instead.

ActionTry it

Delete Legacy SQL Query Visualization

Tool to permanently delete a legacy SQL query visualization. Use when you need to remove a visualization from a SQL query using the legacy API endpoint. Note: This is a deprecated legacy endpoint. Databricks recommends migrating to /api/2.0/sql/visualizations/{id} instead.

ActionTry it

Delete Listing From Exchange

Tool to remove the association between a marketplace exchange and a listing. Use when you need to disassociate an exchange from a provider listing. This removes the listing from the private exchange, and it will no longer be shared with the curated set of customers in that exchange.

ActionTry it

Delete LLM Proxy Partner Powered Setting

Tool to delete (revert to default) the partner-powered AI features workspace setting. Use when you need to revert the workspace to default configuration for AI features powered by partner providers. By default, this setting is enabled for workspaces without a compliance security profile. Strongly recommended to use etag in a read-delete pattern to prevent concurrent modification conflicts (HTTP 409).

ActionTry it

Delete Logged Model

Tool to delete a logged model from MLflow tracking. Use when you need to permanently remove a LoggedModel from the tracking server. The deletion is permanent and cannot be undone. LoggedModels track a model's lifecycle across different training and evaluation runs.

ActionTry it

Delete Logged Model Tag

Tool to delete a tag from a logged model in MLflow. Use when you need to remove metadata from a LoggedModel object. This operation is irreversible and permanently removes the tag from the logged model. Part of MLflow 3's logged model management capabilities.

ActionTry it

Delete Marketplace Consumer Installation

Tool to uninstall a Databricks Marketplace installation. Use when you need to remove an installed data product from your workspace. When an installation is deleted, the shared catalog is removed from the workspace. Requires CREATE CATALOG and USE PROVIDER permissions on the Unity Catalog metastore, or metastore admin role.

ActionTry it

Delete Metastore

Tool to delete a Unity Catalog metastore. Use when you need to permanently remove a metastore and its managed data. Before deletion, you must delete or unlink any workspaces using the metastore. All objects managed by the metastore will become inaccessible. Requires metastore admin privileges.

ActionTry it

Delete ML Experiment

Tool to delete an MLflow experiment and associated metadata, runs, metrics, params, and tags. Use when you need to remove an experiment from Databricks. If the experiment uses FileStore, artifacts associated with the experiment are also deleted.

ActionTry it

Delete ML Experiment Run

Tool to mark an MLflow run for deletion in ML experiments. Use when you need to remove a specific run from Databricks. This is a soft delete operation - the run is marked for deletion rather than immediately removed and can be restored unless permanently deleted.

ActionTry it

Delete ML Experiment Run Tag

Tool to delete a tag from an MLflow experiment run. Use when you need to remove run metadata. This operation is irreversible and permanently removes the tag from the run.

ActionTry it

Delete ML Experiment Runs

Tool to bulk delete runs in an ML experiment created before a specified timestamp. Use when you need to clean up old experiment runs. Only runs created prior to or at the specified timestamp are deleted. The maximum number of runs that can be deleted in one operation is 10000.

ActionTry it

Delete ML Feature Engineering Kafka Config

Tool to delete a Kafka configuration from ML Feature Engineering. Use when you need to remove Kafka streaming source configurations. The deletion is permanent and cannot be undone. Kafka configurations define how features are streamed from Kafka sources.

ActionTry it

Delete ML Feature Store Online Store

Tool to delete an online store from ML Feature Store. Use when you need to remove online stores that provide low-latency feature serving infrastructure. The deletion is permanent and cannot be undone. Online stores are used for real-time feature retrieval in production ML serving.

ActionTry it

Delete ML Feature Tag

Delete a metadata tag from a specific feature column in a Databricks ML Feature Store table. This operation removes the tag association from the feature but does not affect the actual feature data. The operation is idempotent - it succeeds even if the tag doesn't exist, making it safe to call multiple times. Use this when you need to remove metadata tags from feature columns, such as removing deprecated labels or cleaning up temporary tags.

ActionTry it

Delete Notification Destination

Tool to delete a notification destination from the Databricks workspace. Use when you need to permanently remove a notification destination. Only workspace administrators have permission to perform this delete operation.

ActionTry it

Delete OAuth2 Service Principal Secret

Tool to delete an OAuth secret from a service principal at the account level. Use when you need to revoke OAuth credentials for service principal authentication. Once deleted, applications or scripts using tokens generated from that secret will no longer be able to access Databricks APIs.

ActionTry it

Delete Online Table

Tool to delete an online table by name. Use when you need to permanently remove an online table and stop data synchronization. This operation deletes all data in the online table permanently and releases all resources. Note: online tables are deprecated and will not be accessible after January 15, 2026.

ActionTry it

Delete Restrict Workspace Admins Setting

Tool to delete/revert the restrict workspace admins setting to its default state. Use when you need to restore default workspace administrator capabilities for service principal token creation and job ownership settings. Strongly recommended to use etag in a read-delete pattern to prevent concurrent modification conflicts (HTTP 409).

ActionTry it

Delete Secret Scope

Tool to delete a secret scope and all associated secrets and ACLs. Use when you need to permanently remove a secret scope. This operation cannot be undone. The API throws errors if the scope does not exist or the user lacks authorization.

ActionTry it

Delete Secrets ACL

Tool to delete an access control list from a Databricks secret scope. Use when you need to revoke permissions for a principal on a secret scope. Requires MANAGE permission on the scope. Fails if the ACL does not exist.

ActionTry it

Delete Serving Endpoint

Tool to delete a model serving endpoint and all associated data. Use when you need to permanently remove an endpoint. Deletion is permanent and cannot be undone. This operation disables usage and deletes all data associated with the endpoint.

ActionTry it

Delete Share

Tool to delete a Unity Catalog share from the metastore. Use when you need to permanently remove a share object. Deletion immediately revokes recipient access to the shared data. This operation is permanent and requires share owner privileges.

ActionTry it

Delete Sharing Recipient

Tool to delete a Delta Sharing recipient from Unity Catalog metastore. Use when you need to permanently remove a recipient object. Deletion invalidates all access tokens and immediately revokes access to shared data for users represented by the recipient. Requires recipient owner privileges.

ActionTry it

Delete SQL Alert

Tool to delete a Databricks SQL alert (soft delete to trash). Use when you need to remove an alert from active monitoring. The alert is moved to trash and can be restored through the UI. Trashed alerts are automatically cleaned up after 30 days. Note: Deleting an already-deleted alert will return an error (not idempotent).

ActionTry it

Delete SQL Dashboard

Tool to delete a legacy Databricks SQL dashboard by moving it to trash (soft delete). Use when you need to remove a dashboard from active use. The dashboard is moved to trash and can be restored later through the UI. Trashed dashboards do not appear in searches and cannot be shared.

ActionTry it

Delete SQL Query

Tool to delete a Databricks SQL query (soft delete to trash). Use when you need to remove a query from searches and list views. The query is moved to trash and can be restored through the UI within 30 days, after which it is permanently deleted.

ActionTry it

Delete SQL Results Download Setting

Tool to delete SQL results download workspace setting, reverting to default state where users are permitted to download results. Use when you need to restore the factory default configuration. Strongly recommended to use etag in a read-delete pattern to prevent concurrent modification conflicts (HTTP 409).

ActionTry it

Delete SQL Warehouse

Deletes a SQL warehouse from the Databricks workspace. Use this tool to permanently remove a SQL warehouse (compute resource) that is no longer needed. The warehouse must exist and you must have appropriate permissions to delete it. Important notes: - Deleted warehouses may be restored within 14 days by contacting Databricks support - The operation is idempotent - deleting an already deleted warehouse will succeed - This is a destructive operation and cannot be undone through the API

ActionTry it

Delete Storage Credential

Tool to delete a storage credential from the Unity Catalog metastore. Use when you need to remove storage credentials that provide authentication to cloud storage. The caller must be the owner of the storage credential. Use force=true to delete even if there are dependent external locations, tables, or services.

ActionTry it

Delete Synced Database Table

Tool to delete a synced table from Unity Catalog and stop data refreshes. Use when you need to deregister a synced table connection between Unity Catalog and a database instance. Note: The underlying Postgres table remains and must be manually dropped to free space.

ActionTry it

Delete Tag Policy

Tool to delete a tag policy by its key, making the tag ungoverned. Use when you need to remove governance from a tag without deleting the tag itself. Requires MANAGE permission on the governed tag. System governed tags cannot be deleted.

ActionTry it

Delete Token via Token Management

Tool to delete a token specified by ID via token management. Use when you need to revoke or remove access tokens. Admins can delete tokens for any user.

ActionTry it

Delete Vector Search Index

Tool to delete a vector search index from Databricks workspace. Use when you need to remove unused or obsolete vector search indexes. When an index is deleted, any associated writeback tables are automatically removed. This operation is irreversible.

ActionTry it

Delete Workspace Git Credentials

Tool to delete Git credentials for remote repository authentication in Databricks. Use when you need to remove a Git credential entry from the workspace. Only one Git credential per user is supported in Databricks, making this useful for credential lifecycle management when credentials need to be revoked or replaced.

ActionTry it

Delete Workspace Object

Tool to permanently delete a workspace object or directory. Use when you need to remove notebooks, files, or directories from the workspace. This is a hard delete operation that cannot be undone. Recursive deletion of non-empty directories is not atomic and may partially complete if it fails.

ActionTry it

Delete Workspace Repo

Tool to delete a Git repository from Databricks workspace. Use when you need to permanently remove a repository. The repository cannot be recovered after deletion completes successfully.

ActionTry it

Delete Workspace Secret

Tool to delete a secret from a Databricks secret scope. Use when you need to remove a secret stored in a scope. Requires WRITE or MANAGE permission on the scope. Not supported for Azure KeyVault-backed scopes.

ActionTry it

Deploy Databricks App

Tool to create a deployment for a Databricks app. Use when you need to deploy an app with source code from a workspace path. The deployment process provisions compute resources and uploads the source code. Deployments can be in states: IN_PROGRESS, SUCCEEDED, FAILED, or CANCELLED.

ActionTry it

Disable System Schema

Tool to disable a system schema in Unity Catalog metastore. Use when you need to remove a system schema from the system catalog. System schemas store information about customer usage patterns such as audit logs, billing information, and lineage data. Requires account admin or metastore admin privileges.

ActionTry it

Edit Compute Cluster Policy

Tool to update an existing Databricks cluster policy. Use when you need to modify policy settings like name, definition, or restrictions. Note that this operation may make some clusters governed by the previous policy invalid.

ActionTry it

Edit Compute Instance Pool

Tool to modify the configuration of an existing Databricks instance pool. Use when you need to update pool settings like capacity, termination minutes, or preloaded images. Note that the pool's node type cannot be changed after creation, though it must still be provided with the same value.

ActionTry it

Edit Compute Instance Profile

Tool to modify an existing AWS EC2 instance profile registered with Databricks. Use when you need to update the IAM role ARN associated with an instance profile. This operation is only available to admin users. The IAM role ARN is required if both of the following are true: your role name and instance profile name do not match, and you want to use the instance profile with Databricks SQL Serverless.

ActionTry it

Edit Databricks Cluster

Tool to edit an existing Databricks cluster configuration. Use when you need to modify cluster settings such as size, Spark version, node types, or cloud-specific attributes. The cluster must be in RUNNING or TERMINATED state. If updated while RUNNING, it will restart to apply changes.

ActionTry it

Edit SQL Warehouse

Tool to update the configuration of an existing SQL warehouse. Use when you need to modify warehouse settings like cluster size, scaling parameters, auto-stop behavior, or enable features like Photon acceleration and serverless compute. The warehouse is identified by its ID, and you can update various properties including resource allocation and performance optimizations.

ActionTry it

Enable System Schema

Tool to enable a system schema in Unity Catalog metastore. Use when you need to activate a system schema to track customer usage patterns. System schemas store information about audit logs, billing, compute usage, storage, lineage, and marketplace data. Requires account admin or metastore admin privileges.

ActionTry it

Enforce Cluster Policy Compliance

Tool to update a cluster to be compliant with the current version of its policy. Use when you need to enforce policy compliance on a cluster. The cluster can be updated if it is in a RUNNING or TERMINATED state. Note: Clusters created by Databricks Jobs, DLT, or Models cannot be enforced by this API.

ActionTry it

Execute Genie Message Query

Execute the SQL query associated with a Genie message and retrieve result data. DEPRECATED: This endpoint is deprecated in favor of Execute Message Attachment Query (execute_message_attachment_query). The new method requires an attachment_id parameter in addition to space_id, conversation_id, and message_id. Use this action to run the query generated by Genie AI for a specific message. Returns query execution results including data rows, execution status, and schema information. This is useful for re-executing queries when needed.

ActionTry it

Execute Message Attachment Query

Tool to execute SQL query for an expired message attachment in a Genie space. Use when a query attachment has expired and needs to be re-executed to retrieve fresh results. Returns SQL statement execution results with schema, metadata, and data.

ActionTry it

Execute SQL Statement

Execute a SQL statement on a Databricks SQL warehouse. Returns results inline if the query completes within the wait timeout, otherwise returns a statement_id to poll for results. Use this to run SQL queries against your Databricks warehouse. For large result sets, use disposition=EXTERNAL_LINKS and fetch chunks separately.

ActionTry it

Export Workspace Object

Tool to export a workspace object (notebook, dashboard, or file) as file content or base64-encoded string. Use when you need to retrieve the content of workspace objects for backup, migration, or analysis. By default, returns base64-encoded content with file type information. Set direct_download=true to get raw file content directly.

ActionTry it

Finalize Logged Model

Tool to finalize a logged model in MLflow by updating its status to READY or FAILED. Use when custom model preparation logic is complete and you need to mark the model as ready for use or indicate that upload failed. This is part of the experimental logged models feature introduced in MLflow 2.8+.

ActionTry it

Find Database Instance By UID

Tool to find a database instance by its unique identifier (UID). Use when you need to retrieve instance details using the immutable UUID instead of the instance name.

ActionTry it

Generate Database Credential

Tool to generate OAuth token for database instance authentication. Use when you need to authenticate to Databricks database instances. The generated token is workspace-scoped and expires after one hour, though open connections remain active past expiration.

ActionTry it

Generate Temporary Path Credentials

Tool to generate short-lived, scoped temporary credentials for accessing external storage locations registered in Unity Catalog. Use when you need temporary access to cloud storage paths with specific read/write permissions. The credentials inherit the privileges of the requesting principal and are valid for a limited time. The requesting principal must have EXTERNAL USE LOCATION privilege on the external location.

ActionTry it

Generate Temporary Service Credential

Tool to generate temporary credentials from a service credential with admin access. Use when you need short-lived, scoped credentials for accessing cloud resources. The caller must be a metastore admin or have the ACCESS privilege on the service credential.

ActionTry it

Get Access Request Destinations

Tool to retrieve access request destinations for a Unity Catalog securable. Use when you need to find where notifications are sent when users request access to catalogs, schemas, tables, or other securables. Any caller can see URL destinations or destinations on the metastore. For other securables, only those with BROWSE permissions can see destinations.

ActionTry it

Get AI/BI Dashboard Embedding Access Policy

Tool to retrieve workspace AI/BI dashboard embedding access policy setting. Use when you need to check whether AI/BI published dashboard embedding is enabled, conditionally enabled, or disabled. The default setting is ALLOW_APPROVED_DOMAINS which permits AI/BI dashboards to be embedded on approved domains.

ActionTry it

Get AI/BI Dashboard Embedding Approved Domains

Tool to retrieve the list of domains approved to host embedded AI/BI dashboards. Use when you need to check which external domains are permitted to embed AI/BI dashboards. The approved domains list cannot be modified unless the workspace access policy is set to ALLOW_APPROVED_DOMAINS.

ActionTry it

Get All Library Statuses

Tool to retrieve status of all libraries across all Databricks clusters. Use when you need to check library installation status on all clusters, including libraries set to be installed on all clusters via the API or libraries UI. Returns detailed status information for each library on each cluster.

ActionTry it

Get App Update Status

Retrieves the current update status of a Databricks app. This endpoint returns whether the app's most recent configuration update succeeded, failed, is in progress, or has never been updated. Use this to monitor the status of app configuration changes (such as description, compute size, or resource modifications) rather than deployment status.

ActionTry it

Get Automatic Cluster Update Setting

Tool to retrieve automatic cluster update setting for the workspace. Use when you need to check whether automatic cluster updates are enabled, view maintenance window configuration, or get restart behavior settings. This setting controls whether clusters automatically update during maintenance windows. Currently in Public Preview.

ActionTry it

Get Catalog Artifact Allowlist

Tool to retrieve artifact allowlist configuration for a specified artifact type in Unity Catalog. Use when you need to check which artifacts are permitted for use in your Databricks environment. Requires metastore admin privileges or MANAGE ALLOWLIST privilege on the metastore.

ActionTry it

Get Catalog Connection

Tool to retrieve detailed information about a specific Unity Catalog connection. Use when you need to get connection metadata, configuration, and properties for external data source connections.

ActionTry it

Get Catalog Credential

Tool to retrieve detailed information about a specific Unity Catalog credential. Use when you need to get credential metadata, configuration, and cloud provider details for storage or service credentials.

ActionTry it

Get Catalog Details

Tool to retrieve details of a specific catalog in Unity Catalog. Use when you need to get information about a catalog including its metadata, owner, properties, and configuration. Requires metastore admin privileges, catalog ownership, or USE_CATALOG privilege.

ActionTry it

Get Catalog Grants

Tool to get permissions (grants) for a securable in Unity Catalog without inherited permissions. Use when you need to see direct privilege assignments on a catalog or other securable object. Returns only privileges directly assigned to principals, excluding inherited permissions from parent securables. For inherited permissions, use the get-effective endpoint instead.

ActionTry it

Get Catalog Schema

Tool to retrieve details of a specific schema from Unity Catalog metastore. Use when you need to get schema metadata, ownership, storage configuration, and properties. Requires metastore admin privileges, schema ownership, or USE_SCHEMA privilege.

ActionTry it

Get Catalog Table Details

Tool to retrieve comprehensive metadata about a table from Unity Catalog metastore. Use when you need detailed table information including columns, type, storage, constraints, and governance metadata. Requires metastore admin privileges, table ownership, or SELECT privilege on the table, plus USE_CATALOG and USE_SCHEMA privileges on parent objects.

ActionTry it

Get Catalog Volume Details

Tool to retrieve detailed information about a specific Unity Catalog volume. Use when you need to get volume metadata including type, storage location, owner, and timestamps. Requires metastore admin privileges or volume ownership with appropriate USE_CATALOG and USE_SCHEMA privileges on parent objects.

ActionTry it

Get Clean Room Asset

Tool to retrieve detailed information about a specific asset within a Databricks Clean Room. Use when you need to get metadata and configuration for clean room assets such as tables, views, notebooks, volumes, or foreign tables.

ActionTry it

Get Cluster Information

Tool to retrieve comprehensive metadata and configuration details for a Databricks cluster by its unique identifier. Use when you need to check cluster state, configuration, resources, or operational details. Returns cluster information including state, compute configuration, cloud-specific settings, and resource allocations.

ActionTry it

Get Cluster Policy Compliance

Tool to retrieve policy compliance status for a specific cluster. Use when you need to check whether a cluster meets the requirements of its assigned policy and identify any policy violations. Clusters could be out of compliance if their policy was updated after the cluster was last edited.

ActionTry it

Get Compliance Security Profile Setting

Tool to retrieve workspace compliance security profile setting. Use when you need to check whether CSP is enabled or view configured compliance standards. The CSP enables additional monitoring, enforced instance types for inter-node encryption, hardened compute images, and other security controls. Once enabled, this setting represents a permanent workspace change that cannot be disabled.

ActionTry it

Get Compute Cluster Permission Levels

Tool to retrieve available permission levels for a Databricks compute cluster. Use when you need to understand what permission levels can be assigned to users, groups, or service principals for a specific cluster. Returns permission levels like CAN_ATTACH_TO, CAN_RESTART, and CAN_MANAGE with their descriptions.

ActionTry it

Get Compute Cluster Policy

Tool to retrieve detailed information about a specific cluster policy by its ID. Use when you need to view the configuration and settings of an existing cluster policy.

ActionTry it

Get Compute Cluster Policy Permission Levels

Tool to retrieve available permission levels for a Databricks cluster policy. Use when you need to understand what permission levels can be assigned to users, groups, or service principals for a specific cluster policy. Returns permission levels like CAN_USE with their descriptions.

ActionTry it

Get Compute Cluster Policy Permissions

Tool to retrieve permissions for a Databricks cluster policy. Use when you need to check who has access to a specific cluster policy and their permission levels. Returns the access control list with user, group, and service principal permissions, including inherited permissions from parent objects.

ActionTry it

Get Compute Policy Families

Tool to retrieve information for a policy family by identifier and optional version. Use when you need to view Databricks-provided templates for configuring clusters for a particular use case. Policy families cannot be created, edited, or deleted by users.

ActionTry it

Get Current Metastore Assignment

Tool to retrieve the current metastore assignment for the workspace being accessed. Use when you need to determine which metastore is assigned to the current workspace context.

ActionTry it

Get Current User Information

Tool to retrieve details about the currently authenticated user or service principal making the API request. Use when you need to get information about the current user's identity, groups, roles, and entitlements within the Databricks workspace.

ActionTry it

Get Dashboard Email Subscriptions Setting

Tool to retrieve dashboard email subscriptions setting for the workspace. Use when you need to check whether schedules or workload tasks for refreshing AI/BI Dashboards can send subscription emails. By default, this setting is enabled.

ActionTry it

Get Database Instance

Retrieves detailed information about a Databricks Lakebase (managed PostgreSQL) database instance by name. Returns comprehensive instance details including: - Current state (AVAILABLE, STARTING, STOPPED, UPDATING, DELETING, FAILING_OVER) - Capacity/SKU configuration (CU_1, CU_2, CU_4, CU_8) - Connection endpoints (read-write DNS, read-only DNS) - PostgreSQL version and node configuration - Retention window and backup settings - High availability settings (readable secondaries) - Custom tags and usage policies - Parent/child instance relationships Use this when you need to check instance status, retrieve connection information, or verify configuration settings.

ActionTry it

Get Databricks App Details

Tool to retrieve details about a specific Databricks app by name. Use when you need to get comprehensive information about an app including configuration, deployment status, compute resources, and metadata.

ActionTry it

Get Databricks App Permission Levels

Tool to retrieve available permission levels for a Databricks app. Use when you need to understand what permission levels can be assigned to users or groups for a specific app. Returns permission levels like CAN_USE and CAN_MANAGE with their descriptions.

ActionTry it

Get Databricks App Permissions

Tool to retrieve permissions for a Databricks app. Use when you need to check who has access to an app and their permission levels. Returns the access control list including inherited permissions from parent or root objects.

ActionTry it

Get Databricks Job Details

Tool to retrieve detailed information about a single Databricks job. Use when you need to get comprehensive job configuration including tasks, schedules, notifications, and cluster settings. For jobs with more than 100 tasks or job clusters, use the page_token parameter to paginate through results.

ActionTry it

Get DBFS File Status

Tool to get the information of a file or directory in DBFS. Use when you need to check if a file or directory exists, retrieve its size, type, or last modification time. Throws RESOURCE_DOES_NOT_EXIST exception if the file or directory does not exist.

ActionTry it

Get Default Namespace Setting

Tool to retrieve the default catalog namespace setting for the workspace. Use when you need to check which catalog is used for unqualified table references in Unity Catalog-enabled compute. Changes to this setting require restart of clusters and SQL warehouses to take effect.

ActionTry it

Get Default Warehouse ID Setting

Tool to retrieve the default SQL warehouse ID setting for the workspace. Use when you need to check which warehouse is configured as the default for SQL authoring surfaces, AI/BI dashboards, Genie, Alerts, and Catalog Explorer.

ActionTry it

Get Disable Legacy Access Setting

Tool to retrieve the disable legacy access workspace setting. Use when you need to check whether legacy feature access is disabled, including direct Hive Metastore access, Fallback Mode on external locations, and Databricks Runtime versions prior to 13.3 LTS.

ActionTry it

Get Disable Legacy DBFS Setting

Tool to retrieve the disable legacy DBFS workspace setting. Use when you need to check whether legacy DBFS root and mount access is disabled across all interfaces (UI, APIs, CLI, FUSE). When enabled, this setting also disables Databricks Runtime versions prior to 13.3 LTS and requires manual restart of compute clusters and SQL warehouses to take effect.

ActionTry it

Get Effective Catalog Permissions

Tool to get effective permissions for a securable in Unity Catalog, including inherited permissions from parent securables. Use when you need to understand what privileges are granted to principals through direct assignments or inheritance. Returns privileges conveyed to each principal through the Unity Catalog hierarchy (metastore → catalog → schema → table/view/volume).

ActionTry it

Get Enable Export Notebook Setting

Tool to retrieve workspace setting controlling notebook export functionality. Use when you need to check whether users can export notebooks and files from the Workspace UI. Administrators use this setting to manage data exfiltration controls.

ActionTry it

Get Enable Notebook Table Clipboard Setting

Tool to retrieve notebook table clipboard setting for the workspace. Use when you need to check whether notebook table clipboard functionality is enabled. This setting controls whether users can copy data from tables in notebooks to their clipboard.

ActionTry it

Get Enable Results Downloading Setting

Tool to retrieve workspace setting controlling notebook results download functionality. Use when you need to check whether users can download notebook query results. Requires workspace administrator privileges to access.

ActionTry it

Get Enhanced Security Monitoring Setting

Tool to retrieve enhanced security monitoring workspace setting. Use when you need to check whether Enhanced Security Monitoring is enabled for the workspace. Enhanced Security Monitoring provides a hardened disk image and additional security monitoring agents. It is automatically enabled when compliance security profile is active, and can be manually toggled when compliance security profile is disabled.

ActionTry it

Get Entity Tag Assignment

Tool to retrieve a specific tag assignment for a Unity Catalog entity by tag key. Use when you need to get details about a tag assigned to catalogs, schemas, tables, columns, or volumes. Requires USE CATALOG and USE SCHEMA permissions on parent resources, and ASSIGN or MANAGE permissions on the tag policy for governed tags.

ActionTry it

Get External Location Details

Tool to retrieve details of a specific Unity Catalog external location. Use when you need to get information about an external location including its URL, storage credential, and configuration. Requires metastore admin privileges, external location ownership, or appropriate privileges on the external location.

ActionTry it

Get Genie Message

Tool to retrieve details of a specific message from a Genie conversation. Use when you need to get message content, status, attachments, or check processing status of a previously created message.

ActionTry it

Get Genie Message Attachment Query Result

Tool to retrieve SQL query results from a Genie message attachment. Use when the message status is EXECUTING_QUERY or COMPLETED and you need to fetch the actual query execution results. Returns statement execution details including query data, schema, and metadata with a maximum of 5000 rows.

ActionTry it

Get Genie Message Query Result

Tool to retrieve SQL query execution results for a Genie message (up to 5000 rows). Use when message status is EXECUTING_QUERY or COMPLETED and the message has a query attachment. Returns query results with schema, metadata, and data in inline or external link format.

ActionTry it

Get Genie Message Query Result

Tool to retrieve SQL query execution results for a message attachment in a Genie space conversation. Use when you need to fetch query results from a Genie conversation message. Note: This endpoint is deprecated; consider using GetMessageAttachmentQueryResult instead. Returns results only when message status is EXECUTING_QUERY or COMPLETED. Maximum 5,000 rows per result.

ActionTry it

Get Genie Space Details

Tool to retrieve detailed information about a specific Databricks Genie space by ID. Use when you need to get configuration details, metadata, and optionally the serialized space content for backup or promotion across workspaces. Requires at least CAN EDIT permission to retrieve the serialized space content.

ActionTry it

Get Global Init Script

Tool to retrieve complete details of a global initialization script in Databricks workspace. Use when you need to view script configuration, Base64-encoded content, or metadata. Returns all script details including creation/update timestamps and whether the script is enabled.

ActionTry it

Get IAM Account Group V2

Tool to retrieve a specific group resource by its unique identifier from a Databricks account using SCIM v2 protocol. Use when you need to get complete group details including members, roles, and entitlements.

ActionTry it

Get IAM Permission Levels

Tool to retrieve available permission levels for a Databricks workspace object. Use when you need to understand what permission levels can be assigned to users or groups for a specific object type. Returns permission levels like CAN_READ, CAN_RUN, CAN_EDIT, CAN_MANAGE with their descriptions. Available levels vary by object type.

ActionTry it

Get IAM Permissions

Tool to retrieve IAM permissions for a Databricks workspace object. Use when you need to check who has access to a specific resource and their permission levels. Returns the access control list (ACL) including user, group, and service principal permissions with inheritance information.

ActionTry it

Get IAM Service Principal V2

Tool to retrieve details of a specific service principal by ID from Databricks workspace using SCIM v2 protocol. Use when you need to get complete service principal information including groups, roles, entitlements, and metadata.

ActionTry it

Get Instance Pool Details

Tool to retrieve detailed information about a Databricks instance pool by its ID. Use when you need to get instance pool configuration, capacity settings, preloaded images, and usage statistics. Instance pools reduce cluster start and auto-scaling times by maintaining idle, ready-to-use cloud instances.

ActionTry it

Get Instance Pool Permission Levels

Tool to retrieve available permission levels for a Databricks instance pool. Use when you need to understand what permission levels can be assigned to users, groups, or service principals for a specific instance pool. Returns permission levels like CAN_ATTACH_TO and CAN_MANAGE with their descriptions.

ActionTry it

Get Instance Pool Permissions

Tool to retrieve permissions for a Databricks instance pool. Use when you need to check who has access to a specific instance pool and their permission levels. Returns the access control list with user, group, and service principal permissions, including inherited permissions from parent objects.

ActionTry it

Get IP Access List

Tool to retrieve details of a specific IP access list by its ID. Use when you need to view the configuration of allowed or blocked IP addresses and subnets for accessing the workspace or workspace-level APIs. Requires workspace admin privileges.

ActionTry it

Get Job Permission Levels

Tool to retrieve available permission levels for a Databricks job. Use when you need to understand what permission levels can be assigned to users, groups, or service principals for a specific job. Returns permission levels like CAN_VIEW, CAN_MANAGE_RUN, CAN_MANAGE, and IS_OWNER with their descriptions.

ActionTry it

Get Job Policy Compliance

Tool to retrieve policy compliance status for a specific job. Use when you need to check whether a job meets the requirements of its assigned policies and identify any policy violations. Jobs could be out of compliance if a policy they use was updated after the job was last edited and some of its job clusters no longer comply with their updated policies.

ActionTry it

Get Job Run By ID

Tool to retrieve metadata of a single Databricks job run by ID. Use when you need to get detailed information about a specific job run including state, timing, and cluster configuration. Runs are automatically removed after 60 days.

ActionTry it

Get Lakeview Dashboard Details

Tool to retrieve details about a draft AI/BI Lakeview dashboard from the workspace. Use when you need to get comprehensive information about a dashboard including metadata, configuration, state, and serialized dashboard content.

ActionTry it

Get Lakeview Dashboard Schedule

Tool to retrieve a specific schedule for a Databricks AI/BI Lakeview dashboard. Use when you need to get schedule details including cron expressions, pause status, warehouse configuration, and subscription information. Each dashboard can have up to 10 schedules, with each schedule supporting up to 100 subscriptions.

ActionTry it

Get Latest Provider Analytics Dashboard Version

Tool to retrieve the latest logical version of the provider analytics dashboard template. Use when you need to get the current dashboard template version for monitoring consumer usage metrics including listing views, requests, and installs.

ActionTry it

Get Legacy SQL Alert

Tool to retrieve details of a specific legacy SQL alert by its ID. Use when you need to get information about a legacy alert including its configuration, state, query details, and notification settings. Note: This is a legacy endpoint (/api/2.0/preview/sql/alerts) that is deprecated and being replaced by /api/2.0/sql/alerts.

ActionTry it

Get Legacy SQL Query

Tool to retrieve details of a specific legacy SQL query by its UUID. Use when you need to get information about a legacy query including its SQL text, parameters, configuration, and metadata. Note: This is a legacy endpoint (/api/2.0/preview/sql/queries) that has been replaced by /api/2.0/sql/queries and will be supported for six months to allow migration time.

ActionTry it

Get LLM Proxy Partner Powered Setting

Tool to retrieve workspace-level setting that controls whether partner-powered AI features are enabled. Use when you need to check if features like Databricks Assistant, Genie, and Data Science Agent can use models hosted by partner providers (Azure OpenAI or Anthropic). By default, this setting is enabled for non-CSP workspaces.

ActionTry it

Get Logged Model

Tool to fetch logged model metadata by unique ID. Use when you need to retrieve a LoggedModel object representing a model logged to an MLflow Experiment. Returns comprehensive model information including metrics, parameters, tags, and artifact details.

ActionTry it

Get Marketplace Consumer Listing

Tool to retrieve a published listing from Databricks Marketplace that consumer has access to. Use when you need to get detailed information about a specific marketplace listing by its ID. Requires Unity Catalog permissions to access marketplace assets.

ActionTry it

Get Marketplace Consumer Personalization Requests

Tool to retrieve personalization requests for a specific marketplace listing. Use when you need to check the status of customization or commercial transaction requests for a listing. Each consumer can make at most one personalization request per listing.

ActionTry it

Get Marketplace Consumer Provider

Tool to retrieve information about a specific provider in the Databricks Marketplace with visible listings. Use when you need to get provider details including contact information, description, and metadata.

ActionTry it

Get Marketplace Provider Listing

Tool to retrieve a specific marketplace provider listing by its identifier. Use when you need to get detailed information about a published or draft listing including metadata, configuration, and assets.

ActionTry it

Get Metastore Details

Retrieves comprehensive details about a Unity Catalog metastore by its unique ID. Returns metastore configuration including name, cloud provider, region, owner, storage settings, Delta Sharing configuration, privilege model version, and audit metadata (creation/update timestamps and users). Use this to inspect metastore properties, verify configurations, or gather information for metastore management operations. Note: Requires appropriate metastore access permissions.

ActionTry it

Get Metastore Summary

Tool to retrieve summary information about the metastore associated with the current workspace. Use when you need metastore configuration overview including cloud vendor, region, storage, and Delta Sharing details.

ActionTry it

Get ML Experiment

Tool to retrieve metadata for an MLflow experiment by ID. Use when you need to get experiment details including name, artifact location, lifecycle stage, and tags. Works on both active and deleted experiments.

ActionTry it

Get ML Experiment By Name

Tool to retrieve MLflow experiment metadata by name. Use when you need to get experiment details using the experiment name. Returns deleted experiments but prefers active ones if both exist with the same name. Throws RESOURCE_DOES_NOT_EXIST if no matching experiment exists.

ActionTry it

Get ML Experiment Permission Levels

Tool to retrieve available permission levels for a Databricks ML experiment. Use when you need to understand what permission levels can be assigned to users, groups, or service principals for a specific experiment. Returns permission levels (CAN_READ, CAN_EDIT, CAN_MANAGE) with their descriptions.

ActionTry it

Get ML Experiment Permissions

Tool to retrieve permissions for an MLflow experiment. Use when you need to check who has access to an experiment and their permission levels. Note that notebook experiments inherit permissions from their corresponding notebook, while workspace experiments have independent permissions.

ActionTry it

Get ML Feature Tag

Tool to retrieve a specific tag from a feature in a feature table in ML Feature Store. Use when you need to get metadata tag details from specific features. This operation returns the tag name and value associated with the feature.

ActionTry it

Get ML Model Registry Permission Levels

Retrieves the list of available permission levels that can be assigned to users or groups for a Databricks ML registered model. This endpoint returns metadata about what permission levels are available (e.g., CAN_READ, CAN_EDIT, CAN_MANAGE, CAN_MANAGE_PRODUCTION_VERSIONS, CAN_MANAGE_STAGING_VERSIONS) with descriptions of what each level grants. Use this to understand permission options before setting or updating model permissions. The returned permission levels are standard across all registered models. Note: This returns the available permission level definitions, not the actual permissions currently assigned to the model. To view actual permissions, use the get permissions endpoint.

ActionTry it

Get MLflow Run

Tool to retrieve complete information about a specific MLflow run including metadata, metrics, parameters, tags, inputs, and outputs. Use when you need to get details of a run by its run_id. Returns the most recent metric values when multiple metrics with the same key exist.

ActionTry it

Get Model Version

Tool to retrieve detailed information about a specific version of a registered model in Unity Catalog. Use when you need to get metadata, status, source location, and configuration of a model version. Requires metastore admin privileges, model ownership, or EXECUTE privilege on the registered model with appropriate catalog and schema privileges.

ActionTry it

Get Notification Destination

Tool to retrieve details of a notification destination by its UUID identifier. Use when you need to get configuration details, display name, and type information for a specific notification destination. Only users with workspace admin permissions will see the full configuration details.

ActionTry it

Get Pipeline Permission Levels

Tool to retrieve available permission levels for a Databricks Delta Live Tables pipeline. Use when you need to understand what permission levels can be assigned to users or groups for a specific pipeline. Returns permission levels like CAN_VIEW, CAN_RUN, CAN_MANAGE, and IS_OWNER with their descriptions.

ActionTry it

Get Pipeline Permissions

Tool to retrieve permissions for a Databricks Delta Live Tables pipeline. Use when you need to check who has access to a pipeline and their permission levels. Returns the complete permissions information including access control lists with user, group, and service principal permissions.

ActionTry it

Get Provider Analytics Dashboard

Tool to retrieve provider analytics dashboard information for monitoring consumer usage metrics. Use when you need to access the dashboard ID to view marketplace listing performance including views, requests, installs, and conversion rates.

ActionTry it

Get Public Workspace Setting

Retrieves the current configuration of a workspace-level setting. Use this to get workspace settings such as admin restrictions, automatic cluster updates, default SQL namespaces, security profiles, and monitoring configurations. The response includes an etag for version control, enabling safe concurrent updates. Common use cases include checking current security settings, verifying cluster update policies, or retrieving namespace defaults before making changes.

ActionTry it

Get Published Dashboard Token Info

Retrieves authorization information needed to generate a downscoped OAuth token for embedding a published Lakeview dashboard for external viewers. This endpoint returns the custom claims, scopes, and authorization details required to create a secure, limited-access OAuth token that allows external users to view a specific published dashboard. The returned information ensures the token has minimal necessary permissions and supports row-level security through custom claims. Use this when implementing embedded dashboard experiences for external users who should not have direct access to your Databricks workspace. The dashboard must first be published with embed_credentials enabled.

ActionTry it

Get Published Lakeview Dashboard

Tool to retrieve the current published version of a Lakeview dashboard. Use when you need to get information about the published dashboard including its display name, embedded credentials status, warehouse configuration, and last revision timestamp.

ActionTry it

Get Quality Monitor

Tool to retrieve quality monitor configuration for a Unity Catalog table. Use when you need to get monitor status, metrics tables, custom metrics, notifications, scheduling, and monitoring configuration details. Requires catalog and schema privileges plus SELECT on the table.

ActionTry it

Get Redash V2 Config

Tool to retrieve workspace configuration for Redash V2 in Databricks SQL. Use when you need to get Redash configuration settings for the current workspace.

ActionTry it

Get Registered Model

Tool to retrieve detailed information about a registered model in Unity Catalog. Use when you need to get metadata, owner, storage location, and configuration of a registered model. Requires metastore admin privileges, model ownership, or EXECUTE privilege on the registered model with appropriate catalog and schema privileges.

ActionTry it

Get Resource Quota Information

Tool to retrieve usage information for a Unity Catalog resource quota defined by a child-parent pair. Use when you need to check quota usage for a specific resource type (tables per metastore, schemas per catalog, etc.). The API also triggers an asynchronous refresh if the count is out of date. Requires account admin authentication with OAuth.

ActionTry it

Get Restrict Workspace Admins Setting

Tool to retrieve the restrict workspace admins setting for the workspace. Use when you need to check whether workspace administrators are restricted in their ability to create service principal tokens, change job owners, or modify job run_as settings. This setting controls security boundaries for admin privileges.

ActionTry it

Get Secret Value

Tool to get a secret value from a Databricks secret scope. Use when you need to retrieve the actual value of a secret stored in a scope. Important: This API can only be called from the DBUtils interface (from within a cluster/notebook). There is no API to read the actual secret value outside of a cluster. Requires READ permission on the scope.

ActionTry it

Get Secrets ACL

Tool to retrieve ACL details for a principal on a Databricks secret scope. Use when you need to check the permission level granted to a specific user, service principal, or group. Requires MANAGE permission on the scope. Each permission level is hierarchical - WRITE includes READ, and MANAGE includes both WRITE and READ.

ActionTry it

Get Serving Endpoint Details

Retrieves comprehensive details about a specific Databricks serving endpoint by name. Use this action to: - Get endpoint status and readiness state (READY, NOT_READY, UPDATE_FAILED) - View served model/entity configurations including foundation model details - Check endpoint capabilities (function calling, image input, long context, reasoning) - Retrieve pricing information (input/output token costs in DBUs) - Inspect AI Gateway settings (usage tracking, safety/PII guardrails) - Review traffic routing and workload configurations - Verify permission levels and access controls Returns detailed information including endpoint type (FOUNDATION_MODEL_API or CUSTOM_MODEL_API), task type (llm/v1/chat, llm/v1/completions, llm/v1/embeddings), model documentation links, and configuration versions.

ActionTry it

Get Serving Endpoint OpenAPI Spec

Tool to retrieve the OpenAPI 3.1.0 specification for a serving endpoint. Use when you need to understand the endpoint's schema, generate client code, or visualize the API structure. The endpoint must be in a READY state and the served model must have a model signature logged.

ActionTry it

Get Serving Endpoint Permission Levels

Tool to retrieve available permission levels for a Databricks serving endpoint. Use when you need to understand what permission levels can be assigned to users or groups for access control. Returns permission levels like CAN_MANAGE, CAN_QUERY, and CAN_VIEW with their descriptions.

ActionTry it

Get Share Details

Tool to retrieve details of a specific share from Unity Catalog. Use when you need to get information about a share including its metadata, owner, and optionally the list of shared data objects. Requires metastore admin privileges or share ownership.

ActionTry it

Get Share Permissions

Tool to retrieve permissions for a Delta Sharing share from Unity Catalog. Use when you need to check which principals have been granted privileges on a share. Requires metastore admin privileges or share ownership.

ActionTry it

Get Sharing Provider

Tool to retrieve information about a specific Delta Sharing provider in Unity Catalog. Use when you need to get provider details including authentication type, ownership, and connection information. Requires metastore admin privileges or provider ownership.

ActionTry it

Get Sharing Recipient

Tool to retrieve a Delta Sharing recipient from Unity Catalog metastore by name. Use when you need to get information about a recipient object representing an entity that receives shared data. Requires recipient ownership or metastore admin privileges.

ActionTry it

Get SQL Alert Details

Tool to retrieve details of a specific Databricks SQL alert by its UUID. Use when you need to get information about an alert including its configuration, trigger conditions, state, and notification settings.

ActionTry it

Get SQL Dashboard

Tool to retrieve complete legacy dashboard definition with metadata, widgets, and queries. Use when you need to get detailed information about a SQL dashboard. Note: Legacy dashboards API deprecated as of January 12, 2026. Databricks recommends using AI/BI dashboards (Lakeview API) for new implementations.

ActionTry it

Get SQL Object Permissions

Tool to retrieve the access control list for legacy DBSQL (Redash-based) objects including alerts, dashboards, and queries. Use when you need to check who has access to these legacy SQL objects and their permission levels. IMPORTANT: This API is deprecated as of January 2026. For permissions on modern Lakeview dashboards or other workspace objects, use the IAM Permissions API (DATABRICKS_IAM_PERMISSIONS_GET) instead. Legacy DBSQL objects are no longer directly accessible and should be migrated to AI/BI dashboards.

ActionTry it

Get SQL Query Details

Tool to retrieve detailed information about a specific SQL query by its UUID. Use when you need to get query configuration including SQL text, warehouse ID, parameters, ownership, and metadata.

ActionTry it

Get SQL Results Download Setting

Tool to retrieve SQL results download workspace setting. Use when you need to check whether users within the workspace are allowed to download results from the SQL Editor and AI/BI Dashboards UIs. By default, this setting is enabled (set to true). Returns etag for use in subsequent update/delete operations.

ActionTry it

Get SQL Statement Result Chunk

Get a specific chunk of results from a SQL statement execution. Use this to paginate through large result sets. The chunk_index is zero-based. Use the manifest from the execute_statement or get_statement response to determine the total number of chunks available.

ActionTry it

Get SQL Statement Status

Get the status, manifest, and first chunk of results for a SQL statement execution. Use this to poll for completion after executing a statement asynchronously. The statement result is available for one hour after completion. Returns HTTP 404 if the statement has been in a terminal state for more than 12 hours.

ActionTry it

Get SQL Warehouse Details

Tool to retrieve detailed information about a specific SQL warehouse by its ID. Use when you need to get configuration, state, connection details, and resource allocation for a SQL warehouse. Returns comprehensive warehouse information including cluster settings, JDBC/ODBC connection strings, and health status.

ActionTry it

Get SQL Warehouse Permission Levels

Tool to retrieve available permission levels for a Databricks SQL warehouse. Use when you need to understand what permission levels can be assigned to users, groups, or service principals for a specific SQL warehouse. Returns permission levels like CAN_USE, CAN_MANAGE, IS_OWNER, CAN_VIEW, and CAN_MONITOR with their descriptions.

ActionTry it

Get SQL Warehouse Permissions

Tool to retrieve permissions for a Databricks SQL warehouse. Use when you need to check who has access to a specific SQL warehouse and their permission levels. Returns the access control list with user, group, and service principal permissions, including inherited permissions from parent objects.

ActionTry it

Get Storage Credential

Tool to retrieve storage credential details from Unity Catalog metastore by name. Use when you need to get information about a storage credential's configuration and properties. Requires metastore admin privileges, credential ownership, or appropriate permissions on the storage credential.

ActionTry it

Get Tag Policy

Tool to retrieve a specific tag policy by its associated governed tag's key. Use when you need to get details about tag governance policies including allowed values and metadata.

ActionTry it

Get Token Information

Tool to retrieve detailed information about a specific token by its ID from the token management system. Use when you need to get token metadata including creation time, expiry, owner, and usage information. Requires appropriate permissions to access token information.

ActionTry it

Get Token Management Permission Levels

Tool to retrieve available permission levels for personal access token management. Use when you need to understand what permission levels can be assigned for managing tokens in the workspace. Returns permission levels like CAN_USE and CAN_MANAGE with their descriptions.

ActionTry it

Get Token Management Permissions

Tool to retrieve permissions for workspace token management. Use when you need to check which users, groups, and service principals have permissions to create and manage personal access tokens. Requires workspace admin privileges and is available only in Databricks Premium plan.

ActionTry it

Get User by ID

Tool to retrieve information for a specific user in Databricks workspace by their ID. Use when you need to get complete user details including identity, contact information, group memberships, roles, and entitlements. Implements the SCIM 2.0 protocol standard for retrieving User resources.

ActionTry it

Get Workspace Access Detail

Retrieves workspace access details for a specific principal (user, service principal, or group) in Databricks. Returns information about the principal's workspace access including their principal ID, workspace ID, account ID, principal type (USER/SERVICE_PRINCIPAL/GROUP), access type (DIRECT/INHERITED), and access status (ACTIVE/INACTIVE). Use this to verify workspace access assignments and understand how identities are granted access to the workspace.

ActionTry it

Get Workspace Git Credentials

Tool to retrieve Git credentials for authenticating with remote Git repositories in Databricks. Use when you need to get details of existing Git integration credentials by credential ID.

ActionTry it

Get Workspace IAM Group V2

Tool to retrieve details of a specific group by ID from Databricks workspace using SCIM v2 protocol. Use when you need to get complete group information including members, roles, entitlements, and metadata.

ActionTry it

Get Workspace Object Status

Tool to retrieve status and metadata for any workspace object including notebooks, directories, dashboards, and files. Use when you need to get object type, path, identifier, and additional metadata fields. Returns error with code RESOURCE_DOES_NOT_EXIST if the specified path does not exist.

ActionTry it

Get Workspace Repo Permission Levels

Tool to retrieve available permission levels for a Databricks workspace repository. Use when you need to understand what permission levels can be assigned to users, groups, or service principals for a specific Git repository. Returns permission levels like CAN_READ, CAN_RUN, CAN_EDIT, and CAN_MANAGE with their descriptions.

ActionTry it

Get Workspace Warehouse Config

Tool to retrieve workspace-level SQL warehouse configuration settings. Use when you need to check security policies, serverless compute settings, channel versions, or warehouse type restrictions that apply to all SQL warehouses in the workspace.

ActionTry it

Import Workspace Object

Import a notebook or file into the Databricks workspace from base64-encoded content. Supports multiple formats: SOURCE (requires language parameter), JUPYTER, R_MARKDOWN, HTML, DBC, or AUTO for automatic detection. For SOURCE format with single files, you must specify the language (PYTHON, R, SCALA, or SQL). Maximum content size: 10 MB. Use overwrite=true to replace existing objects at the same path.

ActionTry it

List All Databricks Jobs (API 2.0)

Tool to list all jobs in the Databricks workspace using API 2.0. Use when you need to retrieve all jobs without pagination. Note: API 2.0 does not support pagination or filtering. For pagination support, use the API 2.2 endpoint instead.

ActionTry it

List Catalog Schemas

Tool to retrieve all schemas in a specified catalog from Unity Catalog. Use when you need to discover available schemas within a catalog based on user permissions. If the caller is the metastore admin or owner of the parent catalog, all schemas will be retrieved. Otherwise, only schemas owned by the caller or for which the caller has the USE_SCHEMA privilege will be retrieved.

ActionTry it

List Catalog Tables

Tool to list all tables in a Unity Catalog schema with pagination support. Use when you need to retrieve tables from a specific catalog and schema combination. The API is paginated by default - continue reading pages using next_page_token until it's absent to ensure all results are retrieved.

ActionTry it

List Clusters

Tool to list all pinned, active, and recently terminated Databricks clusters. Use when you need to retrieve cluster information, monitor cluster status, or get an overview of available compute resources. Returns clusters terminated within the last 30 days along with currently active clusters. Supports filtering by state, source, and policy, with pagination for large result sets.

ActionTry it

List Compute Cluster Availability Zones

Tool to list availability zones where Databricks clusters can be created. Use when you need to determine available zones for cluster deployment or planning redundancy. Returns the default zone and a list of all zones available in the workspace's cloud region. This endpoint is available for AWS workspaces.

ActionTry it

List Compute Cluster Node Types

Tool to list all supported Spark node types available for cluster launch in the workspace region. Use when you need to determine which instance types are available for creating or configuring clusters. Returns detailed specifications including compute resources, storage capabilities, and cloud-specific attributes for each node type.

ActionTry it

List Compute Cluster Spark Versions

Tool to list all available Databricks Runtime Spark versions for cluster creation. Use when you need to determine which Spark versions are available for creating or configuring clusters. The 'key' field from the response should be used as the 'spark_version' parameter when creating clusters.

ActionTry it

List Databricks Job Runs

Tool to list Databricks job runs in descending order by start time. Use when you need to retrieve job runs with optional filtering by job ID, run status, and type. Supports pagination via offset and limit parameters. Runs are automatically removed after 60 days.

ActionTry it

List DBFS Directory Contents

Tool to list the contents of a directory or get details of a file in DBFS. Use when you need to browse DBFS directories or check file details. Note: Recommended for directories with less than 10,000 files due to ~60 second timeout limitation. Throws RESOURCE_DOES_NOT_EXIST error if path doesn't exist.

ActionTry it

List Delta Live Tables Pipelines

Tool to list Delta Live Tables pipelines in the workspace. Use when you need to retrieve a paginated list of pipelines with summary information. The pipeline specification field is not returned by this endpoint - only summary information is provided. For complete pipeline details, use the get pipeline endpoint.

ActionTry it

List Genie Conversation Messages

Tool to retrieve all messages from a specific conversation thread in a Genie space. Use when you need to view the complete message history of a conversation including user queries and AI responses. Supports pagination for conversations with many messages.

ActionTry it

List Genie Conversations

Tool to retrieve all existing conversation threads within a Genie space. Use when you need to view conversations in a Genie space, either for the current user or all users if you have CAN MANAGE permission. Supports pagination for spaces with many conversations.

ActionTry it

List Genie Spaces

Tool to retrieve all Genie spaces in the workspace that the authenticated user has access to. Use when you need to list available Genie spaces, their metadata, and warehouse associations. Supports pagination for workspaces with many spaces.

ActionTry it

List Instance Pools

Tool to retrieve a list of all active instance pools in the Databricks workspace with their statistics and configuration. Use when you need to get an overview of all available instance pools.

ActionTry it

List Job Compliance for Policy

Tool to retrieve policy compliance status of all jobs using a given cluster policy. Use when you need to identify jobs that are out of compliance because the policy was updated after the job was last edited. Jobs are non-compliant when their job clusters no longer meet the requirements of the updated policy.

ActionTry it

List Legacy SQL Alerts

Tool to list all legacy SQL alerts accessible to the authenticated user. Use when you need to retrieve a list of all legacy alerts in the workspace. Note: This is a legacy endpoint (/api/2.0/preview/sql/alerts) that is deprecated and being replaced by /api/2.0/sql/alerts.

ActionTry it

List Members of a Security Group

Tool to retrieve all members (users and nested groups) of a Databricks security group. Use when you need to see who belongs to a specific group for access control auditing or management. This method is non-recursive and does not expand nested group memberships.

ActionTry it

List Model Serving Endpoints

Lists all model serving endpoints in the Databricks workspace. This includes both foundation model endpoints (e.g., GPT, Claude, Llama) and custom model endpoints. Returns comprehensive information for each endpoint: - Endpoint name, ID, type, and task (chat, completions, embeddings) - Current state and readiness status - Served entities/models configuration with pricing details - Endpoint capabilities (function calling, image input, long context) - Traffic routing configuration - Permission level and metadata Use this to discover available AI models, check endpoint status, or gather configuration details for making API calls.

ActionTry it

List Pipeline Updates

Tool to retrieve a paginated list of updates for a Databricks Delta Live Tables pipeline. Use when you need to view the update history for a specific pipeline. Returns information about each update including state, creation time, and configuration details such as full refresh and table selection.

ActionTry it

List Quality Monitor Refreshes

Tool to retrieve the refresh history for a quality monitor on a Unity Catalog table. Use when you need to check the status and history of monitor refresh operations. Returns up to 25 most recent refreshes including their state, timing, and status messages.

ActionTry it

List Repos

Tool to list Git repos that the calling user has Manage permissions on. Use when you need to retrieve all available repos in the workspace. Supports pagination and filtering by path prefix.

ActionTry it

List Secret Scopes

Tool to list all secret scopes available in the Databricks workspace. Use when you need to retrieve all secret scopes including their names, backend types (DATABRICKS or AZURE_KEYVAULT), and Key Vault metadata for Azure-backed scopes.

ActionTry it

List Secrets

Tool to list all secret keys stored in a Databricks secret scope. Use when you need to retrieve metadata about secrets in a scope (does not return secret values). Requires READ permission on the scope.

ActionTry it

List SQL Query History

Tool to retrieve the history of SQL queries executed against SQL warehouses and serverless compute. Use when you need to list queries by time range, status, user, or warehouse. Returns most recently started queries first (up to max_results). Supports filtering and pagination.

ActionTry it

List SQL Warehouses

Tool to list all SQL warehouses in the Databricks workspace. Use when you need to retrieve information about available SQL compute resources for running SQL commands. Returns the full list of SQL warehouses the user has access to, including their configuration, state, and connection details.

ActionTry it

List Tokens

Tool to list all valid personal access tokens (PATs) for a user-workspace pair. Use when you need to retrieve all tokens associated with the authenticated user in the current workspace. Note that each PAT is valid for only one workspace, and Databricks automatically revokes PATs that haven't been used for 90 days.

ActionTry it

List Unity Catalogs

Tool to retrieve a list of all catalogs in the Unity Catalog metastore. Use when you need to discover available catalogs based on user permissions. If the caller is the metastore admin, all catalogs will be retrieved. Otherwise, only catalogs owned by the caller or for which the caller has the USE_CATALOG privilege will be retrieved.

ActionTry it

List Users

Tool to list all users in a Databricks workspace using SCIM 2.0 protocol. Use when you need to retrieve user identities and their attributes. Supports filtering, pagination, and sorting.

ActionTry it

List Vector Search Endpoints

Tool to list all vector search endpoints in the Databricks workspace. Use when you need to retrieve information about vector search endpoints which represent compute resources hosting vector search indexes. Supports pagination for handling large result sets.

ActionTry it

List Workspace Directory

Tool to list the contents of a directory in Databricks workspace. Use when you need to view notebooks, files, directories, libraries, or repos at a specific path. Returns object information including paths, types, and metadata. Use object_id for setting permissions via the Permissions API.

ActionTry it

List Workspace Groups

Tool to list all groups in the Databricks workspace using SCIM v2 protocol. Use when you need to retrieve all groups or search for specific groups using filters and pagination.

ActionTry it

Log Batch MLflow Data

Tool to log a batch of metrics, parameters, and tags for an MLflow run in a single request. Use when you need to efficiently log multiple metrics, params, or tags simultaneously. Items within each type are processed sequentially in the order specified. The combined total of all items across metrics, params, and tags cannot exceed 1000.

ActionTry it

Log Logged Model Parameters

Tool to log parameters for a logged model in MLflow. Use when you need to attach hyperparameters or metadata to a LoggedModel object. A param can be logged only once for a logged model, and attempting to overwrite an existing param will result in an error. Available in MLflow 2.8+.

ActionTry it

Log MLflow Dataset Inputs

Tool to log dataset inputs to an MLflow run for tracking data sources used during model development. Use when you need to track metadata about datasets used in ML experiment runs, including information about the dataset source, schema, and tags. Enables logging of dataset inputs to a run, allowing you to track data sources throughout the ML lifecycle.

ActionTry it

Log MLflow Dataset Outputs

Tool to log dataset outputs from an MLflow run for tracking data generated during model development. Use when you need to track metadata about datasets produced by ML experiment runs, including information about predictions, model outputs, or generated data. Enables logging of dataset outputs to a run, allowing you to track generated data throughout the ML lifecycle.

ActionTry it

Log MLflow Metric

Tool to log a metric for an MLflow run with timestamp. Use when you need to record ML model performance metrics like accuracy, loss, or custom evaluation metrics. Metrics can be logged multiple times with different timestamps and values are never overwritten - each log appends to the metric history for that key.

ActionTry it

Log MLflow Model

Tool to log a model artifact for an MLflow run (Experimental API). Use when you need to record model metadata including artifact paths, flavors, and versioning information for a training run. The model_json parameter should contain a complete MLmodel specification in JSON string format.

ActionTry it

Log MLflow Parameter

Tool to log a parameter for an MLflow run as a key-value pair. Use when you need to record hyperparameters or constant values for ML model training or ETL pipelines. Parameters can only be logged once per run and cannot be changed after logging. Logging identical parameters is idempotent.

ActionTry it

Migrate Permissions

Tool to migrate ACL permissions from workspace groups to account groups. Use when adopting Unity Catalog and migrating permissions from workspace-level groups to account-level groups. Primarily used by the Unity Catalog Migration (UCX) tool. Supports batch processing with configurable size limits.

ActionTry it

Move DBFS File or Directory

Tool to move a file or directory from one location to another within DBFS. Use when you need to relocate files or directories in Databricks File System. Recursively moves all files if source is a directory. Not recommended for large-scale operations (>10k files) as it may timeout after ~60 seconds.

ActionTry it

Patch IAM Group V2

Tool to partially update a Databricks workspace group using SCIM 2.0 PATCH operations. Use when you need to modify group attributes like displayName, add/remove members, or update entitlements/roles. All operations in a single request are atomic.

ActionTry it

Patch IAM Service Principal V2

Tool to partially update a service principal using SCIM 2.0 PATCH operations. Use when you need to modify service principal attributes like active status, displayName, groups, entitlements, or roles without replacing the entire resource. All operations in a single request are atomic.

ActionTry it

Patch IAM User V2

Tool to partially update a user using SCIM 2.0 PATCH operations. Use when you need to modify user attributes like active status, displayName, userName, name fields, emails, groups, entitlements, or roles without replacing the entire resource. All operations in a single request are atomic.

ActionTry it

Permanently Delete Compute Cluster

Tool to permanently delete a Databricks compute cluster. Use when you need to irreversibly remove a cluster and its resources. After permanent deletion, the cluster will no longer appear in the cluster list and cannot be recovered.

ActionTry it

Pin Compute Cluster

Tool to pin a Databricks compute cluster configuration. Use when you need to preserve a cluster's configuration beyond the standard 30-day retention period. This operation is idempotent - pinning an already-pinned cluster has no effect. Requires workspace administrator privileges.

ActionTry it

Publish Lakeview Dashboard

Tool to publish an AI/BI Lakeview dashboard making it accessible via public link. Use when you need to publish a draft dashboard with embedded credentials and assign a warehouse for query execution. After successful publication, the dashboard becomes accessible at https://<deployment-url>/dashboardsv3/<resource_id>/published.

ActionTry it

Put Secret in Scope

Tool to insert or update a secret in a Databricks secret scope. Use when you need to store sensitive information like passwords, API keys, or credentials. Overwrites existing secrets with the same key. Requires WRITE or MANAGE permission on the scope. Maximum 1,000 secrets per scope with 128 KB limit per secret.

ActionTry it

Put Secrets ACL

Tool to create or overwrite access control list for a principal on a Databricks secret scope. Use when you need to grant or modify permissions for a user, group, or service principal on a secret scope. Requires MANAGE permission on the scope. Overwrites existing permission level for the principal if one already exists.

ActionTry it

Query Vector Search Index

Tool to query vector search index to find similar vectors and return associated documents. Use when performing similarity search, hybrid keyword-similarity search, or full-text search on Databricks Vector Search indexes. Supports filtering, reranking, and returns configurable columns from matched documents with similarity scores. Must provide either query_vector or query_text.

ActionTry it

Read DBFS File Contents

Tool to read the contents of a file from DBFS. Returns base64-encoded file data with maximum read size of 1 MB per request. Use when you need to retrieve file contents from Databricks File System. Throws RESOURCE_DOES_NOT_EXIST if file does not exist, INVALID_PARAMETER_VALUE if path is a directory, MAX_READ_SIZE_EXCEEDED if read length exceeds 1 MB.

ActionTry it

Register Lakebase Database as Catalog

Tool to register a Lakebase Postgres database as a Unity Catalog catalog in Databricks. Use when you need to register a Lakebase database instance as a catalog in Unity Catalog for data governance and analytics. Requires CREATE CATALOG privilege on the metastore and access to the database instance.

ActionTry it

Remove Compute Instance Profile

Tool to remove an instance profile from Databricks. Use when you need to unregister an AWS instance profile ARN from Databricks. This operation is only accessible to admin users. Existing clusters with this instance profile will continue to function normally after removal.

ActionTry it

Restore ML Experiment

Tool to restore a deleted MLflow experiment and its associated metadata, runs, metrics, params, and tags. Use when you need to recover a previously deleted experiment from Databricks. If the experiment uses FileStore, underlying artifacts are also restored.

ActionTry it

Restore ML Experiment Run

Tool to restore a deleted MLflow run and its associated metadata, runs, metrics, params, and tags. Use when you need to recover a previously deleted run from Databricks ML experiments. The operation cannot restore runs that were permanently deleted.

ActionTry it

Restore ML Experiment Runs

Tool to bulk restore runs in an ML experiment that were deleted at or after a specified timestamp. Use when you need to recover multiple deleted experiment runs. Only runs deleted at or after the specified timestamp are restored. The maximum number of runs that can be restored in one operation is 10000.

ActionTry it

Restore SQL Query (Legacy)

Tool to restore a trashed SQL query to active state. Use when you need to recover a deleted query within 30 days of deletion. Once restored, the query reappears in list views and searches and can be used for alerts again. This is a legacy/deprecated API endpoint.

ActionTry it

Search Logged Models

Tool to search for logged models in MLflow experiments based on various criteria. Use when you need to find models that match specific metrics, parameters, tags, or attributes using SQL-like filter expressions. Supports pagination, ordering results, and filtering by datasets.

ActionTry it

Search MLflow Experiments

Tool to search for MLflow experiments with filtering, ordering, and pagination support. Use when you need to find experiments based on name patterns, tags, or other criteria. Supports SQL-like filtering expressions and ordering by experiment attributes.

ActionTry it

Search MLflow Runs

Tool to search for MLflow runs with filtering, ordering, and pagination support. Use when you need to find runs based on metrics, parameters, tags, or other criteria. Supports complex filter expressions with operators like =, !=, >, >=, <, <= for metrics, params, and tags.

ActionTry it

Send Genie Message Feedback

Tool to send feedback for a Genie message. Use when you need to provide positive, negative, or no feedback rating for AI-generated messages in Genie conversations. Positive feedback on responses that join tables or use SQL expressions can prompt Genie to suggest new SQL snippets to space managers for review and approval.

ActionTry it

Set Compute Cluster Policy Permissions

Tool to set permissions for a Databricks cluster policy, replacing all existing permissions. Use when you need to configure access control for a cluster policy. This operation replaces ALL existing permissions; non-admin users must be granted permissions to access the policy. Workspace admins always have permissions on all policies.

ActionTry it

Set Compute Instance Pool Permissions

Tool to set permissions for a Databricks instance pool, replacing all existing permissions. Use when you need to configure access control for an instance pool. This operation replaces ALL existing permissions. You must have CAN_MANAGE permission on a pool to configure its permissions.

ActionTry it

Set Databricks App Permissions

Tool to set permissions for a Databricks app, replacing all existing permissions. Use when you need to configure access control for an app. This operation replaces ALL existing permissions; for incremental updates, use the update permissions endpoint instead. Admin permissions cannot be removed.

ActionTry it

Set Databricks Job Permissions

Tool to set permissions for a Databricks job, completely replacing all existing permissions. Use when you need to configure access control for a job. This operation replaces ALL existing permissions with the provided list. To remove all permissions except the owner, provide an empty array. The job must have exactly one owner (cannot be a group).

ActionTry it

Set IAM Permissions

Tool to set IAM permissions for a Databricks workspace object, replacing all existing permissions. Use when you need to configure complete access control for a resource. This operation replaces the entire access control list - existing permissions are overwritten. Admin permissions on the admins group cannot be removed.

ActionTry it

Set Logged Model Tags

Tool to set tags on a logged model in MLflow. Use when you need to add or update metadata tags on a LoggedModel object for organization and tracking. Tags are key-value pairs that can be used to search and filter logged models. Part of MLflow 3's logged model management capabilities.

ActionTry it

Set ML Experiment Permissions

Tool to set permissions for an MLflow experiment, replacing all existing permissions. Use when you need to configure access control for an experiment. This operation replaces ALL existing permissions; for incremental updates, use the update permissions endpoint instead.

ActionTry it

Set ML Experiment Tag

Tool to set a tag on an MLflow experiment. Use when you need to add or update experiment metadata. Experiment tags are metadata that can be updated at any time.

ActionTry it

Set MLflow Run Tag

Tool to set a tag on an MLflow run. Use when you need to add custom metadata to runs for filtering, searching, and organizing experiments. Tags with the same key can be overwritten by successive writes. Logging the same tag (key, value) is idempotent.

ActionTry it

Set or Update ML Feature Tag

Tool to set or update a tag on a feature in a feature table in ML Feature Store. Use when you need to add or modify metadata tags on specific features. If the tag already exists, it will be updated with the new value. If the tag doesn't exist, it will be created automatically. This operation is idempotent and can be used to ensure a tag has a specific value.

ActionTry it

Set SQL Object Permissions

Tool to set access control list for legacy SQL objects (alerts, dashboards, queries, or data_sources). Use when you need to configure permissions for legacy SQL objects created via the /preview/sql endpoints. IMPORTANT: - This operation REPLACES ALL existing permissions. To retain existing permissions, include them in the access_control_list. - This is a DEPRECATED API; Databricks recommends using the Workspace API (/api/2.0/permissions) instead. - Only works with LEGACY SQL objects: Legacy SQL queries (/preview/sql/queries), legacy Redash-based SQL dashboards (NOT Lakeview dashboards), legacy SQL alerts, and SQL warehouses (data_sources). - Does NOT work with modern Lakeview dashboards (/lakeview/dashboards) or modern SQL queries (/sql/queries).

ActionTry it

Set SQL Warehouse Permissions

Tool to set permissions for a Databricks SQL warehouse, replacing all existing permissions. Use when you need to configure access control for a SQL warehouse. This operation is authoritative and overwrites all existing permissions. Exactly one IS_OWNER must be specified. Groups cannot have IS_OWNER permission.

ActionTry it

Set Token Management Permissions

Tool to set permissions for personal access token management, replacing all existing permissions. Use when configuring which users, groups, and service principals can create and use tokens. This operation replaces ALL existing permissions; if you need to add or modify permissions without replacing existing ones, use the update_permissions method instead. Workspace admins always retain CAN_MANAGE permissions.

ActionTry it

Set Workspace Configuration Status

Set workspace-level configuration settings for a Databricks workspace. Commonly used to configure the maximum token lifetime for personal access tokens (maxTokenLifetimeDays). Updates are applied immediately and affect workspace behavior. Requires workspace admin permissions. Invalid or unsupported configuration keys will return a 400 Bad Request error.

ActionTry it

Set Workspace Repo Permissions

Tool to set permissions for a workspace repository, replacing all existing permissions. Use when you need to configure access control for a workspace repo. This operation replaces ALL existing permissions; admin users cannot have their permissions lowered. Repos can inherit permissions from their root object.

ActionTry it

Set Workspace Warehouse Config

Tool to configure workspace-level SQL warehouse settings shared by all SQL warehouses. Use when you need to set security policies, enable serverless compute, configure channel versions, or manage warehouse type restrictions across the workspace.

ActionTry it

Start Compute Cluster

Tool to start a terminated Databricks compute cluster asynchronously. Use when you need to restart a stopped cluster. The cluster transitions through PENDING state before reaching RUNNING. Poll cluster status to verify when fully started.

ActionTry it

Start Databricks App

Tool to start the last active deployment of a Databricks app. Use when you need to start a stopped app, which transitions it to the ACTIVE state. The start operation is asynchronous and the app will transition to the ACTIVE state after the operation completes.

ActionTry it

Start Genie Conversation

Tool to start a new Genie conversation in a Databricks space for natural language data queries. Use when you need to ask questions about data using natural language. The message processes asynchronously, so initial status will be IN_PROGRESS. Poll the message status to get the completed response with query results.

ActionTry it

Start SQL Warehouse

Tool to start a stopped Databricks SQL warehouse asynchronously. Use when you need to restart a stopped warehouse. The warehouse transitions through STARTING state before reaching RUNNING. Requires CAN MONITOR permissions or higher.

ActionTry it

Stop Databricks App

Tool to stop the active deployment of a Databricks app. Use when you need to stop a running app, which transitions it to the STOPPED state. The stop operation is asynchronous and the app will transition to the STOPPED state after the operation completes.

ActionTry it

Submit One-Time Run

Tool to submit a one-time run without creating a job. Use when you need to execute a task directly without saving it as a job definition. After submission, use the jobs/runs/get API with the returned run_id to check the run state and monitor progress.

ActionTry it

Trash Genie Space

Tool to move a Genie space to trash instead of permanently deleting it. Use when you need to remove a Genie space while retaining recovery options. Trashed spaces follow standard Databricks trash behavior with 30-day retention before permanent deletion. Requires CAN MANAGE permission on the space.

ActionTry it

Trash Lakeview Dashboard

Tool to move a Lakeview dashboard to trash instead of permanently deleting it. Use when you need to remove a dashboard while retaining recovery options. Trashed dashboards can be recovered within 30 days before permanent deletion.

ActionTry it

Unassign Metastore from Workspace

Tool to unassign a Unity Catalog metastore from a workspace. Use when you need to remove the association between a workspace and its assigned metastore, leaving the workspace with no metastore. The metastore itself is not deleted, only the workspace assignment is removed. Requires account admin privileges.

ActionTry it

Unpin Compute Cluster

Tool to unpin a Databricks compute cluster configuration. Use when you need to allow a cluster's configuration to be removed after termination. This operation is idempotent - unpinning an already-unpinned cluster has no effect. Requires workspace administrator privileges.

ActionTry it

Unpublish Lakeview Dashboard

Tool to unpublish an AI/BI Lakeview dashboard while preserving its draft version. Use when you need to remove the published version of a dashboard. The draft version remains available and can be republished later if needed.

ActionTry it

Update Access Request Destinations

Tool to update access request notification destinations for Unity Catalog securables. Use when you need to configure where access request notifications are sent for catalogs, schemas, external locations, connections, or credentials. Requires metastore admin, owner privileges, or MANAGE permission on the securable. Maximum 5 emails and 5 external destinations allowed per securable. Note: Destinations cannot be updated for securables underneath schemas (tables, volumes, functions, models) as they inherit from parent securables.

ActionTry it

Update AI/BI Dashboard Embedding Access Policy

Tool to update AI/BI dashboard embedding workspace access policy at the workspace level. Use when you need to control whether AI/BI published dashboard embedding is enabled, conditionally enabled, or disabled. Follows read-modify-write workflow with etag-based optimistic concurrency control to prevent race conditions.

ActionTry it

Update AI/BI Dashboard Embedding Approved Domains

Tool to update the list of domains approved to host embedded AI/BI dashboards at the workspace level. Use when you need to modify the approved domains list. The approved domains list can only be modified when the current access policy is set to ALLOW_APPROVED_DOMAINS.

ActionTry it

Update Automatic Cluster Update Setting

Tool to update workspace automatic cluster update configuration with etag-based concurrency control. Use when you need to enable/disable automatic cluster updates, configure maintenance windows, or adjust restart behavior. Requires Enhanced Security Compliance SKU entitlement and admin access. If the setting is updated concurrently, the PATCH request fails with HTTP 409 requiring retry with fresh etag. Note: The enabled field can only be modified if the workspace has the necessary entitlement; other fields like maintenance_window and restart_even_if_no_updates_available can be configured regardless.

ActionTry it

Update Catalog Connection

Tool to update an existing Unity Catalog connection configuration. Use when you need to modify connection properties, credentials, ownership, or metadata for external data sources.

ActionTry it

Update Catalog Credential

Tool to update an existing Unity Catalog credential with new properties. Use when you need to modify credential configuration, ownership, or cloud provider settings. The caller must be the owner of the credential, a metastore admin, or have MANAGE permission on the credential. If the caller is a metastore admin, only the owner field can be changed.

ActionTry it

Update Catalog Function

Tool to update function owner in Unity Catalog. Use when you need to change the ownership of a catalog function. Only the owner of the function can be updated via this endpoint. The caller must be a metastore admin, the owner of the function's parent catalog, the owner of the parent schema with USE_CATALOG privilege, or the owner of the function with both USE_CATALOG and USE_SCHEMA privileges.

ActionTry it

Update Catalog Grants

Tool to update permissions for Unity Catalog securables by adding or removing privileges for principals. Use when you need to grant or revoke permissions on catalogs, schemas, tables, or other Unity Catalog objects. Only metastore admins, object owners, users with MANAGE privilege, or parent catalog/schema owners can update permissions.

ActionTry it

Update Catalog Table

Tool to update Unity Catalog table properties. Use when you need to change the owner or comment of a table. The caller must be the owner of the parent catalog, have the USE_CATALOG privilege on the parent catalog and be the owner of the parent schema, or be the owner of the table and have the USE_CATALOG privilege on the parent catalog and the USE_SCHEMA privilege on the parent schema.

ActionTry it

Update Catalog Workspace Bindings

Tool to update workspace bindings for a Unity Catalog securable (catalog). Use when you need to control which workspaces can access a catalog. Allows adding or removing workspace bindings with read-write or read-only access. Caller must be a metastore admin or owner of the catalog.

ActionTry it

Update Cluster Policy Permissions

Tool to incrementally update permissions on a Databricks cluster policy. Use when you need to modify permissions without replacing the entire permission set. This PATCH operation updates only the specified permissions, preserving existing permissions not included in the request.

ActionTry it

Update Compute Cluster

Tool to partially update a Databricks compute cluster configuration using field masks. Use when you need to update specific cluster attributes without providing a full configuration. The update_mask specifies which fields to modify. Running clusters restart to apply changes; terminated clusters apply changes on next startup.

ActionTry it

Update Dashboard Email Subscriptions Setting

Tool to update the Dashboard Email Subscriptions setting for the workspace with etag-based concurrency control. Use when you need to enable or disable whether dashboard schedules can send subscription emails. If the setting is updated concurrently, the PATCH request fails with HTTP 409 requiring retry with fresh etag.

ActionTry it

Update Databricks App

Tool to update an existing Databricks app configuration. Use when you need to modify app settings such as description, resources, compute size, budget policy, or API scopes. This is a partial update operation - only fields provided in the request will be updated, other fields retain their current values.

ActionTry it

Update Databricks App Permissions

Tool to incrementally update permissions for a Databricks app. Use when you need to modify specific permissions without replacing the entire permission set. This PATCH operation updates only the specified permissions, preserving existing permissions not included in the request. For replacing all permissions, use SetPermissions instead.

ActionTry it

Update Databricks Job By ID

Tool to completely reset all settings for a Databricks job. Use when you need to overwrite all job configuration at once. Changes to timeout_seconds apply immediately to active runs; other changes apply to future runs only. Consider using the update endpoint for partial updates instead of reset to minimize disruption.

ActionTry it

Update Default Namespace Setting

Tool to update the default catalog namespace configuration for workspace queries with etag-based concurrency control. Use when you need to configure the default catalog used for queries without fully qualified three-level names. Requires a restart of clusters and SQL warehouses to take effect. Only applies to Unity Catalog-enabled compute. If concurrent updates occur, the request fails with 409 status requiring retry with fresh etag.

ActionTry it

Update Default Warehouse ID Setting

Tool to update the default SQL warehouse configuration for the workspace with etag-based concurrency control. Use when you need to configure which warehouse is used as the default for SQL operations and queries in the workspace. If concurrent updates occur, the request fails with 409 status requiring retry with fresh etag.

ActionTry it

Update Disable Legacy Access Setting

Tool to enable the workspace disable legacy access setting with optional etag-based concurrency control. Use when you need to enable (not disable) restrictions on legacy features including direct Hive Metastore access, external location fallback mode, and Databricks Runtime versions prior to 13.3LTS. Note: This setting can only be set to true (enabled), not false. The etag parameter is optional but recommended for preventing concurrent modification conflicts.

ActionTry it

Update Disable Legacy DBFS Setting

Tool to update workspace disable legacy DBFS setting with etag-based concurrency control. Use when you need to enable or disable legacy DBFS features including DBFS root access, mounts, and legacy Databricks Runtime versions prior to 13.3 LTS. Changes take up to 20 minutes to take effect and require manual restart of compute clusters and SQL warehouses.

ActionTry it

Update Enable Export Notebook

Tool to update workspace notebook and file export setting. Use when you need to enable or disable users' ability to export notebooks and files from the Workspace UI. Requires admin access.

ActionTry it

Update Enable Notebook Table Clipboard

Tool to update workspace setting for notebook table clipboard. Use when you need to enable or disable users' ability to copy tabular data from notebook result tables to clipboard. Requires workspace admin privileges.

ActionTry it

Update Enable Results Downloading

Tool to update workspace notebook results download setting. Use when you need to enable or disable users' ability to download notebook results. Requires admin access.

ActionTry it

Update Enhanced Security Monitoring

Tool to update enhanced security monitoring workspace settings. Use when you need to enable or disable Enhanced Security Monitoring (ESM) for the workspace. Requires the etag from a previous GET request for optimistic concurrency control.

ActionTry it

Update External Location

Tool to update an existing Unity Catalog external location properties. Use when you need to modify the cloud storage path, credentials, ownership, or configuration of an external location. The caller must be the owner of the external location or a metastore admin. Use force parameter to update even if URL changes invalidate dependencies.

ActionTry it

Update External Metadata

Tool to update an external metadata object in Unity Catalog. Use when you need to modify metadata about external systems registered within Unity Catalog. The user must have metastore admin status, own the object, or possess the MODIFY privilege. Note that changing ownership requires the MANAGE privilege, and callers cannot update both the owner and other metadata in a single request.

ActionTry it

Update Genie Space

Tool to update an existing Genie space configuration. Use when you need to modify a Genie space's title, description, warehouse assignment, or complete serialized configuration. Supports partial updates (only provide fields you want to change) or full replacement via serialized_space. Useful for CI/CD pipelines, version control, and automated space management.

ActionTry it

Update Global Init Script

Tool to update a global initialization script in Databricks workspace. Use when you need to modify script content, name, enabled status, or execution order. All fields are optional; unspecified fields retain their current value. Existing clusters must be restarted to pick up changes.

ActionTry it

Update IAM Account Access Control Rule Set

Tool to update account-level access control rule set for service principals, groups, or budget policies. Use when you need to replace the entire set of access control rules for a resource. This is a PUT operation that replaces all existing roles - to preserve existing roles, they must be included in the grant_rules array.

ActionTry it

Update IAM Group V2

Tool to update an existing group in Databricks workspace using SCIM v2 protocol. This performs a complete replacement of the group resource. Use when you need to update group properties, members, entitlements, or roles. For partial updates, consider using PATCH instead.

ActionTry it

Update IAM Permissions

Tool to incrementally update permissions on Databricks workspace objects including dashboards, jobs, clusters, warehouses, notebooks, and more. Use when you need to modify specific permissions without replacing the entire permission set. This PATCH operation updates only the specified permissions, preserving existing permissions not included in the request.

ActionTry it

Update IAM Service Principal V2

Tool to update an existing service principal in Databricks workspace using SCIM v2 protocol. This performs a complete replacement of the service principal resource (PUT operation). Use when you need to update service principal properties, group memberships, entitlements, or roles. Note: applicationId and id are immutable fields.

ActionTry it

Update IAM User V2

Tool to update a user in Databricks workspace using SCIM v2 protocol. This performs a complete replacement (PUT) of the user resource. Use when you need to update user properties including userName, displayName, active status, emails, entitlements, or roles. IMPORTANT LIMITATIONS: - Groups cannot be updated via workspace-level API. Groups for account-level users are managed at the account level only. - For partial updates (updating specific fields without replacing the entire resource), use the PATCH operation instead. - The 'groups' parameter is included for response compatibility but will be ignored in requests to avoid API errors.

ActionTry it

Update Instance Pool Permissions

Tool to incrementally update permissions on a Databricks instance pool. Use when you need to modify permissions without replacing the entire permission set. This PATCH operation updates only the specified permissions, preserving existing permissions not included in the request.

ActionTry it

Update Lakeview Dashboard

Tool to update a draft Lakeview dashboard configuration and metadata. Use when you need to modify dashboard properties such as display name, warehouse, location, or content. This is a partial update operation - only provided fields will be updated. The etag field can be used for optimistic concurrency control to prevent conflicts from concurrent modifications.

ActionTry it

Update Legacy SQL Alert

Tool to update a legacy SQL alert configuration including name, query reference, trigger conditions, and notification settings. Use when you need to modify existing alerts using the legacy API endpoint. Note: This is a legacy endpoint that has been replaced by /api/2.0/sql/alerts and is deprecated.

ActionTry it

Update Legacy SQL Query

Tool to update an existing SQL query definition using the legacy API. Use when you need to modify queries with the legacy /preview/sql/queries endpoint. Note: This is a legacy/deprecated endpoint. The newer API uses PATCH /api/2.0/sql/queries/{id} instead.

ActionTry it

Update Legacy SQL Query Visualization

Tool to update a visualization in a SQL query using the legacy API. Use when you need to modify visualization properties such as name, description, type, and options. Note: This is a deprecated endpoint; users should migrate to the current queryvisualizations/update method. Databricks does not recommend modifying visualization settings in JSON.

ActionTry it

Update LLM Proxy Partner Powered Setting

Updates workspace-level setting that controls whether AI features are powered by partner-hosted models. Use to enable/disable partner-powered AI features (Azure OpenAI or Anthropic on Databricks). When disabled, Databricks-hosted models are used instead. IMPORTANT: You must first call DATABRICKS_SETTINGS_LLM_PROXY_PARTNER_POWERED_WORKSPACE_GET to obtain the current etag before updating. If concurrent updates occur, request fails with 409 status - retry with the fresh etag from the 409 response.

ActionTry it

Update Marketplace Consumer Installation

Tool to update marketplace consumer installation fields and rotate tokens for marketplace listings. Use when you need to modify installation attributes or refresh access credentials. The token will be rotated if the rotate_token flag is true.

ActionTry it

Update Metastore

Tool to update configuration settings for an existing Unity Catalog metastore. Use when you need to modify metastore properties like name, owner, Delta Sharing settings, or storage credentials. Requires metastore admin permissions.

ActionTry it

Update Metastore Assignment

Tool to update a metastore assignment for a workspace. Use when you need to update the metastore_id or default_catalog_name for a workspace that already has a metastore assigned. Account admin privileges are required to update metastore_id, while workspace admin can update default_catalog_name.

ActionTry it

Update ML Experiment

Tool to update MLflow experiment metadata, primarily for renaming experiments. Use when you need to rename an existing experiment. The new experiment name must be unique across all experiments in the workspace.

ActionTry it

Update ML Experiment Permissions

Tool to incrementally update permissions for an MLflow experiment. Use when you need to modify specific permissions without replacing the entire permission set. This PATCH operation updates only the specified permissions, preserving existing permissions not included in the request.

ActionTry it

Update ML Experiment Run

Tool to update MLflow run metadata including status, end time, and run name. Use when a run's status changes outside normal execution flow or when you need to rename a run. This endpoint allows you to modify a run's metadata after it has been created.

ActionTry it

Update Model Version

Tool to update a Unity Catalog model version. Use when you need to modify the comment of a specific model version. Currently only the comment field can be updated. The caller must be a metastore admin or owner of the parent registered model with appropriate catalog and schema privileges.

ActionTry it

Update Notification Destination

Tool to update an existing notification destination configuration. Use when you need to modify display name or configuration settings for email, Slack, PagerDuty, Microsoft Teams, or webhook destinations. Requires workspace admin permissions. At least one field (display_name or config) must be provided.

ActionTry it

Update Pipeline Permissions

Tool to incrementally update permissions on a Databricks pipeline. Use when you need to modify specific permissions without replacing the entire permission set. This PATCH operation updates only the specified permissions, preserving existing permissions not included in the request. Pipelines can inherit permissions from root object.

ActionTry it

Update Restrict Workspace Admins Setting

Tool to update the restrict workspace admins setting with etag-based concurrency control. Use when you need to modify workspace administrator capabilities for service principal token creation and job ownership/run-as settings. Requires account admin permissions and workspace membership. If concurrent updates occur, the request fails with HTTP 409 requiring retry with fresh etag.

ActionTry it

Update Serving Endpoint AI Gateway

Tool to update AI Gateway configuration of a Databricks serving endpoint. Use when you need to configure traffic fallback, AI guardrails, payload logging, rate limits, or usage tracking. Supports external model, provisioned throughput, and pay-per-token endpoints; agent endpoints currently only support inference tables.

ActionTry it

Update Serving Endpoint Rate Limits

Tool to update rate limits for a Databricks serving endpoint. Use when you need to control the number of API calls allowed within a time period. Note: This endpoint is deprecated; consider using AI Gateway for rate limit management instead.

ActionTry it

Update Share

Tool to update an existing share in Unity Catalog with changes to metadata or data objects. Use when you need to modify share properties (comment, owner, name) or manage shared data objects (add, remove, or update tables/views/volumes). The caller must be a metastore admin or the owner of the share. For table additions, the owner must have SELECT privilege on the table.

ActionTry it

Update Sharing Provider

Tool to update an existing Delta Sharing authentication provider in Unity Catalog. Use when you need to modify provider properties like comment, owner, or name. The caller must be either a metastore admin or the owner of the provider. To rename the provider, the caller must be BOTH a metastore admin AND the owner.

ActionTry it

Update SQL Alert

Tool to update an existing Databricks SQL alert using partial update with field mask. Use when you need to modify alert properties including display name, query reference, trigger conditions, notification settings, or ownership.

ActionTry it

Update SQL Dashboard

Tool to update legacy Databricks SQL dashboard attributes (name, run_as_role, tags). Use when you need to modify dashboard metadata. Note: This operation only affects dashboard object attributes and does NOT add, modify, or remove widgets.

ActionTry it

Update SQL Query

Tool to update a saved SQL query object in Databricks using partial field updates. Use when you need to modify specific fields of an existing query without replacing the entire object. Requires update_mask parameter to specify which fields to update. Supports updating query text, configuration, parameters, and metadata.

ActionTry it

Update SQL Query Visualization

Tool to update an existing Databricks SQL query visualization using partial update with field mask. Use when you need to modify visualization properties. Only two fields are updatable: display_name (the name shown in UI) and type (TABLE, CHART, COUNTER, FUNNEL, or PIVOT). The visualization type can be changed to reorganize how query results are displayed.

ActionTry it

Update SQL Results Download Setting

Tool to update workspace SQL results download setting controlling whether users can download results from SQL Editor and AI/BI Dashboards. Use when you need to enable or disable SQL query results download capability. Requires workspace admin access and uses etag-based optimistic concurrency control to prevent conflicting updates.

ActionTry it

Update SQL Warehouse Permissions

Tool to incrementally update permissions for a Databricks SQL warehouse. Use when you need to modify specific permissions without replacing the entire permission set. This PATCH operation updates only the specified permissions, preserving existing permissions not included in the request. For replacing all permissions, use SetPermissions instead.

ActionTry it

Update Storage Credential

Tool to update an existing storage credential in Unity Catalog. Use when you need to modify credential properties, cloud provider configuration, or ownership. The caller must be the owner of the storage credential or a metastore admin. Metastore admins can only modify the owner field.

ActionTry it

Update Tag Policy

Tool to update an existing tag policy (governed tag) with specified fields. Use when you need to modify tag policy properties like description, tag key, or allowed values. Users must have MANAGE permission on the governed tag to edit it.

ActionTry it

Update Token Management Permissions

Tool to incrementally update permissions for personal access token management. Use when you need to modify who can create and use personal access tokens. This PATCH operation updates only the specified permissions, preserving existing permissions not included in the request.

ActionTry it

Update Workspace Git Credentials

Tool to update existing Git credentials for authenticating with remote Git repositories in Databricks. Use when you need to modify Git provider credentials, email, username, or access tokens. Note that the git_provider field cannot be changed after initial creation. For azureDevOpsServicesAad provider, do not specify personal_access_token or git_username.

ActionTry it

Update Workspace Repo

Tool to update a workspace repo to a different branch or tag. Use when you need to switch branches, pull latest changes, or update sparse checkout settings. When updating to a tag, the repo enters a detached HEAD state and must be switched back to a branch before committing.

ActionTry it

Update Workspace Repo Permissions

Tool to incrementally update permissions on a Databricks workspace repository. Use when you need to modify specific permissions for users, groups, or service principals without replacing the entire permission set. This PATCH operation only modifies the permissions specified while leaving other existing permissions intact. Repos can inherit permissions from their root object.

ActionTry it

Upsert Data Vector Index

Tool to upsert (insert or update) data into a Direct Vector Access Index. Use when you need to manually add or update vectors in a Databricks vector search index. The index must be a Direct Vector Access Index type where updates are managed via REST API or SDK calls. Data structure must match the schema defined when the index was created, including the primary key field.

ActionTry it

Validate Catalog Credential

Tool to validate a Unity Catalog credential for external access. Use when you need to verify that a credential can successfully perform its intended operations. For SERVICE credentials, validates cloud service access. For STORAGE credentials, tests READ, WRITE, DELETE, LIST operations on the specified location.

ActionTry it

Validate Storage Credential

Tool to validate a storage credential configuration for Unity Catalog. Use when you need to verify that a storage credential can successfully access a cloud storage location. Requires metastore admin, storage credential owner, or CREATE_EXTERNAL_LOCATION privilege.

ActionTry it