Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
This article is an overview of your options for creating and managing workspaces.
What is a workspace?
A workspace is a Azure Databricks deployment in a cloud service account. It provides a unified environment for working with Azure Databricks assets for a specified set of users.
There are two types of Databricks workspaces available:
- Serverless workspaces (Public Preview): A workspace deployment in your Databricks account that comes pre-configured with serverless compute and default storage to provide a completely serverless experience. You can still connect to your cloud storage from serverless workspaces.
- Classic workspaces: A workspace deployment in your Databricks account that provisions storage and compute resources in your existing cloud account. Serverless compute is still available in classic workspaces.
Requirements
Before you create an Azure Databricks workspace, you must have an Azure subscription that isn't a Free Trial Subscription.
If you have a free account, complete the following steps:
- Go to your profile and change your subscription to pay-as-you-go. See Azure free account.
- Remove the spending limit.
- Request a quota increase for vCPUs in your region.
Required Azure permissions
To create a Azure Databricks workspace, you must be the one of the following:
A user with the Azure Contributor or Owner role at the subscription level.
A user with a custom role definition that has the following list of permissions:
Microsoft.Databricks/workspaces/*Microsoft.Resources/subscriptions/resourceGroups/readMicrosoft.Resources/subscriptions/resourceGroups/writeMicrosoft.Databricks/accessConnectors/*Microsoft.Compute/register/actionMicrosoft.ManagedIdentity/register/actionMicrosoft.Storage/register/actionMicrosoft.Network/register/actionMicrosoft.Resources/deployments/validate/actionMicrosoft.Resources/deployments/writeMicrosoft.Resources/deployments/read
Note
The Microsoft.Compute/register/action, Microsoft.ManagedIdentity/register/action, Microsoft.Storage/register/action, Microsoft.Network/register/action permissions are not required if these providers are already registered in the subscription. See Register resource provider.
Choosing a workspace type
The following sections describe which workspace type is best for common use cases. Use these recommendations to help you decide whether you should deploy a serverless or classic workspace.
When to choose serverless workspaces
Serverless workspaces are the best choice for the following use cases:
- Enable business users to access Databricks One
- Create AI/BI dashboards
- Create Databricks Apps
- Perform exploratory analytics using notebooks or SQL warehouses
- Connect to SaaS providers via Lakehouse Federation (but not Lakeflow Connect)
- Use Genie Spaces for business use cases
- Test new Mosaic AI features before moving them into production
- Create serverless Lakeflow Spark Declarative Pipelines
When to choose classic workspaces
Classic workspaces are the best choice for the following use cases:
- Do AI or ML development work that requires GPUs
- Use Databricks Runtime for Machine Learning or Apache Spark MLib
- Port over existing legacy Spark code that uses Spark RDDs
- Use Scala or R as your primary coding language
- Stream data that requires default or time-based trigger intervals
- Connect to the Databricks APIs over a PrivateLink connection
- Connect to on-premises systems or private databases directly, through Lakeflow Connect
Workspace creation options
There are multiple ways to deploy an Azure Databricks workspace. The standard deployment method is through the Azure Portal or Terraform.
Additionally, you can create workspaces using the following tools: