

It allows you to write jobs using Spark APIs and run them remotely on an Azure Databricks cluster instead of in the local Spark session.įor example, when you run the DataFrame command ("parquet").load(.).groupBy(.).agg(.).show() using Databricks Connect, the parsing and planning of the job runs on your local machine. Overviewĭatabricks Connect is a client library for Databricks Runtime.

This article explains how Databricks Connect works, walks you through the steps to get started with Databricks Connect, explains how to troubleshoot issues that may arise when using Databricks Connect, and differences between running using Databricks Connect versus running in an Azure Databricks notebook. Also take a look at the key bindings documentation.Databricks recommends that you use dbx by Databricks Labs for local development instead of Databricks Connect.ĭatabricks Connect allows you to connect your favorite IDE (Eclipse, IntelliJ, P圜harm, RStudio, Visual Studio Code), notebook server (Jupyter Notebook, Zeppelin), and other custom applications to Azure Databricks clusters. Use the ‘default’ Default.sublime-keymap file (ie the one the opens up on the left-hand side when you go to Preferences > Key Bindings) to figure out what commands exist and how to re-bind them. Here’s my Default.sublime-keymap file that implements the above key bindings: [ Then, when you next install Sublime Text on a new computer, you can just paste these key bindings in.
