Databricks Lakeflow GA(2025): Good-bye DLT, Hello Unified Data Engineering
A collaborative team of Data Engineers, Data Analysts, Data Scientists, AI researchers, and industry experts delivering concise insights and the latest trends in data and AI.
Why another name change?
“I already have Delta Tables—why do I need Delta Live Tables?” If you’ve ever been asked (or asked yourself) that question, you’re not alone. Community threads and LinkedIn rants have been peppered with engineers conflating the storage layer (Delta Table) with the pipeline service (Delta Live Tables).
Databricks has seized GA as the perfect time to re-brand: DLT is now Lakeflow Declarative Pipelines. The fresh Lakeflow umbrella groups ingestion (Connect), transformation (Declarative Pipelines) and orchestration (Jobs) into one product—removing two big pain points:
- Naming confusion (DLT ≠ a table).
- Tool sprawl (Auto Loader + DLT + Workflows was three SKUs).
Quick-look table — what actually changed?
Layer / Term | Before (Jan 2025) | Now (June 2025 GA) | What it means for you |
---|---|---|---|
Storage | Delta Table | Delta Table | Unchanged ACID lakehouse tables |
Transformation | Delta Live Tables (DLT) | Lakeflow Declarative Pipelines | Same declarative SQL/Python syntax, now inside Lakeflow UI |
Ingestion | Auto Loader + Partner Connect bits | Lakeflow Connect | 40+ managed connectors + high-throughput Zerobus API (100 MB/s, 5 s latency) |
Orchestration | Workflows | Lakeflow Jobs | Jobs UI renamed, adds loops, table-update triggers & default serverless runs |
New goodies | — | IDE for Data Engineering, Auto CDC, Lakeflow Designer (no-code) | Faster debugging, zero-code option for analysts, simpler CDF pipelines |
Deep dive into the GA feature set
Lakeflow Connect — plug-and-play ingestion
- 40+ GA connectors: Salesforce, Workday, SharePoint, SQL Server, Oracle, Postgres, Redshift, BigQuery, SFTP and more. (databricks.com)
- Zerobus write API streams events straight into Unity Catalog at ~100 MB/s with 5 s end-to-lake latency.
- Source-aware CDC optimisations so you don’t babysit log positions.
Lakeflow Declarative Pipelines — DLT 2.0
- Same
CREATE LIVE TABLE
DSL, now built on Spark Declarative Pipelines—an open standard you can run anywhere Spark runs. (databricks.com) - New IDE for Data Engineering: side-by-side code & DAG, inline data previews, Git integration and an AI copilot. (databricks.com)
AUTO CDC …
SQL /create_auto_cdc_flow()
Python replace manualAPPLY CHANGES
.
Lakeflow Jobs — modernised orchestration
- Everything Workflows did, plus loops, conditionals, parameter-scoped retries and table-update triggers.
- Serverless performance mode GA: 3–5× faster cold starts, or cost-optimised mode if pennies matter.
- Unified “Jobs & Pipelines” left-nav entry—no more hunting for two icons.
Lakeflow Designer — drag-and-drop pipelines (Preview) Data analysts can build pipelines visually; the canvas spits out the same Declarative Pipelines code your engineers already trust.
Migration in three steps
- Upgrade your workspace – The GA switch is opt-in; click Enable Lakeflow in the admin console. Pipelines re-index under the new UI instantly. (docs.databricks.com)
- Leave your code alone – Notebooks, Terraform and REST calls with
/dlt/
endpoints keep working for at least 12 months (Databricks deprecation SLA). - Adopt the new bits gradually – Try
AUTO CDC
, move ingestion to Zerobus, or let ops run Jobs in serverless mode first.
Pricing & availability
-
Included SKU: Everything lives in the existing Data Engineering edition—no extra licence.
-
Metered extras:
- Zerobus event ingest — DBUs per GiB.
- Serverless Jobs / Pipelines — DBUs per compute-second.
-
Clouds & regions: GA on AWS, Azure and GCP as of 12 June 2025.
Common questions
Will my dlt
REST endpoints break?
Not until June 2026 at the earliest. Databricks promises a one-year notice before deleting any alias.
Can I still run dbt? Yes—Lakeflow Jobs orchestrates dbt tasks right next to notebooks, SQL and pipeline runs.
Do I need to learn a new syntax?
No. All existing SQL/Python for DLT works; new features like AUTO CDC
are additive.
Best-practice checklist
- Rename gradually – Update internal docs and dashboards to say “Lakeflow Declarative Pipelines” so newcomers don’t revert to “DLT”.
- Enable Unity Catalog lineage – Lakeflow pipes rich lineage back to UC; surfacing that from day one pays off during audits.
- Benchmark serverless – In tests, serverless Jobs cut orchestration latency by 60-80 % for bursty DAGs; worth flipping the toggle.
- Leverage Auto CDC – Swap
APPLY CHANGES
forAUTO CDC
to halve code and get built-in replay safety.
Tecyfy Takeaway
Lakeflow is more than a cosmetic rename; it’s Databricks’ answer to a decade of fragmented data-engineering stacks. You get:
- One SKU, one UI, one lineage graph from file ingestion to scheduled job.
- A modern IDE and drag-and-drop Designer that make pipelines first-class citizens.
- Backwards compatibility that respects every line of DLT code you’ve already shipped.
And—perhaps best of all—the next time someone asks whether a Delta Live Table is “just another table format,” you can simply smile and point them to Lakeflow Declarative Pipelines. Problem solved.
Happy flowing!