Databricks

Databricks Lakeflow GA(2025): Good-bye DLT, Hello Unified Data Engineering

D
Data & AI Insights CollectiveJun 29, 2025
3 min read

Why another name change?

“I already have Delta Tables—why do I need Delta Live Tables?” If you’ve ever been asked (or asked yourself) that question, you’re not alone. Community threads and LinkedIn rants have been peppered with engineers conflating the storage layer (Delta Table) with the pipeline service (Delta Live Tables).

Databricks has seized GA as the perfect time to re-brand: DLT is now Lakeflow Declarative Pipelines. The fresh Lakeflow umbrella groups ingestion (Connect), transformation (Declarative Pipelines) and orchestration (Jobs) into one product—removing two big pain points:

  1. Naming confusion (DLT ≠ a table).
  2. Tool sprawl (Auto Loader + DLT + Workflows was three SKUs).

Quick-look table — what actually changed?

Layer / TermBefore (Jan 2025)Now (June 2025 GA)What it means for you
StorageDelta TableDelta TableUnchanged ACID lakehouse tables
TransformationDelta Live Tables (DLT)Lakeflow Declarative PipelinesSame declarative SQL/Python syntax, now inside Lakeflow UI
IngestionAuto Loader + Partner Connect bitsLakeflow Connect40+ managed connectors + high-throughput Zerobus API (100 MB/s, 5 s latency)
OrchestrationWorkflowsLakeflow JobsJobs UI renamed, adds loops, table-update triggers & default serverless runs
New goodiesIDE for Data Engineering, Auto CDC, Lakeflow Designer (no-code)Faster debugging, zero-code option for analysts, simpler CDF pipelines

Deep dive into the GA feature set

Lakeflow Connectplug-and-play ingestion

  • 40+ GA connectors: Salesforce, Workday, SharePoint, SQL Server, Oracle, Postgres, Redshift, BigQuery, SFTP and more. (databricks.com)
  • Zerobus write API streams events straight into Unity Catalog at ~100 MB/s with 5 s end-to-lake latency.
  • Source-aware CDC optimisations so you don’t babysit log positions.

Lakeflow Declarative PipelinesDLT 2.0

  • Same CREATE LIVE TABLE DSL, now built on Spark Declarative Pipelines—an open standard you can run anywhere Spark runs. (databricks.com)
  • New IDE for Data Engineering: side-by-side code & DAG, inline data previews, Git integration and an AI copilot. (databricks.com)
  • AUTO CDC … SQL / create_auto_cdc_flow() Python replace manual APPLY CHANGES.

Lakeflow Jobsmodernised orchestration

  • Everything Workflows did, plus loops, conditionals, parameter-scoped retries and table-update triggers.
  • Serverless performance mode GA: 3–5× faster cold starts, or cost-optimised mode if pennies matter.
  • Unified “Jobs & Pipelines” left-nav entry—no more hunting for two icons.

Lakeflow Designerdrag-and-drop pipelines (Preview) Data analysts can build pipelines visually; the canvas spits out the same Declarative Pipelines code your engineers already trust.

Migration in three steps

  1. Upgrade your workspace – The GA switch is opt-in; click Enable Lakeflow in the admin console. Pipelines re-index under the new UI instantly. (docs.databricks.com)
  2. Leave your code alone – Notebooks, Terraform and REST calls with /dlt/ endpoints keep working for at least 12 months (Databricks deprecation SLA).
  3. Adopt the new bits gradually – Try AUTO CDC, move ingestion to Zerobus, or let ops run Jobs in serverless mode first.

Pricing & availability

  • Included SKU: Everything lives in the existing Data Engineering edition—no extra licence.

  • Metered extras:

    • Zerobus event ingest — DBUs per GiB.
    • Serverless Jobs / Pipelines — DBUs per compute-second.
  • Clouds & regions: GA on AWS, Azure and GCP as of 12 June 2025.

Common questions

Will my dlt REST endpoints break? Not until June 2026 at the earliest. Databricks promises a one-year notice before deleting any alias.

Can I still run dbt? Yes—Lakeflow Jobs orchestrates dbt tasks right next to notebooks, SQL and pipeline runs.

Do I need to learn a new syntax? No. All existing SQL/Python for DLT works; new features like AUTO CDC are additive.

Best-practice checklist

  • Rename gradually – Update internal docs and dashboards to say “Lakeflow Declarative Pipelines” so newcomers don’t revert to “DLT”.
  • Enable Unity Catalog lineage – Lakeflow pipes rich lineage back to UC; surfacing that from day one pays off during audits.
  • Benchmark serverless – In tests, serverless Jobs cut orchestration latency by 60-80 % for bursty DAGs; worth flipping the toggle.
  • Leverage Auto CDC – Swap APPLY CHANGES for AUTO CDC to halve code and get built-in replay safety.

Tecyfy Takeaway

Lakeflow is more than a cosmetic rename; it’s Databricks’ answer to a decade of fragmented data-engineering stacks. You get:

  • One SKU, one UI, one lineage graph from file ingestion to scheduled job.
  • A modern IDE and drag-and-drop Designer that make pipelines first-class citizens.
  • Backwards compatibility that respects every line of DLT code you’ve already shipped.

And—perhaps best of all—the next time someone asks whether a Delta Live Table is “just another table format,” you can simply smile and point them to Lakeflow Declarative Pipelines. Problem solved.

Happy flowing!

Share this article