Artificial Intelligence

Key Takeaways from Wednesday’s Keynote at the Databricks Data + AI Summit 2025

By Kate Huber

A recap for data leaders shaping the future of AI.

At the 2025 Data + AI Summit, Databricks outlined not just its product roadmap but a cohesive vision for how organizations like yours can move faster, govern smarter, and deliver value with AI more effectively.

With over 22,000 attendees packed into San Francisco's Moscone Center—and more than 65,000 joining virtually from across the globe—Wednesday’s keynote wasn't just a product update. It was a masterclass in building a future-proof AI stack. This wasn’t about theory or trends. It was about practical moves your team can make today to unlock AI at an enterprise scale.


The keynote featured insights from thought leaders across the data and AI ecosystem, including:

  • Ali Ghodsi, Co-Founder and CEO, Databricks
  • Hanlin Tang, CTO, Neural Networks, Databricks
  • Justin DeBrabant, Director, Product Management, Databricks
  • Kasey Uhlenhuth, Director, Product, Mosaic AI Platform, Databricks
  • Reynold Xin, Co-Founder and Chief Architect, Databricks
  • Jamie Dimon, Chairman and CEO, JPMorgan Chase
  • Dario Amodei, Co-Founder and CEO, Anthropic
  • Holly Smith, Staff Developer Advocate, Databricks
  • Nikita Shamgunov, CEO, Neon
  • Richard Masters, VP Data & AI, Virgin Atlantic Airways
  • Greg Ulrich, Chief AI & Data Officer, Mastercard


Let’s walk through what Databricks unveiled and how your organization can leverage that momentum to drive business value.

AI at the Core of the Modern Data Stack

Ali Ghodsi opened with a message that resonated clearly: AI is now the native behavior of a modern data platform. Databricks isn’t just offering tools. It’s positioning itself as the operating system for enterprise AI.

“We’re not just talking about making AI accessible. We’re building the foundation, so it works at enterprise scale.” – Ali Ghodsi

Ghodsi emphasized that unifying data and AI into a single platform eliminates the historical tradeoffs between performance, governance, and innovation. This architectural convergence reduces latency and complexity for teams working across data pipelines, machine learning (ML) models, and language learning models (LLMs), resulting in better time-to-value.

He also pointed out that enterprise adoption isn’t limited to tech-first companies. Every industry—financial services, healthcare, and travel—is accelerating its transformation by placing AI at the center of its data stack.

This shift isn’t theoretical. It’s happening now as companies consolidate their tech stacks and embed AI in core workflows—from forecasting and planning to customer experience and risk modeling.

Ghodsi reflected on the journey from Apache Spark to the current AI-native Lakehouse vision, emphasizing that the architectural foundation matters now more than ever. He pointed out that while enterprises have long struggled to unify their data systems, the convergence of data engineering, analytics, and AI finally makes that goal achievable.

Mosaic AI: Production-Ready GenAI

Hanlin Tang provided a behind-the-scenes look at Mosaic AI’s neural architecture strategy, showcasing how Databricks is optimizing LLM deployment to balance latency, cost, and accuracy. With structured workflows, observability baked in, and scalable agent management, Mosaic AI turns LLM pipelines into secure, governed assets.

The introduction of Agent Bricks allows developers to compose multi-step AI workflows natively inside Databricks. This means you can automate retrieval, summarization, classification, and action-triggering—all from within a governed, traceable framework.

What sets Mosaic AI apart is its enterprise-readiness. With built-in observability through MLflow 3.0, organizations can finally track and audit LLMs the same way they monitor traditional models. That’s critical as GenAI moves from labs to production use cases across industries.

The Mosaic AI platform introduced key capabilities, including Agent Bricks, vector search, and MLflow 3.0 observability. These tools are designed to make GenAI development more scalable, auditable, and enterprise-friendly.

“Agentic workflows are how we scale GenAI beyond prototypes. That’s why we built Agent Bricks.” – Kasey Uhlenhuth

Lakebase: Online Transaction Processing (OLTP), Reimagined for AI

Lakebase supports multi-region, multi-tenant use cases with near-instant recovery and failover. For enterprises managing global applications or real-time customer data, this level of reliability is critical.

Unlike legacy Online Transaction Processing (OLTP) systems, Lakebase was built cloud-native and optimized for Delta Lake. That means less duplication, faster iteration, and simpler orchestration—benefits that compound across large-scale AI workloads.

It’s built on Delta Kernel and the Photon engine, offering Postgres compatibility and scalable concurrency with strong consistency. It’s a foundational shift that empowers developers to use AI on top of live, production-grade data.

Lakebase addresses a gap that most enterprise teams are familiar with how to support both transactional and analytical needs without maintaining duplicate systems. It brings OLTP-style workloads directly into the Lakehouse—no syncing, no separate infrastructure.

“This is a new OLTP engine, not a fork of the past. Built for the Lakehouse, serverless, and AI-native.” – Reynold Xin

Databricks announced Lakebase, a transactional database engine designed for high-concurrency, AI-native applications. Compatible with Postgres, it enables real-time workloads without duplicating data across systems.

Governance Built for LLMs

New features, such as model access policies and prompt lineage reports, help ensure that models aren’t just performant but also compliant with internal and regulatory guidelines.

Richard Masters from Virgin Atlantic shared how Unity Catalog has improved collaboration between analytics and security teams by standardizing data definitions, permissions, and lineage tracking within one system.

“Unity Catalog lets us track prompts, models, outputs—and make that lineage reviewable and governed.” – Richard Masters

The update also enables developers and governance leaders to work from a shared system of record—aligning incentives and reducing friction between innovation and oversight.

Traditional governance tools weren’t built for AI. Unity Catalog is changing that with new features like prompt-level lineage, model visibility, and output traceability. This provides enterprise teams with the necessary controls to move quickly without compromising accountability.

With Unity Catalog now extending governance to models and prompts, teams can track lineage, ensure compliance, and manage outputs alongside traditional data assets.

Databricks One: The AI-Native Experience Layer

It also integrates seamlessly with Mosaic AI, Unity Catalog, and SQL AI Functions—making Databricks One not just a new interface but a control layer across the entire platform.

“We want to enable every user to ask questions of their data—with language, not SQL.” – Justin DeBrabant

By connecting search, lineage, AI querying, and monitoring in one UI, Databricks One reduces friction for domain experts who need fast, interpretable access to data and AI.

That consolidation matters. It means domain experts, analysts, and engineers can collaborate more easily while working within governed boundaries.

Databricks One is more than a rebrand. It’s an experience shift. Users can now discover data assets, track lineage, and launch natural language queries all from one integrated interface.

Databricks One combines data discovery, natural language querying, and workflow orchestration into a unified user interface. It’s a step toward making AI-driven insights more accessible to business teams.

Architecting for Agility

Developers can now clone databases with a single command, test logic in parallel environments, and push updates without fear of breaking production. This branching capability is native to the Lakehouse—no Git hacks or expensive snapshots are required.

The platform also supports streaming data, multi-table transactions, and isolation across tenants—key features for modern, AI-native applications that serve real-time use cases, such as fraud detection or personalized recommendations.

This model supports rapid iteration, safe testing, and GenAI experimentation—all with governance controls and audit trails embedded by default.

A recurring theme was the pursuit of agility without compromise. The architecture underlying Databricks enables experimentation without risking core systems.

Speakers emphasized the importance of separating storage from computing and enabling ephemeral environments, which are essential for experimentation and scalability.  

“Separation of computing and storage makes branching, experimentation, and safety possible at scale.” – Nikita Shamgunov

The Takeaway for Leadership

Greg Ulrich from Mastercard put it plainly: businesses that cannot manage AI and governance together will struggle to scale, as well. Leaders need to invest in architectures that align with the goals of legal, compliance, product, and engineering teams.

Databricks is betting that its unified stack—complete with governance, observability, and compute efficiency—will become the default for enterprise AI. And based on the keynote, that bet is already paying off.

Forward-looking CIOs, CDOs, and product execs will be the ones who make AI scale not just technically but operationally. And Databricks wants to be the ecosystem where that happens.

The keynote closed with a clear message for leaders: it’s time to rethink how data, AI, governance, and applications come together. These aren’t separate disciplines anymore. They’re facets of the same platform problem.

Leaders across industries emphasized that AI can’t be bolted on. It requires a unified platform where governance, experimentation, and production coexist seamlessly.

“Governance and AI aren’t at odds. They need to advance together.” – Greg Ulrich

Your AI Strategy Is Only as Strong as Your Data Foundation

As we enter the second half of 2025, AI is quickly becoming a staple across every sector. Whether you’re in healthcare, finance, manufacturing, or tech, the ability to derive insight from your data—and act on it through intelligent applications—will define your competitiveness.

And Databricks wants to be the place where it all comes together—open, governed, and ready for scale. The Databricks keynote session from Wednesday didn’t just outline product launches. It provided a map for transitioning from experimentation to operational excellence.  

We at Concord can help your business connect the dots and guide your organization toward execution. Contact us today!  

Sign up to receive our bimonthly newsletter!

Not sure on your next step? We'd love to hear about your business challenges. No pitch. No strings attached.

Concord logo
©2025 Concord. All Rights Reserved  |
Privacy Policy