Databricks’ cover photo
Databricks

Databricks

Software Development

San Francisco, CA 969,781 followers

About us

Databricks is the Data and AI company. More than 10,000 organizations worldwide — including Block, Comcast, Condé Nast, Rivian, Shell and over 60% of the Fortune 500 — rely on the Databricks Data Intelligence Platform to take control of their data and put it to work with AI. Databricks is headquartered in San Francisco, with offices around the globe, and was founded by the original creators of Lakehouse, Apache Spark™, Delta Lake and MLflow. --- Databricks applicants Please apply through our official Careers page at databricks.com/company/careers. All official communication from Databricks will come from email addresses ending with @databricks.com or @goodtime.io (our meeting tool).

Website
https://databricks.com
Industry
Software Development
Company size
5,001-10,000 employees
Headquarters
San Francisco, CA
Type
Privately Held
Specialties
Apache Spark, Apache Spark Training, Cloud Computing, Big Data, Data Science, Delta Lake, Data Lakehouse, MLflow, Machine Learning, Data Engineering, Data Warehousing, Data Streaming, Open Source, Generative AI, Artificial Intelligence, Data Intelligence, Data Management, Data Goverance, Generative AI, and AI/ML Ops

Products

Locations

Employees at Databricks

Updates

  • Our new Databricks AI Governance Framework is your comprehensive guide to implementing enterprise AI programs responsibly and effectively. The framework provides a structured approach to AI development, spanning 5 foundational pillars for building a responsible and resilient AI program. Explore best practices across risk management, legal compliance, ethical oversight, operational monitoring and more. Read the full guide here: https://lnkd.in/ginBQyw4

    • No alternative text description for this image
  • MCP is quickly becoming a standard for equipping LLMs with tools and context. That's why we're excited to provide organizations with the best of all worlds with our new MCP integrations! Use MCP for your agents to take action, Mosaic AI for building and evaluating agents, and Unity Catalog for governance and discovery. With these new capabilities, we’re handling the hard parts of MCP for you, so you can focus on building and deploying agents. https://lnkd.in/gPWtd-Dr

  • In the high-stakes world of major league baseball, every decision and action can be the difference between winning and losing. ⚾️ Knowing the competitive advantage of real-time analytics, the Cincinnati Reds modernized their legacy infrastructure with Databricks — prioritizing serverless capabilities and unified orchestration. Databricks Lakeflow Jobs helped streamline ETL and deliver real-time insights for actionable decision making. “With serverless Databricks Lakeflow Jobs, we’ve achieved a 3–5x improvement in latency. What used to take 10 minutes now takes just 2–3 minutes, significantly reducing processing times.” https://lnkd.in/gfzitfeZ

  • View organization page for Databricks

    969,781 followers

    LLMs are powerful, but getting consistent accuracy remains a major challenge. Lamini CEO Sharon Zhou, PhD shares how to build compound AI systems using SLMs and high-accuracy mini-agents within agentic workflows. She introduces techniques like memory RAG, which reduces hallucinations using embed-time compute for contextual embeddings, and memory tuning, which applies a Mixture of Memory Experts (MoME) to specialize models with proprietary data. The takeaway: with the right building blocks, high-accuracy mini-agents can be composed into larger, more reliable AI systems: https://lnkd.in/gMyXFs45

  • Oracle Autonomous Database now supports Delta Sharing 🔁 Until now, sharing data across enterprise systems meant pipelines, duplication, and lock-in. With Oracle’s support for Delta Sharing, Databricks users can securely access Oracle ADB data (no ETL, no copies) for real-time analytics and AI with full governance. The same approach enables sharing from Oracle Fusion Data Intelligence. https://lnkd.in/gYNXS6yc

  • We introduced powerful new Azure Databricks capabilities at #DataAISummit to help organizations modernize data architectures, scale secure collaboration, and accelerate AI adoption. Highlights: • Databricks One and AI/BI Genie enhancements to empower every user (including business users) with data • Unity Catalog upgrades: ABAC, auto-publish to Power BI, and external metadata access • Declarative Pipelines + Lakeflow Designer for Azure users looking to modernize workflows • Enhanced interoperability through Apache Iceberg™ support and Azure Databricks mirrored catalog • Build AI-native apps faster with Lakebase and native support for open formats on Azure Full rundown below 👇 https://lnkd.in/gQc2_-qb

    • No alternative text description for this image
  • Lakeflow Designer is a new tool to bridge the gap between data engineers and business analysts. Production-quality ETL with no code required! This short demo shows Lakeflow Designer in action: import spreadsheets + transform data with drag-and-drop, translate and classify data with Databricks Assistant, specify pipeline transformations using examples, and deploy a visual pipeline to production. Full demo here: https://lnkd.in/gnGsnPG8

  • “One of the really important things to think about when leveraging AI in healthcare is trust. The decisions are high stakes and we need it to be believable and explainable.” Watch Srinivas Sridhara, PhD MHS , Chief Data and Analytics Officer at Penn Medicine, University of Pennsylvania Health System discuss how Databricks is transforming healthcare through centralized data and AI: https://lnkd.in/gVHsqFk9

Affiliated pages

Similar pages

Browse jobs

Funding

Databricks 15 total rounds

Last Round

Debt financing

US$ 5.3B

See more info on crunchbase