Learn how NOW Insurance works with Onehouse to harness data, cutting costs, and drive innovation with AI and ML for insurance policies.
Join this special panel event to hear live from universal data lakehouse practitioners at Robinhood, Uber, and Walmart, and Apache Hudi creator and Onehouse Founder and CEO Vinoth Chandar.
Apache Kafka and the data lakehouse are used by the world’s most sophisticated data teams to ingest data at scale, quickly, and cost-effectively.
Learn how to replicate operational databases to the data lakehouse in a manner that is easy, fast, cost-efficient, and opens your data to multiple downstream engines.
Learn all about the benefits of the universal data lakehouse architecture and see Onehouse in action for use cases such as Postgres change data capture!
Tired of making tradeoffs between data lake formats? Learn how OneTable opens your data to any - or all - formats including Apache Hudi, Delta Lake and Apache Iceberg, and see a live demo!
Join thought leaders from Onehouse, AWS, Confluent, Uber, Walmart, Tesla, Netflix and more as they discuss how open source projects have taken over as the standard for data architectures at companies of all sizes.
Join Onehouse Founder and CEO Vinoth Chandar for an overview and demo of the Onehouse platform.
Nadine Farah, an Apache Hudi Contributor, and Prashant Wason, the release manager for Hudi 0.14.0, delve deep into the groundbreaking record-level index feature.
Join Onehouse Head of Product Management Kyle Weller as he discusses the ins and outs of the most popular open source lakehouse projects.
NOW Insurance is a rapidly growing pioneer in the insurtech space. Read about why they chose the Universal Data Lakehouse architecture - and partnered with Onehouse to make it happen, fast.
Apna, Notion, Uber, Walmart, Zoom. What do these companies have in common? Aside from their businesses generating massive volumes of data - at high velocity - all of their teams have chosen the universal data lakehouse as a core component of their data stack and pipelines.
You shouldn’t have to move copies of data around, never knowing which is the real source of truth for different applications such as reporting, AI/ML, data science, and analytics. Learn how the universal data lakehouse architecture architecture is reshaping how businesses like Uber, Walmart, and TikTok handle vast and diverse data with unparalleled efficiency.
The data lakehouse is gaining strong interest from organizations looking to build a centralized data platform. Many are struggling to choose between the three popular lakehouse projects: Hudi, the original data lakehouse developed at Uber; Iceberg, developed at Netflix, and; Delta Lake, an open source version of the Databricks lakehouse. Learn about the goals and differences of each project.
Combine the scalability and flexibility of data lakes with the stability and accessibility of data warehouses, and open it to your entire ecosystem.
Build realtime workloads in minutes to power use cases across your entire ecosystem, including change data capture, analytics, AI and ML, and more.
Onehouse builds on the data lakehouse architecture with a universal approach that makes data from all your favorite sources - streams, databases, and cloud storage, for example - available to all the common query engines, languages, and data lakehouse formats your data consumers use every day.
See how you can replicate Postgres tables into the lakehouse using Onehouse's new Confluent CDC source. This demo showcases fully automated integration with Onehouse and Confluent to provision and manages resources in Confluent to facilitate CDC data ingestion into the lakehouse.
This guide shows how to seamlessly integrate Amazon Athena with your Onehouse Managed Lakehouse. This will allow you to power serverless analytics at scale on top of the data in your Lakehouse.
This guide will show how to ingest data from DynamoDB Tables into your Onehouse Managed Lakehouse using Onehouse's deep integration with Kafka.
This guide provides a pattern for creating a cross-region disaster recovery solution for Hudi using savepoints - enabling highly resilient lakehousees.
Gain maximum value from your data lakehouse while ensuring robust security and tailored access control.
Integrate the Onehouse Universal Data Lakehouse with Amazon Sagemaker to build machine learning models in near real-time.
Replicate your operational PostgreSQL database to the Onehouse Universal Data Lakehouse with up-to-the-minute data.
Integrate Confluent and Onehouse to seamlessly replicate operational databases in near real-time.
The workshop will leverage TPC-DS dataset in volume of 10 GB to demonstrate the various capabilities of read and write with Hudi and Presto. The dataset will be made available at a common S3 location accessible to workshop attendees.
NOW Insurance is a rapidly growing pioneer in the insurtech space. Read about why they chose the Universal Data Lakehouse architecture - and partnered with Onehouse to make it happen, fast.
Apna is the largest and fastest-growing site for professional opportunities in India. Read on to learn how they rearchitected their data infrastructure to move from daily batch workloads to near real-time insights about their business while reducing costs.
Apna, Notion, Uber, Walmart, Zoom. What do these companies have in common? Aside from their businesses generating massive volumes of data - at high velocity - all of their teams have chosen the universal data lakehouse as a core component of their data stack and pipelines.
Be the first to hear about news and product updates