32 subscribers
با برنامه Player FM !
پادکست هایی که ارزش شنیدن دارند
حمایت شده


1 SISTER WIVES: The Brown Family Plans Garrison's Funeral, Gives NEW Details About His Passing. Justin Baldoni v Blake Lively UPDATES, First Pictures Of Micah Plath’s Broken Nose Have Surfaced!… 36:16
Data Mesh Architecture: A Modern Distributed Data Model
Manage episode 424666750 series 2510642
Data mesh isn’t software you can download and install, so how do you build a data mesh? In this episode, Adam Bellemare (Staff Technologist, Office of the CTO, Confluent) discusses his data mesh proof of concept and how it can help you conceptualize the ways in which implementing a data mesh could benefit your organization.
Adam begins by noting that while data mesh is a type of modern data architecture, it is only partially a technical issue. For instance, it encompasses the best way to enable various data sets to be stored and made accessible to other teams in a distributed organization. Equally, it’s also a social issue—getting the various teams in an organization to commit to publishing high-quality versions of their data and making them widely available to everyone else. Adam explains that the four data mesh concepts themselves provide the language needed to start discussing the necessary social transitions that must take place within a company to bring about a better, more effective, and efficient data strategy.
The data mesh proof of concept created by Adam's team showcases the possibilities of an event-stream based data mesh in a fully functional model. He explains that there is no widely accepted way to do data mesh, so it's necessarily opinionated. The proof of concept demonstrates what self-service data discovery looks like—you can see schemas, data owners, SLAs, and data quality for each data product. You can also model an app consuming data products, as well as publish your own data products.
In addition to discussing data mesh concepts and the proof of concept, Adam also shares some experiences with organizational data he had as a staff data platform engineer at Shopify. His primary focus was getting their main ecommerce data into Apache Kafka® topics from sharded MySQL—using Kafka Connect and Debezium. He describes how he really came to appreciate the flexibility of having access to important business data within Kafka topics. This allowed people to experiment with new data combinations, letting them come up with new products, novel solutions, and different ways of looking at problems. Such data sharing and experimentation certainly lie at the heart of data mesh.
Adam has been working in the data space for over a decade, with experience in big-data architecture, event-driven microservices, and streaming data platforms. He’s also the author of the book “Building Event-Driven Microservices.”
EPISODE LINKS
- The Definitive Guide to Building a Data Mesh with Event Streams
- What is data mesh?
- Saxo Bank’s Best Practices for Distributed Domain-Driven Architecture Founded on the Data Mesh
- Watch the video version of this podcast
- Kris Jenkins’ Twitter
- Join the Confluent Community
- Learn more with Kafka tutorials at Confluent Developer
- Live demo: Intro to Event-Driven Microservices with Confluent
- Use PODCAST100 to get an additional $100 of Confluent Cloud usage (details)
فصل ها
1. Intro (00:00:00)
2. Event streaming use case (00:01:08)
3. Microservices (00:05:05)
4. The available data and tooling you have matters (00:09:12)
5. Business requirements (00:12:40)
6. Make data available (00:15:21)
7. Data mesh/Immutable data (00:19:25)
8. React to data in real time (00:23:10)
9. Data governance (00:26:36)
10. Self-service data (00:27:52)
11. Data mesh prototype (00:31:22)
12. What’s next? (00:43:41)
13. It’s a wrap (00:46:40)
265 قسمت
Manage episode 424666750 series 2510642
Data mesh isn’t software you can download and install, so how do you build a data mesh? In this episode, Adam Bellemare (Staff Technologist, Office of the CTO, Confluent) discusses his data mesh proof of concept and how it can help you conceptualize the ways in which implementing a data mesh could benefit your organization.
Adam begins by noting that while data mesh is a type of modern data architecture, it is only partially a technical issue. For instance, it encompasses the best way to enable various data sets to be stored and made accessible to other teams in a distributed organization. Equally, it’s also a social issue—getting the various teams in an organization to commit to publishing high-quality versions of their data and making them widely available to everyone else. Adam explains that the four data mesh concepts themselves provide the language needed to start discussing the necessary social transitions that must take place within a company to bring about a better, more effective, and efficient data strategy.
The data mesh proof of concept created by Adam's team showcases the possibilities of an event-stream based data mesh in a fully functional model. He explains that there is no widely accepted way to do data mesh, so it's necessarily opinionated. The proof of concept demonstrates what self-service data discovery looks like—you can see schemas, data owners, SLAs, and data quality for each data product. You can also model an app consuming data products, as well as publish your own data products.
In addition to discussing data mesh concepts and the proof of concept, Adam also shares some experiences with organizational data he had as a staff data platform engineer at Shopify. His primary focus was getting their main ecommerce data into Apache Kafka® topics from sharded MySQL—using Kafka Connect and Debezium. He describes how he really came to appreciate the flexibility of having access to important business data within Kafka topics. This allowed people to experiment with new data combinations, letting them come up with new products, novel solutions, and different ways of looking at problems. Such data sharing and experimentation certainly lie at the heart of data mesh.
Adam has been working in the data space for over a decade, with experience in big-data architecture, event-driven microservices, and streaming data platforms. He’s also the author of the book “Building Event-Driven Microservices.”
EPISODE LINKS
- The Definitive Guide to Building a Data Mesh with Event Streams
- What is data mesh?
- Saxo Bank’s Best Practices for Distributed Domain-Driven Architecture Founded on the Data Mesh
- Watch the video version of this podcast
- Kris Jenkins’ Twitter
- Join the Confluent Community
- Learn more with Kafka tutorials at Confluent Developer
- Live demo: Intro to Event-Driven Microservices with Confluent
- Use PODCAST100 to get an additional $100 of Confluent Cloud usage (details)
فصل ها
1. Intro (00:00:00)
2. Event streaming use case (00:01:08)
3. Microservices (00:05:05)
4. The available data and tooling you have matters (00:09:12)
5. Business requirements (00:12:40)
6. Make data available (00:15:21)
7. Data mesh/Immutable data (00:19:25)
8. React to data in real time (00:23:10)
9. Data governance (00:26:36)
10. Self-service data (00:27:52)
11. Data mesh prototype (00:31:22)
12. What’s next? (00:43:41)
13. It’s a wrap (00:46:40)
265 قسمت
همه قسمت ها
×
1 Migrate Your Kafka Cluster with Minimal Downtime 1:01:30

1 Top 6 Worst Apache Kafka JIRA Bugs 1:10:58









1 Optimizing Apache JVMs for Apache Kafka 1:11:42



1 International Podcast Day - Apache Kafka Edition | Streaming Audio Special 1:02:22




1 Capacity Planning Your Apache Kafka Cluster 1:01:54




1 Streaming Analytics and Real-Time Signal Processing with Apache Kafka 1:06:33



1 Common Apache Kafka Mistakes to Avoid 1:09:43













1 Scaling an Apache Kafka Based Architecture at Therapie Clinic 1:10:56





1 The Evolution of Apache Kafka: From In-House Infrastructure to Managed Cloud Service ft. Jay Kreps 46:32



1 Expanding Apache Kafka Multi-Tenancy for Cloud-Native Systems ft. Anna Povzner and Anastasia Vela 31:01



1 From Batch to Real-Time: Tips for Streaming Data Pipelines with Apache Kafka ft. Danica Fine 29:50


















1 How to Build a Strong Developer Community with Global Engagement ft. Robin Moffatt and Ale Murray 35:18







1 Collecting Data with a Custom SIEM System Built on Apache Kafka and Kafka Connect ft. Vitalii Rudenskyi 25:14








به Player FM خوش آمدید!
Player FM در سراسر وب را برای یافتن پادکست های با کیفیت اسکن می کند تا همین الان لذت ببرید. این بهترین برنامه ی پادکست است که در اندروید، آیفون و وب کار می کند. ثبت نام کنید تا اشتراک های شما در بین دستگاه های مختلف همگام سازی شود.