Player FM - Internet Radio Done Right
38 subscribers
Checked 6d ago
اضافه شده در seven سال پیش
محتوای ارائه شده توسط The Data Flowcast. تمام محتوای پادکست شامل قسمتها، گرافیکها و توضیحات پادکست مستقیماً توسط The Data Flowcast یا شریک پلتفرم پادکست آنها آپلود و ارائه میشوند. اگر فکر میکنید شخصی بدون اجازه شما از اثر دارای حق نسخهبرداری شما استفاده میکند، میتوانید روندی که در اینجا شرح داده شده است را دنبال کنید.https://fa.player.fm/legal
Player FM - برنامه پادکست
با برنامه Player FM !
با برنامه Player FM !
پادکست هایی که ارزش شنیدن دارند
حمایت شده
Can AI compress the years long research time of a PhD into seconds? Research scientist Max Jaderberg explores how “AI analogs” simulate real-world lab work with staggering speed and scale, unlocking new insights on protein folding and drug discovery. Drawing on his experience working on Isomorphic Labs' and Google DeepMind's AlphaFold 3 — an AI model for predicting the structure of molecules — Jaderberg explains how this new technology frees up researchers' time and resources to better understand the real, messy world and tackle the next frontiers of science, medicine and more. For a chance to give your own TED Talk, fill out the Idea Search Application: ted.com/ideasearch . Interested in learning more about upcoming TED events? Follow these links: TEDNext: ted.com/futureyou TEDAI Vienna: ted.com/ai-vienna Hosted on Acast. See acast.com/privacy for more information.…
GDPR, Self-Service Data, and Infrastructure Automation with Typeform
Manage episode 277847512 series 2053958
محتوای ارائه شده توسط The Data Flowcast. تمام محتوای پادکست شامل قسمتها، گرافیکها و توضیحات پادکست مستقیماً توسط The Data Flowcast یا شریک پلتفرم پادکست آنها آپلود و ارائه میشوند. اگر فکر میکنید شخصی بدون اجازه شما از اثر دارای حق نسخهبرداری شما استفاده میکند، میتوانید روندی که در اینجا شرح داده شده است را دنبال کنید.https://fa.player.fm/legal
Welcome back to the Airflow Podcast. This week, we met up with Albert Franzi and Carlos Escura from Typeform. Typeform is a tool that allows you to build beautiful interactive forms that you can use for a wide variety of use cases, including customer surveys, employee engagement, product feedback, and market research to name a few. In our conversation, we discussed Airflow as a tool for GDPR compliance, the concept of self-service data and how it allows your data operations team to function as a data platform team, and some of the more specialized infrastructure tooling that the Typeform team has built out to support their internal teams. For folks interested, our team at Astronomer is growing rapidly and we're on the hunt for new folks to join in a variety of different roles. If you're passionate about Airflow and interested in building the future of data engineering, please get in touch. You can check our current job postings at careers.astronomer.io, but we're constantly updating our listings to accommodate new hiring needs. Please feel free to email me directly at pete@astronomer.io if you're passionate about what we're doing and think you'd be a good addition to the team. Mentioned Resources: Dag Factory: https://github.com/ajbosco/dag-factory Astronomer Careers: https://careers.astronomer.io Guest Profiles: Albert Franzi: https://www.linkedin.com/in/albertfranzi/?originalSubdomain=es Carlos Escura: https://www.linkedin.com/in/carlosescura/en-us/
…
continue reading
63 قسمت
GDPR, Self-Service Data, and Infrastructure Automation with Typeform
The Data Flowcast: Mastering Apache Airflow ® for Data Engineering and AI
Manage episode 277847512 series 2053958
محتوای ارائه شده توسط The Data Flowcast. تمام محتوای پادکست شامل قسمتها، گرافیکها و توضیحات پادکست مستقیماً توسط The Data Flowcast یا شریک پلتفرم پادکست آنها آپلود و ارائه میشوند. اگر فکر میکنید شخصی بدون اجازه شما از اثر دارای حق نسخهبرداری شما استفاده میکند، میتوانید روندی که در اینجا شرح داده شده است را دنبال کنید.https://fa.player.fm/legal
Welcome back to the Airflow Podcast. This week, we met up with Albert Franzi and Carlos Escura from Typeform. Typeform is a tool that allows you to build beautiful interactive forms that you can use for a wide variety of use cases, including customer surveys, employee engagement, product feedback, and market research to name a few. In our conversation, we discussed Airflow as a tool for GDPR compliance, the concept of self-service data and how it allows your data operations team to function as a data platform team, and some of the more specialized infrastructure tooling that the Typeform team has built out to support their internal teams. For folks interested, our team at Astronomer is growing rapidly and we're on the hunt for new folks to join in a variety of different roles. If you're passionate about Airflow and interested in building the future of data engineering, please get in touch. You can check our current job postings at careers.astronomer.io, but we're constantly updating our listings to accommodate new hiring needs. Please feel free to email me directly at pete@astronomer.io if you're passionate about what we're doing and think you'd be a good addition to the team. Mentioned Resources: Dag Factory: https://github.com/ajbosco/dag-factory Astronomer Careers: https://careers.astronomer.io Guest Profiles: Albert Franzi: https://www.linkedin.com/in/albertfranzi/?originalSubdomain=es Carlos Escura: https://www.linkedin.com/in/carlosescura/en-us/
…
continue reading
63 قسمت
همه قسمت ها
×Telemetry has the potential to guide the future of Airflow, but only if it’s implemented transparently and with community trust. In this episode, we’re joined by Bolke de Bruin , Director at Metyis and a long-time Airflow PMC member. Bolke discusses how telemetry has been handled in the past, why it matters now and what it will take to get it right. Key Takeaways: (03:20) The role of foundations in establishing credibility and sustainability. (04:52) Why data collection is critical to open-source project direction. (07:24) Lessons learned from previous approaches to user data collection. (10:23) The current state of telemetry in the project. (10:53) Community trust as a prerequisite for technical implementation. (12:54) The importance of managing sensitive data within trusted ecosystems. (16:37) Ethical considerations in balancing participation and access. (18:45) Forward-looking ideas for improving workflow design and usability. Resources Mentioned: Bolke de Bruin https://www.linkedin.com/in/bolke/ Metyis | LinkedIn https://www.linkedin.com/company/metyis/ Metyis | Website http://www.metyis.com Apache Airflow https://airflow.apache.org/ Airflow Summit https://airflowsummit.org/ Airflow Dev List https://lists.apache.org/list.html?dev@airflow.apache.org https://www.astronomer.io/events/roadshow/london/ https://www.astronomer.io/events/roadshow/new-york/ https://www.astronomer.io/events/roadshow/sydney/ https://www.astronomer.io/events/roadshow/san-francisco/ https://www.astronomer.io/events/roadshow/chicago/ Thanks for listening to “ The Data Flowcast: Mastering Apache Airflow® for Data Engineering and AI .” If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
Contributing to open-source projects can be daunting, but it can also unlock unexpected innovation. This episode showcases how one engineer’s journey with Apache Airflow led to impactful UI enhancements and infrastructure solutions at scale. Shubham Raj , Software Engineer II at Cloudera , shares how his team built a drag-and-drop DAG editor for non-coders, contributions which helped shape the Airflow 3.0 Ul and introduced features like external XCom control and bulk APls. Key Takeaways: (02:30) Day-to-day responsibilities building platforms that simplify orchestration. (05:27) Factors that make onboarding into large open-source projects accessible. (07:35) The value of improved user interfaces for task state visibility and control. (09:49) Enabling faster debugging by exposing internal data through APIs. (13:00) Balancing frontend design goals with backend functionality. (14:19) Creating workflow editors that lower the barrier to entry. (16:54) Supporting a variety of task types within a visual DAG builder. (19:32) Common infrastructure challenges faced by orchestration users. (20:37) Addressing dependency management across distributed environments. Resources Mentioned: Shubham Raj https://www.linkedin.com/in/shubhamrajofficial/ Cloudera | LinkedIn https://www.linkedin.com/company/cloudera/ Cloudera | Website https://www.cloudera.com/ Apache Airflow https://airflow.apache.org/ 2023 Airflow Summit https://airflowsummit.org/ https://www.astronomer.io/events/roadshow/london/ https://www.astronomer.io/events/roadshow/new-york/ https://www.astronomer.io/events/roadshow/sydney/ https://www.astronomer.io/events/roadshow/san-francisco/ https://www.astronomer.io/events/roadshow/chicago/ Thanks for listening to “ The Data Flowcast: Mastering Apache Airflow® for Data Engineering and AI .” If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
Managing data pipelines at scale is not just a technical challenge. It is also an organizational one. At Lyft, success means empowering dozens of teams to build with autonomy while enforcing governance and best practices across thousands of workflows. In this episode, we speak with Yunhao Qing , Software Engineer at Lyft , about building a governed data-engineering platform powered by Airflow that balances flexibility, standardization and scale. Key Takeaways: (03:17) Supporting internal teams with a centralized orchestration platform. (04:54) Migrating to a managed service to reduce infrastructure overhead. (06:04) Embedding platform-level governance into custom components. (08:02) Consolidating and regulating the creation of custom code. (09:48) Identifying and correcting inefficient workflow patterns. (11:17) Replacing manual workarounds with native platform features. (14:32) Preparing teams for major version upgrades. (16:03) Leveraging asset-based scheduling for smarter triggers. (18:13) Envisioning GenAI and semantic search for future productivity. Resources Mentioned: Yunhao Qing https://www.linkedin.com/in/yunhao-qing Lyft | LinkedIn https://www.linkedin.com/company/lyft/ Lyft | Website https://www.lyft.com/ Apache Airflow https://airflow.apache.org/ Astronomer https://www.astronomer.io/ Kubernetes https://kubernetes.io/ https://www.astronomer.io/events/roadshow/london/ https://www.astronomer.io/events/roadshow/new-york/ https://www.astronomer.io/events/roadshow/sydney/ https://www.astronomer.io/events/roadshow/san-francisco/ https://www.astronomer.io/events/roadshow/chicago/ Thanks for listening to “ The Data Flowcast: Mastering Apache Airflow® for Data Engineering and AI .” If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
Understanding the complexities of Apache Airflow can be daunting for newcomers and seasoned data engineers. But with the right guidance, mastering the tool becomes an achievable milestone. In this episode, Marc Lamberti , Head of Customer Education at Astronomer , joins us to share his journey from Udemy instructor to driving education at Astronomer, and how he's helping over 100,000 learners demystify Airflow. Key Takeaways: (02:36) Early exposure to Airflow while addressing inefficiencies in data workflows. (04:10) Common barriers to implementing open source tools in enterprise settings. (06:18) The shift from part-time teaching to a full-time focus on Airflow education. (07:53) A modular, guided approach to structuring educational content. (09:57) The value of highlighting underused Airflow features for broader adoption. (12:35) Certifications as a method to assess readiness and uncover knowledge gaps. (13:25) Coverage of essential Airflow concepts in the Fundamentals exam. (16:07) The DAG Authoring exam’s emphasis on practical, advanced features. (20:08) A call for more visible integration of Airflow with AI workflows. Resources Mentioned: Marc Lamberti https://www.linkedin.com/in/marclamberti/ Astronomer | LinkedIn https://www.linkedin.com/company/astronomer/ Astronomer Academy https://academy.astronomer.io/ Airflow Fundamentals Certification https://www.astronomer.io/certification/ DAG Authoring Certification https://academy.astronomer.io/plan/astronomer-certification-dag-authoring-for-apache-airflow-exam The Complete Hands-On Introduction to Airflow https://www.udemy.com/course/the-complete-hands-on-course-to-master-apache-airflow/?utm_source=adwords&utm_medium=udemyads&utm_campaign=Search_DSA_Beta_Prof_la.EN_cc.ROW-English&campaigntype=Search&portfolio=ROW-English&language=EN&product=Course&test=&audience=DSA&topic=&priority=Beta&utm_content=deal4584&utm_term=_._ag_162511579404_._ad_696197165418_._kw__._de_c_._dm__._pl__._ti_dsa-1677053911088_._li_9061346_._pd__._&matchtype=&gad_source=1&gad_campaignid=21168154305&gbraid=0AAAAADROdO3MpljfP-gssiYSmDEPdhZV9&gclid=Cj0KCQjw097CBhDIARIsAJ3-nxdjZA6G5-Y0-akk6Huksy2PLb04t92J4iNfUSIbMdrSAla_tb-o2N8aArOeEALw_wcB&couponCode=PMNVD3025 https://www.astronomer.io/events/roadshow/london/ https://www.astronomer.io/events/roadshow/new-york/ https://www.astronomer.io/events/roadshow/sydney/ https://www.astronomer.io/events/roadshow/san-francisco/ https://www.astronomer.io/events/roadshow/chicago/ Thanks for listening to “ The Data Flowcast: Mastering Apache Airflow® for Data Engineering and AI .” If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…

1 Embracing Data Mesh and SQL Sensors for Scalable Workflows at lastminute.com with Alberto Crespi 30:09
The flexibility of Airflow plays a pivotal role in enabling decentralized data architectures and empowering cross-functional teams. In this episode, we speak with Alberto Crespi , Data Architect at lastminute.com , who shares how his team scales Airflow across 12 teams while supporting both vertical and horizontal structures under a data mesh approach. Key Takeaways: (02:17) Defining responsibilities within data architecture teams. (04:15) Consolidating multiple orchestrators into a single solution. (07:00) Scaling Airflow environments with shared infrastructure and DevOps practices. (10:59) Managing dependencies and readiness using SQL sensors. (14:23) Enhancing visibility and response through Slack-integrated monitoring. (19:28) Extending Airflow’s flexibility to run legacy systems. (22:28) Integrating transformation tools into orchestrated pipelines. (25:54) Enabling non-engineers to contribute to pipeline development. (27:33) Fostering adoption through collaboration and communication. Resources Mentioned: Alberto Crespi https://www.linkedin.com/in/crespialberto/ lastminute.com | Website https://lastminute.com Apache Airflow https://airflow.apache.org/ dbt Labs https://www.getdbt.com/ Astronomer Cosmos https://github.com/astronomer/astronomer-cosmos GitLab Slack https://slack.com/ Kubernetes https://kubernetes.io/ Confluence https://www.atlassian.com/software/confluence Slack https://slack.com/ https://www.astronomer.io/events/roadshow/london/ https://www.astronomer.io/events/roadshow/new-york/ https://www.astronomer.io/events/roadshow/sydney/ https://www.astronomer.io/events/roadshow/san-francisco/ https://www.astronomer.io/events/roadshow/chicago/ Thanks for listening to “ The Data Flowcast: Mastering Apache Airflow® for Data Engineering and AI .” If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
Innovation in orchestration is redefining how engineers approach both traditional ETL pipelines and emerging AI workloads. Understanding how to harness Airflow’s flexibility and observability is essential for teams navigating today’s evolving data landscape. In this episode, Anu Pabla , Principal Engineer at The ODP Corporation , joins us to discuss her journey from legacy orchestration patterns to AI-native pipelines and why she sees Airflow as the future of AI workload orchestration. Key Takeaways: (03:43) Engaging with external technology communities fosters innovation. (05:05) Mentoring early-career engineers builds confidence in a complex tech landscape. (07:51) Orchestration patterns continue to evolve with modern data needs. (08:41) Managing AI workflows requires structured and flexible orchestration. (10:35) High-quality, meaningful data remains foundational across use cases. (15:08) Community-driven open source tools offer lasting value. (16:59) Self-healing systems support both legacy and AI pipelines. (20:20) Orchestration platforms can drive future AI-native workloads. Resources Mentioned: Anu Pabla https://www.linkedin.com/in/atomicap/ The ODP Corporation https://www.linkedin.com/company/the-odp-corporation/ The ODP Corporation | Website https://www.theodpcorp.com/homepage Apache Airflow https://airflow.apache.org/ LlamaIndex https://www.llamaindex.ai/ https://www.astronomer.io/events/roadshow/london/ https://www.astronomer.io/events/roadshow/new-york/ https://www.astronomer.io/events/roadshow/sydney/ https://www.astronomer.io/events/roadshow/san-francisco/ https://www.astronomer.io/events/roadshow/chicago/ Thanks for listening to “ The Data Flowcast: Mastering Apache Airflow® for Data Engineering and AI .” If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
The orchestration layer is foundational to building robust AI- and ML-powered data pipelines, especially in complex hybrid enterprise environments. IBM’s partnership with Astronomer reflects a strategic alignment to simplify and scale Airflow-based workflows across industries. In this episode, we’re joined by IBM ’s Senior Product Manager, BJ Adesoji , and GTM PM and Growth Leader , Ryan Yackel . We discuss how IBM customers are using Airflow in production, the challenges they face at scale and what the new IBM–Astronomer collaboration unlocks. Key Takeaways: (03:09) The growing importance of orchestration tools in enterprise environments. (04:48) How organizations are expanding orchestration beyond traditional use cases. (05:24) Common patterns across industries adopting orchestration platforms. (07:16) Why orchestration is essential for supporting business-critical workloads. (10:00) The role of orchestration in compliance and regulatory processes. (13:02) Challenges enterprises face when managing orchestration infrastructure. (14:58) Opportunities to simplify and centralize orchestration at scale. (19:11) The value of integrating orchestration with broader data toolchains. (20:54) How AI is shaping the future of orchestrated data workflows. Resources Mentioned: BJ Adesoji https://www.linkedin.com/in/bj-soji/ Ryan Yackel https://www.linkedin.com/in/ryanyackel/ IBM | LinkedIn https://www.linkedin.com/company/databand-ai/ IBM Databand https://www.ibm.com/products/databand IBM DataStage https://www.ibm.com/products/datastage IBM watsonx.governance https://www.ibm.com/products/watsonx-governance IBM Knowledge Catalog https://www.ibm.com/products/knowledge-catalog Apache Airflow https://airflow.apache.org/ watsonx Orchestrate https://www.ibm.com/products/watsonx-orchestrate Domino https://domino.ai/ Astronomer https://www.astronomer.io/ Snowflake https://www.snowflake.com/en/ dbt Labs https://www.getdbt.com/ Amazon SageMaker https://aws.amazon.com/sagemaker/ Cloudera https://www.cloudera.com/ MongoDB https://www.mongodb.com/ https://www.astronomer.io/events/roadshow/london/ https://www.astronomer.io/events/roadshow/new-york/ https://www.astronomer.io/events/roadshow/sydney/ https://www.astronomer.io/events/roadshow/san-francisco/ https://www.astronomer.io/events/roadshow/chicago/ Thanks for listening to “ The Data Flowcast: Mastering Apache Airflow® for Data Engineering and AI .” If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
Efficient orchestration and maintainability are crucial for data engineering at scale. Gil Reich , Data Developer for Data Science at Wix , shares how his team reduced code duplication, standardized pipelines, and improved Airflow task orchestration using a Python-based framework built within the data science team. In this episode, Gil explains how this internal framework simplifies DAG creation, improves documentation accuracy, and enables consistent task generation for machine learning pipelines. He also shares lessons from complex DAG optimization and maintaining testable code. Key Takeaways: (03:23) Code duplication creates long-term problems. (08:16) Frameworks bring order to complex pipelines. (09:41) Shared functions cut down repetitive code. (17:18) Auto-generated docs stay accurate by design. (22:40) On-demand DAGs support real-time workflows. (25:08) Task-level sensors improve run efficiency. (27:40) Combine local runs with automated tests. (30:09) Clean code helps teams scale faster. Resources Mentioned: Gil Reich https://www.linkedin.com/in/gilreich/ Wix | LinkedIn https://www.linkedin.com/company/wix-com/ Wix | Website https://www.wix.com/ DS DAG Framework https://airflowsummit.org/slides/2024/92-refactoring-dags.pdf Apache Airflow https://airflow.apache.org/ https://www.astronomer.io/events/roadshow/london/ https://www.astronomer.io/events/roadshow/new-york/ https://www.astronomer.io/events/roadshow/sydney/ https://www.astronomer.io/events/roadshow/san-francisco/ https://www.astronomer.io/events/roadshow/chicago/ Thanks for listening to “ The Data Flowcast: Mastering Apache Airflow® for Data Engineering and AI .” If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…

1 Modernizing Legacy Data Systems With Airflow at Procter & Gamble with Adonis Castillo Cordero 22:13
Legacy architecture and AI workloads pose unique challenges at scale, especially in a global enterprise with complex data systems. In this episode, we explore strategies to proactively monitor and optimize pipelines while minimizing downstream failures. Adonis Castillo Cordero , Senior Automation Manager at Procter & Gamble , joins us to share actionable best practices for dependency mapping, anomaly detection and architecture simplification using Apache Airflow. Key Takeaways: (03:13) Integrating legacy data systems into modern architecture. (05:51) Designing workflows for real-time data processing. (07:57) Mapping dependencies early to avoid pipeline failures. (09:02) Building automated monitoring into orchestration frameworks. (12:09) Detecting anomalies to prevent performance bottlenecks. (15:24) Monitoring data quality to catch silent failures. (17:02) Prioritizing responses based on impact severity. (18:55) Simplifying dashboards to highlight critical metrics. Resources Mentioned: Adonis Castillo Cordero https://www.linkedin.com/in/adoniscc/ Procter & Gamble | LinkedIn https://www.linkedin.com/company/procter-and-gamble/ Procter & Gamble | Website http://www.pg.com Apache Airflow https://airflow.apache.org/ OpenLineage https://openlineage.io/ Azure Monitor https://azure.microsoft.com/en-us/products/monitor/ AWS Lookout for Metrics https://aws.amazon.com/lookout-for-metrics/ Monte Carlo https://www.montecarlodata.com/ Great Expectations https://greatexpectations.io/ https://www.astronomer.io/events/roadshow/london/ https://www.astronomer.io/events/roadshow/new-york/ https://www.astronomer.io/events/roadshow/sydney/ https://www.astronomer.io/events/roadshow/san-francisco/ https://www.astronomer.io/events/roadshow/chicago/ Thanks for listening to “ The Data Flowcast: Mastering Apache Airflow® for Data Engineering and AI .” If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
Building reliable data pipelines starts with maintaining strong data quality standards and creating efficient systems for auditing, publishing and monitoring. In this episode, we explore the real-world patterns and best practices for ensuring data pipelines stay accurate, scalable and trustworthy. Joseph Machado , Senior Data Engineer at Netflix , joins us to share practical insights gleaned from supporting Netflix’s Ads business as well as over a decade of experience in the data engineering space. He discusses implementing audit publish patterns, building observability dashboards, defining in-band and separate data quality checks, and optimizing data validation across large-scale systems. Key Takeaways: . (03:14) Supporting data privacy and engineering efficiency within data systems. (10:41) Validating outputs with reconciliation checks to catch transformation issues. (16:06) Applying standardized patterns for auditing, validating and publishing data. (19:28) Capturing historical check results to monitor system health and improvements. (21:29) Treating data quality and availability as separate monitoring concerns. (26:26) Using containerization strategies to streamline pipeline executions. (29:47) Leveraging orchestration platforms for better visibility and retry capability. (31:59) Managing business pressure without sacrificing data quality practices. (35:46) Starting simple with quality checks and evolving toward more complex frameworks. Resources Mentioned: Joseph Machado https://www.linkedin.com/in/josephmachado1991/ Netflix | LinkedIn https://www.linkedin.com/company/netflix/ Netflix | Website https://www.netflix.com/browse Start Data Engineering https://www.startdataengineering.com/ Apache Airflow https://airflow.apache.org/ dbt Labs https://www.getdbt.com/ Great Expectations https://greatexpectations.io/ https://www.astronomer.io/events/roadshow/london/ https://www.astronomer.io/events/roadshow/new-york/ https://www.astronomer.io/events/roadshow/sydney/ https://www.astronomer.io/events/roadshow/san-francisco/ https://www.astronomer.io/events/roadshow/chicago/ Thanks for listening to “ The Data Flowcast: Mastering Apache Airflow® for Data Engineering and AI .” If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
Creating consistency across data pipelines is critical for scaling engineering teams and ensuring long-term maintainability. In this episode, Snir Israeli , Senior Data Engineer at Next Insurance , shares how enforcing coding standards and investing in developer experience transformed their approach to data engineering. He explains how implementing automated code checks, clear documentation practices and a scoring system helped drive alignment across teams, improve collaboration and reduce technical debt in a fast-growing data environment. Key Takeaways: (02:59) Inconsistencies in code style create challenges for collaboration and maintenance. (04:22) Programmatically enforcing rules helps teams scale their best practices. (08:55) Performance improvements in data pipelines lead to infrastructure cost savings. (13:22) Developer experience is essential for driving adoption of internal tools. (19:44) Dashboards can operationalize standards enforcement and track progress over time. (22:49) Standardization accelerates onboarding and reduces friction in code reviews. (25:39) Linting rules require ongoing maintenance as tools and platforms evolve. (27:47) Starting small and involving the team leads to better adoption and long-term success. Resources Mentioned: Snir Israeli https://www.linkedin.com/in/snir-israeli/ Next Insurance | LinkedIn https://www.linkedin.com/company/nextinsurance/ Next Insurance | Website https://www.nextinsurance.com/ Apache Airflow https://airflow.apache.org/ https://www.astronomer.io/events/roadshow/london/ https://www.astronomer.io/events/roadshow/new-york/ https://www.astronomer.io/events/roadshow/sydney/ https://www.astronomer.io/events/roadshow/san-francisco/ https://www.astronomer.io/events/roadshow/chicago/ Thanks for listening to “ The Data Flowcast: Mastering Apache Airflow® for Data Engineering and AI .” If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
Airflow’s adaptability is driving Tekmetric’s ability to unify complex data workflows, deliver accurate insights and support both internal operations and customer-facing services — all within a rapidly growing startup environment. In this episode, Ipsa Trivedi , Lead Data Engineer at Tekmetric , shares how her team is standardizing pipelines while supporting unique customer needs. She explains how Airflow enables end-to-end data services, simplifies orchestration across varied sources and supports scalable customization. Ipsa also highlights early wins with Airflow, its intuitive UI and the team's roadmap toward data quality, observability and a future self-serve data platform. Key Takeaways: (02:26) Powering auto shops nationwide with a unified platform. (05:17) A new data team was formed to centralize and scale insights. (07:23) Flexible, open source and made to fit — Airflow wins. (10:42) Pipelines handle anything from email to AWS. (12:15) Custom DAGs fit every team’s unique needs. (17:01) Data quality checks are built into the plan. (18:17) Self-serve data mesh is the end goal. (19:59) Airflow now fits so well, there's nothing left on the wishlist. Resources Mentioned: Ipsa Trivedi https://www.linkedin.com/in/ipsatrivedi/ Tekmetric | LinkedIn https://www.linkedin.com/company/tekmetric/ Tekmetric | Website https://www.tekmetric.com/ Apache Airflow https://airflow.apache.org/ AWS RDS https://aws.amazon.com/free/database/?trk=fc551e06-56b0-418c-9ddd-5c9dba18569b&sc_channel=ps&ef_id=CjwKCAjwzMi_BhACEiwAX4YZULS4jV2Xpnpcac_Q3eS9BAg-klKUDyCt6XSdOul8BLHkmWzFFh4NXRoCGhQQAvD_BwE:G:s&s_kwcid=AL!4422!3!548989592596!e!!g!!amazon%20sql%20database!11543056228!112002958549&gclid=CjwKCAjwzMi_BhACEiwAX4YZULS4jV2Xpnpcac_Q3eS9BAg-klKUDyCt6XSdOul8BLHkmWzFFh4NXRoCGhQQAvD_BwE Astro by Astronomer https://www.astronomer.io/product/ https://www.astronomer.io/events/roadshow/london/ https://www.astronomer.io/events/roadshow/new-york/ https://www.astronomer.io/events/roadshow/sydney/ https://www.astronomer.io/events/roadshow/san-francisco/ https://www.astronomer.io/events/roadshow/chicago/ Thanks for listening to “ The Data Flowcast: Mastering Apache Airflow® for Data Engineering and AI .” If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
The Airflow 3.0 release marks a significant leap forward in modern data orchestration, introducing architectural upgrades that improve scalability, flexibility and long-term maintainability. In this episode, we welcome Vikram Koka , Chief Strategy Officer at Astronomer , and Jed Cunningham , Principal Software Engineer at Astronomer , to discuss the architectural foundations, new features and future implications of this milestone release. They unpack the rationale behind DAG versioning and task execution interface, explain how Airflow now integrates more seamlessly within broader data ecosystems and share how these changes lay the groundwork for multi-cloud deployments, language-agnostic workflows and stronger enterprise security. Key Takeaways: (02:28) Modern orchestration demands new infrastructure approaches. (05:02) Removing legacy components strengthens system stability. (06:26) Major releases provide the opportunity to reduce technical debt. (08:31) Frontend and API modernization enable long-term adaptability. (09:36) Event-based triggers expand integration possibilities. (11:54) Version control improves visibility and execution reliability. (14:57) Centralized access to workflow definitions increases flexibility. (21:49) Decoupled architecture supports distributed and secure deployments. (26:17) Community collaboration is essential for sustainable growth. Resources Mentioned: Astronomer Website https://www.astronomer.io Apache Airflow https://airflow.apache.org/ Git Bundle https://git-scm.com/book/en/v2/Git-Tools-Bundling FastAPI https://fastapi.tiangolo.com/ React https://react.dev/ https://www.astronomer.io/events/roadshow/london/ https://www.astronomer.io/events/roadshow/new-york/ https://www.astronomer.io/events/roadshow/sydney/ https://www.astronomer.io/events/roadshow/san-francisco/ https://www.astronomer.io/events/roadshow/chicago/ Thanks for listening to “ The Data Flowcast: Mastering Apache Airflow® for Data Engineering and AI .” If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
The evolution of data orchestration at Instacart highlights the journey from fragmented systems to robust, standardized infrastructure. This transformation has enabled scalability, reliability and democratization of tools for diverse user personas. In this episode, we’re joined by Anant Agarwal , Software Engineer at Instacart , who shares insights into Instacart's Airflow journey, from its early adoption in 2019 to the present-day centralized cluster approach. Anant discusses the challenges of managing disparate clusters, the implementation of remote executors, and the strategic standardization of infrastructure and DAG patterns to streamline workflows. Key Takeaways: (03:49) The impact of external events on business growth and technological evolution. (04:31) Challenges of managing decentralized systems across multiple teams. (06:14) The importance of standardizing infrastructure and processes for scalability. (09:51) Strategies for implementing efficient and repeatable deployment practices. (12:17) Addressing diverse user personas with tailored solutions. (14:47) Leveraging remote execution to enhance flexibility and scalability. (18:36) Benefits of transitioning to a centralized system for organization-wide use. (20:57) Maintaining an upgrade cadence to stay aligned with the latest advancements. (23:35) Anticipation for new features and improvements in upcoming software versions. Resources Mentioned: Anant Agarwal https://www.linkedin.com/in/anantag/ Instacart | LinkedIn https://www.linkedin.com/company/instacart/ Instacart | Website https://www.instacart.com Apache Airflow https://airflow.apache.org/ AWS Amazon https://aws.amazon.com/ecs/ Terraform https://www.terraform.io/ https://www.astronomer.io/events/roadshow/london/ https://www.astronomer.io/events/roadshow/new-york/ https://www.astronomer.io/events/roadshow/sydney/ https://www.astronomer.io/events/roadshow/san-francisco/ https://www.astronomer.io/events/roadshow/chicago/ Thanks for listening to “The Data Flowcast: Mastering Airflow for Data Engineering & AI.” If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…

1 From ETL to Airflow: Transforming Data Engineering at Deloitte Digital with Raviteja Tholupunoori 27:42
Data orchestration at scale presents unique challenges, especially when aiming for flexibility and efficiency across cloud environments. Choosing the right tools and frameworks can make all the difference. In this episode, Raviteja Tholupunoori, Senior Engineer at Deloitte Digital , joins us to explore how Airflow enhances orchestration, scalability and cost efficiency in enterprise data workflows. Key Takeaways: (01:45) Early challenges in data orchestration before implementing Airflow. (02:42) Comparing Airflow with ETL tools like Talend and why flexibility matters. (04:24) The role of Airflow in enabling cloud-agnostic data processing. (05:45) Key lessons from managing dynamic DAGs at scale. (13:15) How hybrid executors improve performance and efficiency. (14:13) Best practices for testing and monitoring workflows with Airflow. (15:13) The importance of mocking mechanisms when testing DAGs. (17:57) How Prometheus, Grafana and Loki support Airflow monitoring. (22:03) Cost considerations when running Airflow on self-managed infrastructure. (23:14) Airflow’s latest features, including hybrid executors and dark mode. Resources Mentioned: Raviteja Tholupunoori https://www.linkedin.com/in/raviteja0096/?originalSubdomain=in Deloitte Digital https://www.linkedin.com/company/deloitte-digital/ Apache Airflow https://airflow.apache.org/ Grafana https://grafana.com/solutions/apache-airflow/monitor/ Astronomer Presents: Exploring Apache Airflow® 3 Roadshows https://www.astronomer.io/events/roadshow/ https://www.astronomer.io/events/roadshow/london/ https://www.astronomer.io/events/roadshow/new-york/ https://www.astronomer.io/events/roadshow/sydney/ https://www.astronomer.io/events/roadshow/san-francisco/ https://www.astronomer.io/events/roadshow/chicago/ Thanks for listening to “The Data Flowcast: Mastering Airflow for Data Engineering & AI.” If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
T
The Data Flowcast: Mastering Apache Airflow ® for Data Engineering and AI

1 How Uber Manages 1 Million Daily Tasks Using Airflow, with Shobhit Shah and Sumit Maheshwari 28:44
When data orchestration reaches Uber’s scale, innovation becomes a necessity, not a luxury. In this episode, we discuss the innovations behind Uber’s unique Airflow setup. With our guests Shobhit Shah and Sumit Maheshwari , both Staff Software Engineers at Uber , we explore how their team manages one of the largest data workflow systems in the world. Shobhit and Sumit walk us through the evolution of Uber’s Airflow implementation, detailing the custom solutions that support 200,000 daily pipelines. They discuss Uber's approach to tackling complex challenges in data orchestration, disaster recovery and scaling to meet the company’s extensive data needs. Key Takeaways: (02:03) Airflow as a service streamlines Uber’s data workflows. (06:16) Serialization boosts security and reduces errors. (10:05) Java-based scheduler improves system reliability. (13:40) Custom recovery model supports emergency pipeline switching. (15:58) No-code UI allows easy pipeline creation for non-coders. (18:12) Backfill feature enables historical data processing. (22:06) Regular updates keep Uber aligned with Airflow advancements. (26:07) Plans to leverage Airflow’s latest features. Resources Mentioned: Shobhit Shah - https://www.linkedin.com/in/shahshobhit/ Sumit Maheshwar - https://www.linkedin.com/in/maheshwarisumit/ Uber - https://www.linkedin.com/company/uber-com/ Apache Airflow - https://airflow.apache.org/ Airflow Summit - https://airflowsummit.org/ Uber - https://www.uber.com/tw/en/ Apache Airflow Survey - https://astronomer.typeform.com/airflowsurvey24 Thanks for listening to The Data Flowcast: Mastering Airflow for Data Engineering & AI. If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
T
The Data Flowcast: Mastering Apache Airflow ® for Data Engineering and AI

Efficient data orchestration is the backbone of modern analytics and AI-driven workflows. Without the right tools, even the best data can fall short of its potential. In this episode, Andrea Bombino , Co-Founder and Head of Analytics Engineering at Astrafy , shares insights into his team’s approach to optimizing data transformation and orchestration using tools like datasets and Pub/Sub to drive real-time processing. Andrea explains how they leverage Apache Airflow and Google Cloud to power dynamic data workflows. Key Takeaways: (01:55) Astrafy helps companies manage data using Google Cloud. (04:36) Airflow is central to Astrafy’s data engineering efforts. (07:17) Datasets and Pub/Sub are used for real-time workflows. (09:59) Pub/Sub links multiple Airflow environments. (12:40) Datasets eliminate the need for constant monitoring. (15:22) Airflow updates have improved large-scale data operations. (18:03) New Airflow API features make dataset updates easier. (20:45) Real-time orchestration speeds up data processing for clients. (23:26) Pub/Sub enhances flexibility across cloud environments. (26:08) Future Airflow features will offer more control over data workflows. Resources Mentioned: Andrea Bombino - https://www.linkedin.com/in/andrea-bombino/ Astrafy - https://www.linkedin.com/company/astrafy/ Apache Airflow - https://airflow.apache.org/ Google Cloud - https://cloud.google.com/ dbt - https://www.getdbt.com/ Apache Airflow Survey - https://astronomer.typeform.com/airflowsurvey24 Thanks for listening to “The Data Flowcast: Mastering Airflow for Data Engineering & AI.” If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
T
The Data Flowcast: Mastering Apache Airflow ® for Data Engineering and AI

Data orchestration is evolving faster than ever and Apache Airflow 3 is set to revolutionize how enterprises handle complex workflows. In this episode, we dive into the exciting advancements with Vikram Koka , Chief Strategy Officer at Astronomer and PMC Member at The Apache Software Foundation . Vikram shares his insights on the evolution of Airflow and its pivotal role in shaping modern data-driven workflows, particularly with the upcoming release of Airflow 3. Key Takeaways: (02:36) Vikram leads Astronomer’s engineering and open-source teams for Airflow. (05:26) Airflow enables reliable data ingestion and curation. (08:17) Enterprises use Airflow for mission-critical data pipelines. (11:08) Airflow 3 introduces major architectural updates. (13:58) Multi-cloud and edge deployments are supported in Airflow 3. (16:49) Event-driven scheduling makes Airflow more dynamic. (19:40) Tasks in Airflow 3 can run in any language. (22:30) Multilingual task support is crucial for enterprises. (25:21) Data assets and event-based integration enhance orchestration. (28:12) Community feedback plays a vital role in Airflow 3. Resources Mentioned: Vikram Koka - https://www.linkedin.com/in/vikramkoka/ Astronomer - https://www.linkedin.com/company/astronomer/ The Apache Software Foundation LinkedIn - https://www.linkedin.com/company/the-apache-software-foundation/ Apache Airflow LinkedIn - https://www.linkedin.com/company/apache-airflow/ Apache Airflow - https://airflow.apache.org/ Astronomer - https://www.astronomer.io/ The Apache Software Foundation - https://www.apache.org/ Join the Airflow slack and/or Dev list - https://airflow.apache.org/community/ Apache Airflow Survey - https://astronomer.typeform.com/airflowsurvey24 Thanks for listening to The Data Flowcast: Mastering Airflow for Data Engineering & AI. If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
T
The Data Flowcast: Mastering Apache Airflow ® for Data Engineering and AI

Data and AI are revolutionizing HR, empowering leaders to measure performance and drive strategic decisions like never before. In this episode, we explore the transformation of HR technology with Guy Dassa , Chief Technology Officer at 15Five , as he shares insights into their evolving data platform. Guy discusses how 15Five equips HR leaders with tools to measure and take action on team performance, engagement and retention. He explains their data-driven approach, highlighting how Apache Airflow supports their data ingestion, transformation, and AI integration. Key Takeaways: (01:54) 15Five acts as a command center for HR leaders. (03:40) Tools like performance reviews, engagement surveys, and an insights dashboard guide actionable HR steps. (05:33) Data visualization, insights, and action recommendations enhance HR effectiveness to improve their people's outcomes. (07:08) Strict data confidentiality and sanitized AI model training. (09:21) Airflow is central to data transformation and enrichment. (11:15) Airflow enrichment DAGs integrate AI models. (13:33) Integration of Airflow and DBT enables efficient data transformation. (15:28) Synchronization challenges arise with reverse ETL processes. (17:10) Future plans include deeper Airflow integration with AI. (19:31) Emphasizing the need for DAG versioning and improved dependency visibility. Resources Mentioned: Guy Dassa - https://www.linkedin.com/in/guydassa/ 15Five - https://www.linkedin.com/company/15five/ Apache Airflow - https://airflow.apache.org/ MLflow - https://mlflow.org/ DBT - https://www.getdbt.com/ Kubernetes - https://kubernetes.io/ RedShift - https://aws.amazon.com/redshift/ 15Five - https://www.15five.com/ Thanks for listening to The Data Flowcast: Mastering Airflow for Data Engineering & AI. If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
T
The Data Flowcast: Mastering Apache Airflow ® for Data Engineering and AI

Unlocking engineering productivity goes beyond coding — it’s about managing knowledge efficiently. In this episode, we explore the innovative ways in which Dosu leverages Airflow for data orchestration and supports the Airflow project. Devin Stein , Founder of Dosu , shares his insights on how engineering teams can focus on value-added work by automating knowledge management. Devin dives into Dosu’s purpose, the significance of AI in their product, and why they chose Airflow as the backbone for scheduling and data management. Key Takeaways: (01:33) Dosu's mission to democratize engineering knowledge. (05:00) AI is central to Dosu's product for structuring engineering knowledge. (06:23) The importance of maintaining up-to-date data for AI effectiveness. (07:55) How Airflow supports Dosu’s data ingestion and automation processes. (08:45) The reasoning behind choosing Airflow over other orchestrators. (11:00) Airflow enables Dosu to manage both traditional ETL and dynamic workflows. (13:04) Dosu assists the Airflow project by auto-labeling issues and discussions. (14:56) Thoughtful collaboration with the Airflow community to introduce AI tools. (16:37) The potential of Airflow to handle more dynamic, scheduled workflows in the future. (18:00) Challenges and custom solutions for implementing dynamic workflows in Airflow. Resources Mentioned: Apache Airflow - https://airflow.apache.org/ Dosu Website - https://dosu.dev/ Thanks for listening to The Data Flowcast: Mastering Airflow for Data Engineering & AI. If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
T
The Data Flowcast: Mastering Apache Airflow ® for Data Engineering and AI

Harnessing data at scale is the key to driving innovation in autonomous vehicle technology. In this episode, we uncover how advanced orchestration tools are transforming machine learning operations in the automotive industry. Serjesh Sharma, Supervisor ADAS Machine Learning Operations (MLOps) at Ford Motor Company, joins us to discuss the challenges and innovations his team faces working to enhance vehicle safety and automation. Serjesh shares insights into the intricate data processes that support Ford’s Advanced Driver Assistance Systems (ADAS) and how his team leverages Apache Airflow to manage massive data loads efficiently. Key Takeaways: (01:44) ADAS involves advanced features like pre-collision assist and self-driving capabilities. (04:47) Ensuring sensor accuracy and vehicle safety requires extensive data processing. (05:08) The combination of on-prem and cloud infrastructure optimizes data handling. (09:27) Ford processes around one petabyte of data per week, using both CPUs and GPUs. (10:33) Implementing software engineering best practices to improve scalability and reliability. (15:18) GitHub Issues streamline onboarding and infrastructure provisioning. (17:00) Airflow's modular design allows Ford to manage complex data pipelines. (19:00) Kubernetes pod operators help optimize resource usage for CPU-intensive tasks. (20:35) Ford's scale challenges led to customized Airflow configurations for high concurrency. (21:02) Advanced orchestration tools are pivotal in managing vast data landscapes in automotive innovation. Resources Mentioned: Serjesh Sharma - www.linkedin.com/in/serjeshsharma/ Ford Motor Company - www.linkedin.com/company/ford-motor-company/ Apache Airflow - airflow.apache.org/ Kubernetes - kubernetes.io/ Thanks for listening to The Data Flowcast: Mastering Airflow for Data Engineering & AI. If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
T
The Data Flowcast: Mastering Apache Airflow ® for Data Engineering and AI

Data failures are inevitable but how you manage them can define the success of your operations. In this episode, we dive deep into the challenges of data engineering and AI with Brendan Frick, Senior Engineering Manager, Data at GumGum. Brendan shares his unique approach to managing task failures and DAG issues in a high-stakes ad-tech environment. Brendan discusses how GumGum leverages Apache Airflow to streamline data processes, ensuring efficient data movement and orchestration while minimizing disruptions in their operations. Key Takeaways: (02:02) Brendan’s role at GumGum and its approach to ad tech. (04:27) How GumGum uses Airflow for daily data orchestration, moving data from S3 to warehouses. (07:02) Handling task failures in Airflow using Jira for actionable, developer-friendly responses. (09:13) Transitioning from email alerts to a more structured system with Jira and PagerDuty. (11:40) Monitoring task retry rates as a key metric to identify potential issues early. (14:15) Utilizing Looker dashboards to track and analyze task performance and retry rates. (16:39) Transitioning from Kubernetes operator to a more reliable system for data processing. (19:25) The importance of automating stakeholder communication with data lineage tools like Atlan. (20:48) Implementing data contracts to ensure SLAs are met across all data processes. (22:01) The role of scalable SLAs in Airflow to ensure data reliability and meet business needs. Resources Mentioned: Brendan Frick - https://www.linkedin.com/in/brendan-frick-399345107/ GumGum - https://www.linkedin.com/company/gumgum/ Apache Airflow - https://airflow.apache.org/ Jira - https://www.atlassian.com/software/jira Atlan - https://atlan.com/ Kubernetes - https://kubernetes.io/ Thanks for listening to The Data Flowcast: Mastering Airflow for Data Engineering & AI. If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
T
The Data Flowcast: Mastering Apache Airflow ® for Data Engineering and AI

1 From Sensors to Datasets: Enhancing Airflow at Astronomer with Maggie Stark and Marion Azoulai 22:25
A 13% reduction in failure rates — this is how two data scientists at Astronomer revolutionized their data pipelines using Apache Airflow. In this episode, we enter the world of data orchestration and AI with Maggie Stark and Marion Azoulai, both Senior Data Scientists at Astronomer. Maggie and Marion discuss how their team re-architected their use of Airflow to improve scalability, reliability and efficiency in data processing. They share insights on overcoming challenges with sensors and how moving to datasets transformed their workflows. Key Takeaways: (02:23) The data team’s role as a centralized hub within Astronomer. (05:11) Airflow is the backbone of all data processes, running 60,000 tasks daily. (07:13) Custom task groups enable efficient code reuse and adherence to best practices. (11:33) Sensor-heavy architectures can lead to cascading failures and resource issues. (12:09) Switching to datasets has improved reliability and scalability. (14:19) Building a control DAG provides end-to-end visibility of pipelines. (16:42) Breaking down DAGs into smaller units minimizes failures and improves management. (19:02) Failure rates improved from 16% to 3% with the new architecture. Resources Mentioned: Maggie Stark - https://www.linkedin.com/in/margaretstark/ Marion Azoulai - https://www.linkedin.com/in/marionazoulai/ Astronomer | LinkedIn - https://www.linkedin.com/company/astronomer/ Apache Airflow - https://airflow.apache.org/ Astronomer | Website - https://www.astronomer.io/ Thanks for listening to The Data Flowcast: Mastering Airflow for Data Engineering & AI. If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
T
The Data Flowcast: Mastering Apache Airflow ® for Data Engineering and AI

Mastering the flow of data is essential for driving innovation and efficiency in today’s competitive landscape. In this episode, we explore the evolution of data orchestration and the pivotal role of Apache Airflow in modern data workflows. Ben Tallman, Chief Technology Officer at M Science, joins us and shares his extensive experience with Airflow, detailing its early adoption, evolution and the profound impact it has had on data engineering practices. His insights reveal how leveraging Airflow can streamline complex data processes, enhance observability and ultimately drive business success. Key Takeaways: (02:31) Benjamin’s journey with Airflow and its early adoption. (05:36) The transition from legacy schedulers to Airflow at Apigee and later Google. (08:52) The challenges and benefits of running production-grade Airflow instances. (10:46) How Airflow facilitates the management of large-scale data at M Science. (11:56) The importance of reducing time to value for customers using data products. (13:32) Airflow’s role in ensuring observability and reliability in data workflows. (17:00) Managing petabytes of data and billions of records efficiently. (19:08) Integration of various data sources and ensuring data product quality. (20:04) Leveraging Airflow for data observability and reducing time to value. (22:04) Benjamin’s vision for the future development of Airflow, including audit trails for variables. Resources Mentioned: Ben Tallman - https://www.linkedin.com/in/btallman/ M Science - https://www.linkedin.com/company/m-science-llc/ Apache Airflow - https://airflow.apache.org/ Astronomer - https://www.astronomer.io/ Databricks - https://databricks.com/ Snowflake - https://www.snowflake.com/ Thanks for listening to The Data Flowcast: Mastering Airflow for Data Engineering & AI. If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
T
The Data Flowcast: Mastering Apache Airflow ® for Data Engineering and AI

Welcome to The Data Flowcast: Mastering Airflow for Data Engineering & AI — the podcast where we keep you up to date with insights and ideas propelling the Airflow community forward. Join us each week, as we explore the current state, future and potential of Airflow with leading thinkers in the community, and discover how best to leverage this workflow management system to meet the ever-evolving needs of data engineering and AI ecosystems. #AI #Automation #Airflow #MachineLearning…
T
The Data Flowcast: Mastering Apache Airflow ® for Data Engineering and AI

Data orchestration is revolutionizing the way companies manage and process data. In this episode, we explore the critical role of data orchestration in modern data workflows and how Apache Airflow is used to enhance data processing and AI model deployment. Hannan Kravitz, Data Engineering Team Leader at Artlist, joins us to share his insights on leveraging Airflow for data engineering and its impact on their business operations. Key Takeaways: (01:00) Hannan introduces Artlist and its mission to empower content creators. (04:27) The importance of collecting and modeling data to support business insights. (06:40) Using Airflow to connect multiple data sources and create dashboards. (09:40) Implementing a monitoring DAG for proactive alerts within Airflow. (12:31) Customizing Airflow for business metric KPI monitoring and setting thresholds. (15:00) Addressing decreases in purchases due to technical issues with proactive alerts. (17:45) Customizing data quality checks with dynamic task mapping in Airflow. (20:00) Desired improvements in Airflow UI and logging capabilities. (21:00) Enabling business stakeholders to change thresholds using Streamlit. (22:26) Future improvements desired in the Airflow project. Resources Mentioned: Hannan Kravitz - https://www.linkedin.com/in/hannan-kravitz-60563112/ Artlist - https://www.linkedin.com/company/art-list/ Apache Airflow - https://airflow.apache.org/ Snowflake - https://www.snowflake.com/ Streamlit - https://streamlit.io/ Thanks for listening to The Data Flowcast: Mastering Airflow for Data Engineering & AI. If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
T
The Data Flowcast: Mastering Apache Airflow ® for Data Engineering and AI

Data engineering is constantly evolving and staying ahead means mastering tools like Apache Airflow. In this episode, we explore the world of data engineering with Alexandre Magno Lima Martins, Senior Data Engineer at Teya. Alexandre talks about optimizing data workflows and the smart solutions they've created at Teya to make data processing easier and more efficient. Key Takeaways: (02:01) Alexandre explains his role at Teya and the responsibilities of a data platform engineer. (02:40) The primary use cases of Airflow at Teya, especially with dbt and machine learning projects. (04:14) How Teya creates self-service DAGs for dbt models. (05:58) Automating DAG creation with CI/CD pipelines. (09:04) Switching to a multi-file method for better Airflow performance. (12:48) Challenges faced with Kubernetes Executor vs. Celery Executor. (16:13) Using Celery Executor to handle fast tasks efficiently. (17:02) Implementing KEDA autoscaler for better scaling of Celery workers. (19:05) Reasons for not using Cosmos for DAG generation and cross-DAG dependencies. (21:16) Alexandre's wish list for future Airflow features, focusing on multi-tenancy. Resources Mentioned: Alexandre Magno Lima Martins - https://www.linkedin.com/in/alex-magno/ Teya - https://www.linkedin.com/company/teya-global/ Apache Airflow - https://airflow.apache.org/ dbt - https://www.getdbt.com/ Kubernetes - https://kubernetes.io/ KEDA - https://keda.sh/ Thanks for listening to The Data Flowcast: Mastering Airflow for Data Engineering & AI. If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
T
The Data Flowcast: Mastering Apache Airflow ® for Data Engineering and AI

Managing data workflows well can change the game for any company. In this episode, we talk about how Airflow makes this possible. Larry Komenda, Chief Technology Officer at Campbell, shares how Airflow supports their operations and improves efficiency. Larry discusses his role at Campbell, their switch to Airflow, and its impact. We look at their strategies for testing and maintaining reliable workflows and how these help their business. Key Takeaways: (02:26) Strong technology and data systems are crucial for Campbell’s investment process. (05:03) Airflow manages data pipelines efficiently in the market data team. (07:39) Airflow supports various departments, including trading and operations. (09:22) Machine learning models run on dedicated Airflow instances. (11:12) Reliable workflows are ensured through thorough testing and development. (13:45) Business tasks are organized separately from Airflow for easier testing. (15:30) Non-technical teams have access to Airflow for better efficiency. (17:20) Thorough testing before deploying to Airflow is essential. (19:10) Non-technical users can interact with Airflow DAGs to solve their issues. (21:55) Airflow improves efficiency and reliability in trading and operations. (24:40) Enhancing the Airflow UI for non-technical users is important for accessibility. Resources Mentioned: Larry Komenda - https://www.linkedin.com/in/larrykomenda/ Campbell - https://www.linkedin.com/company/campbell-and-company/ 30% off Airflow Summit Ticket - https://ti.to/airflowsummit/2024/discount/30DISC_ASTRONOMER Apache Airflow - https://airflow.apache.org/ NumPy - https://numpy.org/ Python - https://www.python.org/ Thanks for listening to The Data Flowcast: Mastering Airflow for Data Engineering & AI. If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
T
The Data Flowcast: Mastering Apache Airflow ® for Data Engineering and AI

1 How Laurel Uses Airflow To Enhance Machine Learning Pipelines with Vincent La and Jim Howard 23:58
The world of timekeeping for knowledge workers is transforming through the use of AI and machine learning. Understanding how to leverage these technologies is crucial for improving efficiency and productivity. In this episode, we’re joined by Vincent La, Principal Data Scientist at Laurel, and Jim Howard, Principal Machine Learning Engineer at Laurel, to explore the implementation of AI in automating timekeeping and its impact on legal and accounting firms. Key Takeaways: (01:54) Laurel's mission in time automation. (03:39) Solving clustering, prediction and summarization with AI. (06:30) Daily batch jobs for user time generation. (08:22) Knowledge workers touch 300 items daily. (09:01) Mapping 300 activities to seven billable items. (11:38) Retraining models for better performance. (14:00) Using Airflow for retraining and backfills. (17:06) RAG-based summarization for user-specific tone. (18:58) Testing Airflow DAGs for cost-effective summarization. (22:00) Enhancing Airflow for long-running DAGs. Resources Mentioned: Vincent La - https://www.linkedin.com/in/vincentla/ Jim Howard - https://www.linkedin.com/in/jameswhowardml/ Laurel - https://www.linkedin.com/company/laurel-ai/ Apache Airflow - https://airflow.apache.org/ Ernst & Young - https://www.ey.com/ Thanks for listening to The Data Flowcast: Mastering Airflow for Data Engineering & AI. If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
T
The Data Flowcast: Mastering Apache Airflow ® for Data Engineering and AI

Discover the cutting-edge methods Vibrant Planet uses to revolutionize geospatial data processing and resource management. In this episode, we delve into the intricacies of scaling geospatial data processing and resource allocation with experts from Vibrant Planet. Joining us are Cyrus Dukart, Engineering Lead, and David Sacerdote, Staff Software Engineer, who share their innovative approaches to handling large datasets and optimizing resource use in Airflow. Key Takeaways: (00:00) Inefficiencies in resource allocation. (03:00) Scientific validity of sharded results. (05:53) Tech-based solutions for resource management. (06:11) Retry callback process for resource allocation. (08:00) Running database queries for resource needs. (10:05) Importance of remembering resource usage. (13:51) Generating resource predictions. (14:44) Custom task decorator for resource management. (20:28) Massive resource usage gap in sharded data. (21:14) Fail-fast model for long-running tasks. Resources Mentioned: Cyrus Dukart - https://www.linkedin.com/in/cyrus-dukart-6561482/ David Sacerdote - https://www.linkedin.com/in/davidsacerdote/ Vibrant Planet - https://www.linkedin.com/company/vibrant-planet/ Apache Airflow - https://airflow.apache.org/ Kubernetes - https://kubernetes.io/ Vibrant Planet - https://vibrantplanet.net/ Thanks for listening to The Data Flowcast: Mastering Airflow for Data Engineering & AI. If you enjoyed this episode, please leave a 5-star review to help get the word out about the show. And be sure to subscribe so you never miss any of the insightful conversations. #AI #Automation #Airflow #MachineLearning…
به Player FM خوش آمدید!
Player FM در سراسر وب را برای یافتن پادکست های با کیفیت اسکن می کند تا همین الان لذت ببرید. این بهترین برنامه ی پادکست است که در اندروید، آیفون و وب کار می کند. ثبت نام کنید تا اشتراک های شما در بین دستگاه های مختلف همگام سازی شود.