با برنامه Player FM !
Fostering AI Literacy - On Tech Ethics
Manage episode 475873080 series 3440731
Discusses the importance of fostering AI literacy in research and higher education.
Our guest today is Sarah Florini who is an Associate Director and Associate Professor in the Lincoln Center for Applied Ethics at Arizona State University. Sarah’s work focuses on technology, social media, technology ethics, digital ethnography, and Black digital culture. Among other things, Sarah is dedicated to fostering critical AI literacy and ethical engagement with AI/ML technologies. She founded the AI and Ethics Workgroup to serve as a catalyst for critical conversations about the role of AI models in higher education.
This episode is co-hosted by Alexa McClellan, MA, Associate Director of Research Foundations at CITI Program.
Additional resources:
- Distributed AI Research Institute: https://www.dair-institute.org/
- Mystery AI Hype Theater 3000: https://www.dair-institute.org/maiht3k/
- Tech Won’t Save Us: https://techwontsave.us/
- CITI Program’s Essentials of Responsible AI course: https://about.citiprogram.org/course/essentials-of-responsible-ai/
36 قسمت
Manage episode 475873080 series 3440731
Discusses the importance of fostering AI literacy in research and higher education.
Our guest today is Sarah Florini who is an Associate Director and Associate Professor in the Lincoln Center for Applied Ethics at Arizona State University. Sarah’s work focuses on technology, social media, technology ethics, digital ethnography, and Black digital culture. Among other things, Sarah is dedicated to fostering critical AI literacy and ethical engagement with AI/ML technologies. She founded the AI and Ethics Workgroup to serve as a catalyst for critical conversations about the role of AI models in higher education.
This episode is co-hosted by Alexa McClellan, MA, Associate Director of Research Foundations at CITI Program.
Additional resources:
- Distributed AI Research Institute: https://www.dair-institute.org/
- Mystery AI Hype Theater 3000: https://www.dair-institute.org/maiht3k/
- Tech Won’t Save Us: https://techwontsave.us/
- CITI Program’s Essentials of Responsible AI course: https://about.citiprogram.org/course/essentials-of-responsible-ai/
36 قسمت
Kaikki jaksot
×به Player FM خوش آمدید!
Player FM در سراسر وب را برای یافتن پادکست های با کیفیت اسکن می کند تا همین الان لذت ببرید. این بهترین برنامه ی پادکست است که در اندروید، آیفون و وب کار می کند. ثبت نام کنید تا اشتراک های شما در بین دستگاه های مختلف همگام سازی شود.