Artwork

محتوای ارائه شده توسط Declarations: The Human Rights Podcast. تمام محتوای پادکست شامل قسمت‌ها، گرافیک‌ها و توضیحات پادکست مستقیماً توسط Declarations: The Human Rights Podcast یا شریک پلتفرم پادکست آن‌ها آپلود و ارائه می‌شوند. اگر فکر می‌کنید شخصی بدون اجازه شما از اثر دارای حق نسخه‌برداری شما استفاده می‌کند، می‌توانید روندی که در اینجا شرح داده شده است را دنبال کنید.https://fa.player.fm/legal
Player FM - برنامه پادکست
با برنامه Player FM !

Deepfakes and Non-Consensual Pornography

37:47
 
اشتراک گذاری
 

Manage episode 324150049 series 1918661
محتوای ارائه شده توسط Declarations: The Human Rights Podcast. تمام محتوای پادکست شامل قسمت‌ها، گرافیک‌ها و توضیحات پادکست مستقیماً توسط Declarations: The Human Rights Podcast یا شریک پلتفرم پادکست آن‌ها آپلود و ارائه می‌شوند. اگر فکر می‌کنید شخصی بدون اجازه شما از اثر دارای حق نسخه‌برداری شما استفاده می‌کند، می‌توانید روندی که در اینجا شرح داده شده است را دنبال کنید.https://fa.player.fm/legal

The Deepfake detection platform Sensity came out with a report in 2019 that 96% of Deepfakes on the internet are pornographic and 90% of those represent women. Deepfakes are a modern form of synthetic media created by two competing AI’s with the goal of replicating hyper-realistic videos, images, and voices. Over the past five years this has led to major concerns of the technology being used to spread mis/disinformation, carry out fraudulent cybercrimes, tamper with human rights evidence, and most importantly in relation to this episode create non-consensual pornography. In this episode, the last of this season of the Declarations podcast, host Maryam Tanwir sat down with panellist Neema Jayasinghe and Henry Adjer who is not only responsible for the Sensity report that came out in 2019 but is also a seasoned expert on the topic of deepfakes and synthetic media. He is currently the head of policy and partnerships at Metaphysic.AI and also co-authored the report ‘Deeptrace: The State of Deepfakes’ while at Sensity. This was the first major report published to map the landscape of deepfakes and found that the overwhelming majority are used in pornography.

  continue reading

96 قسمت

Artwork
iconاشتراک گذاری
 
Manage episode 324150049 series 1918661
محتوای ارائه شده توسط Declarations: The Human Rights Podcast. تمام محتوای پادکست شامل قسمت‌ها، گرافیک‌ها و توضیحات پادکست مستقیماً توسط Declarations: The Human Rights Podcast یا شریک پلتفرم پادکست آن‌ها آپلود و ارائه می‌شوند. اگر فکر می‌کنید شخصی بدون اجازه شما از اثر دارای حق نسخه‌برداری شما استفاده می‌کند، می‌توانید روندی که در اینجا شرح داده شده است را دنبال کنید.https://fa.player.fm/legal

The Deepfake detection platform Sensity came out with a report in 2019 that 96% of Deepfakes on the internet are pornographic and 90% of those represent women. Deepfakes are a modern form of synthetic media created by two competing AI’s with the goal of replicating hyper-realistic videos, images, and voices. Over the past five years this has led to major concerns of the technology being used to spread mis/disinformation, carry out fraudulent cybercrimes, tamper with human rights evidence, and most importantly in relation to this episode create non-consensual pornography. In this episode, the last of this season of the Declarations podcast, host Maryam Tanwir sat down with panellist Neema Jayasinghe and Henry Adjer who is not only responsible for the Sensity report that came out in 2019 but is also a seasoned expert on the topic of deepfakes and synthetic media. He is currently the head of policy and partnerships at Metaphysic.AI and also co-authored the report ‘Deeptrace: The State of Deepfakes’ while at Sensity. This was the first major report published to map the landscape of deepfakes and found that the overwhelming majority are used in pornography.

  continue reading

96 قسمت

همه قسمت ها

×
 
Loading …

به Player FM خوش آمدید!

Player FM در سراسر وب را برای یافتن پادکست های با کیفیت اسکن می کند تا همین الان لذت ببرید. این بهترین برنامه ی پادکست است که در اندروید، آیفون و وب کار می کند. ثبت نام کنید تا اشتراک های شما در بین دستگاه های مختلف همگام سازی شود.

 

راهنمای مرجع سریع