Artwork

محتوای ارائه شده توسط Cybercrime Magazine. تمام محتوای پادکست شامل قسمت‌ها، گرافیک‌ها و توضیحات پادکست مستقیماً توسط Cybercrime Magazine یا شریک پلتفرم پادکست آن‌ها آپلود و ارائه می‌شوند. اگر فکر می‌کنید شخصی بدون اجازه شما از اثر دارای حق نسخه‌برداری شما استفاده می‌کند، می‌توانید روندی که در اینجا شرح داده شده است را دنبال کنید.https://fa.player.fm/legal
Player FM - برنامه پادکست
با برنامه Player FM !

Security Nudge. Voice Cloning Scam Risks. Sponsored By CybSafe.

1:25
 
اشتراک گذاری
 

Manage episode 426778258 series 2423485
محتوای ارائه شده توسط Cybercrime Magazine. تمام محتوای پادکست شامل قسمت‌ها، گرافیک‌ها و توضیحات پادکست مستقیماً توسط Cybercrime Magazine یا شریک پلتفرم پادکست آن‌ها آپلود و ارائه می‌شوند. اگر فکر می‌کنید شخصی بدون اجازه شما از اثر دارای حق نسخه‌برداری شما استفاده می‌کند، می‌توانید روندی که در اینجا شرح داده شده است را دنبال کنید.https://fa.player.fm/legal
If you got a phone call from someone that sounded exactly like your partner, child, or parent, and they begged you to send money because they were stuck somewhere – would you do it? Of course you would – and you might well become the latest victim of voice cloning scams. Companies have demonstrated new AI tools that can emulate someone’s voice after hearing just three seconds of that individual speaking. Once they have your voiceprint, cybercriminals can use it any way they want – and it’s happening so frequently that the FTC recently held a competition to find the best tools for detecting AI voice clones. If you do get a phone call or voice message asking for money, even if it sounds like a loved one, don’t be fooled. Take a moment to contact that person on a known phone number; text them to ask them to call you; or ask someone else you trust to verify your loved one’s location and status. No matter what the voice on the phone says, you just might find them at home, safe and sound. The 60-second "Security Nudge" is brought to you by CybSafe, developers of the Human Risk Management Platform. Learn more at https://cybsafe.com
  continue reading

4437 قسمت

Artwork
iconاشتراک گذاری
 
Manage episode 426778258 series 2423485
محتوای ارائه شده توسط Cybercrime Magazine. تمام محتوای پادکست شامل قسمت‌ها، گرافیک‌ها و توضیحات پادکست مستقیماً توسط Cybercrime Magazine یا شریک پلتفرم پادکست آن‌ها آپلود و ارائه می‌شوند. اگر فکر می‌کنید شخصی بدون اجازه شما از اثر دارای حق نسخه‌برداری شما استفاده می‌کند، می‌توانید روندی که در اینجا شرح داده شده است را دنبال کنید.https://fa.player.fm/legal
If you got a phone call from someone that sounded exactly like your partner, child, or parent, and they begged you to send money because they were stuck somewhere – would you do it? Of course you would – and you might well become the latest victim of voice cloning scams. Companies have demonstrated new AI tools that can emulate someone’s voice after hearing just three seconds of that individual speaking. Once they have your voiceprint, cybercriminals can use it any way they want – and it’s happening so frequently that the FTC recently held a competition to find the best tools for detecting AI voice clones. If you do get a phone call or voice message asking for money, even if it sounds like a loved one, don’t be fooled. Take a moment to contact that person on a known phone number; text them to ask them to call you; or ask someone else you trust to verify your loved one’s location and status. No matter what the voice on the phone says, you just might find them at home, safe and sound. The 60-second "Security Nudge" is brought to you by CybSafe, developers of the Human Risk Management Platform. Learn more at https://cybsafe.com
  continue reading

4437 قسمت

All episodes

×
 
Loading …

به Player FM خوش آمدید!

Player FM در سراسر وب را برای یافتن پادکست های با کیفیت اسکن می کند تا همین الان لذت ببرید. این بهترین برنامه ی پادکست است که در اندروید، آیفون و وب کار می کند. ثبت نام کنید تا اشتراک های شما در بین دستگاه های مختلف همگام سازی شود.

 

راهنمای مرجع سریع