

حمایت شده
This story was originally published on HackerNoon at: https://hackernoon.com/why-is-it-so-hard-to-learn-basic-facts-about-government-algorithms.
It took six years, from the algorithm’s deployment in 2017 until “Inside the Suspicion Machine” published, for the public to get a full picture of how it worked
Check more stories related to society at: https://hackernoon.com/c/society. You can also check exclusive content about #society, #algorithms, #fraud-detection-algorithm, #journalism, #usa-government, #fast-enterprises, #midas, #the-markup, and more.
This story was written by: @TheMarkup. Learn more about this writer by checking @TheMarkup's about page, and for more stories, please visit hackernoon.com.
The investigation showed that the algorithm, built for Rotterdam by the consultancy Accenture, discriminated on the basis of ethnicity and gender. And most impressively, it demonstrated in exacting detail how and why the algorithm behaved the way it did. (Congrats to the Lighthouse/Wired team, including Dhruv Mehrotra, who readers may recall helped us investigate crime prediction algorithms in 2021.) Cities around the world and quite a few U.S. states are using similar algorithms built by private companies to flag citizens for benefits fraud. Not for lack of trying, we know very little about how they work.
106 قسمت
This story was originally published on HackerNoon at: https://hackernoon.com/why-is-it-so-hard-to-learn-basic-facts-about-government-algorithms.
It took six years, from the algorithm’s deployment in 2017 until “Inside the Suspicion Machine” published, for the public to get a full picture of how it worked
Check more stories related to society at: https://hackernoon.com/c/society. You can also check exclusive content about #society, #algorithms, #fraud-detection-algorithm, #journalism, #usa-government, #fast-enterprises, #midas, #the-markup, and more.
This story was written by: @TheMarkup. Learn more about this writer by checking @TheMarkup's about page, and for more stories, please visit hackernoon.com.
The investigation showed that the algorithm, built for Rotterdam by the consultancy Accenture, discriminated on the basis of ethnicity and gender. And most impressively, it demonstrated in exacting detail how and why the algorithm behaved the way it did. (Congrats to the Lighthouse/Wired team, including Dhruv Mehrotra, who readers may recall helped us investigate crime prediction algorithms in 2021.) Cities around the world and quite a few U.S. states are using similar algorithms built by private companies to flag citizens for benefits fraud. Not for lack of trying, we know very little about how they work.
106 قسمت
Player FM در سراسر وب را برای یافتن پادکست های با کیفیت اسکن می کند تا همین الان لذت ببرید. این بهترین برنامه ی پادکست است که در اندروید، آیفون و وب کار می کند. ثبت نام کنید تا اشتراک های شما در بین دستگاه های مختلف همگام سازی شود.