Set as Homepage - Add to Favorites

日韩欧美成人一区二区三区免费-日韩欧美成人免费中文字幕-日韩欧美成人免费观看-日韩欧美成人免-日韩欧美不卡一区-日韩欧美爱情中文字幕在线

【????? ?????? ??? ??????】Apple delays plan to check iPhones for child abuse images

The ????? ?????? ??? ??????pushback against Apple's plan to scan iPhone photos for child exploitation images was swift and apparently effective.

Apple said Friday that it is delaying the previously announced system that would scan iPhone users' photos for digital fingerprints that indicated the presence of known Child Sexual Abuse Material (CSAM). The change is in response to criticism from privacy advocates and public outcry against the idea.

"Previously we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and to help limit the spread of Child Sexual Abuse Material," a September 3 update at the top of the original press release announcing the program reads. "Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features."


You May Also Like

Announced in August, the new feature for iOS 15 would have checked photos in an iPhone user's photo library — on the device before sending the photos to iCloud — against a database of known CSAM images. If the automated system found a match, the content would be sent to a human reviewer, and ultimately reported to child protection authorities.

Mashable Light Speed Want more out-of-this world tech, space and science stories? Sign up for Mashable's weekly Light Speed newsletter. By clicking Sign Me Up, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy. Thanks for signing up!

The fact that the scanning happened on the device alarmed both experts and users. Beyond it being generally creepy that Apple would have the ability to view photos users hadn't even sent to the cloud yet, many criticized the move as hypocritical for a company that has leaned so heavily into privacy. Additionally, the Electronic Frontier Foundation criticized the ability as a "backdoor" that could eventually serve as a way for law enforcement or other government agencies to gain access to an individual's device.

"Even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor," the EFF said at the time.

Experts who had criticized the move were generally pleased with the decision to do more research.

Others said the company should go further to protect users' privacy. The digital rights organization fight for the future said Apple should focusing on strengthening encryption.

While other companies scan cloud-based photo libraries for CSAM, like Google, and the overall goal of protecting children is obviously a good one, Apple thoroughly bungled the rollout of this product with privacy concerns justifiably overshadowing the intended purpose. Better luck next time, folks.

Topics Apple Cybersecurity iPhone Privacy

0.1378s , 9926.84375 kb

Copyright © 2025 Powered by 【????? ?????? ??? ??????】Apple delays plan to check iPhones for child abuse images,Public Opinion Flash  

Sitemap

Top 主站蜘蛛池模板: 久久久久久青青无码日韩 | 久久久国产精品亚洲一区久久久成人毛片无码 | 久草在线精品ac | 91精品天美精东蜜桃传媒免费 | 久久中精品中文 | 草草在线视频 | 综合另类 | 国产麻豆91精品三级站 | 国产高清精品国语特黄A片 国产高清精品线久久 | 精品人妻综合久久 | 亚洲精品国产一区二区精华液 | 国产传媒在线观看 | 苍井空a v免费视频 苍井空a 集在线观看网站 | 亚洲熟少妇在线播放999 | 成人无码区在线观看 | 国产成人精品无码一区二区百度 | 久久久久久夜色码 | 国产一卡2卡3卡4卡网站精品 | 成人国内精品视频在线观看 | 日本成人一区 | 精品国产成人一区二区 | 久久国产夜色精品噜噜亚洲a | 91久久偷偷做嫩草影院电久久受www免费人成 | 国产亚洲AV片在线观看16女人 | 日日摸天天添到高潮 | 人妻夜夜爽天天爽三区麻豆au | 国产l精品国产亚洲区在线 国产l精品国产亚洲区在线观看 | 2024久久精品国产免费 | 朝桐光亚洲专区在线中文字幕 | 国产毛片精品一区二区色欲黄A片 | 成人无码一区二区三区不 | 一级做a爱无码性色永久免费 | 国产精品无码日韩一区二区三区 | 天天碰免费视频 | 91麻豆成人精品国产免费软件 | 色噜噜噜色噜噜噜色琪琪 | 国产精品国产三级在线专区 | 六月婷婷综合激情 | 亚洲精品乱码久久久久蜜桃 | 中文无码人妻有码人妻中文字幕 | 久久久无码精品亚洲日韩18 |