正文阅读
Are Apple’s Tools Against Child Abuse Bad for Your Privacy?
苹果针对虐待儿童的新功能,恐危及隐私保护?
Apple unveiled a plan two weeks ago founded in good intentions: Root out images of child sexual abuse from iPhones. But as is often the case when changes are made to digital privacy and security, technology experts quickly identified the downside: Apple’s approach to scanning people’s private photos could give law enforcement authorities and governments a new way to surveil citizens and persecute dissidents. Once one chip in privacy armor is identified, anyone can attack it, they argued.
两周前,苹果公司公布了一项善意的计划:从iPhone上发现儿童性虐图像。但就像在数字隐私和安全方面做出改变时经常发生的那样,技术专家很快就发现了其不利之处:苹果扫描人们私人照片的方法,可能会给执法部门和政府提供一种监视公民和迫害异见人士的新方法。他们认为,一旦隐私保护中的一个芯片被识别出来,任何人都可以攻击它。
The conflicting concerns laid bare an intractable issue that the tech industry seems no closer to solving today than when Apple first fought with the F.B.I. over a dead terrorist’s iPhone five years ago. The technology that protects the ordinary person’s privacy can also hamstring criminal investigations. But the alternative, according to privacy groups and many security experts, would be worse.
这些相互冲突的担忧暴露出一个棘手的问题,与五年前苹果首次与联邦调查局争夺一名死亡恐怖分子的iPhone相比,如今科技行业似乎并没有更接近于解决这个问题。这项保护普通人隐私的技术也可能妨碍刑事调查。但根据隐私组织和许多安全专家的说法,另一种选择可能会更糟。
“Once you create that back door, it will be used by people whom you don’t want to use it,” said Eva Galperin, the cybersecurity director at the Electronic Frontier Foundation, a digital-rights group. “That is not a theoretical harm. That is a harm we’ve seen happen time and time again.” Apple was not expecting such backlash. When the company announced the changes, it sent reporters complex technical explainers and laudatory statements from child-safety groups, computer scientists and Eric H. Holder Jr., the former U.S. attorney general.
数字权利组织电子前沿基金会的网络安全主管伊娃·加尔佩林说,“一旦你创建了这个后门,它就会被你不想使用的人使用。”“这并非理论层面的危害。这是我们一次又一次看到的危害。”苹果没有料到会出现这样的强烈抵制。当该公司宣布这些变化时,它向记者提供了复杂的技术解释,并发表了来自儿童安全组织、计算机科学家和前美国司法部长小埃里克·H·霍尔德的赞扬声明。