Computer Ethics, Fall 2022

Thursdays 4:15-6:45

Class 10, Nov 3

Readings

Read Chapter 3 on Speech

Videos:

Section 230 (subsequent cases; 50 min)

Threat Speech (PP v ACLA) (week 8, 37 min)

Software as Speech (week 9, 31 min)


Elon: "The bird is freed"

Thierry Breton, EU market commissioner: "In Europe, the bird will fly by our rules"




LICRA v Yahoo

We have done most of the French side.

Encryption

EARN IT act

Summary at cyberlaw.stanford.edu/blog/2022/02/earn-it-act-back-and-it%E2%80%99s-more-dangerous-ever: the idea is that websites would lose Section 230 protection (we'll get to this) if they fail to put a stop to Child Sexual Abuse Material (CSAM) on their platforms. The law was originally proposed in 2020, and has returned in 2022.

One near-certain consequence is that sites such as Facebook and YouTube would crack down on user-contributed content. There are already automated algorithms that attempt to detect CSAM, but they do so imperfectly, and if the consequences of misidentification are raised, then much more harmless material will be thrown out.

It is also worth pointing out that websites are not protected today by Section 230 for hosting CSAM. So, while EARN IT would surely turn up the heat, it would not add any new tools for prosecution.

Finally, EARN IT is closely associated with banning encryption. The original EARN IT would have withdrawn Section 230 protection for sites that supported any form of end-to-end encryption. The new version only says that encryption alone cannot be taken as evidence of lawbreaking, but that if there is any other evidence of insufficient zeal in cracking down on CSAM, then encryption can be used as additional evidence.

European Union

In May 2022 the EU proposed a "child safety" regulation that would require messaging providers to scan all messages for CSAM, and report it if found. There is no way to do this if true end-to-end encryption is in place.

Apple and CSAM

Last fall Apple announced a plan -- now suspended -- to monitor your iPhone for child pornography:

Here is an article about this: "Your Phone Is Your Private Space": theatlantic.com/ideas/archive/2021/09/spyware-your-iphone-step-too-far-privacy/619987. The author writes that "future iPhones will almost inevitably scan for more than child porn". First, is this the real risk we're worrying about? Certainly Facebook is censoring more and more content. Would Apple start censoring iMessages that were insufficiently respectful of minority groups?

That your phone is your private space is an excellent point, but also a missed one. Apple is not proposing to check the photos on your phone; they are proposing to check the photos you save on iCloud. This gives Apple a little more credibility; they simply don't want to be storing anyone's child pornography for them. But does it matter that Apple was going to do the scanning on the phone, as part of the process of uploading to iCloud?

Finally, Apple's CSAM approach here is doubtless driven by the specter of EARN IT.