Computer Ethics, Fall 2022

Mondays 4:15-6:45

Class 9

Nov 7

Readings

Read Chapter 3 of Baase on Speech



Elon: "The bird is freed"

Thierry Breton, EU market commissioner: "In Europe, the bird will fly by our rules"

A Twitter story

https://twitter.com/stevekrenzel/status/1589700721121058817

1. A large telco wanted to pay us to log signal strength data in N. America and send it to them.

2. When we sent this data to the telco they said the data was useless. They switched their request and said they want to be able to tell how many of our users are entering their competitors’ stores.

3. The Director [of the telco] said “We should know when users leave their house, their commute to work, and everywhere they go throughout the day. Anything less is useless. We get a lot more than that from other tech companies.”

Twitter apparently did not follow through at that point.



Debates

Source Code as Speech

Patents

EARN IT act

Summary at cyberlaw.stanford.edu/blog/2022/02/earn-it-act-back-and-it%E2%80%99s-more-dangerous-ever: the idea is that websites would lose Section 230 protection (we'll get to this) if they fail to put a stop to Child Sexual Abuse Material (CSAM) on their platforms. The law was originally proposed in 2020, and has returned in 2022.

One near-certain consequence is that sites such as Facebook and YouTube would crack down on user-contributed content. There are already automated algorithms that attempt to detect CSAM, but they do so imperfectly, and if the consequences of misidentification are raised, then much more harmless material will be thrown out.

It is also worth pointing out that websites are not protected today by Section 230 for hosting CSAM. So, while EARN IT would surely turn up the heat, it would not add any new tools for prosecution.

Finally, EARN IT is closely associated with banning encryption. The original EARN IT would have withdrawn Section 230 protection for sites that supported any form of end-to-end encryption. The new version only says that encryption alone cannot be taken as evidence of lawbreaking, but that if there is any other evidence of insufficient zeal in cracking down on CSAM, then encryption can be used as additional evidence.

European Union

In May 2022 the EU proposed a "child safety" regulation that would require messaging providers to scan all messages for CSAM, and report it if found. There is no way to do this if true end-to-end encryption is in place.

Apple and CSAM

Last fall Apple announced a plan -- now suspended -- to monitor your iPhone for child pornography:

Here is an article about this: "Your Phone Is Your Private Space": theatlantic.com/ideas/archive/2021/09/spyware-your-iphone-step-too-far-privacy/619987. The author writes that "future iPhones will almost inevitably scan for more than child porn". First, is this the real risk we're worrying about? Certainly Facebook is censoring more and more content. Would Apple start censoring iMessages that were insufficiently respectful of minority groups?

That your phone is your private space is an excellent point, but also a missed one. Apple is not proposing to check the photos on your phone; they are proposing to check the photos you save on iCloud. This gives Apple a little more credibility; they simply don't want to be storing anyone's child pornography for them. But does it matter that Apple was going to do the scanning on the phone, as part of the process of uploading to iCloud?

Finally, Apple's CSAM approach here is doubtless driven by the specter of EARN IT.