Computer Ethics, Spring 2022

Week 4, Feb 8

Class 4 Readings

Start reading Chapter 2 on privacy.


Reputation

What would you do if there was something bad about you on the Internet?

Here's what the rich might do:

restofworld.org/2022/documents-reputation-laundering-firm-eliminalia/

There are two general approaches to reputation management:

EARN IT act

Summary at cyberlaw.stanford.edu/blog/2022/02/earn-it-act-back-and-it%E2%80%99s-more-dangerous-ever: the idea is that websites would lose Section 230 protection (we'll get to this) if they fail to put a stop to Child Sexual Abuse Material (CSAM) on their platforms. The law was originally proposed in 2020, and has returned in 2022.

One near-certain consequence is that sites such as Facebook and YouTube would crack down on user-contributed content. There are already automated algorithms that attempt to detect CSAM, but they do so imperfectly, and if the consequences of misidentification are raised, then much more legal material will be thrown out.

It is also worth pointing out that websites are not protected today by Section 230 for hosting CSAM. So, while EARN IT would surely turn up the heat, it would not add any new tools for prosecution.

Finally, EARN IT is closely associated with banning encryption. The original EARN IT would have withdrawn Section 230 protection for sites that supported any form of end-to-end encryption. The new version only says that encryption alone cannot be taken as evidence of lawbreaking, but that if there is any other evidence of insufficient zeal in cracking down on CSAM, then encryption can be used as additional evidence.

Apple and CSAM

Last fall Apple announced a plan -- now suspended -- to monitor your iPhone for child pornography:

Here is an article about this: "Your Phone Is Your Private Space": theatlantic.com/ideas/archive/2021/09/spyware-your-iphone-step-too-far-privacy/619987. The author writes that "future iPhones will almost inevitably scan for more than child porn". First, is this the real risk we're worrying about? Certainly Facebook is censoring more and more content. Would Apple start censoring iMessages that were insufficiently respectful of minority groups?

That your phone is your private space is an excellent point, but also a missed one. Apple is not proposing to check the photos on your phone; they are proposing to check the photos you save on iCloud. This gives Apple a little more credibility; they simply don't want to be storing anyone's child pornography for them. But does it matter that Apple was going to do the scanning on the phone, as part of the process of uploading to iCloud?

Finally, Apple's CSAM approach here is doubtless driven by the specter of EARN IT.



Google v Oracle

Google Books

Does the Napster model work for film?


Laws

Sony v Universal, 1984

Dowling, 1985

Feist v Rural, 1991: it has to be original

MGM v Grokster

DMCA

    OCILLA

    Mail & Guardian

Viacom v YouTube

Server-based filesharing and SOPA/PIPA (laws that seemed inevitable but which ultimately did not pass)

Lawsuits against Users