David E. Weekly's Website

The Knee Jerk

November 19, 1996

This month, the very fundamentals of the First Amendment are held in question.

On February 1, 1996, a revision to the Telecommunications Reform Act (S.652) was passed by Congress. A week later, President Clinton signed it into law. Part of the revision of the Telecommunications Act of 1996 included new rules for decency on the Internet. This section was entitled “The Communications Decency Act,” or the CDA for short. The CDA’s proponents wanted to address a key issue about the Internet: the ability of children to access indecent and pornographic material. Senator Exon of Nebraska, the chief proponent of the CDA, submitted as evidence lewd, disgusting, and revealing photographs of women. With his cries that every child in the world with a computer could be subjected to such material, the bill naturally passed swiftly through Congress. And why shouldn’t it have?

Indeed, the text of the CDA, which makes it illegal to give minors material that “is obscene or indecent” (CDA, subsection A.I.B), seems sensible and rational. To accept it is almost instinctive, a “knee-jerk” reaction. After all, if a minor cannot in real life walk into a store and (legally) buy a Playboy, why should they be able to view the same in cyberspace? Nearly all the members of the House endorsed the CDA, seeing it as a way to make the reckless Internet family-safe; the CDA seems to make sense.

But within five minutes of the bill’s passage, the American Civil Liberties Union (ACLU) filed a complaint against the US Attorney-General Janet Reno, asking that the bill not be enforced until having undergone complete judicial review. The complaint was granted, and the case was handed over to a panel of three federal judges in Philadelphia. After much scrutinizing and reviewing mounds of technical documents, all three judges independantly concluded that Congress had no right to regulate the Internet. The CDA was declared unconstitutional. The panel did recognize that this was an important act, dealing directly with the First Amendment, so they are putting the final decision to the Supreme Court. They are so confident that the Supreme Court, with the proper information, will accept their verdict that they are encouraging the Court to go through a less-than-extensive review process just to get the CDA permanently revoked. Why did the ACLU protest this seemingly innocuous bill? And further still, why would a panel of judges declare the CDA unconstitutional?

I intend to make it clear how and why the CDA is unconstitutional. First I will clear up some common technical misunderstandings in content regulation. I will then show how the CDA could not possibly be enforced, how it suppresses constitutionally protected free speech and hence violates the First Amendment, how it is unconstitutionally vague, and why it is not neccesary.

Many people compare the Internet to TV, and point out that network television is, to a degree, censored. This is true, and the government does censor network television. But the analogy to the Internet does not hold. With television, a small quantity of large companies transmit information purely over a medium that the goverment owns, a specific set of radio frequencies. Companies pay the government yearly to “lease” a section of the set of frequencies for broadcasting. If something indecent appears, locating the offending station is a trivial matter. They are warned by the government, and if they do not desist, the government removes their license, and/or takes them to court. But the Internet is almost completely different. The US government owns essentially no part of the actual medum in which digital messages get passed from one place to another. And instead of there being a small quantity of content providers, there are hundreds of thousands: anybody in the world can publish any information they want. I, a mere college student, own and administer my own web site containing pictures and text that can be accessed from anywhere in the world. I could publish pornography, or hate literature, or anything I want. This is because I do not have to submit my work to a larger authority for approval — I am my own voice. And the US government would have to prevent every “little guy” in the US like myself from transmitting “material inappropriate for minors.”

One other important way in which the Internet radically differs from television is that it is “active content,” meaning that you, the user, have to actively seek out information. You select exactly the information you want. Televisions, on the other hand, are examples of “passive content.” You are exposed to the information that the television producers wish you to see. There’s no pause, rewind, or skip button on a TV. Content available to you is merely the sum of what producers of each station wish to give you at any given instant. If and when children are viewing pornography on the the Internet, it is of their own volition — in fact, it is likely that they searched long and hard to find some. So children are not “subjected” to objectionable material on the Internet; rather, they play an active role in obtaining it. The problem is not with the computer, or even with the content. The problem lays with the parents and the children. Many, many computer-illiterate people are unclear as to this important distinction between active and passive content.

Billions upon billions of ones and zeros march through the Internet every day. Think about that. The machines that send these ones and zeroes back and forth have little to no information about the content of the information they are passing: pictures, texts, sounds, and movies all look exactly the same to them. And they cannot have time to care: they are already working as hard as possible just to keep the data flowing. There is no way that they could spare any processing time for anything else. Even if hypothetically they could determine whether they were passing a picture or text or a movie, it would then have to be evaluated by a human to determine if the content was obscene or not. If something was found to be obscene, it would have to be verified that it was passed to a minor. Then the transmitter of this information would have to be looked up and located, which again would take more processing. Single machines on the Internet can deliver millions of files per day, so you can see just how infeasible such a scheme would be. Yet the CDA dictates monitoring Internet traffic in such a way. This is very different than television, where a select few corporations are in control of broadcast content. The supporters of the CDA tend not to be aware of the technological impossibility of its enforcement, let alone the ridiculous number of “content-raters” that would need to be hired to monitor the Internet for obscenity.

To further this technical difficulty, there is the problem/gift of cryptography. Cryptography is the art of using mathematical manipulations to make digital information private to only those you want to share it with. Cryptography currently available to the US public is so good that you can make information so secret that it would take a few hundred thousand years of processing on thousands of machines for people to be able to “break your code,” and access your information without your permission. In short, cryptography allows you to keep secrets, even from the government. Computers passing encrypted information could not possibly hope to decipher it. Cryptography insures the impossibility of Internet-wide “brute-force” enforcement of the CDA. It is clear from a number of standpoints that technically, the CDA cannot be strictly enforced.

But even if the US government could wave a magic wand and censor all content within the US, this would not solve the problem. There remains the fact that the panel determined that “a large percentage, perhaps 40% or more,” of the content provided on the Internet “originates outside of the United States” (ACLU v. Reno [1996] Counterstatement: Section B.2) , and hence outside the jurisdiction of the CDA. Were the CDA to be enforced, there is no doubt that many organizations containing material potentially in violation of the CDA would move their content to international servers. The users would hardly notice, and the whole purpose of the CDA, to protect American children, would be defeated. Kids can just look elsewhere for their pornography. The CDA cannot be internationally enforced.

I have talked about, and shown, the technical issues of how the Internet could not be effectively subjected to the CDA. Now I will discuss the moral and constitutional issues of why itshould not be subjected to the CDA.

The CDA, for constitutional reasons alone, should not be upheld by the Supreme Court. It suppresses free speech among adults in flagrant violation of the First Amendment; it would technically censor a vast quantity of artistic and classic works, the Bible included; and it might prohibit the distribution of vital, possibly life-saving information to youths.

Free speech is in our consitution, in the very First Amendment: “Congress shall make no law…abridging the freedom of speech or of the press….” This means that speech, text, and pictures which have been determined to be legal for adults to view must not be censored for an adult in any way by the government. The CDA requires content providers to insure that minors may not view material that is inappropriate for them. The CDA suggests that providers of offensive content use credit cards to do age verification; after all, you do need to be at least 18 years of age to obtain a credit card. (This being true, it should be noted that the mere production of a credit card number does not insure that the said user is not a minor. Kids know how to borrow credit cards from parents and older siblings.) While many commercial pornography sites have already adopted the card-pay practice in a mixed attempt to verify the age of their users and be paid for their services, this solution is impracticable for non-commercial information providers. I, for instance, being of the age of 18, have a constitutional right to transmit adult material, for instance an article containing one of the “seven dirty words” to other adults. If the CDA were to be ratified by the Supreme Court, then I could not do this on my web site. In truth, as a poor college student, I have no technology that I could employ to deny users under 18 access to portions of my Internet site. My only option? Simply not to provide that information which might be deemed offensive to minors. My legal, rightful Constitutional speech has been censored and the First Amendment has just been violated.

You can see how in this way, non-commercial providers of potentially offensive material become censored down to the lowest common denominator: material acceptable for children. All adult material on the Internet would be under the domain of commercial services, due to the prohibitive cost of verifying users’ ages — all publicly accessible material would have to be self-censored and watered down. The government itself notes alongside the CDA that “users of all…forms of [Internet] communication always have the option to tone down their communication so that it does not contain material” criminalized by the CDA (CDA Jurisdictional Statement 24). What did the panel think of this? In their own words: “There could be no clearer admission that the CDA…imposes a ban on the dissemination of constitutionally protected speech” (ACLU v. Reno Argument, Section B.2.C). The CDA is unconstitutional.

What comes to mind when I ask what kind of materials fall under the CDA’s definition of “indecent”? For most people, degrading images of naked women, senseless and smutty literature — in short, works of little to no artistic or useful value. Let’s examine the facts: the CDA prohibits making available information containing direct references to “sexual or excretory activities or organs,” in terms “patently offensive as measured by contemporary community standards” (CDA Section 1.D.1). Okay, this seems reasonable enough. Take rape, for example. That’s an activity that is generally considered patently offensive when detailed in a work. Yet, the Bible itself contains specific, detailed accounts of rape! Am I not allowed to put the Bible on my web site if I cannot provide age verification? Even if I could provide age verification, should this material really be kept from minors? What about a 16-year-old high school student who wants to study Freudian psychosexual analysis? Clearly, Freud’s works contain specific references to sexual activities and organs. These works, which are technically illegal to publish on the Internet under the CDA, contain genuine artistic, historical, and/or scientific value. The CDA very noticeably lacks a clause allowing for these sorts of materials.

In some cases, the CDA might ban pages that are crucial for minors to be able to access. The “Safer Sex” Web page, with detailed and graphic instructions on how to put on and remove a condom and general information about sex, may have already prevented many unwanted pregnancies, abortions, and diseases. Many people do have sex before they are 18. This material, which deals with “adult” matters, needs to be made accessible to youths who need it. The CDA would prevent this.

It should be noted that in all of these cases, minors can obtain the works in print! For example, any minor can walk into a public library and see nudes, read lewd novels, and even (gasp!) Freud. It seems natural that this same minor should have access on the Internet to publicly availble print information. There should not be different rules for the Internet than for other media. If a minor can (legally) obtain a work in print, he or she should be able to legally access that material online. The CDA disagrees.

The CDA violates our constitution by censoring legal adult-to-adult communication where age verification is not possible, in prohibiting open distribution of works deemed to be of artistic or scientific merit that contain adult or sezual matter, and in preventing minors from accessing material online that they can obtain legally in the physical world. The CDA should not be passed.

Finally, the CDA is not required to obtain its stated goals. Ever in the spirit of democracy and capitalism, the desire of parents to prevent content that they deem “obscene” from reaching their children has spawned a whole new business of “filtering.” Filters are programs that you run on the machine viewing the content that limit children to viewing “acceptable” content. There is usually a bypass, so that the adults can access adult information if they provide the correct password. There are two kinds of filtering: client-rated and server-rated.

In client-rated filtering, the client simply obtains a list of acceptable and unacceptable web sites from a central server. When the user attempts to view a page that is on the “unacceptable” list, the user is notified and blocked from viewing it. Optionally, parents can prevent children from accessing any sites not on the “acceptable” list — the unrated sites. The downside of client filtering is that one server is in charge of essentially rating the entire Internet. This is expensive, so they usually charge some kind of fee to provide you with updated versions of the lists on a regular basis. Also, it is impossible to select between different “levels” of content. For instance, I may want to allow my 15-year-old son to view pages that contain mildly offensive language, but the server simply divides everything into “naughty” and “nice.”

Server-based filtering, where web pages rate themselves, is a much more practical, efficient, and sensible way to filter content. PICS (Platform for Internet Content Selection) is a good example of effective self-rating. Web pages rate themselves in the categories of sex, violence, and language. Parents can choose what level of Internet content their children can view in each category. Web browsers that are “PICS-enabled” simply check the ratings for a given page: if the rating exceeds that allowed, the web page is not displayed. Optionally, non-rated sites can be blocked. (This is usually not neccesary because most sites containing objectionable material have no inherent interest in the information being given to a minor — they are almost always the first adopters of rating technologies) In this way, the power of censorship is relegated to its proper place: the parents. I believe that parents should have a right to select the types of information their children are presented with. This is a role that I do not think that the government, or even a single third-party rating bureau, should have.

PICS is not a dream or a fuzzy standard that hopefully people will implement in the future. PICS is here, and rapidly gaining popularity. Web pages with PICS ratings are growing exponentially. I myself am looking into obtaining information on how to rate my own content for outside viewers. New web browsers are now PICS-enabled, and easily allow parents to filter content. I believe that before long, web servers containing objectionable material will automatically check to make sure that the browser is PICS-enabled before displaying information. This is easy to do, and would prevent such simple attacks on the system as a child installing an old, non-PICS-enabled browser to view adult materials.

PICS was not started by or sponsored by the government. It was a movement purely originating from the desire for parents to have the ability to filter content presented to their children. The net is allowing parents to self-censor. The purpose of the CDA was ultimately to make it difficult for children to gain access to material deemed inappropriate for them, and this is precisely what PICS-rated pages and PICS-enabled browsers allow parents to do, without the government making the decisions on what may or may not be printed. The Internet is taking care of itself: the CDA is unneccesary.

I hope I have clearly shown you the technical difficulty and unconstitutionality inherent in the CDA, along with the reasons of why it is not neccesary to implement it to achieve its stated goals and why its implementation would not achieve its stated goals. I believe that information leads to truth, and I have the belief that the panel of three federal judges decided as they did, and for the better, because they were exposed to the greatest amount of information. They were presented with hundreds of technical documents and presentations of the workings of the Internet, and as a result of their hard work and good learning, have made a clear and educated decision. I have the hope that in the Supreme Court, the truth will be found; that the underlying implications of the CDA be revealed to the justices, and the “knee-jerk” reaction found incorrect. Let freedom ring, even in cyberspace.

 


Works Used and Further Reading

“Citizens Internet Empowerment Coalition Response to the Department of Justice Jurisdictional Statement.” October 31, 1996. Online. Internet. World Wide Web:www.cdt.org/ciec/SC_appeal/Juris_resp.html

“Communications Decency Act Page.” Online. Internet. World Wide Web: www.epic.org/CDA/

“ACLU v. Reno: Overview.” Online. Internet. World Wide Web: www.aclu.org/issues/cyber/trial/trial.htm

“The Electronic Frontier Foundation Censorship Page.” Online. Internet. World Wide Web: www.eff.org/pub/Censorship/Internet_censorship_bills/

“The Electronic Frontier Foundation ACLU v. Reno Page.” Online. Internet. World Wide Web: www.eff.org/pub/Legal/Cases/EFF_ACLU_v_DoJ/

“ACLU v. Reno — The Case to Overturn the CDA.” Online. Internet. World Wide Web: www.spectacle.org/cda/cdamn.html

“Platform for Internet Content Selection [PICS].” Online. Internet. World Wide Web: www.w3.org/PICS/

“PoliticsNow Issues: The Communications Decency Act.” Online. Internet. World Wide Web: www.politicsnow.com/issues/internet/cda.htm

“Citizens Internet Empowerment Coalition.” Online. Internet. World Wide Web: www.cdt.org/ciec/

 


Quotes of Note

from the CIEC Response to the US DOJ Jurisdictional Statement on the CDA

Counterargument, Section B:

Appellants’ statement of the case entirely omits any reference to or discussion of the extensive and unusually detailed factual findings of the three-judge district court that formed the foundation for the decision on appeal. See J.S. App. 11a-61a. The Jurisdictional Statement fails to acknowledge the district court’s critical factual findings that (1) it is impossible or infeasible for most Internet speakers to comply with the CDA’s affirmative defenses; (2) because of the global nature of the Internet, the CDA will not effectively shield minors from indecent or patently offensive speech since a very substantial percentage of such speech is posted abroad, and will not be deterred by the CDA; and (3) parents can use currently available software and access provider options to control what Internet sites their children may access. These findings undergird the district court’s legal conclusions that the Act effectively bans constitutionally protected speech among adults, would not substantially further the government’s stated interest in shielding minors from indecent online speech, and is not the least restrictive means available to serve the government’s interest.

Counterargument, Section B.1:

(inability to check for age causes censorship)

Chief Judge Sloviter summarized the court’s factual conclusions as follows: we have found that no technology exists which allows those posting [protected but indecent material] on the category of newsgroups, mail exploders or chat rooms to screen for age. Speakers using those forms of communication cannot control who receives the communication, and in most instances are not aware of the identity of the recipients. If it is not feasible for speakers who communicate via these forms of communication to conduct age screening, they would have to reduce the level of communication to that which is appropriate for children in order to be protected under the statute. This would effect a complete ban even for adults of some expression, albeit “indecent,” to which they are constitutionally entitled … .

Counterargument, Section B.2:

(Overseas content)

Judge Dalzell concluded, “the CDA will almost certainly fail to accomplish the Government’s interest in shielding children from pornography on the Internet.”

Argument, Section B.1

Government may not constitutionally “quarantin[e] the general reading public against books not too rugged for grown men and women in order to shield juvenile innocence … . Surely this is to burn the house to roast the pig.” Butler v. Michigan, 352 U.S. 380, 383 (1957).

Argument, Section B.1

See also Denver Area Educ. Telecomm. Consortium v. FCC, 116 S.Ct. 2374, 2393 (1996) noting that “[n]o provision, we concede, short of an absolute ban, can offer certain protection against assault by a determined child,” and affirming that the Court has “not, however, generally allowed this fact alone to justify reduc[ing] the adult population … to … only what is fit for children.”

Argument, Section B.2.C

Even more offensive to the First Amendment, the government suggests that “users of all three forms of communication always have the option to tone down their communication so that it does not contain material” criminalized by the Act. J.S. 24. In other words, they can self-censor their speech in order to avoid federal prosecution. There could be no clearer admission that the CDA’s “display” provision imposes a ban on the dissemination of constitutionally protected speech.


© 2020 David E. Weekly, Built with Gatsby