Hate Speech On The Internet

I. Hate Speech on the Internet
Generally, hate speech receives constitutional protection and is not prosecuted that is why there are relatively few court cases addressing this issue on the Internet. For this reason, sites containing speech discriminating people because of their race or sexual inclinations are available on the Internet. These include the “Ku Klux Klan,” “Nazis,” “White Socialist Party,” “Skinheads” or “Aryan Nation,” for example, which speech is not directed to any person in particular, thus not punishable. In addition, the nature of this medium makes it difficult to trace the perpetrators of hate crime indeed, Web sites are easily relocated or abandoned when legal problems arise. In RAV v. St Paul , the Supreme Court defined that speech leading to racially motivated violence could be punished. Hence, threatening private message involving racial epithet sent over the Internet to someone, as well as a public message on a Web site, are legally actionable.

II. The Internet
At the dawn of the new century, the rise of new media such as the Internet, seem to create new issue about the limitations of free speech. However the chore of some free speech cases remains the same as in the past 100 years.
The Internet is an outgrowth of a military program called “ARPANET,” which began in 1969. The ARPANET no longer exists, and today the Internet is an international network of interconnected computers. The Internet is “a unique and wholly new medium of worldwide human communication.” People can access the Internet from many different sources, several major national “online services” such as America Online, or CompuServe provide access to their own networks as well as broader links to the Internet. The Internet offers a wide variety of communication and information methods, such as “e-mail,” automatic mailing list services (“listservs”), “chat rooms,” or the “World Wide Web.” These different tools can be used to transmit text, sounds, pictures, or animated video images. The environment as a whole is commonly called the “cyberspace,” because it does not belong to a particular geographical location.
It is “no exaggeration to conclude that the content on the Internet is as diverse as human thought,” said a District Court. Hence, the Internet is in itself, a “market place” of ideas, which concept was once adopted by the Supreme Court. An important difference between the Internet and broadcasting, for example is that users need to type the address of a known page or enter one or more keywords into a “search engine” to locate a site. This characteristic makes it a less intrusive medium than broadcasting, since users need to actually look for the information they want.
The Web is like a large library where millions of publications are available. The problem however, is that anyone can become a publisher and use this huge platform to address a worldwide audience. This characteristic makes it easy for extremist groups, for instance, to safely propel their message to the entire world. It allows them to communicate with other bigots, to promote their doctrine and to recruit people, while remaining anonymous. These groups however, have the right to express their ideas, even if they are offensive. Litigation occurs when an individual targets or harasses someone else because of his/her race, religion, or sexual inclinations, for instance. “Real life” cases have set precedents that also apply to the “cyberspace,” as some online hate speech cases discussed below show it.
While extremists have the right to create Web pages that contain offensive speech, users have means to deny them access to their homes. Indeed, filters and software are available for parents who wish to block this kind of material. These filtering tools provide users with a strong weapon, allowing the Internet to remain as free as possible from governmental regulations.
III. Hate Speech and The First Amendment
The First Amendment states that “Congress shall make no law … abridging the freedom of speech…” Every American has therefore the right to express his/her opinion even if the statement is offensive. The United States Supreme Court once adopted the concept of “a market place of ideas,” which laissez-faire policy allows good and bad ideas to freely compete. The logic is that harmful speech will ultimately be rejected and that it is better to tolerate a little harm for the sake of greater freedom. Indeed, the same tools used to censor hate speech, for instance could be used to restrict reasonable ones. Hence, it is rare that federal, state, and local government intrude upon this citizen right. In order for them to restrict speech they need to prove a compelling interest in doing so. Thus, they must act under the “strict scrutiny” standard and demonstrate that their goal is compelling and that the approach is narrowly tailored to meet this goal. Hence, the government has relatively few means to impose restrictions on the content of a speech. However, it can easily regulate the time, place and manner in which a speech is delivered, regardless of its content. These elements pertain to context, which concept is hard to apply to the Internet.

We will write a custom essay sample on
Hate Speech On The Internet
or any similar topic only for you
Order now

IV. Limitations of the First Amendment: Unprotected Speech
Offensive speech tends to fall under the First Amendment’s protection, but in some cases the Supreme Court ruled differently. Indeed, the following cases all set precedent also applying to the Internet.

In the case of Chaplinsky v. New Hampshire , Walter Chaplinsky, a Jehovah’s witness, distributed literature in the street of Rochester, N.H., when he called someone a “damned fascist.” He was arrested under a state law prohibiting speaking in “any offensive, derisive or annoying word to any person who is lawfully in any street or other public place.” Chaplinsky appealed, but the Supreme Court ruled that his speech contained “fighting words,” which definition is “words, which by their very utterance inflict injury or tend to incite an immediate breach of the peace.” In the decades since this 1940 decision, the Court has limited its effects to the most challenging and confrontational of words spoken in a face to face encounter and likely to lead to immediate fighting.
In Brandenburg v. Ohio, the defendant, a leader of a Ku Klux Klan group, was convicted in an Ohio state court for having said “Bury the niggers” and “the niggers should return to Africa” at a rally. In the Brandenburg case, the Court distinguished between words and action. The test for the speech “clear and present danger” was not met, because it did not give a prospect of immediate action, and was therefore protected by the First Amendment.

In R.A.V v. City of St. Paul , the defendant Robert A. Victoria, along with other teenagers, burned a cross in a black family’s backyard. Although this conduct could have been punished under any of a number of laws, R.A.V.’ s action was judged under the “fighting words” category of speech, which consistently with the First Amendment, can be regulated. The ordinance makes it a crime to use “fighting words” when knowing that they will “arouse anger, alarm or resentment in others on the basis of race, color, creed, religion or gender.” The Minnesota Supreme Court ruled that the action was not protected by the First Amendment due to the precedent set in the Chaplinsky v. New Hampshire case. It also concluded that the ordinance was narrowly tailored and served a compelling interest in protecting the community against bias-motivated threats to public safety. Thus, in R.A.V., based on the precedent set in Chaplinsky v. New Hampshire the court decided that race-based fighting words could also be punished, but the Supreme Court reversed the conviction. Justice Scalia stated that “The First Amendment does not permit St. Paul to impose special prohibitions on those speakers who express views on disfavored subjects.”
In 1989, in Kenosha, Wisconsin, a group of young black men and boys, including Mitchell, beat a young white man, inspired by a scene from the movie “Mississippi Burning.” The boy was rendered unconscious and remained in a coma for four days. Mitchell was convicted of aggravated battery, which offense usually carries a maximum sentence of two years imprisonment. In Wisconsin v. Mitchell, the jury found that Mitchell had intentionally selected his victim because of his race, and the sentence for his offense was increased to seven years. The Court rejected Mitchell’s claim that it was unconstitutional to punish his ideas. Relying on R. A. V. v. St. Paul, the Court held that the statute was directed at the defendant’s conduct that is committing a crime. Hence, the statute did not focus on Mitchell’s bigoted ideas, but rather on the actions resulting from these thoughts.

V. Hate Speech on the Internet and Court Decisions
The precedents set by the above cases have also applied to the Internet in Court’s ruling. There are relatively few cases involving hate speech on the Internet but the following show the limits of free speech in regard to this new medium.
In United States v. Machado the Court held that transmitting racially motivated harassing speech over the Internet was a violation of the law. In 1996, a student at the University of California, Richard Machado, 21, sent a threatening e-mail signed “Asian Hater” to 60 Asian students. In this message, he said he hated Asians and wanted to kill them all. Although the e-mail was sent from an anonymous account, Machado was questioned by the campus police and admitted being the author of the messages. In August 1997, the Central District of California charged Machado with violating a law that prohibit interference with a Federally protected activity, attending public college, because of race, color, religion or origin. The first trial ended in February 1998, and the jury found Machado guilty of violating the Asian student civil rights. At the second trial there was evidence that Machado had sent other threatening messages to campus staff prior to the “Asian Hater” one. He was sentenced to one year in prison. Machado was sentenced under Federal law because his messages were motivated by racial bias and aimed at people engaged in a Federally protected activity.

A similar case that happened in 1999 is United States v. Kingman Quon. Quon sent a hateful e-mail to Latinos faculty members and students both at the California State University in Los Angeles and at the Massachusetts Institute of Technology, in Boston. The messages stated that he would “come down and kill” them, and that Latinos were “too stupid to get jobs.” He pleaded guilty and received a two-year sentence.

In 1999, the case Commonwealth of Pennsylvania v. ALPHA HQ, involved a complaint against a Ryan Wilson, owner of a white supremacist Web site, ALPHA. He was charged for terrorist threats, harassment, and ethnic intimidation. He placed harassing material on the Internet that involved employees of the Reading Berks Human Relations Council. One of the pictures on ALPHA’ s Web site showed a bomb blowing up the office of Bonnie Jouhari, an employee who was involved in anti-hate activities. Next to the image a comment stated that she should be hung from a tree. Due to these insults, the white supremacists were ordered to remove their web page from the Internet. In addition Stormfront, Inc., the Internet company that provided ALPHA HQ its domain name service, was also named in the case and was ordered not to provide any more service. Wilson did not contest the charges and the site was removed from the Internet. This case set a precedent where a judge ordered a Web site to be shut down because the information on it was harmful.
The Brandenburg standard makes it hard to punish online hate speech. Indeed, the call for lawless action can be proved but the imminent action resulting from it is hard to demonstrate. Thus, on the Internet people can post messages on bulletin board, for instance, calling for action, but as long as the message does not provoke violent reactions, it will be protected.
Regardless, when a person is targeted because of race, religion or sexual orientation the perpetrator can face enhanced penalty. If there is evidence that a racist thought, for instance, led to a racist assault, as in Wisconsin v. Mitchell, and that the victim was picked according to his/her race, the statement receives no First Amendment protection. The same is true for hateful views expressed on the Internet.
VI. Internet Service Provider’s liability:
The Communication Decency Act
In an attempt to protect children from indecent material, Congress passed the Communications Decency Act (CDA) in 1996 as part of the Telecommunications Act of 1996. It made it a crime to send or display “obscene or indecent” material to minors on the Internet. As President Clinton signed the bill, the American Civil Liberties Union challenged it. In 1997, the Supreme Court had its first case about cyberspace, Reno v. American Civil Liberty Union. The CDA was declared too broad, since it banned offensive material, in general. The Court had to admit that the Internet was a unique medium that could not be regulated like broadcasting. However, the provisions included by Congress concerning the liability of Internet Service Providers remained unchanged. The section states that: “no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” Because hate speech on the Internet needs to be posted on a Web site, which in turns has to be hosted by a provider, the question of ISP’s liability is important. Even in cases involving child pornography, libel and defamation the provider America Online was not found legally actionable by the courts.
VII. AOL Case Study
The service provider America Online (AOL) was involved in tree major cases, where Section 230 of the Communication Decency Act, which prohibit Internet Service Providers to be held responsible for the content of messages posted on their network, was upheld.
In 1994, Richard Russell photographed and videotaped John Doe (fictive name) and two other minors engage in sexual activity with each other and with him. He then used an America Online (AOL) bulletin board to sell the photos and videotapes, without actually showing the images on the Internet. In 1997, John Doe’s mother, Jane Doe (fictive name), sued Russell and AOL, asking the court to order Russell and/or AOL to pay her and her son $8 million to compensate them for their emotional injuries.
In Doe v. America Online, Jane Doe claimed that AOL was negligent because it knew that Russell and other pedophiles used AOL to market and distribute child pornographic materials. The issue in this case was “whether a computer service provider with notice of a defamatory third party posting is entitled to immunity under section 230 of the CDA.” AOL asked the trial court to dismiss Doe’s complaint, arguing that AOL has immunity from this lawsuit under a federal statute and the Court ruled in the ISP’s favor.
In Blumenthal v. Drudge , Matt Drudge, a cyber-columnist, wrote an article about domestic violence between Sydney Blumenthal (an aide of President Clinton) and his wife. Although Drudge retracted the story and apologized, Blumenthal filled a libel suit against him and AOL. In spite of the fact that the article was posted on Drudge’s site within AOL’s network and the U.S. District Court judge dismissed AOL from the suit, because it was protected by Section 230 of the DCA.
In Zeran v. America Online, the issue was whether AOL would be held liable for being slow to remove a series of allegedly defamatory messages posted on its bulletin board by an unidentified third party. An anonymous message posted on AOL’s message board wrongly stated that Kenneth Zeran was selling T-shirts with offensive slogans about the Oklahoma bombing. Because Zeran’ s phone number was listed on the message, he received death threats and insulting calls. He then sued AOL for taking too long to remove the initial message, as well as the ones that followed, and for failing to post a retraction. The U.S. District Court for the Eastern District of North Carolina cited Section 230, held in AOL’s favor, and eventually denied Zeran’ s appeal. In the court’s opinion the purpose of Section 2340 was precisely to protect freedom of speech on the Internet, since: “It would be impossible for service providers to screen each of their millions of postings for possible problems.” The Court of Appeals based its ruling on Section 230.
Although these three cases are not linked to hate speech per se, they are relevant in showing to what extent it is hard to regulate the Internet, since service providers are never held liable for the content of the messages found on their servers. Nevertheless, ISPs have a choice in regulating the material located on their network, hence, most of them decide to obey the law and many ban libelous or defamatory speech, for instance. As private corporations, these service providers have the right to expulse customers or to delete the content of their messages.
VIII. Internet Regulations of Hate Speech
So far, most Internet regulations are designed to protect children, making it illegal to transmit child pornography, for instance. However, some states have legislation punishing harassment and fighting words on the Internet. Two examples are the Connecticut and the Georgia’s Internet Laws, which respectively prohibit harassment and terrorist threats. These laws do not prevent people from having personal opinions, regardless of what they may be, but rather to stop criminal activity over the Internet.
IX. Blocking Hate Speech: Filters
As discussed above, ISP’s can chose to host or not a given Web site however, users can also decide for themselves what they want to be exposed to. By using a filter, they can deny their computer access to certain Web sites. The World Wide Web Consortium, an international computer industry organization, proposed a technology called Platform for Internet Content selection (PICS). These programs rate the content of a site based on various criteria, such as violence, language, or nudity and then allow access or not.
These filtering software when used by private individuals or corporations, do not involve any governmental regulation and therefore do not fall under the First Amendment. But in the Communication Decency Act, the Congress encourages parents to use them, in order to protect their children. The use of filters in public places such as libraries or schools has, however, raised legal issues, since these governmental institutions are not allowed to ban constitutionally protected speech. They are only allowed to screen out threatening, obscene or libelous speech, but rarely hate speech.
Most of the filters focus on screening out pornography, but some like SurfWatch, do block hate speech and prevent access to site that discriminate individuals based on their race or religion, for instance.
X. Discussion
Combating online extremism presents enormous technological and legal difficulties. Even if it was electronically possible to keep sites off the Internet, the global nature of this medium makes legal regulation almost impossible. In addition, in the United States, the First Amendment guarantees the right of freedom of speech to everyone, even people with offensive opinions. As seen in the cases discussed above, if a hateful speech does not lead to an action, it is hardly punishable. Furthermore, ISPs can host harmful material without being held responsible, which leaves it to their goodwill to create a safe Web environment.
Internet Service Providers (ISPs), such as America Online, which is based in the United States, is considered a “common carrier” and as such benefits full protection. Furthermore, they are not legally responsible for the content of the sites they host, but it is their decision to host or not hate sites. Some carriers do host haters, while others have adopted strict terms of service, prohibiting subscribers to use their facilities for promoting hate.
Just as an Internet Service Provider can remove a hate site from its servers, private individuals can remove such sites from their screens. Filtering software products help people or concerned parents to block offensive material from their home computers.
Bibliography
Bibliography
“Citizen Internet Empowerment coalition.” www.ciec.org
“Constitutional Challenge to Hate Crimes Statutes.” Available at: www.adl.org
“Findlaw: Cyber Space Law Center: Internet/Freedom of Expression: Defamation and Libel.” http://cyber.findlaw.com/expression
“Hatecrime.” http://ucl.boward.cc.fl.us
“Hate Speech: The Speech that kills.” http//www.indexoncensorship.org
“Indecency, Ignorance, and Intolerance: The First Amendment and the Regulation of Electronic Expression.” http://warthog.cc.wm.edu
Internet Law Library. Available at: http://www.priweb.com
“Legal Information Institute.” www.law.edu/topics/communication
Middleton, Kent R., Trager Robert, and Chamberlin, Bill F. The Law of Public Communication (New York: Addison Wesley Longman, 2000.

“Pending Court Cases and Legislation.” http://www.nlp.cs.umass.edu/aw/ch13
Perkins Coie LLP. www.perkinscoie.com
“State Law on Hate Crime.” httpp://gsulaw.gsv.edu/lawland
“Telecommunications and The First Amendment” Available at http://www.bsos.umd.edu
“Terrorism on the Internet.” www.loundy.com
Bibliography
“Citizen Internet Empowerment coalition.” www.ciec.org
“Constitutional Challenge to Hate Crimes Statutes.” Available at: www.adl.org
“Findlaw: Cyber Space Law Center: Internet/Freedom of Expression: Defamation and Libel.” http://cyber.findlaw.com/expression
“Hatecrime.” http://ucl.boward.cc.fl.us
“Hate Speech: The Speech that kills.” http//www.indexoncensorship.org
“Indecency, Ignorance, and Intolerance: The First Amendment and the Regulation of Electronic Expression.” http://warthog.cc.wm.edu
Internet Law Library. Available at: http://www.priweb.com
“Legal Information Institute.” www.law.edu/topics/communication
Middleton, Kent R., Trager Robert, and Chamberlin, Bill F. The Law of Public Communication (New York: Addison Wesley Longman, 2000.

“Pending Court Cases and Legislation.” http://www.nlp.cs.umass.edu/aw/ch13
Perkins Coie LLP. www.perkinscoie.com
“State Law on Hate Crime.” httpp://gsulaw.gsv.edu/lawland
“Telecommunications and The First Amendment” Available at http://www.bsos.umd.edu
“Terrorism on the Internet.” www.loundy.com
Legal Issues

×

Hi there, would you like to get such a paper? How about receiving a customized one? Check it out