Does AOL Discriminate Against Sexual Minorities in its Blocking System for AOL Members?

37 Pages Posted: 21 Apr 2014

See all articles by Alex Marthews

Alex Marthews

Digital Fourth / Restore The Fourth

Date Written: March 22, 2001

Abstract

Blocking systems, usually referred to as “filtering software”, are becoming steadily more pervasive. Commercial blocking systems such as “FamilyFilter” and “CyberPatrol” are widely available on the World Wide Web. They market themselves as being able to block unacceptable content of various kinds, from pornography to hate speech. In practice, this ability is considerably circumscribed by the ambiguity and context-dependent nature of human language.

For example, a page containing information on breast cancer may be inadvertently filtered out because it contains many repetitions of a word that in another context might be considered inappropriate. It is very hard for these software programs to be sensitive to context, to distinguish mere information about drugs or sex from pages that lubriciously encourage drug use or sexual experimentation.

The Children’s Internet Protection Act, a recent piece of federal legislation, mandates the use of blocking systems in schools and libraries that receive certain types of federal funding. Civil liberties groups, among which is my client the Online Policy Group, oppose the mandating of the use of blocking systems, and generally believe that a filtered Internet represents a threat to freedom of speech and expression.

Commercial blocking systems are not the only examples of such systems: they are merely the most visible form. Online service providers such as AOL and Earthlink incorporate blocking systems into their browsers. AOL, for example, markets a “Parental Controls” system to adults, which is intended to enable parents to control what their children see on the Internet.

This immediately presents some important concerns: what does the AOL system block? Who decides what gets blocked, and what criteria are they using? And one of the problems with assuaging these concerns is that the online service providers (OSPs) cannot be wholly explicit about the criteria they use: if they were, the webmasters controlling the blocked pages could that much more easily circumvent the blocking. If we wish to study AOL’s blocking system, we must therefore do it surreptitiously, by testing various categories of webpage and determining which of them get blocked.

Particular problems may arise if one believes that an OSP is using criteria biased against a specific group or Internet community. AOL’s relationship with sexual minorities over the last five years has been uncomfortable. Its CEO, Steve Case, regularly contributes to organizations which oppose homosexuality. In a widely publicized case, AOL released to the US Navy the login details of someone who had identified themselves as being a naval officer and as being homosexual. AOL chatroom monitors have been criticized for applying a double standard of obscenity to the postings of heterosexual and of homosexual chatroom participants.

Given this history, it is of particular interest to determine as precisely as possible what AOL blocks, what it doesn’t block, and whether there is a pattern where AOL over-blocks “webpages containing pro-gay phraseology”, relative to “webpages containing anti-gay phraseology”. These categories, and our reasons for choosing them, are described more precisely below. If such a pattern is found, this may help to establish whether the blocking system used by AOL is facially discriminatory towards sexual minorities. We make no assertions about the intent behind blocking any particular webpage: we seek simply to ascertain whether the fact that a given page contains pro-gay or anti-gay phraseology significantly affects the likelihood that that page will be blocked at a given access level.

While this seems like a vague and subjective way to model a blocking system, it in fact enables one to model it whilst preserving a significant measure of statistical objectivity. In order to defuse allegations of bias, we aim to be as explicit as possible about our methodology, and to adhere to the highest possible standards of qualitative research.

The subject of blocking systems is an extremely contentious one, with enormous significance for internet users. It can be argued that one of the most valuable aspects of the internet is the wide variety of material it contains. We can tell, in the offline world, the source of much information, and its reliability; but if online service providers begin to constrain what adults may see on the internet, then it is a worthy object of an advanced policy analysis to determine what we are not allowed to see, and to suggest ways to reinforce freedom of information on the internet.

Keywords: filtering systems, blocking systems, natural language analysis, online discrimination, AOL, children and the Internet

JEL Classification: Y4

Suggested Citation

Marthews, Alex, Does AOL Discriminate Against Sexual Minorities in its Blocking System for AOL Members? (March 22, 2001). Available at SSRN: https://ssrn.com/abstract=2425201 or http://dx.doi.org/10.2139/ssrn.2425201

Alex Marthews (Contact Author)

Digital Fourth / Restore The Fourth ( email )

28 Temple St.
Belmont, MA 02478
United States

Do you have a job opening that you would like to promote on SSRN?

Paper statistics

Downloads
51
Abstract Views
1,461
Rank
723,303
PlumX Metrics