Tuesday, October 2, 2007

Assignment 6, Option 1; Facebook's Leviathan

There are many "norms" on Facebook that are generally accepted. Unlike MySpace, a similar social networking site, there is almost no trace of spam mail, chain letters or the ever obnoxious surveys. Many new users to the site seem to quickly realize what the norms are, by either observing more seasoned users or by being reprimanded for something by one of their friends. Eventually and inevitably, they fall in line with the rest of the users.

Everyday, millions of new Facebook wall posts appear. Aside from instant messenger it is probably the most popular way to communicate online. And while for the most part users don't have to worry about censorship, there are times when people cross the accepted line of Facebook netiquette, and the Leviathan has to step in.

The times when the Leviathan (in this case a Facebook moderator) has to step in usually occur in a group setting. The groups that Facebook allows you to join offer so many different viewpoints and opinions that it is inevitable that the content will eventually offend someone. When this occurs there are usually two scenarios that occur: a) the offended individual reports the person to the Leviathan immediately, or b) they join the group, talk trash and begin the inevitable flame war, eventually resulting in a report being filed anyway, possibly by more than one user.

When filing a report to the Leviathan, the user is greeted by a screen similar to this:


First , the user is required to confirm that the person/group/picture they are reporting is in violation of the Terms of Use. The Terms of Use is basically a rulebook of everything you can and can't do or say on the site. If the accused is in violation, the user is then asked to give the reason that they are reporting the person.

The Facebook Leviathan is much like the moderator described by Wallace that "can have a calming influence and ensures participants that a means is available to resolve disputes should they arise (Wallace 70)." The Leviathan has ultimate authority and can do many things. They can delete posts, groups, user's accounts and even ban e-mail addresses from ever using Facebook again (a more severe punishment when the site was college-exclusive). Facebook even has an automated type Leviathan that notices when a person is adding friends too fast. It will first warn the user that they need to slow down, and eventually will temporarily block the user from adding friends. This is an effective way to block spamming and unsolicited activity.

Overall the Leviathan is necessary, because if the site went unmoderated all hell would break loose, with flame wars consistently occurring and no reproach for unruly users; similar to what occured on the MUD LambdaMOO (Wallace 71-73). And while moderators are certainly necessary, they would have too much trouble on their hands if not for conformity and overall good behavior by most users; "The fact that humans tend to conform to group norms may be one of the key reasons Internet communities continue to thrive and flourish (Wallace 73)." So moderators are a good thing, especially in moderation.


Patricia Wallace "The Psychology of the Internet" 1999

1 comment:

Katelyn McClellan said...

Tyler, I found your post to be very interesting. I knew that facebook had a Leviathan but I had no idea there were so many restrictions on users. For example, your explanation of the warning against friending too many people was news to me. I thought you did a good job of relating the Facebook features to Wallace’s Chapter 4. You really described the site in full and the picture you posted certainly was an added bonus!

I think you are correct about the Facebook needing a Leviathan. If everyone could post anything they wanted I do not think the site would operate as well as it does. Certain restrictions have to be in place especially since facebook is a site for people of all ages.