In urban planning, there is an idea called “eyes on the street” that describes the tendency for crime to be reduced in areas under high public surveillance. For example, a fourth-grader is less likely to have his lunch money stolen from him on a route that passes through busy streets on his way to and from the elementary school than one that takes him through empty streets.

A busy street, or forum, like this is way safer than one...

A busy street, or forum, like this is way safer than one… Image courtesy of wikipedia.org.

Vendors, pedestrians and residents who are watering plants, drying laundry or simply people watching from the high rises all serve as monitoring agents for the community. And it’s not just because they intervene when they sense premonitions of criminal activity, though that may be the case. Their very presence is preventative. A crowded street is safer than a deserted one not just because they there are more people to intervene or to call the police. The crowd itself is a regulatory agent.

The simplest explanation for this phenomenon is that public crimes are more dangerous than private crimes. Mugging someone in a dark alley is different from mugging someone on a main street not just because the mugger is less likely to get caught in the former situation. It’s different because it’s not just a crime against the specific victim; it’s a crime against the entire community. More eyes amps it up from a private to a public act.

The lesson of “eyes on the street” may prove in solving the problems that anonymity and unregulated discussion forums have created on the web. What happens when bullying is put out in the open? What happens when we move it from the dark alleys of private IM sessions into the crowded streets of social media platforms? What if our usernames are not masks that we get to hide behind, but rather, forms of identification that we carry with us into both our public and private lives?

Openness is a powerful force for good, far more than any of the censoring mechanisms currently being proposed by social network researchers. A recent article published in the MIT Technology suggested a natural language filter to identify “offensive” tweets, giving the example of a homophobic filter, which would ban tweets that contain the words “gay” and “disgusting”. Aside from the blatant crudeness of such a metric, this would undoubtedly turn into a battleground for political agendas. Case in point, whether or not “offensiveness” even ought to be conflated with “bullying” is an assumption worth challenging in its own right.

than this. Image courtesy of wikipedia.org.

like this. Image courtesy of wikipedia.org.

Instead of resorting to censorship or any sort of content filter, which would sow distrust between the users and the owners of the system, we’re better off taking advantage of something that people are none-too-reluctant to provide: their eyes. People love to people-watch, and this is as true on a park bench as it is on the web. In the crusade against cyberbullying, we’re better off trying to create a more transparent web than we are trying to create a more heavily regulated one. The latter is likely to create more problems than it intends to solve, and the former is likely to solve problems beyond this particular intent.

Eyes on the streets, eyes on the web. The more transparency, the merrier. It’s one of those things you can never have too much of.

About The Author

Related Posts