Erin Bradley of Reprise Media asks this week in the SearchViews blog if search engines should manually re-order suicide related search listings in order to shift sites aimed at helping above sites that give tips on how to commit the act. The debate was sparked by an article in The Guardian about two British Internet users that created an Internet suicide pact after meeting in an online chat room.

From the article:

Internet companies are being urged by the Home Office to make so-called suicide web sites and chat rooms more difficult to access. The move comes after two strangers forged Britain's first Internet suicide pact, dying side by side two days after making contact for the first time on a chat room dedicated to discussions about suicide.

Talks are taking place with a number of service providers, including Yahoo! and AOL, and search engine companies, in an attempt to reprioritise the results that are thrown up during a trawl on the Internet. "When somebody keys in 'suicide' and 'UK', we would like them to be offered a link to the Samaritans long before they find a website showing them what they can do with a car exhaust and a hosepipe," one official said.

Perhaps surprisingly, one of the voices in defence of suicide chat rooms yesterday was that of a close relative of Ms Williams, who believes that parental control may be needed, but not legislation.

"The web is there as a source of information for all of us, and it's better that these discussions aren't driven underground," he said. "Building high-rise blocks didn't increase the suicide rate, and I don't think the Internet will either."

The article and the blog post spark a good question. Should search engines be delivering content that is deemed most relevant by their algorithms (which aim to deliver content voted most relevant by Internet users) or should they fiddle with the results to serve up what some group "thinks" should be there. Google has faced this issue in the past over terms like "miserable failure," "scientology," and "Jew."

The reality is that when it comes to searches like "suicide" or even "suicide ideas" the results at the major search engines are already pretty heavily skewed toward resources aiming to help people contemplating suicide. That makes the argument somewhat moot on this particular query, but the article and the idea raise a greater point.

Let's say that the results weren't skewed toward help lines, but instead featured site after site with detailed how-to guides. If there was a "need" for this type of censorship, would it be a good idea? It may seem like an easy call on an issue like suicide where it's generally agreed that it's good to get people help rather than assist them in following through with the act, but what about more controversial issues? Abortion? Euthanasia? Building your own meth lab? Pornography? At what point does it become ok for a search engine to taint the results with their own prevailing morals?

Undoubtedly, some search engines are already playing this game. Although they have since backtracked, Google once inadvertently admitted to altering search results in a Wired article titled Inside the Soul of the Web.

A snippet:

This query hasn't come from Kuala Lumpur or Genoa or Montevideo, but just outside Google's front door. A drama is unfolding only a few miles away, and there is no way to help; I don't even know the person's name. I can only sit and watch the words crawl up the screen and disappear. This is a contract between man and machine, and I can only observe, not intervene.

Stricken, I glance over at Rae, who has returned from night league volleyball, his spiky blond hair still wet. He, too, has seen the query and is typing away furiously. Finally he stops and looks up at me. "They're going to be OK. They got referred to the right places."

"You can do that?"

"Yeah, well, I can see how the system responds. And if it doesn't give the right information, I'll find better sites and attach them for future queries."

"But you can't help the people who ask the original question."

"No."

"Just the ones that follow?" Rae nods.

"You've just got to do the right thing. The hard part is figuring out what the right thing is."He thinks a moment, then gestures at the screen. "I know people trust in this thing. They believe it will have the answer. And I don't want it to fail them." As Rae talks, 50 more queries scroll up the screen.

Further discussion of the issue is taking place at the Search Engine Watch forums in the thread " Should Google & Other Search Engines Censor Suicide Searches?."

Discuss this article in our own Small Business Ideas forum.
October 21, 2005





Jennifer Laycock is the Editor of Search Engine Guide, the Social Media Faculty Chair for MarketMotive and offers small business social media strategy & consulting. Jennifer enjoys the challenge of finding unique and creative ways to connect with consumers without spending a fortune in marketing dollars. Though she now prefers to work with small businesses, Jennifer’s clients have included companies like Verizon, American Greetings and Highlights for Children.







Search Engine Guide > Jennifer Laycock > Should Engines Manipulate Results In Certain Circumstances?