For anyone not into German politics, let me retell the story, it's not really important since there are cases like this one in many countries. The issue is simple, our former second lady was rumoured to have worked at an escort service years ago. The claim seems to be completely made up but that's what rumours are all about. Be as it may it spread around the internet and Google's autocomplete suggested it to anyone searching for her after some time. She was and is quite peeved about it and sued Google, the result of that lawsuit is yet to be determined.
Now the quintessence of the problem is, whether Google is required to censor it's results (and the suggestions are results in a way). It has done so in the past, ordered by court or law (e.g. Italy/France/Egypt or even DCMA), semivolontarily (e.g. China a few years ago or recently RS/torrents in the suggestions) or completely volontarily (e.g. the recent anti muslim video scandal).
Let's state a few facts that i have a feeling are not obvious to everyone.
- It is technically possible, but only to a degree. I've heard before, that it is and that it isn't and both extremes are pretty much nonsense. I'm not going to discuss how searching works on the inside but you could always put a filter on the end results, there is no denying that. It wouldn't be costly either, compared to the processing, the data experiences already and the fact that it has happened in the past prove this. Still it would be completely out of the question to do try and filter everything that might be offensive. It would be close to impossible to come up with a list that fits everywhere, human intervention is too costly and last but not least, what's acceptable and what isn't is just not canonical (think different countries, cultures, religion).
- It's not a new problem. I don't even understand why my favourite newspaper made that claim in the first place, it's just way too far out there. There are so many precedences for every aspect of the story and I'm not just talking about the internet here. For the specific case of Mrs Wulff just think of all the garbage the yellow press rumours every day. It's just a sad fact of being in the spotlight.
- This is not about free speech. Although i can't argue with active undertakings to hide certain results being "censorship" of some kind, so is not pushing certain others. Every little tweak Google would make to their search algorithm would have similar albeit indirect effects. Also at least from a German point view, it is ok to prevent certain things to be said. The German constitution deliberately weighs off freedom of speech vs demagoguery vs slander preventing you saying things considered incitement of the people or insultations. This is difficult, since there are fine invisible lines, that are not obvious or the same to everyone but in practice it works and is reasonable.
- This is not a completely naturally occuring phenomenon. Yes, the automatic suggestions are solely generated by the amount of users that search for a specific phrase, but (big but) a lot of them only search for those because they were suggested to try them (i wouldn't come up with half that junk, that i have searched for, just because the suggestions were so outlandish).
Is it then justified to force google to censor its results? I'd say yes, it can be, though the case at hand had to be rather severe. I don't think it was a bad idea of them to remove the anti-islam hate video despite the response in the islamic world being ridiculously over the top. I doubt even half the people rioting has even seen this load of tripe.
Is it Ok to force google to remove autosuggestions? I'd say yes, but the case at hand would have to be even clearer than for mere search results. It is definitely desirable to censor, (child) pornography from there for example. Mrs Wulff on the other hand has no right to demand this. The tabloids would have to be outlawed long before that.
The only compromise i could imagine would be to make a list of the results/suggestions not shown alongside their frequency accessible somewhere. This wouldn't solve the problem of technical feasability though.