Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Bing Is Suggesting the Worst Things You Can Imagine (howtogeek.com)
55 points by basicplus2 on Oct 28, 2018 | hide | past | favorite | 62 comments


At least searching "muslims are" in Bing gives you honest autocompletes that reflect what people actually search for. Google, on the other hand, evidently censors content it deems to be offensive. If you search "muslims are" in Google, there are no autocomplete results at all. I doubt Google does not have enough data on "muslims are *" searches

I have safe search off, meaning I don't want to be patronized when I am searching things. I would much rather know what people search for than have Google treat me like I'm their child with their hands covering my eyes during the dirty scenes.


    > I don't want to be patronized when I am searching things.
It's more like they're removing the penis graffiti.

Most of the time, I search to find some content. You are probably the same. It's a search engine. Considerations aside from whether a search suggestion helps one locate desired content are beside the point. What's more, in some of the author's cases, the content would be illegal.

If you are serious about researching search trends, then you could use tools, or lobby for more tools, like https://trends.google.com


I have just tried "muslims are" on bing.com right now and I get a single autocomplete result: "muslims are treated fairly". And "black people are" yields nothing at all.

Unless it's a regional thing, it seems that someone at Bing has read this news and now they are censoring exactly like Google.

If I try those same searches in Spanish, I do get some racist suggestions though. Probably not for long...


Some things should not be autocompleted. I can't think of a reason why a search engine should suggest a completion if you start typing something generic like "xyz are ...".


Searching "muslims are vegetarian" could be seen as another way to search "are muslims vegetarian". A good search engine lets you search query soup like this. They don't necessarily return the same results, but people are pretty bad at querying.


Yeah, there's nothing wrong with a query like that, and I think autocomplete is fine once you typed "muslims are vege...".

But I don't see why you'd autocomplete a sentence that could go anywhere. Just because many people type that isn't a good reason.


Why isn't that a good reason?

It's not a search engine's job to be the thought police.


Can we please stop wildly exaggerating every attempt to remove graffiti and trolling from the Internet as equivalent to invoking the “thought police”? Nobody is sending the authorities after you for entering overtly-racist searches into Google.

Keep in mind that you are currently enjoying the benefits of discussing this topic on a highly-moderated discussion forum. How relevant do you think Hacker News would be if such moderation was ended entirely?

Have you learned literally nothing from the utter disaster that was Tay[1]?

[1] https://en.wikipedia.org/wiki/Tay_(bot)


Because it's a stupid thing to do. It makes sense to complete a query once it is obvious what the user wants to write. That's helpful, you saved me from typing a few characters.

But if you suggest completions for queries at a point where it isn't yet obvious what the user wants, you are no longer just helpful. You are steering people in a certain direction, whether that's your intent or not. You are effectively telling people, hey, try this query, it's popular, even if the user had no intent of typing that query.


Okay, I see your point now. Content discovery can be decomposed into "push" and "pull". In the 90s, searching was typically associated with pull. Over time, engines started providing a push component (e.g. autocomplete). It sounds like you want search engines to be mostly pull with less push.

You want the push to be more local to what you specifically choose to pull, which means greatly reducing the current scope of the push component. I guess that's fine, but a lot of people really like the push component, especially if they don't quite know what they're looking for. I think the whole of the two methods is more valuable than the sum of the parts.


except it does suggest if you type 'people are...'

some fucker somewhere again decided what you should or shouldn't see, there's no other explanation :)


some fucker somewhere again decided what you should or shouldn't see

Yes, that's what a suggestion algorithm is.

There is no pure Platonic ideal neutral objective suggestion algorithm. Every suggestion algorithm is going to have biases based on both its training data and the goals (explicit or otherwise) of its creators.

You've gotten as far as, apparently, noticing that tweaking the algorithm after the public has seen it constitutes a choice on the part of the designers about what they do and don't want to show you. Now take the final logical step, and observe that they already made choices about what they do and don't want to show you in the initial design, before the algorithm ever went public.

If you have a problem with the former, I don't see how you can be logically consistent while not having a problem with the latter.


when I enter something, I expect to see what other people search for, not "what people search for, except...", this talk about suggestion algos has nothing to do with the issue I think


When I enter something, I want to see what's relevant to me, not what other people searched for. As such, it makes sense to not suggest anything when I type in 'Muslims are', as Google doesn't know what I want to see yet; it could decide that once I started typing in more strings on the next word. It'd make more sense to me if, instead of completing search queries, it just completed individual words in the queries.


> when I enter something, I expect to see what other people search for

That's never been the point, though it's sometimes been an implementation method because it's easy: the point is to provide a useful guess of what you might be searching for.

Some search engines do provide tools for exploring patterns of other people's searches, but recommendation systems have never been that except as a matter of leaky implementation details.


it was just an example, you can choose another if you think it's more appropriate, my point is that this idiotic censorship has no place in search

if it has something to show me regarding my search (and it certainly does), it must


Have you started using the internet in the last 2-3 years Google was not censoring suggestions until a lot of noise was made about racist results etc. So this hasn't come out of the blue and anyway search suggestions are just that suggestions I don't think Google is censoring the results unless it is being forced to us dmca or European right to be forgotten or whatever restrictions are being put by different countries for search results by users in each country.


The point of autocompletes is to make it easier for you to search by prompting you to choose a readymade option that might be relevant. They happen to use what other people search for, but that's an implementation detail. It's not that Bing is showing you the uncensored truth about what people search for, it's that it's prompting you to search for really bad things.


When one types a query, why should a search engine model search success based on what everyone else in the world searches up? Doesn't it matter more that Google finds what that person wants?

And if individuals want to understand Google Search Trends, as noted already, it's there but I don't think many people are going there, because it's not what they want.


> When one types a query, why should a search engine model search success based on what everyone else in the world searches up? Doesn't it matter more that Google finds what that person wants?

That's exactly what I want, at least. It'd make much more sense for them to just work on autocompleting individual words instead of queries. That way it's useful to the person who's searching and it doesn't matter what everyone else searched.


Does that include things like React, Rust, or Ruby?


Personally I find that brutal honesty about its search suggestions refreshing as opposed to manually excluding anything remotely controversial. Although the reality is probably that somebody is spoofing the search suggestions for laughs above and beyond morbid curiosity and jokes of debatable taste akin to searching for nuclear warheads on eBay.


I think you underestimate how many pedophiles are out there.


The issue is that once these thoughts and articles are suggested, they become much more mainstream.

Try something less controversial, robbery. No doubt, people search for "how to break into houses", and you might even click on that link if it was a top suggestion. But if people saw that all the time, there would be an increase in robberies, and a decrease in the feeling that it matters.

And just because it generates traffic, doesn't mean it is what most people are really looking for!

It just means that it has appeal because of the way we are wired to respond to things that are outrageous. Once it is a suggestion (even at the end of the list) it will get clicks, and move up till it dominates.


> Once it is a suggestion (even at the end of the list) it will get clicks, and move up till it dominates.

So why isn't every article that has ever been suggested moving up until it dominates?


Because most articles don't trigger the same responses.

It is the outrageuos and the radical that trigger the interest.

Have you not noticed that the links on the side of all the trash blogs are full of "The craziest snake!" or whatever?

Which clickbait works better - "Hitler loved his black maid" or "cheap diapers"? Does that really mean that more people are searching for Hitler's black maid than for cheap diapers?


So in your opinion why isn't "Hitler loved his black maid" the top result for every search query?


This article assumes that Bing is in the business of telling people what to think

It's not; it's in the business of showing what people think


No, a search engine literally tells you what to look at, based on what you told it. And for many people there's a short path between seeing and believing.

The autocomplete suggestions are a suggestion: would you like to search for this? And in the minds of many people that normalises it.

"Stochastic terrorism" is a real problem. You've got one guy mailing out bombs and another shooting up a synagogue based on lies they read on the internet, just this past week. This stuff has consequences.


This article is straight up asking for censorship. Nothing else.


Correct. They are literally asking that the censorship feature to work as expected.


SafeSearch definition says: "filter out adult text, images, and videos". It's not about racism. So it's working as expected


Did you read further along? There's also suggestions for bestiality and borderline child pornography, both of which definitely fall under adult images and in one or both cases are very illegal.


You said borderline, because it is not past the line of illegality.

Thus your second sentence makes no sense.


There is a debate whether nonsexualized images of minors (e.g., toddlers playing in baths, which is mentioned in the linked article) actually constitutes child pornography. Jurisdictions vary about their interpretation, hence "borderline".

I don't feel the need to define this so precisely in my original post because it distracts from my main point, which is that Bing is not only not filtering hate material but also stuff that many would generally agree fall under adult content.

I will not respond to any replies to this post because I will assume that further argument is in bad faith aimed at obscuring my main point.


Borderline but legal child pornography would still be classified as "adult", no? To me, the second sentence makes perfect sense.


If you have safe search on, all the results should be non nude, non adult.

I reread the article, and see where I went wrong. They mix up two issues: bing returning odd suggestions with safe search on, and bing returning results for searches for illegal material with safe search off.

I had thought they were only talking about safe search being on.

In my misreading, I made the assumption that Bing would never return illegal results for searches for children, as it would never return pornography since safe search is on. Apologies to the OP.


Flip this around:

There are at the very least billions of possible search term refinements and pieces of content which are not currently actively suggested. Is your position that, because they are not being suggested at this moment, they are being censored?


The suggestions are being computed by an algorithm. If you take things away from those it's censorship.


The algorithm was designed to do certain things.

Does this mean it's censoring the things it wasn't designed to do?

(there's no way to claim that changing the algorithm would be censorship without acknowledging that the algorithm already censors things)


And who designed the algorithm?


curation != censorship.


Why does everyone need to have a safe space? Bing suggestions work perfectly fine without patronizing and treating you like a dumb child that doesn't know how to use internet.


A bunch of people who use Bing are children who don't know how to use the internet.


So the slippery slope here is: if you're 12 and see an algorithm autocomplete "muslims are [insert racist thing]," its going turn you into a racist by suggestion?

The reason autocomplete results end up this way is because the humans this kid lives around are racists. Those people will be the reason the kid grows up racist, not some 2nd tier search engine. Why lie to the kid and pretend like the community he/she lives in doesn't have a problem?

I've never understood this bizarre desire we have to hide reality from children to keep them ignorant (or more commonly referred to as "innocent"). As if they can't observe the objective reality around them. The sooner they can understand the problems with the world the sooner their generation can work toward fixing them.


I once did a search on Bing for a .NET topic and Bing didn't even return any Microsoft documentation results.


What? I tried these searches out when the article was just three hours old on HN. I could not reproduce any of these results. Safe Search was set "Moderate", and my results did not change significantly when I switched it to "Off".

The worst it got for me was when I asked for autocomplete suggestions for "muslims are":

    good
    hindu
    worst
    cancer
    killed
Disclosure: I performed these searches from India. I'm open to the possibility that there could be regional factors worsening the search results in the US. Even so, I can't imagine what Indian customizations would make my search suggestions for "Michelle Obama" so innocuous compared to what I see in the article.


Not so much autocomplete suggestions, but the autosuggested related searches under Image Search.

In my case I got shown "muslims are scum" and "muslims are terrorists", in the UK. "Michelle Obama" got suggested "... muscles"?


No, I had looked at those too. Both under Image Search (for Jews, Muslims, and "gril") and under Videos (for Michelle Obama). All the related searches were completely innocuous. The autocomplete suggestions were the least innocuous ones I got. Looking for "gril" got me images of girls, but all completely SFW or child-safe.


As a jew, I can't decide which one is more offensive: The dumb Antisemitic suggestions from Bing, or Jew Jitsu being the top autocompletion on Chrome. Hey Google! There are 14M of us, give some credit.


Reading your comment I reacted with 14M not being a lot at all. I was surprised to find out it is true.

Having no knowledge of this before I’ve always assumed this number would be higher. Based on influence I guess.

https://en.m.wikipedia.org/wiki/Jewish_population_by_country


It would have been a lot higher if six million hadn't been murdered.


That was my first thought too. But then, I live in Bangalore, and my sense of proportion is biased by this. The Jewish population is about the same as my city's.


Not the main point of the article, so off topic, but isn’t “underage children” a pleonasm? Or is it a term that respects that we are all someone’s child, so also when we reach adulthood?

The term seems to suggest there is an age from which it is acceptable for children to receive sexual attention from adults. I prefer to think that there is no such age. Most people would probably agree with this.

Pleonasms are very common; “your personal belongings” is a funny one. In the case of ‘underage children’ though, it undermines the message.


> Most people would probably agree with this.

I think you might be overestimating how common your opinion is. In many (maybe most) countries, children are considered capable of consent even before they are granted all privileges of adulthood. Sometimes it's only if the partner is underage as well, but then one will be an adult sooner than the other, so there are exceptions for cases where the age difference is small.

Because of the various exceptions in traditional consent law in contrast to bright-line rules against child pornography (below 18 is illegal, no exceptions), it's possible for legal sex to become illegal as soon as there's a camera between the participants.

The best-known cases are probably of teens being prosecuted for child pornography when they take nude selfies.


What you say is true for interpersonal relationships. The article is about imagery with an erotic element. So in the context of erotic photography I think that most countries would consider participation of children illegal and I think most people would agree with that.

So I'd think that in the context of the article it could be a pleonasm, but an understandable one then.


Well, you can have shared belongings, as with things you own with your family members (usually spouse). So my personal belongings is a valid distinction, not a pleonasm.


It depends on the context. In case of selling a house while you have shared belongings with the neighbours, then yes. When the train conductor says: "Don't forget your personal belongings when leaving the train", then it is a pleonasm as the distinction is probably not intentional; I doubt the conductor wants you to leave your shared belongings in the train.


It doesn't suggest those things for me - did they fix it or is this a case of personalisation?


As the article states, Microsoft took note of it and started correcting some of the suggestions.


The article is 2.5 weeks old. Maybe the people at Bing cleaned it up a bit.


Sure, but google image search gives spam results from Pinterest, so Bing still wins.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: