Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

If you actually read the article most of the statements of fact are about the omnibox autocomplete system, and then they use innuendo to imply some things about search engine ranking. But these are two completely separate systems, and it makes sense that a system that is literally telling you what to type is more sensitive than the search result ranking. It is not a flaw of Google that it won't suggest "is hillary clinton still controlled by the jews" when you type "is hillary clinton". If it was just a big trie of what everyone typed, it would be completely dominated by 4chan troll bots.


First, I don't agree with your assessment of the article at all. There are multiple concrete assertions regarding manipulations of the search rankings themselves. I also don't agree that it is somehow "more okay" to manipulate autocomplete results than the search results proper.

Second, I think the big chunk of the problem here is lack of transparency. Google has traditionally been very secretive about its algorithms to avoid tipping off spammers. So if you ask them directly they will hem and haw when in fact they ban spammers and also, as the article reports, they moderate inflammatory content and manually boost rankings of specific websites. The question is - what is the exact scope of these activities? Where is the red line that they will not cross? I think the public deserves to know.


What’s a site they’ve manually boosted?


> Google made algorithmic changes to its search results that favor big businesses over smaller ones, and in at least one case made changes on behalf of a major advertiser, eBay Inc., contrary to its public position that it never takes that type of action. The company also boosts some major websites, such as Amazon.com Inc. and Facebook Inc., according to people familiar with the matter.

Of course the exact nature of changes and boosts remains unknown but that's just underlines the need for transparency.


"algorithmic changes" implies the boost is not manual.


An "algorithm" still implies human intent. Heck, even a blacklisting system is still a form of an "algorithm." Even if each changes to the algorithms Google have made in the past may be justified, the public can't make an informed decision about it if it's not transparent about what it actually does.


Fact: the overwhelming majority of users aren't going to type a search query that completely specifies the information they are looking for.

The user isn't going to supply the very detailed information necessary to objectively filter down (and rank) everything on the web to a set of relevant results, without the search engine making any judgment calls. That would be a ton of work, more than almost any user would want to do. (And many users wouldn't have the technical skills to do it.) It would be like going into a restaurant and handing the chef a recipe for everything in your meal, down to the level of detail saying the vanilla extract in your dessert should use this type of vanilla bean, infuse it in this type of alcohol, and for this long.

The corollary is that any useful search engine will have to guide you a bit. It will take the incomplete specification you gave about what you want, and it will fill in reasonable guesses about everything you didn't say about what you want (but that it needs to know to find it), and then it will give you an answer that's useful.

Obviously, it's ideal to make this guidance objective or unbiased, but how do you even ensure that you're doing that? The whole point here is that you are making guesses about what the user prefers. It's not useful to guess randomly. Objectivity is a good goal, and you should avoid any unnecessary subjectivity, but the idea that you're going to be totally objective seems like a fantasy.


Isn't the autocomplete feature a sort of search in its own right? I agree there is a distinction between the two tools, but deliberately steering autocomplete carries the same potential for abuse as deliberately curating search results

Your jews example is a positive use case but not all conspiracy theories are created equal. Before Epstein killed himself, google's autocomplete steered you away from any negative searches involving Prince Andrew, despite the fact that it was public knowledge at that point that one of Epstein's victims had named the prince as an abuser


Does DDG do the same then?

Its autocompletes:

is hillary clinton running for president

is hillary clinton running in 2020

is hillary clinton going to jail

is hillary clinton an alcoholic

is hillary clinton democrat or republican

is hillary clinton sick

is hillary clinton under investigation

FYI, Google's "is donald trump" includes "is donald trump the antichrist" so they're certainly not consistently selective.


ddg uses google search results because bing will copy them for specific terms so yes.


If you actually read the article, the very first bulleted point list contradicts what you're saying. It openly talks about messing with search results without any innuendos.


So that query is inflammatory. The question would be more about other political queries which might not be PC but also aren’t inflammatory.

Should they suppress queries about area51, Scientology, The_Donald, The_Müller, r/politics, Trump is a Russian stooge, etc?


You can still search for it, but it shouldn't be auto-completing conspiracy theories


If some conspiracy theory is popular, it should absolutely show up in search and autocompletion. Google isn't the arbiter of truth and shouldn't be reducing visibility of anything on the basis of its "conspiracy theory" status. There are many theories that were ultimately proven true.


It’s possible that censoring them drives more attention to them. I’m not sure it’s actually helping Hilary to hide that content.

“Check out what they won’t let you see or talk about!” is one of their main marketing draws.

And if Google was just a cold representation of human behavior (not bots though) then it might help people develop their own editorial voice.

I don’t know. It’s their call at the end of the day.


Some conspiracy theories are just that. Quack theories with no basis.

But others turn out to be true. How do you deal with that?


You "suicide" everyone with the credibility and potential motivation to expose you, and you use your board-level blackmail influence at numerous publicly traded compaines to establish narratives that associate your political opponents with your own crimes, I think.


The cynical part in me would find that funny. But this also poses the question what the responsibilities of web-search are. In some sense the most qualitative output (including autocomplete) is also conventionally responsible. I‘m not sure if this is always the case.


>It is not a flaw of Google that it won't suggest "is hillary clinton still controlled by the jews" when you type "is hillary clinton". If it was just a big trie of what everyone typed, it would be completely dominated by 4chan troll bots.

Would it be a "flaw of Google" in your opinion if Google blacklisted such an autocomplete if that is actually what a lot of (real non-troll) people were actually typing in and interested in finding results for?


How much value is even added by the autocompletion of queries?


A better question would be how much value is removed from the autocomplete suggesting inadequate searches.


Showing suggestions manipulates what someone might search for, and therefore the results they get, especially as they build up search history.

A better option would be to remove search suggestions and only match proper nouns at most.


it matters a lot more for mobile though, the difficulty of typing makes it much more likely to click those searches.


From

> hillary clinton emails

to

> is hillary clinton still controlled by the jews

Nice try.


why would that be wrong? if most people are searching for the truth wouldn't it be better to give them both sides of the issue and have them decide for themselves? Why is Google deciding why that specific Hillary query is taboo? And who gave them that right?


> if most people are searching for the truth wouldn't it be better to give them both sides of the issue and have them decide for themselves?

i think the wording of this is insufficiently nuanced. consider that many issues involve more than two clear dominant points of view, and that for many issues most people would not consider all points of view to be equally credible.

> Why is Google deciding why that specific Hillary query is taboo?

because, as another poster pointed out in another subthread, any search engine that is usable (at the level of time and technical skill that most people have) will necessarily have to make essentially editorial judgements. after almost a lifetime of being the sort of nerd that likes to make lists, categorize things, geek out over philosophical classifications, etc, and after a few years of working in the library world, i'm convinced that coming up with any system of abstraction or classification necessarily implies making editorial judgements and value judgements. i think objectivity is a great and important thing to strive for (in reporting and in information classification), but i think achieving it perfectly is definitely not possible, especially on divisive issues, and especially where lots of people disagree (or claim to disagree) on the basic facts.

> And who gave them that right?

i don't know about right, but effectively they have the ability because: 1) they built a really good search engine, 2) they built a really successful ad business on top of that to monetize it, 3) through ignorance and laziness we let them hoover up our data and use it to greatly improve their ad business, which let them provide us even more free services that everyone got hooked on, 4) everyone seems too apathetic to make the effort to move away and no one seems interested in competing with them as a search or email provider for most people. and here we are.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: