Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

"Spammy links ... are both unnatural linking behaviors that result when sites are more concerned with pleasing search engine algorithms"

ForHackernews, we say in our quality guidelines to "Make pages primarily for users, not for search engines." If the SEO whose email was quoted in the article had followed that advice, they wouldn't be in the position of trying to clean up spammy links after getting caught spamming.



>ForHackernews, we say in our quality guidelines to "Make pages primarily for users, not for search engines."

The problem with that is that it's not true (in my admittedly very limited experience). I used to believe that until I created a small website and found it wasn't getting listed in Google and I had to try and guess why and carefully read the rules.

Making pages primarily for users means you unthinkingly do things like having duplicate content on multiple pages/subdomains/sites or linking to a site that reviewed your product after you gave them a free sample (a paid link).

I'm sure to any even vaguely experienced webdev they know not to do these kinds of things (use canonical links, nofollow and all that jazz). Somebody just throwing a few pages together who has no experience in this stuff easily falls foul of the rules. The rules may seem obvious to people who understand how search engines work. If you have never heard of pagerank and don't have an understanding of how search engines work and how people try to game and spam search engines, the rules are not obvious at all and certainly the rules are not the same as the natural behaviour of somebody creating a legitimate site with the users in mind.


How is duplicate content made "for users"? Why would I want to find exactly the same prose in multiple places?


* Paginated content with different filters or sorting? Almost the same content on multiple pages

* Product with a category name in url, or without, or under multiple categories

Both are very useful for humans, but have to be massaged with canonical urls to please Google machines


I'm reasonably sure Google is familiar with pagination and filtering, seeing as that's pretty much the entire thing their own website does.

I don't see how the same product under a variety of URLs, one for each category it's in, is at all useful to me as a human. It actually rather annoys me when the same page has a whole bunch of URLs: I can't tell which is the "real" one, my address bar frecency gets messed up, visited links don't work correctly, etc.



Maybe you run a store where you sell Widget and Deluxe Widget. The only difference is Deluxe Widget has a higher max RPM, but the operating instructions for Widget and Deluxe Widget are identical.

As a consumer, if I came to the website looking for instructions on how to use my [Deluxe] Widget, I would reasonably expect to find the same operating guide on both product pages.


Maybe you should collapse these admittedly identical items into variations of the same product, which will reduce the useless duplicated clutter I'd experience when searching your catalogue.


So are you saying that "clean up" and suspicious link notices only happen when a site hits a threshold. A state machine of sorts?

Frankly, I think "spammy link penalties" is an approach that cannot work long term, any negative weight put on links is too easy to game by competitors.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: