Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Literally none of those alternative methods listed worked on my moderate traffic wiki. Recaptcha (and before it went away, identify the dogs or cats from Microsoft) is literally the only solution that stopped us from getting spammed. I wonder how much experience the author of this article really has in this domain.

Recaptcha has saved the internet as far as I'm concerned.



I'm almost sure that the traditional (ha!) wiki model where everyone is free to edit is not meant to be functional under the modern internet anyway. A verified and/or interested user model (the OP mentions this possibility in the heading "Community-specific questions", too bad that the second example image is too ambiguous even for anime fans though) seems to work though; I've seen it being much effective on, for example, the Esolang wiki [1].

[1] https://esolangs.org/wiki/Special:CreateAccount "Which number does this Befunge code output: [...]"


For my small wiki, refusing all submissions with external links eliminated virtually all spam. Yes, it's drastic, but in my case external links were not essential. I still use ReCaptcha to cut down on spam account signups.


We do that as well, outside of a small whitelist of allowed external links. Turns out it's not enough and we still need recaptcha for a few reasons.

We have a small set of anonymous edits every day, which go through recaptcha to be allowed.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: