I stand by what I said. It's about a balance of effort-- complete spam protection involves some human assistance. The question is how to best balance two kinds of effort:
1. The effort involved in manually deleting spam.
2. The effort involved in designing an automated system to prevent spam.
The best answer is the one that involves the least effort as a sum of (1) and (2).
That's great, and I believe you. I had a no-protection form on my website a few years ago and started getting 10 spam emails per day. I added an incredibly simple (but unique) anti-spam measure [you can see it on my website if you want], and I have had 0 spam since as well.
He implemented my sytem in and since then he has had 0 bots register.
It's not very hard to prevent spam like that-- all you need is anything that differs from the expectation of normal websites. In fact, this includes CAPTCHAs. It's better protection in some cases to have something unique than to have a theoretical strong, but common CAPTCHA, because the spammers will be trying to defeat the common CAPTCHA because it's on a lot of websites, while the spammers won't think about your website or design a bot to attack it.
What is important here is the distinction between:
1. Generic, automatic bots that don't specifically target your site.
2. Bots that are specialized for your website.
The methods you've described are not very good for (2), but they will work very well for (1).
Personally, I think this is the most important. There are three kinds of attacks:
1. Generic, automatic bots.
2. Targetted bots.
You can prevent (1) as I said. You can try to prevent (2) by using something like a CAPTCHA (see below). And you can't do anything about (3).
Disagree if you want, but I'm not wrong. It's true that some CAPTCHAs are defeated by bots. It may even be true that the age of CAPTCHAs is over (I'd be happy as a website visitor) because bots are smarter than them. BUT... that doesn't mean that any other method is technically, or theoretically better at preventing bots. If you had a bot-designing competition, CAPTCHAs are basically the only system out there that bots can't pass but humans can (reliably).
So how you say a Captcha system is the only "real effective way to stop all automated attacks.", I would have completely disagree.
There are lots of methods that accomplish the prevention of type (1) above, but there are very few methods that can effectively block type (2) above. If someone wants to attack your website, you will not be able to stop them with anything aside from some kind of human-targetted puzzle like a CAPTCHA.
The real trick is finding a better CAPTCHA-- one type that is interesting (but possibly hard for humans) is to say "click on the cat" and display 4 images. Computers can't do that very well, at least not yet. (However, some problems: Computers can actually guess somewhat, and out of a multiple choice set, maybe 4 images, they will be right probably 50% of the time. That's not great protection.) The issue is that methods like this rely on language, so that your visitors must speak English. A solution to that problem would be very useful.
Correct. But they actually block bots, rather than just confusing them. A bot can easily be programmed to circumvent most options, but there is no way to get around a CAPTCHA. It must actually solve the CAPTCHA. Therefore, even if it's not 100% effective, it's still better than everything else, which is about 0% effective against a targetted attack.
Captchas are effective to a degree, as well as everything you can put out, but, nothing is ever 100% effective.
ReCAPTCHA is pretty good. The reason it doesn't work that well is because it is so popular. That makes bot-makers interested in breaking it. The lesson: don't rely on the most popular technology. (Imagine if you could use a special kind of key to lock your website. Now imagine if there was a single best kind of key. Now imagine if everyone had that exact key. Now imagine how hard it would be to break into all websites-- very easy, once you figure out the key!)
Even systems like ReCAPTCHA that can be integrated are not 100% effective.
In this case, the situation is exactly the opposite of "safety in numbers". In fact, the internet may be the first place in history where that applies. (And perhaps with diseases.)
Obviously not. But your methods won't stop them either. As you said, nothing well. That's not an argument against CAPTCHAs. That's an argument against trying to stop human spam (type 3 above). The only effective methods there are to: 1) filter suspicious messages (eg, keywords), and 2) deal with the rest manually.
When it comes to the spammers with human assistance or are human, nothing out their will stop them, even captchas.
Of course. That's why I discussed a balance of effort above. Nothing is 100% effective, so it is a waste of time/effort to look for a perfect solution. I'm not trying to describe a perfect solution. I'm describing a good (perhaps the best) solution. It involves minimizing effort. Here, I make some effort to stop generic bots (by making my website unique), and I also may try to stop specialized bots by adding a CAPTCHA or something like that. For especially well-designed specialized bots, and of course for humans, I don't make any attempt stop them. That's a waste of time. The best method is applied, and that's done. Whatever extra spam comes through can be dealt with by me (or whoever is helping me on the website), manually. That's how these things work.
All in all, every little precaution that can be taken should be taken if you really want to protect your site instead of running on the assumption that your site is protected. I always tell people to always think that their site is vulnerable and not to let their guard down and never believe that anything is 100% fool proof, because anyone with even the basic knowledge of anything knows that is BS.
Finally, you have to be aware of scale. As a moderator here, I'm aware of just how strong the force of attacking spammers can be. This is a relatively popular website (within the top 3000 in the world), so it's not surprising the spammers make a lot of effort to attack it. We have automated methods in place, but not all spam can be stopped. It just can't be. Therefore, the moderators manually delete spam when it comes (and we do a good job-- it's very often deleted within 5 minutes of being posted, so there's not much incentive to continue posting it!). On a smaller site (my personal website for example), it's just not a problem like that. So if you really want to prove that your methods work as well as you claim they do, you'd need to try them on a giant site like facebook. The point is... your methods wouldn't work well at all, because a specialized bot could destroy them with little effort. A CAPTCHA would work very well, until someone put in a lot of time, and then a specialized bot would eventually be able to break it. In terms of strength, a CAPTCHA is better. In terms of practical purposes, it depends on the situation. The biggest variability in situation is how much spam you would get without any protection, and whether anyone cares enough to design a spam bot for your website.