Let’s be honest: No one really likes CAPTCHAs. The process of typing in (often unreadable) letters is tedious at best, and insulting at worst (after all, who enjoys having to prove that they are “not a robot”?). So why do websites need CAPTCHAs? Mostly because spammers are notorious for using automated input systems, aka robots. By asking users to decode letters, it can be easily proven that those users are human.
Fortunately, there are some easier versions of CAPTCHA, and many website owners are waking up to these new methods so that visitors won’t be turned off. Here are a few of the most popular CAPTCHAs:
Voice CAPTCHA Codes: These require the user to give the option to hear the CAPTCHA verification code, in addition to seeing it on the screen. It’s there for the visually impaired, but it can be helpful for anyone who is frustrated by the CAPTCHA refreshing because of an “invalid entry”.
Math Questions: For some people, these are actually fun. Instead of asking the user to solve a graphical code of text, the web page asks the user to solve a simple math question, i.e. 2+2. If it’s correct, the destination page is loaded. If not, it can refresh again to present the user with another math problem. If you opt to use this on your website, make sure the math problems are basic. A simple addition question is all that’s needed to eliminate the possibility of robots.
Simple Trivia: Speaking of “simple” and “questions”, answering a simple trivia question is another alternative to CAPTCHA that doesn’t seem terribly inconvenient. Some examples are questions about the alphabet, the President of the United States, or the old standby, “What color is the sky?” These are all quick ways for people to verify their human-ness, and they are fairly pleasant to complete.
Friend Recognition: This is really only applicable to social networking sites, but it is certainly groundbreaking and therefore worthy of our reflection. Last year, Facebook began experimenting with “social authentication” (aka identifying friends in photos) as a way of verifying account authenticity. Here’s how it worked, in Facebook’s own words: “We will show you a few pictures of your friends and ask you to name the person in those photos. Hackers halfway across the world might know your password, but they don’t know who your friends are.”
There are also some invisible CAPTCHAs out there that are gaining traction. Invisible CAPTCHAs run in the background so your user actually doesn’t need to do anything to prove they’re human. The robots fall into the “hidden trap”, so to speak, and humans push right through without any hassle. This is by far, the most user-friendly option for your website.
Automated and Manual Spam Detection: These detection services, like Akismet, Mollom and SBlam!, all analyze user-submitted data and flag spam automatically – rather than using a CAPTCHA to verify if a user is human. Occasionally, Mollom presents a CAPTCHA, but only when the system wants extra verification.
The Honeypot Method: Sounds strange? It’s not. With the Honeypot Method, website forms include an additional field that is hidden to users. That means that spam robots process and interact with raw HTML rather than render the source code, and are therefore unable to detect the hidden field. If data is inserted into this “honeypot”, the code can detect that it was not entered by a legitimate user, thus blocking the submission.
Centralizing the User Base: This is relevant to social sites and blogs, where publishing to third-party websites is a common practice. Often, doing so means either registering a full-fledged account or submitting anonymously (although the latter is becoming less common). Both of these leave the door open to spam. But by centralizing the user base, (i.e., Facebook Connect that allows you to sign in to Facebook using Twitter), website owners can relax their registration requirements before users post comments.