Affirmative! - I'm not a robot - Am I?
So - you walk into a shop and before you can ask the shopkeeper a question he stops you mid sentence and demands your name, email address, company name as well as proof of whether you were human or not. Not exactly ideal customer relation protocol is it?
So how come so many commercial websites do just this before you can even ask a simple question? If you hadn't already guessed, I'm talking about CAPTCHA forms - those frustrating online questionnaires that force you to decipher a relatively illegible group of letters and enter them into a form field to prove that you're not a spam bot.
Here's a classic example of a CAPTCHA form from Google's signup page - Not exactly enticing is it?
So what are CAPTCHAs and why do companies choose to use them on one of the most critical parts of their website?
- CAPTCHA is an acronym based on the word "capture" and standing for "Completely Automated Public Turing test to tell Computers and Humans Apart"
- They are primarily used as an automated method for websites to ascertain whether a website's user is human or not.
So - why would a website need to differentiate between a human and an automated robot? In most cases websites are trying to minimise automated software programmes (known as robots or crawlers) from signing up for illegitimate user accounts (like a new Gmail account). They're also often used to stop spambots from sending spam through website contact forms.
Spambots - are automated software applications which are designed to scour the web and in this case - find a website's contact forms and automatically fill them out. When this starts to happen, website owners can become inundated with more spam flowing into their inbox than genuine visitor enquiries.
The natural next step is that website owners took was to ask their developers for a way to stop the spam - and so the CAPTCHA was born. It stands to reason that if you want to stop robots from spamming you - use some software in the website to establish if the user is a human or a robot - ignore the robot but let the human proceed.
So far so good. Or is it? As you may have guessed or indeed experienced yourself - these CAPTCHA forms are rarely enjoyable to use, in fact most users find them if not frustrating, then certainly a barrier to entry. If your business model revolves around engaging users to interact with your company, the last thing you want is for the visitor to have to jump through hoops in order to get in touch.
The key problem with implementing a CAPTCHA into your website is that you are effectively shifting the burden of proof onto the user (with a poor user experience), rather than asking ourselves if we can find out whether a visitor is real or not without them even knowing.
That's not to say they don't have their place on the web - there are cases where sheer volume and economics mean that a simple Turin test is the most effective way to process millions of requests without a large operational team sifting between real and fake requests. However, CAPTCHAs can and often are mis-deployed in websites where they could easily be avoided - especially when implemented in the line of defence from day to day spam.
So, if we're receiving a lot of spam through our website, how can we avoid using CAPTCHAs?
- The first line of defence is the most obvious one - if you're getting a load of spam sent through to your mailbox from your website - just deal with it. I know it sounds harsh - but - is it not better to receive a few hundred or thousand spams into a dedicated inbox than force your prospective customers through a frustrating user experience? - At least, by filtering through the spam ourselves, we're shifting the burden of proof away from the user and it then becomes more of an operational problem to solve rather than a barrier to entry for your visitor.
Clearly - if you're receiving thousands of spam on a regular basis - then incorporating a few spam filters to identify the real emails from the automated ones should be a relatively simple process for anyone in your IT department to set up for you. - If this really doesn't work for you and you need a second line of defence we'd recommend adding in what's known as a 'honeypot'. This is simply an additional form element that is added into your subscription or contact form. The only difference? It's hidden from a visitors view - it's there in the code, but a human user can't see it. This means that when a typical spambot comes along (which is programmed to automatically fill out each form that it sees in a webpage) it doesn't know the difference between a form element that is visible or not.
Now, like most solutions - this isn't a Silver bullet against spam - spam crawlers are getting smarter and are learning the difference between a honeypot and a visible form element, but this technique will probably help to reduce the volume of spam you receive.
The reality is that there is no silver bullet to identify a human from a spambot - There will, for the foreseeable future be an ongoing battle between website owners and spammers - and as new techniques become popular, spammers will innovate and evolve to outwit them. Indeed - there are offshore outfits that employ very poor, but very real people to complete these forms by hand - so fake users and robot crawlers are a always going to be a reality in the digital age - but as you can see there are things we can do to help minimise the need to force our website visitors through poor user experiences unless absolutely necessary.
Let's strive to keep the web an engaging, inviting environment to explore, learn and share. Don't force your visitors through a mini MENSA test just to prove they are human unless you've explored a few other options first.