When I think of bots my mind always goes back to The Matrix when the Sentinels rip into the Nebuchadnezzar on a search and destroy mission. In the World Wide Web we are forced to be around bots as they outnumber human users currently on the web. Not all bots are the same however. Some bots were created to take your place on Ebay and win the auction by placing a bid at the very last second. Other bots were programmed to check restaurants that were using OpenTable to snag up reservations when a table becomes available. Even Google uses bots to crawl websites which allows them to help rank you in Google searches.
Even though these are just a few examples of bots that mean no harm, there are many out there can cause you a lot of frustration. These bots are programmed to flood your website with comments (forum and blog style websites), tie up your inbox, and in some cases knock your website offline. Bots can account for over half the amount of traffic on websites.
There are some precautions we can take to help control the amount of damage a malicious bot can do. One great example is Google’s reCaptcha. This verification process can help determine whether or not a human is submitting information or a bot is submitting information. There are many different variations of this type of verification, some better than others, but as technology gets smarter bots will follow suit.