Everything You Need To Know About Bots
You might not see bots, but they are there.
You may not have heard of bots, but you’ll certainly have encountered them. Abbreviated from the word ‘robot’, a bot is a simple software application designed to perform repetitive automated tasks online. As such, bots have become a central plank of today’s internet, synonymous with everything from search engine results collation to flooding message boards with spam.
Not All Bots Are Bad
Many bots have positive uses, with chatbots becoming increasingly pivotal in corporate communications. Search engines also rely heavily on bots for the mass scanning of ever-changing website data.
However, the potential for bots to adopt malicious and disruptive roles has long been known among less wholesome internet communities. At their mildest, malicious bots can spam-bomb user forums or sabotage online auctions by placing bids at the very last second. At their worst, they can bring down websites and compromise the security of personal or corporate data. Unless it’s prefixed with the word ‘chat’, the term ‘bot’ has thus become a largely pejorative one.
The Dark Side
A ‘bad bot’ is typically a self-propagating piece of malware code, designed to interact with other network services or users. Having found a way to infect an individual computer or device (often through the use of Trojans or compromised websites), it will begin reporting back to a central server that controls it and many other slave devices. These machines are under the control of the botnet’s creator, who may use the combined resources of every device to perform other malicious actions. Alternatively, the bot may report keystrokes, note passwords or harvest sensitive data such as financial reporting from each infected machine. Identify fraud and data theft can subsequently take place with relative ease – in many cases without the victim being aware of anything untoward going on.
Don’t Blame the Bot
It goes without saying that bots wouldn’t exist without human creators; like worms, bots aren’t self-aware enough to realize they are being used inappropriately. A single piece of well-designed code can spread across the web like wildfire, particularly since many bots are specifically designed to co-opt infected computers into a larger network. It can be difficult to identify a compromised machine (other than slow performance), yet that computer could have surrendered its resources to a cyber-criminal half a world away. From here, it could be used for a huge variety of purposes from mass spamming to DDoS attacks.
Simple Creatures
Because of their inherent simplicity, bots can be easily defeated. A single-page contact form on a website could be bombarded with bot spam, whereas a Captcha form is too complicated for the bot to resolve. The same is true of two-factor authentication, or anything that requires multiple stages of data entry. It is possible to install a robots.txt file on a server to stipulate the expected behavior of any bots – search engine results will often display a message about this file preventing it from displaying details of a particular website’s content. However, this is optional, and malicious bots can simply ignore the robots.txt instructions.
Preventative Maintenance
Keeping bots at bay often depends on good housekeeping, with OS updates and regularly updated antivirus software playing key roles. Firewalls provide an additional bridge that bots struggle to cross, and avoiding disreputable websites will also reduce the risk of infection. Consistently sluggish computer performance (particularly regarding internet access) should be investigated as soon as possible, since this is typically the most overt indicator that a machine has been co-opted into a botnet.