More than half of web traffic is not human

More than half of web traffic is not human

Companies often don't realize the need for Symantec SSL certificates and other website security methods until it is too late, but a new survey from Incapsula shows that 51 percent of website traffic is potentially damaging automated software programs. Companies need to make sure they protect themselves from hackers, spammers, scammers and other people who intend to not only hurt a company, but steal from customers.

ZDNet spoke with Marc Gaffan, co-founder of Incapsula, who said few people and companies realize how much traffic is non-human and potentially harmful, adding that companies should work to secure their websites as best as they can to minimize the risk of dealing with these automated web browsers.

“Because we have thousands of websites as customers, we spot exploits way ahead of others and we can then block them for all our customers," Gaffan said. "That’s the benefit of scale. We also maintain a virtual patch service that prevents harmful exploits days and sometimes weeks before a patch is ready.”

The average web traffic every day, according to the study, is 5 percent hacking tools searching for unpatched websites or vulnerabilities in other website, 2 percent automated content spammers, 19 percent spies collecting information, 20 percent benign search engine traffic and 49 percent people who browse the internet on a daily basis.

"This means that the human user experience suffers because my server is trying to deal with all the ‘non-human’ traffic generated by software programs hitting the site," Tom Foremski writes on his IMHO blog on ZDNet, adding that having a service program security for a company may be a good option. "I don’t have time to keep up with the many security patches sent out, and then installing and upgrading multiple programs is a chore I’d rather do without."

Companies should invest in security tools such as high assurance SSL certificates to help protect themselves online.