The rapid growth and spread of data over the last few years had provided us with new conveniences, and this technology is even more essential to daily life than ever before. Because so much of what we do is connected, though, it presents a number of new opportunities for that information to be compromised. Now, hackers and thieves don’t have to think elaborately when aiming to obtain data from bigger corporations. There are other, smaller ways to get the same information.
What Big Data Has Meant for Cybercrime
Forbes magazine columnist Greg Martin, who is also the CEO of the Silicon Valley cybersecurity startup JASK, gave the example of someone using a company card to order a pizza online. To target that company, all a hacker has to do is breach the pizza delivery service, whose security measures are probably far weaker. If the person using the company card wasn’t careful, he or she might have used the log-in and password that also allows access to other accounts within that organization. Through this small-time avenue, a hacker has the opportunity to compromise critical data.
It’s this phenomenon of big data that is presenting one of the most complex security issues of all-time. Cyberattacks are much more insidious, making them harder to detect without the help of real-time analytics technology. As Martin explained it, this is a call for “behavior-based security,” which would comb through all internal data in a company and look for threats, something a signature security approach couldn’t handle.
What Big Data Means for the Future of Cybersecurity
For this kind of security approach to work, enterprises need to be able to run real-time data applications with as little issue as possible, and the challenge of that requirement mounts depending on the size of the company in question.
Computer World touched on this topic in an article from March 2015, remarking on the volume of data that exists now, and how that alone has created a conundrum for cybersecurity professionals. It gave the example of a medium-sized network with 20,000 devices that would transmit more than 50 TB of data in 24 hours, meaning over 5 Gbits would need to be analyzed each second to detect cyberattacks, threats and malware.
These numbers don’t present an impossibility, though. They are meant to remind organizations of what’s necessary to meet the standards of security today. Martin wrote that speed is of the essence, and companies need to deploy more processing power, emphasizing “fast data” as the key term for 2016. To achieve fast-data processing, Martin points to cloud computing as a solution for large or rapidly growing organizations who need the ability to scale up to fit their needs.