Prevention is always better than cure when it comes to information security, particularly when dealing with today’s big data environments that see a flood of data drowning the enterprise on a daily basis.
Gerald Naidoo, CEO of Logikal Consulting, says it’s not only enterprises that are going big – vast quantities of consolidated data are a very tempting target for threat actors. “Breaching a large enterprise’s store of big data can produce massive payloads for cyber criminals. Moreover, the effects of a data breach can be catastrophic for the affected company. Terabytes and terabytes of data are contained in these stores – including a company’s most valuable information assets – financial credentials, proprietary information, intellectual property, staff records – the cost of the theft of this sort of data are all but incalculable – not just in terms of money, but in terms of reputation.”
To protect a big data environment, what is needed is the proactive enforcement of monitoring, which can ensure real-time detection, and application of security defence measures when needed, he says.
Securing big data comes with a set of unique challenges beyond being a target with vast amounts of potentially saleable data. “In no way are we suggesting that big data security is fundamentally different from traditional data security, but that there are different challenges and elements to consider when security a big data environment.”
Naidoo says big data security environments have big risks, not limited to potential random security attacks, data breaches, outdated or ineffective security policies, malicious insiders and many more. “A big data environment that isn’t properly secured can result in compromised data.”
The first step in securing a big data environment, is ensuring the monitoring and analysis of audit logs to better understand and monitor large clusters, Naidoo says. He explains that IBM’s InfoSphere Guardium solution does exactly that, and monitors systems for any unauthorised or potentially dangerous activity to give enough time for the system to mitigate, avoid, or reduce the impact of a breach or attack.
He says detecting whether a big data cluster has been breached is a hidden necessity that needs a proactive and efficient approach.
“The constant monitoring of transaction logs is a practical solution. Logging should be added to the existing cluster, the shared Web features used to managing log files, or a SIEM or other log-management product can be added. There is no doubt that logging helps identify attacks, diagnose failures, or root out any anomalous behaviours by tracing events to their root cause. Guardium does all of this.”
Another benefit that Guardium offers for securing a big data environment, is its generous provisions for the sufficient horizontal scalability and transparency that is needed to work with big data. “The solution’s data encryption feature provides cryptographic encryption and decryption without interruption to the big data environment. It also offers a centralised policy and key management services are a bonus to protect unstructured and structured data by requiring that users who attempt to access encrypted files have the required encryption key or certificate to do so.”
He says policy specifications against appliance interference are an absolute necessity for semantic Webs and guard against security and privacy violations that stem from inference.
“It is easy to achieve a secure big data environment by using IBM’s Guardium. Don’t forget that too often, big data is subject to many compliance and privacy regulations, making it even more important to ward off threats and protect its integrity. This is particularly relevant for the public sector, which has significant big data environments to protect,” Naidoo concludes.