In response to the recent survey from LogRhythm*, I find it shocking that almost half of organisations that have suffered a data breach took more than four months to detect the issue, and I agree with Ross Brewer that businesses are still not doing enough to protect their networks from today’s threats.
The immediate reaction, the focus on fixing the threat detected, is of course essential, and is something that businesses must begin to put into practice. A three-month lag between detecting the problem and mitigating the risk is frankly unacceptable. Quite simply, even just a week of unfettered data exfiltration can equate to millions upon millions of records.
As well as placing importance on fixing threats once they occur, it is also vital for organisations to concentrate on preventing attacks in the first place and, most importantly, containing breaches once they happen.
The real imperative is that the security architecture as a whole must be fixed; it has not evolved at the same speed as the new world of digitised data and the borderless applications that have been hacked in every major breach case. The common security architecture is still built around the antiquated notion that a firewall can keep the ‘bad guys’ out. If any enterprise application is being shared with any external party, then even the most advanced next-generation firewalls cannot guarantee that the application is safe.
The need is for architectures to adapt to the new world of applications by ensuring that network segmentation and application isolation can be applied across all network environments, both external and internal. User access control policies must then be applied and enforced in real-time, across all users and all applications both inside and outside the traditional firewalled perimeter.
The time for the industry to recognise that a fresh approach is needed is now; if companies wait for another global data breach before making changes it will purely be too late.