A look at the current state of security
"If we look at how we are doing now, we are not doing very well," commented Kaplan.
According to a report by Privacy Rights Clearinghouse (PRC), approximately 868 million records - or 11 percent of the world population - have been breached in the past nine years.
This is not a very encouraging statistic, said Kaplan. Moreover, the rate at which these threats are hitting us is escalating, based on the statistics from end 2012 to 2013. Unfortunately, Singapore did no better than its global counterparts.
"So what can we do about it? There will definitely be several vendors that will look to selling silver-bullet solutions but I am not that optimistic. I'm more of: What can we do when things go wrong?" said Kaplan.
You got to think like a bad guy, he advised. Most people often have this perception that their data is not important, and that there's a low probability of having their data security compromised.
"If you are online, and you have a data, then you are vulnerable. There is no such thing as my division, my company or my business isn't interesting. Everybody's interesting," emphasized Kaplan.
Go with the 'flow'
According to Kaplan, there are many ways to obtain this said visibility: link or edge-based visibility (such as firewalls, packet shapers and IDS), scanning-based visibility, agents and human-based visibility as well as network flow-based visibility.
Among these, Kaplan said that he likes flow data the most, describing it as a "telephone bill for network conversations".
Every connection in a network generates a flow record. A flow record is information about one host's contact with another host, giving details such as the host's own source IP address, the destination IP address of the host it is contacting, the TCP port number, the protocol type, volume of data and many other statistics.
Flow data has become a fundamental source of intelligence for monitoring networks and network activity. It is generated by much of the existing infrastructure at no extra charge, and the flow records contain detailed information without the need for packet capture.
"Solutions capable of interpreting this host connection metadata offer organisations a greater understanding of their networks and the data that flows around them," said Kaplan.
According to a Gartner report titled 'When is NetFlow good enough?', network monitoring and analysis is "shifting from primarily a probe-and-polling approach to a balanced approach that includes summarised flow-based data sources".
In the report, Gartner also estimates that NetFlow will provide 80 percent of the network visibility needed, whereas probe-based technologies 20 percent in the core of the network infrastructure.
Sign up for Computerworld eNewsletters.