Subscribe / Unsubscribe Enewsletters | Login | Register

Pencil Banner

Securing the new era of big data

Eric Chan, Regional Technical Director, Southeast Asia & Hong Kong, Fortinet | Dec. 16, 2014
The Gold Rush we currently see for collecting and analyzing Big Data, which in turn is being fed increasingly by the Internet of Things, is creating greater challenges for the networks and security of the data centers in three key areas.

In the digital age, information is the new currency.  And in order to get information, enterprises are mining data — and lots of it — for the knowledge that it can yield. 

On the scale of Internet commerce or social networks, the amount of data can be pretty large — think of the hundreds of millions of smartphones and end-user devices. On the scale of consumer, medical, scientific, or research data, it can be gigantic, as sensors and instruments can collect vast amounts of raw data, whether from a single source (such as instrumentation of a GE aircraft engine during a flight) or from the projected 26 billion devices that will make up the Internet of Things. 

The Gold Rush we currently see for collecting and analyzing Big Data, which in turn is being fed increasingly by the Internet of Things, is creating greater challenges for the networks and security of the data centers in three key areas:   

First, there is Aggregation. Increasingly, rather than processing and reducing the raw data at the data source to a more manageable volume, raw data is being transferred and stored centrally — because it now can be — so that it can be analyzed in different ways over time.  Today, enterprises are transferring terabytes of data over long distances every day. The sheer quantity of data is forcing core network and data center upgrades, such as 100GbE switching fabric, to deal with individual data transfers at 10Gbps or even higher.  This also creates challenges for perimeter security, such as firewalls, as many security solutions today are not designed to handle such large inflows and sessions.  For example, a firewall that boasts 10 GbE ports or 40 Gbps aggregate throughput may not actually have internal processing paths all the way through to handle an individual 10Gbps flow.  LAN congestion from normal enterprise campus traffic may further saturate network appliance CPU or memory resources, causing large flows to stall or even drop.

Next comes Processing.  Big data flows are not symmetric — the raw data that goes in does not necessarily go out in the same form and volume.  Instead, the data kept in storage arrays is analyzed typically by an intermediary set of servers, then further reduced and delivered — often by web server front-ends — as a reduced set of insights before exiting the data center.  This means higher bandwidth with an increasing proportion of lateral, or east-west traffic, within the data center, instead of north-south traffic that is going out to the Internet or elsewhere.  Many studies show that east-west traffic now accounts for up to 70% of the data center traffic, and this trend will continue to increase with the growing amount of big data analytics.  

 

1  2  Next Page 

Sign up for Computerworld eNewsletters.