Subscribe / Unsubscribe Enewsletters | Login | Register

Pencil Banner

Q&A with Oracle’s Christopher G. Chelliah: Move to big data decision-making is inevitable

Zafar Anjum | Jan. 9, 2014
There is no doubt that the move to big data decision-making is inevitable and compelling. However, with the level of vendor-noise around big data, most companies fail in these projects with their approach, says Christopher G. Chelliah, Vice President and Chief Architect, Exadata & Strategic Solutions at Oracle Asia Pacific in this interview.

Absolutely. We're in the smartphone era and turn to big data for many aspects of personal decision-making, from trip planning to traffic monitoring to restaurant opinions. It's natural to start questioning the conventional norm and push to see the correlation of external and internal data.  This is an even more likely agenda in Singapore, where there is an intrinsically competitive mindset - a recurring trend that we've been seeing in our customer conversations.

How are databases enablers of business efficiency and uptime?

The database market - both structured and unstructured - are at the juncture of a hockey-stick curve growth.  Everything we do today involves data centric decisions - such as traffic light signals, trading systems and MRT piloting systems, amongst others. Singapore has led global transformation in driving efficiencies through data centric (or fact based) decision-making.  This is a quest that is not going away in the near future. 

The reality is that all businesses are already on the big data adoption curve - albeit at different speeds. The rewards will go to CIOs who are able to figure out how to manage the growing terabytes of data they collect, glean insights from that data, and capitalise on them. Hence, the deployment of next-gen databases is key here.

By adopting technologies that are able to better make use of and manage big data efficiently, it will also aid in enhancing decision making processes within organisations and subsequently help businesses achieve their competitive edge. There is no denying that exponential growth is forcing IT departments to rethink their strategies and bring in new tools to optimise database workloads. Data management, via in-memory and other specialised databases, are therefore crucial to enabling efficiency, as well as maximising the value and opportunities that big data presents[ii].

How should companies approach big data to become more competitive?

There is no doubt that the move to big data decision-making is inevitable and compelling.  However, with the level of vendor-noise around big data, most companies fail in these projects with their approach[iii].

We would propose two key litmus tests for customer's evaluating big data:

  • Reconsider an approach if it involves cutting down on skills and technologies that are already in-house.
    • The underlying big data technologies are open-source and leave little room for vendor differentiation. Instead of a sweeping change or introducing more complexity in your IT environment, look at your existing portfolio - start small, and leverage what you already have.
  • Reconsider an approach if it involves having to keep any data in both your structured (traditional RDBMS) and unstructured (hadoop or no-SQL) environments.
    • With the rapid explosion of big data, any data storage duplication is bound to have a significant cost multiplier effect. Typical approaches only count the cost of storing the no-SQL databases in low-cost disk arrays. That is only a small part of the cost and will not sustain over the long run of a project.

 

Previous Page  1  2  3  Next Page 

Sign up for Computerworld eNewsletters.