Subscribe / Unsubscribe Enewsletters | Login | Register

Pencil Banner

The next tech trends to disrupt data centres

Bonnie Gardiner | May 28, 2015
Trends are born from constant technology, societal and organisational changes.

He likens this trend to the open software movement, which first took off in the '90s where enterprises and software developers had free access to large amounts of code, but it only took off once vendors stood up and offered to ensure the free software was valid and tested, along with maintenance services.

"Those vendors took over responsibility to maintain's hard to find an enterprise today without some type of open software running for the web servers or whatever, but it's all managed by these vendors," says Cappuccio.

"Open hardware is the same thing right now; we have early phases of movement where we're beginning to see vendors pop up; we're early in the life cycle right now but we think it's going to have an impact."

Proactive infrastructures

IT leaders are seeing more and more intelligence built into infrastructures, especially moving forward into sophisticated data analytics. So how does it impact them?

There has been a level of analytics in IT for quite some time, with many companies putting in data centre infrastructure management (DCIM) tools to manage energy consumption. In response, vendors have started to tie in asset management and workflow management tools into their offerings to allow a more granular vision of data surrounding these assets.

The next step involves adding a whole a level of diagnostics in order to get ideas about not just what happened and why, but what will happen next.

"You can ask the data — if I install these new devices what's going to happen for the truth of IT? Or I want to accomplish something, how can we make it happen?" says Cappuccio.

"Either let me run the intelligence to understand what could happen down the road and how it can happen — or, better yet, do this analysis for me and let me know that you're seeing what I can't see."

We're starting to see this kind of analytics impacting on generators, storage, and eventually networking. This is a level of analysis that most companies aren't using yet, but Gartner anticipate it taking off mostly in hybrid environments.

"With hybrid today, I've got something running off-premise, and something else running on-premise and the problem is, at the end of the day, you still own that end user experience.

"You need somebody who understands how all those pieces tie together and who can monitor them with very granular detail, and which software vendors can do that today? Not many," says Cappuccio.

This trend is useful in determining the right way to optimise workloads, including the right time to shut down or move them, along with the appropriate restructures for applications according to demand.


Previous Page  1  2  3  4  5  6  Next Page 

Sign up for Computerworld eNewsletters.