Subscribe / Unsubscribe Enewsletters | Login | Register

Pencil Banner

Building blocks are snapping together virtually to handle big storage

Stephen Lawson | Oct. 16, 2013
Easy expansion, lower costs and cloud-like internal services are common themes in storage today, and startup Coho Data is playing all three with a "micro-array" architecture that it's introducing on Tuesday.

Easy expansion, lower costs and cloud-like internal services are common themes in storage today, and startup Coho Data is playing all three with a "micro-array" architecture that it's introducing on Tuesday.

Enterprises will be able to combine Coho's micro-arrays in a grid to form huge storage systems with capacities reaching up into the petabytes. Users will be able to tap into that capacity as if they are signing up for more storage on a public cloud, without worrying about what storage hardware makes up the underlying system, the company says. Coho will showcase the system this week at Storage Networking World, where backup veteran Sepaton will also unveil a new architecture designed to scale out to multiple nodes.

The heart of the Coho DataStream architecture is the company's software, which lets it take full advantage of fast PCIe and Ethernet interfaces without bottlenecks, according to Coho. The company first planned to sell its technology as pure software but found through pilot projects that enterprises didn't trust a storage platform that didn't come with its own hardware, Coho Co-Founder and CTO Andy Warfield said.

So Coho combined flash devices, hard disks, CPU sockets and network interface cards in rack-mounted boxes. It links multiple micro-arrays together logically through a software-defined networking switch. "We get the hardware that lets us build that balanced trio of resources as densely as we can," Warfield said. The hardware components can come from various sources, which may change over time.

"We're a software company," Warfield said. "We're packaging on top of commodity hardware that will follow the commodity performance curve." Meanwhile, the software will also continue to evolve, bringing customers new capabilities, he said. Coho still plans to have a software-only product some day.

Coho's modular approach should be more economical than buying traditional arrays, said Bob Plankers, a virtualization architect at a major Midwestern university that has tested Coho's technology. The system scales up in a linear fashion, with performance keeping in step with capacity, he said.

"You're not just adding more drives, you're also adding the ability to keep up with what's using that additional capacity," Plankers said. That prevents over-investing in controllers or network components that aren't needed, he said.

Coho fits the university's plans to offer internal cloud services. A private cloud could provide better service to internal customers for the same or less money than the university spends today, with better control and visibility over costs and security, Plankers said.

The university now uses conventional storage arrays, primarily from NetApp and Hitachi Data Systems. Plankers sees Coho's micro-arrays forming a middle storage tier between his high-speed, high-availability systems at the high end and inexpensive disk-based platforms at the low end. That tier could eventually account for 80 percent of the university's storage, he said. Coho can actually match the high-end systems for speed, but replacing those platforms with the new gear would require rethinking functions such as replication, Plankers said.

 

1  2  Next Page 

Sign up for Computerworld eNewsletters.