HDS architecting solutions for 'business-defined IT'

  • Organisations need to invest in future-proof infra that can weather dynamic change
  • Investments must be made with eye towards supporting big data environment
HDS architecting solutions for 'business-defined IT'

IN line with its mission to further enable businesses to more effectively leverage the benefits of a data-driven economy, Hitachi Data Systems (HDS) has introduced its Continuous Cloud Infrastructure portfolio.

The portfolio comes under the banner of 'Business Defined IT,' which HDS says involves more closely linking a company’s business and technology objectives. which demands a more responsive IT foundation.

The new solutions are intended to drive IT efficiency through a responsive, software-rich architecture that can quickly react to changing needs without continual redesign and disruption, the company  claims.
 
“Our customers across industries have told us that to keep up with the frenetic pace of business, they are aligning the IT and business functions more closely than ever,” said HDS Malaysia managing director Wee Kai Teck told a recent media briefing in Kuala Lumpur.
 
“In order to execute in this business-defined world, IT teams are looking to new infrastructure strategies to deploy more continuous, adaptable and scalable infrastructure. Businesses need solutions that don’t require constant and disruptive changes to the technology they support. And that is what we are delivering now,” he claimed.
 
The new offerings include the Hitachi Storage Virtualization Operating System (SVOS), Hitachi Virtual Storage Platform G1000 (VSP G1000), a new version of the Hitachi Command Suite management platform, and enhancements to its Hitachi Unified Compute Platform converged computing offerings.
 
HDS chief technology officer Hu Yoshida said the key characteristics of Business Defined IT is the ability to be mobile and share information, with the associated economics centred on reducing cost and optimising value.
 
“This is so you don't spend on different silos but instead on economic solutions to get insights that can be used by all. We call it Continuous Cloud Infrastructure because the infrastructure is where we need to start,” he added.
 
Yoshida said that HDS’ announcement could be broken down into four parts, with the first being the company’s SVOS offering.
 
Touted as an industry first, SVOS is the standalone software implementation of HDS’ storage virtualisation technology.
 
The new storage operating system is intended to provide a common software architecture that will double the useful life of hardware architectures, span the breadth of the HDS infrastructure portfolio; and enhance and amplify the benefits of server virtualisation.
 
Primary features include flash optimisation, advanced storage virtualisation, automated tiering, non-disruptive data migration and a new native global active device feature that will provide multi-system and multi-datacentre active-active capabilities without the need for an appliance
 
“Virtualisation up till now has been vertical; you have a control unit and virtualise other people’s storage under it. By making it horizontal, if one [piece of] equipment fails it is easy to divert resources and also means, if there is a need for new control unit, one can seamlessly migrate it and enable continuous availability.
 
“With this, we’re separating the software and making it independent of the development cycle of the hardware,” Yoshida said.
 
Meanwhile the Hitachi VSP G1000 is the first available system on which customers can natively deploy SVOS.
 
The system can start small and scale block-storage throughput performance of well over three million IOPS (input/ output operations per second) , over 48GB/sec of usable bandwidth, and NFS ops/sec performance of over 1.2 million in unified configurations.
 
Yoshida said that performance has been quadrupled with the G1000, an intentional design decision to enable a longer shelf life and to meet the demands of a big data world.
 
“Most customers today are running 300-400 IOPS with some at 1000 IOPS. Today, they’ll say they don’t need that but in five to seven years, they will.
 
“This type of machine must be able to scale to meet big data requirements -- you’re going to see exabytes of storage, and by 2020, the world will be changed by the Internet of Things.
 
“If you think five years ago, how rapidly the world has changed since, you’re looking at the same type of change accelerating in the next five to seven years and the investments made today have to support us to 2020 and beyond,” Yoshida said.
 
In addition, collaboration with strategic partners such as Microsoft, SAP and VMware enables SVOS and the VSP G1000 to be certified in key initiatives like Microsoft Private Cloud deployments, SAP HANA Tailored Data Centre Integration and VMware Software-Defined Data Centre technology.
 
The other components of the HDS announcement include enhancements made to Hitachi Unified Compute Platform (UCP) and Unified Compute Platform Director 3.5.
 
On top of support for VSP G1000 and SVOS, there are new entry-level configurations of UCP for VMware vSphere and increased capabilities in Unified Compute Director, such as server profiling for simplified provisioning and enhanced disaster recovery integration.
 
“Openness in the cloud means playing well with others and we will be providing that,” Yoshida claimed.
 
The company also released a new version of Hitachi Command Suite (HCS), its integrated management platform, which supports the new global storage virtualisation features in SVOS, and offers a common REST API (representational state transfer applications programming interface) across the platform, as well as an updated, streamlined user interface.
 
These new class-leading technologies come to market with integration across virtualisation platforms, databases and a variety of clustering and operating system platforms, and can be quickly adopted to support a variety of workloads, according to HDS.
 
“So what this does, the operating system, the storage and unified management system is helping us lay the foundation for a software-defined data centre, enabling the designation of templates for users to configure without needing to know all elements of the storage network.
 
“This simplifies things and makes it more agile. You don’t want IT people to be hooking wires together and managing infrastructure; you want them to focus more on the business and applications, and the solution will allow it, letting automation do the grunt work,” Yoshida said.
 
He said that in terms of investment involving capital expenditure and licensing fees, HDS’ solution can enable up to a 25% reduction in hardware and software costs, up to a 40% reduction in storage management costs, and up to a 30% reduction in power, cooling and floor space costs of a data centre.
 
In addition, HDS is touting up to 35% savings in total cost of ownership for storage, not to mention an up to 60% reduction in the cost of application infrastructure.
 
“These are dramatic claims we're making but have the data and customers to back it up,” Yoshida claimed.
 
Previous instalment: HDS aims to be ‘socially innovative’ solutions provider

Related Stories:

Big data benefits recognised in Asia but potential untapped: EIU study

Bullish outlook for Malaysia's storage market: HDS

For more technology news and the latest updates, follow us on TwitterLinkedIn or Like us on Facebook.

 
Keyword(s) :
 
Author Name :
 

By commenting below, you agree to abide by our ground rules.

Subscribe to SNAP
Download Digerati50 2018-2019 PDF

Digerati50 2018-2019

Get and download a digital copy of Digerati50 2018-2019