Data Mobility -Then and Now

By Mauricio Daher

Our industry has seen repeated cycles of centralisation and decentralisation: Mainframe or Open Systems? On-premises, or off-premises: Build a DR site, or use Sungard? Which has been better? That’s the past. What I can say is that today, IT leaders see a great number of new choices in how to deploy their business content in the form of cloud connectors for legacy applications, and DevOps that makes use of any number of cloud stacks.

This is compelling to marketing teams, but too many choices can be a hazard. By choosing solutions that make it easier to move your content around – avoiding vendor lock-in – organisations can reduce cost by eliminating fork-lift upgrades that were needed in the past. Cost reduction is a very strong driver, but risk, or rather the cost of risk has to be considered.

The choice that matters most is whether or not to use a public cloud. The second most important choice is which deployment model to use. When is it more cost-effective to, for example, go to a SaaS deployment model?

A clear example of a successful SaaS-run business is SalesForce.com. Why are companies still building their own infrastructure when they can pay someone else to do it for them? Well, here’s one reason: SalesForce.com had a bad outage a few months ago, and it turned out their SaaS provider was hosted on a Tier-2 data centre during an internal power disruption.

Could they have avoided this by paying more for Tier-3? Hindsight is always 20/20. Whether it is a trust issue, or a need for flexibility, the important things to consider are the strengths of the underlying technologies used to deliver access in a timely manner, and protect the availability, integrity and confidentiality of all the content the business cares about, and not just the functionality available when everything is working at 100%, no matter how flashy it may be. Resiliency and recoverability as well as integrity and privacy always have to be considered.

Private or Public? Ask yourselves, can we do it better than Google, Amazon or Microsoft? But also ask, can I trust that my data will always be kept safe, private, and retrievable in a timely manner, and at a predictable cost? This may not be the case with these providers.

A growing number of IT customers have adopted content technologies like the Hitachi Content Platform on-premises, to host traditional applications as well as home grown ones. Some have adopted software from Star Storage to completely retire legacy applications and are able to get at their content on HCP through a single pane of glass.

Customers with enterprise content management (ECM) such as Documentum, OpenText or IBM FileNet have seen the value of putting this valuable content on HCP leveraging cloud protocols, while call-centre management customers with NICE Engage software found that the HCP REST interface to HCP provided a superior capability to meet compliance needs.

The same goes for Enterprise Vault customers. Some customers use their DevOps capabilities to develop their own interfaces to access the HCP over REST, or simply leverage HCP’s Amazon S3 capabilities. Star Storage SEAL provides an integrated approach to managing many of the legacy content above without the expensive license cost, while preserving the valuable content.

So, the question remains, Private or Public, and which deployment model should we use? The answer is up to you, the IT leader, but hopefully you can make a better decision after reading this article.

Mauricio Daher is Chief Information Security Technologist, Content and Cloud at Hitachi Data Systems