top of page
Writer's picturekyle Hailey

Data Simplicity


Complexity costs us a lot. Managing data in databases is a big chunk of that cost. Applications voraciously consume ever-larger quantities of data, driving storage spend and increased IT budget scrutiny. Delivering application environments is already so complex that the teams of experts dedicated to that delivery demand many control points, and much coordination. The flood of data and the complex delivery process makes delivery of environments slower and more difficult, and can lengthen refresh times so much that stale data becomes the norm. Complexity also grows as IT tries to accommodate the flood of data while their application owners expect Service Level Agreements, backup/recovery protections, and compliance requirements to remain constant.

What’s the result? Even our most critical projects can get behind schedule and stay there because delivery of environments is so slow. We find ourselves accepting and even anticipating production outages and their reputation risk because we just couldn’t test long enough on data that’s fresh enough or on environments big enough to find those problems before we went live. The cost and complexity of mobilizing our data to fewer platforms and datacenters has grown so high that we’re stuck year after year with a patchwork of datacenters, database versions, and old infrastructure draining already over-strained IT budgets. Our processes for data management are a patchwork, with no central point of control to provide the accountability and controls needed to insure compliance.

When we talk to the folks who support these applications, they tell us that data management is complex, and that’s the way it is. And, it’s not just these high visibility problems. We have a lot of highly paid experts that spend a lot of time copying, moving and babysitting bits for complex architectures like Business Intelligence or Master Data Management. The striking thing about many of these situations is that we don’t think there is a data management problem. We’ve concluded that data management must be complex. But, that conclusion is the problem. And, instead of entertaining the idea that data management can be simpler, we find that many leading technologists and business leaders shake their heads and say, “We’ve got the best tools, the best technology, and the latest processes. Our data management problems just aren’t that extreme.” Really?

We recently deployed Data Virtualization technology for a company on its own internal MDM project. Now, that company is clearly expert at MDM, and they were certainly using the best tools and processes since they are industry leaders. They cut their own delivery timeline by 50%. We also deployed Data Virtualization technology for a fortune 500 Company with a large SAP implementation. Instead of delivering 2 similar sized SAP releases every 6 months, they are delivering 11 with the same team. Industry leaders are unlocking tremendous value because they are realizing that their processes can evolve and simplify, and the bottlenecks can be removed.

Will you experience the same benefits? Maybe. Maybe Not. But, you’d agree that ignoring that amount of potential value is always a wrong decision. Disruptive technology is never well understood at first, because the essence of disruptive technology is that it finds a lever that no one else can see to unlock value no one knew was there. It requires a new kind of thinking that will challenge the way we’ve managed data. Data Virtualization can help you reduce workloads, streamline data management processes, remove bottlenecks by pushing work closer to end users, reduce the critical path, shorten the application delivery cycle, deliver more features to market sooner, and gain competitive advantage. With Data Virtualization technology, we can massively simplify. And that’s where the value is.

4 views0 comments

Recent Posts

See All

Commentaires


bottom of page