Delivering Business Value Through Data-First Modernization

BrandPost
14 Jan 2022
Cloud Computing

Getting strategic about data architecture is the key to maximizing data’s value as a business asset.

Pointing at laptop
Credit: iStock/Sitthiphong

Despite the billions of dollars and countless man-hours being channeled into digital transformation, too many companies are spinning their wheels, buried in petabytes of unusable data. As a result, many are saddled with initiatives that do little to advance the strategic agenda or deliver value for the business.

According to Forrester Research, companies are three times as invested in digital transformation as in other IT programs, yet progress is slow. Companies say they are still navigating the early stages of the transformation journey, and 70% admit they are underperforming — an indication that efforts have not met expectations or led to sustainable change, Boston Consulting Group (BCG) research found.

The disconnect reveals an inconvenient truth: Regardless of how innovative or well architected the digital road map, transformation is incomplete — and likely unsuccessful at enterprise scale — without a strategy and an orchestration plan for data-first modernization that spans the data center, edge, and cloud.

“Data modernization is all about starting the journey from business goals and thinking about your data first: what data do you need, where do you use data, or how can you make the most out of your data,” explains Hande Sahin-Bahceci, senior marketing leader for Data and AI, HPE GreenLake Marketing. “Yet data architecture is typically an afterthought for most organizations. They start with modernizing infrastructure, then applications, and then data. What’s required is building a data strategy and planning for data-first modernization at the beginning, not the end.”

Pillars of data modernization

Why is data-first modernization so integral to digital transformation and, ultimately, business success? The answer is that without a proper modernization strategy, data remains in disarray and spread across legacy systems and multiple silos. This multigenerational IT sprawl creates significant hurdles for capitalizing on data’s real value.

For example, there is new data being generated at the edge, which is essential for driving insights, but only a small portion is leveraged effectively as part of strategic data and analytics programs. At the same time, considerable data and systems are being pushed to the cloud, even though it isn’t a good fit for all workloads and apps and can result in unforeseen costs, among other issues.

“Not having a modern data management system in place can lead to fragmented and isolated operations that aren’t managed in a coordinated way,” Sahin-Bahceci contends.

To steer toward data-first modernization, HPE advises companies to adopt these key principles:

  • Data is a core asset and a strategic part of business goals and thus should be controlled by the enterprise, not a public cloud vendor.
  • Essential data is everywhere and must be accessible at digital speed from its native location, whether that’s at the edge, in data centers, or in applications that have moved to the cloud.
  • Data has rights and sovereignty, requiring governance policies for security, privacy, and access.
  • The public cloud is not the de facto platform, especially for industries or applications operating under strict regulatory requirements.
  • A unified view of data from edge to cloud and a single operational model will drive better performance and superior user experience.

Plotting the path to data modernization

The journey starts with fashioning a comprehensive data strategy along with discovery of data assets, aligning everything with core business needs and KPIs. This exercise establishes that data is managed and treated like an asset while also providing a common set of goals. These objectives can then be leveraged across initiatives to ensure that data is used both effectively and efficiently. A data strategy should also establish governance policies and common methods for owning, managing, manipulating, and sharing data across the enterprise in a consistent, repeatable manner.

With data and transformation strategy in sync, the next step involves identifying all facets of the data landscape, including the ways in which analysts, developers, and data scientists can work with a comprehensive and consistent collection of data. Here, organizations need to hammer out mechanisms for adding new data sources in a manner that won’t overwhelm IT and operations teams.

IT leaders should consider several factors at this stage:

  • Data transportation and ingestion speeds, including those from heterogeneous and dispersed data sources
  • Data unification and virtualization requirements
  • Data management and governance along with data security
  • How data will be consumed through a variety of processes, channels, and instruments

The next step is to unify, scale, and enable sharing of data with an edge-to-cloud platform. Creating a culture of trust is critical for sharing and unifying data. It’s also essential to understand the genesis of data silos and how data can be shared across them to avoid costly data duplication.

“All data must be available through the single, consistent global namespace, whether it resides in on-premises IT or in a public cloud or is distributed at the edge,” says Sahin-Bahceci. Moreover, the democratization of data and analytics should be enabled by a platform that supports a broad variety of protocols, data formats, and open APIs. “Secure authentication, authorization, and access control must be enacted in a consistent manner for different user types, no matter where data resides or on what system it’s running,” she adds.

Bringing the cloud experience to data

Although the public cloud is valuable, a sizable number of applications and data cannot — or should not — migrate, due to issues concerning data gravity, latency, IP protection, performance, and even application entanglement. At the same time, although storage and computing capacity seems infinite in the cloud, costs can quickly skyrocket as organizations scale processing and analysis of key data. An alternative approach is to bring the cloud experience to data, which ensures the same speed, agility, and as-a-service benefits popularized by public cloud platforms.

The HPE GreenLake edge-to-cloud platform delivers this experience to apps and data wherever they reside, whether that’s at the edge, in a colocation facility, or in a data center. HPE GreenLake brings cloudlike services for data management, data storage, data analytics, MLOps, and high-performance compute/AI, all within the context of a managed, pay-per-use, scalable, self-service experience.

The GreenLake edge-to-cloud approach to data-first modernization is already helping many organizations make the shift to next-generation operating models. Examples include a global financial powerhouse that can now streamline processing of transactions with a unified view of data and a manufacturing company that now garners efficiencies and quality enhancements by leveraging real-time analytics at the edge.

“Those organizations that continue to transform digitally are making data the life force of their organization,” Sahin-Bahceci says. “These organizations tap data to inform business transformation, to change their business goals, and to shape their business vision. That’s very different from the traditional enterprise that uses data as an enabler.”

To learn more about the HPE GreenLake platform, visit hpe.com/greenlake.

Exit mobile version