Despite the billions of dollars and countless man-hours that are being funneled into digital transformation, many companies are spinning their wheels, buried in petabytes of unusable data. As a result, many are burdened with initiatives that do little to advance the strategic agenda or deliver business value.
according to Forrester ResearchCompanies are investing three times as much in digital transformation as they are in other IT software, yet progress is slow. Companies say they are still in the early stages of their transformation journey, and 70% admit they are underperforming – an indication that efforts have not met expectations or led to sustainable change, Boston Consulting Group (BCG) research found..
The disconnect reveals an inconvenient truth: No matter how innovative or well-designed a digital roadmap is, transformation is incomplete—and potentially unsuccessful at enterprise-wide—without a data-first update strategy and coordination plan that spans the datacenter, edge, and cloud.
Hande Sahin-Bahceci explains, “Refreshing data is about starting the journey from business goals and thinking about your data first: what data you need, where you use data, or how you can get the most from your data,” Senior Marketing Leader for Data and Artificial Intelligence, HPE GreenLake Marketing . However, data engineering is usually an afterthought for most organizations. They start with updating the infrastructure, then the applications, then the data. What is required is to build a data strategy and plan to update the data first at the beginning, not at the end. “
Data update pillars
Why is updating data first so integral to digital transformation and ultimately business success? The answer is that without a proper updating strategy, data remains in a mess and spreads across legacy systems and multiple silos. This multigenerational spread of information technology creates significant obstacles to harnessing the true value of data.
For example, there is new data being generated at the edge, which is essential to drive insights, but only a small portion is effectively leveraged as part of strategic data and analytics software. At the same time, a large amount of data and systems are pushed into the cloud, although it is not suitable for all workloads and applications and can lead to unexpected costs, among other problems.
“The lack of a modern data management system can lead to fragmented and isolated operations that are not managed in a coordinated manner,” asserts Şahin Bahçe.
To move towards updating the data first, HPE advises companies to adopt these basic principles:
- Data is a primary asset and a strategic part of business goals, and therefore should be controlled by the enterprise, not the public cloud vendor.
- Essential data is everywhere and must be accessed at digital speed from its original location, whether that is at the edge, in data centers, or in applications that have moved to the cloud.
- Data has rights and sovereignty, and requires governance policies for security, privacy and access.
- A public cloud is not the actual platform, especially for industries or applications that operate under strict regulatory requirements.
- A unified view of data from the edge to the cloud and a single operating model will lead to better performance and superior user experience.
Draw data update path
The journey begins with crafting a comprehensive data strategy along with discovering data assets, aligning everything with core business needs and KPIs. This exercise demonstrates that data is managed and treated as an asset while providing a common set of goals. These goals can then be leveraged across initiatives to ensure that data is used effectively and efficiently. The data strategy must also establish governance policies and common methods for acquiring, managing, processing, and sharing data across the organization in a consistent and repeatable manner.
With the data and transformation strategy in sync, the next step involves identifying all aspects of the data landscape, including the ways in which analysts, developers, and data scientists can work with a comprehensive and consistent set of data. Here, organizations need to put in place mechanisms to add new data sources in a way that doesn’t overburden IT and operations teams.
IT leaders should consider several factors at this point:
- Data transfer and ingestion speeds, including those from heterogeneous and dispersed data sources
- Data standardization and virtualization requirements
- Data management and governance combined with data security
- How data will be consumed by a variety of processes, channels, and tools
The next step is to standardize, scale, and enable data to be shared with an edge-to-cloud platform. Creating a culture of trust is critical to data sharing and standardization. It is also essential to understand the origins of data silos and how data can be shared across them to avoid costly data duplication.
“All data must be available through a single, consistent global namespace, whether it resides in on-premises IT, in the public cloud, or distributed at the edge,” says Sahin-Bahceci. Furthermore, the democratization of data and analytics must be enabled through a platform that supports a wide range of protocols, data formats, and open APIs. “Secure authentication, authorization and access control should be activated in a consistent manner for different types of users, regardless of where the data is located or what system it is running,” she adds.
Bringing the cloud experience to data
While the public cloud is valuable, a large number of applications and data cannot – or should not – migrate, due to issues with data attractiveness, latency, IP protection, performance, and even application interlacing. At the same time, although storage and computing capacity seem unlimited in the cloud, costs can rise very quickly as organizations scale up their master data processing and analysis. An alternative approach is to bring the cloud-to-data experience, ensuring the same speed, agility, and benefits as a service that are popularized by public cloud platforms.
HPE GreenLake’s edge-to-cloud platform delivers this experience for applications and data wherever they are, whether on the edge, in a shared facility, or in a data center. HPE GreenLake offers cloud-like services for data management, data warehousing, data analytics, MLOps, and HPC/AI, all within the context of a self-managed, pay-per-use, scalable experience.
GreenLake’s edge-to-cloud approach to refresh data first helps many organizations shift to next-generation operating models. Examples include a global financial powerhouse that can now simplify transaction processing with a unified view of data and a manufacturer that is now improving efficiency and quality by leveraging real-time analytics on the edge.
“Those organizations that continue to digitally transform are making data the life force of their organization,” says Sahin Bahçe. “These organizations leverage data to inform business transformation, to change their business goals, and to shape their business vision. This is very different from a traditional organization that uses data as an enabler.”
To learn more about the HPE GreenLake platform, visit hpe.com/greenlake.