7 enterprise data strategy trends

- Advertisement -


Every enterprise needs a data strategy that clearly defines the technologies, processes, people and rules needed to safely and securely manage its information assets and practices.

- Advertisement -

As with almost everything in IT, a data strategy must evolve over time to keep pace with emerging technologies, customers, markets, business needs and practices, regulations, and a nearly endless number of other priorities.

- Advertisement -

Here’s a quick overview of seven key trends that will reshape your organization’s current data strategy in the days and months ahead.

1. Real-time data becomes real – as does the complexity of dealing with it

- Advertisement -

Lan Guan, global data and AI lead at business consulting firm Accenture, recommends that CIOs should prioritize their investment strategy to deal with the growing amount of complex, real-time data that is being poured into the enterprise.

Guan believes that there can be no compromise on the ability to harness data in today’s business environment. “The unique insights derived from an organization’s data create a competitive advantage that is inherent to their business and not easily copied by competitors,” she observes. “Failing to meet these needs means falling behind and missing out on the many opportunities made possible by advances in data analytics.”

The next step in every organization’s data strategy, says Guan, should be investing in artificial intelligence and machine learning and learning to unlock more value from their data. “Initiatives such as automated predictive maintenance on machinery or workforce optimization through operational data are just a few of the many opportunities enabled by pairing a successful data strategy with the effective deployment of artificial intelligence.”

2. Demand for in-house data access takes center stage

CIOs and data leaders are facing increasing demands for internal data access. “Data is no longer only used by analysts and data scientists,” says Dinesh Nirmal, general manager of AI and automation at IBM Data. “Everyone in their organization – from sales to marketing to HR to operations – needs access to data to make better decisions.”

The downside is that providing easy access to timely, relevant data is becoming increasingly challenging. “Despite massive investments, the data landscape within enterprises is still highly complex, spanning multiple clouds, applications, locations, environments and vendors,” says Nirmal.

As a result, a growing number of IT leaders are looking for data strategies that will allow them to manage massive amounts of disparate data housed in silos without introducing new risks and compliance challenges. “While the need for data access is growing internally, [CIOs] “This will have to keep pace with rapidly evolving regulatory and compliance measures such as the EU Artificial Intelligence Act and the newly released White House Blueprint for an AI Bill of Rights,” says Nirmal.

3. External data sharing becomes strategic

Mike Bechtel, chief futurist at the business advisory firm Deloitte Consulting, says that sharing data between trading partners is becoming much easier and more cooperative. “With the meaningful adoption of cloud-native data warehouses and adjacent data insights platforms, we are starting to see interesting use cases where enterprises can leverage their data to create entirely new, salable, digital assets. able to link the data with the data of the counterparties. it is said.

Bechtel envisions the next big change in external data sharing. “For years, people in boardrooms and server rooms alike have talked about the value of all this data, but the geeks among us know that the ability to monetize that data requires it to be more liquid. he says. “Organizations may have petabytes of interesting data, but if it’s calcified in old on-premises warehouses, you won’t be able to do much with it.”

4. Data Fabric and Data Mesh adoption has increased

Data Fabric and Data Mesh technologies can help organizations squeeze the maximum value out of all the elements in the technology stack and hierarchy in a practical and useful way. “Many enterprises still use legacy solutions, old and new technologies, inherited policies, processes, procedures, or approaches, but struggle to combine all within a new architecture that enables greater agility and speed.” creates,” says Paola Saiben, principal advisor at IT advisory firm Consequential.

Mesh enables an organization to get the information and insights it needs from the environment in its current state, without having to fundamentally change it or disrupt it in a big way. “In this way, CIOs can take advantage [tools] They already have it, but add a layer on top that allows them to access all those assets in a modern and fast way,” explains Saiben.

Data fabric is an architecture that enables end-to-end integration of various data pipelines and cloud environments through the use of intelligent and automated systems. Fabric, especially at the active metadata level, is important, Sieben notes. “Interoperability agents will make it look like everything is incredibly well connected and intentionally designed that way,” she says. “That way, you’re able to get all the insights you need while avoiding overhauling your environment.”

5. Data observability becomes business-critical

Data Observability extends the concept of data quality by closely monitoring data as it flows in and out of applications. This approach provides business-critical insights into the application’s information, schema, metrics, and lineage, says Andy Petrella, founder of data observability provider Kensu and author of Fundamentals of Data Observability (O’Reilly, 2022).

A key Data Observability feature is that it operates on metadata, providing a secure way to monitor data directly within applications. As sensitive data leaves the data pipeline; This data is collected by an observation agent, says Petrella. “Thanks to this insight, data teams can rapidly troubleshoot data issues and prevent them from propagating, reduce maintenance costs, restore trust in data, and increase value creation from data are,” he says.

Petrella claims that data monitoring creates an entirely new solution category. “CIOs must first understand the different ways of looking at data and how it differs from quality management,” he added. They should then identify the stakeholders in their data team, as they will be responsible for driving the adoption of the observable technology.

The inability to improve data quality will hinder the productivity of data teams while reducing data trust throughout the entire data chain. “In the long term, this can push data activities into the background, affecting the organization’s competitiveness and ultimately its revenue,” Petrella says.

IT leaders are competing with the growing complexity and unfathomable volume of data spread across the technology stack, said Greg Ostrovsky, executive CTO, Cisco AppDynamics. “They are having to integrate a massively expanding set of cloud-native services with existing on-premises technologies,” he added. “From a data strategy perspective, the biggest trend is the need for IT teams to gain clear visibility and insight into their applications regardless of domain, whether in on-premises, cloud or hybrid environments.”

6. ‘Data as a product’ begins to deliver business value

Data as a product is a concept that aims to solve real-world business problems through the use of blended data taken from many different sources. “This capture-and-analysis approach provides a new level of intelligence for companies that can result in real, bottom-line impact,” says Irwin Bishop, Jr., CIO, Global Engineering, Procurement, Consulting, Black & Veitch ” and construction company.

Bishop says that understanding how to harvest and apply data can be a game-changer in many ways. He reports that Black & Veitch is working with customers to develop data product roadmaps and establish relevant KPIs. “One example is how we use data within the water industry to better manage the physical health of critical infrastructure,” he said. “The data gives our water customers the ability to predict when a piece of equipment will need to be replaced and what type of environmental impact it may face based on past performance data.” Bishop says this approach gives participating customers reliability of service and more control over their budget.

7. Cross-functional data product teams arise

As organizations begin to treat data as a product, it is becoming necessary to establish product teams that span the IT, business and data science sectors, says Tracy Gusher, data and analytics leader at business advisory firm EY Americas. .

Data collection and management shouldn’t be classified as just another project, Gusher notes. She claims, “Data should be viewed as a fully functional business area, no different from HR or finance.” “The move to a data product approach means that your data will be treated just like a physical product – developed, marketed, quality controlled, enhanced and with clearly tracked value.”



Source link

- Advertisement -

Recent Articles

Related Stories