When you work in IT, you see firsthand how the increasing business appetite for data stresses existing systems — and even in-flight digital transformations. You’re tasked with managing a sprawling data ecosystem that likely requires costly maintenance, and at the same time, your analysts can’t find the data they need or worse, are building their findings on poor data.
If that’s the case, you aren’t alone. Only 25% of IT leaders say their technology maximizes employee productivity. They rank disconnected data and systems among their biggest challenges alongside budget constraints and competing priorities.
It might be time for a new approach.
Data fabrics are gaining momentum as the data management design for today’s challenging data ecosystems. At their most basic level, data fabrics leverage artificial intelligence and machine learning to unify and securely manage disparate data sources without migrating them to a centralized location. Data fabric governance assumes a federated environment, so they scale by connecting to new data sources as they emerge.
At Tableau, we believe data is most valuable when everyone in an organization can use it to make better, data-driven decisions. Our unique approach to data fabric converges a data management layer within our analytics platform — which data consumers already use and love — so you can govern and secure a federated environment and enable broad self-service without disrupting the flow of business.
Let’s take a look at the problems data fabrics are designed to solve and how Tableau’s data fabric capabilities address them.
Address technical debt
A lot of organizations find that making tools self-serve can cause unforeseen outcomes. For example, data can be used in unintended ways, often and repeatedly duplicated, and stored out of sight, injecting chaos into data management.
Throughout the years, we’ve tackled the challenge of data and content reuse. The Tableau platform today deploys capabilities that favor certified trusted sources and measures, including:
Certified data sources carefully chosen by site administrators and project leaders
Recommended data sources Personally certified and/or automatically selected based on organizational usage patterns
Recommended database tables that are used frequently in data sources and workbooks published to your Tableau server
With our approach, you can take advantage of existing, relevant content, avoiding duplicate work so you can spend more time on your analysis. If you are tasked with enforcing data management, you can have access to metrics on what data is being used, by whom, and at what frequency to make data source cleanup easier.
At Tableau, we’re leading the industry with capabilities to connect to a wide variety of data, and we have made it a priority for the years to come. Here’s a look at what we provide today.
Connector library for accessing databases and applications outside of Tableau regardless of the data source (data warehouse, CRM, etc.) and where it is stored (cloud, hybrid, or on-premise)
Automatic ingestion of all data assets in your Tableau environment into one central list. This eliminates the need to set up an index schedule or configure connectivity
Data refresh failure detection that flags the issue to data users for mitigation and downstream consumers
Data modeling for every data source created in Tableau that shows how to query data in connected database tables and how to include a logical (semantic) layer and a physical layer
Secure data access and sharing through Virtual Connections (VConn) that enables data owners to share access to groups of tables that can be used across different workbooks, data sources, and prep flows. VConns also provide securely managed service accounts, agile physical database management, reduced data proliferation, and centralized row-level security
User role-based security that defines the maximum level of access a specific user can have on the site by predefined role and determines who can publish, interact with, or only view published content
Metadata management that supports a native analytics catalog with full view of your data assets and sources and provides metadata in context for fast data discovery
Data lineage and impact analysis of downstream impacts for planned changes like database migrations, field deprecation, and simpler changes that can impact assets in your environment, like the addition of a new column
Collaborate and drive adoption
Data fabrics do more than drive value with modern data management. They have the potential to create cross-functional collaboration and drive business-side support and adoption.
In the past, data analysts and IT departments worked independently from one another, effectively decoupling the business’s data needs from IT’s governance and security rulemaking. Data fabric designs work best by turning that model on its head, with the business and IT collaborating on data initiatives, co-authoring data standards with the shared understanding that data self-service can only exist with data governance.
This means data analysts and IT finally get to move their projects past the enterprise data warehouse or data lake. With Tableau, the business continues to work where they already have been working, making the opportunity to scale real. Upleveling data management and governance projects to a federated zone for analytics can enable significant adoption: one Tableau customer onboarded thousands of users in less than a year.
You can see why IT leaders embrace data fabric designs. Industry-wide, data fabric designs are in their infancy, though they have such great promise that McKinsey predicts that by 2024, data fabric deployments will quadruple efficiency in data utilization while cutting human-driven data management tasks in half, leaving more time for data professionals to focus on data strategy, process optimization, and innovation.