Data-Driven Decision Making (DDDM) is exactly what it sounds like. DDDM uses facts, metrics, and data to guide strategic business decisions that align with goals, objectives, and priorities. If organizations can realize the full value of their data, everyone will be empowered to make better decisions.
When you look at why IT and DDDM analytics exist, it’s because CEOs often make decisions based on their intuition. Sometimes, maybe a lot of times depending on the executive and the context, their hunch is right.
For example, Fred Smith has great insight into transportation and, despite widespread skepticism, created Federal Express. Michael Eisner hears a brief rundown of an offbeat game show, and based on his intuition, millions are committed to developing Who wants to be a millionaire?
But gut instinct isn’t how we want to constantly run a project. We don’t want to make decisions this way. Data is much more reliable basis for decision making.
Accurate data is current and timely data.
In IT, if the data is even a week, let alone three months, it is better to lick your finger and hold it to the wind than to make a decision based on old data. Ninety days ahead it doesn’t tell you where your apps, workloads, or where your customers are. It tells you nothing about the potential risks of cyber attacks.
A huge amount of IT analytics is incorrect data, which is discarded because the old data, by the time it is used, has become false data. Therefore, you perform smart analyzes on it to arrive at a result that is no better than the false data you started with.
For example, imagine a hospital that has not updated its Configuration Management Database (CMDB) in 90 days. This is like flying a plane with 90 days old instrument data. This really reduces the problem.
You don’t have to worry about a new mountain or skyscraper emerging every two weeks. But in the IT realm, a new mountain can emerge in a matter of hours or days.
What types of decisions are informed by accurate end point data?
There is a hierarchy of data-driven decisions in operations, security, and compliance, but let’s start with one that’s ready to go. In most organizations, the IT department updates the software based on an alert from the vendor or help desk that receives a large number of complaints.
The vendor usually alerts its customers when it is time to update due to a recently discovered vulnerability. It is often a “panic” alert. But the IT team rarely has enough staff to update every software it needs. It wouldn’t be a good way to go. There will likely be instability downstream from any change you made. Therefore, the decision to update or not becomes a personal one, not data-driven.
But with the right tool, you can find out, second by second, every app crash that happens across all your enterprise apps. Having this data in real time means that the IT can say, “The vulnerability did not have the top 10 complaints this week, but we know this app is crashing and we will fix it centrally.” Knowing what crashes second by second, what degrades CPU performance, and/or what a blue screen is, allows IT to make a decision that is also a business one.
In some cases, legacy data matters, especially with a distributed workforce. You want to be able to immediately see the weaknesses at each endpoint. There may be a lot to fix, but when you know where they are and how important they are, you can make informed decisions about which ones to address. For example, there may be mitigations of the corporate network, but the users at home are in the “Wild West”, and they are likely to be vulnerable to every attack.
Just what is “modern data” from an IT perspective?
The importance of data freshness is not uniform across IT operations. For example, if the hardware is in a low refresh cycle, such as two or three years, it doesn’t matter if the CPU or HDD model data is a month old. But if you are making decisions about shutting down servers or migrating workloads from a physical environment to a virtual environment, it is very likely that outdated data will cause problems. You may be retired from a server that the business unit depends on or moving workloads that support a critical service.
With data down to a second – or at least up to an hour – you’re in a much better position to work.
IT Analytics and Digital Transformation.
digital transformation like no confidence. This means different things to different people. Ask 10 engineers and you will get 12 different answers. One aspect of digital transformation is mobility and data centralization. It allows organizations to switch providers of applications and services because data and service are separated.
But if you look at where “digital transformation efforts have gone south,” you are often in the process and not knowing what servers and endpoints you connect to to enable business service. And this is where data timing intersects with digital transformation.
For example, if you have the ability to crawl through every .txt file, PDF, Word doc, and Excel spreadsheet on a laptop to find something that shouldn’t be there but should be stored centrally, it’s much easier to switch the central storage provider.
Accurate data removes risks from this decision. This is how the new data increases speed. If it takes months of effort to go from a local system to a hosted system, or from one host to another, the friction and the cost of moving are so high that you won’t. With more agility and less friction, digital transformation efforts become a buyer’s market.
Learn how to make better business decisions with accuracy, completeness and Updated data on all endpoints – Wherever they are.