Contact Us

|

Careers

|

Change Locale
close

Top 3 worst data quality process errors and how to fix them [IBM]

Servers, Storage and Networking | Posted on December 17, 2012 by Deepesh Misra

In our previous post Bust your top 3 worst data quality habits we discussed the dirtiest data quality habits and how to bust them. All of those errors lead to the need for a clean process to be put in place. But even those are often flawed! Ack! How do you get this right? Gartner analyst Lyn Robinson puts it nicely “A business that can’t produce useful information is like an airplane that can’t fly. How useful is that?”

In this blog post, we summarize our IBM and TechTarget article “Data Quality Process Needs All Hands on Deck” to pick out and fix the top 3 worst data quality process errors. You can break the habits, but can you maintain the process?

Process Error #1: You don’t understand the stakes

We already claimed no company’s data is perfect – why does it matter? According to TechTarget, “Overall, estimates of the added costs and business losses resulting from inaccurate data run into the billions of dollars per year.” You are likely spending more and more on business intelligence and applications that use analytics to drive decision making. Do you still think it’s not that important?

Fix it: An effective data quality process starts with the business users who enters data into transaction systems. Think about how you interact with customers every day. What are the rewards of data being correct compared to the fallout if some of it is inaccurate? You will understand the stakes when you understand that data quality directly impacts the customer experience – and their choice to trust you.

Process Error #2: Underestimating operations
Real (or real-time) operations now represent the cultural nervous system in many companies. “More enterprises are realizing that effective real-time operations give them a competitive advantage” says Andres Perez, President at IRM Consulting Ltd. This is why good data should be valued throughout the organization.

Fix it: From the top of the company to the bottom, maintaining stellar levels of data quality should be a requirement of the job. Period.

Process Error #3: Avoiding ownership and accountability conversations
Who wants to put up their hand for that? Although, when you think of the alternative (dismissing data quality conversations) it creates more risk from reduced productivity to lost business and, ultimately, lost profits. In addition, assuming this responsibility can further complicate someone’s job if the action items are too intense.

Fix it: You need to incorporate all business users into ownership and accountability conversations. This means creating a data governance program that gives central roles to departmental managers or workers. Show them the cause and effect between data quality all to often divided. Creating smooth processes for data quality will unveil unique challenges for every company.

Get started right now
What challenges have you faced that aren’t covered here? Do you have a unique question to ask our experts? Chances are someone else does too, so leave a comment! We will get back to you. In the meantime, take a look at our Data Integrity Healthcheck to see how we can help you get your data back on track – and producing results, not bad habits!

Related Articles

Cloud | December 20, 2019 by Ryan Demelo

The stakes surrounding data security and risk mitigation rise with each passing year. Data breach costs continue to increase and potential threats grow more sophisticated.  According to IBM, the average total cost of a data breach – after accounting for remediation, reputational damage and regulatory issues – has reached $3.92 million. While smaller organizations may […]

Cloud | December 12, 2019 by Ryan Demelo

Digital transformation is changing the way businesses operate on a fundamental level. With many more digital platforms and emerging technologies like big data and the Internet of Things – the rate of data production has grown at a steady pace. With no sign of things slowing down, data protection is more important than ever. 

Cloud | November 28, 2019 by Ryan Demelo

Among the biggest obstacles to IT resilience is the “data dilemma.”  That data has become “the new oil” is a well-worn cliché by now. But clichés earn that status because they originate in the truth. And it’s true that today, data drives the decision-making that moves businesses forward. Protecting it is more important than ever. […]