Digital transformation is about harnessing analytics, mobile devices, cloud services, and the Internet of Things (IoT) and using these technologies to deliver new customer experiences. It is a concept that rewards organizations by increasing visibility into key audiences, markets, and internal and external processes.

For success in digital transformation, organizations need accurate, complete, timely, and consistent data. Feed defective or irrelevant data into a business process or an analytics program, and the result may create a false picture of what has happened, is happening, or will happen in the real world. Such inaccurate insights can hurt customer satisfaction and increase costs while making compliance with internal policies and external mandates more complicated.

A data quality program can help ensure usable and valid data. While launching a program is an involved process, here are three important steps that most organizations can take to accelerate progress.

1. Use software to simplify data quality management

Maintaining data quality was easier when infrastructure was in one place and under the management of an in-house professional technology staff. Now, in what some call the post-ERP era, it is more common for infrastructure to be distributed and for in-house IT departments to be much smaller.

These two trends make ensuring data quality more difficult. Organizations can adapt by leveraging increasingly powerful data quality management software solutions. The most advanced packages can be integrated into existing business processes and workflows and configured to automatically cleanse, validate, and remediate critical data. They also allow business analysts and other non-IT users to create, refine, and apply rules as circumstances warrant without involving limited developer resources.

“I think we need to embrace the changes we have in the world, especially around business, Big Data, and artificial intelligence,” said Tobias A. Bloch, Accenture vice president, sales, North America. “We need to deal with those changes and leverage them.”

For today’s typical IT departments, “dealing” often entails leaning more heavily on highly automated data quality solutions either deployed onsite or in the cloud.

2. Reestablish critical technical competencies

As organizations continue to streamline their IT operations, they should consider maintaining (or reestablishing) technical competency in critical areas such as data migration.

“Data migration is not just moving data,” said Paul Medaille, SAP senior director. “It means ensuring you are going live with high quality data.” He advises that a digital transformation strategy is more likely to succeed when it includes deep expertise in connecting, deduplicating, and harmonizing data sources.

Similarly, it is easier for organizations to improve data quality when they understand – at a fundamental level – how data is obtained and used to drive key processes and analyses. In addition, there is a strong business case for treating data migration as a core business competency.

With new cyber-physical systems such as the IoT, organizations are monitoring an ever-increasing number of physical and digital environments. Data from these environments is being amassed at an incredible rate to support operational functions, customer services, and regulatory compliance needs. A lot of this data is ending up in on-premises and cloud-based silos. Organizations that know how to migrate this data while maintaining data quality will be in a stronger competitive position than their less tech-savvy competitors.

As with data management, there are solutions that include built-in best practices for remediating during the migration process. Implementing a good data migration strategy also provides an opportunity to onramp new governance programs.

3. Promote a culture of data ownership

Data analytics used to be the purview of highly trained data scientists. That is no longer the case. As organizations have gotten better at integrating, managing, and consuming data, they have turned more of their workforce into “number crunching” business analysts.

“We expect [non-technical] employees to do some of the things that core IT professionals used to do,” Bloch says. “There is benefit and risk to this trend.”

The benefit is that more workers are relying on real-time data and analytics to discover business opportunities, reduce costs, and drive sales. That is a significant improvement over following your gut or waiting for a data scientist to work through a queue of report requests.

The risk is that employees may not fully understand the tools they are using to make important decisions. This incomplete knowledge means data anomalies may not be spotted until there is a significant financial impact on operations.

“Organizations should see [data] as an ongoing competency,” said Owen Pettiford, BackOffice Associates digital transformation practice lead. “Now, it is often the novices, not the experts, who enter and maintain corporate data.”

Effective data migration, data quality, and governance programs can mitigate this risk, however. It is also important to develop a widespread culture of data ownership. Ownership helps to reinforce the value of data and the responsibility all data consumers share in following the organization’s governance strategy.

Digital transformation does not allow organizations the luxury of a time-out to validate or cleanse data before decisions are made. And as more back-office functions are pushed out to the front office, all employees need to become co-authors of every bit of data and any transactions they touch.

Want to learn more? Listen to the SAPRadio show: “New Digital Realities: Does Your Data Quality Matter.” And check @SAPPartnerBuild on Twitter.


Source: Digitalistmag Big data technologies