The European market leader in the field of sanitary products successfully established a data-driven, end-to-end assortment management approach to support sales and distribution over multiple channels
Geberit has won this year’s award for its exemplary approach of establishing an end-to-end assortment management concept together with a highly effective product information management system. Today, all the product managers of the Group, plus 29 sales companies, are using the new system, the content of which can be translated into 34 languages. As of this year, product catalogs can be created with maximum efficiency and resilience against errors, as the content can easily and quickly be compiled using the catalog software. By establishing direct interfaces for daily data exchange with the SAP system, timeliness of data could substantially be enhanced and undesired redundancy reduced. Furthermore, workflows were directly integrated into the management application to improve the support of processes and establish measurement capabilities. Geberit also showed a web version of their future, serverless product catalog, where data is fed directly from the source systems (e.g. STEP, SAP). For more details take a look at the press release. View Geberit’s submission here.
In their presentation, ABB outlined the company’s journey from a rather informal approach to master data management to professional management of master data based on an elaborate concept. In the course of ABB’s master data management program launched in 2013, the company structured and orchestrated all of its master data management activities, which in the past used to be quite diverse and locally controlled. ABB’s good practice referred to two main aspects of the program: first, how the company conducted a step-wise implementation of measures and activities in order to deliver quick wins, raise awareness of the issue across the entire organization, and demonstrate the benefits of effective master data management; and second, with a strong focus on people, how itdeveloped a master data management competency model, offered trainings to employees, issued certificates for various levels of expertise, established a network of data managers across different company functions and geographic regions, conducted an internal good practice award, and leveraged multiple communication channels (including videotaped interviews with top managers, a master data portal, a mobile app, and the use of various promotion materials, such as specially designed mouse pads). ABB’s good practice resulted in improved data quality, concrete process improvements (e.g. faster creation of reports), and increased stakeholder and user satisfaction. View ABB’s presentation here.
Bayer presented their approach of combining data quality management and metadata management to better support marketing & sales in doing business with customers. Bayer’s approach is based on two pillars. One is “GREAT” (Governance Repository of Enterprise Architecture and Data), which is Bayer’s in-house developed electronic tool for metadata management incorporating related roles and processes. In addition, GREAT establishes a canonical model that links the business data model to the technical data models of 29 solutions. The other information system Bayer’s approach is based on is “BRE” (Business Rules Engine), supporting data quality management through more than 100 million executions of business rules per day. Furthermore, BRE provides data stewards in Bayer’s national branches with KPIs and a transparent dashboard representation of the quality of locally managed data. Bayer demonstrated their approach for the specific case of pharmaceutical event management, where legal restrictions and regulations require verifiable information about participants and the transfer of values to them. By combining data quality management and metadata management in an integrated approach, Bayer is now able to achieve a very high level of data quality across all national branches in a complex, global marketing & sales landscape. The integrated approach is particularly beneficial for local business units to monitor various marketing and sales processes, and for BI developers that need to design new KPIs and reports. View Bayer’s presentation here.
Deutsche Bank’s presentation centered around DQ Direct, a central repository and workflow tool the company is using to create a single, consistent view of data quality, enabling coordinated remediation of data quality issues. Establishing DQ Direct, together with the role of a Chief Data Officer, was Deutsche Bank’s response to overcome the challenges of the past, when data quality issues used to be dealt with at a local level only, resulting in conflicting solutions, inefficiencies, and duplicate work. DQ Direct features a self-service visualization component, which allows data quality issues to be tracked, prioritized and remediated. The stand-out quality of Deutsche Bank’s approach is its collaborative nature, as data quality issues can be reported by any of the 97,000 employees through a Web portal. The progress of each inquiry can then be tracked via the tool. DQ Direct also identifies and correlates duplicate data quality issues across the entire company, providing stronger business cases for issue remediation. With DQ Direct, Deutsche Bank was able to consolidate about 4,500 issues and remediate more than 1,000 of them (out of 7,500 on-boarded issues). DQ Direct, which is mandated by auditors, enables Deutsche Bank to meet regulatory requirements of the European Central Bank and the Basel Committee on Banking Supervision, among others. View Deutsche Bank’s presentation here.
If you are interested in participating in the CDQ Good Practice Award 2018 or you would like to learn more about the CC CDQ - contact us now.