Data analytics, data management, and master data management are part of an overall imperative for public-sector organizations. They are central to organizational competitiveness and relevancy. The City of Cincinnati, Ohio, has developed a robust master data management process, and any government can use the city's achievements as a best practices model for their own master data management strategy. This article looks at several administrative regulations, touching on reasons why master data management is essential, the benefits it can confer, how Cincinnati got started, the city's framework, and the lessons the city learned along the way.
AN ESSENTIAL PROCESS
Master data management integrates technologies and processes in a disciplined fashion across the organization, allowing the flow of data from numerous stand-alone systems into one unified process via an enterprise-wide technology tool. This allows an organization to make the transition from a series of non-unified silos to a master data management architecture that integrates the output of multiple systems to a master data management structure that integrates the outputs from all systems, creating an organization-wide management capability.
A "smart" city is data-driven. Increasing the scope, volume, quality, and utility of data can lead to improved performance, better customer service, and greater efficiency through creative problem solving, while also promoting government transparency and accountability.
Governments seek to deliver efficient and effective customer services at the lowest possible cost. One way to identify opportunities for improved delivery of services is by using a strong analytics infrastructure, which provides a framework for storing and using data. Because multiple departments work together to deliver services, a key component of this analytics infrastructure is a clear data governance policy, which outlines the expectations for data access, availability, and management to ensure cross-functional decision making, accountability, data integrity, and data availability.
Optimizing the availability of quality data can be challenging, but it's worth the effort. It forces the organization to ask itself a set of difficult questions, such as:
* What business process does this data represent?
* How are the data structured?
* What information populates these data categories?
* How is this data input?
* Is this data incomplete?
The specific objectives of analytics infrastructure framework are to establish expectations and apply appropriate controls over:
* Data inventory and ownership.
* Data collection.
* Data use and disclosure.
* Data availability, retention, and disposal.
Governments are stewards of the data it collects from residents, customers, and visitors, so it has a responsibility to protect those data. At the same time, the collected data is a valuable asset for managing the enterprise, and it can be critical for identifying opportunities to improve the quality, effectiveness, and efficiency of service delivery to residents and customers. An organization's capability to achieve these goals relies on a strong analytics infrastructure and direct access to enterprise data in order to perform the following core functions related to this mission:
* Automated data dashboards that are used to consistently monitor and evaluate performance.
* Monitoring of operations in real-time.
* Self-service data discovery, which allows departments to fluidly use data for gaining insight about operations without relying on power users and database administrators for analysis.
* Predictive analytics, which help organizations to develop...