Quick Search

Training

Featured Text

 

Case History: City of Austin, Texas

From 1985, when the Austin City Auditor started conducting performance audits (Practice 1a. Audit performance), through 1992, very common audit findings across many city agencies involved weak performance measurement practices, with measures often not relevant or non-existent, data unreliable, and the use of data by management limited at best.  So in 1992, the City Auditor encouraged the City Council(Practice 5a. External advocacy) to demonstrate its interest in performance measurement to city management.  The City Auditor reports to the City Council.  That year, the City Auditor’s Office drafted a resolution that the Council passed encouraging city agencies to measure and report performance.

Building on the Council resolution, the City Auditor began adding value in a new way, by assessing the systems used by the city to measure, report, and improve performance (Practice 1b. Audit performance management systems).  The Auditor’s Office conducted entity-wide performance management systems audits or assessments in 1994, 1996, 1998, and 2002, using results both to continue advocating to Council (Practice 5a External advocacy) to require greater improvement, and to help management design and improve these systems (Practice 4b. Assist management).  One approach taken to assisting management, in the late 1990s, involved the City Auditor’s Office serving on a committee with management to oversee agency business planning, including development of new, more relevant performance measures.

In 2002, city management enhanced citizen access to city performance information by implementing a web-enabled searchable database called e-performance measures with access to performance information for all city departments.  In 2002, city management also took more responsibility for the reliability of performance information by assigning Corporate Internal Audit staff (who report to management) to develop and implement a certification process to test performance measurement reliability (Practice 2a. Test relevance or reliability).

In 2004, city management enhanced measures certification when the Budget Office piloted a self-assessment process in which each year, city departments self-assess the reliability of selected measures included in their annual business plans.  These self-assessments are submitted by departments to the Budget Office for review.  This self-assessment complements the formal certification process, still conducted selectively by Corporate Internal Audit, increasing the number of performance measures that are reviewed for accuracy, and increasing departmental ownership of measure reliability.  The certification process is defined by the Budget Office in the city’s Managing for Results Resource Guide provided on-line for use by departments in conducting business planning and performing self-assessments of performance measures.
 
Performance measurement in Austin has reached a level of maturity such that management expects and is required to provide performance information on city operations, be able to answer questions on the information, and be accountable for its reliability.  Also, management has been upgrading systems that provide information to support effective performance measurement, and the City Auditor’s Office is planning projects to test the reliability of these systems.  The trend in the city government to more integration and centralization of these types of information systems and sources will likely result in more centralized auditing.

The City Auditor continues to audit relevant performance measures during performance audits of individual departments and issue areas, as well as assessing data reliability of selected systems that provide information used for performance measurement and other purposes.  Just as management has been upgrading automation of performance information, the City Auditor’s Office has upgraded its reliability testing (Practice 2a. Test relevance or reliability) to include the use of software tools such as ACL for data mining and analysis of entire data universes with less reliance on sampling, or ArcGIS for testing the completeness and accuracy of geographic information that may be used in performance measurement.

As the quality of department performance information has improved, it has become more useful not only to management and City Council, but also as a tool for the City Auditor.  Now, the City Auditor’s Office reviews trends in department performance measures as a regular part of the office’s citywide risk assessment process for determining departments and programs to audit in the coming year, which helps the City Auditor find higher-value targets for its regular program of performance auditing (Practice 1a. Audit performance).