Popular annual financial reports: current trends and future prospects.

AuthorHennessy, Barbara R.

What characteristics distinguished the outstanding popular reports selected in the GFOA's first Popular Reporting Awards competition?

The Governmental Accounting Standards Board (GASB) set forth nine goals of financial reporting for governmental entities in its Concept Statement No. 1, "Objectives of Financial Reporting." They can be summarized as providing users with the financial information to enable them to:

* assess the government's accountability;

* evaluate the government's operating results for the period; and

* determine the government's ability to meet its obligations as they become due, yet still provide acceptable levels of service to the public.

The Comprehensive Annual Financial Report (CAFR) represents governments' attempt to meet these objectives. These reports do an excellent job of conveying information to sophisticated users, such as bond counsels, underwriters, rating agencies, bankers and the like, for the purpose of helping governments efficiently finance their operations and capital improvements.

Although they fulfill legal reporting requirements and generally accepted accounting principles (GAAP), the reports are notoriously baffling to nonspecialists. While the public typically relies on financial analysts to decipher the reports of large corporations, no one translates the technical language contained in CAFRs to the legislators, citizenry, press, business community and other potential users.

One way to ameliorate this situation is to supplement the CAFR with a Popular Annual Financial Report (PAFR). PAFRs are designed to provide citizens and other users with easily understood information about a government's finances and economic condition. PAFRs are supplements to, not replacements for, CAFRs.

GFOA's Popular Reporting Awards

To encourage the development of PAFRs, the Government Finance Officers Association's (GFOA) Executive Board approved the concept of a popular reporting awards program. In June 1991, the GFOA provided for the Popular Annual Financial Reporting Task Force to administer the program, evaluate reports submitted for awards and select winners. The first awards were presented at the GFOA annual conference in June 1992. This article reports on the program, the scoring and the characteristics of the reports cited by the task force.

The GFOA did not specify any particular format for PAFRs. Ingenuity was encouraged so that the PAFRs would better fit the needs of the reporting entity's constituency. In keeping with the theme of creativity, the 38 reports that were submitted from across the United States and Canada for the first awards were quite varied: at the extremes of technology were a folded sheet of paper filled with facts and figures and a video tape that the entity broadcast over its cable television channel. In general, however, the reports were in the format used by most organizations: a letter-size booklet containing financial data, figures, narratives and charts. Some, on plain white paper, were printed only on one side and bound with plastic fasteners, while others were professional jobs on glossy paper. Some had no illustrations, but most contained photographs. Again, diversity was encouraged, and diversity is what was evident.

The Selection Process. The Popular Reporting Task Force consisted of 12 members. Each member was asked to review roughly 15 of the 38 reports, and each report was reviewed by five members. No member reviewed his or her own report or any report from the same state or province. The GFOA Executive Board developed 22 criteria that were used by the task force in voting on the entries. Balloting was done using a five-point scale except for the summary, which was double-weighted with a 10-point rating.

In general, the task force assumed that the keys to successful reporting were brevity, readability and presentation of data derived from the CAFR. Additionally, some narrative that interpreted the numbers and described the accomplishments of the entity, the economy of the area and so forth was necessary to translate the CAFR to the public. The 22 criteria were broken down into five groups: reader appeal, understandability, reliability, distribution and other. Exhibit 1 lists the criteria and their interpretations.

Analysis of the Scoring. After the awards were presented, the tally and written evaluations of the task force members were made available for this article. The authors made a statistical comparison of the scores for reports that did and did not receive awards to determine which criteria most differentiated the nonwinning reports from the winning entries. This information may be useful as a starting point for those governments wishing to prepare their own PAFRs.

All the votes were grouped by criteria into two categories, winners (four PAFRs) and nonwinners (34 PAFRs). The average score for each criterion was calculated by group. For example, winners averaged a score of 4.25 for brevity, while nonwinners averaged only 3.79.

The differences in the average votes (0.46 for brevity, for example) were statistically analyzed. These analyses determined whether the differences were due to chance or were real differences in the perception of the task force. Clearly, the greater the difference, the greater the likelihood that the task force perceived a real difference in criteria affecting quality.(1)

Social scientists categorize the results of this type of process into three classes depending on the likelihood of the differences being subject to chance (i.e., the variation was due to haphazard voting):

* very different (the probability of chance is 5 percent or less),

* different (the probability of chance is between 5 and 10 percent),

* not...

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT