Parlez-vous statistique?

Author:Di Genio, John
Position::The Journal of the American Society of Military Comptrollers
 
FREE EXCERPT

The emerging emphasis on efficiency, effectiveness, and quality requires professional comptrollers and resource managers to have a working knowledge of statistics--the "common language" to measure productivity, efficiencies, and process improvements.

There are an abundant number of textbooks and reference materials showing readers how to arrive at a statistic. In reality, however, comptrollers and resource managers have little need for a textbook filled with statistical calculations. Rather, a typical spreadsheet application now performs the most cumbersome of these calculations with a single keystroke. As an example, MiniTab software is capable of producing statistical data and graphs in a matter of minutes, effort that--at one time --would have taken considerable time to generate.

Unfortunately, the "stat guides" a comptroller or resource manager may conveniently have tucked away in his or her desk drawer, as well as the various statistical applications currently available, do little to caution the professional resource warrior on the many pitfalls and misuses of statistics. These are the "traps" that can get him or her into "hot water" and may cause professional embarrassment. Furthermore, textbooks and software fail to show users how to conduct sound statistical examinations.

This paper helps readers conduct basic empirical research without falling into some of the more common statistical "booby traps," such as introducing bias into a study or using the wrong statistical tool to make an inference about data. Hopefully, at the conclusion of this paper, you (the comptroller/resource manager) will be able to parle statistique.

IDENTIFYING THE PROBLEM

A statistical analysis has to have a succinct, clearly defined problem statement to keep the analyst focused on the key objectives. Failure to define the problem concisely may cause the analyst to find solutions to the wrong problem.

Data collection and analysis are costly. Therefore, organizations materially benefit from initiatives that reduce the cost of conducting a study. In this regard, cost savings may be realized during development of the problem statement. In formulating the problem statement, the analyst should conduct a literature search to see if previous studies have addressed the specific problem. After all, there is no need to expend scarce resources to reinvent the wheel, especially when similar sets of data have been analyzed before, and the problem at hand is compatible that is, it fits an already established model. As part of his or her research, the analyst could ask other agencies with similar missions or functions whether they have encountered a similar situation, and how they solved the problem. E-mail greatly facilitates the exchange of information.

DATA COLLECTION

Textbooks on statistics concentrate heavily on data analysis. In actual practice, however, data collection is equally as important as data analysis. Poor data cannot be rescued by fancy analysis. Since bad data cannot be rescued, any portion of a study using those data must be repeated, thereby increasing study costs in an already austere budget environment. To avoid wasting scarce resources (such as time, effort, and funding), analysts need to focus on designing a sound data collection method to reasonably assure that the data truly capture the essence of the problem under study (i.e., the data are "representational").

Poor recording techniques produce a large percentage of worthless data collection. For example, haphazard recording of observations while conducting work sampling will skew the data. Frequently, skilled analysts let technicians or junior analysts--folks who may lack training on appropriate data collection techniques--to record observations. Data collection is far too important a responsibility to leave to the inexperienced amateur or novice. Hence, only those who have been properly trained should collect data.

The problem of introducing "bias"--a systematic error introduced into investigative surveys by selecting or encouraging one outcome over the others--often has biased the results. A biased study is worthless and the cost associated with redoing studies likely will make sponsor(s) hit the roof.

For example, during a study to determine the feasibility of sustaining shuttle bus service every half hour from 7:00 A.M. to 10:00 P.M., the team leader decided that the number of passengers riding the bus at 0800, 1500, and 1700 would be enough to determine the number of shuttle bus passengers on a daily basis. The operations research systems analyst (ORSA) assigned to the study (... eh, me ...) eventually persuaded the team leader that passengers riding the shuttle bus during those times...

To continue reading

FREE SIGN UP