Bias, power, influence, and competence: the implications of human nature on the new NIH conflicts of interest regulations.

Author:Slocum, J. Michael
Position:Voice of Experience - National Institutes of Health
 
FREE EXCERPT

Introduction: Rules and Policies, or Ethical Culture

"The forbearing use of power does not only form a touchstone; but the manner in which an individual enjoys certain advantages over others is a test of a true gentleman. The power which the strong have over the weak, the magistrate over the employed, the educated over the unlettered, the experienced over the confiding, even the clever over the silly; the forbearing and inoffensive use of all this power and authority, or the total abstinence from it, when the case admits it, will show the gentleman in a plain light."

--General Robert E. Lee (As in Bradford, p.233)

The looming advent of the deadline for revised conflicts of interest regulations imposed by the National Institutes of Health (42 CFR Part 50 & 45 CFR Part 94) has made me muse on the continued emphasis on new policies, procedures and rules to be developed by research institutions. It seems to me (and many other commentators) that policies and rules concentrating on disclosure are too often seen as the panacea for the ethical and legal problems that arise from such conflicts. As Regan has said, there is not an "appreciation that even if an organization has adopted elaborate rules and policies designed to ensure legal compliance and ethical behavior; those pronouncements will be ineffective if other norms and incentives promote contrary conduct" (Regan, p. 942). Regan further states:

Responding to the call for creating and sustaining an ethical culture in organizations requires appreciating the subtle ways in which various characteristics of an organization may work in tandem or at cross-purposes in shaping behavior. The idea is to identify the influences likely to be most important, analyze how people are apt to respond to them, and revise them if necessary so that they create the right kinds of incentives when individuals are deciding how to act. (Page, p. 942)

We See What We Want to See

This daunting task is complicated by basic facts of the human psyche and the nature of organizational behavior. Many others have recently made the point that we are often good at seeing the mote that is in another's eye, but not the beam that is in our own (Regan, Young, Page, Gospel of Matthew 7:3). These make the point (with citation to overwhelming scientific proof in the case of the more recent writers, if the simple observation in the Bible was not enough) that it is often easy to see how others may be biased. They also document, again with many citations, that it is much harder to recognize one's own biases.

As Page says, "The simple fact is most of us believe that we are capable and impartial decision-makers ... Not only are we capable and impartial, we are more capable and impartial than others .... Ethical decisions are biased by a stubborn view of oneself as moral, competent, and deserving, and thus, not susceptible to conflicts of interest. To the self, a view of morality ensures that the decision maker resists temptations for unfair gain [and] a view of competence ensures that the decision maker qualifies for the role at hand ..." (Page, pp. 278-279)

This inability to see one's own biases extends to organizations and not just individuals. The cognitive processes and behavioral economics that underscore many of our individual tendencies are intensified in the organizational setting. Therefore, on an organizational level, as with each of us as individuals, efforts to promote ethical behavior are most likely to be successful if they build on durable human tendencies, instead of fighting natural human instincts.

As so succinctly stated by Young, "The idea that scientists are objective seekers of truth is a pleasing fiction, but counterproductive insofar as it can lessen vigilance against bias" (Young p. 412). Similarly, the idea that universities and other research institutions are pillars of impartiality and purity is, at best--naive.

Concern that the business community wields undue influence over American universities is at least 100 years old. AAUP's founding 1915 "Declaration of Principles on Academic Freedom and Academic Tenure" observed that the "governing body of a university is naturally made up

of men who through their standing and ability are personally interested in great private enterprises." At [universities], businessmen (they are overwhelmingly men) do dominate the board, and most are leaders of [the] FIRE (Finance, Insurance, and Real Estate)-based economy .... "[C]orporate universities," ... are distinguished ... not merely by the extent of corporate domination of their boards but also by their extensive adoption of corporate structures and policies.

Corporatization develops as universities become diversified enterprises with revenues derived not only from on-campus tuition but also from extension, on-line and overseas programs, campus services, investments, real estate holdings, research, patents, industrial parks and partnerships, sports and entertainment, and medical and other professional services. University presidents and senior administrators thus become managers, fund-raisers, and competitive entrepreneurs. (Benjamin, p. 255)

This unrecognized bias experienced by the individual and by organizations of individuals is iterative--a constant feedback loop of good feeling and self-justification that can assure that unethical behavior is neither seen by the actors, nor policed by those charged with compliance in an organization. In fact, the simple fact that an individual is a part of a group is a major source of bias--called "in-group" bias.

In-group bias is the general "tendency to evaluate one's own groups more positively in relation to other groups" (Page, p. 249). Page's summary of this concept is highly instructive in the realm of academia.

In general, the stronger the group ties or similarities, the stronger the bias, but groups based on as little as pleasant social or professional contacts can also lead to biased decision-making based on unconscious cognitive, affective, and motivational processes. Repeated exposure to people "tends to enhance their subjective value," and therefore also increases the bias. This can occur even when people have no conscious awareness of exposure. Members of high status or high power groups generally have higher levels of automatic in-group bias than members of groups with lower status or power, as do more loyal members (Emphasis added and internal references removed.) (Page, p. 251)

Page's review of bias and conflict in the corporate boardroom is equally appropriate for the research organization. As he says,

It is uncontroversial that people are self-interested. People will generally seek financial and social benefits for both themselves and close family members. [People] are likely to prefer outcomes that serve their pecuniary and social needs. (Page p. 253)

Page's review of the underlying reasons for bias is even more relevant to academics and their research institutions when he turns to the non-monetary realm. Social rewards may be even more important than pecuniary. Doing something for which one is recognized by others contributes to one's sense of self-worth. For both the individual and the organization, that sense of worth may be more valuable than any "filthy lucre." "... [T]he way in which the outside world expects a company to behave and perform can be its most important asset." (Engardio & Arndt)

Page and many other commentators have noted that persons (particularly highstatus persons) are not solely motivated by a desire to enrich themselves or their families or enhance a group to which they belong. They are motivated to maintain a desirable self-image and to be competent. There are many reasons to "do a good job," such as a sense of honor; feelings of responsibility; a sense of obligation to the organization; and a simple desire to do the right thing. As social and socialized beings, we want to be seen (and to see ourselves) as fair and even handed.

This leads to what is called "motivated reasoning," where motivation refers to any wish, desire or preference that concerns the outcome of a given reasoning task.

... man always believes more readily that which he prefers. He, therefore, rejects difficulties for want of patience in investigation; sobriety, because it limits his hope; the depths of nature, from superstition; the light of experiment, from arrogance and pride, lest his mind should appear to be occupied with common and varying...

To continue reading

FREE SIGN UP