When Public Relations Compromises Public Health

If you missed the Orange County Vector Control District’s press release, announcing last year’s dramatic decrease in flea-borne typhus cases, you’re not alone. Apparently, the agency’s commitment to “inform and educate the public about the shared responsibility of vector control” is no match for their commitment to link the area’s typhus cases to outdoor cats almost exclusively.

So, while some of us think the most recent statistics are newsworthy, OCVCD probably sees them largely as a most inconvenient truth. How, for example, does the agency explain the significant decline in typhus cases over the same period Orange County Animal Care implemented its return-to-field program? OCVCD has alleged repeatedly that this program increases the risk to the public—but the evidence suggests otherwise.

Well, I suppose that’s why there’s no press release. Read more

Exploring Other Dimensions

Imagine yourself responding to a survey, and one of the questions posed is this:

How much do you enjoy seeing feral cats in the environment?

Optional responses include: very much, somewhat, no opinion, very little, and none at all.

If your experience is anything like mine, the question isn’t nearly as straightforward as it first appears. If we’re talking about the feral cat that I’ve been feeding for more than two years now—who rolls around in the grass to show me how happy he is to see me, and then becomes bashful when I try to pet him—the answer is very much. The pleasure centers in my brain are, I’m sure, lighting up like the Fourth of July.

If, on the other hand, I spot a pair of eyes peering out from behind a dumpster just a few minutes later, my heart sinks. No enjoyment in that at all.

It’s a trick question, but in the usual sense. It’s not so much that the question trips up the respondent up—though, clearly, that’s a strong possibility. What’s more problematic is the fact that the researcher posing the question doesn’t actually know what a particular answer means.

Which encounter am I thinking of when I answer? It makes all the difference in the world, but that critical bit of context is lost due to the blunt nature of the research instrument. That’s the trouble with surveys: to borrow a phrase from former Secretary of Defense Donald Rumsfeld, we don’t know what we don’t know.

And yet, such surveys are the foundation of human dimensions research, the investigation of attitudes, beliefs, and values—along with their underlying drivers—surrounding a particular issue.

Human dimensions research is fascinating stuff, especially as it relates to animal welfare—and, in particular, free-roaming cats and TNR. But what happens when we jump in thinking we know what we don’t know—and then use the results to shape policy? (Here’s a hint: it can’t be good.)

Worse, what happens when the respondents are misinformed—perhaps even (knowingly or not) by the very people asking the questions?

These are some of the questions I posed during presentations at the Vertebrate Pest Conference’s Feral Cats session last week. The responses were, I’m afraid, disappointing. As was the fact that I was the only one asking.

Most unsettling, though, was the conviction with which both researchers presenting human dimensions work* responded—utterly unconcerned, it seemed, with the suggestion that the results of their hard work may not, in the end, be terribly meaningful. It’s hardly what one expects from bright, ambitious PhD candidates.

Not knowing what you don’t know is one thing; not wanting to know is something else altogether.

*I may not have the wording exactly right, but the question I refer to at the beginning of this post is “real,” in that it’s among those being used by one of the presenters.