Review of Andrew Hacker, “Who Knows the American Mind?”

Andrew Hacker, in his 29-paragraph review of five writings on surveys (including my book, The Problem with Survey Research) refers 20 times to survey “limitations”. That’s about one limitation per paragraph and a half, with the result that almost every one of his assertions to the effect that a particular survey helps us understand this or that is followed by a minifying or nullifying qualifier (“but”, “it remains to consider”, “seems too good to be true”, or some such) generated by a limitation of the very same survey that produced the initial-now-questionable-and-perhaps-negated understanding. What a survey gives, a survey takes away.

20 Survey “Limitations”!

Even so, Hacker’s confidence in surveys is steadfast: “many of the Pew Center’s findings shed light on where the country is going”.
He’s aware of one of the most fundamental limitations of surveys; viz., surveys rely on what people say but what people say does not necessarily and—depending on who’s asking who about what—often does not correspond to what they actually do or think. Commenting on the declining number of whites “willing to say [my emphasis] their own race is natively more intelligent”, he writes: “In part, this may be cautiousness about what one says [my emphasis] aloud, even to anonymous interviewers”; i.e., what these respondents say does not correspond to what they’re actually thinking.

Respondents “Cautious . . . About What [They] Say”!

Nevertheless, Hacker trusts surveys: “The General Social Survey . . . [produces] interesting findings”.  He mentions additional limitations of surveys, including: question wording biases answers, respondents lie, respondents are not informed, respondents “understate” and “exaggerate”, and survey answers are ambiguous. Also, he points out that in some instances, survey answers are conflicting or inconsistent and do not “meld” into a consistent or meaningful view about “where the country is moving”.
Question Wording Biases Answers!

Respondents Lie!

Respondents Are Not Informed!

Respondents “Understate” And “Exaggerate”!

Survey Answers Are Ambiguous!

Survey Answers Are Conflicting Or Inconsistent!

Still, Hacker accepts results of surveys, evident, for instance, when he writes that a survey-based chapter in one of the reviewed books is “revealing”.
Although he only hints at the limitation of nonresponse in such phrases as, “not all who pick up [telephone calls] are willing to talk”, I’m sure he knows that increasing nonresponse rates for most surveys make nonresponse an increasingly detrimental limitation. He also calls attention to the “hazards” of online surveys; e.g., it’s not possible to know who is answering: “It might be that someone with dementia is responding, it might be a teenager in Riga”.

Limitation Of Nonresponse!

“Hazards” Of Online Surveys!

In seeming disregard of recognized reasons and evidence against the asking method, Hacker’s faith remains firm: “surveys . . . [can] yield new, even unexpected information”.

Hacker also knows that question format is another limitation. Closed ended survey questions, for instance, don’t allow “people [to] `speak’ on surveys . . . [because] they are choosing from options others have framed”. In addition, he acknowledges that survey researchers, themselves, make answers unreliable when they word questions to obtain the answers they want. For example, in discussing a result of questions concerning trends in views of marriage—namely, “that 39 percent of Americans agree that `marriage is becoming obsolete’”—he writes: “it’s hard to believe that so many were calling it `obsolete’ before an interviewer introduced the word”.

Question Format Is A . . . Limitation!

Survey Researchers, Themselves, Make Answers Unreliable!

Regardless, Hacker’s a fan of surveys: “most polls can expand our understanding”.

Hacker’s mixture of sustained commitment to surveys, thoroughly stirred with knowledge of their extensive limitations, is the usual conflicting swirl in survey research. But why isn’t confidence in asking cancelled by knowledge of its flaws? The best explanation is that askers and those who rely on answers to questions are addicted to asking and to the answers thereby obtained. Because they’re hooked, they can’t stop doing what they know they shouldn’t. There’s an almost incessant drive to ask; to score the fix and receive the answer-high. Such is the strength of the asking drug.

What to do? What to do? Given the near universal addiction to asking and answers, and the insusceptibility of the addicted to reason, the only sensible recourse is focus on the non-addicted—on those open to evidence and logic—and develop and promote the Counter Literature to Survey Research that erodes confidence in the asking method; thereby making other “proper” (as I name them in Part Six of The Problem with Survey Research), methods—such as observation and experimentation—more attractive and, thus, more extensively used. Presently, the only complete, consistent, component of the Counter Literature to Survey Research is my book, The Problem with Survey Research, but there are chapters and parts of other books, as well as articles and sections of articles, that provide theory and evidence demonstrating the inability of components and/or types of survey research to produce reliable information. I’m expanding and publicizing the Counter Literature to Survey Research, in part, via my blog post, Counter Literature to Survey Research, where I list examples of the this Literature.

Don’t ask!  Promote the Counter Literature to Survey Research and encourage the use of proper methods; these are the only methods that produce the information needed for optimum policy making and problem-solving.

About georgebeam

George Beam is an educator and author. The perspectives that inform his interpretations of the topics of this blog–-as well as his other writings and university courses -–are system analysis, behaviorism, and Internet effects. Specific interests include quality management, methodology, and politics. He is Associate Professor Emeritus, Department of Public Administration; Affiliated Faculty, Department of Political Science; and, previously, Head, Department of Public Administration, University of Illinois at Chicago
This entry was posted in Survey Research and tagged , , . Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s