image title
Politicians use fallacies like scare tactics to manipulate voters.
Part of being a member of a successful democracy is to be an educated voter.
February 24, 2016

ASU philosophy professor urges voters to use reason, logic when making decisions

Advertising is a multibillion-dollar business for one reason: It works.

Ever since people have been buying things, there have been people telling them why they should. And these salespeople have become very good at it, often employing fallacious, yet extremely persuasive arguments for why we can’t live without a certain product.

From images of a literal chocolate man being mauled by women unable to resist his allure after using Axe body spray to the story of a man who was one bite of a Nutrigrain bar away from conquering the world, advertisers have proven they know how to convince the general public to buy their products.

When all that’s at stake is whether you opt for name-brand hot dogs or bargain-bin mystery meat, there’s not much cause for concern — except maybe for a case of indigestion. But when similar tactics are used to sway people’s opinions of issues like health care, immigration or, perhaps, who should be the leader of the free world, it becomes significantly more important to be aware of all the facts, and to employ logic when making decisions.

The infamous 1964 “Daisy” political ad is one example of the use of the scare-tactic fallacy. It opens on a scene of a child in a field counting daisy petals when suddenly, out of nowhere, an atomic bomb explodes, mercilessly taking the innocent child and, presumably, all life as we know it along with it. The ad — considered an important factor in former President Lyndon B. Johnson's landslide 1964 victory over Barry Goldwater — was pulled after airing only once for its controversial implication that electing Goldwater would result in nuclear war.

The practice of using fallacies to manipulate voters still happens today. To get a better understanding of how those fallacies are employed and how to avoid being manipulated by them, ASU Now sat down with Bertha Alvarez ManninenBertha Alvarez Manninen is an associate professor of philosophy in the School of Humanities, Arts and Cultural Studies, an academic unit of ASU's New College of Interdisciplinary Arts and Sciences. (pictured), an associate professor of philosophy who relishes the opportunity to open her students’ eyes to logic and reason.

portrait of Bertha Manninen

Question: What are some of the most common fallacies that voters can look out for?

Answer: This election year in particular, I am seeing a lot of scare tactics. Some politicians like to scare voters into thinking that something horrible will happen to the country if they aren't the one elected. Of course, they rarely offer any evidence that these horrible things will happen. This uses fear, misinformation and relies on people's prejudices instead of approaching issues with rational thought, evidence and argumentation.

There's also always a lot of personal attacks in politics. When people disagree with each other on ethical issues, religious issues or policies, instead of debating the logic that underlies their arguments or citing any actual evidence in favor of their position, they start to personally attack each other. For example, people who are in favor of some kind of universal health care are often derided as socialist or communist, and rarely are their actual arguments or reasons heard.

Finally, the fallacy of hasty generalization is pretty prevalent in politics as well. This is when you use a small sample of a group to make generalizations about most or all members of the group. Any time we make a prejudicial remark about any group in particular, we are engaging in hasty generalization. It's a fallacy because we rarely have any actual evidence to suggest that most or all members of the group share in the characteristics one is accusing them of having. So instead of finding evidence, we draw from a small sample to the whole. Clearly, we see that nothing good ever comes out of making sweeping negative generalizations about whole groups of people, and yet this fallacy is one of the most often committed ones.

Q: Why are these fallacies so successful at manipulating voters?

A: Fallacies are successful because they often appeal to your emotions, both negative and positive. If someone can manipulate you using fear, for example, that is often very effective in getting you to do what they want. Sadly, we are not a society that is taught from an early age to use reason and logic as much as we are taught to go with our gut and use our initial reactions or emotions. So fallacies work because they appeal to those knee-jerk reactions that humans tend to have. But as history has shown us, leaders and politicians who rely on fallacies in arguments are typically successful only in getting people to blindly follow them. It is typically the rational thinkers of society that can battle this kind of manipulation. This is why I think it is so important to study critical thinking, logic and philosophy. All I want for my students is for them to leave my class a little bit more skeptical and a little bit more critical than they were when they first entered it. I want them to be able to see through bad arguments and to be able to create good ones. This is the kind of thinking that helps guard people against being manipulated.

Q: Can you point to some examples of these fallacies in action?

A: Most political ads are full of these fallacies. After 9/11, for example, many politicians would argue that if we didn't vote them, or their party, into office, then that meant that we would probably experience another terrorist attack. This is an example of the fallacy of scare tactics. Rarely do they offer any evidence that their tactics will keep us safe or that the opposition's tactics would not.

People who argue that vaccines cause autism because the diagnosis of autism came shortly after their child received vaccines are committing the false-cause fallacy — that is, just because X came after Y doesn't mean that Y caused X. You need actual evidence of a causal relation, not just the fact that something came before another.

When people argue that we should get rid of welfare programs because they assume all or most of welfare recipients abuse the program, and the only evidence they have for that is because they have witnessed a few abuses themselves, this commits the hasty-generalization fallacy. They are drawing conclusions about the whole group based on a few. Imagine if I argued that we should get rid of student financial aid because some students abuse the program. That commits the same fallacy. You cannot judge the whole on the actions of a few, and you can't take steps that would hurt the whole because of those few.

Q: What are some other areas besides political ad campaigns that these fallacies can be used by politicians?

A: Sadly, I feel that as a country we are moving more toward exclusion of those who we perceive as different. Again, this commits the hasty-generalization fallacy. Just because there are certain members of a religious group that may commit terrorist acts, it is patently fallacious and morally wrong to hold everyone in that group accountable as well.

Right now politicians have been using scare tactics to deny shelter to Syrian refugees, even though there is no reliable evidence that any one of them poses a threat, and even though we have an extensive vetting system in this country, because there may be a danger that a few of them might engage in violent activity. This is both hasty-generalization and scare tactics.

I remember when there were a few cases of Ebola last year in the United States and everyone started freaking out, even though deaths from Ebola in a country such as ours is exceedingly rare, even though Ebola is actually quite difficult to get and even though more people die of the flu every year in the United States. People were still freaking out and calling for the forced containment and quarantine of physicians and nurses who came back from helping Ebola patients (a nurse in New Jersey was forcibly quarantined because she showed a slight fever, even though she had no other signs of infection). Think about that — instead of acting rationally and according to the evidence, we let our fears take over, even if it meant violating the fundamental rights of our citizens. This is one reason why it's so imperative to be rational and critical thinkers. Voltaire said it best: “Those who can make you believe absurdities can make you commit atrocities.”

Q: If you could give voters one piece of advice as far as how to protect themselves from being manipulated, what would it be?

A: Americans often claim that they are so proud of their freedoms and that they live in a democracy. But part of being a member of a successful democracy is to be an educated voter and a critical thinker. A democracy where people are easily manipulated is one that can fall into the hands of the wrong leaders, and one that can be made to do horrible and unethical things. We have a moral responsibility to ourselves and our country to be the kind of citizen that is persuaded by logic, reason and evidence, rather than fear or prejudice. We should embrace dialogue among opposing viewpoints, rather than fall into the habit of fallaciously attacking each other.

 
image title
Forensic expert bias "very common" in trials, ASU assistant professor says.
America's adversarial legal system clashes with empirical, objective science.
February 25, 2016

When forensic experts are swayed by an adversarial legal system

It’s one of those nightmare situations you hope you never find yourself in: sitting on death row for a crime you didn’t commit. In 1977 that nightmare became a reality for Randall Dale Adams, thanks to the testimony of Texas forensic psychiatrist James Grigson.

Despite doubt concerning Adams’ guilt in the murder of a Dallas police officer, Grigson told the jury that Adams would be an ongoing menace if kept alive, and he was subsequently sentenced to death.

Publicity surrounding the 1988 documentary “The Thin Blue Line” did much to warrant a review of Adams’ case, resulting in the overturning of his conviction and his release from prison in 1989. Six years later, Grigson was found guilty of unethical conduct and was expelled by the American Psychiatric Association and the Texas Society of Psychiatric Physicians.

Grigson was also known by another name: Dr. Death. Throughout his career, he developed a reputation for serving as an expert witness, but only for the side of the prosecution. When all was said and done, he had testified in 167 capital trials, nearly all of which resulted in death sentences.

Such implicit bias, for whatever reason — be it monetary gain or even anger at the defendant and desire for retribution — is “very common,” according to Arizona State University assistant professor Tess NealTess Neal is an assistant professor in the School of Social and Behavioral Sciences, an academic unit of ASU’s New College of Interdisciplinary Arts and Sciences..

Herself a licensed psychologist, Neal recently received the Saleem Shah AwardNeal is the 20th recipient of the Saleem Shah Early Career Award, which honors individuals who have made significant contributions to the interdisciplinary field of psychology-law within six years of completing their highest academic degree. for Early Career Excellence in Psychology and Law for her interdisciplinary research blending psychology, ethics and law to understand how people reach decisions in the legal system. Specifically, she has analyzed how the biases of forensic experts inform their testimony and, therefore, the decisions made by judges, lawyers and other members of the courts.

“As a professional psychologist, you’re ethically obligated to not be influenced by your own [biases],” said Neal. “You’re supposed to know that they’re there, and then account for them. We know that that’s really hard to do, and in fact, probably impossible. But the clinicians are trained to think they can do it, and that they have to be objective, so they’re really invested in it.”

Unfortunately, bias isn’t always as obvious as in the case of Grigson — even to the experts themselves.

Tess Neal

ASU assistant professor Tess Neal
analyzes how the biases of forensic
experts inform their testimony and,
therefore, the decisions made by
the courts.

Photo by Charlie Leight/ASU Now

Because the American legal system is designed to be adversarial — that is, there are two sides to every case, and it is the job of each side to present its best argument based on the evidence — attorneys are ethically obligated to advocate for whichever side they happen to be on.

“This is where science and law clash,” said Neal, citing a recent study involving forensic experts. In the study, participants were asked to evaluate evidence in a hypothetical case. Before doing so, they were informed of which side — prosecution or defense — they were hired by. The result was that their evaluation of the evidence changed based on which side they were told they were working for.

What that shows, Neal explained, is that even trained forensic experts “can get subtly absorbed into that adversarial way of thinking.” And that biases the way they perceive information, interpret data and reach opinions.

“This is a touchy subject, and I don’t want to alienate myself from my field … but they’re human beings. … It’s impossible not to be impacted by the limitations of human cognition,” she said.

However, she also believes it’s possible to reduce the likelihood of bias, which is her ultimate goal:

“I hope with this body of research, and with my career, that I can help clinicians — this group of which I am a member — to understand the limitations of what we bring and not be overconfident in how objective we are, and not be overconfident in the opinions that we provide to the court, and try to stick with the science and not go beyond science.”

Neal made a big step toward that goal with the publication of the paper “Forensic Psychologists' Perceptions of Bias and Potential Correction Strategies in Forensic Mental Health Evaluations” in the journal Psychology, Public Policy, and Law.

In the paper, she and co-author Stanley L. Brodsky break down 25 methods forensic experts use to control their bias into four categories: things people say they do that actually work and that science says probably do; things people say they do that science hasn’t yet tested; things people say they do that science says don’t work and may actually make bias worse; and other strategies not recognized by forensic clinicians.

Identifying and evaluating current methods is just part of the process. “We still have some work to do,” Neal conceded — namely, testing and proposing newer, better methods. One method that she finds promising is known as “blinding procedures,” where the forensic expert hired to evaluate evidence isn’t told which side of the case their findings will be used for. The result is a more independent, neutral opinion, and many forensic labs are adopting this practice.

Also working in Neal’s favor is that fact that there is currently a lot of federal support for reducing the margin of error in such forensic methods as fingerprint and blood-spatter analysis. She hopes to take advantage of that by advocating for the same kind of support for reducing the likelihood of bias in forensic psychology. She’ll soon have the chance to do so on a higher level when she speaks on the topic at the American Psychology-Law Society annual conference March 10-12 in Atlanta.

Until then, Neal is enjoying teaching a course on forensic psychology at ASU. The class recently observed footage from the trial of infamous serial killer Jeffrey Dahmer, which included the testimony of no less than seven mental-health experts, all with varying opinions as to Dahmer’s mental state.

“I think it’s an interesting class,” said Neal. “But I’m biased.”