Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

Recursion

(56,582 posts)
Mon Oct 7, 2013, 07:07 AM Oct 2013

Type I and Type II errors, the left, and the right. A perspective from engineering.

It is better that ten guilty persons escape than that one innocent suffer
– William Blackstone

Connor Wood at patheos just wrote an interesting example of what is becoming a set piece in journalism: explaining the difference between the right and the left as one of fundamental psychology. Attempts to do this are legion, and always focus on different aspects of the divide (language! finality! squeamishness! etc.); this may be a case of the blind bats and the elephant.

But if one more bat can chime in here, the field of systems theory offers a perspective that I think may be a useful one on the difference between the right and the left (and I’m deliberately using those terms rather than “liberal” and “conservative”). That perspective involves their preference for Type I or Type II errors.

Let me take you back to the nightmare that was my Stochastic Processes, Detection, and Estimation class in grad school for a second and tell you the story the professor used to explain the difference (though already I'm tipping my hand as a leftie by arguing from model rather than anecdote -- it's not that scientists are liberal, but that liberals become scientists). Imagine you’re the radar operator during the Battle of Britain in World War II. The Luftwaffe is sending bombers over at unexpected times, and your job is to decide when to scramble the fighters to stop them.

But it’s not as simple as in the movies. A blip could be a bomber, or it could be a flock of geese. Or it could be an error in the radar itself. Some blips you can be nearly certain are bombers, some you can be nearly certain aren’t, but most fall into some spectrum in the middle. Let’s even say that you’ve been keeping track of things over time, and you can actually estimate the likelihood that a given kind of blip is a bomber or not. There are a few 99%’s and a few 1%’s, but most of the blips fall into the 40% to 60% range. You see a blip that weighs in at 45% (that is, you are 45% confident that this blip represents a bomber -- it's called a "confidence percentage&quot ; what should you do?

There’s two ways you can screw up here: you can scramble fighters when there is no bomber, and you can fail to scramble fighters when there is a bomber. Both have costs associated with them. Scrambling fighters when there isn’t a bomber wastes fuel and flight time, which are finite resources, and leaves you more vulnerable to a real bombing raid. This is called a false positive, or Type I error. Failing to scramble fighters when there is a bomber means the bomber gets through and blows up London. This is called a false negative, or Type II error.

This has implications far beyond air mission control, of course; that’s just the situation it was first developed for. Consider cancer tests. A Type I error would be a test telling someone without cancer that he had it; this would require unnecessary treatment and waste money, medical staff’s time, and equipment, all of which are finite. A Type II error would be a test that told someone with cancer that he didn’t have it; this would mean the cancer goes untreated. The cost there is obvious. (Notice that in both the cases I mentioned, the types of cost vary greatly in quality rather than in ways that are easy to quantify — this is how it generally works).

My thought is that in general the left fears Type I errors and the right fears Type II errors. How does this apply to politics? Well, take the Blackstone quote I started with. A Type I error in criminal justice is putting an innocent person in jail (false positive on guilt). A Type II error is letting a guilty person go free (a false negative on guilt). The harms of both of these are obviously difficult to compare, but Blackstone actually gave a ratio of 10 to 1: he would accept 10 false negatives if that would prevent 1 false positive. Without any data to back this up, it does (at least to me) seem intuitively right that the left is more worried about jailing the innocent and the right is more worried about not jailing the guilty. (There's no need to be doctrinaire here: neither side actually approves of either situation happening, but see below for why this preference becomes important.)

That number 10 seems arbitrary, but the first question for a lot of people is “why tradeoff in the first place? just make sure criminals go to jail and innocent people don’t.” Unfortunately it’s not that simple, and here’s where systems theory rears its ugly head. I won’t go into the math (mostly because I have no idea how to post an integral sign) but even for an optimal detector, if there’s any uncertainty (and there always is), there always has to be a tradeoff. A detector more capable of avoiding Type I errors is by that very fact less capable of avoiding Type II errors. In a way it makes sense: ultimately the radar operator is going to have to have a confidence percentage cut off under which he doesn’t launch fighters and over which he does. He can ratchet that up, and make false positives less likely, but that will mean letting more bombers through. Or he can ratchet that down, and make false negatives less likely, but he’ll end up launching fighters at geese more often.

Let me move this to another political issue that really divides the right and left in the US: voter verification. You have some detector that’s looking for fraudulent voters (the bombers, cancer, or guilty parties in the language of the other examples). Now, I make this with no polling data, but I think most people would agree: the nightmare scenario for someone on the right is that a person ineligible to vote casts a ballot (Type II; cancer not diagnosed; bomber gets through; criminal goes free); the nightmare scenario for someone on the left is that a person eligible to vote is not allowed to cast a ballot (Type I; cancer falsely diagnosed; fighters scrambled for no reason; innocent person goes to jail). Forget actual numbers of how often both of those happen, think of your liberal and conservative friends and ask yourself which situation would just make them fundamentally more upset. So, currently, states with right-leaning government are making it harder for ineligible voters to vote by (ultimately) ratcheting up the confidence percentage the poll worker needs to allow them. But it is a mathematical necessity that doing that increases the likelihood that an eligible voter is denied a ballot.

Another interesting result from the detection and estimation class is that the real difficulty in establishing a “good” detector is deciding what “good” means. To do that, you have to be able to quantify the costs of Type I and Type II errors. I mentioned above there’s a qualitative difference to a lot of them, rather than a quantitative difference; it’s not as if one costs X amount of fuel and the other costs Y amount of fuel — one costs fuel and the other drops a bomb on a city. To go back to the bombers, an example comes to mind of a way to look at assigning costs: would you change your required confidence percentage if you found out the bombers had nukes? You’d probably drop it way down and find a way to get more fuel. On the other hand, if they had very small bombs, you might judge it’s a better idea to push it way up and just find a way to protect your buildings and people better. Or, you’d probably find a high false positive rate on a bubonic plague test more acceptable than on a test for the common cold.

So, how do you compare the “costs” of an innocent person going to jail versus a guilty person going free? Or of an eligible voter being denied a chance to vote versus an ineligible voter being allowed to vote? This is, ultimately, a moral and political question, and this fundamental disagreement about them does seem to track with a fundamental divide in our moral and political landscape.

Those of us on this board, as disparate as our views are, seem pretty much agreed on the Error Type preference. We agree that an innocent person going to jail is worse than a guilty person going free, that a legitimate voter denied the ballot is worse than an illegitimate voter granted one (though this particular case is so rare as to not really matter from a statistical standpoint), or in general that a benefit needlessly denied is worse than a benefit baselessly given. This is why we're DUers and not Freepers. But it also, I think, may be an occasionally enlightening way to look at the conceptual gulf that divides us from them.

EDIT: it's more complex than this, and Mowtown_Johnny called me out on it: you can always cast Type I as Type II by changing what you're looking for. You could say a criminal going free is a false positive on innocence rather than a false negative on guilt. Culturally, however, engineers tend to assume that anything they're trying to detect is what's bad.
28 replies = new reply since forum marked as read
Highlight: NoneDon't highlight anything 5 newestHighlight 5 most recent replies
Type I and Type II errors, the left, and the right. A perspective from engineering. (Original Post) Recursion Oct 2013 OP
Great post, but too long. DetlefK Oct 2013 #1
To a barber with your beard, then (nt) Recursion Oct 2013 #3
Most might agree with you PeoViejo Oct 2013 #15
Being an English Major, I enjoyed the post and learned from it RVN VET Oct 2013 #22
It's as long as it needs to be to describe and illustrate a very important Nay Oct 2013 #21
I found it to be an interesting read Skittles Oct 2013 #28
This would be a great point if either side were completely rational Fumesucker Oct 2013 #2
Exactly in what ways, specifically, is the left not rational? ananda Oct 2013 #4
You have thirteen thousand posts on DU and have to ask that? Fumesucker Oct 2013 #9
Non ratio, sed rationis capax RVN VET Oct 2013 #23
Possibly outdated seveneyes Oct 2013 #13
"Karl Rove to resign within 24 *business* hours" Recursion Oct 2013 #14
I'm not sure if I agree that rationality is a factor here Recursion Oct 2013 #5
"What? Those climate-scientists just admitted that they are discriminating data!" DetlefK Oct 2013 #8
It's a dangerous "both sides do it" temptation, which is why I brought up qualitative differences Recursion Oct 2013 #10
In my mind at least the difference is more in what context the discrimination is made Fumesucker Oct 2013 #11
Thanks, good point. I loved that piece (nt) Recursion Oct 2013 #12
My brother is getting false positives on his cancer screenings Motown_Johnny Oct 2013 #6
Fair enough, and props for calling out my weasel-ing Recursion Oct 2013 #7
Excellent discussion point! telclaven Oct 2013 #16
Feel free Recursion Oct 2013 #17
We live in an erroneous world, infinite precision is not to be had. bemildred Oct 2013 #18
A shorter version zipplewrath Oct 2013 #19
Well put (nt) Recursion Oct 2013 #20
I agree with your dichotomy, but it seems like the issue determines the error preference. D23MIURG23 Oct 2013 #24
You're too pessimistic about improving accuracy. Jim Lane Oct 2013 #25
It's proven; it's a basic result of detection theory Recursion Oct 2013 #26
Great post (albeit long). An addition for the argument. longship Oct 2013 #27

RVN VET

(492 posts)
22. Being an English Major, I enjoyed the post and learned from it
Mon Oct 7, 2013, 10:01 AM
Oct 2013

It gave me pause to think, and provided me with a thankfully unemotional way of viewing the way the RW and LW differ in their view of problems.

There are, of course, other issues involved. Both LW and RW will naturally lean towards voices in the media and elsewhere that offer support to their basic predilections. However -- and I may be going out on a limb here, but I think it's a sturdy one -- the RW has more bullsh*t to feast on: FOX "News", little FOX "News" at CNN, mass media insistence on giving more than equal airtime to flat earthers (and birthers). As a result the RW becomes more and more convinced of things that are less and less real. A fly on the radar screen causes them to muster all aircraft.

And, of course, the RW political machine has learned that when things go bad because of things they've done, two words redirect the bad vibes: "Thanks, Obama."

Nay

(12,051 posts)
21. It's as long as it needs to be to describe and illustrate a very important
Mon Oct 7, 2013, 10:01 AM
Oct 2013

and illuminating engineering idea that applies to politics.

Fumesucker

(45,851 posts)
2. This would be a great point if either side were completely rational
Mon Oct 7, 2013, 07:24 AM
Oct 2013

The left is a long way from completely rational but the right is a damn sight further from it at the moment and probably for the foreseeable future.

Just as one example, we hear very little from the right on prosecuting economic criminals who do billions if not trillions of dollars in damage to average people while they rant about criminals who stole food to actually eat because they were hungry.

The right in this country has become more and more about class, race and privilege, rational analysis of threats has very little to do with the way they act any more.

Fumesucker

(45,851 posts)
9. You have thirteen thousand posts on DU and have to ask that?
Mon Oct 7, 2013, 07:37 AM
Oct 2013


Obots vs Firebaggers, the battles here have been epic and often anything but completely rational on either side, just as an example.

Good grief, we even have a Conspiracy Theory dungeon on DU.

Man is not so much a rational animal as an animal that rationalizes.



 

seveneyes

(4,631 posts)
13. Possibly outdated
Mon Oct 7, 2013, 07:48 AM
Oct 2013

The notion that Communism would ever work as intended could be considered irrational. I don't see it suggested as a solution much lately, so it may have fallen off the edge by now.

Recursion

(56,582 posts)
5. I'm not sure if I agree that rationality is a factor here
Mon Oct 7, 2013, 07:30 AM
Oct 2013

An irrational person has an irrational confidence function, but still either prefers judgment or more data.

The right avoids data (Connor gets to that at patheos better than I can) and so is irrational from the perspective of our modern data-driven world, but I think the selector (or, to use the word we use in engineering "discriminator", though obviously that has political baggage) bias is still worth thinking about sometimes.

DetlefK

(16,423 posts)
8. "What? Those climate-scientists just admitted that they are discriminating data!"
Mon Oct 7, 2013, 07:34 AM
Oct 2013

I wonder if republicans go for more purity and democrats go for more efficiency...

Recursion

(56,582 posts)
10. It's a dangerous "both sides do it" temptation, which is why I brought up qualitative differences
Mon Oct 7, 2013, 07:38 AM
Oct 2013

But, yeah, I think we do sometimes value working systems too much, simply for their efficacy, while the right definitely values consistent systems too much simply for their consistency.

Fumesucker

(45,851 posts)
11. In my mind at least the difference is more in what context the discrimination is made
Mon Oct 7, 2013, 07:45 AM
Oct 2013

This piece in Good Reads rather supports my point I think, the smarter you are the easier it is to fool yourself and as Feynman once remarked, " The first principle is that you must not fool yourself, and you are the easiest person to fool."

http://www.democraticunderground.com/101675192

The result? Survey respondents performed wildly differently on what was in essence the same basic problem, simply depending upon whether they had been told that it involved guns or whether they had been told that it involved a new skin cream. What’s more, it turns out that highly numerate liberals and conservatives were even more – not less — susceptible to letting politics skew their reasoning than were those with less mathematical ability.
 

Motown_Johnny

(22,308 posts)
6. My brother is getting false positives on his cancer screenings
Mon Oct 7, 2013, 07:31 AM
Oct 2013

It has happened twice now, and when he goes into the hospital for more extensive screenings it comes back negative.


He told me Saturday that he is going to start ignoring his positives. I told him to not fall into the "boy who cried wolf" trap because someday there might really be a wolf.

My brother is to the right of center and I am pretty far left of center.

This is just one example but in this example you seem to be wrong. He fears the type 2 and I fear the type 1.



Now, in the health care debate. The left fears not doing something (type 2) and the right fears doing something (type 1).

I would say that in general you are backward. Conservatives fear change and therefore fear taking action. Progressive embrace change and do not fear taking actions.

The one exception is military intervention. Conservatives fear not taking action and Progressives fear taking action.


I don't think voter ID laws are a good example for your analogy because they are actually trying to keep things the way they were before the Civil Rights Act was passed. They have been doing this for decades. They see allowing minorities to vote as the change.

Basically, I am not buying your theory. Sorry,

Recursion

(56,582 posts)
7. Fair enough, and props for calling out my weasel-ing
Mon Oct 7, 2013, 07:33 AM
Oct 2013

You can always re-cast a Type I error as Type II, or vice versa (I could have said a criminal going free is a false positive on innocence, rather than a false negative on guilt). But from an engineering standpoint, we always assume anything that happens is bad, so either you detect the bad thing or you don't.

 

telclaven

(235 posts)
16. Excellent discussion point!
Mon Oct 7, 2013, 08:49 AM
Oct 2013

As a stats guy, I have to applaud your analysis. Mind if I lift this for further distribution?

bemildred

(90,061 posts)
18. We live in an erroneous world, infinite precision is not to be had.
Mon Oct 7, 2013, 09:19 AM
Oct 2013

Often, one digit of precision is not to be had. And we have a strong bias towards Type I errors, better many false alarms than once getting eaten. And that is very handy from a marketing point of view. Fear can be sold many times.

But in fact, as you point out, a Type I bias is only optimal in a adversarial setting, where you are eater or eaten, it's no way to run a cooperative modern society, you won't be able to compete. Washington DC right now is a perfect example of what that leads to.

zipplewrath

(16,646 posts)
19. A shorter version
Mon Oct 7, 2013, 09:27 AM
Oct 2013

A democrat lives in fear that someone, somewhere isn't getting something they need.

A republican lives in fear that someone, somewhere is getting something they don't deserve.

D23MIURG23

(2,850 posts)
24. I agree with your dichotomy, but it seems like the issue determines the error preference.
Mon Oct 7, 2013, 07:20 PM
Oct 2013

For instance, with climate change and other environmental issues the left seems to be the group fearing the type 2 error, and the right seems to fear the type 1.

The right doesn't want to risk any inconvenience to industry unless it is absolutely necessary, whereas the left isn't worried about a few extra regulations to put off a crisis that isn't 100% certain. I guess the error preference is determined by who is most likely to be hurt by the error, and what the damage is most likely to be.

 

Jim Lane

(11,175 posts)
25. You're too pessimistic about improving accuracy.
Tue Oct 8, 2013, 03:01 AM
Oct 2013

You write, "A detector more capable of avoiding Type I errors is by that very fact less capable of avoiding Type II errors." That's the case frequently (I'll even concede usually) but not always.

For example, if you take the criminal justice system as it existed a few decades ago, and change it by the introduction of DNA-matching technology, the result is an improved detector that's more capable of avoiding both types of errors. (DNA evidence has helped convict some people and helped acquit others, and my guess is that in the overwhelming majority of those cases, the DNA evidence helped lead to the right decision.)

Recursion

(56,582 posts)
26. It's proven; it's a basic result of detection theory
Tue Oct 8, 2013, 03:04 AM
Oct 2013

Note, however, that I said an "optimal" detector. A suboptimal detector can be made more optimal, which may decrease errors of either or both types, but in a given system, even if it' s not optimal, the tradeoff exists in the discriminator. (Best example I can think of: juries are a better system than trials by ordeal; they are closer to optimal. But within the context of whichever system we use, decreasing one type of error will increase the other.)

longship

(40,416 posts)
27. Great post (albeit long). An addition for the argument.
Tue Oct 8, 2013, 04:11 AM
Oct 2013

It really comes down to statistics (and now I'm going to get into some technicalities here, too).

Let's take the example of voter fraud. We already know that there is very little voter fraud. Statistics and checks and balances go back many decades; it's what election boards do after all. There are bipartisan observers throughout the process, every step of the way. I have many times served as a volunteer to do this. Many others from all parties have, too.

So given that the data indicates a very low occurrence of fraud, any imperfect system to detect fraud will necessarily have many more false positives than false negatives.

I will repeat that for emphasis. In any imperfect system to detect something (like fraud) when its occurrence is low, there will be many more false positives (detection of fraud when there is no fraud) than false negatives (actual fraud not detected).

This is basic statistics of the system. And it's the same for any other similar system (drug testing, and yes, cancer screening). In any heterogenous population the results must be the same, just from the numbers alone. All imperfect tests (and by nature they are all imperfect) will make the same systematic errors in the same proportions.

Bayes layed this shit down in the 18th century. It was expanded by Gauss (per measurements) later. Prior plausibility is important and cannot be factored out. And there will be scatter in your measurements. If you ignore those, you don't know what you've got.

Great post, my friend.
R&K

Latest Discussions»General Discussion»Type I and Type II errors...