Everyday Erinyes #174

 Posted by at 8:08 am  Politics
Jul 132019
 

Experts in autocracies have pointed out that it is, unfortunately, easy to slip into normalizing the tyrant, hence it is important to hang on to outrage. These incidents which seem to call for the efforts of the Greek Furies (Erinyes) to come and deal with them will, I hope, help with that. As a reminder, though no one really knows how many there were supposed to be, the three names we have are Alecto, Megaera, and Tisiphone. These roughly translate as “unceasing,” “grudging,” and “vengeful destruction.”

I suspect that everyone who read it remembers the article in which I asked the Furies to take on facial recognition technology, since it was fairly scary. If so, you are not the only ones scared. The people of Detroit, and, in particular, the people of color of Detroit, are pretty nervous at this point.

Let me start with the background that Detroit already uses facial recognition technology on over 500 traffic lights. What is now being considered is to employ facial recognition technology in neighborhood policing, specifically in violent crimes (after the fact), or in cases when terrorism is suspected. “After the fact” as used here means

The chief said facial recognition technology is only used after-the-fact, when police have already identified a suspect on video. A still image of that suspect would be fed into the software and compared with police mugshots and pictures in the Michigan Secretary of State database, and on social media. 

The chief also compared the use of the technology to the use of a police sketch.

“That [recognition by the technology] by itself is not good enough,” he said. “This is no different from a sketch, which police have used for years. It’s a tool we use to get violent suspects off the streets. Is there a chance you may arrest the wrong person? Yes. But it won’t be used as the sole evidence against someone.” 

If you have read as many news stories as I have where innocent people of color have been falsely stopped because they “fit a profile,” many of which stops ended in tragedy, you might be tempted to say that facial recognition technology could hardly do a worse job than humans can on this point. But you would be wrong. there have been tests in which facial recognition technology was unable to identify Oprah Winfrey and Michelle Obama. Apparently the technology can say, “All those people look alike” as well as a human can, or maybe even better.

As a demonstration – and this is NOT intended to excuse anyone; on the contrary, it is to show how much MORE responsibility we have to take care when something is difficult to see – here is a small part of a screen shot from a “hidden object” game, where one is supposed to find objects more or less hidden in plain sight. Within this screen shot there is a silhouette of a common swift (the bird shown in the smaller cut). I am not going to swear it is in the same position as the smaller picture, because I never did succeed in seeing it. But a hint allowed me to click on it (after I took the screen shot), so I know it is there. Bottom line, it is just harder to see things when there is not a lot of contrast in the lighting, and apparently, this also applies to recognition technology. But that is ALL THE MORE reason why no one should ever rush into its use. San Francisco – hardly an enemy of technology in general – has outlawed its use in law enforcement for exactly this reason.

Back in Michigan, the state legislature there is considering a five-year moratorium on the use of facial recognition technology – the bill was introduced to the state house Wednesday by a Democratic representative from Detroit. I don’t know whether he failed to mention the bill to the police chief and police commissioners, or whether they decided to try to get their new technology in place before the bill could go into effect. But it would seem to me that Thursday, then, was not the best time to consider that technology. It would seem instead a good time to postpone such discussion, so as not to commit to wasting money on a program which might have to be scrapped. But maybe that’s just me.

Well, me and Commissioner Willie Butler, elected from District 6 in Detroit. He attempted three times to ask questions of the chair on behalf of his constituents (many of whom were present, wearing masks, to protest the technology – not that all the protesters were from his distract of course), and the third time the Chair ordered him removed.

He was in fact physically removed by police officers, and, out of the meeting room, was arrested, handcuffed, and taken to jail. (I hope Trump doesn’t see this … but I kind of wish Congress would.) He was released Friday morning on bail, posted by another Commissioner. So we are now up to just me, Commissioner Butler, the Commissioner who bailed him out, and a substantial number of protesters.

My head hurts.

Alecto, Megaera, and Tisiphone, I already asked you in March to look into facial recognition technology and the ways it is being used and misused. I now reiterate that, but I also suggest that there is something to find in Detroit that goes beyond disagreements on technology, if they are willing to lock up a, for heaven’s sake, a police commissioner who disapproves of its use.

The Furies and I will be back.

Cross posted to Care2 HERE.

Share

  11 Responses to “Everyday Erinyes #174”

  1. Is the bird at lower left corner?  Is it true that any butt with cheeks spread will open Trump’s iPhone?  Seriously, it’s fine for a phone, but not good enough for police work.  The abuse that Rep. is intolerable!

    Thanks JD! 04

    • It is within the blue (violet) rectangle in this cut, which shows more area.  The arch on the right of my original cut is about where the dotted green line is in this one.  You can also compare the column on the left for height.  No, I wouldn’t say lower left corner.  I’d say about the middle, maybe slightly left of center.  I con’t be 100% positive since I don’t know for sure how big it is.

      Yeah, I’d say the technology is about the iPhone’s speed at this point.

  2. I agree with Gene’s comment on this subject. Profiling through facial recognition(s) has it’s setbacks, and it makes me feel uncomfortable, like Mr. Butler’s horrible experience, re: questioning the chair about the bill. 

    True story here: Years ago, while I was working at the middle school, a teacher came up to me and asked me why I was so belligerent and rude to one of the front office staff, and why did I leave the campus in a huff. (?) I had no idea what she was talking about….and a while later, was apologized to, by the same teacher who confronted me, about this . Obviously…somewhere in this line of thinking, it wasn’t ME, but a woman who looked like me, talked like me, and walked like me. We had no cameras on campus at the time, but I was falsely pinned for another woman’s actions/behavior. I never found out the lady’s name until much later, but I joked (thereafter) that I must of had another twin lurking about. So .. to point out, quick judgments whether by sight/sound or technology is not fool proof. 

    Unfortunately, I believe that PoC folks have it much more worse than my personal experience, if one can’t identify the difference between Ms. Winfrey and Ms. Obama, with using the technology. 

    I understand how technology can be, but it’s not an airtight positive answer either. 

    • I had a “double” when I was in the Marines, at my first duty station.  She was one rank senior to me and about 2″ shorter, but, apart, people didn’t notice.  We were able to use it to our advantage after we figured it out (which didn’t take long.)  We both worked in specialties about which people tend to complain a lot, so we would listen to the the others’ complaints, let them run down, and only then tell them “That’s not me.”  Your experience was much less pleasant – sorry that happened.

      And, yes, it would be hard to disagree with Gene.

  3. Thank you, Joanne (and Furies)…for post !! 

  4. Agree with Gene. I have never cared for seeing these new facial technology systems anywhere, Been signing to ban and keep the away from being used. Like Pat’s experience, it can get innocent people in trouble.

  5. Facial recognition software is not only dodgy, it is a blatant violation of the U.S. Constitution, specifically, the 4th Amendment that is supposed to protect our right to be “secure in [our] persons.”

  6. In principle, there’s nothing wrong with facial software recognition (FSR), but everything with the way it is used. Facial software technology is a tool and any tool can be used wrong and can be dangerous in the wrong hands. For a long time police put a lot of trust in witness testimonies and recognition in a line-up, but studies have shown how unreliable they were, no matter how honest and reliable the witnesses themselves were. FSR can be helpful, as long as the user understands that it has its limitation and could point to an innocent face.
    Given the bad reputation many police forces have on racial profiling and the larger margin of error in case of darker skin colours, FSR should never be the only ‘evidence’ the police should act upon. As a rule, there should be other evidence linked to that person first, before FSR can be used to confirm the identity.

  7. I think we need to revisit a general assumption that technology is going to solve a great many of our concerns and problems.

    A study from Massachusetts Institute of Technology illuminates the problem with that assumption:

    Now a new study from researchers at the M.I.T. Media Lab has found that Amazon’s system, Rekognition, had much more difficulty in telling the gender of female faces and of darker-skinned faces in photos than similar services from IBM and Microsoft. The results raise questions about potential bias….

    Facial recognition is a scary frontier.  It basically asks us: Are you willing to give up your privacy to catch a few criminals?

    • As you point out, there are differences in competence between different technologies designed to to the same thing.  That’s not surprising.  You probably remember the old acronym GIGO.  Sounds like Amazon’s programmers are putting more G in than the others cited.

Sorry, the comment form is closed at this time.