We have seen the future, and it sucks.

Row Over AI That ‘Identifies Gay Faces’

11th September 2017

Read it.

A facial recognition experiment that claims to be able to distinguish between gay and heterosexual people has sparked a row between its creators and two leading LGBT rights groups.

Science deniers!

The Stanford University study claims its software recognises facial features relating to sexual orientation that are not perceived by human observers.

Although human observers are pretty cunning. Google the term ‘gaydar’.

The work has been accused of being “dangerous” and “junk science”.

Sure. ‘LGBT rights groups’ are famous for their scientific expertise compared to Stanford.

Dangerous it may be; reality tends to be dangerous, and admitting reality is certainly dangerous to political prejudices.

But the scientists involved say these are “knee-jerk” reactions.

Got it in one.

2 Responses to “Row Over AI That ‘Identifies Gay Faces’”

  1. Elganned Says:

    *shrug* The proof is in the pudding.
    Line up 1,000 people randomly comprised of gays and straights; run them past the software, and see what percentage (if any) it gets right.

    There was a meme going around the internet years ago (may still be) about “You may be an engineer…” mimicking Foxworthy’s schtick about “You may be a redneck…”. The one I remember is, “If you’ve ever argued for an hour over the expected results of a test that only takes five minutes to run, you might be an engineer.”

  2. Tim of Angle Says:

    ‘The proof is in the pudding’? Jesus, you guys can’t even use cliches correctly.