• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

New AI can guess whether you're gay or straight from a photograph, 91% of the time

SCIENTIST 1: We have done it. This machine can detect homosexuality with 91% accuracy.

SCIENTIST 2: It has been a pleasure working with you. This has truly been a dream come true.

SCIENTIST 1: I am almost sad that it's over.

(Suddenly, the machine activates.)

MACHINE: t w o t a r g e t s i d e n t i f i e d. s c a n n i n g......

SCIENTIST 2: Oh no!

SCIENTIST 1: Turn it off!

MACHINE: s c a n c o m p l e t e. r e s u l t:
t a r g e t s s h o w a t t r i b u t e s o f h o m o s e x u a l i t y w i t h i n a 9 % m a r g i n o f e r r o r

(The scientists are stunned, but also, in love.)

SCIENTIST 1: Did... did you know?

SCIENTIST 2: No... I mean, maybe. But I thought I just loved science.

SCIENTIST 1: It seems the answer we were looking for was right in front of us.

SCIENTIST 2: All along...

SCIENTIST 1: Perhaps our experiment is just beginning.

SCIENTIST 2: I am eager to work with you again in this pursuit.

(They hold hands.)

(The robot beeps, knowingly.)
 
For coin flip to guess it with 50% certainty,half of the population must be gay.

If you select a sample from the population where 50% of them are gay then you can test your computer's accuracy relative to random guessing which would have a 50% success rate. That seems like the only way to do it.
 

Ether_Snake

安安安安安安安安安安安安安安安
I think the point is to try and prove there are biological differences between gay and straight people???? but it really does come off as "gays totally look gay lulz."

It only comes off as that as a result of the comments posted here.
 
I think the point is to try and prove there are biological differences between gay and straight people???? but it really does come off as "gays totally look gay lulz."

Not a deep learning expert, but that appears to be quite the subpar base for such an algorithm.
Also absurd how angry people can get over such trivial application of deep learning.
 

N3DS

Member
The article does mention some implication of this study:

The paper suggested that the findings provide “strong support” for the theory that sexual orientation stems from exposure to certain hormones before birth, meaning people are born gay and being queer is not a choice. The machine’s lower success rate for women also could support the notion that female sexual orientation is more fluid.
 
So, the real question here is where can I start to upload pictures of my friends?



okay, that was a bad joke,I don't have friends in real life :(
 

M3d10n

Member
Even if irrefutable proof a biological basis to homosexuality is found, some religious groups have already been preparing for it after the countless failures in praying the gay away by telling gay people to simply go permanently celibate to avoid acting on their "sinful urges".
 

shandy706

Member
You know what. This literally proves that being "gay" isn't a "choice". The physical body shows "traits" we may not even notice...and I find that pretty awesome.

Rather than this PROVING that to horrible human beings they would use it against people I suppose....derp.

Like it said, it's not 100%...but I find the things the AI picks up on and uses facinating (and pretty obvious).

90+% is darn impressive.
 

Neoweee

Member
Accuracy is such a terrible measure for this. If you're being fed a normal ratio of people, just guess Straight. You'll be right about 95% of the time, crushing most algorithms that aim for nuance.
 

Eridani

Member
Accuracy is such a terrible measure for this. If you're being fed a normal ratio of people, just guess Straight. You'll be right about 95% of the time, crushing most algorithms that aim for nuance.

1.) It's not accuracy, it's AUC.

2.) The dataset is balanced, so a default classifier would only have an accuracy of 50% anyway.
 

Despera

Banned
Honestly I don't think being wrong 9% of the time on 35000 images really means all that much. Basically this confirms no one should be making any assumptions, even with the aid of a computer. You could be wrong. It's private unless someone doesn't want it to be, don't make assumptions.
hmmm

What if the 9% are just gay people in denial of their sexual orientation? AI could still be right...
 

Moose Biscuits

It would be extreamly painful...
Even if irrefutable proof a biological basis to homosexuality is found, some religious groups have already been preparing for it after the countless failures in praying the gay away by telling gay people to simply go permanently celibate to avoid acting on their "sinful urges".

Hasn't that been the stance for the Church for the longest time anyway?
 

Wildo09

Member
Fascinating. But what's the purpose of this research? Just cos?

It's right there in the OP.

The paper suggested that the findings provide “strong support” for the theory that sexual orientation stems from exposure to certain hormones before birth, meaning people are born gay and being queer is not a choice. The machine’s lower success rate for women also could support the notion that female sexual orientation is more fluid.

They want to find out if sexuality is correlated to the exposure to certain hormones
 

ZOONAMI

Junior Member
1.) It's not accuracy, it's AUC.

2.) The dataset is balanced, so a default classifier would only have an accuracy of 50% anyway.

What if the AI is so smart it knows that data set is balanced ahead of time, because that would be the easiest way to have a proper sample, and is basically just guessing half gay half straight?

Yeah I know it wouldn't actually work, would still just come out at 50% or so.

What is interesting here, did the AI come up with the criteria for classifying or did the people who programmed the AI? Because then it's really just people deciding what features all anecdotally more likely to have people consider that person gay upon close observation, and then the AI is just good at discerning those features from photos. Essentially just doing the work more quickly and crunching the numbers with less hemming and hawing than people.
 
errr this is pointless objection. if you believe sexuality doesnt matter so who cares what this algorithm classifies you as. Its basically just a tool which identifies certain traits... doesnt matter to you cool ... the purpose it serves is very simple its an ai classifier. which can be analyzed and improved etc etc.
It doesn't matter to me but I would imagine a good amount of people wouldn't like to be classified anything by a machine.
 

MogCakes

Member
An uncomfortable fact but it is what it is. Our choice of fashion and appearance sends out messages we may or may not intend about ourselves. I wonder if AI will be able to spot bigots next.
 

Minarik

Member
This is really dumb and pointless

Sexuality isn't a look or a way of life it's just what kind of person you like. Even if this works what actual purpose does this thing serve.

Kinda pointless, but apparently it is somewhat of a look if the thing works with that high of an accuracy from a photograph. Personally, I'm kinda glad there might be some physiological signs to combat the idiots that think sexuality is a choice.
 

Eridani

Member
What if the AI is so smart it knows that data set is balanced ahead of time, because that would be the easiest way to have a proper sample, and is basically just guessing half gay half straight?

Yeah I know it wouldn't actually work, would still just come out at 50% or so.

What is interesting here, did the AI come up with the criteria for classifying or did the people who programmed the AI? Because then it's really just people deciding what features all anecdotally more likely to have people consider that person gay upon close observation, and then the AI is just good at discerning those features from photos. Essentially just doing the work more quickly and crunching the numbers with less hemming and hawing than people.

50% is the lowest you can go in 2 class classification. If you get less than 50% you can just tell the classifier to choose the opposite of what it was going to chose and get more than 50%.

Not sure specifically how features were handled in this case, since I just glanced over the study, but deep learning is generally capable of coming up with features without human help, which is one of its biggest strengths. There's also a lot of algorithms used in face recognition that do extract specific features (things like cheek shapes and nose shape) but even then you usually extract a whole bunch of features and then let the AI figure out which subset to use. You also definitely never explicitly state "this feature is more likely to be present in this specific class of images", because then that's not really AI anymore.
 
I'm guessing this primarily works because the people in the photos confirm whether they are gay or straight, and are not closeted or in denial in any way. A closeted person might dress or groom or carry themselves differently in an effort to hide it.

In other words I would guess this isn't a "gay detection machine" you can point at any photo to determine secrets, something that would make politicians or religious people nervous. They're likely not going to look gay to the AI, and will say "nope I'm not gay, see, it works." The people who look gay to the AI are the kind of people who would tell the scientists and researchers "yes, I am gay."
 
Damn progressive GAF, when did you get so anti-science? The OP lists valid scientific reasons for doing this kind of research. It strengthens support for the claim that homosexuality is by birth, not choice. Plus, it's interesting to see that humans do so poorly compared to the AI.
 

cromofo

Member
Some people look, behave, dress and go about their way in certain fashion. Nothing new.

This AI seems to have it on point. Quite an amazing achievement.
 

Oppo

Member
SCIENTIST 1: We have done it. This machine can detect homosexuality with 91% accuracy.

SCIENTIST 2: It has been a pleasure working with you. This has truly been a dream come true.

SCIENTIST 1: I am almost sad that it's over.

(Suddenly, the machine activates.)

MACHINE: t w o t a r g e t s i d e n t i f i e d. s c a n n i n g......

SCIENTIST 2: Oh no!

SCIENTIST 1: Turn it off!

MACHINE: s c a n c o m p l e t e. r e s u l t:
t a r g e t s s h o w a t t r i b u t e s o f h o m o s e x u a l i t y w i t h i n a 9 % m a r g i n o f e r r o r

(The scientists are stunned, but also, in love.)

SCIENTIST 1: Did... did you know?

SCIENTIST 2: No... I mean, maybe. But I thought I just loved science.

SCIENTIST 1: It seems the answer we were looking for was right in front of us.

SCIENTIST 2: All along...

SCIENTIST 1: Perhaps our experiment is just beginning.

SCIENTIST 2: I am eager to work with you again in this pursuit.

(They hold hands.)

(The robot beeps, knowingly.)

pretty good.

I was going to do a much darker version of this taking place in a surveillance control room in Saudi Arabia
 

Eridani

Member
I'm guessing this primarily works because the people in the photos confirm whether they are gay or straight, and are not closeted or in denial in any way. A closeted person might dress or groom or carry themselves differently in an effort to hide it.

In other words I would guess this isn't a "gay detection machine" you can point at any photo to determine secrets, something that would make politicians or religious people nervous. They're likely not going to look gay to the AI, and will say "nope I'm not gay, see, it works." The people who look gay to the AI are the kind of people who would tell the scientists and researchers "yes, I am gay."

They actually address this in the author's notes:

“This must be wrong; you used a sample of openly gay/straight people!”

We could not think of an ethically sound approach to collecting a large number of facial images of non-openly gay people.

Thus, we were worried that the images obtained from a dating website might be especially revealing of sexual orientation. However, this did not seem to be the case.

First, we tested our classifier on an external sample of Facebook photos. It achieved comparable accuracy, suggesting that the photos used here were not more revealing than Facebook profile pictures.

Second, we also asked humans to judge the sexual orientation of these faces, and their accuracy was no better than in the past studies where humans judged sexual orientation from carefully standardized images taken in the lab. This suggests that the images used here were not especially revealing of sexual orientation—at least, not to humans.

Finally, as mentioned before, the deep neural network used here was specifically trained to focus on fixed facial features that cannot be easily altered, such as the shape of facial elements. This helped in reducing the risk of the classifier discovering some superficial and not face-related differences between facial images of gay and straight people used in this study.

Like they say, it's obviously very hard to test on closeted people with a statistically significant sample size, but it's not unreasonable to assume it would work.
 

Neoweee

Member
1.) It's not accuracy, it's AUC.

2.) The dataset is balanced, so a default classifier would only have an accuracy of 50% anyway.

Awesome, thanks for the info. The article was really bad at describing the actual underlying stats work.
 

Daedardus

Member
I want to try this on myself because I'm pretty sure it will say I'm gay due to how feminine I can look sometimes, even though I'm straight. Guess the AI doesn't know that gender expression and sexuality are not the same.
 

Peltz

Member
I can already see how conservatives will interpret this:

tenor.gif


Q1PyDZ5.gif
 
I want to try this on myself because I'm pretty sure it will say I'm gay due to how feminine I can look sometimes, even though I'm straight. Guess the AI doesn't know that gender expression and sexuality are not the same.

"The data also identified certain trends, including that gay men had narrower jaws, longer noses and larger foreheads than straight men"

It's not just looking at "gender expression" but actual physical traits.
 

PK Gaming

Member
This is interesting but also depressing in that there are literal discernible differences between gay and straight people. I'm probably overthinking it, but I could see this information being used to "other" gay people even more.
 

Stumpokapow

listen to the mad man
The 91% accuracy apparently is the AUC value (quote taken from the study):

And the dataset was balanced:

From a quick glance, I don't see anything particularly wrong with the methodology here. Also, for those that are interested, the author's notes go over some interesting things in a fairly understandable manner, so people interested should give them a read. For example, about the results:

Quick followup -- I had missed the preprint link initially when I commented so I was going off the coverage and the abstract. Thanks for the followup, t does sound like the inference is closer to correct (and I'm glad they include a discussion of how thresholding impacts the tradeoff between precision/recall, which is a basic thing you'd think every ML paper does). I'd have to think a little more about how the design of the paper as a balanced sample would impact true OOS testing -- for example, it's not clear the AI would guessing everyone is straight in a design without a balanced sample.

I also maintain their dismissal of the sampling concerns as ungrounded. This is an AI that can tell the difference between dating site photos of gay and straight persons, not between photos of gay and straight persons. Even granted that they've identified morphological features as being predictive, I would like to see true OOS with, say, randomly harvested non-dating photos.
 

caliph95

Member
Phrenology is back baby … awooo
Not even close
This is interesting but also depressing in that there are literal discernible differences between gay and straight people. I'm probably overthinking it, but I could see this information being used to "other" gay people even more.
Look at another way it does prove that it isn't just a choice.

Though I'm worried what would happen if less desirable people got it
 

Peltz

Member
This is interesting but also depressing in that there are literal discernible differences between gay and straight people.
How is that depressing? Isn't it liberating?

It had been argued for decades that sexual orientation has biological origins and is not a "life choice". This data certainly supports such arguments.

I also read research that showed that homosexual men consistently have a more nuanced and refined sense of smell (like women) than heterosexual men. That data also supports the same argument.

People cannot control their genetics and so it makes the case that homosexuality is not a choice and also that people should not be penalized or disadvantaged in life for that which they didn't choose (such as sex, race, physical impairments, etc)

This appears to be just one of many types of genetic traits that deserves equal treatment to that of the majority trait (heterosexuality).
 
Top Bottom