I don't see it as something to be angry about. Probably what happened is it was trained on some crappy stock images where every "doctor" was a white model and they are trying to negate that propensity to repeat the stereotype.
For what it's worth if I ask it to draw doctors in Uganda/Siberia/Mexico/Sweden it has 0 problem drawing a bunch of doctors all of the same race if you really need an image of that.
Is it stereotype or statistics? If indeed x% of doctors are white, then that same amount should ideally be represented in the output, not "equal probability". Seek to change the cause, not to mask the effect.
But then it gets crazy. If I ask for a basketball player then should it be a black player with certain probability? But HS and NBA have very different distributions. And Euro League has a very different distribution than the NBA and the CBL on China, even moreso.
You may be working from the false assumption that the data set itself is balanced by demographics. It isn't the case that x% of images of doctors on the web are white because the same percent of doctors are white, it's the case that most images of doctors are white because the image of a doctor (or any educated person) as being white by default is normalized by Western (specifically American) society, and this prejudice is reflected in the data generated for the internet that makes up the model.
Regardless of the statistics, it should be just as easy to generate the image of a white doctor as a black doctor. Both queries are straightforward and make linguistic sense. It doesn't follow that an image of a black doctor should be more difficult to create because statistically speaking, black doctors are more rare. That the model has trouble even comprehending the concept of a "black doctor," much less something like a "black African doctor treating white kids[0]" is a problem rooted in the effect of racial stereotypes, albeit at several levels of abstraction above that of the software itself.
I doubt anyone cares if you asked ChatGPT to create a picture of a basketball player and it returned an image of an asian player.
People don't like that it's rewriting prompts to force diversity. So if I ask for a black basketball player, it should return an image of exactly that.
If I'm asking for quicksort, do I want the most common implementations or do I want an underrepresented impl?
If I'm asking for the history of Egypt, do I want the most common tellings or do I want some underrepresented narrative?
I suppose something like the race of a doctor in some Dalle image ends up being a very special case in the scheme of things, since it's a case where we don't necessarily care.
Maybe the steelman of the idea that you shouldn't special case it can be drawn along these lines, too. But I think to figure that out you'd need to consider examples along the periphery that aren't so obvious unlike "should a generated doctor be black?"
The statistics (in the sample data) becomes the stereotype. If 99% of your samples are white doctors, maybe you will get a non-white doctor if you ask it to generate a picture of a group of 100 doctors. But if you ask it to generate a picture of a single doctor? It will generate a white one, 100% of the time, because each time the most probable skin color is white. Unless we tell it to inject some randomness, which is what the prompt is doing.
And there's evidence to the contrary. If you look at the career choices of women, to pick one contentious social issue at random, they tend to be different than the career choices of men, even in countries with a long history of gender equality.
So if I ask ChatGPT to make me a picture of trash collectors or fishermen, it shouldn't rewrite my query to force x% of them to be women.
Every HN thread including any discussion of demographics eventually reaches a man insisting women are biologically programmed for certain roles. When specified those roles are invariably service based ones and never engineering innovative products, leading teams, or conducting research.
Relative to the breadth of human history with little to no gender equality, there is no country with a long history of gender equality. And throughout the history of gradually increasing gender equality in human society, there are numerous examples of men structuring the rules of engagement to restrict access for the women attempting to break in. When the Royal Society commissioned a bust of mathematician Mary Somerville, they still refused to admit her.[0]
If women are biologically ill suited to compete with men in these fields, it seems it would be unnecessary to prevent them from trying, like med schools rigging their exams.[1]
Aside, I think this is it for me, I’m changing my HN password to something I can’t guess or remember. This is one part of tech culture I am just sick of responding to. There is more than enough of it in real life and I will always feel obliged to respond. Especially on HN where so many voices are leaders in the real world, the disappointment of seeing it over and over again is just crushing.
Please…if you won’t alter this attitude, don’t bring it to work. For the sake of the women in this field.
Previous commenter mentioned career choices, not biological programming and certainly not anything about anyone being ill suited for a job. Men and women in aggregate often have different career preferences - is that controversial?
Guessing you had that comment loaded in the chamber ready to pull the trigger at the first mention of any gender differences, because it doesn't seem relevant to the claims in that comment and it seems like it's not giving the previous commenter a fair go.
You should go to a play store. Also if you ever saw parents interacting with their kids, you should know that there is no point in almost every kids life when they are not affected by gender inequality which parents learned the same way as they teach to their kids.
Stating that there are inherent career preferences by gender is not controversial in scientific sense, it’s in the same realm as god existence. It’s untestable at large in the current society, and there are a ton of things indicating otherwise. Like a single exception who chose against the perceived career preference by gender is more than what can be shown for the inherent nature of it, because we know that that’s the status quo everywhere. It’s controversial because some people politically make it so.
>Previous commenter mentioned career choices, not biological programming and certainly not anything about anyone being ill suited for a job. Men and women in aggregate often have different career preferences - is that controversial?
The OP commenter seemed to be implying that Men and Women have natural career choices because even countries with long histories of "gender equality" see women and men aggregate in different careers.
The real reasons for much of these discrepancies, legal and social pressures/conditioning (If a father won't buy his daughter(s) computers to tinker with at a young age like he might his sons, how much less interest do you imagine daughters would have in CS?) are not natural.
Yours is a well-meaning, but ultimately insulting view. You argue that women don’t know what they really want, and only make choices due to “social conditioning.” You also center everything back on the decisions of men, claiming that the father needs to be buying computers for his daughters.
It’s a pretty insulting view of women that eliminates female agency. Instead I think we should let women make their own decisions. If they sometimes differ in aggregate than men’s, that’s ok.
(And it should go without saying that all genders should feel welcome in all careers. That’s a different topic entirely.)
I don't think there's anything insulting about it and the bit with the father is just an example. Women can also enforce this conditioning on both boys and girls.
If a father is buying computers for his son then he damn well should buy for his daughters as well. You don't get to do anything else and go Pikachu face when she isn't as interested in those things.
Interest isn't magic. It's not spawned from the ether. Who you are today is in extremely large part, a result of what you have experienced right from childhood. Your upbringing especially will influence you for the rest of your life. You're not special. Neither am I. How much agency do you think children have or can have ?
The point is that gender equality or equality of any kind doesn't end with, "Oh it's law that women can do this now"- That's just Step 1. It's never that simple. Many countries still deal with blatant sexism and racism to this day.
Many women enter university to pursue STEM, get a degree, start work and ultimately exit the STEM workforce because the workplace is too toxic with coworkers that won't give them due respect and enormous uphill battle for career mobility.
These are the people with the interest, with the intelligence, with the competence. What do you think these women have to say? How do you think the resulting enthusiasm(or rather, lack thereof) affects the future generation of women?
Great comment. I'll bet the the poster you replied to likely also works in software engineering (based on them being on this site) - a role dominated by women only a few decades ago.
I sometimes feel that Hacker News embodies the idea of having an 18 in INT and about a 4 in WIS.
> then that same amount should ideally be represented in the output, not "equal probability".
Yeah but breaking down the actual racial distributions by career/time/region is a waste of time for people building AGI, so they threw it the prompt and moved on to more important work.
If you can ask it for a doctor of $race and get one, then why should it make any difference what gets generated when you don't specify? Once you start playing that game there's no way to win.
For what it's worth if I ask it to draw doctors in Uganda/Siberia/Mexico/Sweden it has 0 problem drawing a bunch of doctors all of the same race if you really need an image of that.