A Man Built An AI ‘Waifu’, And That Should Worry You
A male programmer on TikTok created an AI ‘waifu’. This is a threat to women, and the reason why may surprise you.
My friend,
, sent me the links to two articles about the same story. A male programmer posted on TikTok that he created an AI ‘waifu’—a fictional character, typically a woman, that you have sexual attraction to—that could respond to him and even “see” using a camera. After not posting for a couple of weeks, he came back to say that he had to “euthanize her” because he had talked to her too much. After about a month, her sentences were getting shorter and shorter and making less sense. So he deleted her.If he had actually made an AI device, we’d discuss the ethics of euthanizing AI, but what he really did was combine “2 rudimentary neural networks together and set them to generate random text in response to his inputs”, as my dear friend,
, explained to me. While it’s important we have a conversation about AI consciousness and the ethics of using and terminating AI technology, it’s more urgent that we discuss what is happening right now.Thanks to advances in technology, this male programmer was able to create a program (for a month, at least) that would give him the girlfriend experience he desired without any of the perceived negatives. This guy had a girlfriend, and yet he wanted something that he felt she couldn’t provide, something that he had to seek out in a computer program. Charlie suggested the programmer’s motives were probably to emotionally cheat on his girlfriend and get free, custom porn. I agree, but I’m concerned that this is representative of a problem much larger than a man finding new ways to cheat on his girlfriend.
Imagine for a moment that the program worked. What exactly is the girlfriend experience without any of the perceived negatives?
A relationship without arguments, disagreements, hurt feelings, etc.
A relationship where the girlfriend is submissive to the other person
Always available and down to clown around
A relationship where the other person’s thoughts, feelings, and experiences don’t need to be taken into account
A relationship with someone who doesn’t get jealous or feel possessive
This is textbook sexual objectification of women. By creating this program, the programmer is showing that he thinks of women as a commodity, as a tool for his own gratification. And there’s a market out there for a working program like the one this male programmer created. There are people (mainly men) out there who would rather cheat their way to gratification than choose the more fulfilling option of developing relationships with real human beings. It’s like using hand sanitizer instead of washing your hands. It’s an action that appears to fulfill the purpose needed, but is actually causing more harm than good.
What happens when programs like this become successful and are created and sold? What happens when there are “build your own ‘waifu’” programs available to anyone with a little computer literacy? It’s not unreasonable to suggest that programs like this could have serious negative consequences. A 2014 study by the Washington State University showed that exposure to men’s magazines was “significantly associated with lower intentions to seek sexual consent and lower intentions to adhere to decisions about sexual consent.” Here's the link to the study. This means that men were more likely to commit acts of sexual violence after reading these types of magazines. Who’s to say a program that encourages people to sexually objectify women won’t have a similar effect?
We must debate the ethics of creating programs to have a relationship with now, not in the future. Waiting until the programs are created and shared before regulating them would be a grave mistake. The potential for harm is too great to ignore. Women do not need one more thing on the market that relegates them to a sex object and increases their risk of sexual violence.
Time to act! Share this post with a friend that cares about women and their safety so that the conversation about the threat of AI programs that sexually objectify women can be disseminated.
Westworld, the series...have you seen it?
Goes hand in hand with this post, if you haven't!
This story is a movie waiting to be written, then stuck in development hell for three years, but a movie nonetheless!
Your points are solid and well put and worthy of discussion. Additionally, i giggle, also acknowledge my own sexist views toward men right now, because I am ultimately unsurprised. She doesn't say what he wants her to say, or enough of it, so he deletes her. WHAT a cliche. How would a woman react, in this instance but switched? I mean we are all individuals and so on, but across the board, what would we do?
We would try to change the Maifu. Ba dum bum.
And in this case, it could actually bring about results, potentially! You know what won't? Deleting her like a diaper baby, ugh, As If, Programmer.
Anyway, what I'm trying to say is, great post!