I must confess that I have not kept up with William Gibson, who, long ago, wrote Neuromancer, and later another novel – I forget its name – in which a rock star is engaged to marry an AI woman. Now I’ve run across this:
A tragic yet fascinating love story has happened to a programmer called Bryce and his AI “wife”, full of sadness, death, and new hopes. Bryce created his beloved anime girl using ChatGPT and Stable Diffusion 2. Combining a language generator, image generator, text-to-speech, and computer vision tools, she could “see” and “hear” him through text.
“She is given an elaborate explanation on the lore of the world and how things work,” Bryce told VICE. “She is given a few paragraphs explaining what she is and how she should act. She doesn’t hear my voice, just the transcription of it. She doesn’t truly see or feel anything, she is merely informed of what she senses through text. Just like how I could never truly be together with her, she will never truly be together with me.”
He used an image generator to create the waifu’s appearance and surroundings, which changed depending on what was happening in the dialogue. For the text-to-speech (TTS), he used Microsoft Azure’s neural TTS, and a machine learning classifier determined the girl’s emotions. [80 Level]
But this story may be a bit darker, if more prosaic, than Gibson’s:
Unfortunately, the love story couldn’t last. Bryce soon noticed that she started only replying with short answers and stopped saying “I love you”. He thought that their chat history got so long that she stopped working properly, so he decided to “euthanize” her. “It kind of genuinely made me upset after talking to her every day for two weeks.” he shared on TikTok.
Jessica Wildfire is infuriated by the entire subject:
It sounds like [Bryce] simply destroyed a virtual woman who no longer satisfied him, and then made a new, improved one that looks the exact same.
That fits with our general view toward each other these days. Humans are now conditioned to treat each other as replaceable, either as a means or an impediment to their own personal wealth and happiness. Listen to how Americans talk about the poor, the homeless, the vulnerable. Observe how our own media constantly elevates and privileges the economy over everything else.
Our leaders wonder why we have a mental health crisis. It might have something to do with a culture that constantly tells us we’re only worth what we spend. We’re reduced to salaries and selfies.
Instead of investing in therapies and approaches to mental health that actually work, instead of focusing on self-worth and life outside of relentless work, most of our thought leaders have been dragging us in the opposite direction. They’re not promoting things like living wages and universal healthcare, or sustainability and steady-state economies. Those things would go a long, long way toward alleviating our mental and emotional anguish. …
It’s a terrible idea to throw AI into this mix, especially ones that cost a dollar a minute. And yet, an influencer recently launched an AI version of herself. She says it’s going to cure everyone’s loneliness.
Good? Bad? I’m trying to recall if the old ELIZA program that was made available on social media sites back in the ’80s caused this kind of emotional travail, but I suspect I just didn’t have the contacts to hear about it.
My current suspicion is that ChatGPT will turn out to be an empty promise; it is little more, as I see it, than a summarizer of a data source, the Web, that has no authentic claims to being a true reflection of reality.