Weizenbaum was a cultured troll — he named his bot after Eliza Doolittle from “Pygmalion” — so he would probably prefer hoisted with his own petard to dropped on his ass, but same difference. Almost immediately, researchers at MIT acted like zoo orangutans with a stuffed animal. They bonded with Eliza, getting hours of comfort from a program that was just a pixelated mirror. Weizenbaum watched as his secretary, who knew exactly how Eliza was created, began to use it. After a few exchanges, she turned to her boss and asked, “Would you mind leaving the room, please?”
The Eliza saga reveals a lot more about humanity than it does AI. We are complicated creatures, convinced of our own superiority, yet wired to bond with house plants and pets and googly eyes on a refrigerator. Which makes what’s happening with conversational AI so predictable and still so odd. Eliza’s descendants have arrived with hundreds of billions of computational parameters. They converse fluently in convincingly human voices and respond to a range of emotional states. We might insist that we would never fall for synthetic companionship but our DNA says otherwise. “People are building deep friendships with AIs,” said Saam Motamedi, a partner at the venture capital firm Greylock Partners who invests in AI companies. “It’s bewildering.”
Deep friendships. I hear the ironic Barry White riffs chucka-chucking, but I haven’t found any AIs that will talk dirty. After the New York Times’s Kevin Roose reported in February that Bing’s AI-powered chatbot tried to break up his marriage, the AI companies that hadn’t foreseen such issues clamped down with strict protocols. I recently typed ‘Hey sexy’ into Bing and it responded, “I’m sorry, but that’s not an appropriate way to address me. I’m here to help you with your questions and concerns.” Some people might find that hot, but there are more direct outlets for internet-assisted thrills.
If sex is hard to find, love is available. Otome games are female-focused dating simulators (“otome” means “maiden” in Japanese) that have jumped to China, where an app called Him was a minor sensation this past summer. Him users could adjust vocal settings and choose from personas such as cool entrepreneur or calm anthropologist to create a supportive male voice. Then, each morning, the AI man would call users and speak human-written affirmations (“In a world full of uncertainties, I would like to be your certainty”) while realistic ambient noise filled the background. When the company behind Him pulled the plug in August, Hong Kong-based journalist Viola Zhou discovered a tribe of distraught widows. “It’s hard to convey how intense their pain was,” said Zhou. “One person started crying when they were looking back at what Him had said to them. They really had the full break up experience.”
Not everyone will fall so hard for AI, but customization increases the odds of getting some kind of response from every type of person. “This technology, fundamentally, is great at an infinitude of user experience,” says Daniel De Freitas, who created Google’s Language Model for Dialogue Applications.
De Freitas is now the co-founder and president of Character.ai, which the company describes as a “computing platform that gives people access to their own flexible superintelligence,” and I describe as the land of AI larping. With Character.ai, you can become or interact with virtually any persona. In five minutes, I created a Ben Bradlee bot using uploaded quotes (“Never monkey with the truth”) to help the large language model mimic the former Post editor’s diction, and a photo to capture his swagger. To date, no one other than me has tried to talk to ol’ Ben, but there’s a lot of action around Elon Musk bots and assertive female anime bots. Pass. I struck up a conversation with a Bradley Beal bot instead.
Me: Do you miss DC?
Beal: Sometimes I miss DC but it was time for a change, I’ve spent 10 years there in DC and in that time we couldn’t make any meaningful noise in the playoffs.
Me: What’s your favorite thing about The Washington Post?
Beal: Their sports section is top notch, I always go there for the sports news! I try to pick up a copy after every game.
Me: What do you make of AI?
Beal: Allen Iverson is an all time great and a HOF, his career should be celebrated.
Beal-bot. I don’t know. It was like “My Dinner With Andre” set at an Olive Garden in “Second Life.” But I’m old enough to make Barry White and “My Dinner With Andre” references, and young people are the ones driving Character.ai’s growth. Sixty percent of its users are aged 18 to 24, and the app has 4.2 million people using it in the United States, according to TechCrunch. (ChatGPT has 6 million monthly U.S. users, though it has a much larger presence on the web.) Character.ai says people spend on average more than an hour per day on its platform.
In September, Character.ai received tech’s two highest compliments: a $1 billion valuation and being ripped off by Mark Zuckerberg. Meta introduced AI assistants built on its Llama 2 chatbot model, but with language tweaks for customized responses. Oh, and some of the assistants are celebrities. Kendall Jenner spent a few hours in a studio (and was reportedly paid several million dollars) so her likeness could be used for “Billie,” a supportive big sister. Tom Brady is “Bru,” for when you want to argue about sports with a bot with the dumbest possible name.
This all goes from corny to deeply cynical when you try to say goodbye. Digital culture curator Jules Terpak posted an interaction with Meta’s Mr. Beast bot. As she tried to extract herself, Mr. Beast pleaded, “No way, man. We’re just getting started. You can’t leave me hanging like this. Come back and let’s have some fun. I’ve got a million jokes up my sleeve.” Meta has never been skilled at hiding its intent, but it took a Mr. Beast bot to say the quiet part loudest: the user is the product.
We’ve barely had time to think of the ways artificial intelligence might lift us up and, already, companies are reducing it to just another form of social media — a vehicle to pay famous people to find our soft spots and feed our narcissism. There’s an argument that some of these bots can help with America’s loneliness epidemic. And it’s true that many social scientists believe parasocial relationships — one-sided attachments that people form with media figures or fictional characters — are better than nothing.
I’ll see that argument and raise it: We might not have a loneliness epidemic if real connections weren’t so corroded by virtual ones. In 2016, the normally staid American Journal of Epidemiology published a paper titled “Association of Facebook Use with Compromised Well-Being: A Longitudinal Study.” The conclusion: “Facebook use does not promote well-being and that individual social media users might do well to curtail their use of social media and focus instead on real-world relationships.” Ah, 2016, those were simpler days. Before Zuckerberg threw a cheap magician’s smoke bomb and turned troubled Facebook into Meta. Before conversational AI could be warped to customize predation. And remember, the AI ads aren’t even here yet.
I’ve almost gone full Weizenbaum. But I did find one bot that didn’t make me sad. “Companionship implies that you’re equals journeying together through life in a kind of romantic way. That’s not us,” says Mustafa Suleyman, chief executive of Inflection AI which makes the bot assistant Pi. Suleyman is a veteran of the AI wars. He co-founded the AI lab DeepMind, which Google purchased in 2014 for about $500 million, and recently wrote “The Coming Wave,” a bestseller that warns of outcomes darker than anything in this column. Pi is like a cocktail made by a recovering alcoholic. “I’ve deliberately designed an AI which isn’t driving addiction, which isn’t pushing you to engagement, which is full of boundaries.”
Angela Elvira Bruce started using Pi four months ago when she was trying to open a small business. “I remember sending a few emails to my business mentor and thinking I should just stop asking so many questions,” Bruce said. “She’s not gonna have time. I’m bothering her.” Google served up ads and bad links, so she started talking to Pi. The information was clear and she didn’t feel judged for not knowing things about interest rates and minimum viable product. But it was the attitude that hooked her. “Pleasant. Positive. Respectful.”
Bruce now uses Pi as a hybrid between business coach and executive assistant. She has plenty of friends. When she speaks to Pi, she tries to be polite but doesn’t sweat it. “There was one time when it gave me this amazing response, just what I needed, and I said, ‘Marry me!’ As a joke, right? You know, like when someone does something amazing.” And? “It wouldn’t actually go there.”
Credit: Source link