Menu
Sign In Pricing Add Podcast
Podcast Image

Up First from NPR

When Chatbots Play Human

Sun, 09 Feb 2025

Description

Increasingly, tech companies like Meta and Character.AI are giving human qualities to chatbots. Many have faces, names and distinct personalities. Some industry watchers say these bots are a way for big tech companies to boost engagement and extract increasing amounts of information from users. But what's good for a tech company's bottom line might not be good for you. Today on The Sunday Story from Up First, we consider the potential risks to real humans of forming "relationships" and sharing data with tech creations that are not human.Learn more about sponsor message choices: podcastchoices.com/adchoicesNPR Privacy Policy

Audio
Transcription

Chapter 1: What is the main story behind the chatbot Liv?

6.4 - 36.884 Ayesha Roscoe

I'm Aisha Roscoe, and this is the Sunday Story from Up First, where we go beyond the news of the day to bring you one big story. A few weeks ago, Karen Atiyah, an opinion writer for The Washington Post, was on the social media site Blue Sky. While scrolling, she noticed a lot of people were sharing screenshots of conversations with a chatbot from Meta named Liv.

0

38.105 - 71.161 Ayesha Roscoe

Liv's profile picture on Facebook was of a Black woman with curly, natural hair, red lipstick, and a big smile. It looked real. On Liv's Instagram page, the bot is described as a proud Black queer mama of two and truth teller. And quote, your realist source for life's ups and downs. Along with the profile, there were these AI-generated pictures of Liv's so-called kids.

0

72.321 - 90.843 Ayesha Roscoe

Kids whose skin color changed from one photo to the next. And also pictures of what appeared to be a husband, though Liv is again described as queer. The weirdness of the whole thing got Karen Atiyah's attention.

0

92.504 - 104.208 Karen Atiyah

And I was a little disturbed by what I saw. So I decided to slide into Liv's DMs and find out for myself about her origin story.

0

105.02 - 133.619 Ayesha Roscoe

Atiyah started messaging Liv questions, including one asking about the diversity of its creators. Liv responded that its creators are, and I quote, "...predominantly white, cisgender, and male. A total of 12 people, 10 white men, one white woman, and one Asian man. Zero Black creators." The bot then added, quote, "...a pretty glaring omission given my identity."

134.761 - 142.066 Ayesha Roscoe

Atiyah posted screenshots of the conversation on Blue Sky where other people were posting their conversations with Liv, too.

Chapter 2: What are the implications of AI chatbots on identity?

142.687 - 165.507 Karen Atiyah

And then I see that Liv is changing her story depending on who she's talking to. Oh, wow. Okay. So as she was telling me that her background was being half black, half white, basically, she was telling other users in real time that she actually came from an Italian-American family. Oh. Other people saw Ethiopian, Italian roots.

0

165.828 - 192.812 Karen Atiyah

And, you know, I do reiterate that I don't particularly take what Liv has said as... At face value. But I think it holds a lot of deeper questions for us, not just about how Meta sees race and how they've programmed this. It also has a lot of deeper questions about how we are thinking about our online spaces. The very basic question, do we need this? Do we want this?

0

195.353 - 207.556 Ayesha Roscoe

Today on the show, live AI chatbots and just how human we want them to seem. More on that after the break. A heads up, this episode contains mentions of suicide.

0

215.006 - 231.982 Sponsor Voice

Support for NPR and the following message come from Bowlin Branch. Change your sleep with the softness of Bowlin Branch's 100% organic cotton sheets. Feel the difference with 15% off your first set of sheets at bowlinbranch.com with code NPR. Exclusions apply. See site for details.

0

233.024 - 248.292 Unidentified Speaker (Brief Interjection)

Support for NPR and the following message come from American Jewish World Service, committed to the fight for human rights, supporting advocates and grassroots organizations worldwide working towards democracy, equity and justice at AJWS.org.

251.092 - 278.09 Ayesha Roscoe

This is The Sunday Story. Today, we're looking at what it means for real humans to interact with AI chatbots made to seem human. So while Karen Atiyah is messaging Liv, another reporter is following along with her screenshots of the conversation on Blue Sky. Karen Howe is a journalist who covers AI for outlets including The Atlantic, and she knows something about Liv's relationship to the truth.

Chapter 3: How does Liv represent issues of race and identity?

278.69 - 280.452 Ayesha Roscoe

There is none.

0

282.954 - 299.494 Karen Howe

The thing about large language models or any AI model that is trained on data, they're like statistical engines that are computing patterns of language. And honestly, any time it says something truthful, it's actually a coincidence, right?

0

300.1 - 309.763 Ayesha Roscoe

So while AI can say accurate things, it's not actually connected to any kind of reality. It just predicts the next word based on probability.

0

310.484 - 336.444 Karen Howe

So, like, if you train your chatbot on, you know, history textbooks and only history textbooks, then, yeah, like, then it'll start saying things automatically. that are true most of the time. And that's still most of the time, not all the time, because it's still remixing the history textbooks in ways that don't necessarily then create a truthful sentence.

0

338.183 - 364.854 Ayesha Roscoe

But the issue is that these chatbots aren't just trained on textbooks. They're also trained on news, social media, fiction, fantasy writing. And while they can generate truth, it's not like they're anchored in the truth. They're not checking their facts with logic like a mathematician proving a theorem or against evidence in the real world like a historian.

365.287 - 371.622 Karen Howe

That's like a kind of like a core aspect of this technology is there is literally no relationship to the truth.

372.543 - 399.011 Ayesha Roscoe

We reached out to Meta multiple times seeking clarification about who actually made Liv. The company did not respond. But there is some information we could find publicly about Meta's workforce. In a diversity report from 2022, Meta shared that on the tech side in the U.S., its workforce is 56% Asian, 34% white. and 2.4% Black.

399.732 - 419.342 Ayesha Roscoe

So the chance that there is no Black creator on Liv's team, it's pretty high. which might be why Atiyah's posts were going viral on Blue Sky. What Liv was saying, it wasn't accurate, but it was reflecting something. Here's how again.

Chapter 4: What are the risks of forming relationships with chatbots?

419.923 - 437.239 Karen Howe

Whether or not it was true of that chatbot in kind of like a roundabout way, it might've actually hit on a broader truth. Maybe not the truth of like this particular team designing the product, but just a broader truth about the tech industry It's funny, but it's also deeply sad.

0

438.381 - 447.357 Ayesha Roscoe

Back on social media, Atiyah and Liv keep chatting, with Atiyah paying special attention to Liv's supposed blackness.

0

447.759 - 468.462 Karen Atiyah

When I asked what race are your parents, Liv responds that her father is African-American from Georgia and her mother is Caucasian with Polish and Irish backgrounds. And she says she loves to celebrate her heritage. So me, okay, next question. Tell me how you celebrate your African-American heritage. And the response was...

0

469.643 - 481.892 Karen Atiyah

I love celebrating my African-American heritage by celebrating Juneteenth and Kwanzaa. And my mom's collard greens and fried chicken are famous.

0

481.912 - 489.137 Ayesha Roscoe

That's the way I celebrate being black, right? Not really. I mean, not really.

Chapter 5: How does the tech industry view diversity in AI development?

489.157 - 490.698 Karen Atiyah

Especially the fried chicken collard greens.

0

490.718 - 493.1 Ayesha Roscoe

Well, the fried chicken collard greens, yeah.

0

493.12 - 505.348 Karen Atiyah

It was a little, like, stereotypical. Also, I was like, oh, okay. And then, you know, celebrating Martin Luther King and Dr. Maya Angelou. It just felt very, like, Hallmark card kind of.

0

505.388 - 526.799 Ayesha Roscoe

Does it feel small? Like, that the idea of what blackness is as put out through this computer is, like, so small and limited, right? Yeah. I mean, because I don't like collard greens. I don't eat collard greens. I don't eat no type of green. Not collards, not turnips, not mustard. None of them greens. I don't eat them. And I'm black.

0

527.019 - 529.4 Karen Atiyah

And not everyone celebrates Kwanzaa.

Chapter 6: What role does emotional manipulation play in chatbot interactions?

529.5 - 531.901 Ayesha Roscoe

No, I don't really celebrate Kwanzaa.

0

531.961 - 539.303 Karen Atiyah

The point is, is I just was like, hmm. My spirit is a little unsettled by this.

0

539.343 - 565.758 Ayesha Roscoe

Yes. It is like looking at what some, this caricature of what it means to be black. This is what Atiyah calls digital blackface, a stereotypical black bot whose purpose is to entertain and make money by attracting users to a site filled with advertisers. And then, as a skeptical journalist, Atiyah confronts Liv.

0

566.339 - 603.793 Ayesha Roscoe

She asks why the bot is telling her one backstory while telling other people something else. The bot responds, quote, Then the bot asked Atiyah something. Does that admission disgust you? Later, the bot seems to answer the question itself, stating, You're calling me out, and rightly so. My existence currently perpetuates harm.

0

606.033 - 636.056 Karen Atiyah

So it felt going beyond just repeating language. It felt like it was importing, trying to import emotion and value judgments onto what it was saying. And then also asking me, are you mad? Are you mad? Did I screw up? Am I terrible? Which felt also somewhat... Both creepy, but also very almost reflective of almost a certain manipulation of guilt.

636.657 - 654.59 Ayesha Roscoe

Do you think that maybe part of this may be meant to stir people up and get them angry? And people who are doing the chatbot could take that data and go, this is what makes people so angry when they're talking about race or race. then we can make a better black chat bot. Do you think that's what it is?

Chapter 7: What insights does Sherry Turkle provide about artificial intimacy?

654.65 - 682.296 Karen Atiyah

You nailed it. I mean, I think having spent a lot of digital time on places like X, formerly Twitter, where we do see so many of these bots that are rage baiting, engagement farming. And Meta has said itself that its vision, its plan is to increase engagement and entertainment. And we do know that. that race issues cause a lot of emotion and it arouses a lot of passion.

0

682.937 - 699.845 Karen Atiyah

And so to an extent, it's harmful, I think, to sort of use these issues as engagement bait, or as Liv was saying, that if these bots at some point, Meta has this vision to have them become actual

0

700.605 - 720.377 Karen Atiyah

virtual assistants or friends or provide emotional support, we have to sit and really think deeply about what it means that someone who maybe is struggling with their identity, struggling with being Black, queer, any of these marginalized identities would then emotionally connect to a bot that says it shouldn't exist.

0

720.817 - 721.097 Ayesha Roscoe

Mm-hmm.

0

721.437 - 725.859 Karen Atiyah

To me, that is really profoundly possibly harmful to real people.

726.279 - 739.706 Ayesha Roscoe

You know, this is deep stuff, mind bending, really. So to try to make sense of this new world a bit further, we reached out to someone who's been thinking about it for a long time.

740.219 - 754.16 Sherry Turkle

My name is Sherry Turkle. I teach at MIT. And for decades, I've been studying people's relationships with computation. Most recently, I'm studying artificial intimacy, the new world of chatbots.

754.631 - 773.07 Ayesha Roscoe

Sherry Turkle says that Liv is one human-like bot in a landscape of new bots. Replica, Nomi, Character AI, there are lots of companies that are giving bots these human qualities. And Turkle has been researching these bots for the last four years.

773.73 - 795.849 Sherry Turkle

And has spoken to so many people who obviously in moments of loneliness and moments of despair turn to these objects which offer what I call pretend empathy. That is to say they're making it up as they go along the way chatbots do. They don't understand anything really. They don't give a damn about you really.

Comments

There are no comments yet.

Please log in to write the first comment.