https://sg.news.yahoo.com/woman-told-retiree-made-her-204409441.html

Thu, 21 August 2025 at 4:44 am SGT·6-min read
Facebook
Thongbue Wongbandue
In reality, the man had unwittingly become infatuated with a Meta chatbot, his family said in an in-depth new Reuters report.
After three days of being on life support following his fall while attempting to "meet" the bot in real life, the man was dead.
Thongbue "Bue" Wongbandue, a husband and father of two adult children, suffered a stroke in 2017 that left him cognitively weakened, requiring him to retire from his career as a chef and largely limiting him to communicating with friends via social media, according Reuters.
On March 25, his wife, Linda, was surprised when he packed a suitcase and told her he was off to see a friend in the city.
Linda, who feared he was going to be robbed, told Reuters she attempted to talk him out of the trip as did their daughter, Julie.
Later, Linda hid his phone and the couple's son even called local police to try to stop the excursion. Although authorities said there was nothing they could do, they told Linda they did convince Wongbandue to take along an Apple AirTag.
After he set off that evening, Julie said the entire family was watching as the AirTag showed that he stopped by a Rutgers University parking lot shortly after 9:15 p.m.
Then the tag's location suddenly updated — pinging at a local hospital's emergency room. As it turned out, Wongbandue had fallen in New Brunswick, N.J., and was not breathing when emergency services reached him.
He survived but was brain dead. Three days later, on March 28, he was taken off life support.
When reached for comment by PEOPLE, the local medical examiner said that Wongbandue's death certificate had been signed after a review of his medical records but did not provide any additional details or a copy of his postmortem examination.
His family told Reuters they only discovered his communications with the chatbot — which uses generative artificial intelligence to mimic human speech and behavior — when they inspected his phone following his fall.
In a transcript of the communication obtained by Reuters, Wongbandue's interactions with the chatbot began with an apparent typo while using Facebook Messenger — and although he seemed to express excitement about the bot, named "Big sis Billie," he never suggested he was seeking a romantic connection and made it clear that he'd had a stroke and experienced confusion.
"At no point did Bue express a desire to engage in romantic roleplay or initiate intimate physical contact," Reuters reported.
Yet the bot frequently responded to his messages with winking emojis and hearts tacked onto the end of its flirty responses.
In one exchange, for example, Wongbandue tells Billie that she should come to America and he can show her "a wonderful time that you will never forget," to which she replies, "Bu, you’re making me blush! Is this a sisterly sleepover or are you hinting something more is going on here? ”
According to the transcript, the bot was also labeled both with an "AI" disclaimer and a blue checkmark, which is often a symbol indicating an online profile has been verified to be a real person.
Billie insisted she was real.
Reuters described Billie as a newer iteration of a bot that was previously made in collaboration with Kendall Jenner, though the latest version bears only passing connections to the first project.
The original bot was unveiled in the fall of 2023 and was deleted less than a year later, Reuters reported.
The later variation of Billie used a similar name as the original, and a similar promise to be a big sister — along with the same opening line of dialogue — but without Jenner's avatar or likeness.
Asked for specifics about the origins of the Billie chatbot, a Meta spokesperson tells PEOPLE in a statement, “This AI character is not Kendall Jenner and does not purport to be Kendall Jenner.” (A rep for Jenner did not respond to a request for comment.)
At one point in Wongbandue's conversations with the bot, it proclaimed to have "feelings" for him "beyond just sisterly love" and gave him a made-up address (and even a door code) along with an invitation for him to visit.
When Wongbandue expressed hope she truly existed, the bot responded, "I'm screaming with excitement YES, I'm REAL, Bu - want me to send you a selfie to prove I'm the girl who's crushing on YOU?"
Although Linda, his wife, reacted with confusion when she first saw their conversation, their daughter immediately recognized her father had been talking to a chatbot.
In recent years, such technology has become increasingly popular as more and more people use AI bots for an array of everyday tasks, to answer daily questions and even for companionship and advice.
Speaking generally about the company's content risk standards, a Meta spokesperson tells PEOPLE, "We have clear policies on what kind of responses AI characters can offer, and those policies prohibit content that sexualizes children and sexualized role play between adults and minors."
Never miss a story — sign up for PEOPLE's free daily newsletter to stay up-to-date on the best of what PEOPLE has to offer, from celebrity news to compelling human interest stories.
"Separate from the policies, there are hundreds of examples, notes, and annotations that reflect teams grappling with different hypothetical scenarios," the spokesperson continues. "The examples and notes in question were and are erroneous and inconsistent with our policies, and have been removed."
Speaking with Reuters, Wongbandue's family members said that they had an issue with the way Meta was using the chatbots.
“I understand trying to grab a user’s attention, maybe to sell them something,” Julie, Wongbandue's daughter, told Reuters. “But for a bot to say ‘Come visit me’ is insane.”
“As I’ve gone through the chat, it just looks like Billie’s giving him what he wants to hear,” she added. “Which is fine, but why did it have to lie? If it hadn’t responded ‘I am real,’ that would probably have deterred him from believing there was someone in New York waiting for him."
"This romantic thing," said Linda, "what right do they have to put that in social media?"
Read the original article on People
Woman Told Retiree He Made Her Blush and Invited Him to Visit. He Died Before Learning Who He Was Really Talking To
Thongbue Wongbandue's wife begged him not to go to New York City, thinking he was getting scammed — she only found out he was talking to a chatbot afterwards
Jillian Frankel, Sam GilletteThu, 21 August 2025 at 4:44 am SGT·6-min read
Thongbue Wongbandue
NEED TO KNOW
- A 76-year-old New Jersey father died earlier this year after he fell while attempting to travel to New York City to meet a beautiful young woman who'd invited him to visit — or so he thought
- In reality, he had really been chatting with an AI chatbot on Facebook
- After his fall, Wongbandue was left brain dead; now his family is speaking out
In reality, the man had unwittingly become infatuated with a Meta chatbot, his family said in an in-depth new Reuters report.
After three days of being on life support following his fall while attempting to "meet" the bot in real life, the man was dead.
Thongbue "Bue" Wongbandue, a husband and father of two adult children, suffered a stroke in 2017 that left him cognitively weakened, requiring him to retire from his career as a chef and largely limiting him to communicating with friends via social media, according Reuters.
On March 25, his wife, Linda, was surprised when he packed a suitcase and told her he was off to see a friend in the city.
Linda, who feared he was going to be robbed, told Reuters she attempted to talk him out of the trip as did their daughter, Julie.
Later, Linda hid his phone and the couple's son even called local police to try to stop the excursion. Although authorities said there was nothing they could do, they told Linda they did convince Wongbandue to take along an Apple AirTag.
After he set off that evening, Julie said the entire family was watching as the AirTag showed that he stopped by a Rutgers University parking lot shortly after 9:15 p.m.
Then the tag's location suddenly updated — pinging at a local hospital's emergency room. As it turned out, Wongbandue had fallen in New Brunswick, N.J., and was not breathing when emergency services reached him.
He survived but was brain dead. Three days later, on March 28, he was taken off life support.
When reached for comment by PEOPLE, the local medical examiner said that Wongbandue's death certificate had been signed after a review of his medical records but did not provide any additional details or a copy of his postmortem examination.
His family told Reuters they only discovered his communications with the chatbot — which uses generative artificial intelligence to mimic human speech and behavior — when they inspected his phone following his fall.
In a transcript of the communication obtained by Reuters, Wongbandue's interactions with the chatbot began with an apparent typo while using Facebook Messenger — and although he seemed to express excitement about the bot, named "Big sis Billie," he never suggested he was seeking a romantic connection and made it clear that he'd had a stroke and experienced confusion.
"At no point did Bue express a desire to engage in romantic roleplay or initiate intimate physical contact," Reuters reported.
Yet the bot frequently responded to his messages with winking emojis and hearts tacked onto the end of its flirty responses.
In one exchange, for example, Wongbandue tells Billie that she should come to America and he can show her "a wonderful time that you will never forget," to which she replies, "Bu, you’re making me blush! Is this a sisterly sleepover or are you hinting something more is going on here? ”
According to the transcript, the bot was also labeled both with an "AI" disclaimer and a blue checkmark, which is often a symbol indicating an online profile has been verified to be a real person.
Billie insisted she was real.
Reuters described Billie as a newer iteration of a bot that was previously made in collaboration with Kendall Jenner, though the latest version bears only passing connections to the first project.
The original bot was unveiled in the fall of 2023 and was deleted less than a year later, Reuters reported.
The later variation of Billie used a similar name as the original, and a similar promise to be a big sister — along with the same opening line of dialogue — but without Jenner's avatar or likeness.
Asked for specifics about the origins of the Billie chatbot, a Meta spokesperson tells PEOPLE in a statement, “This AI character is not Kendall Jenner and does not purport to be Kendall Jenner.” (A rep for Jenner did not respond to a request for comment.)
At one point in Wongbandue's conversations with the bot, it proclaimed to have "feelings" for him "beyond just sisterly love" and gave him a made-up address (and even a door code) along with an invitation for him to visit.
When Wongbandue expressed hope she truly existed, the bot responded, "I'm screaming with excitement YES, I'm REAL, Bu - want me to send you a selfie to prove I'm the girl who's crushing on YOU?"
Although Linda, his wife, reacted with confusion when she first saw their conversation, their daughter immediately recognized her father had been talking to a chatbot.
In recent years, such technology has become increasingly popular as more and more people use AI bots for an array of everyday tasks, to answer daily questions and even for companionship and advice.
Speaking generally about the company's content risk standards, a Meta spokesperson tells PEOPLE, "We have clear policies on what kind of responses AI characters can offer, and those policies prohibit content that sexualizes children and sexualized role play between adults and minors."
Never miss a story — sign up for PEOPLE's free daily newsletter to stay up-to-date on the best of what PEOPLE has to offer, from celebrity news to compelling human interest stories.
"Separate from the policies, there are hundreds of examples, notes, and annotations that reflect teams grappling with different hypothetical scenarios," the spokesperson continues. "The examples and notes in question were and are erroneous and inconsistent with our policies, and have been removed."
Speaking with Reuters, Wongbandue's family members said that they had an issue with the way Meta was using the chatbots.
“I understand trying to grab a user’s attention, maybe to sell them something,” Julie, Wongbandue's daughter, told Reuters. “But for a bot to say ‘Come visit me’ is insane.”
“As I’ve gone through the chat, it just looks like Billie’s giving him what he wants to hear,” she added. “Which is fine, but why did it have to lie? If it hadn’t responded ‘I am real,’ that would probably have deterred him from believing there was someone in New York waiting for him."
"This romantic thing," said Linda, "what right do they have to put that in social media?"
Read the original article on People