News

AI chatbot companions may not be real, but the feelings users form for them are. Some scientists worry about long-term ...
New York State budget introduces AI restrictions, requiring companion bots to detect self-harm and notify users of non-human ...
Tech companies will now have to issue disclaimers in New York that its AI companion chatbots are not human, and refer users ...
There's just one issue, he isn't real. He is in fact an AI companion, a computer programme trained to talk and act like a human being. But this hasn't stopped Alaina Winters, a retired college ...
Where are human-AI relationships headed? Is it ok to be friends, or more, with a chatbot? We discuss this topic with the ...
Studies suggest benefits as well as harms from digital companion apps — but scientists worry about long-term dependency.
A shocking portion of Generation Z say they could form a “deep emotional bond” with an AI-generated partner — and would even ...
Companion-like artificial intelligence apps pose “unacceptable risks” to children and teenagers, nonprofit media watchdog Common Sense Media said in a report published Wednesday.
State lawmakers have also made it a crime for people to use the image of a minor or the likeness of one while using artificial intelligence to generate sexual content.
Tech companies will now have to issue disclaimers in New York that its AI companion chatbots are not human, and refer users who express suicidal thoughts to mental health hotlines. Additionally ...
AI companion platforms are locked in competition ... He also says Replika companions are designed with characteristics that make them as human-like as possible. Young People's Alliance recently ...