When Kayla Chege, a upper school student in Kansas, uses artificial intelligence (AI), no question is too small.
The 15 -year -old asks ChatgPT with purchasing guidance for the return to classes, makeup colors, low -calorie options at Smoothie King, as well as ideas for her 16 -year -old party and her younger sister’s birthday party.
The second year student ensures that the “chatbots” do not do their task and try to limit their interactions to worldly questions. But in interviews with The Associated Press and a new study, adolescents say they are increasing more and more with AI as if it were a partner, capable of providing advice and friendship.
“Everyone uses AI for everything now. Everything is really taking over,” said Chege, who wonders how the tools of AI will affect their generation. “I think children use AI to stop thinking.”
During the last two years, concerns about traps at school have dominated the conversation about children and AI. But artificial intelligence is playing a much more important role in many of their lives. AI, teenagers say, has become a reference source for personal advice, emotional support, daily decision -making and problem solving.
“AI is always available. It never gets bored of you”
More than 70% of adolescents have used AI partners and half use them regularly, according to a new study by Common Sense Media, a group that studies and advocates the sensible use of screens and digital media.
The study defines the classmates as platforms designed to serve as “digital friends”, such as Character.AI or Replika, which can be customized with specific features or personalities and can offer emotional support, company and conversations that may feel human. But popular sites such as Chatgpt and Claude, which mainly answer questions, are being used in the same way, researchers say.
As technology quickly becomes more sophisticated, adolescents and experts care about the potential of AI to redefine human relations and exacerbate the crises of solitude and youth mental health.
“The AI is always available. It is never bored of you. It is never critical,” says Ganesh Nair, a 18 -year -old in Arkansas. “When you talk to AI, you are always right. You’re always interesting. You’re always emotionally justified.”
All of that used to be attractive, but as Nair goes to this fall, he wants to stop using AI. Nair was scared after a friend of the Higher School who depended on a “AI partner” for sincere conversations with his girlfriend then made the chatbot write the breakup that ended his two -year relationship.
“That felt a bit dystopian, that a computer generated the end of a real relationship,” said Nair. “It is almost as if we were allowing computers to replace our relationships with people.”
How many teenagers are using AI?
In the Common Sense Media survey, 31% of adolescents said their conversations with AI companions were “so satisfactory or more satisfying” than talking to real friends. Although half of the adolescents said it distrusted the AI councils, 33% had discussed serious or important issues with AI instead of real people.
These findings are worrisome, says Michael Robb, principal author of the study and principal investigator of Common Sense, and should send a warning to parents, teachers and policy formulators. The now flourishing and to a large extent not regulated AI industry is integrating with adolescence such as smartphones and social networks.
“It’s revealing,” Robb said. “When we set out to do this survey, we had no idea how many children are really using AI companions.” The study survey more than 1,000 adolescents throughout the country in April and May.
Adolescence is a critical moment to develop identity, social skills and independence, Robb said, and the classmates should complement, not replace, interactions in the real world.
“If adolescents are developing social skills on ia platforms where they are constantly validated, they are not challenged, they do not learn to read social signals or to understand the perspective of another person, they will not be properly prepared in the real world,” he said.
The non -profit organization analyzed several popular partners in a “risk assessment”, finding ineffective age restrictions and that platforms can produce sexual material, give dangerous advice and offer harmful content. The group recommends that minors not use AI partners.
A worrying trend for adolescents and adults equally
Researchers and educators care about cognitive costs for young people who depend largely on AI, especially in their creativity, critical thinking and social skills. The potential dangers that children form relationships with Chatbots won national attention last year when a 14 -year -old boy from Florida died from suicide after developing an emotional attachment to a character chatbot.
“Parents really have no idea that this is happening,” said Eva Telzer, professor of Psychology and Neuroscience at the University of North Carolina in Chapel Hill. “We are surprised how fast this exploded.” Telzer is directing multiple studies on young and AI, a new research area with limited data.
Telzer’s research has found that children of only 8 years are using generative and also found that adolescents are using AI to explore their sexuality and for company. In focal groups, Telzer found that one of the main applications frequent adolescents is Spicycat AI, a free role -playing games for adults.
Many teenagers also say they use chatbots to write emails or messages to achieve the right tone in delicate situations.
“One of the concerns that arises is that they no longer trust themselves to make a decision,” Telzer said. “They need feedback from AI before feeling that they can mark the box that an idea is fine or not.”
Bruce Perry, a 17 -year -old teenager from Arkansas, says he identifies with that and trusts AI tools to prepare schemes and correct essays for his English class.
“If you tell me to plan an essay, I would think about going to Chatgpt before getting a pencil,” said Perry. Use AI daily and have asked chatbots advice in social situations, to help you decide what to wear and write emails to teachers, saying that the AI articulates their thoughts faster.
Perry says he feels lucky that AI’s companions were not present when he was younger.
“I worry that children can get lost in this,” said Perry. “I could see a child who grows with AI without seeing a reason to go to the park or try to make a friend.”
Other adolescents agree and say that the problems with AI and their effect on the mental health of children are different from those of social networks.
“Social networks complemented the need that people have to be seen, to be known, to meet new people,” said Nair. “I think that the AI complements another need that is much deeper: our need for attachment and our need to feel emotions. It feeds on that.”
“It’s the new addiction,” Nair added. “This is how I see it.”