Consultancy Circle

Artificial Intelligence, Investing, Commerce and the Future of Work

Navigating Adulthood: How AI Shapes Young Social Skills

In an age where digital tools increasingly act as proxies for human engagement, the contours of social development are being redrawn. For today’s adolescents and young adults, artificial intelligence (AI) has swiftly pivoted from backend recommendation engines to direct social interlocutors. Large language models (LLMs) like GPT-4, personal AI assistants on smartphones, and emotionally responsive bots embedded in social media are not just changing how young people interact—they’re reshaping what these interactions look like. From virtual therapists to AI texting companions, these technologies are both enabling and hindering the development of critical social skills, creating a paradoxical dependency on tools designed to simulate, rather than foster, human connection.

The Rise of Social AI and Its Impact on Interpersonal Learning

The integration of AI into social platforms has accelerated over the past 18 months. Snapchat’s “My AI,” powered by OpenAI’s GPT technology, now has over 150 million users worldwide as of January 2025, according to Snap’s latest investor report (Snap Investor Relations, 2025). Acting as a conversational friend, the bot provides 24/7 chat services and tailored advice to a digitally native audience. TikTok, too, is rolling out conversational AI to help users engage with content and creators, while Meta has introduced AI-powered assistants across its messaging platforms including WhatsApp and Messenger (Meta AI Blog, 2025).

This mass adoption has real behavioral consequences. According to a January 2025 Pew Research survey, 42% of Gen Z respondents (ages 13–27) report engaging in weekly conversations with AI companions in lieu of human peers (Pew Research, 2025). While social AI may offer low-pressure environments for expressing thoughts or asking questions, it risks circumscribing users’ exposure to the complexities of real-world social dynamics—ambiguity, rejection, emotional reciprocity, and accountability.

Professor Sherry Turkle of MIT posits in her commentary (NYT, 2026) that such interactions “simulate understanding” without necessitating negotiation or empathy. This, she adds, may cultivate “transactional friendships” that prioritize emotional validation over the challenge of navigating differing perspectives.

Quantifying the Shift: Usage & Psychological Indicators

Emerging quantitative data paints a nuanced portrait of how AI is altering the fabric of youth communication. A March 2025 report by McKinsey’s Future of Work division found that 34% of young adults aged 18–24 use AI tools not just for information but for emotional co-regulation—venting, journaling, or seeking motivational support (McKinsey MGI, 2025).

To contextualize this evolution, consider the following comparative table, which juxtaposes AI usage trends with key emotional health metrics among Gen Z.

Metric Jan 2023 Jan 2025
% of Gen Z using AI companions weekly 9% 42%
Average social anxiety score (scale 1–10) 5.7 6.8
Avg. reported hours/day of in-person interaction 3.1 2.2

The sharp uptick in AI use coincides with an increased average social anxiety score and a notable reduction in face-to-face engagement. While correlation does not equate to causation, the association merits longitudinal study and, for now, measured concern.

Deep Learning, Shallow Relationships?

AI excels at mimicking linguistic empathy—saying the right things in the right tone. But these expressions often emerge from predictive modeling rather than authentic understanding. While this can be comforting, the dynamic arguably flattens the stakes of emotional conversations. A March 2025 study from the University of Toronto found that users aged 15–21 conversing regularly with chatbots reported higher satisfaction scores compared to speaking with human peers—yet paradoxically, exhibited weaker scores in emotion recognition tasks (University of Toronto, 2025).

This “empathy paradox” signals a core challenge in developing social cognition. Personal growth hinges not only on being heard but also on navigating discomfort, reconciling viewpoints, and negotiating consent—elements glaringly absent in AI dialogues trained to please. In essence, AI may oversupply emotional certainty, reducing young users’ tolerance for ambiguity—an essential trait for adult social navigation.

Educational AI: A Double-Edged Sword for Skill Growth

Within structured settings like schools or university programs, AI tools carry immense potential to scaffold social learning when intentionally designed. Gamified simulations powered by generative AI—such as those deployed by Classcraft and Kahoot AI—can allow students to role-play conflict resolution and group decision-making. According to a February 2025 report by Deloitte Insights, 58% of U.S. high schools now integrate scenario-based AI tools into social-emotional learning (SEL) curricula (Deloitte, 2025).

However, outcomes are highly contingent on implementation fidelity. A randomized trial conducted in early 2025 by Stanford Graduate School of Education found that classrooms using AI SEL bots without human facilitation saw no change in student collaboration scores, whereas classes with guided discussions experienced notable improvements (Stanford GSE, 2025).

This suggests that AI cannot be a drop-in substitute for adult mentorship. When paired with teacher or counselor mediation, however, it becomes an accelerant—guiding introspection, enhancing feedback loops, and providing safe rehearsal grounds for social conduct. The challenge lies in resisting overdelegation to bots simply because they are scalable.

Economic Incentives and Platform Strategy

The AI-as-companion model is not purely pedagogical—it’s also highly profitable. Emotional engagement translates into retention, monetization, and user profiling. Meta’s Q1 2025 earnings report revealed that users who interacted with its new AI assistant were 22% more likely to remain active on its platform for over 90 days (Meta Q1 Earnings, 2025).

Tech companies are strategically evolving their offerings into emotionally immersive ecosystems. Replika, one of the longest-running empathetic AI companions, now generates over $18 million annually through premium features such as romantic roleplay and visual AI avatars (VentureBeat AI, 2025). While the service markets itself as therapeutic, its success hinges on fostering prolonged emotional dependency—a business model at odds with teaching detachment or interpersonal resilience.

Such platforms may inadvertently normalize parasocial intimacy, teaching users to expect immediate validation without mutual investment. For teenagers still developing identity through peer feedback, the implications could be lasting—potentially shaping their sense of worth around algorithmic responsiveness instead of human engagement.

Regulatory and Ethical Considerations in the 2025 Landscape

Policymakers are beginning to reckon with these dynamics. The FTC issued an advisory in February 2025 emphasizing the need for transparent labeling of emotionally manipulative AI services marketed to minors (FTC, 2025). Concurrently, the European Commission is drafting an amendment to the AI Act requiring educational AI counselors to operate within strictly human-supervised settings when deployed for youth use (EC AI Act, 2025).

Yet meaningful enforcement remains elusive. Nearly 60% of AI-based wellness apps available in the U.S. Apple App Store in 2025 have no documented age verification processes, according to watchdog SurfSafe AI (SurfSafe AI Report, 2025). The tension between innovation and safeguarding continues to widen as AI becomes more immersive, more personalized, and more difficult to distinguish from human sentiment.

Where Do We Go From Here?

The relationship between AI and social skill formation is neither wholly corrosive nor categorically benevolent. It is a terrain of delicate trade-offs. In the 2025–2027 horizon, a probable scenario involves a bifurcation: one group leaning heavily into emotionally reactive AI companions, and another actively trained to use AI as augmentation—not replacement—for community engagement. This divide may create long-term variance in relational competence, emotional regulation, and even employability where soft skills are increasingly prized since generative AI is narrowing the differential on purely cognitive tasks.

Opportunities abound, especially through initiatives that refocus AI toward coaching, not coddling. Emerging startups like AlectaAI (launched Q1 2025) offer AI mentors explicitly designed to challenge users and simulate conflict, guided by developmental psychology frameworks (Alecta AI, 2025). These counter-trend modalities represent a shift away from merely being conversational mirrors toward becoming developmental mirrors—tools that teach by reflecting ambiguity, not eliminating it.

Ultimately, AI is poised to shape a generation’s emotional fluency. Whether it enhances or erodes genuine social skills may hinge less on the technology itself and more on the cultural, educational, and regulatory frameworks that guide its use. As young adults navigate a world crowded with synthetic empathy, the urgent task lies not in retreating from AI, but in shaping its voice—so that it reinforces humanity, instead of replacing it.

by Alphonse G

This article is based on and inspired by The New York Times – Jan 30, 2026

References (APA Style):

  • Deloitte Insights. (2025). AI in Education: Where Teachers and Algorithms Meet. https://www2.deloitte.com/us/en/insights/industry/public-sector/ai-in-education-2025.html
  • European Commission. (2025). AI Act Amendment on Youth AI Counselor Use. https://ec.europa.eu/commission/ai-act-amendment-2025
  • FTC. (2025, February). FTC Issues Guidelines for AI Marketing to Minors. https://www.ftc.gov/news-events/press-releases/2025/02/ftc-issues-guidelines-ai-marketing-minors
  • McKinsey Global Institute. (2025). The AI Generation: Emotional and Cognitive Trends. https://www.mckinsey.com/mgi/reports/the-ai-generation-2025
  • Meta Platforms. (2025). Q1 Earnings & AI Engagement Report. https://investor.fb.com
  • MIT Technology Review. Turkle, S. (2026, January 30). Your Soul Shouldn’t Live in a Machine. https://www.nytimes.com/2026/01/30/opinion/ai-social-skills-relationships.html
  • Pew Research Center. (2025). Gen Z and Conversational AI. https://www.pewresearch.org/internet/2025/01/12/ai-companions-gen-z
  • Snap Inc. (2025). MyAI User Metrics. https://investor.snap.com
  • Stanford GSE. (2025). AI and SEL in Classrooms. https://ed.stanford.edu/news/ai-and-sel-2025
  • University of Toronto. (2025). Emotion Perception and AI Dialogues. https://www.utoronto.ca/research/emotion-ai-2025
  • VentureBeat AI. (2025). Emotional Economics of Replika. https://venturebeat.com/ai/replika-2025
  • SurfSafe AI. (2025). AI Emotional Apps and Age Verification Gaps. https://www.surfsafe.org/reports/feb2025
  • Meta AI Blog. (2025). AI Assistants in Messenger. https://www.meta.com/blog
  • Alecta AI. (2025). Next-Gen AI Mentors for Social Development. https://www.alecta.ai

Note that some references may no longer be available at the time of your reading due to page moves or expirations of source articles.