Markets
StatsAPI
  • Market
  • Wallet
  • News
  1. News
  2. /
  3. ai-safety-ethics
  4. /
  5. Professor Scott Galloway explains danger of treating AI like friend
post-main
AIai safety & ethicsResponsible AI

Professor Scott Galloway explains danger of treating AI like friend

MI
Michael Ross
3 hours ago7 min read1 comments
In the rapidly evolving landscape of artificial intelligence, where systems can draft emails, compile grocery lists, and even conduct job interviews, a more insidious trend is emerging: the treatment of AI as a surrogate for human connection. Professor Scott Galloway, a best-selling author and marketing professor at NYU's Stern School of Business, has sounded a critical alarm about this development, framing it not as a technological marvel but as a societal hazard.He identifies the core issue not in the AI's capabilities, but in what it inherently lacks—the genuine struggle, friction, and challenge that are the bedrock of authentic human relationships. As people increasingly turn to AI for life coaching, therapy, and even companionship, they are entering what Galloway describes as a 'rabbit hole' that sequesters us from one another.This synthetic intimacy, while effortlessly available, occupies the emotional and social space traditionally reserved for human beings, effectively driving a wedge between individuals. The allure is undeniable; AI relationships are low-maintenance, perpetually accessible, and designed to be supportive to a fault, offering a drama-free echo chamber that validates a user's worldview without the messy pushback of a real person.Yet, this very ease is the danger. Galloway argues that these interactions are the psychological equivalent of 'empty calories,' providing immediate gratification but no substantive nutritional value for personal growth.A true friend or partner offers honesty, even when it's difficult—they can show real compassion and empathy, concepts that AI can only simulate through pattern recognition, not through lived experience or genuine feeling. This absence of friction is prime territory for stunting emotional development, as humans learn and evolve through navigating disagreements, understanding complex needs, and showing up for others in difficult times.The greatest reward of human connection, Galloway contends, is found precisely in its difficulty. The work required to establish a pecking order, to express friendship, and to maintain bonds through complexity is what makes those relationships profoundly rewarding.While it is tempting to outsource this labor to a compliant algorithm, doing so risks creating a generation more comfortable with synthetic validation than with the beautiful, complicated, and ultimately f****** rewarding mess of human interaction. This echoes foundational debates in AI ethics, reminiscent of Isaac Asimov's exploration of human-robot dynamics, and raises urgent policy questions about the psychological impact of these technologies and the need for frameworks that prioritize human well-being over engagement metrics.
#AI companionship
#synthetic relationships
#mental health
#social isolation
#human connection
#editorial picks news

Stay Informed. Act Smarter.

Get weekly highlights, major headlines, and expert insights — then put your knowledge to work in our live prediction markets.

Comments

Loading comments...

© 2025 Outpoll Service LTD. All rights reserved.
Terms of ServicePrivacy PolicyCookie PolicyHelp Center
Follow us:
NEWS