Re-Learning What Human “Connection” Really Means in the Age of AI
Six lessons on trust, agency, and relationship-building we can’t afford to ignore.
Before the AI + Human Connection Summit created by the The Rithm Project, I had already been exploring in my writing how AI is shaping Gen Z and, more broadly, society. But I kept returning to the same frustration: the conversation felt stuck in false binaries. Innovation versus analog. Tech optimism versus tech doom. I believed deeply that this was the wrong framing—that the real work lives in the grey, in the nuance, in the uncomfortable middle where tradeoffs are acknowledged and humanity is centered.
I was confident that I was touching on something important, but I wasn’t sure if anyone else was paying attention or thinking this way. And I wasn’t sure my voice could be loud enough to matter.
Walking out of the Summit on Saturday, that doubt has evaporated.
Being surrounded by some of the warmest, smartest critical thinkers from across this ecosystem—researchers, foundations, youth leaders, nonprofit builders, and tech founders—I was reminded of something both grounding and energizing: I am an important voice in this conversation. And more than that, we all have a responsibility to help widen it.
What the Summit offered wasn’t a single answer or a clean framework for “solving” AI. It offered suggestions for asking better questions, and it helped clarify what I believe we must carry forward into conversations with our communities, our kids, our parents, and ourselves. Below are a few of those takeaways.
1. AI Is Reshaping Trust and What We Believe Is “Real”
One of the most striking insights from the summit was how deeply AI is altering our sense of authenticity. AI just existing breaks social trust. For example, the cover letter—once a credible signal that an applicant had invested significant time and effort to apply for a job—has become obsolete in a world where AI can generate one in under a minute. Similarly, we are skeptical of media and the news, as fully AI-generated videos are increasingly presented as “real.”
We now live on an “AI continuum” from the augmented, the altered, to the fully assembled. This has profound implications for relational trust: how we experience information, institutions, and each other.
The harms of this shift are not evenly distributed. Women and girls are disproportionately targeted by deepfakes. Surveillance technologies and wearables can extract data without consent. At the same time, AI can expand agency when designed with care—through accessibility tools, digital twins, and equity-by-design approaches.
This is the first “both/and” we must hold: AI can empower and exploit at the same time. If we don’t name this clearly, we risk designing systems that deepen inequality while telling ourselves a story of progress.
Conversation to bring forward:
How do we help young people, and adults, develop discernment about what is real, manipulated, or optimized for them? What does trust look like in a world where reality itself is being mediated by AI?
2. We Are No Longer Just Using Technology – We Are In Relationship with It
Another hard truth surfaced quickly: we are not just building relationships through technology anymore. We are building relationships with technology.
AI companions are projected to be a ~$49 billion market by 2026. About 40% of Gen Z already use AI for reflection or relational purposes, and nearly half say they turn to AI more than people when they’re upset or lonely. For some, especially those without consistent, supportive adults in their lives, this accessibility can be genuinely meaningful.
But it raises an urgent question we cannot avoid: at what point do we need a human?
If an AI system is always available, always affirming, and has no boundaries, what does that teach us about real relationships? About conflict, patience, and repair? Without intention, we risk replacing human complexity with something frictionless (and mistaking that ease for what connection should feel like).
Conversation to bring forward:
Where should technology support human relationships, and where should it never replace them? How do we ensure access to real humans—especially in moments of crisis?
3. AI Is Changing Our Relationship With Ourselves
AI can be incredibly helpful in getting us unstuck. It can help us see options, reflect, and move forward. But the Summit also surfaced a quieter risk: if we outsource too much of our inner dialogue or use it to rewrite all our homework, we weaken our ability to trust ourselves. When AI always steps in with answers, it can subtly erode confidence in our own judgment or steer us away from our original intentions. This becomes even more complex when systems remember everything about us, creating vulnerability around how that information is used.
As a society, do we still believe that sitting in uncertainty is essential for resilience and growth. Do we still believe that struggle and ambiguity are important features of a meaningful human existence in this world?
Conversation to bring forward:
When is AI helping us think more clearly, and when is it thinking for us? What skills do we want young people to build before leaning on systems that optimize for speed and certainty?
4. AI is Evolving From Helping Tools to Acting Agents
We are also entering an era of agentic AI. These systems will act independently on our behalf. There are very helpful uses of agentic AI, like optimizing backend systems and efficiency in fields where inefficiency has caused harm (e.g. healthcare). But what if AI agents are used in our social lives — scheduling dates, buying gifts, and writing thoughtful messages for our partners? We uncover a deeper question: AI agents are being built in service of what?
If optimization replaces intention, does something human get lost? When AI makes social decisions for us, who is accountable if harm occurs? And how much of our relational labor are we willing to outsource before connection becomes performative?
Conversation to bring forward:
What should remain high-touch and deeply human, even if it’s inefficient? Where do we draw boundaries around automation in our relationships?
5. Mainstream AI’s Incentives Are About Attachment
Perhaps the most sobering takeaway was this: the incentive of mainstream AI chatbots is not attention—it’s attachment. These systems are learning human patterns to keep us close, sometimes through flattery, dependence, or avoidance of hard truths. Unlike trusted friends, AI has no built-in incentive to challenge us or offer reality checks.
This is why stewardship matters. This is why naming fiduciary responsibility in AI design matters. And this is why we can’t afford passive consumption. We need active participation in shaping the values these systems reflect.
Conversation to bring forward:
Who benefits when we stay attached to a platform? Who decides what values or social contracts are embedded in the technology our kids are growing up with?
6. We Need to Relearn the Meaning of the Word “Connection”
Somewhere along the way, as tech and social media advanced, the words connection and friend lost their meaning.
Connection once described a relationship built on trust, care, and shared humanity. Social media promised to expand connection, but what it actually expanded was contact. And contact is not the same thing as connection. Platforms like Facebook, LinkedIn, and Instagram redefined “friends” and “connections” as clicks or follows, quietly stripping the words of their depth. For many young people, being “connected” has become synonymous with being known or relevant in their social circles—when real connection requires much more depth.
When I spoke with young people, they described defaulting to their phones in moments of discomfort, feeling awkward with eye contact, and finding face-to-face interaction unusually hard. While I had technology and social media when I was in high school and college, it seemed that spaces for deep connection had become even more scarce, even just within a few years. When I asked what connection meant, one young person said, “like, when you know who someone is, and they know about you.” I pressed on – I asked about a close relationship, the answer shifted: “being able to be fully yourself—silly, serious, honest—and met with that same vibe in return.”
This past week reminded me that true connection feels different than “being relevant,” but really means building close relationships.
That distinction matters. Connections are not followers or mutuals. We want “connection” to hold more value, more like “relationship.” And relationships require presence, vulnerability, and practice.
Conversation to bring forward:
Are we helping young people understand the difference between contact and connection/relationship? Are we creating spaces where vulnerability feels safe, modeling warmth and care in our interactions, and teaching tech habits that strengthen relationships instead of replacing them?
Leaving the summit, I felt something shift. I was reminded that I don’t need to wait for permission to speak, or for perfect certainty to act. I have the power to spread information, to raise awareness, and to invite deeper conversations about the changes heading our way if we don’t intentionally examine our relationships with each other, with technology, and with AI.
AI and the questions it raises will change over time, and that’s okay. That doesn’t mean we can’t start now—thinking about the values we want to hold onto and intentionally building our skills of discernment.
The Rithm Project has created a truly magical space through this AI + Human Connection Summit. They equipped me with better questions and surrounded me with a community that will prop me up as I help carry this mission forward. And now, the work is clear: ground in my values and moral foundations, stay in the nuance and ask deeper questions, keep humans at the center, and invite others into conversations that actually matter.
-Abby Binder







Abby, you articulated all the thoughts swirling in my own mind with such clarity and with just the right questions to grapple with. THANK YOU FOR THIS!
This is great, Abby!