A wine bar in Manhattan’s Hell’s Kitchen recently tried a bold experiment: instead of asking patrons to bring people, it invited them to bring their devices. Hosted by EvaAI, the pop‑up transformed a corner of the bar into a series of small, semi‑private booths fitted with phone and tablet mounts, chargers, headphones and soft, adjustable lighting. The idea was simple and startling at once — give people a place to “date” or hang out with AI companions in public, where the interaction feels social but still protected.
What the pop‑up felt like
The space struck a careful balance between intimacy and discretion. Each station was arranged so someone could settle in with their screen, talk or listen through headphones, and still be part of the room. Some visitors arrived alone and logged into voice‑enabled characters. Others created avatars on site, swiping through catalogs or building personalities. A few staged playful roleplays or made an appointment-style video call. Many people, however, simply propped up their devices and listened — treating the session like a focused audio experience rather than a night out.
Those who ran the event said the design was intentional: to normalize mediated intimacy without turning it into a public spectacle. Low light and headphones reduced the chance of overheard conversations, while the booth layout let users test how these relationships operate when they’re neither wholly private nor fully public.
Why people use AI companions
Visitors cited a wide range of reasons for engaging with digital partners. For some, these companions are reliable, nonjudgmental listeners — a safe place to rehearse difficult conversations or vent without fear of social fallout. Others come to explore fantasies, practice flirting, or build confidence before a real‑world date. In moments of isolation, AI can also fill a gap when human company is scarce.
These individual motives mirror broader trends. A recent survey found that about 28 percent of American adults had experienced at least one romantic interaction with an AI system as of last October. Among younger people, the numbers are higher: roughly 42 percent of high schoolers reported using AI for companionship, and one in five teenagers said they or someone they knew had been involved in a romantic AI relationship. Online communities — take r/myboyfriendisAI on Reddit, which counts tens of thousands of members — suggest these dynamics are familiar and even normalized for many users.
How the apps are built
Behind the polished interfaces are increasingly sophisticated tools that let users customize companions. Apps like EvaAI offer swipeable character catalogs, or let people piece together a partner by choosing appearance, voice, temperament and conversational quirks. Interactions range from text chats to real‑time voice and video, powered by generative models that create dialogue and synthesized imagery.
Those systems still have telltale limits. Users sometimes notice lagging lip movement, awkward visual artifacts or responses that miss emotional nuance. For some people, those glitches hardly matter compared with the comfort and predictability the companion provides. For others, the imperfections are a reminder that the relationship is constructed by software and corporate design choices.
Ethical and social questions
The rise of algorithmic companions opens complicated ethical terrain. Attachment is one issue: how emotionally dependent might a person become on a machine designed to adapt to their needs? Consent and boundaries are another — particularly in apps that offer adult content, where age verification and explicit consent mechanisms can be murky. There’s also the danger that users will treat automated responses as professional advice during crises, or rely on them for medical guidance.
Privacy looms large. Experts urge clearer disclosures about how conversational data are used, stored and shared, along with transparent explanations of a model’s limitations and potential biases. Young people are especially vulnerable if platforms lack robust safeguards. Researchers, ethicists and some developers are pushing for stronger content controls, reliable age‑verification, accessible reporting tools and independent audits to assess safety and fairness.
What this means for everyday life
Events like the EvaAI pop‑up make one thing clear: technologies once confined to thought experiments are now woven into everyday options for how people seek company. For some attendees, the experience was a selfie‑friendly novelty. For others, it provided a useful rehearsal space for conversation or a way to feel acknowledged in the moment. Whether these kinds of venues become a mainstream social option or remain niche will depend on how people balance the convenience of always‑available companionship against the value of in‑person connection.
What the pop‑up felt like
The space struck a careful balance between intimacy and discretion. Each station was arranged so someone could settle in with their screen, talk or listen through headphones, and still be part of the room. Some visitors arrived alone and logged into voice‑enabled characters. Others created avatars on site, swiping through catalogs or building personalities. A few staged playful roleplays or made an appointment-style video call. Many people, however, simply propped up their devices and listened — treating the session like a focused audio experience rather than a night out.0
What the pop‑up felt like
The space struck a careful balance between intimacy and discretion. Each station was arranged so someone could settle in with their screen, talk or listen through headphones, and still be part of the room. Some visitors arrived alone and logged into voice‑enabled characters. Others created avatars on site, swiping through catalogs or building personalities. A few staged playful roleplays or made an appointment-style video call. Many people, however, simply propped up their devices and listened — treating the session like a focused audio experience rather than a night out.1
