In this article
Ninety days ago, we activated 842 Concya Atlas nodes across New York City's transit network. Today, we're sharing the results — and they tell a story that even we didn't fully anticipate.
The Scale Challenge
Operating in transit infrastructure is fundamentally different from a controlled restaurant environment. The acoustic challenges alone are staggering — platform announcements, train arrivals, crowd noise, enclosed tunnel acoustics. Each station has its own acoustic fingerprint, and our system had to learn them all.
“Each station has its own acoustic fingerprint, and our system had to learn them all.”
The Human Impact
Numbers only tell part of the story. What moved us most was the human impact: elderly passengers getting real-time accessibility assistance, tourists navigating a foreign transit system in their native language, commuters reporting emergencies through voice instead of searching for a help point. In one memorable case, a node helped guide a visually impaired passenger through a station transfer during rush hour.
What We Learned
- 01Multilingual demand was 3x higher than projected — especially for Mandarin, Spanish, and Bengali
- 02Peak usage correlated with service disruptions, not ridership volume — people seek help when things break
- 03The emotional tone of interactions shifted dramatically by time of day — our prosody engine adapted accordingly
- 04Node-to-node context sharing enabled system-wide intelligence we hadn't initially designed for
This pilot validated our core thesis: physical spaces need their own intelligence. Not a remote server, not a mobile app, but an agent that lives in the space, understands its rhythms, and serves the people who move through it. The 911 infrastructure opportunity alone — a $6B market — is the natural next step.
Building the operating system for physical spaces.