Pokémon Go Player Data Now Powers Coco Robotics Urban Delivery Navigation
Every PokéStop scan you ever submitted in Pokémon Go is now helping a fleet of 1,000 sidewalk delivery robots navigate Los Angeles, Chicago, and Helsinki without GPS.

Every time a Pokémon Go player held up their phone and scanned a PokéStop or local landmark, they were quietly contributing to what Niantic now claims is a dataset of more than 30 billion real-world images. That dataset is no longer just powering augmented reality games. Niantic Spatial, the offshoot formed to commercialize Niantic's spatial-AI work, announced a partnership with Coco Robotics this week to use that crowdsourced urban imagery to navigate a fleet of sidewalk delivery robots through cities where GPS simply cannot be trusted.
Coco Robotics operates roughly 1,000 flight-case-size robots across Los Angeles, Chicago, Jersey City, Miami, and Helsinki, each capable of carrying up to eight extra-large pizzas or four grocery bags. The fundamental problem those robots face is the same one that sends the blue dot on your phone sliding half a block in the wrong direction. "The urban canyon is the worst place in the world for GPS," McClendon, quoted by IGN, explained. "If you look at that blue dot on your phone, you'll often see it drift 50 meters, which puts you on a different block going a different direction on the wrong side of the street." A delivery robot that drifts 50 meters has just crossed into traffic or dropped your groceries at the wrong address.
Niantic's Visual Positioning System, known as VPS, addresses that problem by letting a device determine its location from its surroundings rather than from a GPS signal. Coco's robots use onboard cameras to continuously scan the environment around them, then compare those images against the VPS database built from years of player-submitted scans to pinpoint their exact position and orientation relative to buildings and landmarks. The Pokémon Go scanning mechanic was particularly well-suited to building this kind of dataset: by encouraging players to capture locations from multiple angles, heights, and in varying weather conditions, Niantic generated coverage robust enough to handle the variability a moving robot encounters on a real sidewalk.
Niantic Spatial founder and CEO John Hanke framed the technical overlap bluntly. "It turns out that getting Pikachu to realistically run around and getting Coco's robot to safely and accurately move through the world is actually the same problem," he said. "If robots are ever going to assimilate into that environment in a way that's not disruptive for human beings, they're going to have to have a similar level of spatial understanding. We can help robots find exactly where they are when they've been jostled and bumped."

Worth noting for anyone who played during the scanning era: those submissions were opt-in and deliberate. The scans were of specific in-game locations, typically a PokéStop or Gym tied to a piece of street art or a notable building, not ambient footage captured while the app ran in the background.
There is a notable wrinkle in the ownership picture. Scopely acquired Pokémon Go in March 2025, along with Pikmin Bloom and Monster Hunter Now, and now operates all three titles. Niantic, however, retained access to the historical AR dataset accumulated during the years it ran those games. The precise contractual terms governing that data retention after the Scopely sale have not been disclosed, and it remains unclear whether "30 billion images" refers to distinct photographs or individual video frames, a distinction IGN flagged when reporting the partnership.
For players who spent years scanning statues and murals to unlock in-game bonuses, the legacy of that effort turns out to be considerably more tangible than a PokéStop waypoint upgrade.
Know something we missed? Have a correction or additional information?
Submit a Tip

