A lot of people are talking about “the metaverse” these days. Coming off eighteen months of Zoom, Netflix, and Doordash, you can count me out — at least in the form that most folks are imagining. I’m not denying that the metaverse is a cool concept from a technology point of view; it comes from one of my favorite sci-fi writers, Neal Stephenson, who coined the term in his 1992 novel, Snow Crash. Along with the works of William Gibson, that book created the cyberpunk genre, in which characters spend time wired into a digital universe where they explore, socialize, fight, and (at least in the novels) save the world from villainous plots. The concept reached one of its most complete expressions in Ernest Cline’s Ready Player One, where virtually everyone has abandoned reality for an elaborate VR massively multiplayer video game.
A lot of people these days seem very interested in bringing this near-future vision of a virtual world to life, including some of the biggest names in technology and gaming. But in fact these novels served as warnings about a dystopian future of technology gone wrong.
As a society, we can hope that the world doesn’t devolve into the kind of place that drives sci-fi heroes to escape into a virtual one — or we can work to make sure that doesn’t happen. At Niantic, we choose the latter. We believe we can use technology to lean into the ‘reality’ of augmented reality — encouraging everyone, ourselves included, to stand up, walk outside, and connect with people and the world around us. This is what we humans are born to do, the result of two million years of human evolution, and as a result those are the things that make us the happiest. Technology should be used to make these core human experiences better — not to replace them.
Technology to enhance the human experience
Some might argue that we ought to ditch technology completely and return to a simpler way of life. But we don’t think that’s the answer either. Technology isn’t going away. The benefits of connecting us with information, friends, and family are simply too great. But over the last decades, those benefits have taken a huge toll, increasingly cutting us off from the experiences that we enjoy the most. It’s all too easy to get lulled into a routine of Zoom calls, online shopping, gaming, and scrolling through our social feeds. It encourages behavior toward one another that we would never tolerate in person, and is dividing our society by algorithmically pushing people into bubbles which reinforce the most extreme views.
At Niantic, we ask the question: what if technology could make us better? Could it nudge us get us off the couch and out for an evening stroll or a Saturday in the park? Could it draw us into public space and into contact with neighbors we might never have met? Could it give us a reason to call a friend, make plans with our families, or even discover brand new friends? Collectively, could it help us discover the magic, history, and beauty hiding in plain sight?
If this fresh perspective is the goal, what are we doing to achieve it? For us, it starts with a technology that connects the real world (the atoms) with the digital one (the bits). You could call it the ‘real world metaverse’ to distinguish it from the virtual videogame version, but honestly, I think we are just going to experience it as reality made better: one infused with data, information, services, and interactive creations. This has guided our work to date, both in terms of our first attempts to incorporate these concepts into products like Field Trip, Ingress, and Pokémon GO, and in terms of inventing critical technology to enable them. The core of this isn’t only the computer graphics challenge of adding annotations and animations to the physical world; it’s also — maybe even mainly — about the information, services, and experiences where digital meets physical.
Building the real world metaverse lies at the intersection of two major technical undertakings: synchronizing the state of hundreds of millions of users around the world (along with the virtual objects they interact with), and tying those users and objects precisely to the physical world. The first exists today in the Niantic Lightship platform, which underpins Pokémon GO and all of our products and supports hundreds of millions of users around the world. It means that those millions of users can create, change, and interact with digital objects in the physical world and that experience is consistent and shared by everyone. In the world of software, we call that a ‘shared state’ — we are all seeing the same thing, the same enhancements to the world. If you change something it’s reflected in what I see, and vice versa, for the millions of participants using the system.
Tying all of that precisely to the physical world is an even bigger project. It requires a new kind of map, similar in concept to something like Google Maps, but different because this map is built for computers, not people. It requires an unprecedented level of detail so that a phone or headset can recognize its location and orientation in a highly accurate way anywhere in the world. It is designed to enable the ultimate kind of digital wayfinding and coordination. Think of it as a kind of GPS, but without the satellites and a much higher level of accuracy. Niantic is building that map, in collaboration with our users. This is one of the grand challenges of augmented reality, and it’s the key to making it work the way we want it to — to make the real world come alive with information and interactivity.
Other big opportunities and challenges lie in semantically understanding the world. What are those pixels: an oak tree, a pond? A park bench, a cafe, or a historical building? Human cartographers have been doing this for hundreds of years. The new twist is in using computer vision to do this more or less automatically. Think of the opportunity as an analog to the web crawlers that search the web for pages to be indexed by Google. Today, computer vision powered by deep learning algorithms can provide a basic version of this in real time. In the future, offline processing can extend this to a much higher degree of fidelity and persistently tie this understanding to an ever-evolving AR map of the world. Niantic is pursuing these and other capabilities within the Lightship platform.
Given the innovations that will be required to achieve the necessary optical and computing performance, plus delivering it in a socially acceptable form factor, there’s a lot of skepticism about AR smart glasses. In the spirit of pioneers like Alan Kay, who took a long view of where technology was heading and accurately anticipated that evolution, we are not deterred by the challenges. As predicted in Kay’s 1972 Dynabook paper, the trend over the last several decades has taken us from mainframes to mini-computers to personal computers to laptops and now to smartphones, increasing in performance as they shrunk in size. Mark Weiser and other luminaries noted this trend as far back as the 1980s, predicting that our computing devices will ultimately disappear into the world around us — a concept Weiser described as “ubiquitous computing.”
It’s in this spirit that we have joined with our partners at Qualcomm to invest in a reference design for outdoor-capable AR glasses that can orient themselves using Niantic’s map and render information and virtual worlds on top of the physical world. Unlike the walled gardens others are attempting to create, our open consortium will enable many partners to create and distribute compatible glasses. By moving the primary computing surface from a smartphone that demands hands and attention to glasses, the friction for accessing data and services is radically reduced. And by overlaying those services on the real world, reality itself can appear malleable, enabling brand new experiences. The Pokémon will finally be able to truly walk among us!
Although we already have the first versions of hardware for internal R&D, we are just beginning this work, which we expect to continue over many years. It will evolve in fits and starts, just as past platform transitions have unfolded. Of course, we think games are a great way to explore this technology frontier. Games have been at the forefront of technology adoption since Atari brought some of the first microprocessors into the home with Pong. This trajectory was carried forward with innovations like the Nintendo Gameboy (perhaps the first successful consumer handheld computer) and computer games like Quake and World of Warcraft, which drove demand for internet connectivity in the home. We’ll be using games to explore the potential for AR smart glasses, and are eager to share our work with early adopters of this hardware.
Going back to our vision for software and content, we imagine a future of worlds that can be overlaid on the real world. For now, we’re calling these ‘reality channels’ to give the idea a name. Think of Pokémon GO, upgraded for smart glasses where the Pokémon wander through your local park, seeming to actually inhabit the world. In this future version, Pokémon appear to you as if they are really there, scurrying around passing pedestrians, hiding behind a park bench, or roaming in herds through your favorite park. Buildings might take on the pastel hues of the Pokémon universe; a 10-story Pokémon GO Gym might rise above your local shopping center. If you encounter another player on the street, they might even appear transformed into the guise of their in-game persona. Multiply this kind of channel x1000: Mario, Transformers, Marvel’s superhero universe, the world of Wakanda, Star Wars, Indiana Jones, Bladerunner, Sherlock Holmes, Nancy Drew, The Maltese Falcon — all of these and countless more will exist as reality channels that you can turn on, transforming your daily routine into something a bit more magical, intriguing, exciting — and most of all, a little more fun. Importantly, all of these experiences will be shared by countless other people, so that the adventure is a catalyst for spending time together and deepening social relationships.
But it’s not just games. Although we expect games and entertainment to be key drivers for this new platform, reality channels are a way of seeing the world that will power more activities that entertain, educate, guide, explain, and assist us, from assembly lines and construction sites to the most complex knowledge work, all without taking us away from the thing we do best: reality.
We have a responsibility to do all this in a way that respects the people using our services, as well as people who don’t. User privacy, responsible use, inclusive development processes and recognizing and mitigating the potential impacts of AR technology on societies all need to be considered now, not after the fact.
Come build with us
The shift to the real-world metaverse represents a sea change in computing that’s as significant as earlier developments like personal computing, the internet, and mobile. Helping to achieve this meta-shift is our purpose at Niantic, but we recognize that it won’t be solely our work. Such a massive evolution will require the work of big companies, startups, and individuals around the world to become a reality. We have identified critical leverage points where we think Niantic can make a difference in moving this massive transformation forward: our own games and applications, the Lightship Platform, the map, and industry-spanning initiatives like our hardware reference design that leverage the expertise and investment of many towards a common goal.
I hope these thoughts help you to see what we see, which is the opportunity to do what we humans do best — harness technology to serve our needs, rather than the other way around. The future is what we make it to be, and we’re devoting our time and energy to building the future we want to live in, and want to pass on to the next generation.
We’ve got a lot of work to do, so If this appeals to you as a partner, a Lightship developer or a potential Niantic employee, we’d love to hear from you!
– John Hanke, Founder & CEO, Niantic