The metaverse needs open infrastructure

Pre-pandemic, we did most things offline. 2020 brought a sea change: we worked, shopped, studied, cooked, watched movies, exercised, dated, and even got married over the Internet — in short, we now did most things online by default. While it looked like the Brady Bunch credits, the moment we collectively began living online was actually V1 of the metaverse. VR and 3D will help make these experiences more immersive and lifelike, but the metaverse will be more than that. As we return to meatspace, after two years of cultivating online-only relationships, wearables offer a persistent, synchronous connection to our tribe. This is the next iteration of the metaverse, the perpetual blurring between our analog and digital lives.

It’s an exciting time. We’re simultaneously reentering and remaking our world. It’s too early to know the metaverse’s final form, but we hope it won’t be an MMO, accessed through proprietary hardware, controlled by a single company. No, it should look more like the early Web, built atop open and decentralized infrastructure. Web3 and NFTs push us toward direct distribution of content, with value accruing to communities and creators. Crypto provides disintermediated payments and compute infrastructure. And DAOs are consensus-driven corporations. What about communications? Teleporting memy presence, my voice, my face, my avatarto someone’s XR glasses, phone, or holographic display requires real-time, globally-distributed network infrastructure. Infrastructure that’s open, censorship-resistant, but also robust and scalable. That’s what we’re building at LiveKit: an open source, real-time communications stack for the metaverse.

This past summer, we released LiveKit as free, modern, open source infrastructure for running WebRTC at scale. Our repo was one of the fastest-growing this fall and has gone from zero to 2k stars on GitHub. Hundreds of developers have joined our community; they’ve fixed bugs, improved features like screen sharing and speaker detection, and bootstrapped SDKs for new platforms. Numerous projects and products, from tabletop games to classrooms to doctor’s offices and drones, are now streaming audio and video using LiveKit.

It’s only been four months, but the response has greatly surpassed our expectations. The best part is, this is just the beginning. We’re focused on three core areas as we build out this platform: developer experience, reliability, and scalability.

Developer experience

Today’s metaverse offers a wide variety of applications, but future experiences will have no existing analogue. When we began working on LiveKit, developers told us they’d struggled to adapt existing infrastructure to their use cases. They needed things like continuous bandwidth estimation, raw track access, or an SDK for a specific framework. To that end, we’ve designed flexible server and client APIs, consistent and native, across all major platforms. Over time we’ll add higher level plug-and-play abstractions. Most importantly, we listen and react quickly to community feedback and our entire stack is open source (contributions encouraged!), so LiveKit improves faster than any alternative.


Live shows, gatherings, meetings, and classes are just some of the experiences that require low-latency connections between people around the world. There’s a reason we’ve co-opted Zoom for most of these activities: it just works. The UX penalty for a dropped connection, call, or lagging stream is extremely high, second only to payment processing. At LiveKit, we spend a lot of our time on the finer details of network latency and bandwidth management, so that we can bring Zoom-level reliability to every application. We’re also working on a sophisticated telemetry system for both, our team and developers, to gauge QoS and react accordingly.


The metaverse will eventually include every person on Earth. (We can discuss interplanetary comms another time.) Which means people interacting together, online, in the tens for a meeting to millions at the World Cup. Right now, a single LiveKit gathering is capped at thousands of participants, and while that’s sufficient for most practical applications, it limits a developer’s imagination. The next version of our platform will be “metaverse-scale”, built for a world increasingly living online and capable of supporting interactive, million-person events. The only technical limitation will be a user’s bandwidth.

We’ve got a lot of work to do, but we’re energized by what we’ve seen people build. Tools to help loved ones connect with family overseas, students joining geographically inaccessible courses, and healthcare workers helping patients through their phones — all in small part enabled by LiveKit.

Since we started in late 2020, some incredible folks reached out who shared our vision and asked to get involved. So, I’m excited to announce our initial core team of three is now 15 strong, with deep experience in audio, video, distributed systems, networking, cloud gaming and netcode. We’ve also raised $7M in seed funding from Redpoint, Goat (Justin and Robin), Elad, Ev, Adam, Kayvon, Packy, Furqan, Max, Lenny, Emil, Sujay, Cailen, PrimeSet and other amazing investors and operators.

There’s a lot more stuff to talk about in the coming months, in particular, the 🤯 things people have built with LiveKit. Until then, come hang and let us know how we can help.