Streaming video from a canvas with WebRTC and React

With WebRTC you can live stream video from a canvas. This post is a step-by-step guide which shows you how. We use LiveKit’s WebRTC stack to build a real-time application for sending canvas video. Check out the full code.

VTuber Sample App Gif

A lot of people know WebRTC as the technology that powers video chats on web. This is great for things like Google Meet, but it can be used to build much more than simple conference calls.

In fact, anything you can put on a canvas (that’s a lot of things!) can be sent as video into a WebRTC session.

Before we go into the details, note that the techniques here should apply generally to any stack. We use LiveKit to simplify the publishing flow. Follow this mini tutorial to learn how to set up your own LiveKit project.

Capturing a Canvas Using captureStream()

WebRTC works on the concept of MediaStreams which are…uhhh…streams of media. In order to send a canvas into a session, we need to get a MediaStream from the canvas. Luckily, this is super easy using the HTMLCanvasElement.captureStream() api.

Let’s create a simple canvas that draws a circle and then create a MediaStream from it:

export const CircleCanvasPublisher() {
  const canvasRef = useRef<HTMLCanvasElement | null>(null);
  const mediaStreamRef = useRef<MediaStream | null>(null);
  
  // Draw a circle
  useEffect(() => {
    if(!canvasRef.current) return;
    const ctx = canvasRef.current.getContext("2d");
    ctx.beginPath();
    ctx.arc(canvasRef.current.width / 2, canvasRef.current.height / 2, 50, 0, 2 * Math.PI);
    ctx.fillStyle = "blue";
    ctx.fill();
  }, []);
  
  // Create MediaStream
  useEffect(() => {
    if(mediaStreamRef.current || !canvasRef.current) return;

    // Create 30fps MediaStream
    mediaStreamRef.current = canvasRef.current.captureStream(30);
  }, [])

  return (
    <canvas width={300} height={300} ref={canvasRef} />
  )
}

In the above snippet we:

  • Created a canvas and stored a reference to it
  • Drew a small circle to the canvas (because why not?)
  • Created a MediaStream from the canvas at 30 frames per second and stored a reference to it

Now that we’ve got the MediaStream. All that’s left is to publish it to your session. With LiveKit this is is just a couple lines of code:

// At the top of the component
const { localParticipant } = useLocalParticipant()

// In the useEffect that creates the MediaStream
const publishedTrack = mediaStreamRef.current.getVideoTracks()[0]
localParticipant.publishTrack(publishedTrack, {
  name: "video_from_canvas",
  source: Track.Source.Unknown
})

Putting it All Together

And, voila! That’s how simple it is to send a canvas as video. We used the same technique on a WebGL canvas for our VTuber streaming app:

Here’s the full code including some additional code to create a session using LiveKit:

import { useEffect, useRef, useCallback } from "react";
import { Track } from "livekit-client";
import { useLocalParticipant } from "@livekit/components-react";

export const CircleCanvasPublisher() {
  const canvasRef = useRef<HTMLCanvasElement | null>(null);
  const mediaStreamRef = useRef<MediaStream | null>(null);
  const { localParticipant } = useLocalParticipant()
  
  // Draw a circle
  useEffect(() => {
    if(!canvasRef.current) return;
    const ctx = canvasRef.current.getContext("2d");
    ctx.beginPath();
    ctx.arc(canvasRef.current.width / 2, canvasRef.current.height / 2, 50, 0, 2 * Math.PI);
    ctx.fillStyle = "blue";
    ctx.fill();
  }, []);
  
  // Create MediaStream
  useEffect(() => {
    if(mediaStreamRef.current || !canvasRef.current) return;

    // Create 30fps MediaStream
    mediaStreamRef.current = canvasRef.current.captureStream(30);
    
    // Publish the canvas video
    const publishedTrack = mediaStreamRef.current.getVideoTracks()[0]
    localParticipant.publishTrack(publishedTrack, {
      name: "video_from_canvas",
      source: Track.Source.Camera
    })
  }, [])

  return (
    <canvas width={300} height={300} ref={canvasRef} />
  )
}

@/components/CircleCanvasPublisher.tsx

import { LiveKitRoom } from "@livekit/components-react";
import { CircleCanvasPublisher } from "@/components/CircleCanvasPublisher";

type RoomProps {
  token: string
  wsUrl: string
}

export const Room = ({token, wsUrl}: RoomProps) => {
  return (
    <LiveKitRoom token={token} serverUrl={wsUrl} connect={true}>
      <CircleCanvasPublisher />
    </LiveKitRoom>
  )
}

@/components/Room.tsx

Join Us

You should now have everything you need to publish canvas media in a WebRTC session. If you have any feedback on this post or end up building something cool using or inspired by it, pop into the LiveKit Slack community and tell us. We love to hear from folks!