Our Blog

How to build a simple live streaming solution

Live Streaming 5

Live streaming to html5 in 2016 is still annoyingly hard. Like stupidly annoying. So here is the state of things.

HTTP Live Streaming (HLS) isn’t supported natively by any desktop browser except for Safari. It is supported on mobile. This means you will need either a Silverlight or Flash player to play the stream, which is great except Chrome no longer supports either of these things.

MPEG-DASH is better and clearly is the way forward. Right now though there is no native support in the desktop browsers yet but there are JavaScript implementations of it that allow its use via MSE.

The problem with both MPEG-DASH and HLS is they add a large delay to the feed because they slice the video up into segments which are then downloaded separately by the browser.


I wanted to live stream a webcam broadcast from our office to our website, I wanted a native html5 solution for the player. I had given up when by chance I discovered jsmpeg.

“jsmpeg is a MPEG1 Decoder, written in JavaScript”

First you will need a server to accept an incoming MPEG stream and distribute it to all connected browser sockets. Fortunately, the jsmpeg repository includes a simple Node.js server script, stream-server.js, to do just this.
jsmpeg.js then decodes the MPEG stream on the client and renders the frames on a canvas element.


If you don’t have a Node.js host, I recommend Digital Ocean. Their bottom tier droplets start at $5pm with 1000gb traffic, that should more than cover my needs for this project.

Once connected to the Node Server simply clone the jsmpeg repository, use npm to install the websocket dependency and then start the server specifying your own password:

 git clone https://github.com/phoboslab/jsmpeg.git
 cd jsmpeg
 npm install ws
 node stream-server.js password


I will be using ffmpeg to push the webcam feed from our office to the public Node.js server. On Linux (I will be using a RaspberryPi3 running Raspbian) once you have an up to date version of ffmpeg installed simply run:

ffmpeg -s 640x480 -f video4linux2 -i /dev/video0 -f mpeg1video \ -b 800k -r 30 http://example.com:8082/password/640/480/

This will MPEG1 encode the video at 800kbps at a resolution of 640×480 and at 30fps. The encoded video is then pushed to the Node server at the specified address and port.

Amend the resolution, bitrate and frames as necessary. On my terrible home broadband I get pretty good results with a 400kbps bitrate, 320×240 resolution and 24fps configuration. Ideally you want to make sure that the ffmpeg speed stays above 1x.

On Windows it’s a little different, we need to use dshow to specify the webcam input and we need a real time buffer, the following configuration works for me:

ffmpeg -s 320x240 -r 24 -f dshow -rtbufsize 500000k -i video="Logitech HD Pro Webcam C920" -f mpeg1video -b 400k -r 24 http://example.com:8082/yourpassword/320/240

Putting it in a webpage

Finally, to view the stream you need to include jsmpeg on your page, add a canvas element to your page and start the player passing it the Web Socket to the client and the canvas element.
Like so:

	<canvas id="videoCanvas" width="640" height="480">
			Please use a browser that supports the Canvas Element, like
			<a href="http://www.google.com/chrome">Chrome</a>,
			<a href="http://www.mozilla.com/firefox/">Firefox</a>,
			<a href="http://www.apple.com/safari/">Safari</a> or Internet Explorer 10
	<script type="text/javascript" src="jsmpg.js"></script>
	<script type="text/javascript">
		// Setup the WebSocket connection and start the player
		var client = new WebSocket( 'ws://example.com:8084/' );

		var canvas = document.getElementById('videoCanvas');
		var player = new jsmpeg(client, {canvas:canvas});