For over a year now, we have been producing high-quality 360º 3D video content for virtual reality headsets here at Purple Pill VR. We plan and script a production, shoot it using our (open-source) 16-GoPro rig, and then spend anywhere from a couple of days to a couple of weeks stitching and editing the videos. On the other end of the spectrum you have companies like NextVR who are actually able to live stream their videos directly from their camera into VR headsets around the world. So isn’t streaming much cooler than pre-recorded content?

Well, we have received several requests from clients for 360º live streaming, especially from the sports industry who are used to live streaming every event they have, but producing 360º (3D) video and streaming it are two totally different ball games and so we had not put a lot of R&D into figuring out this whole streaming thing yet. To be honest, we had no idea what made a stream different from a regular video…

That was, until the guys from MOMENTUMXR told us they were gonna live stream the Grammys red carpet event in 360º and were looking for an Android and iOS Cardboard app, as well as a web player, through which people could actually view their stream. And are you really going to say no to the Grammys?! You don’t, so we started educating ourselves and managed to successfully display the live stream in our apps and web player! Here is what we learned..


As soon as we started looking into streaming, the acronym HLS kept popping up, which stands for HTTP Live Streaming and is a streaming format originally developed by Apple. The immediate benefit of HLS is that both iOS and Android support it, while streaming formats like RTMP are currently only supported by Android for example. HLS works in an ingenious way: it simply chops up videos into small chunks of, say, 10 seconds long and then dynamically updates a .m3u8 index file, which is nothing more than a simple plain text document which contains links to the various parts of the chopped up video.





As you can see in the example .m3u8 file above, the videos are encoded as an MPEG-2 Transport Stream (.ts). As you are streaming, new .ts segments are created every 10 seconds or so and their links added to the .m3u8 index file. The url to the .m3u8 file is in fact your stream url. The beauty is that one index file can link to another index file, which enables you to set up the same stream at different resolutions and bitrates, allowing the viewer to automatically switch to the best stream for his or her internet connection and display device.





In this case your main stream url would be the .m3u8 file above, which then links through to the various .m3u8 streams you set up at different data rates. Click here for a list of (non-360) HLS test streams.


Encoding settings

If you have experience with 360º video, you know that resolution-for-4k-360-3d-vr-videos/”>encoding your 360º videos can be a pain, since every VR headset has different resolution and bitrate limitations. Well, I’m sorry to break it to you, but streaming these videos brings with it even more limitations. Looking at’s HLS recommendations, you see that the maximum resolution they recommend is 1080p, while we all know that even at a 4k (2160p) resolution a 360º video does not always look as sharp as we would like it to be, due to the fact that this resolution is spread out over an entire sphere.

As you could read in the previous section, the HLS stream files are not your regular .mp4 files, but sport a more exotic .ts extension and require a .m3u8 index file. While we haven’t figured out the best encoding settings for a 360º stream yet — we were mainly focused on creating the apps and web player, not the stream itself — we have figured out how you can use the powerful FFmpeg encoding tool to transform a regular .mp4 file into an HLS stream.

  1. Download a static build of FFmpeg from the FFmpeg website
  2. Open Terminal on Mac or cmd on Windows
  3. Then use the following code to encode an input file into a 1080p HLS stream:

    /Users/nick/Downloads/ffmpeg -v 9 -loglevel 99 -re -i /Users/nick/Desktop/input/myvideo.mp4 -an -c:v libx264 -profile:v baseline  -pix_fmt yuv420p -b:v 128k -bf 0 -maxrate 8500000 -bufsize 30000000 -flags -global_header -map 0 -f segment -segment_time 4 -segment_list /Users/nick/Desktop/1080p.m3u8 -segment_format mpegts /Users/nick/Desktop/1080pstream%05d.ts

    Path to your downloaded FFmpeg binary (add .exe on Windows)
    Path to the 360º video you wish to encode
    Path of your .m3u8 and .ts files


As I said, these encoding settings are far from perfect (in fact, they are pretty appalling), but at least it might give you a starting point to run some tests and to start experimenting.


Capturing a 360º Live Stream

So now that you have a basic understanding of what an HLS stream actually looks like on the file level, let’s now see how you can actually capture such a stream. You obviously need a camera, but is there a camera capable of live streaming 360 degree video?

Well, it appears there are quite a lot of options. For example, NextVR uses two side-by-side RED cameras to live stream 180º in stereoscopic 3D, Freedom360 sells the F360 Broadcaster rig based on 6 GoPros to create a full monoscopic 360º stream, or you could simply use 2 back-to-back GoPros equipped with Entaniya fish-eye lenses (I believe such a rig was used to stream the Grammys). If you’re looking for the lowest cost option, then you could consider the Ricoh Theta S, which has built-in HD live streaming support.


Most of the above mentioned camera rigs use HDMI to stream the video from the camera to a capture device. A capture device can be a PC or a laptop. However, you either have to upgrade them with a 2-port or 4-port Magewell HDMI capture card, or else use an HDMI-to-USB adapter to capture the live stream through your USB3.0 ports.

It is also a good idea to get yourself a USB hub and cables through which you can power your rig, so your batteries don’t run out halfway through your stream.


Stitching & Broadcasting a 360º Live Stream

Once you have a camera that is streaming to your capture device, you still need to “stitch” these individual images into a seamless 360º panorama… on the fly! For obvious reasons you will not be able to manually stitch your videos while live streaming, so you’ll have to find a way to automate this process. Luckily this is exactly what Video Stitch had in mind when they created their Vahana VR live streaming software. Vahana VR handles the stitching process for you and can then output the stream in RTMP format. Click here to watch a video which explains the Vahana VR workflow.

Once you have this 360º RTMP stream setup correctly, you need a Content Delivery Network (CDN) or streaming service which will distribute your video as an adaptive HLS stream in various resolutions and bitrates. We ran some tests with the Wowza media streaming service and were very satisfied with it. Wowza was also used as the CDN for the 360º live stream of the Grammys.


Viewing a 360º Live Stream

You’ve purchased all the gear, done all the hard work and finally have a working 360º HLS stream, but now how will your audience be able to view it? Well, according to The Verge, YouTube will soon support 360º live streaming, which would be an amazing development! However, until then you might have to look for a different solution, especially if you also want your stream to be viewable on VR headsets other than Cardboard Android.

Since there seemed to be no player available yet capable of playing back a 360-degree live stream, we decided to develop our own. Our friends at The Ambassadors focused on developing the web player, which turned out to be very doable, thanks to two brilliant open-source projects: WebVR and hls.js. WebVR allowed us to easily create a 360º video player in the browser, and hls.js allowed us to accept the HLS stream.

>> Click here to test our 360º HLS web player <<


AppFactory Logo

We used our AppFactory to create dedicated 360º streaming apps for Gear VR, as well as Cardboard on both iOS and Android. Streaming to Oculus should also be possible, but this still needs a bit more development time on our end.


Advanced 360º Streaming: Cubemaps & Pyramids

Now, we could easily stop our 360º streaming discussion here, but you’ve actually just reached the point where it becomes interesting! As mentioned before, bandwidth is a serious issue when it comes to streaming 360-degree video, since you would preferably have a resolution of 4k or more, which is not realistic with the current state of technology.

That’s why companies like Facebook, who have integrated 360º video playback into their platform, came up with some ingenious tricks to significantly reduce the file size of 360º videos: they transform the default equirectangular video format to a cube or a pyramid to save between 20-80%! Watch the video below for more details:



The best part of this story is that Facebook decided to open-source their cubemap FFmpeg code! Here is an example of a 360º 3D video production we shot during the Amsterdam Dance Event last year which we then converted to a cubemap using Facebook’s code:


Facebook 360 Cubemap Example Video

>> Click here to download the example cubemap video (67MB) <<


So with this open-source code, everyone is now capable of transforming their equirectangular videos into cubemaps! At least, that was the idea… In practice it turns out to be quite a technical challenge to integrate Facebook’s cubemap functionality into FFmpeg, since it requires you to download the FFmpeg source code and then compile the Facebook code into it.

Since we at Purple Pill VR love to make your life as a VR content producer a bit easier, we took care of all these technical hassles and want to offer you an OSX version of FFmpeg with the cubemap code enabled! Fill in your name and email address in the form below to receive a download link in your mailbox:



You can then use Terminal to run the following command:

ffmpeg -i input.mp4 -vf transform=input_stereo_format=MONO:w_subdivisons=4:h_subdivisons=4:max_cube_edge_length=512 output.mp4


You can change MONO to TB if you are encoding a stereo 360 video. Have fun!



NEW! Create VR video apps in minutes with Headjack

14 thoughts on “Live Streaming 360 Degree Video to VR Headsets: The Basics

  1. Thanks! Just to understand the scope better: The cube-based and pyramid-based formats are what eventually reach the 360 player? So the *player* also needs to know how to cope with those formats?

  2. Is cube-map supported with the Python Injector? Do we still need to use the injector to let FB know that the video is a Spherical Video? Is there a special injector for the cube map 360VR format?

    1. Hi Alex, the cube-map is not supported in the Python injector. If you upload to FB, they encode it to a cube-map for you.

  3. Wow, great post.
    Is it possible that you elaborate further on the way you managed the hls.js handling within webvr? That seems to be a great solution.

    1. Hi Rob, we basically just included both webvr and hls.js libraries and used the hls.js input as the webvr input

  4. Thanks for sharing the link from! One thing I didn’t see mentioned is the use of emerging codecs such as HEVC and VP9. I do think we’ll see these used in an adaptive context (HLS and DASH respectively) in the very near future.

Leave a Reply

Your email address will not be published. Required fields are marked *