OwlLive captures user-defined video inputs and processes them with pre-defined stitching templates, to provide live stitching and streaming services, supporting up to 4K resolution as well as 60 fps real-time high quality 3D Cinematic Virtual Reality video streaming.
- Linux (Ubuntu 14.04 preferred) / Windows (Windows 8 preferred)
- Nvidia Graphics Card (with Compute Compatibility 5.2) Check compatibility
- Graphics Driver, CUDA
- Multi-channel memory architecture
- Video capture card driver
- Install python3
- Download CUDA 7.5 from https://developer.nvidia.com/cuda-downloads and install .deb package
- sudo apt-get update
- sudo apt-get install cuda
- sudo apt-get install libSDL-dev
- sudo apt-get install libva-dev
- Start the software by running “start-linux.sh” or “start-win.bat”.
- Select inputs. In the Input tab, all available video capture inputs will be automatically loaded.
Select all inputs to be stitched and streamed, and deselect all inputs that are not required.
– Frame rate: in the “FPS” input box, specify the capturing frame rate.
– Crop: to crop the video inputs with black area on the left/right side, specify the left margin in the first input box of “Crop”, and specify the input width after cropping in the second input box of “Crop”.
– Save Images: OwlLive makes a snapshot of all selected inputs at the same moment and saves them to a folder, for the convenience of generating stitching templates.
- Use stitching software (e.g. Hugin/PTgui) to generate a stitching template. Users can use the saved images in step 1 to generate the template.
- Import the stitching template. Click the Template tab. Here users can import the generated stitching template.
For 2D videos:
Click the button “Load Template (Left eye / non-3D)” on the left side and select the generated template from storage.
For 3D videos:
Check the “3D” checkbox, and click the button “Load Template (Left eye / non-3D)” to import the template for left eye, and click the button “Load Template (Right eye)” to import the template for the right eye.
For 3D videos by camera group (e.g. Google Jump):
Check the “3D” checkbox and the “Split on longitude for first X inputs” checkbox, specify the number of cameras on the equator of the capture set in the following input box, and use the same template for both “Load Template” buttons.When Selected, input 0 must be at the center of image (longitude 0), input 1 must be on the left of input 0 and so on ( The cameras should be labeled counter-clockwisely in the top view.). This is only valid if selected before loading template.
- Start stitching/streaming
Now that the inputs and stitching templates are all specified, we can start the stitching/steaming process.
Click the Run tab. Here users can specify the parameters for stitching and streaming.
In “Panorama Settings”, users can specify:
– Algorithm for stitching, with “Linear” as the fastest and “Multi-band 128” as the slowest but highest quality.
– Width of the panorama video.
– Height of the panorama video.On the right side, users can specify the streaming method, including:
– Apple HLS (pre-requisite: HTTP server service)
– Save the panorama video to local file
In each tab, users can enable/disable streaming via selected method, multiple methods of streaming can be supported simultaneously. Video encoding parameters (bitrate, codec, GOP size, etc) can be specified in each tab. Once the parameters are set, users can click the “Start” button to start live stitching and streaming. OwlLive will first prepare the resources (showing “Preparing”) and then starts the live stitching and streaming process (showing “Running”). “Stop” button can be clicked to stop the process (showing “Not running).
Users can have a live preview of the stitched panorama video.
Firstly, in the Run tab, users need to check the “Enable” checkbox in “Preview Settings”. The width/height of the video for previewing can also be specified.
Once OwlLive shows “Running” in the “Run” section, click Preview tab to view the panorama video. Actual delivered frame rate is also displayed.