The default method produces the choppiest video due to large number of stitching failures. Also notice that some frames suffer from serious perspective jerkiness. Interestingly, if you pause the video, you would find that some "jerky" frames actually look fine statically.
In contrast with the default method, SkyStitch is able to give much smoother video experience thanks to various optimization techniques. Note that failures can still happen if inliers are too few or H is ill-formed. In that case prediction can help recover lost frames. When both stitching and optical flow succeed, the fusion procedure kicks in to further reduce perspective jerkiness.
When stitching more than two videos we will encounter the case where the first video and last video are not perfectly aligned, due to errors accumulated in estimated homographies. Without loop closing you would see noticeable misalignment between the first and last video. Our simple and fast loop closing approach can distribute this misalignment error into the estimated homographies to make the end result much more pleasant.
SkyStitch is able to stitch 12 concurrent HD video streams at 20 fps.
In this demo a person (me!) walked across the field. Since cameras are synchronized using GPS clocks there is no ghosting effect present. However, white balance was not synced. Color of the grass from the two images slightly differs and there is a noticeable seam near the center.
In this demo not only cameras but also white balance is synchronized. Color is consistent across the two images and the result is perfectly seamless.
The two quadcopters were affected by large wind turbulence due to bad weather. SkyStitch tried its best to recover lost frames but you would still notice some frames were dropped, but overall there was no bad frame in stitched video.