Sunday, November 24, 2019

FFMpeg Transitions -- Part 1


As I look back at the content of this blog a good chunk of the posts revolve around FFMpeg, roughly 1/4 of them to-date.  I've been playing with FFMpeg for years, mostly out of curiosity as I lack the creativity or ambition to create any significant media content.  I think I'm drawn to this topic for a few reasons; 1) I had a need to use it out of necessity some years ago to transcode videos for image detection software, 2) it's extremely powerful and is a tribute to the sophisticated command-line utilities that Unix emphasizes, 3) despite it's power it's documentation is IMO lacking so I feel obligated to a degree to document what I discover.

This form of topic fits well into my goal of this blog as well, short consumable content that can be authored in spare hours of the evening.  I'm intending departing from this format temporarily however and author a multi-part series on video transition effects to give it sufficient attention and hopefully shed some light on the underlying mechanics to create these effects.

Let's skip to the end temporarily and look at the end result we will be shooting for.  We will use two input videos of the legendary duo and perform transitions between the two using various transition effects.  Despite these video being of a static image, the same effects can be used for dynamic videos.
So, that's our bogie, we will tackle each effect one-by-one in this and future posts.

Fundamentals

The effects all build on some FFMpeg and mathematical fundamentals, we will try to set the stage in this section.

Two key FFMpeg filters enable these transitions; overlay and blend.  

The overlay filter allows placing a video or image atop another and can look something like this;

We create this effect by instructing FFMpeg to first render the first image, then render the second image pinned at position (x,y).  Altering the start (x,y) position of the 2nd image can give the appearance of the image moving.

The blend filter allows blending two images or videos into a output video giving the appearance of one video melting and morphing into the other; looking something like this;

This effect is created by instructing FFMpeg to render each pixel as a composite of both input videos.  For example, we can tell FFMpeg to apply 50% of the first video and 50% of the second video to get something that looks like the above.  Similar in nature to the effect you'd get if you printed out the frames on transparencies (e.g. clear plastic), lined atop one another and held up to a strong light.

These two filters will be the basis of the future sections so we'll spend a bit more time with them here.

We need to start with two input videos, they need to be of the same dimensions and resolutions and helpful if they are of the same duration.


image01.mp4



image02.mp4

An example usage of the overlay filter takes the form:
$ ffmpeg -i image01.mp4 -i image02.mp4 -filter_complex "[0:v][1:v]overlay=x='W/4':y=0[out]" -map "[out]" -y example01.mp4

In short, the above command says; render a image01.mp4 frame, then overlay the subsequent image02.mp4 frame at (x=width/4, y=0).



An example of the blend filter takes the form:
$ ffmpeg -i image01.mp4 -i image02.mp4 -filter_complex "[0:v][1:v]blend=all_expr='A*(0.5)+B*(0.5)'" example02.mp4

In short, the above command says to blend 50% of the 1st video frame with 50% of the 2nd video frame; looking like this:

Future video transitions will make heavy use of these filters, but instead of using static values they will/may be based on time and/or position making for more sophisticated effects.  The FFMpeg filters can make use of internal filter variables and functions.  Let's consider ones that we may make use of:

Variables

W -- width of video
H -- height of video
X -- pixel x location
Y -- pixel y location
t/T -- time
Note: variables are filter specific; for example the time variable differs between the blend and overlay filter

Functions

between(i,min,max) -- return 1 if min <= i <= max
lte(i,j) -- less than equal to
gte(i,j) -- greater than equal to
sqrt(i) -- square root

Time-Based Expressions

Applying time-based expressions open a door to a number of sophisticated effects.  Suppose we use the following equation:
The T variable is an internally available variable, T0 is the duration of the video.  The result will be a fraction between 0.0 and 1.0.  This concept can be used to apply dynamic filters.  

For example, a filter x position can be applied to a filter using the following equation.  The result; x will begin at 0, ending at the width of the video, linearly traversing across the screen in T0 seconds.

Wipe Right Transition

$ ffmpeg -i image01.mp4 -i image02.mp4 -filter_complex "[0:v][1:v]overlay=x='min(0,-W+(t/1)*W)':y=0[out]" -map "[out]" -y wipeRight.mp4

In the above example we are making use of the previous time-based location equation with a minor tweak; specifically the use of the min() function.

We start the overlay far left, outside the rendering dimensions; specifically -W, then slowly move the x start location right-wise.  The min() function is used to prevent the x overlay start position to exceed 0, snapping the final location directly over-top the lower image.  Also of note, T0 in this case is defined as 1 second.  This means the wipe effect completes within 1 sec.  Tweaking that value will speed up or slow down the motion.

Wipe Left Transition

Let's reverse the effect, moving the overlay from right-to-left;
$ ffmpeg -i image01.mp4 -i image02.mp4 -filter_complex "[0:v][1:v]overlay=x='max(0,W-(t/1)*W)':y=0[out]" -map "[out]" -y wipeLeft.mp4

Similarly natured equation as above, only starting at W and moving left.  The max() halts the position at its final location.

A reasonable stopping point, we will put a pin in it for now and follow on with additional effects in later posts.

Cheers.



No comments:

Post a Comment