Sunday, September 29, 2019

FFMpeg + Imagemagick Perspective Mapping

Calvin and Hobbes shared many a lesson in perspective.  While this post will be far from quality life lessons, it will focus on perspectives...specifically image processing mapping perspectives.

In it's simplest form, this can take the form of projecting a video on a angular surface giving the appearance of casting it on a three-dimensional surface.  Such feats are not directly supported by FFMpeg, but pairing it with Imagemagick and you can accomplish it.

Let's jump ahead to see what we're trying to accomplish, then walk through the steps to do so.

Let's say we want to project a video like this;
Why would you want to do such a thing?  Whelp, perhaps you have a sales video and while the presenter is pitching your product you wish to show your product in a sub-window and rather than do so in a flat 2D manner you want it to look like it has some depth.  Or, perhaps you cast it on a computer/phone/tablet screen (real or virtual).  You can do so by overlaying the perspective mapping atop your full video.

So, how do you accomplish it?  Let's call it a three-step process;

  1. extract all the frames from the input video using FFMpeg.
  2. feed all the frames through the Imagemagick distort perspective utility
  3. take the resultant frames and rejoin back into the output video.
Step 1 -- Extract the Frames/Audio
$ ffmpeg -y -i ~/Downloads/BigBuckBunny.mp4 frame%05d.png audio.mp3

Since this process requires disassembly of a video to raw frames, later reassembling and reassembly will include audio it's important to retain the same frame rate as the input video, otherwise your audio will be out of sync in the output video.  For example, our input video is 24fps, suppose when you reassemble the frames at 25fps the video will be ahead of the audio.

Step 2 -- Run through Distort Perspective
Let's jump into the command first, then spend a bit of time describing how  we came up with the specifics.  Our input video is 1280x720 and we wish to maintain those dimensions and project the video on a diagonal plane within the video.

$ find . -name "frame*png" -exec convert {} -matte -virtual-pixel transparent -distort Perspective '0,0,170,0 0,720,170,720 1280,720,854,500 1280,0,854,100' -background none {} \;

The distort perspective utility essentially requires you provide a mapping from the video frame coordinates to the mapping coordinates.  In our example;

  • orig frame (0,0) => mapFrame (170,0)
  • orig frame (0,720) => mapFrame (170,720)
  • orig frame (1280,720) => mapFrame (854,500)
  • orig frame (1280,0) => mapFrame (854,100)

Details for this distortion effect can be found here;


Step 3 -- Reassemble Frames to Output Video
$ ffmpeg -r 24/1 -i frame%05d.png -i audio.mp3 -c:a copy output.mp4

Specifying the same framerate as our input video (e.g. 24fps) we can reassemble the frames and stay synchronized with the audio.

The result will look something like this;


Now go ahead and make something cool.
Cheers.




Sunday, September 22, 2019

FFMpeg Blur Filter


Whether it be a loose Johnson, nip slip, license plate, or simply a desire to draw the viewers attention to a specific emphasis of a video a blur filter comes in handy at times.  Being able to apply a contoured filter is even better.  This post is focused on FFMpegs BoxBlur filter which does exactly that.  Let's get to it.

The filter takes in two inputs: 1) a b/w filter map, and 2) the video file.  The filter map defines the blur/unblur areas within the video frame.

Let's look at a simple rectangular blur mask;


$ ffmpeg -i clip.mp4 -loop 1 -i boxFilter.png -filter_complex "[0:v][1:v]alphamerge,hue=s=0,boxblur=5[fg]; [0:v][fg]overlay[v]" -map "[v]" -map 0:a -c:a copy output1.mp4

Results in a blurred bordered video as follows:

Similarly, we can apply a circular mask filter;

$ ffmpeg -i clip.mp4 -loop 1 -i circleFilter.png -filter_complex "[0:v][1:v]alphamerge,hue=s=0,boxblur=5[fg]; [0:v][fg]overlay[v]" -map "[v]" -map 0:a -c:a copy output0.mp4


The filter mask can be as complex as you need.

Cheers.

Sunday, September 15, 2019

Pausing/Resuming Processes

For those of you not born in the Elizabethan Era you may be surprised to learn that the norm back in the ‘days of old’ were crowds of folks using a shared computer server. If you had administration rights, you could horde the server resources and divvy out the scraps to the serfs but you’d need to prepare for a revolt from the sun-deprived masses.


I recall one of our professors had a long-running (weeks, if not months) simulation that he ran on the server. It ate up memory and cpu resources like cookie monster in a room full of Oreos. Rather than disrupt the masses, he scheduled it to run in the wee hours and paused it during peak hours. How he did it always elluded me, but I’ve recently found a means to do just that, unclear if it’s particularly how he did it.

Most all of your are somewhat familiar with the ‘kill’ command. ‘kill -9 -1’ kills all of your processes, ‘kill -9 [pid]’ more typically used to kill a particular process. You supply a kill signal, and this specifically allows you to pause and resume a process as you wish.

‘kill –STOP [pid]’ will suspend a process, ‘kill –CONT [pid]’ will resume it. Slap this pair in a crontab and you could suspend/resume a particular process based on time/day/…

$ kill -STOP [pid]

This will pause the specified process, allocated memory will be transferred to disk during this hiatus, freeing it up for current users.

$ kill -CONT [pid]

Following with this command later in the day will resume the process, letting it pick up where it left off.  A four-day process can be turned on/off over the course of the month, making use of idle time to run without disrupting current users.

Cheers.

Sunday, September 8, 2019

FFMpeg Deshake Filter



While  I can't speak for others, I've got a pretty shaky hand when shooting videos.  As a result, I tend to set up a tripod and leave the stability to good old mother Earth.

But, if you've got a shaky video, perhaps due to an over-abundance of caffeine then this post may be for you.

Ffmpeg provides a deshake video filter which may reduce the shakiness of a video.  According to the source code, it applies a SAD block-matching motion compensation.  In other words, it compares sequential frames, attempts to detect the vector of the motion, and correct it.

This won't, by any means, eliminate the shakiness entirely (in most cases), but it certainly reduces it substantially, perhaps enough to remove it as a distraction.

Let's go through an example; starting with  a static image, generate a short video, add artificial shakiness, then run it through the deshake video filter.  Lastly, we'll lay the shaky and deshaky videos side-by-side for comparison.

Let's get started.

$ ffmpeg -loop 1 -i image.jpg -c:v libx264 -t 30 -pix_fmt yuv420p -vf scale=640:480 input.mp4

Apply some shakiness;

$ ffmpeg -i input.mp4 -filter:v "crop=in_w/2:in_h/2:(in_w-out_w)/2+((in_w-out_w)/2)*sin(0.015*random(1)):(in_h-out_h)/2 +((in_h-out_h)/2)*sin(0.015*random(n))" shake.mp4

Pass the shaky video through the deshake video filter;
$ ffmpeg -i shake.mp4 -vf deshake deshake.mp4

Finally, lay these two side by side;
$ ffmpeg -i shake.mp4 -i deshake.mp4 -filter_complex '[0:v]pad=iw*2:ih[int];[int][1:v]overlay=W/2:0[vid]' -map [vid] -c:v libx264 -crf 23 -preset veryfast compare.mp4

Resulting in the following video;