In this post I want to explore converting a set of images into a video or gif using FFmpeg.
For this, I have a set of ~40 iphone photos that I want to try to string together into an animated/stop-motion style video or gif. Here's a sample of the images I'll be trying this with:
Convert HEIC images to JPG/PNG
Before we get started, because I'm using images taken on a newer iPhone, the file type of these images initially was
HEIC instead of a more familiar format like
PNG. I want to convert these files to instead be jpg or png images. If you have imagemagick installed, this can actually be done quite easily using mogrify. For example:
mogrify -format jpg *.HEIC
This command will create a new
jpg file for every
HEIC file in the directory we run it in. Note that this will not delete the original
HEIC files from the directory.
Using multiple input files with FFmpeg
So now to recap from the previous post, the most basic structure for doing something with FFmpeg is this:
ffmpeg -i input_file output_file
In this example
input_file is the file we are starting with, and
output_file is the new file we want to create.
The first issue we are going to run into here is that I have multiple image files that I want to use for the input, so we need to figure out how to handle that. After a little bit of research, it appears that you can glob together files by doing something like this for the input:
-pattern_type glob -i '*.png'
So basically, I'm going to sequentially name all the images in the folder (e.g.
img-02.png, etc.), and we can use this command to glob them all together for our input.
Converting images to video with FFmpeg
Now that we have the input for our command, we can see what happens if we simply try to create an output file with a .mp4 (video) file type.
I also want to set a specific framerate for the output file we will be creating. Looking at FFmpeg's documentation, there are a few different flags that it looks like we can use related to this; for example
-r. Here's what the FFmpeg docs say:
-r[:stream_specifier] fps (input/output,per-stream) Set frame rate (Hz value, fraction or abbreviation). As an input option, ignore any timestamps stored in the file and instead generate timestamps assuming constant frame rate fps. This is not the same as the -framerate option used for some input formats like image2 or v4l2 (it used to be the same in older versions of FFmpeg). If in doubt use -framerate instead of the input option -r. As an output option, duplicate or drop input frames to achieve constant output frame rate fps.
I don't know exactly what all the implications of this are, but I've seen some examples on stackoverflow where people will set the -framerate flag before the input and use the -r flag before the output, so let's try that out. So here's my initial command to try to create a video from my images:
ffmpeg -framerate 15 -pattern_type glob -i '*.png' -r 15 output.mp4
This seemed to work okay initially, however after trying to send the video result to my iPhone and not being able to play it, it seems like the resulting mp4 video file this command creates is somehow corrupted.
After digging through some stackoverflow threads, I found some additional commands we can apply to try to fix this problem. Here's what my new script looks like with these added:
ffmpeg -framerate 15 -pattern_type glob -i '*.png' -c:v libx264 -vf fps=15 -pix_fmt yuv420p output-video.mp4
Let's dissect this a little:
This command will use the libx264 video codec for encoding/compressing the video file we want to create.
This command sets an fps filter to the specified frames per second that I want.
This command sets the pixel format to yuv420. Apparently YUV/YUV420 is a color encoding format used for video encoding type things.
After putting all of that together, and re-running my command, the resulting video now appears to work, and is able to be shared to my phone without issue! Here is the result:
Because the original images coming from the iPhone were rather large, we can reduce the final ouput file size by adding "scale=w:h" to set a specific scale in our filters. So here's what my final command ended up being:
ffmpeg -framerate 15 -pattern_type glob -i '*.png' -c:v libx264 -vf "fps=15,scale=1158:1544" -pix_fmt yuv420p out.mp4
If you want to take this approach and create a gif as your output file intead of a video, just change the file format for the output file in the command from .mp4 to .gif. Because you are no longer creating a video, you can also remove
-c:v libx264 from your command.