Convert between framerates (24 fps, 25 fps, 30 fps, …)

The 5D originally only shot in 30 fps (NTSC-inspired, undoubtedly, since it was meant for USA news agencies anyway). So when the footage was to be used for TV (PAL in Europe = 25 fps) or for cinema (24 fps), it had to be converted. Also, when you shoot in 720p60 (at 60 fps) so that you can make a nice slow motion effect, you have to ‘convert’ your material. We already referred to an article about how to ‘conform with Cinema Tools/FinalCutPro‘. But let’s talk some more about this conversion and how to do it with a free open-source tool like ffmpeg.

1. Conversion with fixed # of frames.

This is what is referred to as ‘conforming‘. You keep the same number of frames, so a 20 second long 30 fps clip (20 x 30 = 600 frames) becomes 25 seconds at 24 fps. The same number of frames are played at a slower speed. The only thing you have to do for the video is to change the metadata of your clip. The audio will have to be played slower too, so will sound lower. If you want to fix this, you have to pitch this up (with audacity or sox).

How to do this with ffmpeg? Well, in two steps:

  • extract the frames as rawvideo
    ffmpeg -i input.mov -f rawvideo -b 50000000 -pix_fmt yuv420p -vcodec rawvideo -s 1920x1080 -y temp.raw
  • recreate the video with new framerate
    ffmpeg -f rawvideo -b 50000000 -pix_fmt yuv420p -r 24 -s 1920x1080 -i temp.raw -y output.mov
  • To save on needed intermediate disk space, you could send the output of the first to stdout and pipe it to the input of the second

2. Conversion with fixed running length.

This method keeps the total length, but in order to do so, it has to interpolate (estimate frames that are between 2 original frames) . In other words, for each second you get 30 input frames and you need to create e.g. 24 output frames. Output frame 3 of that second will be created out of input frames 3 and 4, combined in some way. If this interpolation is done with a low quality/fast algorithm, any steady smooth movement in the input movie, might become jerky and unnatural. Therefor this procedure will be slow (might be slower than real time – you’ll need more than 1 hour of conversion time for each hour of footage) The audio will remain untouched, since the total length does not change.

How to do this with ffmpeg? Easy:

  • convert framerate in one step:
    ffmpeg -i input.mov -sameq -r 24 -y output.mov

Unfortunately, this is quite slow AND low quality. So you might want to look at tools like FinalCutPro to do this kind of conversion. Also, MVtools (AVIsynth) seems to be able to do this better, I still have to check that out.

5 Responses to “Convert between framerates (24 fps, 25 fps, 30 fps, …)”


  • Unfortunately the way how ffmpeg (also the ffmpeg broadcast version http://code.google.com/p/ffmbc/) read the h264 mov file is not very good: you get the highlight very compressed and you lose information if you want do a good grading!!!
    Personally, after a long research, i use another way :
    Using avisynth with QTinput plugin (to open mov file) and open the avs file with virtualdub or ffmpeg to export in your favorite format and frame rate.

    if you use open avs file with ffmpeg you have to add flipvertical() command in avs file; that is not userfull if you use virtualdub(i don’t know why)
    I will post soon some where a small tutorial to how menage all this and also export in loss-less free codec (huffyuv or Lagarith)
    sorry for my poor English
    i hope this is user full

  • @TheToad, I find QTInput scales the luma, even with latest QT Alternative installed. It appears to scale the luma twice though, so levels appear correct until you check them on a waveform or histogram to see combing and data loss.

    So i use Haali Media splitter and remux into a matroska container, then use AVISynth & Haali bundled .dlls to read the mkv’s to decode the h264.

    Also for anyone reading, never use FFmpeg to convert to RGB always use AVISynth. FFmpegs swscale expands the luma crushing blacks and clipping whites, and on default settings for interpolation looses about 50% of colour data compared to AVISynth’s conversion.

    This all works on Linux too with Wine.

  • I tried to “upframe” – convert mp4 with 23.976 to 30 frames rate. Does not work. I mean, output file still has 23.976 frames rate.

  • Found this post and it was very helpful start. I wanted to give back some additional info I discover for anyone who comes here via google.

    1. This is how you pipe the output of step 1 into step 2 so you don’t have to create (& then delete) the temp.raw file

    ffmpeg -i input.mov -f rawvideo -b 50000000 -pix_fmt yuv420p -vcodec rawvideo -s 1920×1080 -y pipe:1 | ffmpeg -f rawvideo -b 50000000 -pix_fmt yuv420p -r 24 -s 1920×1080 -i – output.mov

    Secondly, by default this will create a h264 file if you want to use a different codec such as Apple Pro Res (only in the newer builds of FFmpeg)

    ffmpeg -i input.mov -f rawvideo -b 50000000 -pix_fmt yuv420p -vcodec rawvideo -s 1920×1080 -y pipe:1 | ffmpeg -f rawvideo -b 50000000 -pix_fmt yuv420p -r 24 -s 1920×1080 -i – -vcodec prores -profile 2 output.mov

  • I think I found a more elegant way to do that using “-vsync passthrough” option. It is conforming my 59.97fps footage to slow motion 23.98 without frame skipping:

    for %%a in (“*.mov”) do ffmpeg -i “%%a” -s 1280×720 -sws_flags lanczos -r 23.98 -vsync passthrough -vcodec dnxhd -b 110M -mbd rd -acodec copy “%%~na.DNxHD.mov”

Leave a Reply