So with the iPhone 13 and iPhone 13 Pro, Apple added a cinemtatic mode to video that is basically Portrait Mode for video, which does a post processed Depth of field using the different lenses to create a depth map. It is not going to be perfect, but certainly will look more cinematic than the deep depth of field you normally get with an iPhone lens, and you can re-focus after the fact in Final Cut Pro, iMovie and in the Photos app. It even has HDR color depth, but then why is it only at 30 Frames Per second?
I just don’t get what Apple was thinking here, especially since it should be easier to process 23.976 than 30 frames a second. And films in movie theaters are in 23.976, and our eyes are used to that amount of motion blur to be cinematic, and not 30 frames, which is much more TV looking, as everything is sharper.
I think eventually will add 23.976 to cinematic mode, but likely not till IOS 16 or the next iPhone with better Cinematic mode.
Honestly I have an iPhone 11 Pro and Cinematic Mode would possibly tip me to getting a new iPhone, but it is the 30 FPS thing that really screws it up for me.
Now I do have a weird relationship with 23.976 vs 29.97, as when I am cutting commercials I would prefer 29.97 because Motion graphics looks less stuttery at 29.97, but we are talking “CINEMATIC” here and for “CINEMATIC” that is 23.976!
I think Apple made a mistep here, and hope they will fix it sooner rather than later.
Of course I also think that they should have figured out a way to use the LIDAR on the Pro model which is used for low light photos to make a better depth map. I know it would split cinematic mode for the 12 and the 13 pro models, but a better depth map would be welcome.