MacWorld on the A16 of the iPhone 14 Pro is really an A15+

Jason Cross at Macworld has a really interesting article on the new A16 for the iPhone 14 Pro.

It seems it isn’t even really a 4 nanometer process, but an enhanced 5 Nanometer Process, just running faster with faster memory, but with none of what was expected for the A16.

And if the true A16 is delayed, it also might explain why the M2 for the MacPro is delayed and why Apple might not hit the 2 year mark on the Mac Pro moving to Apple Silicon.

ProVideoCoalition on How to Troubleshoot HDR iPhone Footage in Premiere, According to Adobe

Michelle DeLateur at ProVideo Coalition has an excellent article, “How to Troubleshoot HDR iPhone Footage in Premiere, According to Adobe

It is a problem with tone mapping, which is currently handled best by the beta.

This isn’t necessary until you run into it, and go wait, it doesn’t look like this on my iPhone, but that has an HDR screen, so…

TechCrunch says Congress should secure the app store supply chain and I agree

Jorge Grueul has a great article To protect consumers, Congress should secure the app store supply chain.

I so agree. The bills currently proposed do nothing to protect consumers from malware, congress people have so little tech knowledge. If we have unfettered access to an iphone with apps installing off a link, all cell phones will be compromised in a single day. If they just get rid of the walled garden and allow easy unfettered install, every iPhone will be compromised so quickly it is a joke.

And of course the App stores need more regulation, as how could Apple let through so many bogus apps?

We need security standards on new app stores that apply to all app stores.

Nick Lear at ProVideoCoalition on Using the iPhone 13 Pro as your B Cam

PVC again for the win, damn this is a great site.

Nick Lear has an awesome article on using an iPhone 13 as a B Cam and the pitfalls it entails and how he got it to work. And the things you don’t realize, like yes you can and should shoot with ProRES, but if you do you can’t shoot in LOG in Filmic Pro and the possibility of using Cinematch to help with balancing.

Jan Kaiser info on iPhone 13 Cinematic Mode

Now I saw this twitter thread thanks to Scott Simmons at ProVideoCoalition and this post.

And here is Jan Kaiser’s eskocz channel on YouTube.

Now you need to look at the thread and see more than just the first post.

Wow, I am blown away that the depth map is only 320×180, and only works so close up.

I was also wondering why the Pro didn’t use Lidar to enhance the depth map, but while it shows allot more depth than the cinematic mode, it is also fluttery and noisy, so I can see that it would cause some serious issues in a depth map.

And the true depth on the front camera would be useless for a depth map.

Between the 1080 30p and low resolution depth map I think I am OK with skipping the iPhone 13 and seeing what the 14 will offer. The technology is impressive, but I want to see future generations.

Allan Tépper at ProVideoCoalition on ProRes in iPhone 13 turns out to be Variable Frame Rate

Allen Tépper at PVC has this article on tests by Carolina Bonnelly that show that the ProRes shot by the iPhone 13 are in fact variable frame rate and not constant frame rate.

This is so disappointing as to really use this footage you will need to decompress it to a constant frame rate. especially disappointing since the footage already takes up much more space and need to re compress it to really use it properly to get it constant frame rate. And you will be losing a generation in compression (I know ProRes can handle it better than HEVC but still).

Why is Cinematic Mode on the iPhone 13 and 13 Pro only 30 FPS and not 23.976?

So with the iPhone 13 and iPhone 13 Pro, Apple added a cinemtatic mode to video that is basically Portrait Mode for video, which does a post processed Depth of field using the different lenses to create a depth map. It is not going to be perfect, but certainly will look more cinematic than the deep depth of field you normally get with an iPhone lens, and you can re-focus after the fact in Final Cut Pro, iMovie and in the Photos app. It even has HDR color depth, but then why is it only at 30 Frames Per second?

I just don’t get what Apple was thinking here, especially since it should be easier to process 23.976 than 30 frames a second. And films in movie theaters are in 23.976, and our eyes are used to that amount of motion blur to be cinematic, and not 30 frames, which is much more TV looking, as everything is sharper.

I think eventually will add 23.976 to cinematic mode, but likely not till IOS 16 or the next iPhone with better Cinematic mode.

Honestly I have an iPhone 11 Pro and Cinematic Mode would possibly tip me to getting a new iPhone, but it is the 30 FPS thing that really screws it up for me.

Now I do have a weird relationship with 23.976 vs 29.97, as when I am cutting commercials I would prefer 29.97 because Motion graphics looks less stuttery at 29.97, but we are talking “CINEMATIC” here and for “CINEMATIC” that is 23.976!

I think Apple made a mistep here, and hope they will fix it sooner rather than later.

Of course I also think that they should have figured out a way to use the LIDAR on the Pro model which is used for low light photos to make a better depth map. I know it would split cinematic mode for the 12 and the 13 pro models, but a better depth map would be welcome.

Adobe updated Adobe Fresco to Version 3.0 October 2021 at Adobe Max

Adobe has updated it’s awesome drawing app for tablets, Adobe Fresco to it’s October 2021 release, version 3.0.

It’s new features are:

Apply Motion (ipad and windows) which I am interested in checking out.

•Perspective Grids (ipad and windows) to help do proper perspective

•Send to Illustrator on iPad

•Vector jitter brushes to give a more organic feel to vector brushes

Awesome, I need to spend more time in Fresco as it really is such an impressive and powerful painting program, and am certainly interested in seeing what the motion does, and see if will be useful for any video projects in the future.

iPhone 13 Cinematic mode and 13 Pro ProRes coming soon

Apple’s new iPhone 13 and iPhone 13 Pro both feature a new Cinematic Mode, which is basically iphone’s portrait mode for video. So the video also has a depth map, and with AI can track faces and even switch faces or do manual rack focus. It is not true depth of field, but fake depth like portrait mode, and it only works in 1080 30p. According to the video you should also be able to focus after the fact, which would be incredible!

This will be a very impressive feature, and will enhance video, though won’t be perfect just like portrait mode which enhances but not like a real lens with real depth of fieldh. And hopefully someone will make a plug-in for NLE’s that allows focus after the fact much like the awesome focos app. And let’s hope it can be integrated into Filmic Pro as well, as I find it unlikely to record at non-variable speeds

And the extra GPU core of the iPhone Pro and at least 256gb of hard drive space will eventually be getting ProRes recording. Now I am sure that it won’t be ProRes RAW or even ProRes HQ, but instead lite or proxy, but that will beat H.264 or H.265. Of course ProRes is going to wreak havoc on icloud photo storage, and really show how slow lightning cables are.

The Pro can also capture in dolby vision hdr up to 4k 60 fps. I think this means that cinematic mode doesn’t include Dolby Vision HDR, but then what computer monitor can play that back for adjusting color?

I also wonder if the LiDAR scanner in the Pro will enhance the cinematic mode? I guess they would have said so if it did, though it’s depth map would be better you would think, though maybe it is only for night mode.