I haven’t gotten to play with shotcuts yet as I haven’t moved to Monterery because the company I am working for still runs Premiere Pro 2019 and it is doubtful that it runs on Monterery and even if it does it will likely be not that stable. Not like I want to be running Premiere Pro 2019 anyway, basically I don’t want to live without the new caption feature when running Premiere Pro.
Very impressive results, especially since i have an iMac Pro, though I did get the Radeon Pro Vega 64X 16GB which might get a tiny tiny bit more performance.
My hope for the Apple Silicon MacPro keeps growning, though it will certainly be expensive, especially if it is basically 2 M1 Max chips tired together (i know it is more complicated than that).
Another great video from Chadwick shoults and this one showing just how fast the new M1 Max MacBook Pro is at DaVinci Resolve, which not only says allot about what Apple did, but also what Black Magic Design has done about getting DaVinci to really shine on Apple Silicon.
Man I can’t wait for an Apple Silicon iMac Pro and Mac Pro to really see what these chips can do, and if rumors are right they will likely be M2 Max chips, as the MacBook Air next year is likely to be the first M2.
The chips here aren’t only able to outclass any competitor laptop design, but also competes against the best desktop systems out there, you’d have to bring out server-class hardware to get ahead of the M1 Max – it’s just generally absurd.
Wow, this sounds amazing, as I have said, can’t wait for the Apple Silicon iMac Pro and Mac Pro to see what they can do!
I don’t have one of the Intel machines from back in 2016, but here is a result from one of the most powerful Intel integrated GPUs Apple ever shipped – an Iris Pro 655 from a 13” MBP: pic.twitter.com/w2V7dPP4P3
This is pretty amazing, beating a $6000 GPU from the 2019 MacPro! Wow. So the Apple Silicon iMac Pro and Mac Pro will certainly be impressive machines.
Things sounds great so far, and it really gives me hope for the Apple Silicon iMac Pro and Mac Pro.
Still hope that plug in makers start speeding up the process of re-writing their software for M1. It disturbs me that even companies like Maxon with Red Giant hasn’t upgraded everything to M1 yet, even though it is a subscription, which means they really should be upgrading their applications quickly, because I am paying for them constantly. At least Adobe has the Beta of After Effects working on M1, but it is going to be limited on plug ins for sure.
So with the iPhone 13 and iPhone 13 Pro, Apple added a cinemtatic mode to video that is basically Portrait Mode for video, which does a post processed Depth of field using the different lenses to create a depth map. It is not going to be perfect, but certainly will look more cinematic than the deep depth of field you normally get with an iPhone lens, and you can re-focus after the fact in Final Cut Pro, iMovie and in the Photos app. It even has HDR color depth, but then why is it only at 30 Frames Per second?
I just don’t get what Apple was thinking here, especially since it should be easier to process 23.976 than 30 frames a second. And films in movie theaters are in 23.976, and our eyes are used to that amount of motion blur to be cinematic, and not 30 frames, which is much more TV looking, as everything is sharper.
I think eventually will add 23.976 to cinematic mode, but likely not till IOS 16 or the next iPhone with better Cinematic mode.
Honestly I have an iPhone 11 Pro and Cinematic Mode would possibly tip me to getting a new iPhone, but it is the 30 FPS thing that really screws it up for me.
Now I do have a weird relationship with 23.976 vs 29.97, as when I am cutting commercials I would prefer 29.97 because Motion graphics looks less stuttery at 29.97, but we are talking “CINEMATIC” here and for “CINEMATIC” that is 23.976!
I think Apple made a mistep here, and hope they will fix it sooner rather than later.
Of course I also think that they should have figured out a way to use the LIDAR on the Pro model which is used for low light photos to make a better depth map. I know it would split cinematic mode for the 12 and the 13 pro models, but a better depth map would be welcome.
And this means it is time for the awesome deep dive review with Andrew Cunningham at Ars Technica. This is always the way to learn the ins and outs of the new system, and I always look forward to the article, and have been reading them for every OS release for so many years.
I would love to upgrade, but I always have to wait. Before you upgrade you should always check out what apps might have issues and to find out I always check out the forum at MacRumors, which has a post with a sticky at the top with Apps that are working and not working. Now it doesn’t do version numbers, so it will almost never answer my most pressing issue, which is what versions of Premiere Pro will work on the new OS (yes the new version works). Now it does seem better than most OS upgrades, but some apps are still having issues or sure.
I have companies that work completely on 2019 (which you can’t even install and should not be using anymore), and I don’t know how you run a newish MacPro on that old of an OS, but I digress.
•Apply Motion (ipad and windows) which I am interested in checking out.
•Perspective Grids (ipad and windows) to help do proper perspective
•Send to Illustrator on iPad
•Vector jitter brushes to give a more organic feel to vector brushes
Awesome, I need to spend more time in Fresco as it really is such an impressive and powerful painting program, and am certainly interested in seeing what the motion does, and see if will be useful for any video projects in the future.
So yesterday I had a post how Adobe’s new color correction feature in Premiere Pro 2022 didn’t do anything to fix the gamma shift issue on Mac on exports, and I posted a link to my post in a facebook group on Premiere Pro for Pro users.
Responses included finishing everything in DaVinci, which does nothing about Premiere’s handling of the issue, and people saying just work on a PC which will fix the issue, though it won’t if you have people viewing on Mac, because the gamma shift will happen then.
And then there were the responses about putting the Gamma 2.4 tag in DaVinci in fact tagging the clip wrongly to display correctly on Mac, and I decided to do a little test with Parallels to see how the clips show up in Windows. And I know that putting the Gamma 2.4 tag on your footage is ignored by YouTube, which ignores a 121 tag on footage and in fact forces 111 which will then have the gamma shift.
So to start this is short film I am working on, and the first part are the clip set with the DaVinci Gamma tag set to Gamma 2.4 on export.
This is the Gamma Tag I am talking about.
So I exported the show with this tag.
So the left is the 111 tag and the right side is the 121 tag on mac in quicktime. The 121 is much closer to what I am seeing on my external monitor
So this is the 111 on left and 121 tag on right, these look the same, and actually look better than the 121 tag in Quicktime,
So the left is the clip with 121 tag in quicktime and the same clip in VLC on the right, VLC looks like the correct look, so VLC is still the best solution.
Now I wanted to see about the tags and how they would look in Windows, so I have Windows 11 installed in Parallels, and I used both just Windows viewer and VLC in Windows, now I just used whatever size they opened at so I will scale down the other images to match framing. And the frame might be slightly different as I couldn’t figure out how to go frame by frame in Windows player. This is all on my iMac Pro, though screen shotting form mac, find it interesting that the 121 Gamma 2.4 tag matches on Windows to Mac, but VLC on Mac and WIndows doesn’t and VLC on Mac looks closest to what is on my external monitor.
So the 111 on the left and the 121 on the right tag look the same in windows.
So this is the 111 tag on the left and the 121 tag on the right in VLC in windows.
And this is the 111 tag on the left and the 121 tag on the right and to me the color correct looks the same.
On the left is the 111 tagged clip in Quicktime on Mac and on the left the 121 tag in VLC in windows to see the gamma shift.
And this one I don’t get at all, but here it is, VLC on the Mac vs VLC on windows, the mac version seems correct, and closest to what I am getting on my external monitor. If anyone can explain it, please contact me about what is going on here. Windows 11 is on the same display as the mac, though I am taking the screenshots from the Mac into the parallels Window.
On the left we have the 111 clip on Mac in VLC, and on the right the
I am very confused about this last one, I think the VLC on the mac looks most like what the image looks like on my external monitor, but on the mac at least the clips exported from DaVinci with the 121 Gamma 2.4 tag look closer to the image than ones exported with the 111 tag which completely show the weird gamma shift.
So for exporting from Premiere Pro, I guess if you can get people to use VLC is the best option, but if you can’t I would love if Premiere was able to add the 121 Gamma 2.4 tag to exported movies because it will look better on client machines.
I have also read the YouTube ignores the 121 tag and plays video at 111, so they will look blown out on YouTube. Does Vimeo do the same? I might have to do some tests and see what the results are when I have a chance.
If anyone has any thoughts please let me know.
EDIT:
So since there is such a difference in VLC, I decided to try another app that doesn’t do ColorSync on Mac and that is Firefox in Windows 11 and Mac.
The left is 111 on Firefox Mac and the right is 111 on Firefox Windows. The Windows one looks better to me, but the mac is better than quicktime
So strangely Firefox looks different on Mac and Windows. Better on Windows, but not as bad as Mac Quicktime for sure.
OK so I have posted about this at Adobe Uservoice to ask Adobe to add the ability to change the gamma of a clip on export. I know you don’t want to do this for final export, but for clients viewing copies, it would be nice to guarantee that those on Mac and those on Windows see something approximating what I see on my external monitor.