BlackMagic RAW has been updated to 2.1 with improved Premiere Pro Performance and M1 Support

Blackmagic Design has upgraded it’s BlackMagic RAW plugin to version 2.1 with M1 support and improved Premiere Pro Performance.

  • Added native support for Apple Silicon on Mac.
  • Added optimised CPU decoding for clips captured by Blackmagic URSA Mini Pro 12K.
  • Added Blackmagic Generation 5 Color Science Technical Reference document.
  • Added support for Panasonic Lumix S1H, S1 and S5 Blackmagic RAW clips captured by Blackmagic Video Assist.
  • Added support for Nikon Z 6II and Z 7II Blackmagic RAW clips captured by Blackmagic Video Assist.
  • Blackmagic RAW Adobe Premiere Pro plugin performance and stability improvements.
  • General performance and stability improvements.

Improvements to performance are always welcome and M1 support is as well.

OWC on which used Mac for Video Users

Brian Levin at Other World Computings Rocket Yard Blog has a great article on the pros and cons of various used Macs for video editors.

A great article and something to consider because of the lack of pro M1 computers with more than 16 GB of combined ram, and the possibility than sone plug ins or software might not yet work in an M1.

Video Editor Chris Salters with 2 levels if remote Collaboration Worklows for any budget

Trailer editor Chris Salters has posted 2 remote collaboration workflows, one free and one including a second computer. Great to see an almost free solution and a much more robust solution as well as solutions better than Zoom though much more expensive (starting at $399 a month). 

I like that he sets up a solution that will work with other programs as well.

The budget solution creates a new video monitor which means you are eating more of your precious video memory, which worries me a little, but for screening should be ok.

And the other solution requires another computer and some hardware, so might be more for where companies can provide hardware.

Adobe Desperately needs to enable reading of iXML in Audio Tracks in Premiere Pro

 

Adobe Premiere Pro desperately needs to add support for iXML audio date for secondary audio in Premiere Pro. Currently when you add your external audio in Premiere Pro you get zero of the Metadata that the Audio Recordist has added, you just get the audio.

This is the same Audio in DaVinci Resolve, in which the engineer has added Mix and TrkA and TrkB to the metadata! Why can’t we see this in Premiere?

Being able to see which track is what would make my whole life so much easier. Especially with lav and boom and mix. If the data is there, Premiere Pro should be able to read it for sure, and export it for the mixer to see as well.

Now this is in various places on Adobe Voice, but I have voted for them all but this hasn’t gotten enough votes for sure. 11 votes here. 5 Votes here. And 20 votes here.

Adobe please add ability to see both video and audio waveform in viewer window like DaVinci Resolve

 

Now as full fledged editor I still much prefer Adobe Premiere Pro over DaVinci Resolve. DaVinci Edit is functional, but just doesn’t feel as fully functional, and I like that I can setup the windows on Premiere Pro however the hell I want to, while I am much more forced into how they want me to do it on DaVinci, and that is my big complaint with Final Cut Pro X, engineers forcing you to do things there way. Also I am not a big fan of the new Cut Page. It feels very Final Cut Pro X to me. Yes I like how you can quickly scan through a folder as if it is a single sequence, but the actual editing is just not how I edit, and I would much rather have an edit control surface for the Edit page than for the Cut Page.

That being said there is certainly one feature in DaVinci’s Edit page that is far superior to Premiere Pro’s the Monitor window and the ability to overlay your Audio on the video either in zoomed in or the full audio. To activate this click on the 3 dots on the upper right of the viewer window and you can seelct either show zoomed Audio Waveform or Show Full Clip Audio Waveform. 

Premiere is either or, and you can do that in Resolve, but I just find this so much faster.

Either Video

Or audio in Premiere

But both together is such a time saver! Please Adobe I hope they adopt this soon.

There are over 500 votes for this at Adobe Voice, but if you agree please add your opinion. This is a timesaving feature that I really love and would love to see added to premiere.

BlackMagicDesign updated DaVinci Resolve to version 17.2

 

BlackMagicDesign has updated DaVinci Resolve to version 17.2

These are the new features for the Studio Version.

What’s new in DaVinci Resolve 17.2

  • Dramatically improved application startup performance.
  • Live save is now on by default.
  • Support for custom naming for individual timeline clips.
  • Support for adding transitions by double clicking or dragging to viewer.
  • Support for decoding AV1 clips on Windows.
  • Accelerated AV1 decodes on supported Intel, NVIDIA and AMD platforms.
  • Support for decoding MKV clips.
  • Support for exporting IMSC-1 compatible TTML captions in IMF clips.
  • Support for option to include project name subfolder in media management.
  • Support for pasting HDR and color warper attributes in the Color page.
  • Support for Fusion template bundles.
  • Support for applying and managing crossfades in the Fairlight timeline.
  • Support for a batch fade and crossfade editor in the Fairlight page.
  • Support for persisting Fairlight edit mode between application restarts.
  • Support for moving audio clips to match timeline timecode position.
  • Support for setting handles when performing audio only renders.
  • Support for controlling track processing order in the Fairlight mixer.
  • Support for accessing Fairlight patch and link in the edit and deliver page.
  • Ability to show or hide specific audio I/O ports for patching in Fairlight.
  • Support for Fairlight console firmware 1.6 with full FlexBus mixing support.
  • Improved waveform displays in the Fairlight timeline.
  • Improved auto scroll behavior when dragging clips in the Fairlight timeline.
  • Support for ACES color science 1.2.
  • Support for selecting per-clip ACES DCTLs from context menu.
  • Support for new IDTs for the Canon EOS-R5 cameras.
  • Option to use white point adaptation in project settings for RCM workflows.
  • Option to use white point adaptation in Resolve FX color space transform.
  • Support for codec passthrough when rendering IMF JPEG2000 clips.
  • Support for trimming Sony Raw and XAVC MXF in media management.
  • Support for reading gyroscopic metadata on Sony Venice clips.
  • Ability to update RMD metadata files for R3D clips.
  • Improved spatial and temporal deinterlace quality.
  • Improved curves range display for position and zoom on the edit timeline.
  • Improved color management for Blackmagic RAW Gen 5 color science.
  • Improved decode performance for 8K H.265 clips on Apple Silicon systems.
  • Improved scripting API with the ability to import custom frame sequences.
  • Improved scripting API with the ability to delete timelines.
  • Improved scripting API with the ability to query current page.
  • Improved scripting API with the ability to add generators and titles.
  • Improved scripting API with the ability to specify render alpha options.
  • Improved scripting API with the ability to switch layout presets.
  • Improved scripting API with the ability to quit the application.
  • Improved iXML data support with AAF export workflows.
  • General performance and stability improvements.

Frame.io’s Lisa McNamara and Zack Arnold ACE on Adopting a Post-Production Workflow from March 2020 is well worth a read

Lisa McNamara has written an article with the help of Zack Arnold ACE on the Best practices for Adopting a Remote Post-Production Workflow at Frame.io, and it is well worth a read. It goes into the challenges and security concerns, managing media, communication, collaboration, and even morale, well being and sanity. It is of course also selling Frame.io, but it is an article by them, and the article is great and very in depth including other companies solutions.

Every post supervisor or producer overseeing a team working from home should read this article.

The Hidden cost of Apple changing it’s hardware architecture for editors and motion graphics artist is plug-ins

 

So of course Apple is moving to the M1 processor for all of it’s computers, moving away from intel. This is the 3rd hardware switch Apple has made, from it’s initial motorola processors, to power pc, to the ARM based M1 processors. And while the current M1 is very fast, but not a pro processor, especially with shared graphics and normal ram and a limit of 16 GB of total RAM. 

For everyone sticking with Apple this will eventually mean new hardware to move to M1 from Intel, though for a few years at least Apple will continue to support Intel hardware.

The hidden cost though, that is something different, and for a professional editor or motion graphics artist the hidden cost is plugs-ins.

Plug-ins can be an expensive investment, but can really help your workflow and speed things up and let you do things that couldn’t do without them. And the move to M1 will certainly be a paid upgrade, even for those still on Intel hardware. And those plugs in upgrades can cost hundreds, and over the upcoming period there are going to be a lot of upgrades to M1.

And while DaVinci and Final Cut Pro X already run on M1’s and the Premiere Pro Beta runs on M1, to get your old plug-ins to run you have to run them via Rosetta 2, which means running the Intel based versions of the host software to get the plug-ins working. And that is going to mean running the software slower through emulation, and could cause many issues and add more stability issues.

Now of course subscription based plug-ins will have the price included in the subscription, but the lack of more money for the upgrade might mean a lot longer before they upgrade to M1, even if it should mean they should upgrade sooner since you are already paying monthly or yearly for the software.

And yes the fact that our Intel Hardware will last a few more years with upgrades means that the upgrades will happen over a few years, so we can pay it, but for me it is a lot of plug-in upgrades, that will be followed by an expensive hardware upgrade to whatever form Pro M1 Macs take.

And of course their will be the exceptions, companies that treat their customers correctly and will upgrade to the new architecture without charging anything. One such company is RE:Vision Effects, which I got an e-mail from and they are developing M1 versions of the current versions of all their plugs ins. And have already released OpenFx and Twixtor M1 betas for FXPlug versions and RSMB for FXPlug is next.

Sofi Marshall’s Ultimate Real Time Remote Editing Workflow and how I am considering adapting it to my workflow

 

Sofi Marshall, Editor, Writer and Workflow Expert has an awesome post on the Ultimate Real Time Remote Editing Worklflow. You can check out her IMDB page to see her experience, which is extensive.

Now I talked a bit about this in my recent post on Work from Home thanks to the Covid-19 Pandemic. And I have been thinking about getting a workflow working because I would like to be able to work from home as much as possible.

And I too loved iChat theater in Final Cut Pro 7. It was amazing technology that worked really well when the entire Internet was so much slower.

As for the Workflow, obviously the BlackMagic Web Presenter has been updated to the Web Presenter HD to include the front panel, but there is another option as well that is either the ATEM Mini or the ATEM Mini Pro which are switchers and include picture in picture abilities.

Now the Mini Pro includes the ability to stream directly to YouTube, FaceBook or Twitch (or I guess you program some others like Vimeo but it isn’t easy, nor is that going to be all that secure) but the Mini for $295 seems like a viable alternative as it has the USB port to act as a web cam and it has multiple sources possible, so if you have a camera with an HDMI port and I have an old GoPro Hero 2 which has mini USB out and a usb for power.

Personally since I have an BlackMagic Ultrastudio 4k, I would hook this up to the ATEM Mini as my main source, and the GoPro as my personal camera, and then I would hook up the ATEM Mini to my Windows Surface and run Zoom on there to save resources on my editing Mac.

The one issue is the single HDMI out on my Ultrastudio 4k, so to get it to my monitor, I would have to take the HDMi out of the switcher which will likely put it on more of a delay from what I am editing, but this would only be necessary when live streaming.

And because of the Ultrastudio I would not be streaming any of my desktop, but the direct output from my editing system, though I would still uncheck disable video output when I background, so the output signal remains as a source.

Now with using my Windows Tablet as the zoom machine, I will have to use it as the audio monitor as well (though that precludes all the setup with Rouge Amoeba’s LOOPBACK which makes things easier) though not sure how great it will sound through the headphone port and have to check audio levels using the ATEM Software Control (which should work fine on Windows).

Best would be to cut off audio to the gopro and use a good mic through the switcher’s microphone ports, but that is of course an extra expense.

And honestly I should probably run the audio out headphone port into my Emotiva-XDA-1 and it’s connected heaphone amp, just so I can control the volume better.

Of course I have to hook it up and try it to see how it works, and if it didn’t I could do it all from my iMac, but that shouldn’t be necessary.

 

EDIT: Awesome, so Sofi Marshall has done a small update to the post to add the ATEM Mini as an option. Basically it works the same, with the ability to do picture in picture form multiple sources, but it has one downside. The ATEM Mini can only accept 1080 inputs, while the web presenter can accept 4K inputs, and the ATEM Mini is a bit larger as well. Good to know.

EDIT: OK so maybe the Web Presenter is the better solution. It has SDI, and I have SDI to spare on my Ultrastudio 4k, and it accepts up to 2160 60p video  and exports up to 1080 60p. The ATEM mini and Mini pro accept up to 1080 60p video, so it wouldn’t accept a UHD sequence direct from Premiere, but I would also lose the ability to have a camera on me via the switcher. Hmmm.

 

 

DaVinci Resolve Ext Matte issues in Color Page

OK so I am running into some issues on  DaVinci Resolve right now using an Ext Matte in a color correct node, and I am wondering if I am doing something wrong, if this is how it is supposed to function.
I have some shots that were shot just as the sun was going down at the end of a shoot without enough lighting. So the exterior is way too dark, and it has to match the daylight exteriors. Now I tried to do matte’s in DaVinci and even tried the new smart mask on the people, but I found I could get a better matte in After Effects with the Roto Brush. The matte is the exact same size as the shot in the sequence in the show, for this shot 1 second and 10 frames, so I made a matte that started at the first frame and is exactly 1:10 in the frame, but when I add the Ext Matte to the shot, while the scale matches up, it certainly does not match first frame to first frame.

So I unchecked Lock Matte to be able to  adjust the clip manually. And had to adjust the zoom, tilt and offset to get it to work. I don’t get why I have to set the offset to -14, since the matte is the same length as the visible portion of the shot in the sequence, and the full shot is much longer than 14 frames. SO I AM TOTALLY CONFUSED BY THE OFFSET.

 

Even weirder to me than the Offset though is the tilt. I had to set it at -571 to get it to line up (and why can’t there by a match for position, but change the offset?) and that is weird to me, because I have below the transform for that clip within the edit panel. The zoom matches at .6, but the position is -150, so where does the -571 come from? I should be able to match the 2 clips by using the same numbers, not completely different numbers! WTF!

Not sure what I am doing wrong here, but if anyone can explain to me why the offset and tilt are like they are it would be greatly appreciated.