Quartz on Should commute time be counted as part of the worklday

The article is by Sarah Todd at Quartz.

I have been privileged enough to be able to work from since the Covid Pandemic hit, though my wife has started having to go in 2 days a week. And I have realized I really don’t want to return to the office at all if it can be helped. At home I have my editing computer setup just how I want it with the plug-ins and scripts I want and the keyboard and mouse as well, and I am able to see my wife during the day as well as my dogs, but the thing I really, really don’t miss is the commute.

Honestly in Los Angeles the minimum commute if you are very lucky is 30 minutes each way, and most are between 45 minutes to 1 hour and 15 minutes each way. And yes that eats up so much time in the day.

Already I walk my dogs for about an hour and a half each morning (unless it rains which is not that often) so I get up at 5:45 to start work at 9 AM. To have to drive too, means I would need to be getting up at 4:45 AM. I used to be able to go that, but I am getting older, and 5:40 is already early.

And to be able to start dinner and eat pretty much when work is done is also amazing, it gives you an evening.

And this is without kids.

Now an inexperienced editor would need more supervision, and I can see them in an edit bay with more supervision, but after 20 years, I don’t need more daily oversight.

And with all the talk that 4 day work weeks without longer hours, actually increase productivity, so does working from home, and having commute time be part of your day for days you go in really would be amazing. It would mean that employers had to think about the commute, but it would also mean that they would hire closer employees, so there would have to be laws enacted, giving commute times based on where you are. Some places could easily do 30 minutes, but Los Angles would have to give at least an hour every day. I am assuming employers would make apps that track your drive as well, so they don’t get cheated our of a few minutes, but all that might let more people work from home.

9to5Mac on ‘compute module’ in iOS 16.4 code, what is it referring to?

An article by Jeff Benjamin at 9to5 mac speculates on the ‘compute module’ in iOS 16.4 code.

I hope it really is for the Mac Pro, but honestly I doubt it. I think the Mac Pro will be a higher end Mac Studio, with more Thunderbolt ports. After the seeming upgrade-ability of the last Mac Pro which only had PCI slot upgrades, I just doubt they are making some upgrade-able aspect to the Mac Pro.

It seems much more likely to be a processor for the Reality Pro headset.

Late last night I got into the Adobe Firefly Generative AI Beta and started playing around

This week Adobe announced their new Firefly Generative AI Beta that you can sign up for. I did on the first day and got in around 8 PM last night. I was playing on my iPad last night (which was fun) though the download buttons seems a little funky on iOS and it didn’t download everything that I asked it to. So I played some more on my Mac today.

There currently 2 active modules, Text to Image and Text Effects.

My name as a text effect with Humpback whales, and the effect set to loose on the characters. You can see the created by Firefly Beta and not for Commercial use bug on the corner.
And my mom’s name done in orchids
And my wife’s name with a tight Steampunk Effect

The type effects are very cool and fast. My biggest complaint was it certainly can’t do one plant or animal per letter, it uses them more as textures than objects, but still I will be using this quite a lot for certain things.

And as for Text to Image, you certainly have to get the prompt right.

For 1960’s psychedelic guitar playing man with beard onstage at woodstock, it went for a modern and disturbing image.

Fairly impressive hands, though those sunglasses, and not sure about the hair in some.

I then tightened it to psychedelic guitar playing man with beard on stage at woodstock in 1968 and things got more interesting.

Love the first and 3rd, then went for graphic instead of photo of the same.

The 2nd one is much more interesting this way, and the 3rd has Floyd from the muppets eyes, or maybe the grateful dead, no idea, but creepy, and the 4th is pretty damn fantastic, though a strangely 7 string guitar with 4 strings.

And then went for an art variation.

These are all pretty damn amazing!

And I do love the controls for image refinement,

And I tried to do some imagery for my short film THE MISADVENTURES OF BEAR, while I do love what it came up with. It certainly isn’t listening to the “dark hallway at night” it is a hallway, But it is certainly lit, and I can’t seem to get it to go dark. The first 4 are art, and then photo afterwards.

And the next are photo.

And then i stayed on photo but tried to go darker, but setting color and tone to cool, and lighting to low lighting.

And then I changed the promp to down a hallway in the dark to see if that would do it, and it didn’t, but I do like the clothing variations it added.

So wow, a really powerful tool I will keep playing with, but it doesn’t seem to know how to do low light or unlit images, or day for night?

Tangent releases a Beta of it’s Warp Engine for Mapping Color grading

Tangent Beta Program – Tangent : Tangent

This new engine in the beta allows you to not only do a custom mapping for DaVinci, but for other programs that don’t use the panel, though you must go in and program the positions on screen as it is basically faking mouse moves. So not full customization (not possible unless Black Magic allows it which is highly unlikely.

And everyone will need to program for their own monitor setup, which is a pain, but once programmed will work.

Adobe’s AI art generation tool Firefly enters Beta

And you can currently sign up for the beta on the Firefly site right here.

AI has made such an explosion lately, especially Midjourney, which I so want to play with, but don’t really want to pay for just to play. This adobe version seems ideal for me as someone who already subscribed to the creative cloud, and with elements that can be used within Adobe products.

And the Text Effect seems really amazing.

I really want to see one of these that you can put your own images into it and get variations on that coming out (I do believe midjourney does that).