Jump to content
enny

RODNEY CHARTERS: BM POCKET CAMERA, 4K ACQUISITION & ALEXA

Recommended Posts

4k streaming is so dumb.

Yeah white balance on the A7s is weird, raw photos you can correct, but video: forget about it; it's just not worth the hassle. After getting a taste of raw video, I'm never going back.

Share this post


Link to post
Share on other sites
EOSHD Pro Color for Sony cameras EOSHD Pro LOG for Sony CamerasEOSHD C-LOG and Film Profiles for All Canon DSLRs

The worst thing about the Sony cameras is the white balance.  It's just so...off.  Daylight looks like it is 4500K or something. No matter how much you fine tune the custom settings it never (ever) looks correct. 

Yet Mad Max will be one of the first UHD Blu Rays released.  Go figure.

Netflix demands 4K and soon other online platforms will as well.  It might not mean much to the filmmakers but it does to the distributors and the platforms where your work could be sold so...there is that.

What I always wonder is what's stopping people from uprezing to 4K and passing it off? Same with the BBC 4:2:2 requirements. How are they really going to know if it's from a 4:2:0 camera?

I'm certain that the 5D/7D were used and presented as 1080p cameras, even though they most certainly were not.... what's to say that *good* 1080p uprezed would be that different than native 4K? Especially from older 4K like Red One, or cameras that shoot 4K, but aren't actually resolving anywhere near that. 

Share this post


Link to post
Share on other sites

Film on heavily compressed 4k or uprezzed 1080p, heavily compress and stream it to the masses, only a tiny percentage of whom have 4k TVs who won't experience any discernible IQ improvement over 1080p sitting on their lounge 10 feet away from the screen. But hey, they still have the 4k sticker on their TVs to impress their friends.

Share this post


Link to post
Share on other sites

I think we all agree that story and content matters most. On the technical side, color/skintones, and sound are more important, by far, than 4K. However, if you can offer 4K too, why not? As for skin issues (regardless of age), there are amazing plugins which track faces/skintones allowing for smoothing/defect reduction in post, and only on the face/skintones. That way the rest of the scene can maintain full 4K vs. using a diffusion filter which softens everything.

Streaming 4K makes sense- VP9 and H.265 cost less than 50% more bandwidth. On 4K displays on my desk the difference between 4K and 1080p is dramatic. Sitting comfortably close to a large HDTV, for the cinema experience, 4K makes a difference there too.

I haven't seen it in person, however the new HDR displays shown at CES are getting glowing reviews. 4K HDR is going to be very cool.

As for 4K disk space, the C300 II costs 440Mbps for 10-bit 422 4K vs. 225Mbps for 1080p 444 12-bit (160Mbps for 10-bit 422). That's only 1.95x (or 2.75x for lower quality 10-bit 422) for 4K. At these highly efficiently compressed bitrates, that's not a big deal for any serious project. Disk space is incredibly cheap, and CFAST2.0 copies are blazing fast. XFAVC is a much newer and more efficient codec vs. ProRes (basically 10/12-bit 422/444 MJPEG). Even 100Mbps GH4 and A7S/A7R II LGOP H.264 looks great in 4K vs. 1080p.

We had issues figuring out how to get fast editing for 4K C300 II files with Premiere Pro CC (FCPX had no issues). Once figured out, editing 4K footage is blazing fast.

Anyone with a Retina device looking to go back to fuzzy pixels? Once story/content/color/sound are up to par, 4K (and HDR) and nice bonuses when available!

Share this post


Link to post
Share on other sites

Thankfully there's companies like Arri and Blackmagic, and groups like Magic Lantern and Axiom who are striving to create the best cinematic digital images instead of milking old tech. The 5D raw and BMPCC meet my baseline 2k resolution requirements for theatrical release. Color science, dynamic range, and 12/14 bit raw trump 4k every time. 4k delivery is an unreasonable demand on independent filmmakers. People still listen to crappy 16bit audio, and a lot of it is heavily compressed!

Share this post


Link to post
Share on other sites

What I always wonder is what's stopping people from uprezing to 4K and passing it off?

Absolutely nothing.  Which is exactly what the UHD Blu Ray of Mad Max Fury Road will be.

4k streaming is so dumb.

I agree but it will obviously get better and make more sense soon enough.  Right now I pretty much cannot tell the difference between a good HD stream and a Blu-Ray.  We just rented "The Martian" online and it looked great.

Share this post


Link to post
Share on other sites

Thankfully there's companies like Arri and Blackmagic, and groups like Magic Lantern and Axiom who are striving to create the best cinematic digital images instead of milking old tech. The 5D raw and BMPCC meet my baseline 2k resolution requirements for theatrical release. Color science, dynamic range, and 12/14 bit raw trump 4k every time. 4k delivery is an unreasonable demand on independent filmmakers. People still listen to crappy 16bit audio, and a lot of it is heavily compressed!

I think it comes down to familiarity with a system and sticking with that system instead of ditching it, buying a new camera, and having to re learn what you do well.  I have spent the last 3 years with RED cameras and I am having trouble using anything else.  From the menus to my personal workflow for getting the colors I like, to knowing how far I can push the sensor...all of that stuff.  I hate re learning tech.  Hate it.  I think it's key to find "your system" and get really comfortable with it.  I think the idea is for the camera side of things to be effortless.  Some people hate RED but I would imagine if you spent enough time with it and adapted to it...you would change your mind.  I know I did:

 

https://www.facebook.com/greenshredder

Share this post


Link to post
Share on other sites

I think it comes down to familiarity with a system and sticking with that system instead of ditching it, buying a new camera, and having to re learn what you do well.

My mantra to noobs: Know thy shit; OR as they used to say back in the olden days: its not the size, its what you do with it; OR: practice makes perfect.

Gear porn is addictive. I've bought and spent a lot of time testing cameras that I shouldn't have bothered with. The A7s is one example: great for photos, but for filmmaking there's a bunch of better options for the money. Every camera has its limitations: try helmet mounting an Alexa. A good rule of thumb is you should be testing a camera for a film for as least as long as you'll be shooting the film.

Share this post


Link to post
Share on other sites

I think it comes down to familiarity with a system and sticking with that system instead of ditching it, buying a new camera, and having to re learn what you do well.  I have spent the last 3 years with RED cameras and I am having trouble using anything else.  From the menus to my personal workflow for getting the colors I like, to knowing how far I can push the sensor...all of that stuff. 

https://www.facebook.com/greenshredder

100% agree.

Share this post


Link to post
Share on other sites

Thankfully there's companies like Arri and Blackmagic, and groups like Magic Lantern and Axiom who are striving to create the best cinematic digital images instead of milking old tech. The 5D raw and BMPCC meet my baseline 2k resolution requirements for theatrical release. Color science, dynamic range, and 12/14 bit raw trump 4k every time. 4k delivery is an unreasonable demand on independent filmmakers. People still listen to crappy 16bit audio, and a lot of it is heavily compressed!

I watched a drama shot in 4k and displayed on a Samsung 4K OLED TV last week.

It wasn't an enjoyable and involving experience because you could see imperfections in the make up and see the actors sweating under the lights.

Instead of being immersed in the story, I was distracted by these production issues ruthlessly revealed by too much resolution.

  

Share this post


Link to post
Share on other sites

After getting a taste of raw video, I'm never going back.

That's why I am delaying and delaying the first bite! ;-DD

 

Instead of being immersed in the story, I was distracted by these production issues ruthlessly revealed by too much resolution. 

The first DDD CDs sounded awful, engineers where using all the tricks to compensate the tape loosing high frequencies, in a digital format that do not attenuate high frequencies. You wonder if sound engineers have any hearing capacity left sometimes. 

Same goes for 4K production. You need a 4K monitor, two eyes and a BRAIN. From "it seems to be there" to "wow look that wig on that actor" the step is short! 

Share this post


Link to post
Share on other sites

Hello to everybody. Those days i'm looking around on the web to buy a new camera. I have had 5D Mark II in 2009 (I shot with it the first italian full lenght feature film on dsrl), then 7D, GH2, GH4, BMPCC, Nikon d750.

Now i was there to buy a Sony A7S II. So i watched around on Vimeo some videos, ungraded, Graded, S-logs, Cine2, Cine4, etc of this camera... I didn't find one video that satisfied me completely. You know, neither such excellent videos from Brandon Li (a7R II) liked me so much. Then, I found myself watching videos of a camera that i always considered 'shit': Canon C100 which has a price now on pair with A7S II.

So... i liked many videos shot on it. Too much. 

Then i said: OK, let me see some Hollywood trailer to see the images and the colors. I saw "Bridge of spies" trailer: colors tends to have cold shadows and hot/orange lights (a bit) not so far from FilmConverted Sony A7xxx. But when i look at Titanic, Jurassic Park, The Big Short etc... colors are different. Possibly way more Canon-style. Natural. No orange/blue (what a fashion is it! everything tends to be orange/blu with faded shadows those days...). The difference (obviously) are done by lights. 

I totally agree with the DOP and... Started to buy a Sony A7S II, i'm now coming to buy a Canon C100 (C100 MK II If i find money).

 

Share this post


Link to post
Share on other sites

Hello to everybody. Those days i'm looking around on the web to buy a new camera. I have had 5D Mark II in 2009 (I shot with it the first italian full lenght feature film on dsrl), then 7D, GH2, GH4, BMPCC, Nikon d750.

Now i was there to buy a Sony A7S II. So i watched around on Vimeo some videos, ungraded, Graded, S-logs, Cine2, Cine4, etc of this camera... I didn't find one video that satisfied me completely. You know, neither such excellent videos from Brandon Li (a7R II) liked me so much. Then, I found myself watching videos of a camera that i always considered 'shit': Canon C100 which has a price now on pair with A7S II.

So... i liked many videos shot on it. Too much. 

Then i said: OK, let me see some Hollywood trailer to see the images and the colors. I saw "Bridge of spies" trailer: colors tends to have cold shadows and hot/orange lights (a bit) not so far from FilmConverted Sony A7xxx. But when i look at Titanic, Jurassic Park, The Big Short etc... colors are different. Possibly way more Canon-style. Natural. No orange/blue (what a fashion is it! everything tends to be orange/blu with faded shadows those days...). The difference (obviously) are done by lights. 

I totally agree with the DOP and... Started to buy a Sony A7S II, i'm now coming to buy a Canon C100 (C100 MK II If i find money).

 

The difference is that back then a lot of the look was simply the film stock and different film stocks produce different looks.  

 

The "orange/teal" look has been around forever and is simply a result of using lights with color temps of 3200k and/or 5600-6000k and shooting on stock that is either Daylight or Tungsten.  When they would mix lighting temperatures the result would often be a variation of the "orange/teal look" by default.

Share this post


Link to post
Share on other sites

 

The "orange/teal" look has been around forever and is simply a result of using lights with color temps of 3200k and/or 5600-6000k and shooting on stock that is either Daylight or Tungsten.  When they would mix lighting temperatures the result would often be a variation of the "orange/teal look" by default.

It is also how the actual world look before sunset this time of year if you live in snowy Sweden or equivalent.

Share this post


Link to post
Share on other sites

Yes, of course. I have nothing against using different temps for lights. It's easy to see when blue/orange tint is a product of set lights and when it is a color correction. And i don't like the last. Most of the video i saw on Vimeo concerning Sony A7S - II had this problem. Of course, this depend on the flavour of the filmmaker not on the camera. But, watching carefully many Big movies like those i said, Lord of the rings, etc... this doesn't happen. 

But, the center of the question is that i was sure to buy A7SII, saw hundreds of videos. Now i'm almost sure of getting C100 (damned slowmotion, i love it)

Share this post


Link to post
Share on other sites
Guest Ebrahim Saadawi

Ironically, 4K is more expensive and much heavier to the high-end world than it is to us!

A large portion of the world of video is consumed through broadcast television around the world. In fact, in most of the world it is the ''only'' video consumption channel to the population, how about the 4K proposition to broadcasters? It's hundreds of millions of dollars to shift to that. Not exaggerating. I lived through the broadcast shift from SD to HD and that was how much cost it took, it was a massive damage to the facilities in terms of budget but the quality leap made it necessary.

When everybody saw HD, they went WOW. I never want SD again. It's so much better so much clearer. Does the shift to 4K have the same effect on broadcast viewers? No. So this huge portion of video world will not be 4K anytime soon, not in less than 5+ years. Perhaps a few experimental channels in the US/UK/Japan. It just costs SO much.

Also, the cost increase 4K represents to episodic television and to cinema is SO much more than it is to us. It's not about the cameras, it's about the data and how the post-production houses deal with it, and the eventual projection either on a TV or a Theatre. It's millions of dollars. This is why people like Rodney are taking this stand. 

However, to us in the low-end world? 4K doesn't increase cost. Not barely any difference than HD. We have dirt cheap 4K acquisition cameras and lenses, our mere computers at home we use as a post-production ''house'' can handle the small amount of 4K data editing and grading we shoot for a short/music video/corporate/doc, and projection is free now with a simple click to Youtube. 

So really it's ironic that the lower-end video world IS going to shift to 4K MUCH faster and easier than the high-end world. It's just how it worked out. 

To me, 4K is a bonus. I wouldn't trade it for dynamic range or colour science of artefacts but I WOULD take it as a bonus if it has no negative effect on the other aspects of image quality. It's just better. It's exactly the same as the digital shift went from 2mp stills  Nikon/Canons to 8mp models. And I do shoot these high resolution images (36mp stills and 8mp video) even for eventual viewing on a 1920x1080 pixel screens or small prints, or even SD screens, because starting with a high resolution gives a better/cleaner image, and in many cases the 4K resolution actually positively affects the other IQ elements, it decreases noise and compression as noise-reduction+down-sampling gives a cleaner image, and it also increases the colour information, not to mention even improves the visual content from the ability to make dynamic movements/crops. 

-Would I take higher resolution if it negatively affects IQ (BM4K vs BMC2.5K)? no

-Would I take higher resolution if it doesn't negatively affect IQ (GH3 vs GH4)? Yes.

-Would I take higher resolution if it positively affects IQ (1Dx vs 1Dc)? Hell yes.

-What about 1080p RAW 14/12bit 4:4:4 high DR vs 4K with high compression? I'd take the 1080p (5D raw/BM Raw vs FS5/GH4). 

Colour thickness is paramount for me. Not even dynamic range as long as it's 10/11+ I am fine. But colours, how people look is the most important IQ element I strive for. If you give me 4K on top of it without negatively affecting me, I'd be stupid to refuse it really. It's a higher-end image a better image all things being equal. More information, just like we want more colour depth (Bit depth) or more highlight/shadow information (DR) this also is getting more information to work with in your image. And it doesn't have a budget effect on me as a low-end video shooter due to the reasons explained earlier. 

4K is coming, very fast, will be the standard for Internet video and for TV screen resolutions in less than a year. But it's not going to be the standard for broadcast television or cinema for quite a few years to come, more than you would think. The world is still recovering from the cost it took to go HD and actually most of the world STILL haven't gone HD (and most of it is 1080i/720p!), neither broadcast nor Web consumption (broadband speed), so 4K in that area of the video world is years and years apart from us. 

Share this post


Link to post
Share on other sites

I remember that. Caused quite a stir.

The iphone and the 7D were the ones that noticeably stuck out as worse. Not AWFUL by any stretch, just noticeably softer. 

The GH2 did look good. The third episode, they did reveal the lighting set ups and post time. That's where people came back to reality. It took a lot more tweaking with the lights and tons more post work to make the GH2 look good. 

Still, it was like 50x cheaper. Remarkable, given the price difference. But that security and speed to get a nice shot is worth a lot. 

That was years ago though, and I'd argue that the low end is crushing up into the high end more and more with each passing year. The test would be even closer now. 

I expect the BIG changes with the low end cameras since that test is not just the image quality getting closer, but that they'll be able to light it faster and need to do less in post as well.

 

Still won't be quite up to the speed and efficiency of an Arri Alexa of course, which is why the big guys will stick with that! But it is getting closer.

Share this post


Link to post
Share on other sites


A large portion of the world of video is consumed through broadcast television around the world. In fact, in most of the world it is the ''only'' video consumption channel to the population, how about the 4K proposition to broadcasters? It's hundreds of millions of dollars to shift to that. Not exaggerating. I lived through the broadcast shift from SD to HD and that was how much cost it took, it was a massive damage to the facilities in terms of budget but the quality leap made it necessary.
 

Ebrahim,

Why does it cost hundreds of millions of dollars? I.e. What costs the most for a broadcaster to shift to 4k?

Thanks,

Rob

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


×
×
  • Create New...