Jump to content

TheRenaissanceMan

Members
  • Posts

    1,503
  • Joined

  • Last visited

Posts posted by TheRenaissanceMan

  1. That 20MP sensor has a lot more latitude in the shadows than the highlights, so it actually responds fairly well to underexposing (to avoid blown highlights) and bringing it up in post. This is a lot like the D800e, which also has a lot of dynamic range in the shadows. This can be nice for recovering botched exposures, but it requires a fair bit of tonal work to get it looking natural. Also, all that shadow DR on a small sensor means that once you get to ISO 1600 or so, you're losing a ton of that to noise. 

    The newer M4/3 sensors, on the other hand, have a much more pleasing, well-balanced DR distribution. They do pretty well in the highlights with a very nice roll off, but can also be recovered quite well (at low ISOs). I prefer this, because there's less I need to do to make it look good out of the camera. Plus, in low light, I can crush the blacks a little and still have that nice dynamic range in the highlights to keep things looking realistic. 

    I've been shooting the GH3 and the RX10 side by side for a few months now, so I have a ton of experience trying to get them to match well in post. They require VERY different treatment to look their best. Both are quite capable, but if I had to choose, I'd stick with M4/3 in a heartbeat. 

    EDIT: This is all in regards to stills. For video, my findings are similar, which puts the RX10 (even with the XAVC-S codec update) at a distinct disadvantage, because all that shadow DR is heavily compressed and doesn't always respond well to being raised. Compounding the issue is that the most accurate Sony Picture Profile (Deep) is very contrasty and crushes the blacks. Portrait is probably the second best, but it's so oversaturated you can easily clip a color channel, even at -3 saturation. The GH3 in Natural or Standard blows it out of the water for out of the box color and ease of CC/grading. The only points I'd put in the RX10's favor are that lens (an amazing piece of engineering) and the internal ND.

  2. Here is the full video with the Arri Alexa SL LUT applied. My camera settings are under the description.

    Definitely a little flat in the skintones, luma-wise. It needs a little more contrast in post or the internal profile dropped to -2. Tests show that cranking the contrast lower than that just flattens the image without providing any extra DR. 

    I don't know if people just don't test their cameras these days or what, but...Natural IS one of the normal profiles. Just because one of them has "Cine" in the name doesn't mean you have to use it. Natural for low-contrast and Standard for high-contrast have been my go-to GH3 profiles for years and provide excellent results in all kinds of shooting environments. All this "bad GH4 colors" and "lousy skintones" nonsense is 100% operator error. Glad to see people are finally digging into the camera properly.

  3. Has anyone used the SLR Magic 10mm and the 12mm? I have the opportunity to get one or the other--the 12 for $350, the 10 for $450. I'm not sure which one to go for. I currently have the SLR Magic 25mm and plan to get either the 35mm f/1.4 or the 50mm f/0.95 next.

    For context, I shoot mostly narrative, occasionally foraying into music videos and events. I want a wide angle for wide vistas, deep space compositions, and moving/steadicam shots. I have a BMPCC and plan to pick up a GH4 or G7 in the next month or so. I don't mind distortion (in fact, I kind of prefer a little) and I don't care if the 12 is soft at t/1.6 if it sharpens up by t/2. 

    How, in your experience, do the two compare in those situations? Thanks in advance!

  4. Yeah, that Alexa LUT does make for a nice look. It's little more contrasty than I'd like to go, personally. But an excellent starting point overall. Here's some more clips shot in the Natural profile (0, -5, -5, -2, 0) that I've applied my own custom LUT to. 

     

    1. Those look great. Natural, understated colors, nice neutral highlights, and healthy skintones. Excellent work.

    2. You are a stunningly rugged man. Respect.

    3. Any chance you'd be willing to share your custom LUT? Or is that too much of a trade secret?

  5. They are affiliated with or paid by Blackmagic. I've never read a negative thing about any BM cameras from them. 

    Or maybe they just like their products? 

    Seriously, why are people acting like BM cameras are so unstable? They have some quirks, but as sensors in boxes, they deliver. Not to mention that they have led the charge on providing features via firmware--features which were never promised and they had no obligation to provide. They don't crash like Magic Lantern, the more aggressive GH2 hacks, or even the newer RED cameras. And overheating? Their cameras have some of the best internal cooling of any camera out there. 


    In terms of low-light, I'm not sure why people have so much trouble. I can easily push the BMPCC two stops (to 3200) without objectionable noise, and the Ursa Mini is reported to have a stop more latitude in the lows. Combine that with a downsample to 2K from 4.6K and you should be able to get a decent 6400 out of it. If you need more than ISO 6400 and an F/2 lens, your light probably sucks anyway.

  6. Also oddly enough, when I use Ninja 2 and record to DNxHD 220x (10-bit) there seems to be smoother gradation, despite the fact that it's only an 8-bit output in C100. Oddly, the DNxHD looks a bit nicer than ProRes.

     

    10-bit would be nice, I'm sure we'll all have it in coming years. I'm just not willing to sacrifice a lot of convenience right now for a few more bit depth by using black magic cameras I'm afraid.

    That's why a lot of us would like companies like Samsung or Panasonic to start offering an internal 10-bit option. Then we wouldn't have to choose! :) As it is, all these micro HDMI options are just a stopgap solution.

  7. As long as we're testing, I'd also love to see someone with an external recorder take a 10-bit capable camera like the GH4 or BMPCC and record the uncompressed out in both 8 bit and 10 bit. This theoretical stuff with RAW frames is interesting, but those conditions are where I see big differences in rendering. 

    For example, this video was pretty eye-opening to me. It's a LUT that attempts to imitate Dragon color on the GH4, and you can clearly see in the examples how much more natural the image is from the 10-bit HDMI after the Dragon LUT and standard grading are applied. It's a drastic difference. In many cases, the color palette is completely different between the two, because the 10-bit captures shades the 8-bit can't even see. 

    https://vimeo.com/101350338

    This gap in aesthetic is night and day different to the gap between compressed and uncompressed 8-bit from the Nikon D5300, for example. 

    And quick question. If your main argument against 10-bit is that the larger file sizes aren't worth it, how can you argue for the 1DC shooting 500 mbps files? Even Prores HQ, a 10-bit 4:2:2 codec, is only 220 mbps, and most people could shoot LT without noticing any degradation. Panasonic's 10-bit 4:2:2 AVC-Ultra codec is even more efficient at half the size in 1080p and almost a third the size in 4K. How does it look? You tell me. 

    https://vimeo.com/105522587

  8. They do. I've shot 60p many times on the 5d mark III raw. The image has to be desqueezed and the aspect ratio is a bit funky (also the resolution has to be lowered), but it still looks great, better than the 60p h264.

    On another note, most people think 8bit is really bad because of compression, not 8 bits itself. Remember that h264 is basically 6-7 bits precision.

     Can you post any article or white paper about that last bit? I had no idea H.264 affected tonal precision that much.

  9. What Andrew and I are saying is that 8bit compression vs 10bit compression, everything else being equal, is not a visible difference for 90% viewers and even editors/colourists. 

    We are not saying there is NO difference, just that it's not enough to take the size penalty that comes with it. 

    10 bit doesn't make the image look better, it doesn't transform it into a visibly higher quality image. Higher resolution, more DR, better lowlight performance, highlight roll-off, less rolling shutter, all are way more important than the jump to 10bit, they are visible to everybody. 

    8bit when implemented well enough, is absolutely gorgeous. C300, C100, 1Dc images, GH4, NX1, A7s images. For example, if you scientifically try it, the jump from 4:2:0 to 4:2:2 chroma subsampling makes a more visible difference. 

    Bottom line, it's not the holy ticket to higher IQ, compression is complicated, other than bit depth. 

    What makes images look better in terms of compression, is not just that. 

    There'a the compression that starts with downscaling the image, chroma subsampling, bit-rate, Gamma curve, codec efficiency and the actual algorithm of the codec, whether it does All-I, Long Gop, whether it compresses motion more, colour more, how much it prioritizes certain colours or frequencies, etc 

    A great sensor, with a great downscale, with a great 8 bit codec (C300) gives eway better images than a poor sensor with a 12bit codec (BM4K). A identical sensor with an 8bit codec and a 10bit codec shows practically identical images for all needs just with slightly less banding in your skies, yet with a bigger file. 

    I would hate to see companies starting to push 10bit to consumers as the holy grail, I want them to keep pushing better sensors, better noise, better DR, better downscaling, better colour science, better gamma curves, better ergonomics, audio. 

    Numbers and theory mean absolutely nothing, yes reading on the internet that 10bit has 4 times the colour of 8bit it must be a 4 times better image in colours, it's not, it just reduces banding slightly, that it. 

    And a side note, the over estimation of 10bit comes from the theoritical numbers but also from certain cameras, ones that shoot 10bit in a vastly superior codec to 8bit, therefore people assumed it's bit depth that made that difference. 

    For example shooting 8bit AVCHD on the FS700 and 10bit out, or shooting 8bit internal on the F3 vs 10bit out, and so on. The fact is, the improvement of going to 10bit is extremely minimal and only visible in a sky shot when grading it, the increase in video quality that's visible is coming from jumping from poor AVCHD with heavy NR and LogGop compression  to pristine ProRes, all-I, 4:2:2 (4:4:4 on F3 and added S-Log gamma). 

    Remember this also, all your monitors show only 8bit, anything extra is solely for grading purposes. 

    JPEG is 8bit 4:2:2. I am still looking for a camera that shoots 550D JPEG quality video under 20K. Closest is 1DC but at 8mp. JPEG is wonderful robust 8bit codec when implemented as Canon/Nikon do on their DSLRs. 

    Long story short, I believe 10bit is over estimated in the video world and I would hate it if companies started marketing it for consumers instead of other features. 

     

    I disagree with almost all of that, but you're also getting off topic. This is about which manufacturer we think will implement internal 10-bit, not whether we think 10-bit footage is a meaningful gain over 8-bit. If you want to discuss that, I suggest making a new topic. 

  10. 10bit is good when working with log, shoot jps with a log profile and you can see the problems. Personally I want good compressed raw because it's actually a better way of compressing files (14bit per pixel vs 30bits per pixel (10bit 4:4:4))

    See, now that's a valid argument. I was really excited when BM gave the BMCC compressed CinemaDNG in firmware, because the increase in quality is huge, the filesize isn't that different, and the workflow is already built into Davinci Resolve. 

    It makes me wonder why Cineform isn't more widespread. 

  11. Spare a millisecond proving it then, and do us all a fucking favour.

    The science is pretty well-known to serious video shooters, so I think the burden of proof is on you to show the uselessness of 10-bit. 

    Back on topic...I think Samsung seems the most likely. Ambitious and without a professional line to protect. 

    The GH4 already does 10-bit, albeit through the HDMI port. Does anyone know if the DVX200 does too? If so, I don't think an internal 10-bit codec in the GH5 would kill it.

    Show us then!!

    I'm going to fucking bed!

    Sweet dreams. Tomorrow, try running a google search or two and making your own assessments. :)

  12. Because it isn't the 10bit you're liking.

    I notice a large difference in quality between the GH4's 8-bit and 10-bit output, even on my 8-bit monitor. Maybe nail down my opinion before you attempt to disprove it. ;)
     

     

    The 1D C proves that you can get amazing amounts of colour information out of a 8bit file.

    I never said the 1DC didn't have a good look in terms of color. But you're forcing everyone to agree with your opinion (and it is an opinion, despite how loud and forcefully you club us with it). If you were showing us carefully shot comparisons with a macbeth/OneShot chart and a vectorscope to demonstrate an actual technical difference or lack thereof, I'd be more willing to see it your way.
     

    Hell, 42MP A7R II JPEGs prove this

    There's a big difference between 8-bit JPEG processing and 8-bit video processing. Except in the 1DC's case, which is why part of why it has such a great image.
     

    10bit is massively overrated because people blame 8bit for stuff that is really the fault of heavy compression (banding) and sensor related shortcomings in DSLRs.

    No, I blame it for having less color and tonal information. Which is a fact. And I enjoy the aesthetic more. Which is my personal preference. So maybe slow down on the coffee.
     

    If a line-skipping DSLR went 10bit tomorrow in a firmware update you would notice no difference.

    None of the camera manufacturers we're talking about are using line-skipping anymore except for slow motion and the NX1's 1080p mode, so this point is completely irrelevant. 
     

    And as for grading, I have tried grading 8bit and 10bit from the same camera and you have to pixel peep ridiculously hard to see a difference. Raw makes all the difference for grading, it is a different ballgame. 10bit is not.

    Where is that test and how was it performed? What monitor were you viewing it on? Did you blow it up on a projector to see how it scales? How intense was your grade? 

    I get disagreement on what's important, but you're just bullying us with your opinion at this point. 

  13. Because you're wrong!

    About liking 10-bit footage? How exactly does that work? "You know that thing you like? Well, you don't. Stop it!" 

    And again, it's so funny that you cry out for LOG profiles meant to imitate higher bit depth and write articles about a 4:2:2 camera's color superiority over a 4:2:0 one, then come out and argue tooth and nail that higher bit and color depth is bunk. 

  14. Does Panasonic have a full frame sensor I don't know about? 

    I think there are physics barriers that the magicians at Sony haven't yet figured out how to cross. 

    Where are you reading "full frame" in the title "who will break the internal 10-bit hybrid barrier"? Or are you demonstrating your impressive ability to remember sensor sizes? 

  15.  

    Please, can we wait another 20 minutes for my camera to cool down and repeat the "will you marry John" part??

    I'm sorry, have you SEEN the heatsink on the URSA and URSA Mini? Heat is NOT a problem.

  16. I could be wrong here, but my understanding is that this is possible because there is 'no' compression going on. Not to mention they are dealing with a larger body and I assume a larger heat sink. My feeble understanding of this topic is that, at this point, getting full frame 4k 10bit out of a small mirrorless dslr body is faaaar more difficult than getting full frame HD raw out of a large mirror dslr body. I'm not suggesting that the 8bit/10bit won't be used as a barrier between prosumer and pro products, because it probably will. But I don't think Sony could deliver 4K 10bit in that body even if they wanted to. 

    What's stopping them from making a bigger body, ala the GH4 and NX1? Besides losing what little size advantage FF mirrorless has over DSLRs?

×
×
  • Create New...