Jump to content

Thomas S

  • Posts

  • Joined

  • Last visited

About Thomas S

Profile Information

  • My cameras and kit
    GH4, Canon M6 MK2, P4k

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

Thomas S's Achievements

New member

New member (1/5)



  1. The topic of DR is kind of interesting. A lot of cameras live around the 12 stop range. m43 cameras like the GH5, GH5S and the Pocket 4k are all around 12-13 stops. On the Pocket 4k due to how the ISO works by shifting the stops over and under many do not shoot at ISO 1000 which has the most stops over middle gray due to the amount of visible noise. so they get 13 stops but most of that is in the shadows. Recording format can limit the DR as well and 8bit video just cannot capture the same DR as raw can which is why Sony has always been a tad misleading about DR. First of all you don't get 14+ stops of DR out of a Sony camera by shooting rec709 video profiles. You have to use log to get extra DR and even then you have to use the flatter log profiles which do not work as well with 8bit video recording. So while the raw stills may have higher DR on Sony cameras the video has likely never had a ton of extra DR. You really have to shoot Slog3 to get even close to the sensors capable DR amount and that doesn't always look the best in 8bit. It has the most risk of falling apart and the most risk of excessive noise which users will hate more than clipped clouds in the sky. We as humans are used to seeing lights and clouds clipped in video. We are used t seeing a window blown out. We are used to thinking of noise and heavy noise reduction as cheap quality. Just looking at how much is protected in the highs does not give us the full picture of how much log has gained on each camera. Some cameras like the Pocket 4k still have 13 stops of DR but use very little of that in the highs if shot at ISO 100 or ISO 1250 using the Dual Native ISO. It all really comes down to acceptable noise in the shadows. So far I like what I see with the R6 in the highs and shadows with C.log3. Its no Alexa but then again neither is BMD or Sony. No camera really is which is why people love the Alexa so much. DR is a flawed form of measurement and nobody has ever tried to pretend a Sony DSLR DR is the same as what Alexa measures. Could the R6 be better? I'm sure it could. I think every DSLR could be better. No camera right now matches what Alexa can do with DR. I think what the R6 does have is a good amount for professionals to make use of. We need to stop thinking of 12 stops as 6 stops likes its in some way inferior or unusable. We get way too hung up of a single stop of DR difference when its really kind of splitting hairs. Especially from those viewing the final product. I would like 13 or 14 of course but not enough to think the camera isn't worth it. I also know many Pocket 4k users consider their measly 13 stops cameras to look much better than cameras claiming 14+ stops. Proving that DR is a flawed spec more used as a marketing term than anything of real value. Kind of like what the lux rating used to be on video cameras. Its a nice to have but not a must have. Thats my opinion of course.
  2. One other thing I would like to add. Coming from Panasonic cameras I did find it odd that the M6 mk2 and R6 did not have a shutter or aperture priority mode for video. With that said in my years using a GH1, GH2, GH3 and GH4 I have never used those modes and almost 100% of the time used fully manual. I have at times considered using the other modes but in the end just did not. So I'm not entirely sure how many are actually negatively impacted by this. When it comes to video under no circumstances should the shutter speed be allowed to adjust on its own like some crappy smartphone camera. Nothing screams amateur more than video with different shutter speeds depending on the lighting. Aperture adjusting has a lot more practical use but given how much optically a lens can change based on aperture alone it really is not the best option. Since the ISO is so clean on the R6 using ISO to automatically adjust smoothly is likely the best option if one absolutely cannot manage to adjust exposure manually. I'm frankly surprised any professional would have an issue with this and cannot recall the last person that actually used an automatically adjusting aperture besides a soccer mom. Those soccer moms do still have the full auto mode which they are more likely to use anyway since they don't care about shutter speed or ISO either. Plus if one absolutely wanted to in full auto mode they could set the shutter range to 60 for both the low and high. That could lock in the shutter to not change at all which would effectively give the same effect of only auto adjusting the aperture. I guess I'm just not entirely sure why this is considered such a negative and worthy of a rant.
  3. Are you sure your camera was not a dud? I'm getting absolutely insane levels of detail out of the camera. Like too sharp compared to the 1:1 sensor on my GH4 and P4k cameras. Much more detailed than my M6 Mk2 and not even close. The R6 I have is razor sharp. I did notice in FCPX that when pausing the video the decoding will display the video softer until I add something like a filter that impact all the pixels. Then it suddenly snaps into full detail. During playback its that sharp as well. I think there may be something interesting going on with my M1 MBA and its hardware decoding of Canon 10bit material. Are you sure you are not seeing some weird issue caused by Premiere trying to handle the material. Not sure I would fault a camera for a NLE not doing well t handle the material. The 10bit h265 files are insanely fast performing on my M1 MBA. Super fast and snappy. Plays back great and renders super fast. I couldn't ask for a better combination really.
  4. Line skipping wouldn't even be possible on the R6. In order to line sip a camera has to use multiples of pixels in an even amount. 2x or 3x for example. A 2x crop would be 2736 wide only. Even less when you factor in the 94% crop video uses of the FF sensor. Thats like 2572 wide for 4k video. I realize some cameras like the M6 Mk2 do this as well but there is a massive detail difference between what the M6 Mk2 does and what the R6 is doing in 4k 60p mode. Not even close to the same detail between the two. I think the article needs to retract that part about the line skipping because its simply not true at all. I have seen plenty of videos online where the details was super sharp shooting 4k 60p. Just went outside and shot 23.98 and 59.94 4k on my R6 and the detail was identical on both.
  5. As a long time Panasonic m43 user I actually just upgraded to the Canon R6 and absolutely love it. For me it came down to 100% EF lens support. The 20 MP completely makes sense t one for the same reason why 12 MP makes sense for the Sony A7S. It helps sensitivity. The R6 has about a 1 stop advantage over the R5 and even photographers are very happy with the 20 MP stills. The sensitivity gain matters more then the tiny bit of extra detail. Detail that is debatable how many lenses really take advantage of. Yes The R6 has some head scratching limitations but I think the amount of hatred is a bit overly dramatic. Its a solid stills and video camera with really good IBIS and really good DPAF. I also have a P4k and while its nice that I'm not limited in what tools and options I have on the P4k the addition of super clean low light, super reelable DPAF and very good IBIS make it a more enjoyable camera to use in many situations. Every camera out there has flaws. The latest Sony A7S finally has 10bit but its stills suck. If we are going to complain about 20 MP on the R6 its funny the Sony fans are willing to look the other way for 12 MP stills on the A7S. Most of the other Sony cameras still don't have 10bit. Talk about outdated thinking. I didn't realize 4k 60p line skips. Is that 100% confirmed? If you look at this review and look at the studio tool the 4k 60p seems to have the same detail as the UHD in the drop down. https://www.dpreview.com/reviews/canon-eos-r6-review/8 Are you sure you didn't do something wrong? I rarely use 4k 60p even on the P4k. I'm not as obsessed with slow motion as others are I guess. I also don't really have an issue with the rolling shutter on the R6. Maybe its not as good as other options out there but I rarely run into a rolling shutter issue because I don't whip my camera around. 4k 60p APS-C crop eliminates a lot of those concerns. Yes it sucks to have a crop but my other camera choice was the Panasonic S5 which can only shoot 4k 60p with a APS-C crop. So technically the R6 having a FF 4k 60p mode is a bonus over the S5 even if it is compromised. I really see no decent affordable option out there for FF 4k 60p 10bit that doesn't have some kind of compromise or much higher cost. I also find it odd to criticize the RF lens options when I think many EF users will happily use their trusted EF lenses on the camera. EF glass works great on the R6 and there are a ton of EF owners out there. I am not even close to a Canon fan boy. I have used Panasonic since before the GH1. I owned a Canon XL1 many years ago but that was my last Canon camera. I used to laugh at what passed for quality from their DSLRs. The HD made me want to vomit I hated it that much. The R6 is the first Canon camera I feel finally gets 4k video right. Now that it has c.log3 its pretty solid and has a bit better DR. Every camera out there has disadvantages so its silly to only trash one camera. The S5 is an amazing camera but the L mount lenses are insanely over priced and Panasonic only focused on the high end market for glass. Nothing can really be adapted to the L mount either with any hope of video AF. Even adapted stills AF is hit or miss. Thats even assuming one finds value in the contrast detect AF of the S5. Much better than it used to be but still not perfect enough to trust it. Plus I'm just not sure of the future of the L mount and Panasonic. Feels like risky investment at this point and I just cannot afford the L mount lenses I really want. Sony does make nice cameras but the lack of 10bit is a deal breaker for me and I will not compromise on that. No matter what other features Sony does better its not enough to make up for that. I don't want to have to buy a $4,000 camera body just to get a Sony with 10bit. Thats essentially R5 price territory which arguably does a lot of things better than the Sony A7S. Those with hybrid needs that shoot professional stills and video will never consider a A7S so basically 10bit is a dead end with Sony for those users. I was about 30 seconds away from getting a Panasonic S5 instead. In the end it was the AF and bleak lens options that killed it for me. I just didn't have $8,000 to invest in a new set of f2.8 zooms and a new body. With the R6 I get 100% perfect performance from my $1,000 Tamron 70-200 f2.8 lens.
  6. Thats why I got a Pocket 4k. I got tired of the farting around with compressed formats. External recorders are the way to get around that nut thats also a hassle. Now I can shoot 3:1 Braw on a cheap SSD directly on the camera or at least 5:1 on an internal SD card. Now that some DSLRS are getting external raw support I still don't really care. Its an added cost to get an external recorder and its a lot more hassle than what I have now. I'm also kind of a freak for 4:4:4. I know visually it actually doesn't make a huge difference. I studied VFX in college and can pull damn good keys with 4:2:0. To me its more about why 4:2:2. Its a left over from the analogue days and we don't really need it anymore. We have fast and cheap enough media now to not worry about 4:2:2. Its an old broadcast video standard and really has no place in our digital world today. h264 and h265 are also very capable of 4:4:4 but we are barely getting cameras to add 4:2:2 and 10bit let alone 4:4:4. So Braw on the P4k represents something I have been trying to achieve ever since I started with SVHS and have been trying to get something better than video standards. Its not just because its raw. To me its because its RGB, 4:4:4 and color space agnostic. No more butchered rec709, no more unnecessary 4:2:2. I know visually I could probably do the same with a lesser format but to me its just about starting clean and go from there. It represents what I always dreamed of being able to do with video. Oh yeah and its 12bit which will be even harder to make an argument for than the 10bit vs 8bit argument. But hey its there and doesn't hurt so why not. Fun fact. 12bit has 4096 samples. DCI 4k resolution is 4096 wide. Thats exactly one sample per pixel for a full width subtle gradient or in other words the perfect bit depth for 4k. Not sure anyone could ever tell vs one sample every 4 pixels like 10bit has but hey there it is. Basically posterization should be physically impossible on the P4k shooting Braw.
  7. Kind of depends. ProRes is in my opinion one of the best formats the industry has ever had. With that said its not a very smart format. It just throws the same amount of bits at a frame no matter what the content. The beauty of formats like h264 is that they are smarter and they look at each frame and try to figure out how much is really needed. Yes bitrate is important for that but its such an efficient format that it can get away with a lot less bits than a dummy format like ProRes could get away with. When ProRes drops down to LT or Proxy its sacrificing quality across the frame no matter what the visual impact might be. h264 breaks the image into blocks. mpeg2 did the same thing but the image was all 8x8 pixel blocks. h264 is even more sophisticated and can use 1x1, 2x2, 4x4 and 8x8 pixel blocks. So the macro blocking can be much smaller in fine detail areas where a macro block may be 2x2 pixels instead of 8x8 that mpeg2 would use. This means we don't see macro blocking as much and visually we get a very solid picture. The image then saves bits in those flat areas so they can be used for the more detailed areas. h264 also spreads those bits across many frames. It uses difference between a group of frames to determine what has actually changed. So if a camera is locked down and only a small red ball rolls across the screen then each of the proceeding frames only have to use bits for the macro blocks that cover that red ball. The more the frames change the more bits they need for each of the following frames. The problem is yes sometimes some h264 encoders do not get enough bits. most of the time 100 mbps is enough. If you get a shot with a lot of random moving fine detail like tree leaves blowing in heavy wind then that 100 mbps may fall apart. ProResHQ is dumb but it has the advantage to look good no matter what the situation. A locked down blue sky will compress as well as those complex moving tree leaves. Its just that both will take the same 700 mbps no matter how simple or complex. h264 on the other hand can get by with much less. It would be a complete epic waste to give h264 the same 700mbps. It would not need it at all. In the above example that small red ball just does not need that much data to store 100% perfectly. I'm not sure there is a magic number as to what bitrate should be used. Really depends on the scene but for the most part 100 mbps has been pretty solid on many 4k cameras. 150 mbps for 10 bit on some cameras has been even more solid. Thats another thing to factor in. ProResHQ is 10bit 4:2:2. So its not really fair to compare to 8bit 4:2:0 h264 formats directly in terms of bitrate. Again its a dummy format and even if you send ProRes a 8bit 4:2:0 camera source it stills encodes it as if its 10bit 4:2:2. So the 150 mbps 10bit 4:2:2 h264 formats are a better comparison and visually they hold up very well compared to ProResHQ.
  8. Agreed. All the photos we look at online are 8bit and rarely is 8bit an issue. For normal rec709 video profiles 10bit is a tad overkill although there can be some extremely rare cases where it can help. The bigger plague of 8bit is some h264 encoders that are too aggressive in how they assume color samples will not be noticed as different and merge them as a macro block. You can have a frame from a h264 video in 8bit and an uncompressed png of that image also in 8bit and get more banding from the h.264. Encoders try to figure out what is safe to compress together. The stuff we can't see with the naked eye. If we can't see it there is no point wasting bits on it. So a 8x8 pixel block with very subtle green values may decide to make that a 8x8 block of one green color. This can cause banding where one would normally not have any in 8bit. Panasonic suffered from this on the GH4 when they added log. The log was so flat that the encoder assumed it could compress areas of color into big macro blocks because the values would not be noticeable. If the shot stayed as log they were right but because the log was flatter than other log profiles it really struggled with areas of flat color like walls when graded. Sony's encoder did better at not grouping similar colors. At least up to S-log2. S-log3 could suffer from the same issues as Panasonic V-log on the GH4 on older Sony cameras. The GH5 had an improved encoder that wasn't as aggressive with 8bit and areas of similar colors.
  9. A lot of this is due to 32bit float color space in NLEs. As long as the 8bit has enough to not have posterization the 32bit float will likely be able to fill in any gaps as the image is pushed hard. Grading is much easier for math to fill in gaps than say upscaling an image. In the case of upscaling new pixels can be averaged but averaging doesn't work for fine details like a hair. Grading however we are trying to prevent posterizing. That is done through smooth gradients. Sometimes averaging surrounding values does perfectly. For example if you have a color of value 200 and another value of 250 its easy in grading to averaging an in between value of 225 which still creates a nice smooth gradient. Where 10bit is important is making sure the shot is captured well the first time. Once you have posterization it will always be there and no 32bit float processing can magically make it go away. Visually ion the shot has no posterizing than no matter how hard it is pushed it likely never will have any or pushing the 10bit would show just as much. Thats why 32bit float was created. 10bit is a lot like 32 bit audio or 16 stops of DR that are graded down to 10 stops. We record more so we have it and can better manipulate it. Most of the shots above likely would have still looked good with 6 bits. You need a very long and complex gradient to break 8bit. It can and does happen. The more noise the camera has the less it will happen because of dithering. I think this is partially why Sony always had such a high base ISO for log. Finally 10bit never promised to have better color, more dynamic range or less compression artifacts. Thats not what bit depth does. Its all just about how many different color samples can be used across the image. The single and only real side effect is posterizing. Many computer monitors at one point were only 6 bit panels even if they claimed 8bit. Most never really noticed unless they did something like use the gradient tool in Photoshop to span a full 1920 wide image. In the case of the clear blue sky image in the article that wasn't even a difficult gradient. Most of the sky was a similar shade of blue. To break 8bit you need to create a gradient going from 0 blue to 255 blue across the full 3840 pixels for 4k video. That means there is a unique blue sample every 15 pixels if you create a gradient like that. So your sky needs to go from black on one end of the screen to bright blue on the other side. Not always realistic but you can shoot Skys around dusk and dawn that spread the values out a lot more than mid day. By comparison 10bit has a unique blue color sample every 3.75 pixels for UHD video. It doesn't even have to be something that covers the full screen. If you have a gradient over 100 pixels from 200 blue to 205 blue that still means a new blue sample every 20 pixels. Even though the screen area is very small. I develop mobile apps and when I add a gradient I run into the same problem trying to do a subtle gradient across something like a button. The gradient needs enough range to cover the area of pixels or it will look steppy. 10bit and higher is a safety net or guarantee to likely never have any kind of posterizing. In the professional world thats important and nobody wants surprises after the shoot is done.
  • Create New...