Jump to content
Sign in to follow this  
Andrew Reid

How Mac OSX still *screws* your GH2 / FS100 / NEX footage - A must read!!

Recommended Posts

You'll notice that in your non 32bit filter screenshot that its only the RGB representation if your YCbCr source files that are crushed. There is no clipping btw. The YCbCr luma waveform is not affected.

The reason your RGB scopes show crushed peaks is due to the difference between YCbCr source and RGB levels ie: 16 YCbCr is equal to 0 RGB and YCbCr 235 is equal to RGB 255 thats the correct mapping and conversion. So as many cameras capture the full luma range of 0 255 YCbCr it won't match in contrast and brightness between sources so either crushing or scaling has to happen.

So then the decision is do I want the NLE codec to do that arbitarily and prorata at decompression either stretch and crush or scale luma into 16-235 so RGB scopes look right aka Vegas or do I want full control and have my sources left well alone and grade it where I want in order to get luma into 16-235 for correct RGB preview and playback in all the devices including web that expect a 16-235 luma signal.

CS6 Premiere offers adjustment layers for applying across all clips?

If cameras didn't shoot full luma and were strictly so called broadcast legal then the issue wouldn't exist. But not all output is destined for broadcast legal and wider gamut is possible with 32bit processing over 8bit.

Share this post


Link to post
Share on other sites
EOSHD Pro Color for Sony cameras EOSHD Pro LOG for Sony CamerasEOSHD C-LOG and Film Profiles for All Canon DSLRs
Hi Yellow,

Now let me show you what I understand at this point:

0-255                0-255                                    16-235
NEX    ___    PC Monitor (for NLE work)  __  Output File for Broadcast   

There is sveral ways of converting the nexx 0-255 signal to broadcast 16-235 as You stated                           

https://docs.google.com/open?id=0B-B11XUNq0_hdTlCQTZDVmxLWTA
    This is a File that i just recorder with my nex5n. The histogramm on the nex loked like this:
[IMG]http://i46.tinypic.com/2q19soo.jpg[/img]

You will see there are super white levels and that when you bring them down a lil bit there will be subtile detail on the top of the T-shirt... I wont that detail but while i can bring it back I dont know if there is something wrong with the blacks I simply cannot tell... If I trust Sony vegas then the blacks in Premiere are def. not OK. But if I set the project in Vegas to 32 bit FULL RANGE then everything looks like in Premiere CS6.

So could You please check it out for me? Look into the scopes and waves of your NLE and if you could please download Vegas Pro its really a small programm compared to Premiere in terms of downloadsize.
And/or just post a screen shot of how you think the perfect conversion of the levels schould look like. and I am not talking of personal taste but much more about how the perfect conversion schould ook like.

I know that 0= 16 and 255=135  but i dont know which programm is doing what here in terms of stretching scaling or what and I am starting to loose the last bit of confidence here.


Now when it comes to broadcast safe: What output files are 16-235  and How do I know if my NLE ( Premiere) is outputting  broadcast safe Levels.

Share this post


Link to post
Share on other sites
I just figured it out.  The GH2 records in 16-235 and the NEX FS 100 and Nex5n record in [b]16-255[/b]!!! This is unusal .
So this meens Premiere is interpreting the Blacklevels the right way but only the Superwhite Levels 235-255 are kind of cut off! Bring them down 20 steps and then u have much more detail in the highlights.

So When Andrew Reid says that there is less detail in the blacks then its [b]not[/b] true. Only the part with the highlights is true! This is becaus of the unusal 16-255 Levels of the nex system and because Adobe Premiere is set to 16-235 so 16 =16 and 235=235 and everything above 235 is cut off and you have to bring it back!


In Vegas u can change the Level in the Project but in Premiere u can´t! But even in Vegas you will either get wrong blacks which are too bright or wrong whites...  crushed.

Best solution is Premiere which has spot on blacks and then add an Adjustment Layer and the fast color corrector to it to bring the whites down 20 Levels to get all the Detail.


In After effects u will have to set the whole Project to 32 bit to be able to get all the info in the highlights and the problem then is that some effects wont work in 32 bit!

Share this post


Link to post
Share on other sites
Still haven't found a satisfactory conclusion for this post. Several ambiguities:

1. Is the GH2's matrix BT.601 BT.709 or neither? I have heard all of the above from contradicting sources who I'd thought were reputable. 709 seems to be the most popular response, though it differs from the recommendation at the beginning of this thread. I'm not intelligent enough to get that information out of Mediainfo for .MTS streams. Is it listed anywhere below?

[img]http://public.fennworld.com/Misc/Media_Info_MTS.png[/img]

2. Can the GH2 record 'full' unscaled luma? I believe it records 16-235, but if evidence exists to the contrary please present.

3. If the range is indeed scaled, then one should not use 'Full Range' in the 5dtoRGB options, correct? Or would that not potentially degrade the image as that program doesn't seem to export anything but scaled video anyway?

4. Premiere Pro CS6 recognizes the full range flag and scales video, as pointed out by Yellow and readily confirmed by his test files available for download.  Opening GH2 .MTS files natively shows a scaled image in the waveform, but could this be due to flagging issues like with Canon files? If the GH2 ends up delivering a full range image then this is an issue, no?

5. Recommendations often center on transcoding via 5DtoRGB to the chroma subsampled flavors of ProRes (anything but 4444). A screenshot [u]from Premiere[/u] shows the following difference in my tests between 422 HQ and 444 at 400%:

422HQ:
[img]http://public.fennworld.com/Misc/422.png[/img]

4444:
[img]http://public.fennworld.com/Misc/444.png[/img]

Why the jagged stair-stepping? I don't believe this is carried through to output--certainly hope not or no one would recommend it--but why does the file sent through 5dtoRGB and converted to 422 or 422HQ look so much more aliased than the original or a 4444 transcode inside of Premiere?

6. How does one change the full range flag on .MTS streams? I've run some basic commands via MP4Box in terminal before, but am not familiar with the patch for modifying that flag, nor do I know if it would be Mac compatible.

Thank you all for your time and insights.

Share this post


Link to post
Share on other sites
hi Pyriphlegethon.

1. GH2 will be BT709 color matrix. BT709 color primaries. If the color matrix is not specifically stated in the stream from the camera encoder then assume BT709 for HD sources. This is not to say that a media player will not do it's own thing and use BT601. Color Matrix only matters when a YCbCr to RGB conversion is to be done for display on a computer monitor or creating an image snapshot or extraction from the video source. Transcoding can loose a color matrix declaration in the stream though, really only a concern for Canon & Nikon DSLR's that use BT601 for HD.

2. From the sample files I've seen the GH2 is restricted range luma 16 - 235 from the camera, yes sure it's easy to get 0 - 255 by mishandling the source file, doing uncontrolled color space conversions to and from RGB. But 16 - 235 from source I think.

This would appear to be the reason for example a FS100 & GH2 clip together in an NLE gives different appearance. FS100 is full luma and suggestion that a tool like 5DToRGB solves a problem, which it doesn't really as it just scales full luma into 16 - 235, from limited tests I've done anyway.

3. If the range is 'scaled', scaled would suggest that somewhere the codec has done fullrange to restricted at decompression or worse restricted -> full. Do you mean 'restricted' ie: 16 - 235 from source when you say scaled?

This is where it can get messy depending on codec handling, but generally if a source file has been encoded with full luma it should be treated that way and visa versa. But best to test codec handling in the NLE.

From tests I've done with 5DToRGB it scales into 16 - 235 so it appears you're getting all the data, which you are but it's squeezed it into less levels ie 0 - 255 into 16 - 235 than originally encoded and quantized over and gives the same appearance as a comparably shot / exposed GH2 clip which is already 16-235.

But as mentioned previously if using a 32bit float NLE then there's really no need to transcode and squeeze levels, just grade in the NLE or a levels mapping, all the data will be there outside of the 16 - 235 range just the preview will appear crushed and clipped as the 'proper' range for 8bit playback is 16 - 235, but at capture as much as we can get.

4. Yes because GH2 is 16 - 235 and a Canon or Nikon DSLR source which both use the fullrange flag get luma scaled at decompression into 16 - 235 so both GH2 and Canon/Nikon comparable shots should look very similar in levels. This is the purpose of the fullrange flag, to get a 'proper' preview in an 8bit media player,  with 32bit processing in an NLE there's no need for this.

But the flag does avoid confusion for playback that the FS100 appears to suffer from, because it shoots full luma and doesn't flag it as such, at playback it looks crushed and clipped and the decompressing codec or media player choice is blamed for screwing it up. Again 16 -235 is the 'proper' range for playback but we want as much as possible at capture.

5. 444 is not chroma subsampled, 4444 is no subsampling + alpha, 422 is subsampled chroma. But I think this is more about interpolation methods from the original 4:2:0 source. (GH2/Canon/Nikon DSLR's) and where this occurs and at what bit depth and precision it's done at etc. It's more useful for compositing etc as the output is almost certainly going to be 4:2:0 and there's no reason this interpolation can't just be done at playback in the media player depending on resources.

6. I don't think the MTS container has the flag as an option, unfortunately the patched build of MP4box doesn't recognize MTS as a container I've recently discovered. :-( if this is for GH2 then it probably not important as the source is 16 - 235.

Share this post


Link to post
Share on other sites
Yellow, thanks for the quick reply.

So the GH2 doesn't present too many problems here it seems, but there are some issues with cameras delivering full range luma from the source. I observed the same thing--that 5DtoRGB was scaling full range luma to limited range regardless of source material. I suppose the issue prevalent for many of us involves moving media to another application for grading. If you have source material with unlimited range luma that you want to ingest in Resolve via XML or EDL then you're fine assuming it's a file format Resolve can handle (not MTS). It means grading with the full range of luma underneath a 16-235 LUT and exporting the standard/limited range.

It gets trickier if you need to export from Premiere to Resolve because of file format. Using 5dtoRGB beforehand gives you a scaled to "limited" range clip, which, while not as bad as unknowingly switching between limited and full range, might irritate some who use it for the utmost image quality they hear it provides. Premiere has no options for encoding only the MTS clips in its 'Project Manager', so I suppose I'll test the DPX output from Premiere and do a 'preconform' in Resolve with an EDL (or go the Adobe route to Speedgrade). Might be interesting to see what the DPX output does with the unscaled data in the RGB conversion compared to a 5DtoRGB conversion to PR4444--again with 16-235 footage I'd probably just convert with the latter from the get go.


I'm still confused as to why the ProRes4444 clip looks like the original MTS in Premiere (should be a full RGB source compared to a 4:2:0 source, no?) and the other versions of ProRes from 5dtoRGB exhibit the stairstepping.

I wonder if Andrew might do well to change this post if the majority of the information presented seems to be inaccurate for the GH2--both the color primaries and the luma range in the 5DtoRGB settings.

Share this post


Link to post
Share on other sites
[quote author=Pyriphlegethon link=topic=726.msg6765#msg6765 date=1341327905]
And a quick note on 5dtoRGB "scaling" full range data down on export: Thomas said this is due to the ProRes Codec spec itself being compliant with broadcast range. DPX files can be exported to maintain the full luma range if desired.
[/quote]

Interesting, the tests I did with 5DToRGB were DNxHD exports and they were luma scaled too.

Share this post


Link to post
Share on other sites
Alexander, I'd missed your post and pys earlier one on Resolve for that matter.

From the few nex samples I've seen I was aware it was not restricted 'broadcast' range like the GH2, same for the FS100 with the additional factor of its shooting modes, black level tweaks etc. But again it captures beyond the broadcast 16-235 zone of levels which is really the point, media player and NLE handling of the levels outside of 16-235 and methods of handling it.

So whether 16-255 or 0-255 is a little acedemic but certainly good to know. Thanks.

Share this post


Link to post
Share on other sites
py, missed your post too, result of browsing a long thread with large embedded images on an old gen iphone and trying reach the bottom of the thread rapidly :-)

I think you have it right there and the DPX route which I'm assuming will be 10bit should be enough where as going ti 8bit RGB is insufficient to hold the full YCbCr data in the original source files.

I'd hazard a guess that with regard to interpolation of edges that less refinement in 5DToRGB is perhaps due to a non RGB transform at 8bit versus a 32bit conversion in the NLE? I've not had chance to look properly at your images and may be misunderstanding your observations.

Share this post


Link to post
Share on other sites
[quote author=yellow link=topic=726.msg6786#msg6786 date=1341383748]
I think you have it right there and the DPX route which I'm assuming will be 10bit should be enough where as going ti 8bit RGB is insufficient to hold the full YCbCr data in the original source files.[/quote]
That's correct. 10 bits is enough, even when scaled to broadcast range. And you are correct that 8 bit scaled is less data than 8 bit full range, but that's assuming the 8 bit H.264 footage was transcoded to an 8 bit codec. ProRes is 10 bit, so it should be enough even though it's scaled to broadcast range.

[quote]
I'd hazard a guess that with regard to interpolation of edges that less refinement in 5DToRGB is perhaps due to a non RGB transform at 8bit versus a 32bit conversion in the NLE? I've not had chance to look properly at your images and may be misunderstanding your observations.
[/quote]
Hey Pyriphlegethon, I recall seeing what you're seeing in the past, but my copy of CS6 on the Mac doesn't have this problem. Are you using the latest version of 5DtoRGB? Can you send me a clip that renders like this in Premiere? I'd like to investigate this.

Share this post


Link to post
Share on other sites
To be honest, I didn't understand 50 % of what you were talking about. Partly this may be because english is not my native language, but I showed the white feather in german discussions of similar topics as well.

Be it as it may, I do believe my eyes. I transcoded a GH2 clip using 5D2RGB "as" s.th. 709 Full Range, 601 Full Range and I exported it as the same ProRes from FCP X. I laid the original and the three ProRes versions into the same story line and compared them with the RGB scopes. Now since I color correct everything anyway, I balanced every of the graphs until they looked the same - the graphs as well as the output video clips. To me this seems to be much ado about nothing. I am glad I only downloaded the "lite" version and did not yet pay 39,99 € for the Batch-5D.
What can you make of [i]this[/i]?
[size=8pt]Film quiz: "This? Why, I can make a hat or a brooch or a ..." (?)[/size]

EDIT: I think nothing can beat the original data. With FCP7 and by transcoding with 5D2RGB you throw away the original. With Adobe you work natively, with FCP X you work with Proxy or ordinary ProRes (not "HQ") to smoothen the background-rendering, but to what data do the final calculations (no matter if [i]share[/i] or [i]export[/i]) fall back?

Share this post


Link to post
Share on other sites
[quote name='Axel' timestamp='1341859454' post='13636'] I transcoded a GH2 clip using 5D2RGB "as" s.th. 709 Full Range, 601 Full Range and I exported it as the same ProRes from FCP X. I laid the original and the three ProRes versions into the same story line and compared them with the RGB scopes. Now since I color correct everything anyway, I balanced every of the graphs until they looked the same - the graphs as well as the output video clips. To me this seems to be much ado about nothing. I am glad I only downloaded the "lite" version and did not yet pay 39,99 € for the Batch-5D.[/quote]

Decision to use valid tools like 5DToRGB can be for numerous reasons. But with regard to clipping / crushing / getting all the data etc I feel it's pointless using such a tool on GH2 files which are 16 - 235 luma anyway, ie: NOT full range, so to use 5DToRGB's fullrange option on GH2 is incorrect. 16 - 235 luma levels range in the GH2 is what a media player / HDTV / NLE any conversion to RGB at 8bit is expecting and required.

I haven't tested 5DToRGB on GH2 sources but I'd assume if you feed a limited range 16 - 235 GH2 file through 5DToRGB and tell it that the source is full range ie: 0 - 255 or 16 - 255 luma levels it will squeeze the GH2's already limited luma range into even fewer 8bit levels. ie: 16 - 235 assumed 0 -255 resulting in 32 - 215.

The full range option is for sources like Canons, Nikons that have 0 - 255 luma (assuming the NLE ignores the fullrange flag which generally it doesn't instead honoring it) and 16 - 255 luma shooting cameras like the Nex5 / FS100 because the decompressing codec passes 0 - 255 or 16 - 255 through to the NLE, but as NLE's expect 16-235 they treat the source as 16 - 235 and crushes shadows and highlights when stretching what it thinks is a 16 -235 range into 0 - 255 RGB for preview / playback.

For many NLE's particularly those working at 32bit precision that's fine it's just the preview, look at the original source files in the NLE's waveform and it'll show above 100% IRE ie: levels greater than 235. Same for lower end, The data is there just needs grading into place and in a decent NLE that will be at 32bit precision rather than an 8bit precision transcode squeezing levels outside of 16 - 235 into that legal range. Pointless and detrimental in many cases to transcode solely to be appearing to get all the data, however for playback reasons transcoding may well be required. With regard to Canon & Nikon files this can be a bit more tricky as discussed previously.

But media players will show the crushed shadows and highlights by default for native 0-255 and 16 - 255 source files, so all looks bad. Of coarse a squeezed source from a transcode will 'look' much better, not crushed and that's the point about making sure when encoding to delivery codec that a 16 - 255 or 0 - 255 source has been graded into 16 - 235 for proper display of levels by default in a typical non color managed media player.

Share this post


Link to post
Share on other sites
[quote name='yellow' timestamp='1341921644' post='13670']For many NLE's particularly those working at 32bit precision that's fine it's just the preview, look at the original source files in the NLE's waveform and it'll show above 100% IRE ie: levels greater than 235. Same for lower end, The data is there just needs grading into place and in a decent NLE that will be at 32bit precision rather than an 8bit precision transcode squeezing levels outside of 16 - 235 into that legal range. Pointless and detrimental in many cases to transcode solely to be appearing to get all the data, however for playback reasons transcoding may well be required.[/quote]

I feel I understand around 60 % of this, which is an improvement :)

The bottom line is, OSX doesn't do any harm to my precious GH2 or 7D footage (as the title says [i]screws it[/i]).

But 5D2RGB does. Because for Premiere, the access to the original data is lost. Okay, this is no disaster, since it is the old FCS workflow (or Cineform, DNxHD or the like), and who ever heard about serious quality loss? However, Premiere doesn't [i]need[/i] it. As you said, yellow, it can map the values anew, using 32-bit precision, perhaps a more reliable procedure than to toss away the original. And look what Pyriphlegethon has found out about Premieres treatment of ProResHQ, something that at least didn't happen in classic FCP ...

If you know there is improper presentation of the original footage in the players, for whatever historic reasons of ancient norms, and if you know you have the means to correct the aberrations, the problem is solved.

Or did I again miss the point?


[b]
[/b]

[b] [url="http://www.eoshd.com/comments/user/20257-pyriphlegethon/"]Pyriphlegethon[/url][/b]

Share this post


Link to post
Share on other sites
Well 'harm' may be too strong a description depending on how 'precious' we feel the source files are. For me it's more about gaining awareness of what is happening to avoid 'processing' that is unnecssary or unhelpful.

By OSX I guess you refer to QT? Premiere doesn't use QT even on mac but FCPx and FCP do of coarse?. I'm not aware of how QT handles GH2 source but for Canon DSLR's it does a very similar approach to 5DToRGB with regard to levels, that is it scales them into restricted range. Whenever i've used QT to decompress it always gives 4:2:2 even from 4:2:0 sources upsampling chroma, not sure what interpolation it uses for that though, as I try to avoid QT for anything.

With Canon DSLR sources including 7D and even the prototype 1D C, the MOV container has that fullrange flag metadata set 'on' so many decompressing codecs will scale luma 16 - 235 as per a 5DToRGB transcode.

Regarding improper presentation of original footage, yes it's just about being aware of why and how so that when things don't look right we stand a better chance of fixing it.

Share this post


Link to post
Share on other sites
[quote name='Axel' timestamp='1341928279' post='13683']
The bottom line is, OSX doesn't do any harm to my precious GH2 or 7D footage (as the title says [i]screws it[/i]).

But 5D2RGB does. Because for Premiere, the access to the original data is lost. Okay, this is no disaster, since it is the old FCS workflow (or Cineform, DNxHD or the like), and who ever heard about serious quality loss? However, Premiere doesn't [i]need[/i] it. As you said, yellow, it can map the values anew, using 32-bit precision, perhaps a more reliable procedure than to toss away the original.[/quote]

This is a non-issue. I'll explain. There's nothing Premiere is doing that 5DtoRGB isn't. They both take 4:2:0 H.264 and re-create an RGB (or 4:2:2, in the case of ProRes) representation of the original data. Actually, you should really be referring to the MainConcept decoder, which is responsible for supplying Premiere with decompressed H.264 output. At that point, all Premiere is doing is reconstructing 4:2:2 or RGB from the decompressed source.

Remember that the "original" data is useless without a bunch of additional processing. H.264 video stored as 4:20 (full res luma, 1/4 res red and 1/4 res blue) must, in 100% of cases, be rebuilt by the NLE or transcoding program into something that can be viewed on a monitor. It's this "rebuilding" process that sets one NLE apart from another. FCP7/QuickTime does a pathetic job rebuilding H.264. Premiere is better. 5DtoRGB, of course, is best.

Keeping the original data is always good for archival purposes, but relying on the original files as your source for editing is also a liability. The reason is because if you ever use more than one piece of software in your workflow, there is a possibility that different programs will render H.264 differently. This is certainly the case with FCP7. Anyone who's done motion graphics work knows it's a pain to get footage rendered out of After Effects to match footage in FCP perfectly. This issue is completely solved by transcoding all of your footage beforehand, [i]with one piece of software[/i] (5DtoRGB). That way, you know the footage will look consistent in every program that opens it, and all the gamma shifting/noise issues that have plagued FCP7 users for years are gone forever!

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Sign in to follow this  

×
×
  • Create New...