Jump to content

How Mac OSX still *screws* your GH2 / FS100 / NEX footage - A must read!!


Andrew Reid
 Share

Recommended Posts

[quote name='Thomas Worth' timestamp='1342003046' post='13737']This is a non-issue. I'll explain. There's nothing Premiere is doing that 5DtoRGB isn't. They both take 4:2:0 H.264 and re-create an RGB (or 4:2:2, in the case of ProRes) representation of the original data.[/quote]

Except Premiere CS5 onwards after decompressing the source files recreates RGB data in a 32bit levels range based on the native levels range within the camera source, transferring the YCbCr across to 32bit RGB losslessly and allowing YCbCr <> RGB color space operations to be swapped between in the NLE without loss and the full color gamut created in excess of that which can be achieved with an 8bit conversion using the 16 - 235 luma range from a transcode.

This is not possible with 8bit color space conversions and is one of the reasons that camera source files with levels exceeding 16 - 235 have the luma scaled, to reduce the amount of 'invalid' RGB, gamut errors and resultant artifacts, that can be created in the color space conversion to RGB using source files with levels outside of the 16 - 235 range.

This was mentioned previously on the thread, re: 10bit comments and advice to export 10bit DPX image sequences from 5DToRGB at 10mb+ a frame as the solution? So folders of images or 32bit data within the NLE.

I do understand that there are reasons to use image sequences for VFX and whatever but that's specific to the task not in order to solve the problem of maintaining color and processing precision when transcoding.

My understanding is rightly or wrongly that 5DToRGB scales the luma to 16 - 235? This is different from Premiere, they don't do the same thing in that instance, but not so with a QT based NLE like FCP or even FCPx I believe which autoscale.

[quote]Actually, you should really be referring to the MainConcept decoder, which is responsible for supplying Premiere with decompressed H.264 output. At that point, all Premiere is doing is reconstructing 4:2:2 or RGB from the decompressed source.[/quote]

And that decompressing codec does just that decompresses? Or are you aware of it doing the 4:2:0 -> 4:2:2 chroma upsample and algo used? or is that done by Premieres own code? For QT I find I can only get 4:2:2 out even feeding 4:2:0.

[quote]Remember that the "original" data is useless without a bunch of additional processing. H.264 video stored as 4:20 (full res luma, 1/4 res red and 1/4 res blue) must, in 100% of cases, be rebuilt by the NLE or transcoding program into something that can be viewed on a monitor. [/quote]

However we must separate 'Display' from 'Processing'? I work on many occasions YCbCr right through from deblock, denoise, upsample chroma, grade, deband, luma sharpen and encode without a single RGB operation, of coarse this is limited in scope compared to RGB operations for VFX, Motion Graphics etc but going to RGB is not 100% required with regard to processing.

Display: Do we really need to upsample edges with some interpolation routine in the transcode, so it looks better on Display, media players give options for that at playback, bicubic, lanczos etc. Premeire gives options for preview quality, this doesn't have to be 'baked' in at transcode though?

Interpolating edges has side effects, halos, ghosting and over sharpening is this not better done by a plugin within a 32bit workflow as part of the grade with other 'improvements' rather than some edge 'fix' at transcode? Sure there are improvements to be made, deblock, denoise, edge interpolation but surely at 32bit precision in the NLE?

Processing: I understand your assertion regarding chroma interpolation from 4:2:0 to 4:2:2 to RGB but we're talking the difference between a couple of interpolation methods and correct chroma placement that's all, with negligeble differences perhaps at 400% zoom. Bearing in mind that being that particular about chroma interpolation but after first scaling luma and requantising in the transcode does no harm?

[quote]It's this "rebuilding" process that sets one NLE apart from another. FCP7/QuickTime does a pathetic job rebuilding H.264. Premiere is better. 5DtoRGB, of course, is best.[/quote]

:-)
Link to comment
Share on other sites

EOSHD Pro Color 5 for Sony cameras EOSHD Z LOG for Nikon CamerasEOSHD C-LOG and Film Profiles for All Canon DSLRs
[color=#222222][quote name='Thomas Worth' timestamp='1342003046' post='13737']Remember that the "original" data is useless without a bunch of additional processing. H.264 video stored as 4:20 (full res luma, 1/4 res red and 1/4 res blue) must, in 100% of cases, be rebuilt by the NLE or transcoding program into something that can be viewed on a monitor. It's this "rebuilding" process that sets one NLE apart from another. FCP7/QuickTime does a pathetic job rebuilding H.264. Premiere is better. 5DtoRGB, of course, is best.[/quote][/color]

[color=#222222]Very interesting what you said about Premiere before. As I understood you - please correct me with patience - what I see as a preview of the mpeg4 original (all the codecs in question are mpeg4 in the end) in Premiere is also a decompressed version, just one that is not shat as huge file onto my hard drive, it is rendered "on the fly" and not saved.[/color]

[color=#222222]I know Premiere and FCP very well, that means at least for a decade. I have access to more NLEs through my friends, Avid, Vegas, Edius. They all may have their limitations or advantages. What would have become instantly known and would have meant the termination of the software, is bad quality output. [/color]

[color=#222222]That's my objection regarding 5D2RGB. Thousands of films using the trendy codecs are edited with any of the afore-mentioned softwares and published, and - whether you see through all the intricacies discussed here or not - you never see striking differences that lead you to the conclusion that one NLE is better than the other.[/color]

[color=#222222]That I fail to understand these things fully doesn't mean I would'nt be sensitive. When I skipped from FCP7 to Premiere (because my first impressions of the FCP X were unfavourable, to put it mildly), I re-edited some of my begun projects, and that made a good starting-point for a comparison. Though I did not export to Quicktime, I found the results to be of the same quality. Meanwhile, since Lion, the Quicktime gamma shift bug has been overcome at last, and in FCP X you can choose not to transcode to ProRes in the background, but work with the original. With no difference in appearance, using ProRes just enhances the RT-performance.[/color]

[color=#222222]One of the advantages of Premiere is it's native workflow. My Adobe-teacher friend (PC) refuses to use QT, and I can understand him. Premiere isn't as fast with ProRes as with the native codecs. Can it be that the PC QT-version is still not fit for 64-bit? I don't know. All I say is, if you like intermediates, work with FCS ...[/color]
Link to comment
Share on other sites

[quote name='Axel' timestamp='1342019912' post='13749']
Very interesting what you said about Premiere before. As I understood you - please correct me with patience - what I see as a preview of the mpeg4 original (all the codecs in question are mpeg4 in the end) in Premiere is also a decompressed version, just one that is not shat as huge file onto my hard drive, it is rendered "on the fly" and not saved.
[/quote]

Yes, Premiere decodes H.264 in realtime the same way any video player would (like VLC, for example). What you're looking at has been decompressed and its chroma rebuilt to RGB for display in the source/program monitors.

[quote]
That's my objection regarding 5D2RGB. Thousands of films using the trendy codecs are edited with any of the afore-mentioned softwares and published, and - whether you see through all the intricacies discussed here or not - you never see striking differences that lead you to the conclusion that one NLE is better than the other.
[/quote]

This is, of course, completely subjective. Some may be annoyed by the blocky artifacts QuickTime produces around red taillights or traffic lights in night exterior footage. Some may not care. With 5DtoRGB, you always have the option to transcode without artifacts (and for free, too!).

[quote]
One of the advantages of Premiere is it's native workflow. My Adobe-teacher friend (PC) refuses to use QT, and I can understand him. Premiere isn't as fast with ProRes as with the native codecs. Can it be that the PC QT-version is still not fit for 64-bit? I don't know. All I say is, if you like intermediates, work with FCS ...
[/quote]

You must be talking about the Windows version of Premiere. On the Mac, editing ProRes in CS6 is still faster than editing native H.264. I edited my last film ([url="https://vimeo.com/42391572"]watch here[/url]) in Premiere CS6, but only after using 5DtoRGB to transcode everything. ProRes is much easier to decompress in realtime than H.264. In general, I-frame only formats will be easier for the NLE to work with compared to GOP-based formats.
Link to comment
Share on other sites

[quote name='Thomas Worth' timestamp='1342041170' post='13756']Some may be annoyed by the blocky artifacts QuickTime produces around red taillights or traffic lights in night exterior footage. Some may not care. With 5DtoRGB, you always have the option to transcode without artifacts (and for free, too!)[/quote]

Blocky artifacts around red lights? This must me a Premiere/AME mistreating of ProRes, as the findings of Pyriphlegethon suggest (page 4, with images). Anyone here to confirm this with FCP?

The other day I made a few screenshots of my own comparison:
[img]https://dl.dropbox.com/u/57198583/5D2RGB.jpg[/img]

Notice that it must read "709", not "701". Also, I know only now, thanks to yellow, that I should have chosen 601 and [i]not[/i] full range, nevertheless there are no blatant problems after cc, let alone blocky reds, which would have been fatal in this red, dark scene ...

I know the red shot by heart, because I edited an event clip with Premiere that contained it.

BTW: Liquid is hot of course. The FS 700 is hot.
Link to comment
Share on other sites

[quote name='Axel' timestamp='1342067085' post='13776']
Notice that it must read "709", not "701". Also, I know only now, thanks to yellow, that I should have chosen 601 and [i]not[/i] full range, nevertheless there are no blatant problems after cc, let alone blocky reds, which would have been fatal in this red, dark scene ...
[/quote]

Axel, I'm not sure where I suggested that. GH2 would be 709 and not full range. 7D would be 601 and full range. That's if the whole 601/709 thing in 5DToRGB is about color matrix. Assume this only matters if going to RGB like DPX or maybe 5DToRGB transfers the matrix from 601 to 709 in transcodes as well to avoid potential miss handling of HD sources by certain media players oblivious to color matrix declaration in header of file.
Link to comment
Share on other sites

[quote name='yellow' timestamp='1342069820' post='13777']
Axel, I'm not sure where I suggested that. GH2 would be 709 and not full range. 7D would be 601 and full range.[/quote]

Just better: I transcoded a GH2 clip with 5D2RGB with 709 Broadcast Range (the only alternative to Full Range). Then I transcoded the same clip with QT7 (containing the Panasonic AVCCAM importer plugin) to ProRes. Again I transcoded it with AME, same format. Then I opened all three and the AVCHD original into Color Finesse and made screenshots of the first frames' luma waveform - perfect matches!

Also, as possible with QT launcher, I opened all versions at the same time - ab-so-lutely identical. Getting better: I opened the .mts with VLC aside the ProRes: Neither any difference in hue, saturation nor luma, at least none [i]I[/i] can detect. You may not trust me, but follow these steps, compare the clips with videoscopes asf.
Link to comment
Share on other sites

  • 2 weeks later...
  • Administrators
Bringing this topic back up again. Thanks for your contribution guys. Is there a conclusion you can give us? This would be helpful, and reduce need for people to read the whole 5 pages. It needs summarising. This thread went very into the details. What is the consensus on a fix?

Is 5DtoRGB transcode the only solution?

I'm using 5DToRGB but need to avoid having so much ProRes, the disk space it uses is insane.
Link to comment
Share on other sites

i get the feeling from skimming through that a simple lowering of contrast in premiere, prior to any further adjustment layers does similar things to a 5DtoRGB export. No actual losses are being imparted on the footage by OSX, or quicktime, but more that it is a playback issue of any footage that is has originated from AVCHD.

I was about to pull the trigger on 5DtoRGB but after trying the demo I was put off by the file sizes. I literally saw this topic 2 days after moving from Windows 7 pc to a hackintosh with Lion. A bit annoying really!
Link to comment
Share on other sites

  • Administrators
I've tried adjusting contrast and gamma of luma, changing luma curve, changing highlight luma, adjusting the master RGB curve, changing all sorts of stuff and still cannot get the full range in Premiere from the native AVCHD.

It definitely is an issue and it looks like 5DToRGB is a must-do step.

I just wish there was a definitive fix. Maybe Mountain Lion update which I am downloading now will fix it?

By the way... Problem doesn't go away in CS6 either!
Link to comment
Share on other sites

[quote name='EOSHD' timestamp='1343243008' post='14497']
Bringing this topic back up again. Thanks for your contribution guys. Is there a conclusion you can give us? This would be helpful, and reduce need for people to read the whole 5 pages. It needs summarising. This thread went very into the details. What is the consensus on a fix?

Is 5DtoRGB transcode the only solution?

I'm using 5DToRGB but need to avoid having so much ProRes, the disk space it uses is insane.
[/quote]

My personal opinion:

If you're using a QT based import method into an NLE and using a camera source with luma levels outside 16 - 235 then the levels squeeze happens in QT's decompress, regardless of any 'fullrange' flag setting, so no chance of getting full levels into the NLE for 32bit ops / grade / levels adjust. You get 16 - 235.

If using Premiere CS5 onwards with it's default Mainconcept h264 decoder then as long as the 'fullrange' flag is set off in continers like mp4 & mov, MTS container I don't think supports the flag anyway then full levels pass straight through. You'll see luma levels over 100% in a YCbCr or Luma waveform as they should be because 100% on a typical NLE luma waveform is 235 RGB not 255.

If using a 32bit NLE or grading suite and a decompressing codec that doesn't screw with the levels then there are various approaches to massage the levels into the 16 - 235 range to suit the 'look' you're after, an arbitary prorata input range mapping to output range type adjustment filter is one option but the 3 way and other more flexible tools probably give more options on how you move those levels outside of 16 - 235 into the restricted range.

If you've shot in camera the way you want it to look, then yes a 0 - 255 to 16 - 235 mapping would surfice. But if shooting flat / custom profile no where near the final 'look' then perhaps just do the levels adjustment in the grade via a 3 way or some such tool.

BUT if you are doing this in RGB as many NLE's do then 32bit mode really is required, not 8bit. If doing it with a YCbCr (YCC for short) or as described in many NLE's incorrectly as a 'YUV' filter and the NLE is actually working on the NATIVE YCC rather than Native YCC source -> 8bit RGB in NLE -> YCC in NLE mangling then 8bit is ok, 32bit preferred.

One of the reasons for 32bit mode is not only about precision and rounding errors but the fact that the original camera 8bit YCC source is a different color model to RGB and a wider colorspace to 8bit RGB. It's said that only 25% of the color values in the YCC source can be contained in 8bit RGB with sRGB gamut, hence terms like invalid RGB, however when converting YCC to RGB into 32bit RGB the full 100% of color values generated can be held, not displayed just held in memory for working on, we still have to restrict that color range on output of the NLE for typical 8bit playback.

It's also possible to shift the color primaries to say AdobeRGB or custom color space in order to hold much more of the gamut generated by a typical 8bit YCC to RGB conversion but that will also shift colors, not a problem for the flat / desaturated / custom shooting profile source as the object of the game is to capture as much as possible not maintain the exact shade of red as in the scene for example. May also be possible to create a custom picture profile to represent the custom color space color primaries.

If using 8bit mode in any NLE or grading suit then you'll really need 16 - 235 luma going in. Same for final playback in a typical media player. 16 - 235 luma is required for correct representation of your files.

Not sure if mac based Premiere uses QT but windows based certainly doesn't it uses MainConcept.
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share

  • EOSHD Pro Color 5 for All Sony cameras
    EOSHD C-LOG and Film Profiles for All Canon DSLRs
    EOSHD Dynamic Range Enhancer for H.264/H.265
×
×
  • Create New...