Jump to content
Sign in to follow this  
Andrew Reid

How Mac OSX still *screws* your GH2 / FS100 / NEX footage - A must read!!

Recommended Posts

[html]

[img]http://www.eoshd.com/wp-content/uploads/2012/05/fs100-and-gh2.jpg[/img]

I’ve had my suspicions for a while now about AVCHD footage having a different exposure on my Mac to on the camera. When I edit my GH2 and FS100 stuff natively in Adobe Premiere Pro CS5.5 or the new CS6, or preview AVCHD MTS files in VLC Player (Quicktime X still does not support AVCHD) blacks are crushed and the image is darker overall with far less shadow detail than on the camera’s LCD.

[url="http://www.eoshd.com/content/8076/how-mac-osx-still-screws-your-gh2-fs100-nex-footage-a-must-read/"]Read full article[/url]

[/html]

Share this post


Link to post
Share on other sites
EOSHD Pro Color for Sony cameras EOSHD Pro LOG for Sony CamerasEOSHD C-LOG and Film Profiles for All Canon DSLRs
Are you sure this isn't a Premiere Pro thing? Do you notice the same shifts using say FCP7? As you say Quicktime doesn't handle AVCHD, so is PP 'unpacking' the h.264 and then QT is screwing it up, or does PP use it's own codecs for AVCHD (not being able to rely on QT to decode it). What does 5DtoRGB use for it's encoder? Are you aware of other unwrapping/transcoding tools such as ClipWrap having this same issue?

I know in the past FCP has done some funky gamma shifts in the output stages when exporting from a timeline unless you tell it not to.

Share this post


Link to post
Share on other sites
So this is why I have been getting darker image since Panasonic's plugin for FinalCut came out... It was better before...
Oh well, I'll go back to 5d2RGB...

I'ts one of these moments when you've been facing an issue without knowing it till someone points it out for you.
Thanks Andrew.

Share this post


Link to post
Share on other sites
Since the clipping (I wouldn't call it a gamma shift though it looks similar to one) happens in VLC Player when previewing an MTS file direct from the AVCHD folder on the card, no it is not a Premiere Pro issue.

I suspect this is deep in OSX's Quicktime code, on both Snow Leopard and Lion. It could even be the footage which is flagging some meta data in the wrong way so Apple may even be blameless. What isn't acceptable is their complete silence over it, which has been going on for 3 years and a complete lack of AVCHD support in QT X's player.

Share this post


Link to post
Share on other sites
But if I understand correctly Andrew VLC doesn't utilise QT at all? Perhaps it is a core video thing then? (not that I know a lot about that).

Edit: Which would go on to suggest it is a playback issue, rather than something embedded in the video during transcoding? You mention it is fine when played on a TV, which has rather different gamma characteristics, especially when using MDP to HDMI, the Mac automatically adjusts the output gamma. I am in no way suggesting you are wrong, it would just be good to get to the root of the issue.

Share this post


Link to post
Share on other sites
After throwing a high contrast clip into 5D2RGB and loading both it and the original into CS5.5 I found that by lowering the contrast on the original clip to -14, it was virtually identical to my eye, and very close on all of the scopes to the converted clip.  Very interesting though.  Some more testing is needed for sure.
Nice find!

Share this post


Link to post
Share on other sites
I recently spoke in great detail to the creators of clip wrap, and what we discovered is that quicktime is simply misinterpreting the data of superbrights and sub-blacks in the mts files. However, your solution of using full range on 5dtorgb is actually compressing the original full range of 0-109 IRE and compressing it down to 0-100, actually tossing out data just to compress down the superbrights and lows. What you should instead do is convert it with the origianl 0-109 IRE and inside your NLE, adjust the clips to proper clipping of 100 IRE, as to not just compress and toss out the data from the original file. This is somewhat difficult to put into words, so I hope I am getting my message across. The cliffnotes is that while "full range" in 5dtorgb looks better at a glance, it is achieving its results by throwing away data to compress down the IRE. You are better off translating over the entire IRE range and grading/correcting the values to the proper 0-100 IRE range for presentation.

Share this post


Link to post
Share on other sites
So, Allupons, do you think that while editing the orignal .mts file, and lowering contrast, are we missing any information compared to a conversion?  Or is it still all there compared to editing on a PC.

Share this post


Link to post
Share on other sites
QT converts sources to YUY2 4:2:2 interleaved and in the process squeezes luma 16 - 235. Canon & Nikon h264 has a flag set in the metadata to signal full range levels so QT, Premiere CS5 and FCPX all squeeze DSLR luma 16-235.

Thomas Worth the creator of 5DToRGB is well aware of this. However the last time I tested 5DToRGB it squeezed luma still. I was using a beta though.

For Canon & Nikon sources transcoding via 5DToRGB is not necessarily required to maintain levels, just remuxing and switching the fullrange flag metadata off via a custom build of MP4Box is sufficient.

Problem with media players like VLC is they are not color managed so the typical 16-235 mapped to 0-255 can happen also playback handling can be at the mercy if video card configuration and whether or not hardware acceleration is used or not and whether overlay is used.

Switch off HW accel in VLC and levels will be full range.

I have a link to some full range test files on my blog at www.blendervse.wordpress.com under the waiving full range flag post and a couple of threads at Cinema5D in the Picture Styles sub forum, would post direct links but currently on the move on a stupid iphone keyboard. :-)

Share this post


Link to post
Share on other sites
Ooooohhhh hi guys,

I am following the eoshd blog from some time now but I finally decided to sign up to this forum just because of this very subject !! Soooooooo glad that you came up with this one Andrew !

Because, I also made my experiments with gamma issues I encounter, and really, it's a mess.


My configuration is the following :
- I shoot with a 7D
- Edit in Premiere CS5.5 or FCPX (depending who I am working with) running on OSX Snow Leopard.
- My screen is calibrated with a Spyder 3 elite. And yes, it is the laptop screen only. And no, I do not intend to invest in an external monitor as some recommend to do for proper grading because all my videos will never be seen in another environment than computers.

My goal is the following :
Find a proper workflow to ensure constant gamma or "clipping state" at all stages, whatever the player or application the video is viewed in. (By "clipping state" I mean that I don't know if the tonal differences I notice are because of a gamma problem or because of the fact that I may have shadows and highlights crushed or enhanced due to Rec709 profile compensation)
I know color management in video is still in the stone age but since Hollywood manages it, why could not we ?


Anyway, here are my findings
I made some screencaps you can see here.
http://ronfya.free.fr/ronfya/all_over_the_web_data/pics/gamma_problem_comparison.jpg
[img]http://ronfya.free.fr/ronfya/all_over_the_web_data/pics/gamma_problem_comparison.jpg[/img]

First row are what the video looks like in the Final Cut Pro X monitor and Premiere CS5.5
Last column is the video straight out of cam viewed in VLC, MPlayer, Quicktime X and Quicktime 7
Others are exports from FCPX and Premiere with the defaults H.264 settings and viewed in the same players.

After a close comparison of the screen caps, switching on/off one over another, I grouped the matches in color groups to describe the problem :
Green, yellow and red groups are nearly perfect match within groups. (and in the case of greens, it is probably 100% perfect because it’s apple QT)
A rough classification from dark and saturated to bright and desaturated would be : RED, BLUE, YELLOW, GREEN, PINK.
[BLUE & YELLOW] are different but very close, [GREEN & PINK] as well. I can live with that. No problem.

But RED is very different from [BLUE & YELLOW] which are both very different from [GREEN & PINK]
The strangest thing for me being the export from Premiere which displays differently in QT7 and QTX.

And the [GREEN & PINK] group, which includes the FCPX monitor is washed out in comparison to the [BLUE & YELLOW] from Premiere and even FCPX export in QT7.

Notice that in QT7, if I check the “Enable Final Cut Studio color compatibility”preference, then the exported video from FCPX is also displaying well in QT7.

Well, anything but consistent.

Do you have any help on this ?

What are your personal workflows to ensure proper tones at all stages ?

Thank you sooooooooooo much !
  :D

Share this post


Link to post
Share on other sites
Wait so you are saying that even after you export your files from Premiere Pro they're forever tainted by the crushed blacks and highlights? Like if you make a video project in PP convert it to h.264, send me(on PC) the file the video will be crushed?  That's a huge workflow issue if you need to preconvert everything before being able to edit it without ruinning your footage.

truely bizzar

Share this post


Link to post
Share on other sites
Thanks Andrew, This is why I visit your site. Great blog.

Some observations not yet noted: When looking at footage converted to ProRes422 with Clipwrap and the same footage converted to ProRes422 with 5DtoRGB I find that the 5DtoRGB file size to be slightly smaller but the image size slightly larger. In other words, Clipwrap crops the shot slightly.

What's that about?

 

Share this post


Link to post
Share on other sites
The basics:

YCbCr(YCC) video comprises of a luma channel (Y) and two chroma channels Cb & Cr (blue & red). For many 8bit cameras this is captured with the luma (Y) free to go full range 0 - 255 and chroma 16 - 240. It's very hard to get Chroma anywhere near saturated, ie: to get close to the 16 - 240 limits, especially shooting flat.

The metadata encoded into these files comprises of three main elements with regard to reproducing the image.

1. The color primaries that define the gamut. This for many cameras is BT709 / sRGB.

2. The transfer curve applied to the linear data off the sensor to gamma encoded video like h264. Generally BT709.

3. The color matrix luma coefficients, that tweak results of 1 & 2 for final display characteristics. BT601 and BT709.

When HD arrived on the scene and BT709 introduced there was arguement over what was best color matrix to use BT601 or BT709. Some encoders chose BT601 other BT709. This is color matrix coeffs only. Not color space or luma levels. Display a BT601 matrix encoded source like Canon or Nikon h264 as BT709 on an RGB display and what skin tone turn yellow/orange. :-) A common f--k up when transcoding.

When we view our videos on RGB displays a color space conversion has to be done. By specification the conversion from YCC to RGB is to map YCC 8bit value 16 to RGB value 0, so YCC levels 0 to 15 get compressed to RGB black. Similarly YCC value 235 gets mapped to RGB value 255 white, YCC levels 236 to 255 get compressed to white. This is where the under and over brights description comes from.

This is all correct and proper at final playback if your display device / color managed media player etc is calibrated to 16 - 235 as many home theatres etc are. Including playback on PC's. ie: YCC 16 - 235 to 0 - 255 RGB. It's also dependant on your underlying graphics card set up and whether HW acceleration is on of off.

For web based playback majority is 16 - 235, not full range, Flash supports full range in some versions i think. So if encoding for the web squeeze luma 16 - 235 for accurate playback. The whole YCC to RGB mapping thing.

If we want the full 8bit luma range of the YCC 0 - 255 represented in RGB as 0 - 255 then we have three options:

1. Squeeze full range luma 16 - 235 as last operation before encoding out for delivery. Assuming we've been previewing with a LUT'd color managed app or by applying a 0 - 255 to 16 - 235 levels effect and grading under it with full 0 - 255 levels rather than working on squeezed 16 - 235 sources. Then in the convertion from YCC to RGB  levels get expanded out to RGB 0 - 255.

OR

2. Keep 0 - 255 full levels through NLE, encode out to h264 and set the h264 VUI Option 'fullrange' to on so that on decompression a codec will squeeze luma levels 16 - 235 ready for the conversion to RGB giving full levels on playback. This is how Nikon and Canon h264 is set up.

OR

3. Use a proper color managed media player like Media Player Classic with a LUT'd color management system and calibrated screen / projector and run with full luma levels. This rules out codecs that can't be controlled by the color management app to provide the levels handling the media player configuration requires.

For true 'accurate' color / contrast appearance in RGB playback it's important that the correct color matrix is used in the convertion to RGB. ie: Canons and Nikons apart from 5D MKIII use a BT601 color matrix. Declared in the metadata, 5D MKIII has moved to BT709 for whatever reason.

My personal take on it is that as so many people were transcoding to whatever codecs the BT601 color matrix coeffs metadata got lost, so if a codec is not told what color matrix to use they choose BT709 based on pixel count. ie: HD resolutions. The result of using the wrong matrix is pink skin tone goes yellow/orange, contrast increases slightly.

Then we get comments and judgements about camera capability, aethetics based on incorrect conversion of the source files rather than accurate representation. You know comments like, Canon skin tone looks worse than another camera. Not the camera but the skewed transfer to RGB. ;-)

The thing to seperate here is display / playback verses aquisition, getting as much data our of the camera to grade, frankly I don't give a s--t about 16 - 235 being 'correct' to specification convertion of YCC to RGB until I encode out final delivery and squeeze levels to 16 - 235 at 32bit precision rather than have some cheap 8bit codec squeeze at ingest into the NLE. I prefer control over the process, when and where and how.

It's crazy to screen grab from NLE's or media players for analysis of a cameras capability, yes ok to show how a particular player handles the source but not for analysis. For that we need full control over process. I use:

1. A free app called mediainfo to get details of camera source.

2. I use ffmpeg in combination with AVISynth and AVSPmod to handle decompression and preview, under full control, setting the color matrix to use etc based on mediainfo / camera specification info.

AVIsynth does nothing to the source, no assumptions, no squeezing levels, no cropping 1088 h264 stream to 1080, no conversion to YUY2, no nearest neighbour / bilinear / bicubic inconsistency. I can then see what is really there compared to what an NLE decides to give me.

Having extracted properly converted images with the right luma levels, color matrix and a waveform / histogram that represents the camera source not a skewed media player / NLE output I have reference images to compare with other play back handling. Instead of this player does this and this player does that with no idea which is 'correct'

Canon & Nikon h264 has additional metadata a full range flag

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Sign in to follow this  

×
×
  • Create New...