Jump to content
Sign in to follow this  
Andrew Reid

How Mac OSX still *screws* your GH2 / FS100 / NEX footage - A must read!!

Recommended Posts

EOSHD Pro Color for Sony cameras EOSHD Pro LOG for Sony CamerasEOSHD C-LOG and Film Profiles for All Canon DSLRs
[quote author=Leo James link=topic=726.msg5369#msg5369 date=1337099934]
Anyone Tried Neo Scene from cuneiform ? will that retain the information lost ?
[/quote]

David the CEO at what was Cineform advised that the full levels are squeezed due to suggested problems further down the process, whether that relates to an individuals workflow is only going to be found by testing.

If I understand correctly luma is squeezed but into a 10bit range of levels not 8bit.

Share this post


Link to post
Share on other sites
Wow....many thanks "Yellow" for "The Basics." If you asked for a show of hands of who didn't quite follow along.....that would be me. I get the gist though so thanks!

This is a great heads up to those of us who were wondering like Andrew, why we didn't seem to be getting what we thought we shot.

Share this post


Link to post
Share on other sites
whohooo Yellow,

Your post has been very enlighting to me. Thank you very much for that ... even though I didn't yet catch all of it and still have my load of questions pending. :P

So here I go.

1. Where did you learn about all this ? Do you have any source on the net or a good book to recommend for reading and learning ?


2. About this
[quote]For many 8bit cameras this is captured with the luma (Y) free to go full range 0 - 255 and chroma 16 - 240.[/quote]
Does this mean that cameras like 5D2, 5D3, 7D, D800, FS100, GH2 actually record full range luma and only limited range chroma on the memory card ?
Where did you get the info about what is exactly recorded by the cams (code, range, flags ...) ?
Any trusful source ?

3. About this
[quote]For web based playback majority is 16 - 235, not full range, Flash supports full range in some versions i think. So if encoding for the web squeeze luma 16 - 235 for accurate playback.[/quote]
Does this mean that if one uploads a video to Vimeo or Youtube, these websites read the vids supposing they were encoded within limited range BT709 ?? Consequence being that no problem occurs if that was the case, otherwise the display is wrong ??

Moreover, does this mean that when exporting for the web, we HAVE to export AND convert to limited range spaces such as BT709 ?

Or export in whatever color space with the appropriate flag ? (this would be a color managed video, but as I understand, we would not have this discussion if CM was a standard in video today)


4. About this
[quote]The thing to seperate here is display / playback verses aquisition, getting as much data our of the camera to grade, frankly I don't give a s--t about 16 - 235 being 'correct' to specification convertion of YCC to RGB until I encode out final delivery and squeeze levels to 16 - 235 at 32bit precision rather than have some cheap 8bit codec squeeze at ingest into the NLE. I prefer control over the process, when and where and how.[/quote]

Does this mean that when programs such as Premiere CS5.5 reads natively h.264 video straight from a 7D (without transcoding), it already squeezes it to 16-235 to display the video for editing ? The app forces us to edit in the 16-235 space and export in that space too instead of editing in the 0-255 space and converting to 16-235 during export ??

On the other hand, if we transcode to formats such as ProRes before importing into Premiere, the edit in Premiere is done in the 0-255 range with the conversion to 16-235 at the end ?

5. About AVIsynth, ffmepg, AVSPmod ...
Do you know of any equivalent to these in OSX ?

6. Do you have any practical workflow tutorial from start to finish anywhere ? Hopefully ?
:D

I know these are a lot of questions, but correctly understanding this issue and the good workarounds would be so helpful to releive what is just a pain in the ass.

Thank you very much in advance !

R.

Share this post


Link to post
Share on other sites
first, let me say that Prem Pro does NOT use QT for h.264. It has its own h264 encoder / decoder as part of MediaCore.

QT player 7 and X often handle the gamma of files very differently... just how it is. In QT player 7 there is a pref setting for handling gamma like FCP 7 FWIW which may or may not do things right.

screen calibration can also cause all sorts of problems too...

but back to the original concern. any number of things could be going on. one could be that AVC files are simply being tagged with the wrong color space, or none at all leaving the NLE to guess / assume one. another is that maybe PP is for some reason not liking that particular flavor of AVCHD files. there is a pref file to tell PP how to handle given media types which is a text file you can edit, just like AE. if you know where AE's interpretation file is, then you can find PP's as well. I'm not saying where the file is because I don't want people going in there and hacking it up making for worse problems....

Share this post


Link to post
Share on other sites
[quote author=ronfya link=topic=726.msg5389#msg5389 date=1337117322]
whohooo Yellow,

Your post has been very enlighting to me. Thank you very much for that ... even though I didn't yet catch all of it and still have my load of questions pending. :P

So here I go.

1. Where did you learn about all this ? Do you have any source on the net or a good book to recommend for reading and learning ?[/quote]

Forum at Doom9.org http://forum.doom9.org/

Any books by Charles Poynton.

That's not to say I've learnt anything.


[quote]2. About this
[quote]For many 8bit cameras this is captured with the luma (Y) free to go full range 0 - 255 and chroma 16 - 240.[/quote]
Does this mean that cameras like 5D2, 5D3, 7D, D800, FS100, GH2 actually record full range luma and only limited range chroma on the memory card ?[/quote]

Chroma is limited to 16 - 240 but that's plenty of room for ITU 601 or 709 color space. if chroma was full range 1 - 254 that is then we'd be in a different gamut, ie: xvcolor / xvYCC 1.8x more colors than ITU 709 but need a better monitor to display those. :-)

[quote]Where did you get the info about what is exactly recorded by the cams (code, range, flags ...) ?
Any trusful source ?[/quote]

mediainfo http://mediainfo.sourceforge.net/en ffmpeg: http://ffmpeg.org/

[quote]3. About this
[quote]For web based playback majority is 16 - 235, not full range, Flash supports full range in some versions i think. So if encoding for the web squeeze luma 16 - 235 for accurate playback.[/quote]

Does this mean that if one uploads a video to Vimeo or Youtube, these websites read the vids supposing they were encoded within limited range BT709 ?? Consequence being that no problem occurs if that was the case, otherwise the display is wrong ??

Moreover, does this mean that when exporting for the web, we HAVE to export AND convert to limited range spaces such as BT709 ?

Or export in whatever color space with the appropriate flag ? (this would be a color managed video, but as I understand, we would not have this discussion if CM was a standard in video today) [/quote]

If I understand correctly, 16 - 235 luma in YCbCr is expected for correct contrast levels at playback on the web.

[quote]4. About this
[quote]The thing to seperate here is display / playback verses aquisition, getting as much data our of the camera to grade, frankly I don't give a s--t about 16 - 235 being 'correct' to specification convertion of YCC to RGB until I encode out final delivery and squeeze levels to 16 - 235 at 32bit precision rather than have some cheap 8bit codec squeeze at ingest into the NLE. I prefer control over the process, when and where and how.[/quote]

Does this mean that when programs such as Premiere CS5.5 reads natively h.264 video straight from a 7D (without transcoding), it already squeezes it to 16-235 to display the video for editing ? The app forces us to edit in the 16-235 space and export in that space too instead of editing in the 0-255 space and converting to 16-235 during export ??[/quote]

Yes, Premiere CS5 squeezes the full range luma of Nikon and Canon h264 on import because of the VUI Options metadata 'fullrange' flag attached to MOV container. It's set on by the camera encoder. So Premiere CS5 h264 decompresser assumes levels will exist outside of 16 - 235 so in order to maintain 'correct' contrast for RGB preview it squeezes the luma so it can do the 16 - 235 YCC to 0 - 255 RGB color space conversion. But this also means that any encoding from Premiere is restricted to 16 - 235 rather than the option to encode out 0 - 255 in h264 and set the flag to 'on'. Basically you can't get out what you put in. Output contrast is the same but range of 8bit levels used is reduced compared to camera source.

[quote]On the other hand, if we transcode to formats such as ProRes before importing into Premiere, the edit in Premiere is done in the 0-255 range with the conversion to 16-235 at the end ?[/quote]

Transcoding should be considered as a last resort to resolve a specific problem generally as a result of playback issues.

A simple remux of Canon and Nikon MOV's using a VUI Options patched build of MP4Box and switching the fullrange flag off gives us no luma squeeze in the NLE, but the wrong contrast in the preview due to it now stretching the 16 - 235 zone of our 0 - 255 luma out to 0 - 255 RGB.

But we can now work on the full luma levels as shot in camera at 32bit precision and grade as required including squeezing luma into 16 - 235 for delivery if and as required.

[quote]5. About AVIsynth, ffmepg, AVSPmod ...
Do you know of any equivalent to these in OSX ? [/quote]

Not to my knowledge, ffmpeg I'd imagine yes, but not the others. If you're concerned simply about luma levels getting MP4Box for mac would surfice? But unless you see specific problems in your NLE then the luma squeeze is not necessarily going to cause much problem.

[quote]6. Do you have any practical workflow tutorial from start to finish anywhere ? Hopefully ?
:D

I know these are a lot of questions, but correctly understanding this issue and the good workarounds would be so helpful to releive what is just a pain in the ass.

Thank you very much in advance !

R.
[/quote]

Sorry no tutorials, if you are seeing a specific problem then I could perhaps help.

Share this post


Link to post
Share on other sites
Yellow, thank you so much for your explanations and links to even more explanations :D
That is uncovering some foggy issues about conversions.

It is quite strange that necessary operations like import/export/convert are so often overlooked.
We don't know exactly what's going behind the curtain, when it seems to be fine we don't question the process, but when it is wrong, we cannot say why and find a solution.

Well ... more learning is needed.

[quote][quote]6. Do you have any practical workflow tutorial from start to finish anywhere ? Hopefully ?


I know these are a lot of questions, but correctly understanding this issue and the good workarounds would be so helpful to releive what is just a pain in the ass.

Thank you very much in advance !

R.[/quote]

Sorry no tutorials, if you are seeing a specific problem then I could perhaps help.[/quote]

So a practical question now.

In your previous post, you explained 3 possible solutions to take control of proper conversions at each stage of import/export into the NLE.

In order to clarify how it is done exactly [b][u]in practice[/u][/b], could you for example detail the specific steps of the process when
- importing properly videos from a 7D into Premiere
- setting any effects in Premiere for correct preview during editing & grading (if needed)
- exporting correctly from Premiere into h.264 video for the web or such as it is displayed the same way on multimedia players (VLC, QT7, QTX, Win media player, ...)

Finally what parameters do we have to change if we have to import from another type of cam (5D3 or FS100 for example) and/or edit in another NLE (FCP7, FCPX, Vegas, ...)

Thank you so much again.

R.

Share this post


Link to post
Share on other sites
[quote author=ronfya link=topic=726.msg5401#msg5401 date=1337158698]
Yellow, thank you so much for your explanations and links to even more explanations :D
That is uncovering some foggy issues about conversions.

It is quite strange that necessary operations like import/export/convert are so often overlooked.
We don't know exactly what's going behind the curtain, when it seems to be fine we don't question the process, but when it is wrong, we cannot say why and find a solution.

Well ... more learning is needed.[/quote]

It's really down to where individuals place importance on such things, hopefully to many it falls low down the list as it's an obstruction to just getting on and enjoying the creative process, as long as it doesn't cause problems it doesn't matter, to others the technical side gets the upper hand.

But I don't think it too much to ask that what is captured in camera makes it into the NLE without being skewed and that a decent color managed workflow presents that data as we wish to deliver it without first screwing around with it based on some assumptions.

[quote]So a practical question now.

In your previous post, you explained 3 possible solutions to take control of proper conversions at each stage of import/export into the NLE.

In order to clarify how it is done exactly [b][u]in practice[/u][/b], could you for example detail the specific steps of the process when
- importing properly videos from a 7D into Premiere
- setting any effects in Premiere for correct preview during editing & grading (if needed)
- exporting correctly from Premiere into h.264 video for the web or such as it is displayed the same way on multimedia players (VLC, QT7, QTX, Win media player, ...)[/quote]

I wouldn't say that there is anything improper done with Premiere with regard to Canon h264, it's doing what is expected for displaying a video ie: squeezing 16 - 235 for 'correct' playback on an RGB monitor, for the web etc. It's just doing that at import into the NLE rather than before export, from an editing and grading point of view why reduce the range of 8bit levels at the beginning of the process if working in a 32bit float neutral gamut 'color space in CS5 for example.


[quote]Finally what parameters do we have to change if we have to import from another type of cam (5D3 or FS100 for example) and/or edit in another NLE (FCP7, FCPX, Vegas, ...)

Thank you so much again.

R.
[/quote]

I think it's up to individuals if they feel the need, to work through their NLE's possible perculiarities.

Andrews findings are more with regard to FS100 & GH2 & NEX as the thread heading suggests, so it would be good to look at native samples of those cameras and get the thread back on track.

Share this post


Link to post
Share on other sites
[quote author=evanamorphic link=topic=726.msg5409#msg5409 date=1337181257]
Doesn't the YCC -> RGB conversion in Premiere only occur when using RGB labeled effects?  I was under the impression that if you use YUV effects and export using a YUV codec, no such conversion will occur.
[/quote]

Premiere has been renowned for it's RGB internal processing and even though it was supposed to do native YCC procesing the actual number of YCC filters was very small I believe.

CS5 triumphs a full 32bit float precision RGB neutral gamut higher order color space as the work space. :-) Which allows processing with RGB or YCC color processing and swapping between color spaces without loss.

Share this post


Link to post
Share on other sites
I only use FCP7 and QT7.
I check my mts files with the Toast Video Player and use FCP log and Transfer to import to FCP in Prores.
The MTS looks the same in Toast Video Player and FCP.
Rob

Share this post


Link to post
Share on other sites
Rob, your test doesn't really prove anything either your NLE has squeezed levels on import or at encode out.

The attached images show that there's nothing below about YCC luma level 20 and highlights such as the light in the 5DtoRGB has clipped at 235 rather than 255.

Is there any chance of providing the original MTS to test or can you confirm you squeezed at encode out.

Share this post


Link to post
Share on other sites
right now I'm going to sleep (night in Italy), I'll post one tomorrow..

I don't know what happens with these transcoding but the files from 5DtoRGB looks much better for grading to me and reveals lights and shadows not visible in the original file converted in mov when imported in FCPX. There is also less noise.

Share this post


Link to post
Share on other sites
Yellow, thanks for the great info. It's very interesting to know all the details behind the artistic and creative process.

The final step is that we all want our pictures to be looked at in the full dynamic range. So if a output device expects a YCC signal/file (range 16-235), it will correct the range for optimal view. When an output device expects a RGB signal/file (0-255), it's already at it's full range. So it all goes wrong when there is an RGB signal or file with a range of 16-235, the output display will not correct it properly. Or there is an full range YCC signal and the display will show it clipped.

Is this correct? Then it all comes down to knowing the final play out and working in that dynamic range. Encoding on final export or on ingest doesn't matter, as long as all the proper flags are tagged in the metadata of the files.

Share this post


Link to post
Share on other sites
[quote author=Henk Willem link=topic=726.msg5433#msg5433 date=1337250388]
Yellow, thanks for the great info. It's very interesting to know all the details behind the artistic and creative process.[/quote]

It can be a distraction too. :-) But learning how are camera sources are handled by ones favourite NLE is useful.

[quote]The final step is that we all want our pictures to be looked at in the full dynamic range. So if a output device expects a YCC signal/file (range 16-235), it will correct the range for optimal view.[/quote]

Yes.

[quote]When an output device expects a RGB signal/file (0-255), it's already at it's full range.[/quote]

But the file if discussing typical camera video, will be YCbCr not RGB, differing color spaces, so a conversion has to be done, there are numerous ways to do this. Some good, some bad.

[quote]So it all goes wrong when there is an RGB signal or file with a range of 16-235, the output display will not correct it properly.[/quote]

What is important is that the conversion for display uses the correct luma range to suit the video file. Converting full luma range YCC to RGB assuming the wrong luma range will result in 'spikey' or combed histograms depending.

So you find codecs at decompression time adjusting levels to suit target use whether that be media playback or NLE. So a h264 file flagged full luma will automatically get squeezed to 16 - 235. The NLE in this situation is a $1000 media player. jk

[quote]Or there is an full range YCC signal and the display will show it clipped.[/quote]

Clipped or crushed in RGB, generally YCC 16 crushed to RGB 0 and YCC 235 pushed up to RGB 255 these days or auto scaling like QT, where the full luma range is scaled into 16 - 235. This is what happens with h264 off DSLR's, why NLE luma waveforms show 0 to 100% IRE for these sources when they are really full range on the camera memory card, but unfortunatly even removing the full range flag with MP4Box doesn't help, the auto scaling still goes into 16 - 235 just slightly less levels compression, assume QT has a 'clever' levels detection thing going on. Other decompressors such as Adobe's mediacore treat non flagged h264 as full range and show full 109 IRE as long as we've remuxed and removed the flag.

[quote]Is this correct? Then it all comes down to knowing the final play out and working in that dynamic range. Encoding on final export or on ingest doesn't matter, as long as all the proper flags are tagged in the metadata of the files.
[/quote]

Dynamic range is what it is, as captured by camera. I would disagree with your this last comment and really why I mention the fullrange flag at all. But it really depends on individuals choice, keeping options open and delivery.

Why grade over 220 8bit levels when 255 are available, would one do that with an image or photo?

Why alter the camera source at ingest, why not work with it and allow the user to make the choice in the grade. It prevents getting out what is put in and with picture styles like Cinestyle for Canons or flat profiles with other cameras the more levels the better?

If someone prefers to use an alternative grading system and there's no EDL facility so an intermediate file needs creating would it be better to maintain camera source or squeeze levels to delivery format.

If final playout is multiple formats, perhaps film print or DCP digital cinema projection. Which would be better to have as a mezzanine file, squeezed levels that can't be expanded to original source as the squeeze is destructive or maintain levels as camera source?

Share this post


Link to post
Share on other sites
Agreed Yellow, and good questions in the end. Good arguments for keeping the source as it is, making decisions at grading.

The main issue then seems the metadata and the proper read out of it by the convertors, players and NLE's. And the users  themself who have no clue what to do when settings need to be set manually. And then there's QuickTime which completely does it's own thing. Oh, how hateful that is.

It really takes some effort to understand these issues. Just like the QT gamma shift and many more stuff like that :)

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Sign in to follow this  

×
×
  • Create New...