whohooo Yellow,
Your post has been very enlighting to me. Thank you very much for that ... even though I didn't yet catch all of it and still have my load of questions pending. :P
So here I go.
1. Where did you learn about all this ? Do you have any source on the net or a good book to recommend for reading and learning ?
2. About this
[quote]For many 8bit cameras this is captured with the luma (Y) free to go full range 0 - 255 and chroma 16 - 240.[/quote]
Does this mean that cameras like 5D2, 5D3, 7D, D800, FS100, GH2 actually record full range luma and only limited range chroma on the memory card ?
Where did you get the info about what is exactly recorded by the cams (code, range, flags ...) ?
Any trusful source ?
3. About this
[quote]For web based playback majority is 16 - 235, not full range, Flash supports full range in some versions i think. So if encoding for the web squeeze luma 16 - 235 for accurate playback.[/quote]
Does this mean that if one uploads a video to Vimeo or Youtube, these websites read the vids supposing they were encoded within limited range BT709 ?? Consequence being that no problem occurs if that was the case, otherwise the display is wrong ??
Moreover, does this mean that when exporting for the web, we HAVE to export AND convert to limited range spaces such as BT709 ?
Or export in whatever color space with the appropriate flag ? (this would be a color managed video, but as I understand, we would not have this discussion if CM was a standard in video today)
4. About this
[quote]The thing to seperate here is display / playback verses aquisition, getting as much data our of the camera to grade, frankly I don't give a s--t about 16 - 235 being 'correct' to specification convertion of YCC to RGB until I encode out final delivery and squeeze levels to 16 - 235 at 32bit precision rather than have some cheap 8bit codec squeeze at ingest into the NLE. I prefer control over the process, when and where and how.[/quote]
Does this mean that when programs such as Premiere CS5.5 reads natively h.264 video straight from a 7D (without transcoding), it already squeezes it to 16-235 to display the video for editing ? The app forces us to edit in the 16-235 space and export in that space too instead of editing in the 0-255 space and converting to 16-235 during export ??
On the other hand, if we transcode to formats such as ProRes before importing into Premiere, the edit in Premiere is done in the 0-255 range with the conversion to 16-235 at the end ?
5. About AVIsynth, ffmepg, AVSPmod ...
Do you know of any equivalent to these in OSX ?
6. Do you have any practical workflow tutorial from start to finish anywhere ? Hopefully ?
:D
I know these are a lot of questions, but correctly understanding this issue and the good workarounds would be so helpful to releive what is just a pain in the ass.
Thank you very much in advance !
R.