Jump to content

see ya

  • Posts

  • Joined

  • Last visited

Everything posted by see ya

  1. Hi, along with camera settings and codec as Axel detailed, banding can be induced at playback on a PC / mac or other device when a conversion to RGB is done, ie: computer monitor. You mention you don't see the banding going from camera to TV, that would make sense. The banding may not even be in original MTS files as you mention but such things as: Handling by decompressing codec, what version of QT are you using assuming iMovie uses QT. Conversion to intermediate in NLE with the defacto Quicktime levels wrangling either clipping and crushing with older implementations of QT or squeezing with current ones. 8bit precision color processing with assume no RGB parade, scopes or histogram to keep a check on color and luma levels in iMovie. Also assume non color calibrated monitor and no color management available. Wrong conversion to RGB for preview in the NLE due to graphics card settings ie: BT601/BT709 color matrix skewing appearance of color and contrast which mean we grade judging via a skewed interpretation, not what we wrote to the card in the camera Rendering to a codec with either no control over final levels at output or a requantized output using a different range of levels to that initially encoded with in camera. All lead to an output more prone to banding in RGB playback. A good starting point to find out what iMovie is doing to help you around the situation would be to do a test capture with your preferred camera settings, import it into iMovie and render out to the 'best' codec choice iMovie gives you and compare levels between the two. Personally I use Avisynth for this as it gives full control over the process without need to second guess what the NLE might be doing, its MS Windows based however. It is possible to 'improve' any videos you have suffering banding, where there are no MTS files any longer to redo the project by using a debanding filter and gradient rebuilding dithering tool, again Avisynth based plugins with noise reduction -> debanding -> gradient rebuilding at 16bit -> Adding encoder friendly dithering back to 8bit before reencoding.
  2. [quote author=popalock link=topic=776.msg5664#msg5664 date=1338025782] and who has good lens compatibility/options, I'll be leaving Canon with a massive sigh of relief [/quote] And that's one reason I sold my Nikon years ago and went to Canon apart from Nikons being p-ss poor with video and manual control for video, because although Nikon make great glass and there's a lot of it about, Nikons design 'decision' years ago with regard to small back of lens to sensor distance means that a Nikon camera has a very limited choice of lens mounts that can be utilised without optics in the adaptor. So any Canon owner wanting to move to Nikon has to reinvest in Nikon mount glass, that's a barrier when a cameras performance is on a 'par' all things considered. This is not true of a Canon though, with the deeper distance to sensor infinity focus can be achieved on a massive amount of lens mounts and options via a simple non optic adaptor including using Nikon glass. Except Canon FD needs optics for EOS, shrewed move in the change to digital?. :-)
  3. This kind of concurs with my earlier posts that it could quite possibly be better to import into the NLE, yes preview may appear overly contrasty / dark but at least the files made it into the NLE unaffected, then grade to suit delivery rather than what happens to Canon & Nikon files which force the decompessing codec to squeeze levels at import so the display looks right immediately but the image has been squeezed into less 8bit levels like 5DToRGB output. Remuxing Canon & Nikon files with MP4Box to switch off the full range flag prevents this and gives full range levels in the NLE with a waveform like your Original unwrapped. Prefer to adjust levels at 32bit precision in the NLE rather than transcode or squeeze levels at import at 8bit.
  4. Thanks, so its just remuxing them and should take no more than a few seconds each file, the reason I asked was theres a lot of Apple related metadata added, of coarse, but bitrate, duration etc vary between original and unwrapped. Thanks anyway. There seems absolutely no point in using 5DToRGB for FCPx or Premiere at least for these MTS files and the settings usex, unless its to solve playback issues.
  5. [quote author=sfrancis928 link=topic=726.msg5590#msg5590 date=1337799274] And I just got the same results in Premiere Pro using the original .MTS file. [/quote] Many thanks for the files, using Avisynth to check levels, I see identical waveforms for both Original MTS and unwrapped files as per your FCPX Unwrapped waveform. 5DToRGB have been squeezed 16 - 235 and conversion to RGB give quantisation errors, ie: spikey histogram if treated as a 16 - 235 clip and if treated as a 0 - 255 YCC to RGB clip it uses less 8bit levels in RGB and appears brighter / washed out. I'm not familiar with 'Unwrap', could you explain a little? What waveform did you see for the original MTS if you don't mind.
  6. Could you make your NEX original MTS, 5DToRGB and UnWrap versions available to download?
  7. [quote author=emgesp link=topic=726.msg5568#msg5568 date=1337706587] This is what happens to all my T2i files on my PC.  VLC doesn't crush the blacks, but it clips the highlights, Windows Media Player crushes blacks and clips highlights, and Quicktime player keeps all the shadow and  highlight detail, but it has gamma curve that makes everything look washed out.  In Sony Vegas I choose full range and then use basic contrast/brightness controls to make everything look right before final encode. [/quote] Most non color managed media players will simply pull shadows down from 16 YCC to RGB 0 & 235 YCC pushed up to 255 RGB, so crush blacks and squeeze whites rather than clip. But with media players including VLC handling also depends on whether hardware acceleration is on or not and video cards settings 0 - 255 or 16 - 235 etc. With regard to any Canon or Nikon DSLR video any decent media player + codec combinations will read the fullrange flag in the h264 and squeeze the full levels range into 16 - 235 to ensure nothing is clipped or crushed. The only decent media player I'm aware of is Media Player Classic, free and open source, combined with madVR color management system and a LUT created by color calibrating your monitor will give you ideal playback of HD & SD with correct levels and color rendition. Here's a simple couple of files to test media player handling such as having HW acceleration on and off etc. The files are created to 'match' handling of Canon & Nikon DSLR h264. http://www.yellowspace.webspace.virginmedia.com/fullrangetest.zip
  8. [quote author=Francisco Ríos link=topic=726.msg5569#msg5569 date=1337707996] I bought 5toRGB batch. The results are much better . But, can anyone explain me about the decoding matrix? Andrew suggest to use BT.601. Best regards. [/quote] BT601 for color matrix luma coefficients. There are three bits of metadata in camera video files that relate to 'correct' presentation of color and contrast. You can see all this metadata using something like mediainfo. The decompressing codec in your NLE reads the metadata for hints on how to handle and display the video. 1. The color primaries, for HD this is ITU BT709. This defines the color gamut in relation to the CIE 1931 spectrum. 2. The Transfer curve used in the conversion from linear RGB off the sensor to gamma encoded values for the h264 encoder. 3. The Color Matrix, luma coefficients to be used for the YCbCr <> RGB color space conversions. There are two usual choices BT601 or more commonly BT709, with Canon and Nikon they chose BT601. A common mistake in transcoding is the loss of the color matrix metadata, if BT601 is not specifically flagged in the stream and transcoding usually removes the BT601 then the default matrix of BT709 will be assumed by the NLE or media player, so as a result certain colors are slightly skewed and contrast slightly more. The color skew is seen with pinks going towards orange so skin tone looks off and blues go a little green. 5DToRGB I believe either transfers the BT601 matrix to BT709 for codecs like DNxHD & ProRes that may well never see BT601 flagged and where BT709 is always assumed. To avoid the color skew.
  9. Which camera they're affected differently. If it's GH2 or FS100 would you possibly have a sample via dropbox or somewhere?
  10. [quote author=Henk Willem link=topic=726.msg5433#msg5433 date=1337250388] Yellow, thanks for the great info. It's very interesting to know all the details behind the artistic and creative process.[/quote] It can be a distraction too. :-) But learning how are camera sources are handled by ones favourite NLE is useful. [quote]The final step is that we all want our pictures to be looked at in the full dynamic range. So if a output device expects a YCC signal/file (range 16-235), it will correct the range for optimal view.[/quote] Yes. [quote]When an output device expects a RGB signal/file (0-255), it's already at it's full range.[/quote] But the file if discussing typical camera video, will be YCbCr not RGB, differing color spaces, so a conversion has to be done, there are numerous ways to do this. Some good, some bad. [quote]So it all goes wrong when there is an RGB signal or file with a range of 16-235, the output display will not correct it properly.[/quote] What is important is that the conversion for display uses the correct luma range to suit the video file. Converting full luma range YCC to RGB assuming the wrong luma range will result in 'spikey' or combed histograms depending. So you find codecs at decompression time adjusting levels to suit target use whether that be media playback or NLE. So a h264 file flagged full luma will automatically get squeezed to 16 - 235. The NLE in this situation is a $1000 media player. jk [quote]Or there is an full range YCC signal and the display will show it clipped.[/quote] Clipped or crushed in RGB, generally YCC 16 crushed to RGB 0 and YCC 235 pushed up to RGB 255 these days or auto scaling like QT, where the full luma range is scaled into 16 - 235. This is what happens with h264 off DSLR's, why NLE luma waveforms show 0 to 100% IRE for these sources when they are really full range on the camera memory card, but unfortunatly even removing the full range flag with MP4Box doesn't help, the auto scaling still goes into 16 - 235 just slightly less levels compression, assume QT has a 'clever' levels detection thing going on. Other decompressors such as Adobe's mediacore treat non flagged h264 as full range and show full 109 IRE as long as we've remuxed and removed the flag. [quote]Is this correct? Then it all comes down to knowing the final play out and working in that dynamic range. Encoding on final export or on ingest doesn't matter, as long as all the proper flags are tagged in the metadata of the files. [/quote] Dynamic range is what it is, as captured by camera. I would disagree with your this last comment and really why I mention the fullrange flag at all. But it really depends on individuals choice, keeping options open and delivery. Why grade over 220 8bit levels when 255 are available, would one do that with an image or photo? Why alter the camera source at ingest, why not work with it and allow the user to make the choice in the grade. It prevents getting out what is put in and with picture styles like Cinestyle for Canons or flat profiles with other cameras the more levels the better? If someone prefers to use an alternative grading system and there's no EDL facility so an intermediate file needs creating would it be better to maintain camera source or squeeze levels to delivery format. If final playout is multiple formats, perhaps film print or DCP digital cinema projection. Which would be better to have as a mezzanine file, squeezed levels that can't be expanded to original source as the squeeze is destructive or maintain levels as camera source?
  11. Rob, your test doesn't really prove anything either your NLE has squeezed levels on import or at encode out. The attached images show that there's nothing below about YCC luma level 20 and highlights such as the light in the 5DtoRGB has clipped at 235 rather than 255. Is there any chance of providing the original MTS to test or can you confirm you squeezed at encode out.
  12. [quote author=evanamorphic link=topic=726.msg5409#msg5409 date=1337181257] Doesn't the YCC -> RGB conversion in Premiere only occur when using RGB labeled effects?  I was under the impression that if you use YUV effects and export using a YUV codec, no such conversion will occur. [/quote] Premiere has been renowned for it's RGB internal processing and even though it was supposed to do native YCC procesing the actual number of YCC filters was very small I believe. CS5 triumphs a full 32bit float precision RGB neutral gamut higher order color space as the work space. :-) Which allows processing with RGB or YCC color processing and swapping between color spaces without loss.
  13. [quote author=ronfya link=topic=726.msg5401#msg5401 date=1337158698] Yellow, thank you so much for your explanations and links to even more explanations :D That is uncovering some foggy issues about conversions. It is quite strange that necessary operations like import/export/convert are so often overlooked. We don't know exactly what's going behind the curtain, when it seems to be fine we don't question the process, but when it is wrong, we cannot say why and find a solution. Well ... more learning is needed.[/quote] It's really down to where individuals place importance on such things, hopefully to many it falls low down the list as it's an obstruction to just getting on and enjoying the creative process, as long as it doesn't cause problems it doesn't matter, to others the technical side gets the upper hand. But I don't think it too much to ask that what is captured in camera makes it into the NLE without being skewed and that a decent color managed workflow presents that data as we wish to deliver it without first screwing around with it based on some assumptions. [quote]So a practical question now. In your previous post, you explained 3 possible solutions to take control of proper conversions at each stage of import/export into the NLE. In order to clarify how it is done exactly [b][u]in practice[/u][/b], could you for example detail the specific steps of the process when - importing properly videos from a 7D into Premiere - setting any effects in Premiere for correct preview during editing & grading (if needed) - exporting correctly from Premiere into h.264 video for the web or such as it is displayed the same way on multimedia players (VLC, QT7, QTX, Win media player, ...)[/quote] I wouldn't say that there is anything improper done with Premiere with regard to Canon h264, it's doing what is expected for displaying a video ie: squeezing 16 - 235 for 'correct' playback on an RGB monitor, for the web etc. It's just doing that at import into the NLE rather than before export, from an editing and grading point of view why reduce the range of 8bit levels at the beginning of the process if working in a 32bit float neutral gamut 'color space in CS5 for example. [quote]Finally what parameters do we have to change if we have to import from another type of cam (5D3 or FS100 for example) and/or edit in another NLE (FCP7, FCPX, Vegas, ...) Thank you so much again. R. [/quote] I think it's up to individuals if they feel the need, to work through their NLE's possible perculiarities. Andrews findings are more with regard to FS100 & GH2 & NEX as the thread heading suggests, so it would be good to look at native samples of those cameras and get the thread back on track.
  14. [quote author=ronfya link=topic=726.msg5389#msg5389 date=1337117322] whohooo Yellow, Your post has been very enlighting to me. Thank you very much for that ... even though I didn't yet catch all of it and still have my load of questions pending. :P So here I go. 1. Where did you learn about all this ? Do you have any source on the net or a good book to recommend for reading and learning ?[/quote] Forum at Doom9.org http://forum.doom9.org/ Any books by Charles Poynton. That's not to say I've learnt anything. [quote]2. About this [quote]For many 8bit cameras this is captured with the luma (Y) free to go full range 0 - 255 and chroma 16 - 240.[/quote] Does this mean that cameras like 5D2, 5D3, 7D, D800, FS100, GH2 actually record full range luma and only limited range chroma on the memory card ?[/quote] Chroma is limited to 16 - 240 but that's plenty of room for ITU 601 or 709 color space. if chroma was full range 1 - 254 that is then we'd be in a different gamut, ie: xvcolor / xvYCC 1.8x more colors than ITU 709 but need a better monitor to display those. :-) [quote]Where did you get the info about what is exactly recorded by the cams (code, range, flags ...) ? Any trusful source ?[/quote] mediainfo http://mediainfo.sourceforge.net/en ffmpeg: http://ffmpeg.org/ [quote]3. About this [quote]For web based playback majority is 16 - 235, not full range, Flash supports full range in some versions i think. So if encoding for the web squeeze luma 16 - 235 for accurate playback.[/quote] Does this mean that if one uploads a video to Vimeo or Youtube, these websites read the vids supposing they were encoded within limited range BT709 ?? Consequence being that no problem occurs if that was the case, otherwise the display is wrong ?? Moreover, does this mean that when exporting for the web, we HAVE to export AND convert to limited range spaces such as BT709 ? Or export in whatever color space with the appropriate flag ? (this would be a color managed video, but as I understand, we would not have this discussion if CM was a standard in video today) [/quote] If I understand correctly, 16 - 235 luma in YCbCr is expected for correct contrast levels at playback on the web. [quote]4. About this [quote]The thing to seperate here is display / playback verses aquisition, getting as much data our of the camera to grade, frankly I don't give a s--t about 16 - 235 being 'correct' to specification convertion of YCC to RGB until I encode out final delivery and squeeze levels to 16 - 235 at 32bit precision rather than have some cheap 8bit codec squeeze at ingest into the NLE. I prefer control over the process, when and where and how.[/quote] Does this mean that when programs such as Premiere CS5.5 reads natively h.264 video straight from a 7D (without transcoding), it already squeezes it to 16-235 to display the video for editing ? The app forces us to edit in the 16-235 space and export in that space too instead of editing in the 0-255 space and converting to 16-235 during export ??[/quote] Yes, Premiere CS5 squeezes the full range luma of Nikon and Canon h264 on import because of the VUI Options metadata 'fullrange' flag attached to MOV container. It's set on by the camera encoder. So Premiere CS5 h264 decompresser assumes levels will exist outside of 16 - 235 so in order to maintain 'correct' contrast for RGB preview it squeezes the luma so it can do the 16 - 235 YCC to 0 - 255 RGB color space conversion. But this also means that any encoding from Premiere is restricted to 16 - 235 rather than the option to encode out 0 - 255 in h264 and set the flag to 'on'. Basically you can't get out what you put in. Output contrast is the same but range of 8bit levels used is reduced compared to camera source. [quote]On the other hand, if we transcode to formats such as ProRes before importing into Premiere, the edit in Premiere is done in the 0-255 range with the conversion to 16-235 at the end ?[/quote] Transcoding should be considered as a last resort to resolve a specific problem generally as a result of playback issues. A simple remux of Canon and Nikon MOV's using a VUI Options patched build of MP4Box and switching the fullrange flag off gives us no luma squeeze in the NLE, but the wrong contrast in the preview due to it now stretching the 16 - 235 zone of our 0 - 255 luma out to 0 - 255 RGB. But we can now work on the full luma levels as shot in camera at 32bit precision and grade as required including squeezing luma into 16 - 235 for delivery if and as required. [quote]5. About AVIsynth, ffmepg, AVSPmod ... Do you know of any equivalent to these in OSX ? [/quote] Not to my knowledge, ffmpeg I'd imagine yes, but not the others. If you're concerned simply about luma levels getting MP4Box for mac would surfice? But unless you see specific problems in your NLE then the luma squeeze is not necessarily going to cause much problem. [quote]6. Do you have any practical workflow tutorial from start to finish anywhere ? Hopefully ? :D I know these are a lot of questions, but correctly understanding this issue and the good workarounds would be so helpful to releive what is just a pain in the ass. Thank you very much in advance ! R. [/quote] Sorry no tutorials, if you are seeing a specific problem then I could perhaps help.
  15. [quote author=Leo James link=topic=726.msg5369#msg5369 date=1337099934] Anyone Tried Neo Scene from cuneiform ? will that retain the information lost ? [/quote] David the CEO at what was Cineform advised that the full levels are squeezed due to suggested problems further down the process, whether that relates to an individuals workflow is only going to be found by testing. If I understand correctly luma is squeezed but into a 10bit range of levels not 8bit.
  16. The basics: YCbCr(YCC) video comprises of a luma channel (Y) and two chroma channels Cb & Cr (blue & red). For many 8bit cameras this is captured with the luma (Y) free to go full range 0 - 255 and chroma 16 - 240. It's very hard to get Chroma anywhere near saturated, ie: to get close to the 16 - 240 limits, especially shooting flat. The metadata encoded into these files comprises of three main elements with regard to reproducing the image. 1. The color primaries that define the gamut. This for many cameras is BT709 / sRGB. 2. The transfer curve applied to the linear data off the sensor to gamma encoded video like h264. Generally BT709. 3. The color matrix luma coefficients, that tweak results of 1 & 2 for final display characteristics. BT601 and BT709. When HD arrived on the scene and BT709 introduced there was arguement over what was best color matrix to use BT601 or BT709. Some encoders chose BT601 other BT709. This is color matrix coeffs only. Not color space or luma levels. Display a BT601 matrix encoded source like Canon or Nikon h264 as BT709 on an RGB display and what skin tone turn yellow/orange. :-) A common f--k up when transcoding. When we view our videos on RGB displays a color space conversion has to be done. By specification the conversion from YCC to RGB is to map YCC 8bit value 16 to RGB value 0, so YCC levels 0 to 15 get compressed to RGB black. Similarly YCC value 235 gets mapped to RGB value 255 white, YCC levels 236 to 255 get compressed to white. This is where the under and over brights description comes from. This is all correct and proper at final playback if your display device / color managed media player etc is calibrated to 16 - 235 as many home theatres etc are. Including playback on PC's. ie: YCC 16 - 235 to 0 - 255 RGB. It's also dependant on your underlying graphics card set up and whether HW acceleration is on of off. For web based playback majority is 16 - 235, not full range, Flash supports full range in some versions i think. So if encoding for the web squeeze luma 16 - 235 for accurate playback. The whole YCC to RGB mapping thing. If we want the full 8bit luma range of the YCC 0 - 255 represented in RGB as 0 - 255 then we have three options: 1. Squeeze full range luma 16 - 235 as last operation before encoding out for delivery. Assuming we've been previewing with a LUT'd color managed app or by applying a 0 - 255 to 16 - 235 levels effect and grading under it with full 0 - 255 levels rather than working on squeezed 16 - 235 sources. Then in the convertion from YCC to RGB  levels get expanded out to RGB 0 - 255. OR 2. Keep 0 - 255 full levels through NLE, encode out to h264 and set the h264 VUI Option 'fullrange' to on so that on decompression a codec will squeeze luma levels 16 - 235 ready for the conversion to RGB giving full levels on playback. This is how Nikon and Canon h264 is set up. OR 3. Use a proper color managed media player like Media Player Classic with a LUT'd color management system and calibrated screen / projector and run with full luma levels. This rules out codecs that can't be controlled by the color management app to provide the levels handling the media player configuration requires. For true 'accurate' color / contrast appearance in RGB playback it's important that the correct color matrix is used in the convertion to RGB. ie: Canons and Nikons apart from 5D MKIII use a BT601 color matrix. Declared in the metadata, 5D MKIII has moved to BT709 for whatever reason. My personal take on it is that as so many people were transcoding to whatever codecs the BT601 color matrix coeffs metadata got lost, so if a codec is not told what color matrix to use they choose BT709 based on pixel count. ie: HD resolutions. The result of using the wrong matrix is pink skin tone goes yellow/orange, contrast increases slightly. Then we get comments and judgements about camera capability, aethetics based on incorrect conversion of the source files rather than accurate representation. You know comments like, Canon skin tone looks worse than another camera. Not the camera but the skewed transfer to RGB. ;-) The thing to seperate here is display / playback verses aquisition, getting as much data our of the camera to grade, frankly I don't give a s--t about 16 - 235 being 'correct' to specification convertion of YCC to RGB until I encode out final delivery and squeeze levels to 16 - 235 at 32bit precision rather than have some cheap 8bit codec squeeze at ingest into the NLE. I prefer control over the process, when and where and how. It's crazy to screen grab from NLE's or media players for analysis of a cameras capability, yes ok to show how a particular player handles the source but not for analysis. For that we need full control over process. I use: 1. A free app called mediainfo to get details of camera source. 2. I use ffmpeg in combination with AVISynth and AVSPmod to handle decompression and preview, under full control, setting the color matrix to use etc based on mediainfo / camera specification info. AVIsynth does nothing to the source, no assumptions, no squeezing levels, no cropping 1088 h264 stream to 1080, no conversion to YUY2, no nearest neighbour / bilinear / bicubic inconsistency. I can then see what is really there compared to what an NLE decides to give me. Having extracted properly converted images with the right luma levels, color matrix and a waveform / histogram that represents the camera source not a skewed media player / NLE output I have reference images to compare with other play back handling. Instead of this player does this and this player does that with no idea which is 'correct' Canon & Nikon h264 has additional metadata a full range flag
  17. QT converts sources to YUY2 4:2:2 interleaved and in the process squeezes luma 16 - 235. Canon & Nikon h264 has a flag set in the metadata to signal full range levels so QT, Premiere CS5 and FCPX all squeeze DSLR luma 16-235. Thomas Worth the creator of 5DToRGB is well aware of this. However the last time I tested 5DToRGB it squeezed luma still. I was using a beta though. For Canon & Nikon sources transcoding via 5DToRGB is not necessarily required to maintain levels, just remuxing and switching the fullrange flag metadata off via a custom build of MP4Box is sufficient. Problem with media players like VLC is they are not color managed so the typical 16-235 mapped to 0-255 can happen also playback handling can be at the mercy if video card configuration and whether or not hardware acceleration is used or not and whether overlay is used. Switch off HW accel in VLC and levels will be full range. I have a link to some full range test files on my blog at www.blendervse.wordpress.com under the waiving full range flag post and a couple of threads at Cinema5D in the Picture Styles sub forum, would post direct links but currently on the move on a stupid iphone keyboard. :-)
  18. [quote author=Andrew Reid link=topic=724.msg5336#msg5336 date=1337036645] 5D3 and GH2 are 709 in-camera, not 601... As far as I am aware.[/quote] I don't know how long its going to take me to get this through my thick skull, yes Canon have changed to BT709 color matrix coeffs for the Mk III where as the MK II, 7D T2i etc use BT601. I even posted that the matrix had changed in comment at prolost in response to Dan Chungs test files back when the MK III came out, still forget. :-) But the matrix is more to do with tweaking the color in specific hues pinks/orange blue/green and very little to do with luma levels, full range etc. All these cameras are BT709 color primaries defining the extent of the gamut in tge 1931 CIE color space horse shoe. But that is a separate thing altogether. Thanks for the correction. :-) [quote]The FS100 picture profile settings have an option for 709 mode. It looks dreadful with it enabled.[/quote] I think that probably relates to rec709 curve on the linear data rather than color matrix. Or perhaps its primaries. Not knowing the camera. [quote]I'll look into MP4Box, cheers.[/quote] Its a special patched build and also commandline. So to use it would be something like: mp4box -add my.MOV:fullrange=off -new my.mp4 The build doesn't like the sowt audio though but will give the necessary luma levels. @MattH theres a bit here at my blog that may help http://blendervse.wordpress.com/2012/04/02/waiving-the-fullrange-flag/ Although NLE specific the gist is the same.
  19. [quote author=Andrew Reid link=topic=724.msg5332#msg5332 date=1337025294] [quote author=richg101 link=topic=724.msg5327#msg5327 date=1337017730]what post processing are you applying to this footage?  any de-noise in software?[/quote] None. Straight off the card and onto Vimeo. [/quote] Except the 5D MkIII full range levels have been squeezed into 16 - 235 skewing results, where as the FS100 full luma levels have been left as shot in camera and for whatever reason the GH2 levels are 16 - 235 as well. The 5D MKIII MOV as well as Nikons MOV's use h264 VUI Options metadata, including a fullrange flag set 'on' by the camera firmware, this forces the decompressing codec to squeeze luma levels, evident in the image extracts in the zip below, showing the luma waveform, notice the fine horizontal lines at regular intervals in the waveform on the 5D MKIII image, clearly showing luma levels have been compressed. http://www.yellowspace.webspace.virginmedia.com/bike.zip The images were extracted using AVISynth. If the Canon MOV is remuxed with a h264 VUI Options patched build of MP4Box here: http://komisar.gin.by/tools/ setting the fullrange flag to 'off' in the process then the full levels as shot in camera are left well alone by the decompressing codec and the outcome would be very similar to the FS100 including less contrast, brighter image than the 5D MKIII is showing currently. As an aside the GH2 levels in the attached image show restricted to 16 - 235 levels. I'm not aware the MTS container holding a fullrange flag and looking at the waveform in comparison to 5DKMIII there are no fine regular horizontal lines to suggest luma is getting squeezed, gradation looks 'fine' like the FS100 and what the 5D MKIII would look like if decompressed correctly, but GH2 looks to capture to restricted 8bit range? A further minor skew is that Canon native files use BT601 luma coeffs where as the GH2 and FS100 are BT709 so there will be a slight skew in contrast / color of a Canon source due to the Vimeo mp4 being flagged as BT709 color matrix, which is incorrect for the Canon file unless a BT601 to BT709 color matrix transfer has been done?
  20. @jsmiller, be aware that which ever you use you really need to take the WB reading from the subject position, it's not such a problem in broad daylight, but if your subject is under specific lighting that's where you need to white balance from. So with a cap you need to carry your camera to the subject position, point camera back to shooting position and do the custom white balance under that specific lighting, this can be a pain if you're on a tripod, or have a lens hood or a VariND filter, where as a card is just placed at subject position. But then you have to carry a card rather than a small lens cap. There's for's and against both ways. They're pretty cheap to have both.
  21. Regardless of grading, if the NLE chooses the wrong color matrix to the one encoded in the camera then the NLE preview is skewed and so is the starting point for grading. For example Nikons use a BT601 color matrix like Canons, this is unusual for HD, it's usually BT709. Transcode a Nikon or Canon MOV to ProRes or DNxHD and it'll be assumed BT709 as no one transfers the matrix. So pinks turn to orange. So it looks orange in preview when really it's pink in camera file, so try to correct for orange in the grade to get a more natural skin tone, when it's actually more natural in the camera encoding all along. I'd hazard a guess they transcoded to ProRes before going to FCPX. Lost the matrix, NLE assumed BT709 on pixel count, turned skin tone orange. Example: 'Left' is conversion to RGB using BT601 ie: the correct matrix as encoded in camera. 'Right' the wrong color matrix, flagged as BT709 in their output for Vimeo. [img]http://www.yellowspace.webspace.virginmedia.com/BT601-BT709.jpg[/img] Looking at the image on a color managed display I'd suggest the 'Left' is the more natural.
  22. Hi, how are you judging your white balance? From jpgs or video? Have you tried white balance via a white card? Re: old Nikon glass, one small benefit of a Canon is you can put pretty much any lenses on it due to greater distance from lens mount to sensor, where as you can't with a Nikon. Not sure if your Nikon has full manual exposure control which is a must for video so don't rule out Canons due to you having Nikon glass if you consider changing camera at a later date.
  23. Blah, nothing to see, there's no native file off the camera just a reencode called 'original' at half the bitrate using a lower AVC profile, the levels have been crushed and looks over sharpened in camera from the skin texture.. What can anyone judge from that? Before skin tone gets mentioned it also looks like the wrong color matrix is declared in the Vimeo encodings, based on the 'orangeness' to the skin, use a BT601 and it looks more 'correct'. Subjectively.
  24. The imaging circle laid down by a lens designed for full frame or 35mm onto the film plane or sensor falls off the edge of a smaller sensor, hence the idea of crop. But forget crop factors its camera manufactueres speak for trying to dumb down for consumers easier for them to understand than field of view, focal length and snsor size relationship. If you put an EF-S lens on a T2i there is no dumb ass crop factor to consider as the lens system is designed for the APS C sensor. That's what the EF-S system exists for particularly important at wide end.
  25. [quote author=Axel link=topic=290.msg4295#msg4295 date=1334810129] @yellow GH2s AVCHD and 5Ds H.264 both use YCbCr, and both are treated in an RGB environment (the colorspace of your NLE, your monitors). No differences to be expected from there. Correct me. [/quote] Yes, however color matrix luma coefficients differ between sources GH2 is BT709, Canon's BT601 and Canon h264 has meta data in the header of the MOV's that signals 'full range' luma where as GH2's MTS doesn't. So in the conversion to RGB color matrix is used, if the wrong matrix is assumed by the NLE then pink hues shift to orange so affects skin tone. Full range flag causes decompressing codec to squeeze the full range luma into 16 - 235 at ingest. So there is room there for misinterpretation, not saying it's always that way, just that there are affecting factors between YCbCr native files and RGB preview and RGB based scopes. The image illustrates the various combinations of color matrix coefficients and luma handling possible leading to what could be incorrect assumptions on a cameras capability.
  • Create New...