Jump to content

sunyata

Members
  • Posts

    391
  • Joined

  • Last visited

Everything posted by sunyata

  1. ah, i think he was saying "working space", i don't know of any apps that have a 10bit working space except the older versions of flame and shake's "bytes" node. but 8, 16, or 32 are usually your options for working space depths. actually, i take that back, shake also only had 8, 16 or 32.
  2. ru asking me that? 16bit kills banding for CG and 10bit is good enough for film/video. largely because of, you guessed it, grain.
  3. yea, the whole add noise as part of a remove banding workflow is not something i would recommend, and i've used it in the past. the problem is that banding can be consistent across motion, where noise is always dancing around, so your eye will see the telltale pattern of banding when it moves, but the noise appears invisible. the technique of course also adds a lot of additional noise to footage that is probably already too noisy! the unfortunate answer is to avoid the introduction of banding. that's another topic. but you shouldn't need a product to do it, most programs have a grain filter or noise filter. i'm also very leery of de-noise as part of any standard workflow. de-noise is an imperfect science and usually creates an unsharp mask effect.
  4. You're right about this but now LUT's are used for simulating film stocks or looks.. it used to be for matching film and a labs process when you worked and then you would take it off when you rendered. Just keep in mind that in order for the LUT to show an accurate film stock profile etc, you need to make sure your scene-referred gamut is correctly matched (what is coming from the camera).
  5. Other general suggestions, aside from the video card emphasis, would be to think about a pedestal server chassis, an extended board, lots of PCI slots, a really nice power supply and internal backup! Anyway, those are things that I think make a workstation functional, I only notice clock speed when I'm cursing a render that is hanging on a single frame. Shouldn't be an issue with grading and editing.
  6. Haswell has advantages in speed but it's marginal, if you want to save a few hundred bucks and put it into a video card, that will be more of a noticeable investment than an extra .2Ghz, whatever architecture you choose. Setting up a small 1U render farm, I used a mix of Sandy Bridge and Ivy Bridge (Haswell wasn't out yet) and they are still going strong. Not yet to the point where the render speed difference compared to Haswell is worth upgrading. One thing about Ivy Bridge, it seems to run hotter for some reason, get a good heatsink! For a desktop computer, a Quadro card can cost more than your entire system and since more apps are writing acceleration that uses the cuda cores over the CPU, I'd say focus more on the video card and drives over the processor.
  7. Yes, I do, the moire was subtle btw and the picture looks incredibly sharp otherwise. Just doing a little test. Not pointing out any "sampling issues" that are deal-breakers with the camera.. I'm not trying to criticize the quality of the GH4. No disrespect. Simply noticed that there were subtle scanlines in the signal from the UHD to 1080p scaling and this is something that might make a difference for anyone that is considering getting the Ninja Shogun.
  8. Jacob-- Got it, that makes sense about the lag.. thanks. jcs-- Watching that Zacuto video just to hear the Panasonic guy say it :) 09:52 <= "If you want access to a true 4:2:2, at 10bit, even at 4k.. you can output that over the HDMI port." Said the guy wearing a Panasonic shirt.
  9. Ahh, so it's the GH4 that is undersampling.. I read this: it sounded like you were outputting UHD from the camera. just found this: 1. Does the GH4 output 4K over the micro-HDMI port? When the camera is set to record in 4K internally the onboard HDMI output is down-converted automatically to 1080p. In playback mode the HDMI output is 4K capable (relaying the 8bit 4:2:0 footage from SD card). Only the DMW-YAGH add-on base can output 4K (uncompressed, 10bit 4:2:2) from the GH4. However the micro-HDMI port is 8bit / 10bit switchable and can deliver impressive 1080p in 10bit 4:2:2 format, uncompressed to an external recorder. So if this is correct, the Atomos Shogun will be recording 4k 8bit 4:2:0 from the GH4 through the micro-HDMI port, and limited to the size of your SD card? confused.
  10. matt- Yea, it kinda does answer the question. Based on that test, since 8bit 4k downscaled in post seemed to be the best for killing moire, 4k 10bit downscaled in post to 2k should be even better, but what a waste of space. It seems like the hardware based undersampling is best to avoid.
  11. Yea, it looks good for the price for sure :) The scanlines don't look perfectly horizontal like fields, they seem to curve and go vertical in places. Not sure what the exact method is. Would be good to compare 2k out instead of the 4k and see if the GH4 downsampling is better.
  12. hey matt-- I was doing a test with the GH4 footage and I noticed that the fine hair started to show symmetrical scanlines when you push the luma (this was done by raising the blackpoint and isolating the green channel), so I zoomed in to look closer.. this must be the Ninja Blade downsampling from 4k? it's the GH4 outputting 1080p. I think this answers the question as to why someone might want to shell out for the Shogun, capture 4k 10bit and reformat to HD in post. I put up some Alexa footage just to show something with the same luma tweak and zoom so you can see it's not happening in the resize etc. The Alexa footage was very shallow focus, so the hairline is a little soft (and a lot less hair!). I know they are very different price points too.
  13. jg- On the topic of what clients want to review work on, we have lots of clients at studios that look at work in progress on their iPhone's and even make comments about color grading that way, while having lunch outside in the direct sun I suppose. To be fair, that is how more people will see the end results these days, but really, an iPhone? Lemme guess, everything looks too small to you?
  14. ha.. "i don't get this place". as if there was any consensus on this forum.
  15. jg- you can download a windows binary here: http://ffmpeg.zeranoe.com/builds/ basically, you just unzip it and run it straight from the binaries folder. it comes with several tools like ffprobe, to get info on a file, and ffplay. it's command line based so you can use it in a batch convert scenario. to make it easy to execute from the windows console, you need to add the location of the unzipped binaries to your system variables. tutorial on how to do add a path to your system for java, but same applies for any executable you want to call directly from the prompt: http://www.java.com/en/download/help/path.xml It doesn't need quicktime.
  16. sunyata

    Grading

    Those promos are really misleading, the show is better...
  17. sunyata

    Grading

    Interesting discussion with the DP for American Horror Story on how he's using film to do in-camera effects. Technical specs: Arriflex 16 SR2 Arriflex 16 SR3 Arriflex 235, Panavision Primo Lenses Arriflex 435 Xtreme, Panavision Primo Lenses Panavision Panaflex Millennium XL2, Panavision Primo, PCZ and Angenieux Optimo Lenses Digital Intermediate (2K) (master format)
  18. ffmpeg has prores support and it's free. you can do in the windows terminal (once you add the folder to your shell path): ffmpeg -i mysequence.%04d.dpx -i myaudio.wav -c:v prores -pix_fmt yuv422p10le -c:a aac -r 23.976 myvideo.mov "%04d" means a 4 digit padded numeric sequence is the input ("-i"). that would be 10bit 4:2:2 prores. there are lots of options for codecs and pixel formats. "ffmpeg -codecs" will list all codecs supported and "ffmpeg -pix_fmts" will output all the pixel formats.
  19. I think I'm going to go for the projected 2D film print, if I can find it showing anywhere in Los Angeles. Landmark is showing digital projection, still not sure about the Bruin, all the others are multiplex, it's not looking too promising.
  20. The directors comment on his choice of D.P. Dion Beebe (Memoirs of a Geisha): "I didn’t want it to look like one of those plastic-looking video-game movies that Hollywood loves to release over the summer. I wanted to set it apart from the pack and Dion’s style is far more evocative of the classic war movies that I love". POST: Did you shoot film or digital? LIMAN: “Film, which was very exciting for me as I wasn’t sure I’d ever get that chance again.†http://www.postmagazine.com/Publications/Post-Magazine/2014/June-1-2014/Directors-Chair-Doug-Liman-Edge-of-Tomorrow.aspx
  21. One of the last jobs at SPI Culver City.
  22. We feel bad for you too.. but not that bad; you're winning.
  23. You're referring to film to SD broadcast, I was referring to a film DI, back to film pipeline, but I'm working on a TV show right now that came off digi-beta at 10bit 4:2:2 1080 24p HD, a D5 deck and it's pretty old (Panasonic introduced them in 94!). Digital beta can be either 8 or 10 and there are several varieties, including 10bit SD. If you want to see an example of some 2k 10bit 4:2:2 digi-beta that came from an Arri Alexa, I used an old clip for background on a test in the Grading sticky, page 3.
  24. jcs- random noise and dither is not going to "kind of reverse the error diffusion and put some bits back into luma" anywhere near the color depth of the actual scene recorded at 10bit. I'm not even sure it kinda helps, because it creates other problems, like artificial noise. As a side note.. when a light drop-off radius moves and there is a quality issue where the noise technique has been used, even with super grainy film, you can spot the clean arcs of the banding through the noise. This is an example of the visual system you reference, we ignore the noise and see the symmetrical patterns when there is motion. Similar to the fixed pattern noise tests that have been posted on Vimeo. Unfortunately there is no free lunch if you want to get rid of these types of artifacts.. gotta go higher bit depth and lower compression.
  25. JCS- There is a difference between the visual illusion of banding and actual banding in a file. If you run a sampler over a clear band you will see that the numbers don't change at all.. this is not your visual system creating an illusion, this is in the file. It's really when something moves, like the light's dropoff radius, that your eye will not see the noise and will see the bands.. i.e. like the FPN noise videos on Vimeo for the BMPC. Grain can help, but only to an extent. Increasing bit depth (not post facto) removes banding.
×
×
  • Create New...