Jump to content


  • Posts

  • Joined

  • Last visited

Everything posted by sunyata

  1. I added an 8bit 4:2:2 and 10bit 4:2:2 to my cg bit depth test.. if you go fullscreen and use the pause button (it goes by pretty fast) you can see how different they are. but the gamma is cranked on this test to around 2x, intentionally to isolate the Y,Cr,Cb and emphasize banding and artifacts.
  2. anyway, the issue of quality is not ONE thing.. but bit depth (mathematically) is the foundation, then compression. YES, you can have badly compressed 10bit, but you can't have uncompressed high dynamic range 8bit.
  3. not literally "can't".. hmcindie, it took us around 10x (honest estimate) to do what we had to do because the footage was terrible. it's wasn't all due to the 8bit issue, but we were working with film that was not only old, we had an 8bit prores copy, it made our lives very difficult. i'm working on a show right now, it's in season 6, i've been working on it since season 1.. every roto is unique and floats off target, the track is just a starting point, there's always a lot of manual work, there is nothing you "can't" do.. that's just hyperbole. a clear example of a problem with keying with 8bit is with fine hair, that is when you really need the extra dynamic range.
  4. He's blinking Morse code.. "I'm being held prisoner by Sony's marketing department, help me."
  5. It's significant. Can this camera do 10bit 4:2:2 if you output 1080?
  6. As promised, the disco bit depth test.. This will either explain the artifacting issue perfectly or just confuse and anger people. I can live with that. Enjoy!
  7. Anyway, more interesting than arguing over the importance of bit depth.. how about a test. I'll just illustrate the artifacts I'm referring to with a simple radial in Nuke..and some disco music. I'm bored. Gimme a sec.
  8. Sean, I wasn't really discussing what is a bigger factor in tracking, bit depth vs film grain or chroma compression, because this is a forum on the GH4 camera... remember? I don't really want to get into the details of what makes it easy to pull a good key. With respect to the topic of cameras though, bit depth is the multiplier that all binary information will pass through, so it is important..
  9. This is really off topic, but yea, that's just wrong Sean. Bit depth is not a factor in pulling keys and in tracking?
  10. Okay, well that's great to know Sean, thanks. You realize I'm working on a show right now at 11:44PM Friday night and it's a bitch to track even with 10bit footage. btw, there were no planar trackers a decade ago. there is an unfortunate rule in CG, the faster the software gets, the more the clients want.. so what was accomplished a decade ago unfortunately is not what we have to do today. Before digital compositing artists used glass and motion control too... before that it was clay.. i think before that it was puppets?
  11. Yea, I hear your frustration. I think there is still some confusion about removing artifacts with some conversion workflow, but I don't want to beat a dead horse. People will just need to get their hands on the footage (the source from camera footage) and give it a shot. We had a studio that shall go nameless (but the name rhymes with Disney) once deliver footage to us as 8bit prores! So we "had" to work with it.. it took maybe 10x the amount of work because you can't it's more challenging to: camera track, planar track, color key, degrain.. it's just wrecked it's degraded once it goes to 8bit. the information is not there to recover, ever, not there... no matter what your subsampling scheme is.. for good... a black spot stays black, even if it goes from [0,0,0] black to [0.0] black .... it's still black! So when you grade that footage, those artifacts come back. But I agree, it's interesting to see what Panasonic is doing. I kinda wish vendors were focusing more on nice 1080 though, less on selling 4k TV's. To be honest, 4k is about 2k too much IMO, unless you're making a movie.
  12. Right, I follow. The ffprobe on Andrew's files: Video: prores (apcs / 0x73637061), yuv422p10le, 4096x2160, 339440 kb/s, SAR 4096:4096 DAR 256:135, 24 fps, 24 tbr, 24 tbn, 24 tbc
  13. Andrew, Is what you posted yesterday on the copy site directly from the camera? Thanks
  14. Okay, I'm not going to belabor the issue (too late).. it's an awesome camera. Thanks for the footage!
  15. The files for me read as 4:2:2 10bit, but I thought it might be a Linux thing. Anyway, I've seen enough to know the footage looks great. Unless something unexpected comes out at NAB, I'm getting one... but will need an HDMI capture solution.
  16. Hey Andrew, Basically if you take a grayscale gradient and you capture, resample, whatever.. from 10bit to 8bit, you wind up with 256 shades of gray (wow, just though of a great title for a book).. and if you push the crap out of that, you will see banding, no matter what size the resolution is. But with a camera, there is Bayer pattern grain, algorithms and lots of interesting IP going on that can make it look much better.. conceal some of the obvious artifacts, but you're still stuck with the math.. 256 colors per channel at 8bit. And lower quality when you factor in chroma sub sampling.
  17. Okay, for anyone that already feels their head exploding over 4k to 2k bit depth issue, please do NOT click this link: https://vimeo.com/90836503 This is a quick 3d render done initially at high settings, 32bit exr. Then saved as 12, 10 and 8 bit uncompressed dpx @ 4k. Then those renders are reformatted to HD 1080 in a float comp and given a gamma curve nobody in their right mind would ever do. I wanted to see for myself how much artifacts improve with a 50% 4k resize in a 32bit comp. Short answer, not much. But, I don't want to be a buzzkill.. the GH4 looks awesome and we're getting one here...
  18. Weird, in ffmpeg the GH4 footage is showing up as "yuv422p10le" and it's telling me it's 10bit? Did you record this to the internal card?
  19. Crazy, anyway.. moving on... I see that you commented that the full sensor is used on the 7D when it's cropping 5x, so then it's using 1/5th of the resolution of the chip to achieve 1080 24p RAW? Maybe this is largely responsible for the moire that needs to be removed. It seems like this is something that could be designed out of the downsampling process with the hack, if you mimicked how film grain works for example, you would not get a moire because there would be no symmetry in the sampling. I wonder if you could sample 1/5th of the sensor using perlin noise. Anyway, you're still using a fraction of the sensor.
  20. Yes, if it's the law then it's the law. No argument there. I can't even imagine how you could enforce that in the States. At a Starbucks, someone takes a group shot and you're right next to the table... what are you going to do, jump up and demand they delete the photo? You're walking on the boardwalk and everyone is snapping selfies.. you happen to be in the shot, are you going to flip them off? When I go surfing people are taking photos from the parking lot.. what am I going to do, flip them off and scream at them to delete their photos? It seems ridiculous.
  21. Hey Jon-R I get what you are saying, I hate it when I'm out and someone is taking a photo while I eat, walk, ride, hike or probably if I keeled over on Santa Monica Blvd, while I grasped my chest and breathed my last breath; but this is the world we are living in today, for better or worse. There is no going back. Also, ironically, in a police state, citizens would not be allowed to take photos and post them for public consumption.
  22. I've been going crazy for weeks now trying to decide what camera to get and for a second I thought this answered my question! But wait.. little snag here: Magic Lantern is capturing at 5x cropped and then you have to upscale to 1080. So we are getting 1/5th the resolution of the sensor? Granted, it's nice video.. but I need to find a camera that I can use to capture stock footage that will be used in compositing and effects. It's a bummer, but we have these clock punching sadistic QC people that look for banding frame by frame and even do ridiculous gamma tests to catch artifacts, which is common if you are shooting something dark and luminous like fog, smoke, fire, neon, which is like 90% of what our clients want. I'm thinking I'll have to pony up for the 5DMarkIII or the BM Cinema or Production Cam. When you get footage that was shot on film, the grain of the film, 10bit gamma and motion blur helps knock all the banding out. I'm trying to find an "affordable" digital camera that matches the live action I usually get. Going from 4k 4:2:0 down to 1080 in a float comp actually won't get rid of bad artifacting (i'm thinking the GH4 4k to 1080 4:4:4 workflow thing), not if it clearly exists in the source. It's also doing a lot of pixel removal depending on the program and algorithm. It softens the grain but if you have clear banding in the source, it will still be there in the smaller version. 8 bit is only 256 steps of gray.. that's really problematic in low light. Seems to cancel out the benefits of the GH4's improved sensor. And magic lantern is for Cannon.. not Panasonic, so forget about waiting for a crack to fix the 8 bit internal problem. Capturing the 10bit out is an option but with the BM cams you can use SSD's. Sorry, these are the internal ramblings of a man that has officially been researching this topic for 2 weeks. I think the bottom line is that these camera companies just don't get us.. oh wait, that is unless you have 3k to spend, then it seems we're all good.
  • Create New...