Jump to content

maxotics

Members
  • Content Count

    957
  • Joined

  • Last visited

Everything posted by maxotics

  1. From the Sony technical document you claim there is "no mention of pushing and pulling": 3.1 Changing the Sensitivity (Push Process / Pull Process) The most common method is to adjust the MASTER GAIN of the camera as shown in Table 4. The image contrast that appears on camera viewfinders and the on-set displays will remain consistent, hence it is easier to monitor 3.1.2 Push Process Increasing camera gain will improve camera sensitivity but will increase the camera noise floor. When extra dynamic range is required, the exposure value should be defined according to the light meter rea
  2. maxotics

    NX2 rumors

    Maybe all of us who have SOMEHOW lived without a talking refrigerator will now be rewarded I strongly predicted that Samsung would not give up on their cameras. I was wrong, but not completely wrong because they still sell them. And we must keep that in mind. It's still speculation what's going on with their R&D and factory. If Samsung is listening to all the passion for HDR around here,@jonpais I'm talking to you then NX2 is on the way! 6K would greatly improve color texture in 4K HDR!
  3. The dynamic range may be more perceptible but that doesn't mean it gives a higher quality to the filmmaker than the gain in color information, no matter how small. You accept that the trade-off exists above, and then try to argue with me again. I don't get it. It's always up to the filmmaker's subjective decision. I've said that from day one and will keep saying it. I only start arguing again when one say the trade-off doesn't exist. Again, don't know why we're arguing? Small, big, up to the filmmaker. As for @HockeyFan12 second article which is from the beginning of Sony's S-LOG d
  4. Thanks, I think you've finally give me the information for my report, when I finish it! THANKS!!!! LOG makes sense for a 16-bit linear scan in old scanners because the data DOES need to be in a LOG scale to work with displays. That makes total sense to me. However, I believe some people assume that the camera manufacturer don't already pick data into a LOG distribution for cameras. That is, they believe there LOG gives a trick to give more DR that the camera manufacturers didn't "notice" until they put LOG gammas into their shooting profiles. I believe that is totally false. Needed g
  5. Yes, it's better in "10-bit" because it's shooting chroma sampling 422, but calling it "10-bit" well, I'll leave that alone . I don't dispute that I'd rather grade 422 than 420. What I question is whether the 10-bit from the GH5 is the same as the 10-bit ProRes from the BMPCC as @Damphousse Anyway, like you, I don't have any real problems with 8-bit. Will the C-LOG from my Canon C100 look better on an HDR TV--most likely! But there are other reasons for that than dynamic range. The whole 15-stops of DR in 8-bit claims are beyond ignorant to me, but again, I'll say no more. The C100 gives
  6. I'm not going to go that far, Mark. What I said about the limitations of HDR I still believe true. If you believe I have given incorrect information please post it right here. Please quote me verbatim and give technical proof of any technical inaccuracy I have given. I have given technical data above, to show the difficulties inherent in providing increased dynamic range. I am the closest person here to a real engineer as I have worked with RAW data on a very low level. For example, when you tell me you can understand this then let's talk https://bitbucket.org/maxotics/focuspixelfixer/sr
  7. I never said it wouldn't be more pleasing to me. I said I was doubtful it would solve the DR problem inherent in 8-bit equipment. It can be better for a lot of other reasons having nothing to do with DR! I've said this a lot but feel my statements have been taken out of context. If I could do it all over again I wouldn't have said or speculated about anything HDR since it just wasn't appropriate because some people are just getting into HDR and it dilutes the worth of what they're doing (which is the last thing I want to do). For that, I am sorry.
  8. You guys are killing me! You know, I want to be as liked as the next guy. When I first started this stuff years ago I got into a huge fight with someone on the Magic Lantern forum. I insisted each pixel captures a full color. I went on and on and on. Much like you guys are doing to me. I feel shame just thinking about it. In the end, I learned 2 things 1) What a CFA is and what de-mosaicing does and 2) Always consider the possibility I might be not just wrong, but horrendously, embarrassingly wrong. It's what we do after learning our errors that define us (hint, hint @IronFilm). Anyway
  9. Yes, it is. The question is how 10-bits are measured. In RAW, 10-bits would mean 1,024 values of each R,G,B value. That is certain more dynamic range than storing 256 values (8-bit). If we're talking about 10-bit, in the first sense, we need 1,024 x 1,024 x 1,024 = 1,073,741,825 full-color values. What amount of memory do you need to store a pixel's color in that range? I'm attaching a table of data that I suggest studying and thinking about. The truth is, 10-bit is not 10-bit the way you (and I) would like to think about it. The extra 2-bits goes into reducing chroma-subsampling
  10. Sorry, I'm just frustrated. I believe you get everything I'm saying. My guess is you have a reverse blind-spot to Jon. Sorry! You obviously shoot with high-end equipment, so your cameras have fat-pixel sensors and powerful electronics. LOG IS useful to you. But I believe sometimes when you think about LOG you forget that you're thinking about LOG in high-bit depth or cinema-sensor contexts. Many people on this forum have never shot RAW or high bit-depth cameras. All they know/have, is 8-bit. That's always what I'm focused on. Anyway, THANK YOU SO MUCH for your observations. I haven'
  11. Yes! Because c-log has been tuned to give the most amount in increase DR "look" without super-compromising color. It's a beautiful look, but it's also a sensor made for video. Anyway, I shoot LOG, never said anyone shouldn't. All that said, c-log isn't my first choice.
  12. Oh, I could pull my hair out! Those Alexa files (HD) are at around 42 MBS/sec, in other words, pretty close to what you need for a BMPCC or ML RAW. 4K it would be 4x that amount. 10-bit isn't the same for all cameras. That is 10-bit on an Alex isn't the same as 10-bit on a GH5 because the former is, in the case above, doing 444 which is essentially full-color compressed RAW (no chroma-subsampling). It's not a "tiny fraction of the size" in my book. It's more like half the size of RAW, which, don't get me wrong, is nice! No matter how many ways or times I try, some people don't want to r
  13. HDR is a scheme to sell more television sets. They are not artists living off free-love. How much the technology can/will HDR deliver is the question. I NEVER said manufacturers were lying about depth. Please, if you going to put words in my mouth please quote me. I thought I answered all your questions. I don't know exactly what the eye can take in. I only explained my experience. You said it looks fantastic. I said, 'great, I look forward to it'. Yes, you can't see the difference between TRUE 8-bit and 10-bit image data, but that was NOT what we were talking about. We're talking
  14. Am I speaking English? I don't disapprove of HDR or LOG or anything. I only DISAGREE with claims made that LOG can fit more DR range into a fixed data-depth without, essentially, overwriting data. And I'm not making a judgment on you, or anyone else, who believes they can grade 10-bit footage better than 8-bit. This stuff is esoteric and, in the scheme of things, completely pointless. One shoots with what they can afford. Even if I approved, who gives a sh_t? I've gone to great pains to figure some of this arcane stuff out. If you say, I don't care what you say Max, I love my image.
  15. 98% false. You might be enjoying a placebo effect. But I'm not going to beat a dead horse here. In order to display a continuous color gradient, say, each bit must contain a color that will blend in with the next color. If it doesn't we notice banding. I'm just using banding because it is the easiest way to visualize when the bit-depth is great enough to spread a color out evenly through the data space. So let's assume that we never see banding in 5 stops of DR with 256 shades of red (8-bit). If we reduce it to 200 shades, we notice banding (and if we increase it to 512 we don't n
  16. Me too. I'm bored with video/photography lately!!!!! Again, Jon, I appreciate that you're reporting back from HDR land! I can definitely see how dual ISO, rather dual-voltage tech, could fit right in!
  17. Again, I recorded 10-bit video using the Sony X70. I could find no real different in DR to 8-bit. I have no idea if the GH5 and Atomos Ninja do fully "saturate" the 10-bit space with visual data. I haven't seen anything online that leads me to believe there is a huge difference. In theory, I'm with you! 10-bit video should be serious competition to RAW. I just haven't seen it. Jon, really, I'm not trying to claim there is any conspiracy or that 10-bit or HDR can't deliver real improvements. I just haven't seen them in camera recording. So my opinion: When large sensors wi
  18. Please Jon. I'm not accusing them of deceiving anyone. It's NOT their job to educate people in how to use cameras. They are completely right to let the user decide when/if to use LOG. From my experience, users can't be educated anyway I'm only using VW as an example that many companies are desperate for market share and aren't looking at things the way others are. And I'm only pointing out that the data is available and one should at least ask why it isn't given out? As for the whether it's 9-1/2 or 10-bit. You can do that experiment yourself. (It's what I'm trying to do but is
  19. My latest video covered this subject. The short answer is that the sensor is helpless. When making video, the manufacturer will try to maximize visual information in its 8-bit space. I don't believe 10-bit really exists in the consumer level because camera chips, AFAIK, are 8-bit. I am NOT AN EXPERT, in that I don't work for a camera company, never have. I don't have an engineering degree. I'm completely self-taught. My conclusions are based on personal experiments where I have to build a lot of the tests. I'm happy to help anyone find an excuse to ignore me The question you want
  20. Hi Mark. We move slowly between dark and bright, inside to outside, etc. On the screen, scenes, between inside and out, can jump every 3 seconds. Too much flickering brightness would give one a headache, like neon signs or strobes. If I can expose well, I can't see much of a difference either. However, I can improve exposure after-the-fact better in RAW. Also, there is something about video compression, especially 420, that adds a lot of unnatural noise (lack of chroma) in the image. RAW provides a natural "grain" like look to the footage, and allows me to set the amount of
  21. That supports my findings that standard profiles maintain the most color information (white balance solves the equation where R+G+B equals white, the more RGB values you have, the better) while LOG trades color data for essentially gray-scale DR. Many people confuse our brain's ability to composite 20 stops of DR (from multiple visualizations) and the 5 stops of DR we can discern from visualizes where our pupils remain the same size. When one is outdoors, say, they don't see the bright color of a beach ball and the clouds in the sky at the same time, the brain creates that image from w
  22. Excellent point! Exactly the kind of questions that fascinate me and for which there is no data to be found. With 4K one gets better color in post-compression highlights by reducing color distortions that pass through from the CFA. So has 4K cured that problem? How much of one, or the other? I don't know what anyone here thinks, but I've noticed a shift in filmmaking philosophy about color. For example, I've been watching the "Dirty Money" docs on Netflix (highly recommended). Each scene has its own color model. One person might be interviewed and look flat, like a LOG gamma. An
  23. Yes, that article, from 2009, goes into the trade-offs of LOG gammas! My theory is that many young filmmakers identified LOG with professional, so when LOGs appeared on their cameras began to shoot with it, forgetting the fine-points of data capture mentioned in technical papers like that. Also, the cottage-industry of color profile makers set out to "fix" LOG footage, conveniently forgetting to educate their customers that sometimes what a filmmaker needs, to get the colors they want, is to shoot in a normal rec.709 gamma.
  24. I like to think of myself as an advocate of understanding how one's camera works. linear vs LOG is exhibit A in many filmmaker's choice to skip math class and watch cartoons all day I've done a couple of videos recently on my YT Maxotics channel, which will be the foundation for my new video on LOG. Anyway, the short answer is that a camera records a set of data (pixel) points. Each value is meant to convert into a color. The manufacturer give you a few options with color profiles, like neutral, portrait, landscape. What's the technical difference between linear and
  25. Does anyone know when the first video cameras, both professional and consumer, that recorded a LOG gamma were introduced? I assume something like the Sony F35? Anyway, each manufacturers and approximate date would be helpful. Also, does anyone have an opinion when most professional filmmakers moved from LOG on professional 8-bit cameras to RAW formats (like on RED, etc). THANKS! I want to use this information for the next video I do on the subject.
×
×
  • Create New...