Jump to content

HockeyFan12

Members
  • Posts

    887
  • Joined

  • Last visited

Reputation Activity

  1. Like
    HockeyFan12 got a reaction from kaylee in Lights - 2018 : New, deals, low budget, DIY   
    Lots of cool looking stuff there!
    I was trying to build my own tiny flexible velcro LED lights, but it seems there are similar better options here, and also considering buying four 2X2s from Westcott when they were on sale for $500 apiece to make an 800w 4X4 to rent out. Looks like Litegear already has an 8X8 version, even better! That's the book light of the future and possibly bright enough for day exteriors, too. It claims 1600w and while I doubt it has the punch (beam angle) of a 960w MacTech (which gaffers tell me is about equivalent a 6k HMI through diffusion), the extra wattage might make up for it.
    I would love to rent that for the right project. I'm sure the price is extreme, but the smaller lite mats are excellent, so that could be useful even for day exteriors... maybe...
    That's a shame that lightning is gone, that's one of the main reasons I bought the light but I was probably going for more of a strobe effect anyway. To be fair, at that power output (guessing 10w equivalent about) you could just get a bog standard high CRI E26 bulb and flicker it with a switch or button on a dimming device. The brightness would not be that different whereas "real" lighting units are brighter than 6k HMI so it would never be useful for that except maybe in dimly lit interiors on an A7S or something. The real issue is the attack and decay envelopes being too hard with LEDs, though some seem to be worse than others... I'm experimenting with that now actually. For all I know the "strobe" effect on the viola is just as harsh as standard lights, unless you can carefully tune in the attack and decay (the sine wave effect looks good, but maybe too soft).
    Though I hope I'm surprised about the brightness. And I assume the more of those you buy, the better. A fire gag constructed out of three or four of them could be brighter and more varied. I actually just found two more at $120 used so I picked those up. Could be useful for fire gags club light gags etc. and there's not much else in this price range that is.
     
  2. Like
    HockeyFan12 got a reaction from kaylee in Lights - 2018 : New, deals, low budget, DIY   
    Couldn't help myself, picked up a used one for $118.
    Can you program in animations to simulate flame or a flickering fluorescent or something? I've been looking for a budget magic gadget since years ago. Can you program color loops?
    Of course the trick with magic gadget flame gags is ganging up a bunch of different lights to create some real chaos, usually including a constant base source dimmed way down to a low color temp like embers, so one of these will never replace but merely supplement that. Aputure's approach seems to understand the need for being able to program in animations; recording a flame and then recreating it on a larger scale is sooooo cool. But even a poor man's version of that would be wonderful, especially for $118.
    Lighting with projectors as hard sources and RGB(W) displays as soft sources (look up Sony's Crystal LED technology) is sort of the holy grail of control. This is sort of like a super low res Crystal LED.
    And until then, Digital Sputniks aren't bad.
    They are too expensive, though. Hoping the app offers some decent control not just for color but also for animation. What are your experiences?
    Edit: just watched this video: 
    !
    So cool
    As for Deakins, I think that stuff is pretty common. I see a lot of batten strips/covered wagons and when I talked with the DP of the Sopranos he explained that most of their sets are lit by massive grids of 60w (or maybe 100w) incandescent lights on dimmers, at the top corner of each set. The approach is simple, always keep the scene backlit by those and then accent as needed. But you need to be very very good to dial in the details and of course the cost of constructing custom lights is often greater than just buying something fast and cheap and only justifiable or feasible on a big set in the first place...
  3. Like
    HockeyFan12 got a reaction from kidzrevil in The cameras used on Netflix's Original Films and Series   
    I don't think I've ever disagreed with a post I've read on this forum more than I disagree with this one. The features you mention are about as closely associated with the specific look of their respective formats (as well as the style they adopt from production circumstances surrounding the use of those formats) as any in recent history. Sure, 16mm has no more resolution or dynamic range than most mirrorless cameras today, but it has an inimitable physicality to its look (film grain, halation, gate weave, etc.) and even a physicality to the approach one is forced to take. And that offers a distinct (and in my opinion frequently gorgeous) look and disciplined style I've yet to see anyone get close to approximating for the duration of a feature shot digitally. No one at that level is shooting 16mm because it's the best format technically or because it's cheaper than 35mm; they're shooting with it because they prefer the process and the look. Sure, you can compare 16mm and today's video cameras on a technical basis and they're surprisingly comparable. But that's like saying dinner at the French Laundry and a bottle of Soylent are comparable on the basis of nutrition and ending the comparison at that. (They might or might not be, but you get what I mean.)
    I've seen Chungking Express projected from a good 35mm print at a high end screening facility and the texture and color of it are beyond anything I've seen shot on any digital camera, technically superior cameras such as the Alexa 65 (which can look just as good, only in a different way) included. Not in a way I can easily quantify, but in a way that resonates emotionally. I remember the look of that movie more closely than the story. If you think that film would look as good shot on a GH4, you need to question your own eyes, not Chris Doyle's, and definitely toss aside the resolution chart. There might be less grain on the GH4. It might be sharper. But the dreamlike quality of certain sequences derives specifically from the texture of film. Features shot on 16mm embrace the grain even more.
    I mean, Van Gogh's paintings don't measure well on a resolution chart. Should he have used a GH4 instead of a paintbrush? 
    Yes, Festen was shot on cheap video. And its success proves a great point that a good story and good performances are more important than "look" for that kind of movie–I agree with you there. But the specific look of the cheap video (no lighting, miniDV) was still made with philosophical (Dogme 95, of course) and aesthetic (it looks like a home video and shares the aesthetic intimacy of one, crucial to the subject matter) intent. It had a large budget. They could choose their camera on the basis of cost. They chose miniDV for another reason. Even if the GH4 were around then, it would still have been shot on home video. It's the filmmakers emulating that look, and not emulating the bad amateur online filmmaker look typically associated with mirrorless cameras and dSLRS (it's not all bad, but I'm just talking about the audience's association with a look), that makes that story work. Another story might look great shot on a 70D. Maybe a story about a vlogger could be amazing on that format. But that would be an entirely different movie!
    I would try to be less disrespectful of the choices made by DPs and directors who can afford to make choices on bases other than cost or assume their only criteria are technical!
    I'm not trying to say I know what these filmmakers think, and maybe they would have shot on a wireless camera today if they had the opportunity... I doubt it, but I don't know. I'm just saying that you don't know, either.
    Furthermore, I think your response is a little ironic with respect to the original topic. Netflix is the company that most specifically chooses cameras based on technical specs; even Amazon and YouTube Red will allow the Alexa's upscaled 4k for original content, while Netflix won't because it's upscaled. And because of that, you have a lot of shows shot on the F55 or C300 Mk II that would look better in every respect (except resolution and again, imo) if they were shot on another camera (or even at 2.35:1 maybe, which they also won't allow). But resolution measures better on those cameras and it's in keeping with Netflix's brand and promise of technical quality, so that's what Netflix uses. I get it–part of their brand is 4k HDR original content. Maybe there are even legal reasons for the choice to stick with "true 4k" cameras, too. And it's fine. Those cameras are close to the Alexa anyway and the crews are super talented and most of the content on there is serialized tv type stuff that doesn't need the look of Chungking Express anyway. But if anyone is drinking Soylent over the good stuff on the basis of numbers, it's Netflix. 
    I do agree with the larger point, made many times on this forum, that amateurs like you and like myself, and those who don't have or don't want to spend the money to rent their format of choice, would do best to embrace what's available to them. Just because Chris Doyle probably wouldn't choose a GH4 if he were to shoot Chungking Express today (then again, who knows, he might) doesn't mean he couldn't shoot an awesome entirely different feature on digital. And I agree that a good story, such as Festen's, would work on almost any format, but I reject the notion that miniDV was used thoughtlessly or arbitrarily...
  4. Like
    HockeyFan12 got a reaction from elgabogomez in few random questions   
    I wouldn't know since I must be in the other 5%. It's a huge problem for me.  
  5. Like
    HockeyFan12 got a reaction from sam in few random questions   
    I wouldn't know since I must be in the other 5%. It's a huge problem for me.  
  6. Like
    HockeyFan12 got a reaction from kye in How much bit depth? (10-bit / 12-bit / 14-bit)   
    Codec is WAY more important, but the whole bit thing is kind of a mess. The human eye is estimated to see about 10 million colors and most people can't flawlessly pass color tests online, even though most decent 8 bit or 6 bit FRC monitors can display well over ten million colors: 8 bit color is 16.7 million colors, more than 10 million. And remember sRGB/rec709 is a tiny colorspace compared with what the human eye can see anyway, meaning 16.7 million fit into a smaller space should be plenty overkill. But also remember that digital gamuts are triangle shaped and the human eye's gamut is a blob, so fitting the whole thing into the blob requires overshooting tremendously on the chromasticities, resulting in many of those colors in digital gamuts being imaginary colors.... so the whole "8 bit is bad" thing needs a lot of caveats in the first place...
    I haven't tried 10 bit raw from the 5d, but I suspect in certain circumstances (100 ISO just above the noise floor) 10 bit will have visibly higher contrast noise than 14 bit after grading, though only if it's the exact same frame and you A/B it will the difference be apparent. That's my guess. Something VERY subtle but not truly invisible, though possibly effectively invisible. It's possible there could be banding, too, but the 5D III sensor is quite noisy.
    The science behind it is so complicated I gave up trying to understand. The more I learned the more I realized I didn't understand anything at all. First you're dealing with the thickness of the bayer filter array and how that dictates how wide the gamut is, then you're dealing with noise floor and quantization error and how that works as dithering but there's also read noise that can have patterns, which don't dither properly, then you're dealing with linear raw data being transformed with a certain algorithm to a given display or grading gamma, as well as translating to a given gamut (rec709, rec2020, etc.) and how wide that gamut is relative to the human eye and how much of the color there is imaginary color, and then what bit depth you need to fit that transformed data (less than you started with, but it depends on a lot of variables how much less), and then you introduce more dithering from noise or more banding from noise reduction, then compression artifacts working as noise reduction and to increase banding via macro blocking, then there's sharpening and other processing, then... then it goes on and on to the display and to the eye and of course that's only for a still image. Macroblocking and banding aren't always visible in motion, even if they are in a still, depending on the temporal banding and if the codec is intraframe or inter-frame.
    It's possible everyone who's proselytizing about this understands it far better than I do (I don't understand it well at all, I admit). But I frequently read gross misunderstandings of bit depth and color space online, so I sort of doubt that's the case that every armchair engineer is also a real one. (That said, there are some real engineers online, I just don't understand everything they write since I'm not among them.) I know just enough about this to know I don't know anything about this.
    From the armchair engineers, we do have some useful heuristics (overexposed flat log gamma at 8 bits heavily compressed will probably look bad; raw will probably look good), but even those aren't hard and fast rules, not even close to it.
    All you can do beyond that is your own tests. Even inexpensive monitors these days can display close to 100% NTSC. They should be good enough for most of us until HDR catches on, and when it does bit depth will matter a lot more.
  7. Like
    HockeyFan12 got a reaction from sam in How much bit depth? (10-bit / 12-bit / 14-bit)   
    Codec is WAY more important, but the whole bit thing is kind of a mess. The human eye is estimated to see about 10 million colors and most people can't flawlessly pass color tests online, even though most decent 8 bit or 6 bit FRC monitors can display well over ten million colors: 8 bit color is 16.7 million colors, more than 10 million. And remember sRGB/rec709 is a tiny colorspace compared with what the human eye can see anyway, meaning 16.7 million fit into a smaller space should be plenty overkill. But also remember that digital gamuts are triangle shaped and the human eye's gamut is a blob, so fitting the whole thing into the blob requires overshooting tremendously on the chromasticities, resulting in many of those colors in digital gamuts being imaginary colors.... so the whole "8 bit is bad" thing needs a lot of caveats in the first place...
    I haven't tried 10 bit raw from the 5d, but I suspect in certain circumstances (100 ISO just above the noise floor) 10 bit will have visibly higher contrast noise than 14 bit after grading, though only if it's the exact same frame and you A/B it will the difference be apparent. That's my guess. Something VERY subtle but not truly invisible, though possibly effectively invisible. It's possible there could be banding, too, but the 5D III sensor is quite noisy.
    The science behind it is so complicated I gave up trying to understand. The more I learned the more I realized I didn't understand anything at all. First you're dealing with the thickness of the bayer filter array and how that dictates how wide the gamut is, then you're dealing with noise floor and quantization error and how that works as dithering but there's also read noise that can have patterns, which don't dither properly, then you're dealing with linear raw data being transformed with a certain algorithm to a given display or grading gamma, as well as translating to a given gamut (rec709, rec2020, etc.) and how wide that gamut is relative to the human eye and how much of the color there is imaginary color, and then what bit depth you need to fit that transformed data (less than you started with, but it depends on a lot of variables how much less), and then you introduce more dithering from noise or more banding from noise reduction, then compression artifacts working as noise reduction and to increase banding via macro blocking, then there's sharpening and other processing, then... then it goes on and on to the display and to the eye and of course that's only for a still image. Macroblocking and banding aren't always visible in motion, even if they are in a still, depending on the temporal banding and if the codec is intraframe or inter-frame.
    It's possible everyone who's proselytizing about this understands it far better than I do (I don't understand it well at all, I admit). But I frequently read gross misunderstandings of bit depth and color space online, so I sort of doubt that's the case that every armchair engineer is also a real one. (That said, there are some real engineers online, I just don't understand everything they write since I'm not among them.) I know just enough about this to know I don't know anything about this.
    From the armchair engineers, we do have some useful heuristics (overexposed flat log gamma at 8 bits heavily compressed will probably look bad; raw will probably look good), but even those aren't hard and fast rules, not even close to it.
    All you can do beyond that is your own tests. Even inexpensive monitors these days can display close to 100% NTSC. They should be good enough for most of us until HDR catches on, and when it does bit depth will matter a lot more.
  8. Like
    HockeyFan12 got a reaction from Liam in I'm thinking about starting a film festival...   
    Seems like a cool idea. I'd focus on staying small at first, and maybe even having it be informal. There are lots of generic local festivals, but few that are that interesting. Something small, a starting place to find collaborators, could be cool. If you're going to accept everyone, maybe put a tight limit on how long each short can be. I agree with that.
    I've heard similar things about Channel 101 being a more insular and self-serving community than it once was. It's sort of turned into what it was a reaction against. But I wouldn't let that sour you on submitting to other festivals entirely! If you like Channel 101 stuff, make a Channel 101 show.
    The difficulty with being really really creative is that your ideas exist BECAUSE they're unusual and innovative. And so if there are non-creative criteria for entrance somewhere, and the more established the venue the more established the criteria generally, the least creative stuff is valued at the low end or entry level (it ticks the boxes) and the most creative stuff at the high end (it innovates). So you won't be at the level of high end stuff, but you're too creative for the low end stuff, and it's going to be unduly challenging and you wonder what's wrong with you. Well, the question is also what's wrong with the world. 
    Really creative people often never get past the entry level. This is a real problem with companies, the visionary CEO eventually gets replaced with very conservative thinkers. You need to learn to think like your audience, and meet them half way.
    If this is a problem for you, or you've faced rejection, what to do about it is up to you. Making a festival on your terms is a good idea–it's where Slamdance and Channel 101 were born, even if they later sort of turned into what they began as being defined against.
    But if you see stuff you like coming out of Slamdance and Channel 101, maybe meet them halfway, and once you break into those communities, be even more and more creative. Once you get in, then you can push the boundaries more and more! Either way, look for other films you like, and find collaborators. Don't be myopic.
    I'd start small either way! Five-minute run times aren't a bad thing. The challenge is to pack all the creativity into it! Or hone your idea down to the best, smallest version of itself. Your next film won't be your best unless you let it be your last. So keep creating! And don't look (too far) back.
  9. Like
    HockeyFan12 got a reaction from User in Future Proof with the Current Crop   
    I've yet to see 4k that, per pixel, is as sharp as 1080p downscaled from a 4k source. But stills I've seen from TOTL dSLRs certinly show that potential... the 8K Reds might provide meaningfully more resolution, I haven't seen raw footage from one. Early Red footage was SUPER soft. Now we're seeing 4k that's meaningfully sharper to the eye than 2k/1080p...
    On what display is the question. Even on IMAX, 2k Alexa looks good to me. And the C100MkII is just as sharp (worse in other respects, of course). Both are way sharper than 35mm film (as projected, but even look at blu ray stills from film and see how surprisingly soft they are).
    But on a 5k iMac, I notice a bigger difference since I sit so close to it. Even there, the difference isn't huge between 4k and 5k, though. It is with UI elements, not so much with video. And the difference between 4k cameras will be even way less significant. For me, 2k is enough for video. For most, I think 4k will be.
    For now...
    I think the only substantive shift (beyond already significant meaningful aesthetic differences between cameras and lenses) will be when HDR takes off. And HDR imo is going to evolve in a different direction entirely, even more naturalistic, maybe HFR, etc. maybe even integrating with VR. The C300 Mk II and Alexa (and in practice the F65 and new Reds and Venice probably) all meet that standard of 15+ stops 10 bit rec2020 etc.,
    But HDR is changing fast so I wouldn't even sweat it unless HDR delivery is important to you.
    Of course, I don't know if by "reasonably affordable" you mean an A7S or an F55. I think there's already a rather big difference there, though the two can be seamlessly intercut if you're careful to mitigate the A7S' weaknesses.
     
  10. Like
    HockeyFan12 got a reaction from Aussie Ash in Future Proof with the Current Crop   
    I've yet to see 4k that, per pixel, is as sharp as 1080p downscaled from a 4k source. But stills I've seen from TOTL dSLRs certinly show that potential... the 8K Reds might provide meaningfully more resolution, I haven't seen raw footage from one. Early Red footage was SUPER soft. Now we're seeing 4k that's meaningfully sharper to the eye than 2k/1080p...
    On what display is the question. Even on IMAX, 2k Alexa looks good to me. And the C100MkII is just as sharp (worse in other respects, of course). Both are way sharper than 35mm film (as projected, but even look at blu ray stills from film and see how surprisingly soft they are).
    But on a 5k iMac, I notice a bigger difference since I sit so close to it. Even there, the difference isn't huge between 4k and 5k, though. It is with UI elements, not so much with video. And the difference between 4k cameras will be even way less significant. For me, 2k is enough for video. For most, I think 4k will be.
    For now...
    I think the only substantive shift (beyond already significant meaningful aesthetic differences between cameras and lenses) will be when HDR takes off. And HDR imo is going to evolve in a different direction entirely, even more naturalistic, maybe HFR, etc. maybe even integrating with VR. The C300 Mk II and Alexa (and in practice the F65 and new Reds and Venice probably) all meet that standard of 15+ stops 10 bit rec2020 etc.,
    But HDR is changing fast so I wouldn't even sweat it unless HDR delivery is important to you.
    Of course, I don't know if by "reasonably affordable" you mean an A7S or an F55. I think there's already a rather big difference there, though the two can be seamlessly intercut if you're careful to mitigate the A7S' weaknesses.
     
  11. Like
    HockeyFan12 got a reaction from Mark Romero 2 in How Important is 10-Bit Really?   
    Interesting. I've done similar tests with noisier cameras (C300, Alexa, F35) and not seen any banding because of the heavy dithering from the noise in the source footage. 
    I suppose it goes to show it's worth doing your own tests!
    Certainly it seems to make a big difference here.
  12. Downvote
    HockeyFan12 got a reaction from Deadcode in How Important is 10-Bit Really?   
    I'm gonna check this link out! Never seen this, but that's exactly what I'd expect the difference to be. Subtle difference when increases bit depth, huge difference when increasing bit rate.
  13. Like
    HockeyFan12 got a reaction from Mark Romero 2 in GH5s good enough for photos?   
    How big do you print? I worked with an extremely successful high end photographer who would make wall-sized prints from 12MPs. 
    The shots were stunning, but IMO they didn't quite hold up when you stand close. If you print bigger than 11X17 I might reconsider, but it depends on your personal standards so no one is going to be able to answer this question for you.
  14. Like
    HockeyFan12 reacted to OliKMIA in Show Us Your Best Video   
    On this one I'm kind of making fun of Instagram. It went crazy viral.
    Otherwise, I love doing hyperlapse and drone stuff
     
  15. Like
    HockeyFan12 reacted to Attila Bakos in 8bit → 10bit video with temporal noise filtering, stunning results   
    I would use masks/keys and Resolve's deband filter for this kind of stuff.
    Just one frame: before | after
    I did not finetune the key so there is some detail loss here and there, I also modified the sky's color a bit. This is way faster than grain + denoiser.
  16. Like
    HockeyFan12 got a reaction from Inazuma in 8bit → 10bit video with temporal noise filtering, stunning results   
    This is way above my head, but doesn't this have more to do with denoising being able to reduce macroblocking than it does with bit depth?
    The banding and color problems in all these shots seem to be from color profiles and compression artifacts, not bit depth. None of these image problems are primarily correlated with an 8 bit codec. They're far more to do with macroblocking.
    Both of the shots of the guy by the stairs, for instance, look awful and riddled with compression artifacts.
    Admittedly, the sunset does look a lot better, improved by what's a pretty clever trick.
    Generally, I feel like there's some "10 bit magic" that I don't see. My experience has always been that the strength of the codec and the color space and gamma assigned it is far more important than bit depth. The F5, for instance, still had a lot of banding in 10 bit XAVC or whatever the codec is, because the bit depth is too low and gamma too flat. (This has been improved upon since in future updates.)
    Denoising is definitely powerful! I've used it before to remove compression artifacts in similar situations. I just don't understand what bit depth has to do with it. It's macroblocking that's a far far bigger issue in all these examples. Unless my eyes deceive me...
  17. Thanks
    HockeyFan12 got a reaction from Mark Romero 2 in Dynamic Range of ML RAW vs h.264 / h.265 Cameras?   
    Unless you have chroma clipping or are inept at setting white balance or dealing with a camera with poor white balance inherently (looking at Sonys, here) mixed lighting is difficult to work with in general. The Alexa handles it best, even in ProRes, so it's not necessarily about RAW. (Fwiw I agree about dynamic range, the 5D Mark III has less than the Sonys, but better tonality.)
    The high end $100k stills camera guys I know use strobes and heavily light their real estate work. For video it's not so easy but I think (I could be totally off base here) the ultimate solution to get great quality is to bring color correct fixtures and gels with you to swap out with what's there. Lots of LEDs or kinoflo bulbs to swap, maybe. Really really cheap to buy, but not always possible of course. 
  18. Thanks
    HockeyFan12 got a reaction from maxotics in What was the first professional camera to shoot LOG gamma?   
    http://www.digital-intermediate.co.uk/film/pdf/Cineon.pdf
    https://pro.sony.com/bbsccms/assets/files/mkt/cinema/solutions/slog_manual.pdf
    http://www.panavision.com/sites/default/files/docs/documentLibrary/Panalog Explained.pdf
     
  19. Like
    HockeyFan12 got a reaction from webrunner5 in What was the first professional camera to shoot LOG gamma?   
    http://www.digital-intermediate.co.uk/film/pdf/Cineon.pdf
    https://pro.sony.com/bbsccms/assets/files/mkt/cinema/solutions/slog_manual.pdf
    http://www.panavision.com/sites/default/files/docs/documentLibrary/Panalog Explained.pdf
     
  20. Like
    HockeyFan12 got a reaction from EthanAlexander in What was the first professional camera to shoot LOG gamma?   
    Agreed, I don't even know how we got on the subject of HDR. :/ Maybe we should just move off that, since it's apparently a contentious topic... and I think hopefully when HDR hits it big it will obviate the need for log gammas since we can see fifteen stops of information on screen at once without having to flatten it out.
    @Maxotics as best I can recall, log gammas were introduced for film scans to save file space. 10 bit log could represent the data from a 16 bit linear scan. I believe Cineon was the most common format, designed by Kodak, and–apocryphally–I hear some of that same magic lives on in the Alexa, even though Log C doesn't look at all like a film scan to my eye.
    There are still formats out there that are wide DR in linear colorspaces, Open EXR, for instance. Maybe DPX? Though I usually get DPX in log these days.
    Panalog was the first log video format I've heard of. I haven't worked with it. The first log format I worked with was was possibly from Red (it was poorly implemented; their current IPP2 stuff is much better). The first good log format I worked with was S LOG, on an F3. I was so impressed. Or maybe the Alexa. I forget. I do remember SLOG on the F3 benefited tremendously from 10 bit capture. Later I got to work with in post (but not shoot) some SLOG footage  from the F35, and that camera produces a great image to my eye, although it's pretty controversial on some forums, and a pain to use I'm told. The image reminds me of a really beefed up C100 a bit.
    Here are some white papers for Cineon, SLOG, and Panalog:



    I couldn't make it through them. Math isn't my strength; I've really struggled with it. It is interesting that Cineon is apparently 10 bit in a 12 bit wrapper. Not sure why... but it seems to indicate that 10 bit log is sufficient to capture the vast majority of film's dynamic range. Though Kaminski would note that the DI burns the highlights and crushes the shadows a bit... which he liked. 
    I really can't speak to 10 bit on the GH5. I'd be interested to hear a good colorist's take on the subject. I do think "8 bit" has a bad name among people who don't run their own tests, though.  Like you, I have separate cameras for video and for stills... and most just use my iPhone for both.
  21. Like
    HockeyFan12 got a reaction from webrunner5 in What was the first professional camera to shoot LOG gamma?   
    I don't get to use fancy cameras that often! I own an 8 bit camera and am happy with it for personal use. I'd rather focus more on filmmaking... something I seem to have neglected lately.
    I don't know if the GH5 is really better in 10 bit, or if it has "true" 10 bit color. Haven't used it! Online tests seem to be inconclusive. I'd guess it makes a difference, but not a huge one. Then again, the Alexa has 15+ stops of dynamic range, and the GH5 doesn't, so it shouldn't matter as much with the GH5. I'm also not sure why SLOG 2 looks so bad from the A7S. I remember it didn't look great from the F5, either, which is a 10 bit camera (I think). F55 RAW looks better to me, but still not as good Alexa ProRes by any means... until it's expertly graded, at least. So each step up does improve the image, more in terms of grading potential than initial look imo. Still, I believe the arguments about 10 bit vs 8 bit on the C200, for instance, are overrated. I suspect 8 bit is enough for Canon Log 3. So it's only if you need those extra two stops of barely there dynamic range that the 10 bit codec would be better (letting you to use Canon Log 2).
    But that's not a camera I'm likely to buy. It isn't for me to say what others' needs are. If they need 10 bit, they need 10 bit. Not my concern since I don't. 
    I still agree with @jonpais about HDR. In theory (I think), any "normal" or "flat" look, no matter how stretched it is, is just a flat rec709 look, and only has colors within the rec709 gamut. Where log differs is that it takes a container designed for rec709 but lets you capture colors outside that gamut, and then a LUT brings those colors back to a given colorspace. RAW, likewise, is only limited by the thickness of the BFA (which I believe differs between the C100/C300 and C500, fwiw), so you can take a RAW image and bring it into rec709 or bring it into any other given color space if the color is there in the first place. While I bet you could take a rec709 image and map it into HDR, that would be "faking it." Even a GH5 is (presumably) capturing a significantly wider dynamic range and color gamut than rec709 in v log, and shooting log or RAW will let you access those colors. But yes, tonality will suffer. How much? I'm not sure.
    I'm also not that interested in producing HDR content, at least at the moment, so for me color and tonality matter more. I suspect it requires a really good source to get high end HDR that has great tonality, but I wouldn't be surprised if HDR from the GH5 still looks good.
    I completely agree with @Damphousse that highlight roll off matters more than dynamic range to the average viewer. That's why I never got behind the A7S. The chroma clipping is severe. With a C300 or F3, my image might clip, but I can make it clip in a way that's aesthetically pleasing. With the A7S (and the early F5, it's better now) colors clipped wrong and it looked like video. So for me, the camera with the lower dynamic range is the one with the better image and grading potential, subjectively. The Alexa gets both right, you'll want to grade to burn it out.
    Furthermore, I have reservations about HDR. My first taste of it was Dolby's 10,000 nit demo with high end acquisition, so I'm really spoiled on it. I also can't afford a new tv, so there's that. More than anything, HDR and 4k feel like a simulation of reality, whereas film feels like a more physical organic medium. You go to a movie theater and watch film and the image is 24fps in a dark environment and your eyes are in mesopic vision and the film has a sort of built in tone mapping from the halation around bright spots and diffusion filters or anamorphic lenses, both of which which I love, bring it out even more with wild flares. There's not so much resolution, but the color is beautiful, and the contrast in the color allows the image to look much richer and higher dynamic range or higher contrast than it is. I like theatrical lighting with film, vivid and colorful. It feels like a dream a bit more, but I prefer stylized (not over-stylized) filmmaking.
    Whereas HDR, though stunning, seems to lend itself better to very naturalistic photography. Granted it makes anything look way better, but I see it being most useful in VR with like 8k per eye 120fps reality simulation. And I sort of think that's its own medium. Generally I think photography has tended a bit too much toward naturalism lately. So for my own purposes, I'm just happy doing weird stuff to 8 bit footage. But HDR is stunning. HDR nature documentaries are going to be breathtaking. 
     
  22. Like
    HockeyFan12 got a reaction from EthanAlexander in What was the first professional camera to shoot LOG gamma?   
    2k (not HD) 444 ProRes is about 38MB/sec; ArriRAW (2.8k Bayer for 2k delivery) is 168MB/sec.
    Yes, it's only about a 77% reduction in file size, which is significant on tv shows but perhaps not on the largest feature films. I suppose "tiny fraction" is an exaggeration.
    But ArriRAW has its own gamma mapping to a 12 bit container from a dual 14 bit ADC that then converts to a 16 bit signal in the camera. So, if you were starting with the true RAW signal, which is either 28 bit or 16 bit depending on how you look at it, the reduction in file size would be dramatically more. In the case of ArriRAW, the RAW data itself has its gamma stretched (similar to, but different, from Log) to save space. 
    So perhaps ArriRAW is not the best example because it compresses the gamma, too, and a 77% reduction in file size isn't that big for your needs (it is for mine).
    I'm not sure what I "don't get." My own experience shooting 10bit SLOG 2 on the F5 indicated that the codec wasn't well-implemented for that flat a gamma, and I ended up not liking that camera when it was first released. (Overexposed by a stop, it's not so bad, and it's better now.) I think what you miss is that most serious shooters are running these tests for themselves. Problems like sensor banding in the C300 Mk II reducing the stated 15 stops of dynamic range and SLOG 2 on the A7S being too "thin" and Red's green and red chromaticities being placed too close are well-documented at the ASC and the ACES boards.
    Furthermore, the Alexa is pretty darned good even at 422, which you posit is too thin for log. (And Log C is very flat as gammas go.) Many tv shows shoot 1080p 422 (not even HQ) for the savings in file size. They still shoot log, the images still have good tonality, if slightly less flexibility than 444 ProRes or ArriRAW affords. Just because a few log profiles aren't all they're cracked up to be doesn't mean log profiles are inherently bad or wasteful. 
  23. Like
    HockeyFan12 got a reaction from EthanAlexander in What was the first professional camera to shoot LOG gamma?   
    Can you send me a link to the video you mention where you discuss bit depth per stop in various formats/gammas? I want to make sure I watch the right one. It is an interesting topic and worth exploring. There are, no doubt, trade offs with log gammas screwing with tonality. But by distributing data differently (I believe most camera sensors have 14 bit ADCs in RAW, but that data is not stored efficiently) you can maintain good tonality in a smaller package. Which is the whole point of log capture. No one says it's better than RAW capture, but in the case of the Alexa, for instance, 10 bit Log C 444 is maybe 99.9% as good–and a tiny fraction of the size. 
    Furthermore, dynamic range is not the question so much as tonality is. With adequate dithering (or in the case of most cameras, noisy sensors doing the job for you) you can obviate banding for any given dynamic range at an arbitrarily low bit depth. (At a certain point it'll just be dithered black and white pixels–but no banding!) The color and tonality, however, will suffer terribly. I shoot a bit with a Sigma DP2 and I do notice a lot of poor tonality on that camera relative to the gold standard of 4x5 slide film, despite both having poor dynamic range, and even in RAW. I believe that has a pretty low bit ADC.
    While I admire your reasoning and rigor, I agree with @jonpais for the most part. I agree that a ten bit image, properly sourced, will hold up better in the grade than an 8 bit one, but will look the same to the eye ungraded. While I know (secondhand) of some minor "cover ups" by camera manufacturers, none are too nefarious and consistently it's stuff you can figure out for yourself by running camera tests, and things people online identified anyway, and which were eventually rectified to some extent. Most camera manufacturers are surprisingly transparent if you can talk to their engineers, and there are white papers out there:


    However, this over my head.
    Where I disagree with Jon is his statement that a given log profile from any camera is adequate for HDR content production. In theory, since HDR standards are poorly defined, this might be true. But it doesn't mean it's giving you the full experience. My only exposure to HDR (other than displays at Best Buy, and trying HDR Video on an iPhone X) has been a Dolby 10,000 nit demonstration and a few subsequent conversations with Dolby engineers. The specs I was given for HDR capture by them were 10 bit log capture or RAW capture, rec2020 or greater color space, and 15 stops of dynamic range or greater. Of course, there are many HDR standards, and Dolby was giving the specs for top of the line HDR. But still, this was the shorthand for what Dolby thought was acceptable, and it's not something any consumer camera offers. They are, however, bullish on consumer HDR in the future. Fwiw, the 10,000 nit display is mind blowingly good.
    Just because Sony seems to be careless at implementing log profiles (which is weird, since the F35 is excellent and F3 is good, too) doesn't mean log profiles are universally useless. The idea is to compress the sensor data into the most efficient package while sacrificing as little tonality as possible. The problem arises when you compress things too far, either in terms of too low a bit depth or (much worse) too much compression. I do see this with A7S footage. And I think it's the reason Canon won't allow Canon Log 2 gammas in its intermediate 8 bit codec on the C200. I realize you wouldn't consider the Varicam LT and Alexa consumer-level, but the images from their 10 bit log profiles are great, with rich tonality and color that does not fall apart in the grade. Furthermore, I suspect the C200's RAW capture would actually fulfill even Dolby's requirements for high end HDR, and $6000 is not that expensive considering.
    Out of curiosity, do you use Canon Log on the C100? It's quite good, not true log, but well-suited for an 8 bit wrapper. 
  24. Like
    HockeyFan12 got a reaction from Nicholson Ruiz in Decisions decisions   
    I agree, a C100 Mk II and an 80D would be my choice. Not so much for image quality (which is still very very good) but convenience. Once your workload scales up, every hour transcoding or fixing weird Sony SLGO2 colors is going to cost you $50-$100 minimum... But at first you don't want to pay a ton and get into debt. So I think Canon's cameras combine the ease of use and image quality that's required without costing a ton, even if you can get a little more with a lot more effort by spending a little bit less elsewhere. For hobbyists and people stealing locations, I think mirrorless is cool. But for getting into paying work the C100 Mk II imo is the best choice (it would also be my choice for personal work, but that's a personal preference).
    I think the 1DC would offer a better image for the price, but not by much, and only with a lot more overhead for storage and a lot more kit to set up focus aids and accessories, etc. 
    If you're starting a business, I would say C100 Mk II for sure. Otherwise, just whatever intrigues you most as a hobbyist. 
  25. Like
    HockeyFan12 got a reaction from kaylee in Hey YouTube and Facebook - time to stop burying the good stuff with wall-to-wall bullshit   
    Broadly, it depends by whom and what you mean by "censor." (In this case it's just a warning before the video, so I'm not even sure it's censorship.) Generally, I'm for private self-censorship and rating boards but not for government-mandated censorship. One reason England has the BBFC (which is allowed to censor and/or ban movies as part of a government mandate) and we have the MPAA (which is simply a private rating board) is because the American film industry took it upon themselves to "censor" (or just self-censor among their exhibitors) films produced by its members and exhibited in its theaters. This tamed the moral hysteria of the era and avoided government censorship (which would be, imo, anti-first amendment) like you see in the UK. So as much as people (often fairly, imo) complain about MPAA ratings, I think the institution is a good one and the institution of private censorship is a good one, too. YouTube, FaceBook, Twitter, etc. are not public institutions. They have every right to ban and block whomever they want, and those people then have the right to look elsewhere to express their opinions.
    I'm not for government censorship of hate speech, though. I support The Daily Stormer's right to exist every bit as much as I support the web hosts (private companies) who refuse to host them. 
    In sum, yes, I support YouTube putting a warning before hate speech. It's their freedom of speech as a private enterprise that allows them to put that warning there, after all. So at the private level yes and at the public level no. 
×
×
  • Create New...