Jump to content

helium

Members
  • Content Count

    53
  • Joined

  • Last visited

About helium

  • Rank
    Member

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. High saturation and a steep curve yield the same artifacts in Resolve. There's a recent thread in the BMD forum, with someone complaining about much the same in BMPCC 4K footage, in Resolve. The shot was 4 stops over and then stressed in post.
  2. Would you kindly link to the original clip? Am not going to install Premiere, but I'm interested to see if the clip can be broken similarly in Resolve. I think it probably can, it's a just a matter of order of operations. Also, are you sure you did nothing in Premiere which required 8-bit processing? One 8-bit fx, and the whole thing goes 8 bit.
  3. I haven't examined the video, but one glaring uncontrolled factor above is the LUT, and where in the chain it's being applied in Premiere. As far as I know -- it's been a long time since I worked in Premiere -- there's no way to apply a potentially destructive normalizing LUT without actual destruction. This is obviously not the case with Resolve, with its sequential nodes and/or Color Management. For the extent of the claims you're making, effectively calling Adobe a liar, I think you want to eliminate that LUT variable.
  4. Without knowing your exact workflow, it's impossible to draw any conclusions from those examples. Again, what's the source of this claim that Premiere timelines(??) are 8-bit, which amounts to to saying that Premiere Lumetri is 8 bit....
  5. Premiere *does* offer 32-bit internal processing, with the exception of some fx, which could still be 8-bit (32-bit fx are labelled as such). Basic color correction should all be 32-bit Lumetri. What's the source of this claim that " it basically converts to 8 bit which causes many artifacts"? https://mixinglight.com/color-tutorial/demystifying-premiere-pros-color-management-and-finishing-pipeline/
  6. Aspirations to cinema drive most of the posts on this site, but that aside, if your delivery is to youtube or instragram, what's the point of this endless dispute over cameras and methods? What does it matter? Why this endless torture and inquisition, when nobody goes to these delivery services for their technical excellence? It's this constant inanity, where folks post to youtube, and everyone else is supposed to swoon over one camera or another, when all you're looking at is a crude representation of color correction, production value and compression quality, calibrated to who knows what standard of display, which could be approximated by dozens of other cameras.
  7. Or, when using a camera like the C200 or BMPCC 4K, or any other capable 10 bit 4:2:2 camera, you could just expose using the traditional approach, as do most cinematographers. It's ironic, because ETTR was suggested above as a way to avoid getting the bad results of youtube posters. And yet ETTR, at youtube levels of competence, is more likely to destroy images than "proper" traditional exposure techniques. This preoccupation with noise is also a bit strange, for a so-called cinema camera. Cinema, after all, is full of noise, commonly known as grain.
  8. Try substantially over-exposing BMPCC 4K footage (but shy of clipping), since that camera is party to this discussion. Observe color and contrast-- as well as highlight roll-off, and the lack of adjustability of highlights. How you concluded that it's "a non-destructive shifting of data" I don't know. Sensors don't work that way.
  9. Since when is ETTR an established and accepted practice for cinema, or even for 10-bit log shooting generally? For example, how can you assess highlight roll-off, if you're cramming everything into the upper end of the IRE range? I can't speak to the SH1, but what makes you think you get the best image out of BMPCC 4k/6K shooting ETTR? Drawing the conclusions you've offered here, based on 8-bit DSLR exposure techniques, may be risky.
  10. If by "sharpening" you're talking about the halo thing with BMPCC4k/6K (in comparison to the S1/S1H), I think it's a misleading term, because the footage doesn't have the obvious visible characteristics of video sharpening. I've never heard anyone else complain of the BMD footage, in your terms. If it's there, nobody else is seeing it. Or they attribute the apparent greater resolution of cDNG to false detail. As for color science -- people will argue. The reason I offered a version of those two shots in the S1 thread (you did one version, I did another--BMPCC 4K/S1) was to show people that apart from the obvious giveaways, like DOF and coverage, they couldn't tell them apart with even the most rudimentary adjustments. And we haven't started on the actual grading.
  11. Unless the sets are factory-calibrated, which is impossible at consumer prices, the best they can do is remove the worst default settings, like motion smoothing, digital enhancements and image sharpening. Beyond that, it remains to be seen how accurate the picture is or if they even attempt color accuracy.
  12. For anyone not persuaded 4K and HD are only for snowflakes, wusses and Steve Yedlin, the S1 is going for $2200 with vlog included.
  13. The difference between the LUTs is so insignificant, compared to the likely grading you'd want to do on this shot, that it doesn't seem worth worrying over. The more important question might how much adjustment is available after the application of one LUT or another.
  14. That's not really what I was saying. Just pointing out the obvious. A great camera without the production values only money can buy is useless, outside of the realm of the hobbyist or dreamer-only filmmaker. And it's not a forgone conclusion that an actual no-budget movie benefits from a great camera, except to the extent the movie is so badly shot, it doesn't look like a great camera.
×
×
  • Create New...