Jump to content


  • Content Count

  • Joined

  • Last visited

Everything posted by androidlad

  1. 8K RAW frame grab, shot with animal eye AF.
  2. BMD patent document on the URSA 12K sensor design: https://patentimages.storage.googleapis.com/88/08/39/428dbbc9e5dfca/US20190306472A1.pdf Actual CFA pattern: In the document, it specifically discusses colour-aware pixel-binning to increase sensitivity and double framerate, at the expense of lower resolutions. So the "oversampling" mentioned in marketing is without doubt BS, at least for full FOV framerates higher than 60fps. Because if it oversamples from full 12K, then the oversampled modes cannot exceed 60fps which is the full readout max fps. Ma
  3. This is the CFA pattern of the URSA 12K sensor: 6x6 block instead of traditional 2x2, 18 RGB pixels and 18 white/transparent pixels, which improves SNR a bit but reduces resolution. So the optimal shooting mode will be 4K (full RGB info without interpolation from 3x3 pixel-binning), 8K will be softer, 12K 1:1 will be very very soft.
  4. Yes I invented this 12K sensor, which is precisely the same size as RED Komodo at 27.03mm x 14.25mm, and exactly double the horizontal and vertical resolution. And who made the RED Komodo sensor?? Ok fine, this 12K sensor is a custom version of Canon's 120MXSC, operating in an ROI mode https://canon-cmos-sensors.com/canon-120mxs-cmos-sensor/
  5. I can confirm that 4K HFR modes (50 - 120p) are achieved using 2 x 2 pixel binning. It's a relatively elegant solution because there's no line-skipping involved. The image will be very soft with lower moire and lower noise compared to other subsampling methods.
  6. Free download of F-log ACES IDT DCTL for use in Resolve: https://blog.dehancer.com/articles/fuji-f-log-aces-idt-for-davinci-resolve-download-dctl/
  7. R5/R6 both have 10bit internal recording. BTW the new look of the forum, especially the new font, is obnoxious.
  8. The actual resolution in pixels is 2048 x 1536, versus the current high-end EVF with 1600 x 1200.
  9. That's their 100MP X-Trans project that has since been shelved.
  10. It requires far more aggressive line-skipping to readout the full height of the sensor which is 8736 pixels. Currently GFX100 uses 2/3 subsampling vertically to derive a 4352 pixel Bayer from a 6528 pixel height, and that already saturated the maximum readout time of 32ms required to achieve 30fps video frame rate.
  11. Related, but not really directly correlated. Dynamic range is a measure of a camera system - how far it can see into the shadows and how far it can see into the highlights. Dynamic range can be measured objectively, but even then there's a subjective component as each and every viewer will have their own noise tolerance threshold. This governs how much of the shadow part of the dynamic range they find actually usable. Latitude is related to dynamic range, but it is also scene dependent. Latitude is the degree to which you can over or under expose a scene and be able to bring it back
  12. Great, this aligns nicely with what C5D did with their over/under tests. But, this is testing the latitude, not dynamic range.
  13. Sigma FP as well as Panasonic S1 which has no OLPF, are indeed very prone to moire in real world shooting senarios: This is exactly the reason why S1H has OLPF.
  14. It's a shame the FP doesn't have an OLPF, with such a low pixel density it's very prone to moire/aliasing.
  15. ProRes HQ cannot compare to ProRes RAW for adjusting white balance or ISO, because RAW is linear, scene-referred, the results are much better than gamma encoded color spaces. You can linearise but it adds quite a few additional steps.
  16. "Supports HDR in movie shooting" Anyone tested this? I wonder what it does.
  17. If this was truth, then there would be accompanying empirical evidence to support it. For now, it's only your subjective opinion. Also in BRAW, 6K scored 11.8 stops and URSA Mini G2 scored 12.1.
  18. Is it difficult to make a 0.65x speedbooster? That way full frame glasses would have true and precise full frame equivalent on APS-C cameras. With current 0.71x speedbooster, there's still a crop factor of 1.09x, a 24mm lens would have a 26mm field of view.
  19. I know what you menat, but it's worded incorrectly, what you wanted to say is it would lose SNR. Note that pure pixel-binning actually increases SNR (2x2, 3x3 etc. you see on smartphone sensors).
  20. Oh yeah it's already in the hands of many influencers/industry pros, who are anxiously awaiting the NDA lift.
  21. Most cameras that output ProRes RAW at the moment are mirrorless cameras with HDMI output, and Atomos developed the RAW over HDMI protocol, they only license to camera manufacturers for free. For those that output RAW over SDI, BMD need to develop support for the their RAW spec (EVA1 outputs 10bit Log-encoded RAW, Sony CineAlta outputs 16bit linear RAW). And the same applies to Atomos, but Atomos has its RAW over HDMI protocol and it's being widely adopted, so they pretty much have full control over the RAW spec. So instead of saying BRAW is sensor specific, you can say it's brand sp
  22. Nope. BRAW is just a codec, it has nothing to do with sensors or camera models. It requires BMD's FPGA for the encoding. Same for ProRes RAW, Apple has licensed the encoder to Atomos and DJI, it can encode any incoming RAW signal.
  • Create New...