Jump to content

NickSim

Members
  • Posts

    3
  • Joined

  • Last visited

About NickSim

NickSim's Achievements

New member

New member (1/5)

1

Reputation

  1. Not to sound as if I'm offended, I am not... but why do you say that? Yes of course it isn't the top dog, but in my situation I can't justify having the best reference monitor when what I have is certainly better than 99% of the devices my audience is using... ie 61% of people are viewing my content on their Smartphones. I'm not producing content intended for display on 70"+ HDR10/DV displays, I just want do to the best work that I can realistically do.
  2. @kye @Deadcode Thanks for the info, I’ve now got a good direction to further my investigations. @jonpais you can learn more about the Dell UP2718Q here the picture gets a little strange at 1000nits, you can see some highlight ‘glow’, a bleeding effect that is especially visible when viewed off axis. So I wouldn’t put it up against a $12,000 NEC or Eizo but I feel that it’s sufficient for my use. It also was a collaboration between Dell and x-rite, it has built-in support for the x-rite i1 Display at the hardware level which is suppose to retain the entire color gamut as opposed to software calibration which may take a 100% AdobeRGB rated panel down to say 98.5% after software calibration. With that said, I am having issues performing calibration when outputting to HDR. I believe it’s because I have an older (2011 build) i1 Display which doesn’t support 1000nit levels of output, I vaguely remember reading that newer models support 1500(?) nits even though they aren’t rated for that per-se. I’m currently waiting to hear back from x-rite support on that one.
  3. Hey guys, I'm new so apologies if I'm posting in the wrong place or if I seem... Confused, cause I am! My goal is to create content that is of the highest quality that I can generate today, which looks good with devices available to the masses right now and in the future ie HLG/HDR10/DV. I'm seeking NLE advise, specifically in DiVinci Resolve (15 Beta) as it is my understanding that Resolve is ahead of the curve in regards to working with and triggering HDR content properly for YouTube delivery, please correct me if I'm wrong. I've jumped on the HLG bandwagon perhaps without fully comprehending the nuances of HLG, but alas here I am... I'm shooting with an A7R III HLG / 8-bit Rec.2020 colorspace, GH5 HLG 4:2:2 and DJI X5s D-Cinelike -1/-1/-1 all ETTR. Most 4K/30p but some 1080/60p/120p to be played back in 30p for slow-mo. I'm using an X-Rite colorchart as a reference point to create my own LUTS. I have an HDR10 monitor that displays 100% AdobeRGB (Dell UP2718Q), running Windows 10 with HDR/WCG enabled. My questions are: What should I be using for project settings? I'm assuming 4K UHD 29.97 timeline and playback, Divinvi YRGB colormanaged, Input/Timeline/Output Rec.2020 HLG ARIB-STD-B67. And will a render using that output setting cause viewing issues with SDR devices? Curiously I can't enable the checkbox "Enable HDR Metadata over HDMI", I can't find "Enable HDR Scopes for St.2084" and I can't enable the checkbox "HDR Mastering is for XXXX Nits" as shown in this tutorial, probably as I'm using an mDP cable to my monitor AND/OR because he's using Resolve 12.5... does this matter? My understanding of DP vs HDMI is that for monitor use, DP is the preferred connection method. I believe I'm dealing with mixed gammas between the HLG and D-Cineline, how does one work with mixed gammas in a single timeline? How does one handle Rec.2020 and Rec.709 clips in a single timeline? If I'm working in a Rec.2020 colorspace does that affect the appearance Rec.709 footage in anyway? Does working and grading in an HDR sphere change the appearance of SDR content when delivered to SDR viewers? In other words, what looks good to me may be crushed when viewed in SDR/Rec.709, how to I know/prevent/deal with that issue? Thanks for your input guys! As you can tell I'm still new to all of this... Photography came to me easy, but video... phew.
×
×
  • Create New...